hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
20611fab399f3fcc9ea15bb1c5b28aa6987f1063 | 4,141 | py | Python | bertin/utils/generate_datasets.py | ruinunca/data_tooling | 297e1f8c2898d00b523ccafb7bdd19c6d6aac9ff | [
"Apache-2.0"
] | 23 | 2021-10-21T14:23:40.000Z | 2022-03-27T01:06:13.000Z | bertin/utils/generate_datasets.py | ruinunca/data_tooling | 297e1f8c2898d00b523ccafb7bdd19c6d6aac9ff | [
"Apache-2.0"
] | 331 | 2021-11-02T00:30:56.000Z | 2022-03-08T16:48:13.000Z | bertin/utils/generate_datasets.py | ruinunca/data_tooling | 297e1f8c2898d00b523ccafb7bdd19c6d6aac9ff | [
"Apache-2.0"
] | 33 | 2021-08-07T18:53:38.000Z | 2022-02-11T13:38:11.000Z | import json
import logging
import os
from datasets import load_dataset
from tqdm import tqdm
# Setup logging
logging.basicConfig(
format="%(asctime)s - %(levelname)s - %(name)s - %(message)s",
level="INFO",
datefmt="[%X]",
)
# Log on each process the small summary:
logger = logging.getLogger(__name__)
os.system("wget http://dl.fbaipublicfiles.com/cc_net/lm/es.arpa.bin")
mc4 = load_dataset(
"./mc4",
"es",
split="train",
sampling_method="steps",
perplexity_model="./es.arpa.bin",
sampling_factor=1.5e5,
boundaries=[536394.99320948, 662247.50212365, 919250.87225178],
streaming=True,
).shuffle(buffer_size=10000, seed=2021)
total = 0
with open("mc4-es-train-50M-steps.jsonl", "w") as f:
for sample in tqdm(mc4, total=50_000_000):
f.write(json.dumps(sample) + "\n")
total += 1
if total >= 50_000_000:
break
mc4val = load_dataset(
"./mc4",
"es",
split="validation",
sampling_method="steps",
perplexity_model="./es.arpa.bin",
sampling_factor=5e5,
boundaries=[536394.99320948, 662247.50212365, 919250.87225178],
streaming=True,
).shuffle(buffer_size=10000, seed=2021)
total = 0
with open("mc4-es-validation-5M-steps.jsonl", "w") as f:
for sample in tqdm(mc4val, total=5_000_000):
f.write(json.dumps(sample) + "\n")
total += 1
if total >= 5_000_000:
break
# ------------------
import json
import logging
from datasets import load_dataset
from tqdm import tqdm
# Setup logging
logging.basicConfig(
format="%(asctime)s - %(levelname)s - %(name)s - %(message)s",
level="INFO",
datefmt="[%X]",
)
# Log on each process the small summary:
logger = logging.getLogger(__name__)
mc4 = load_dataset(
"./mc4",
"es",
split="train",
sampling_method="gaussian",
perplexity_model="../es.arpa.bin",
sampling_factor=0.78,
boundaries=[536394.99320948, 662247.50212365, 919250.87225178],
streaming=True,
).shuffle(buffer_size=10000, seed=2021)
total = 0
with open("mc4-es-train-50M-gaussian.jsonl", "w") as f:
for sample in tqdm(mc4, total=50_000_000):
f.write(json.dumps(sample) + "\n")
total += 1
if total >= 50_000_000:
break
mc4val = load_dataset(
"./mc4",
"es",
split="validation",
sampling_method="gaussian",
perplexity_model="../es.arpa.bin",
sampling_factor=1,
boundaries=[536394.99320948, 662247.50212365, 919250.87225178],
streaming=True,
).shuffle(buffer_size=10000, seed=2021)
total = 0
with open("mc4-es-validation-5M-gaussian.jsonl", "w") as f:
for sample in tqdm(mc4val, total=5_000_000):
f.write(json.dumps(sample) + "\n")
total += 1
if total >= 5_000_000:
break
# ------------------
import json
import logging
from datasets import load_dataset
from tqdm import tqdm
# Setup logging
logging.basicConfig(
format="%(asctime)s - %(levelname)s - %(name)s - %(message)s",
level="INFO",
datefmt="[%X]",
)
# Log on each process the small summary:
logger = logging.getLogger(__name__)
mc4 = load_dataset(
"./mc4",
"es",
split="train",
sampling_method="random",
perplexity_model="../es.arpa.bin",
sampling_factor=0.5,
boundaries=[536394.99320948, 662247.50212365, 919250.87225178],
streaming=True,
).shuffle(buffer_size=10000, seed=2021)
total = 0
with open("mc4-es-train-50M-random.jsonl", "w") as f:
for sample in tqdm(mc4, total=50_000_000):
f.write(json.dumps(sample) + "\n")
total += 1
if total >= 50_000_000:
break
mc4val = load_dataset(
"./mc4",
"es",
split="validation",
sampling_method="random",
perplexity_model="../es.arpa.bin",
sampling_factor=0.5,
boundaries=[536394.99320948, 662247.50212365, 919250.87225178],
streaming=True,
).shuffle(buffer_size=10000, seed=2021)
total = 0
with open("mc4-es-validation-5M-random.jsonl", "w") as f:
for sample in tqdm(mc4val, total=5_000_000):
f.write(json.dumps(sample) + "\n")
total += 1
if total >= 5_000_000:
break
| 25.404908 | 69 | 0.637286 | 556 | 4,141 | 4.620504 | 0.169065 | 0.023355 | 0.024523 | 0.037369 | 0.966913 | 0.966913 | 0.966913 | 0.966913 | 0.965745 | 0.939665 | 0 | 0.137576 | 0.203091 | 4,141 | 162 | 70 | 25.561728 | 0.640909 | 0.047332 | 0 | 0.88806 | 0 | 0 | 0.164888 | 0.047764 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.097015 | 0 | 0.097015 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
2079a4c031fe5019acfb33bbd8020e1c05d16927 | 140 | py | Python | rook/db/base.py | lysnikolaou/fastapi-boilerplate | 84ea54b5172a24cec716ccacaf4c3805d7b02850 | [
"MIT"
] | null | null | null | rook/db/base.py | lysnikolaou/fastapi-boilerplate | 84ea54b5172a24cec716ccacaf4c3805d7b02850 | [
"MIT"
] | null | null | null | rook/db/base.py | lysnikolaou/fastapi-boilerplate | 84ea54b5172a24cec716ccacaf4c3805d7b02850 | [
"MIT"
] | null | null | null | from rook.db.base_class import Auditable # noqa
from rook.db.base_class import Base # noqa
from rook.models.user.user import User # noqa
| 35 | 48 | 0.778571 | 24 | 140 | 4.458333 | 0.416667 | 0.224299 | 0.186916 | 0.261682 | 0.46729 | 0.46729 | 0 | 0 | 0 | 0 | 0 | 0 | 0.15 | 140 | 3 | 49 | 46.666667 | 0.89916 | 0.1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
20cf2eb1f83f53f72ac472bade955e7ebaa49d8f | 49,947 | py | Python | tests/providers/googledrive/test_provider.py | Johnetordoff/waterbutler | b505cdbcffadaba12984dcb19c9139068e6c314d | [
"Apache-2.0"
] | null | null | null | tests/providers/googledrive/test_provider.py | Johnetordoff/waterbutler | b505cdbcffadaba12984dcb19c9139068e6c314d | [
"Apache-2.0"
] | 1 | 2017-11-29T20:28:35.000Z | 2017-11-29T20:28:35.000Z | tests/providers/googledrive/test_provider.py | Johnetordoff/waterbutler | b505cdbcffadaba12984dcb19c9139068e6c314d | [
"Apache-2.0"
] | null | null | null | import copy
import pytest
import io
from http import client
import aiohttpretty
from json import dumps
from waterbutler.core import streams
from waterbutler.core import exceptions
from waterbutler.core.path import WaterButlerPath
from waterbutler.providers.googledrive import settings as ds
from waterbutler.providers.googledrive import GoogleDriveProvider
from waterbutler.providers.googledrive.provider import GoogleDrivePath
from waterbutler.providers.googledrive.metadata import GoogleDriveRevision
from waterbutler.providers.googledrive.metadata import GoogleDriveFileMetadata
from waterbutler.providers.googledrive.metadata import GoogleDriveFolderMetadata
from waterbutler.providers.googledrive.metadata import GoogleDriveFileRevisionMetadata
from tests.providers.googledrive import fixtures
@pytest.fixture
def file_content():
return b'SLEEP IS FOR THE WEAK GO SERVE STREAMS'
@pytest.fixture
def file_like(file_content):
return io.BytesIO(file_content)
@pytest.fixture
def file_stream(file_like):
return streams.FileStreamReader(file_like)
@pytest.fixture
def auth():
return {
'name': 'cat',
'email': 'cat@cat.com',
}
@pytest.fixture
def credentials():
return {'token': 'hugoandkim'}
@pytest.fixture
def settings():
return {
'folder': {
'id': '19003e',
'name': '/conrad/birdie',
},
}
@pytest.fixture
def provider(auth, credentials, settings):
return GoogleDriveProvider(auth, credentials, settings)
@pytest.fixture
def search_for_file_response():
return {
'items': [
{'id': '1234ideclarethumbwar'}
]
}
@pytest.fixture
def no_file_response():
return {
'items': []
}
@pytest.fixture
def actual_file_response():
return {
'id': '1234ideclarethumbwar',
'mimeType': 'text/plain',
'title': 'B.txt',
}
@pytest.fixture
def search_for_folder_response():
return {
'items': [
{'id': 'whyis6afraidof7'}
]
}
@pytest.fixture
def no_folder_response():
return {
'items': []
}
@pytest.fixture
def actual_folder_response():
return {
'id': 'whyis6afraidof7',
'mimeType': 'application/vnd.google-apps.folder',
'title': 'A',
}
def _build_title_search_query(provider, entity_name, is_folder=True):
return "title = '{}' " \
"and trashed = false " \
"and mimeType != 'application/vnd.google-apps.form' " \
"and mimeType != 'application/vnd.google-apps.map' " \
"and mimeType != 'application/vnd.google-apps.document' " \
"and mimeType != 'application/vnd.google-apps.drawing' " \
"and mimeType != 'application/vnd.google-apps.presentation' " \
"and mimeType != 'application/vnd.google-apps.spreadsheet' " \
"and mimeType {} '{}'".format(
entity_name,
'=' if is_folder else '!=',
provider.FOLDER_MIME_TYPE
)
class TestValidatePath:
@pytest.mark.asyncio
@pytest.mark.aiohttpretty
async def test_validate_v1_path_file(self, provider, search_for_file_response,
actual_file_response, no_folder_response):
file_name = 'file.txt'
file_id = '1234ideclarethumbwar'
query_url = provider.build_url(
'files', provider.folder['id'], 'children',
q=_build_title_search_query(provider, file_name, False),
fields='items(id)'
)
wrong_query_url = provider.build_url(
'files', provider.folder['id'], 'children',
q=_build_title_search_query(provider, file_name, True),
fields='items(id)'
)
specific_url = provider.build_url('files', file_id, fields='id,title,mimeType')
aiohttpretty.register_json_uri('GET', query_url, body=search_for_file_response)
aiohttpretty.register_json_uri('GET', wrong_query_url, body=no_folder_response)
aiohttpretty.register_json_uri('GET', specific_url, body=actual_file_response)
try:
wb_path_v1 = await provider.validate_v1_path('/' + file_name)
except Exception as exc:
pytest.fail(str(exc))
with pytest.raises(exceptions.NotFoundError) as exc:
await provider.validate_v1_path('/' + file_name + '/')
assert exc.value.code == client.NOT_FOUND
wb_path_v0 = await provider.validate_path('/' + file_name)
assert wb_path_v1 == wb_path_v0
@pytest.mark.asyncio
@pytest.mark.aiohttpretty
async def test_validate_v1_path_folder(self, provider, search_for_folder_response,
actual_folder_response, no_file_response):
folder_name = 'foofolder'
folder_id = 'whyis6afraidof7'
query_url = provider.build_url(
'files', provider.folder['id'], 'children',
q=_build_title_search_query(provider, folder_name, True),
fields='items(id)'
)
wrong_query_url = provider.build_url(
'files', provider.folder['id'], 'children',
q=_build_title_search_query(provider, folder_name, False),
fields='items(id)'
)
specific_url = provider.build_url('files', folder_id, fields='id,title,mimeType')
aiohttpretty.register_json_uri('GET', query_url, body=search_for_folder_response)
aiohttpretty.register_json_uri('GET', wrong_query_url, body=no_file_response)
aiohttpretty.register_json_uri('GET', specific_url, body=actual_folder_response)
try:
wb_path_v1 = await provider.validate_v1_path('/' + folder_name + '/')
except Exception as exc:
pytest.fail(str(exc))
with pytest.raises(exceptions.NotFoundError) as exc:
await provider.validate_v1_path('/' + folder_name)
assert exc.value.code == client.NOT_FOUND
wb_path_v0 = await provider.validate_path('/' + folder_name + '/')
assert wb_path_v1 == wb_path_v0
class TestUpload:
@pytest.mark.asyncio
@pytest.mark.aiohttpretty
async def test_upload_create(self, provider, file_stream):
upload_id = '7'
item = fixtures.list_file['items'][0]
path = WaterButlerPath('/birdie.jpg', _ids=(provider.folder['id'], None))
start_upload_url = provider._build_upload_url('files', uploadType='resumable')
finish_upload_url = provider._build_upload_url('files', uploadType='resumable', upload_id=upload_id)
aiohttpretty.register_json_uri('PUT', finish_upload_url, body=item)
aiohttpretty.register_uri('POST', start_upload_url, headers={'LOCATION': 'http://waterbutler.io?upload_id={}'.format(upload_id)})
result, created = await provider.upload(file_stream, path)
expected = GoogleDriveFileMetadata(item, path)
assert created is True
assert result == expected
assert aiohttpretty.has_call(method='PUT', uri=finish_upload_url)
assert aiohttpretty.has_call(method='POST', uri=start_upload_url)
@pytest.mark.asyncio
@pytest.mark.aiohttpretty
async def test_upload_doesnt_unquote(self, provider, file_stream):
upload_id = '7'
item = fixtures.list_file['items'][0]
path = GoogleDrivePath('/birdie%2F %20".jpg', _ids=(provider.folder['id'], None))
start_upload_url = provider._build_upload_url('files', uploadType='resumable')
finish_upload_url = provider._build_upload_url('files', uploadType='resumable', upload_id=upload_id)
aiohttpretty.register_json_uri('PUT', finish_upload_url, body=item)
aiohttpretty.register_uri('POST', start_upload_url, headers={'LOCATION': 'http://waterbutler.io?upload_id={}'.format(upload_id)})
result, created = await provider.upload(file_stream, path)
expected = GoogleDriveFileMetadata(item, path)
assert created is True
assert result == expected
assert aiohttpretty.has_call(method='POST', uri=start_upload_url)
assert aiohttpretty.has_call(method='PUT', uri=finish_upload_url)
@pytest.mark.asyncio
@pytest.mark.aiohttpretty
async def test_upload_update(self, provider, file_stream):
upload_id = '7'
item = fixtures.list_file['items'][0]
path = WaterButlerPath('/birdie.jpg', _ids=(provider.folder['id'], item['id']))
start_upload_url = provider._build_upload_url('files', path.identifier, uploadType='resumable')
finish_upload_url = provider._build_upload_url('files', path.identifier, uploadType='resumable', upload_id=upload_id)
aiohttpretty.register_json_uri('PUT', finish_upload_url, body=item)
aiohttpretty.register_uri('PUT', start_upload_url, headers={'LOCATION': 'http://waterbutler.io?upload_id={}'.format(upload_id)})
result, created = await provider.upload(file_stream, path)
assert aiohttpretty.has_call(method='PUT', uri=start_upload_url)
assert aiohttpretty.has_call(method='PUT', uri=finish_upload_url)
assert created is False
expected = GoogleDriveFileMetadata(item, path)
assert result == expected
@pytest.mark.asyncio
@pytest.mark.aiohttpretty
async def test_upload_create_nested(self, provider, file_stream):
upload_id = '7'
item = fixtures.list_file['items'][0]
path = WaterButlerPath(
'/ed/sullivan/show.mp3',
_ids=[str(x) for x in range(3)]
)
start_upload_url = provider._build_upload_url('files', uploadType='resumable')
finish_upload_url = provider._build_upload_url('files', uploadType='resumable', upload_id=upload_id)
aiohttpretty.register_uri('POST', start_upload_url, headers={'LOCATION': 'http://waterbutler.io?upload_id={}'.format(upload_id)})
aiohttpretty.register_json_uri('PUT', finish_upload_url, body=item)
result, created = await provider.upload(file_stream, path)
assert aiohttpretty.has_call(method='POST', uri=start_upload_url)
assert aiohttpretty.has_call(method='PUT', uri=finish_upload_url)
assert created is True
expected = GoogleDriveFileMetadata(item, path)
assert result == expected
class TestDelete:
@pytest.mark.asyncio
@pytest.mark.aiohttpretty
async def test_delete(self, provider):
item = fixtures.list_file['items'][0]
path = WaterButlerPath('/birdie.jpg', _ids=(None, item['id']))
delete_url = provider.build_url('files', item['id'])
del_url_body = dumps({'labels': {'trashed': 'true'}})
aiohttpretty.register_uri('PUT',
delete_url,
body=del_url_body,
status=200)
result = await provider.delete(path)
assert result is None
assert aiohttpretty.has_call(method='PUT', uri=delete_url)
@pytest.mark.asyncio
@pytest.mark.aiohttpretty
async def test_delete_folder(self, provider):
item = fixtures.folder_metadata
del_url = provider.build_url('files', item['id'])
del_url_body = dumps({'labels': {'trashed': 'true'}})
path = WaterButlerPath('/foobar/', _ids=('doesntmatter', item['id']))
aiohttpretty.register_uri('PUT',
del_url,
body=del_url_body,
status=200)
result = await provider.delete(path)
assert aiohttpretty.has_call(method='PUT', uri=del_url)
@pytest.mark.asyncio
@pytest.mark.aiohttpretty
async def test_delete_not_existing(self, provider):
with pytest.raises(exceptions.NotFoundError):
await provider.delete(WaterButlerPath('/foobar/'))
class TestDownload:
"""Google Docs (incl. Google Sheets, Google Slides, etc.) require extra API calls and use a
different branch for downloading/exporting files than non-GDoc files. For brevity's sake
our non-gdoc test files are called jpegs, though it could stand for any type of file.
We want to test all the permutations of:
* editability: editable vs. viewable files
* file type: Google doc vs. non-Google Doc (e.g. jpeg)
* revision parameter: non, valid, invalid, and magic
Non-editable (viewable) GDocs do not support revisions, so the good and bad revisions tests
are the same. Both should 404.
The notion of a GDOC_GOOD_REVISION being the same as a JPEG_BAD_REVISION and vice-versa is an
unnecessary flourish for testing purposes. I'm only including it to remind developers that
GDoc revisions look very different from non-GDoc revisions in production.
"""
GDOC_GOOD_REVISION = '1'
GDOC_BAD_REVISION = '0B74RCNS4TbRVTitFais4VzVmQlQ4S0docGlhelk5MXE3OFJnPQ'
JPEG_GOOD_REVISION = GDOC_BAD_REVISION
JPEG_BAD_REVISION = GDOC_GOOD_REVISION
MAGIC_REVISION = '"LUxk1DXE_0fd4yeJDIgpecr5uPA/MTQ5NTExOTgxMzgzOQ"{}'.format(
ds.DRIVE_IGNORE_VERSION)
GDOC_EXPORT_MIME_TYPE = 'application/vnd.openxmlformats-officedocument.wordprocessingml.document'
@pytest.mark.asyncio
@pytest.mark.aiohttpretty
async def test_download_editable_gdoc_no_revision(self, provider):
metadata_body = fixtures.sharing['editable_gdoc']['metadata']
path = GoogleDrivePath(
'/sharing/editable_gdoc',
_ids=['1', '2', metadata_body['id']]
)
metadata_query = provider._build_query(path.identifier)
metadata_url = provider.build_url('files', path.identifier)
aiohttpretty.register_json_uri('GET', metadata_url, body=metadata_body)
revisions_body = fixtures.sharing['editable_gdoc']['revisions']
revisions_url = provider.build_url('files', metadata_body['id'], 'revisions')
aiohttpretty.register_json_uri('GET', revisions_url, body=revisions_body)
file_content = b'we love you conrad'
download_file_url = metadata_body['exportLinks'][self.GDOC_EXPORT_MIME_TYPE]
aiohttpretty.register_uri('GET', download_file_url, body=file_content, auto_length=True)
result = await provider.download(path)
assert result.name == 'editable_gdoc.docx'
content = await result.read()
assert content == file_content
assert aiohttpretty.has_call(method='GET', uri=metadata_url)
assert aiohttpretty.has_call(method='GET', uri=revisions_url)
assert aiohttpretty.has_call(method='GET', uri=download_file_url)
@pytest.mark.asyncio
@pytest.mark.aiohttpretty
async def test_download_editable_gdoc_good_revision(self, provider):
metadata_body = fixtures.sharing['editable_gdoc']['metadata']
path = GoogleDrivePath(
'/sharing/editable_gdoc',
_ids=['1', '2', metadata_body['id']]
)
revision_body = fixtures.sharing['editable_gdoc']['revision']
revision_url = provider.build_url('files', metadata_body['id'],
'revisions', self.GDOC_GOOD_REVISION)
aiohttpretty.register_json_uri('GET', revision_url, body=revision_body)
file_content = b'we love you conrad'
download_file_url = revision_body['exportLinks'][self.GDOC_EXPORT_MIME_TYPE]
aiohttpretty.register_uri('GET', download_file_url, body=file_content, auto_length=True)
result = await provider.download(path, revision=self.GDOC_GOOD_REVISION)
assert result.name == 'editable_gdoc.docx'
content = await result.read()
assert content == file_content
assert aiohttpretty.has_call(method='GET', uri=revision_url)
assert aiohttpretty.has_call(method='GET', uri=download_file_url)
@pytest.mark.asyncio
@pytest.mark.aiohttpretty
async def test_download_editable_gdoc_bad_revision(self, provider):
metadata_body = fixtures.sharing['editable_gdoc']['metadata']
path = GoogleDrivePath(
'/sharing/editable_gdoc',
_ids=['1', '2', metadata_body['id']]
)
no_such_revision_error = fixtures.make_no_such_revision_error(self.GDOC_BAD_REVISION)
revision_url = provider.build_url('files', metadata_body['id'],
'revisions', self.GDOC_BAD_REVISION)
aiohttpretty.register_json_uri('GET', revision_url, status=404, body=no_such_revision_error)
with pytest.raises(exceptions.NotFoundError) as e:
await provider.download(path, revision=self.GDOC_BAD_REVISION)
assert e.value.code == 404
@pytest.mark.asyncio
@pytest.mark.aiohttpretty
async def test_download_editable_gdoc_magic_revision(self, provider):
metadata_body = fixtures.sharing['editable_gdoc']['metadata']
path = GoogleDrivePath(
'/sharing/editable_gdoc',
_ids=['1', '2', metadata_body['id']]
)
metadata_query = provider._build_query(path.identifier)
metadata_url = provider.build_url('files', path.identifier)
aiohttpretty.register_json_uri('GET', metadata_url, body=metadata_body)
revisions_body = fixtures.sharing['editable_gdoc']['revisions']
revisions_url = provider.build_url('files', metadata_body['id'], 'revisions')
aiohttpretty.register_json_uri('GET', revisions_url, body=revisions_body)
file_content = b'we love you conrad'
download_file_url = metadata_body['exportLinks'][self.GDOC_EXPORT_MIME_TYPE]
aiohttpretty.register_uri('GET', download_file_url, body=file_content, auto_length=True)
result = await provider.download(path, revision=self.MAGIC_REVISION)
assert result.name == 'editable_gdoc.docx'
content = await result.read()
assert content == file_content
assert aiohttpretty.has_call(method='GET', uri=metadata_url)
assert aiohttpretty.has_call(method='GET', uri=revisions_url)
assert aiohttpretty.has_call(method='GET', uri=download_file_url)
@pytest.mark.asyncio
@pytest.mark.aiohttpretty
async def test_download_viewable_gdoc_no_revision(self, provider):
metadata_body = fixtures.sharing['viewable_gdoc']['metadata']
path = GoogleDrivePath(
'/sharing/viewaable_gdoc',
_ids=['1', '2', metadata_body['id']]
)
metadata_query = provider._build_query(path.identifier)
metadata_url = provider.build_url('files', path.identifier)
aiohttpretty.register_json_uri('GET', metadata_url, body=metadata_body)
file_content = b'we love you conrad'
download_file_url = metadata_body['exportLinks'][self.GDOC_EXPORT_MIME_TYPE]
aiohttpretty.register_uri('GET', download_file_url, body=file_content, auto_length=True)
result = await provider.download(path)
assert result.name == 'viewable_gdoc.docx'
content = await result.read()
assert content == file_content
assert aiohttpretty.has_call(method='GET', uri=metadata_url)
assert aiohttpretty.has_call(method='GET', uri=download_file_url)
@pytest.mark.asyncio
@pytest.mark.aiohttpretty
async def test_download_viewable_gdoc_bad_revision(self, provider):
metadata_body = fixtures.sharing['viewable_gdoc']['metadata']
path = GoogleDrivePath(
'/sharing/viewable_gdoc',
_ids=['1', '2', metadata_body['id']]
)
unauthorized_error = fixtures.make_unauthorized_file_access_error(metadata_body['id'])
revision_url = provider.build_url('files', metadata_body['id'],
'revisions', self.GDOC_BAD_REVISION)
aiohttpretty.register_json_uri('GET', revision_url, status=404, body=unauthorized_error)
with pytest.raises(exceptions.NotFoundError) as e:
await provider.download(path, revision=self.GDOC_BAD_REVISION)
assert e.value.code == 404
@pytest.mark.asyncio
@pytest.mark.aiohttpretty
async def test_download_viewable_gdoc_magic_revision(self, provider):
metadata_body = fixtures.sharing['viewable_gdoc']['metadata']
path = GoogleDrivePath(
'/sharing/viewable_gdoc',
_ids=['1', '2', metadata_body['id']]
)
metadata_query = provider._build_query(path.identifier)
metadata_url = provider.build_url('files', path.identifier)
aiohttpretty.register_json_uri('GET', metadata_url, body=metadata_body)
file_content = b'we love you conrad'
download_file_url = metadata_body['exportLinks'][self.GDOC_EXPORT_MIME_TYPE]
aiohttpretty.register_uri('GET', download_file_url, body=file_content, auto_length=True)
result = await provider.download(path, revision=self.MAGIC_REVISION)
assert result.name == 'viewable_gdoc.docx'
content = await result.read()
assert content == file_content
assert aiohttpretty.has_call(method='GET', uri=metadata_url)
assert aiohttpretty.has_call(method='GET', uri=download_file_url)
@pytest.mark.asyncio
@pytest.mark.aiohttpretty
async def test_download_editable_jpeg_no_revision(self, provider):
metadata_body = fixtures.sharing['editable_jpeg']['metadata']
path = GoogleDrivePath(
'/sharing/editable_jpeg.jpeg',
_ids=['1', '2', metadata_body['id']]
)
metadata_query = provider._build_query(path.identifier)
metadata_url = provider.build_url('files', path.identifier)
aiohttpretty.register_json_uri('GET', metadata_url, body=metadata_body)
file_content = b'we love you conrad'
download_file_url = metadata_body['downloadUrl']
aiohttpretty.register_uri('GET', download_file_url, body=file_content, auto_length=True)
result = await provider.download(path)
content = await result.read()
assert content == file_content
assert aiohttpretty.has_call(method='GET', uri=metadata_url)
assert aiohttpretty.has_call(method='GET', uri=download_file_url)
@pytest.mark.asyncio
@pytest.mark.aiohttpretty
async def test_download_editable_jpeg_good_revision(self, provider):
metadata_body = fixtures.sharing['editable_jpeg']['metadata']
path = GoogleDrivePath(
'/sharing/editable_jpeg.jpeg',
_ids=['1', '2', metadata_body['id']]
)
revision_body = fixtures.sharing['editable_jpeg']['revision']
revision_url = provider.build_url('files', metadata_body['id'],
'revisions', self.JPEG_GOOD_REVISION)
aiohttpretty.register_json_uri('GET', revision_url, body=revision_body)
file_content = b'we love you conrad'
download_file_url = revision_body['downloadUrl']
aiohttpretty.register_uri('GET', download_file_url, body=file_content, auto_length=True)
result = await provider.download(path, revision=self.JPEG_GOOD_REVISION)
content = await result.read()
assert content == file_content
assert aiohttpretty.has_call(method='GET', uri=revision_url)
assert aiohttpretty.has_call(method='GET', uri=download_file_url)
@pytest.mark.asyncio
@pytest.mark.aiohttpretty
async def test_download_editable_jpeg_bad_revision(self, provider):
metadata_body = fixtures.sharing['editable_jpeg']['metadata']
path = GoogleDrivePath(
'/sharing/editable_jpeg.jpeg',
_ids=['1', '2', metadata_body['id']]
)
no_such_revision_error = fixtures.make_no_such_revision_error(self.JPEG_BAD_REVISION)
revision_url = provider.build_url('files', metadata_body['id'],
'revisions', self.JPEG_BAD_REVISION)
aiohttpretty.register_json_uri('GET', revision_url, status=404, body=no_such_revision_error)
with pytest.raises(exceptions.NotFoundError) as e:
await provider.download(path, revision=self.JPEG_BAD_REVISION)
assert e.value.code == 404
@pytest.mark.asyncio
@pytest.mark.aiohttpretty
async def test_download_editable_jpeg_magic_revision(self, provider):
metadata_body = fixtures.sharing['editable_jpeg']['metadata']
path = GoogleDrivePath(
'/sharing/editable_jpeg.jpeg',
_ids=['1', '2', metadata_body['id']]
)
metadata_query = provider._build_query(path.identifier)
metadata_url = provider.build_url('files', path.identifier)
aiohttpretty.register_json_uri('GET', metadata_url, body=metadata_body)
file_content = b'we love you conrad'
download_file_url = metadata_body['downloadUrl']
aiohttpretty.register_uri('GET', download_file_url, body=file_content, auto_length=True)
result = await provider.download(path, revision=self.MAGIC_REVISION)
content = await result.read()
assert content == file_content
assert aiohttpretty.has_call(method='GET', uri=metadata_url)
assert aiohttpretty.has_call(method='GET', uri=download_file_url)
@pytest.mark.asyncio
@pytest.mark.aiohttpretty
async def test_download_viewable_jpeg_no_revision(self, provider):
metadata_body = fixtures.sharing['viewable_jpeg']['metadata']
path = GoogleDrivePath(
'/sharing/viewaable_jpeg.jpeg',
_ids=['1', '2', metadata_body['id']]
)
metadata_query = provider._build_query(path.identifier)
metadata_url = provider.build_url('files', path.identifier)
aiohttpretty.register_json_uri('GET', metadata_url, body=metadata_body)
file_content = b'we love you conrad'
download_file_url = metadata_body['downloadUrl']
aiohttpretty.register_uri('GET', download_file_url, body=file_content, auto_length=True)
result = await provider.download(path)
content = await result.read()
assert content == file_content
assert aiohttpretty.has_call(method='GET', uri=metadata_url)
assert aiohttpretty.has_call(method='GET', uri=download_file_url)
@pytest.mark.asyncio
@pytest.mark.aiohttpretty
async def test_download_viewable_jpeg_bad_revision(self, provider):
metadata_body = fixtures.sharing['viewable_jpeg']['metadata']
path = GoogleDrivePath(
'/sharing/viewable_jpeg.jpeg',
_ids=['1', '2', metadata_body['id']]
)
unauthorized_error = fixtures.make_unauthorized_file_access_error(metadata_body['id'])
revision_url = provider.build_url('files', metadata_body['id'],
'revisions', self.JPEG_BAD_REVISION)
aiohttpretty.register_json_uri('GET', revision_url, status=404, body=unauthorized_error)
with pytest.raises(exceptions.NotFoundError) as e:
await provider.download(path, revision=self.JPEG_BAD_REVISION)
assert e.value.code == 404
@pytest.mark.asyncio
@pytest.mark.aiohttpretty
async def test_download_viewable_jpeg_magic_revision(self, provider):
metadata_body = fixtures.sharing['viewable_jpeg']['metadata']
path = GoogleDrivePath(
'/sharing/viewable_jpeg.jpeg',
_ids=['1', '2', metadata_body['id']]
)
metadata_query = provider._build_query(path.identifier)
metadata_url = provider.build_url('files', path.identifier)
aiohttpretty.register_json_uri('GET', metadata_url, body=metadata_body)
file_content = b'we love you conrad'
download_file_url = metadata_body['downloadUrl']
aiohttpretty.register_uri('GET', download_file_url, body=file_content, auto_length=True)
result = await provider.download(path, revision=self.MAGIC_REVISION)
content = await result.read()
assert content == file_content
assert aiohttpretty.has_call(method='GET', uri=metadata_url)
assert aiohttpretty.has_call(method='GET', uri=download_file_url)
class TestMetadata:
"""Google Docs (incl. Google Sheets, Google Slides, etc.) require extra API calls and use a
different branch for fetching metadata about files than non-GDoc files. For brevity's sake
our non-gdoc test files are called jpegs, though it could stand for any type of file.
We want to test all the permutations of:
* editability: editable vs. viewable files
* file type: Google doc vs. non-Google Doc (e.g. jpeg)
* revision parameter: non, valid, invalid, and magic
Non-editable (viewable) GDocs do not support revisions, so the good and bad revisions tests
are the same. Both should 404.
The notion of a GDOC_GOOD_REVISION being the same as a JPEG_BAD_REVISION and vice-versa is an
unnecessary flourish for testing purposes. I'm only including it to remind developers that
GDoc revisions look very different from non-GDoc revisions in production.
"""
GDOC_GOOD_REVISION = '1'
GDOC_BAD_REVISION = '0B74RCNS4TbRVTitFais4VzVmQlQ4S0docGlhelk5MXE3OFJnPQ'
JPEG_GOOD_REVISION = GDOC_BAD_REVISION
JPEG_BAD_REVISION = GDOC_GOOD_REVISION
MAGIC_REVISION = '"LUxk1DXE_0fd4yeJDIgpecr5uPA/MTQ5NTExOTgxMzgzOQ"{}'.format(
ds.DRIVE_IGNORE_VERSION)
GDOC_EXPORT_MIME_TYPE = 'application/vnd.openxmlformats-officedocument.wordprocessingml.document'
@pytest.mark.asyncio
@pytest.mark.aiohttpretty
async def test_metadata_file_root(self, provider):
path = WaterButlerPath('/birdie.jpg', _ids=(provider.folder['id'], fixtures.list_file['items'][0]['id']))
list_file_url = provider.build_url('files', path.identifier)
aiohttpretty.register_json_uri('GET', list_file_url, body=fixtures.list_file['items'][0])
result = await provider.metadata(path)
expected = GoogleDriveFileMetadata(fixtures.list_file['items'][0], path)
assert result == expected
@pytest.mark.asyncio
@pytest.mark.aiohttpretty
async def test_metadata_file_root_not_found(self, provider):
path = WaterButlerPath('/birdie.jpg', _ids=(provider.folder['id'], None))
with pytest.raises(exceptions.MetadataError) as exc_info:
await provider.metadata(path)
assert exc_info.value.code == 404
@pytest.mark.asyncio
@pytest.mark.aiohttpretty
async def test_metadata_file_nested(self, provider):
path = GoogleDrivePath(
'/hugo/kim/pins',
_ids=[str(x) for x in range(4)]
)
item = fixtures.generate_list(3)['items'][0]
url = provider.build_url('files', path.identifier)
aiohttpretty.register_json_uri('GET', url, body=item)
result = await provider.metadata(path)
expected = GoogleDriveFileMetadata(item, path)
assert result == expected
assert aiohttpretty.has_call(method='GET', uri=url)
@pytest.mark.asyncio
@pytest.mark.aiohttpretty
async def test_metadata_root_folder(self, provider):
path = await provider.validate_path('/')
query = provider._build_query(provider.folder['id'])
list_file_url = provider.build_url('files', q=query, alt='json', maxResults=1000)
aiohttpretty.register_json_uri('GET', list_file_url, body=fixtures.list_file)
result = await provider.metadata(path)
expected = GoogleDriveFileMetadata(
fixtures.list_file['items'][0],
path.child(fixtures.list_file['items'][0]['title'])
)
assert result == [expected]
@pytest.mark.asyncio
@pytest.mark.aiohttpretty
async def test_metadata_folder_nested(self, provider):
path = GoogleDrivePath(
'/hugo/kim/pins/',
_ids=[str(x) for x in range(4)]
)
body = fixtures.generate_list(3)
item = body['items'][0]
query = provider._build_query(path.identifier)
url = provider.build_url('files', q=query, alt='json', maxResults=1000)
aiohttpretty.register_json_uri('GET', url, body=body)
result = await provider.metadata(path)
expected = GoogleDriveFileMetadata(item, path.child(item['title']))
assert result == [expected]
assert aiohttpretty.has_call(method='GET', uri=url)
@pytest.mark.asyncio
@pytest.mark.aiohttpretty
async def test_folder_metadata(self, provider):
path = GoogleDrivePath(
'/hugo/kim/pins/',
_ids=[str(x) for x in range(4)]
)
body = fixtures.generate_list(3, **fixtures.folder_metadata)
item = body['items'][0]
query = provider._build_query(path.identifier)
url = provider.build_url('files', q=query, alt='json', maxResults=1000)
aiohttpretty.register_json_uri('GET', url, body=body)
result = await provider.metadata(path)
expected = GoogleDriveFolderMetadata(item, path.child(item['title'], folder=True))
assert result == [expected]
assert aiohttpretty.has_call(method='GET', uri=url)
@pytest.mark.asyncio
@pytest.mark.aiohttpretty
async def test_metadata_editable_gdoc_no_revision(self, provider):
metadata_body = fixtures.sharing['editable_gdoc']['metadata']
path = GoogleDrivePath(
'/sharing/editable_gdoc',
_ids=['1', '2', metadata_body['id']]
)
metadata_query = provider._build_query(path.identifier)
metadata_url = provider.build_url('files', path.identifier)
aiohttpretty.register_json_uri('GET', metadata_url, body=metadata_body)
revisions_body = fixtures.sharing['editable_gdoc']['revisions']
revisions_url = provider.build_url('files', metadata_body['id'], 'revisions')
aiohttpretty.register_json_uri('GET', revisions_url, body=revisions_body)
result = await provider.metadata(path)
local_metadata = copy.deepcopy(metadata_body)
local_metadata['version'] = revisions_body['items'][-1]['id']
expected = GoogleDriveFileMetadata(local_metadata, path)
assert result == expected
assert aiohttpretty.has_call(method='GET', uri=metadata_url)
assert aiohttpretty.has_call(method='GET', uri=revisions_url)
@pytest.mark.asyncio
@pytest.mark.aiohttpretty
async def test_metadata_editable_gdoc_good_revision(self, provider):
metadata_body = fixtures.sharing['editable_gdoc']['metadata']
path = GoogleDrivePath(
'/sharing/editable_gdoc',
_ids=['1', '2', metadata_body['id']]
)
revision_body = fixtures.sharing['editable_gdoc']['revision']
revision_url = provider.build_url('files', metadata_body['id'],
'revisions', self.GDOC_GOOD_REVISION)
aiohttpretty.register_json_uri('GET', revision_url, body=revision_body)
result = await provider.metadata(path, revision=self.GDOC_GOOD_REVISION)
expected = GoogleDriveFileRevisionMetadata(revision_body, path)
assert result == expected
assert aiohttpretty.has_call(method='GET', uri=revision_url)
@pytest.mark.asyncio
@pytest.mark.aiohttpretty
async def test_metadata_editable_gdoc_bad_revision(self, provider):
metadata_body = fixtures.sharing['editable_gdoc']['metadata']
path = GoogleDrivePath(
'/sharing/editable_gdoc',
_ids=['1', '2', metadata_body['id']]
)
no_such_revision_error = fixtures.make_no_such_revision_error(self.GDOC_BAD_REVISION)
revision_url = provider.build_url('files', metadata_body['id'],
'revisions', self.GDOC_BAD_REVISION)
aiohttpretty.register_json_uri('GET', revision_url, status=404, body=no_such_revision_error)
with pytest.raises(exceptions.NotFoundError) as e:
await provider.metadata(path, revision=self.GDOC_BAD_REVISION)
assert e.value.code == 404
@pytest.mark.asyncio
@pytest.mark.aiohttpretty
async def test_metadata_editable_gdoc_magic_revision(self, provider):
metadata_body = fixtures.sharing['editable_gdoc']['metadata']
path = GoogleDrivePath(
'/sharing/editable_gdoc',
_ids=['1', '2', metadata_body['id']]
)
metadata_query = provider._build_query(path.identifier)
metadata_url = provider.build_url('files', path.identifier)
aiohttpretty.register_json_uri('GET', metadata_url, body=metadata_body)
revisions_body = fixtures.sharing['editable_gdoc']['revisions']
revisions_url = provider.build_url('files', metadata_body['id'], 'revisions')
aiohttpretty.register_json_uri('GET', revisions_url, body=revisions_body)
result = await provider.metadata(path, revision=self.MAGIC_REVISION)
local_metadata = copy.deepcopy(metadata_body)
local_metadata['version'] = revisions_body['items'][-1]['id']
expected = GoogleDriveFileMetadata(local_metadata, path)
assert result == expected
assert aiohttpretty.has_call(method='GET', uri=metadata_url)
assert aiohttpretty.has_call(method='GET', uri=revisions_url)
@pytest.mark.asyncio
@pytest.mark.aiohttpretty
async def test_metadata_viewable_gdoc_no_revision(self, provider):
metadata_body = fixtures.sharing['viewable_gdoc']['metadata']
path = GoogleDrivePath(
'/sharing/viewable_gdoc',
_ids=['1', '2', metadata_body['id']]
)
metadata_query = provider._build_query(path.identifier)
metadata_url = provider.build_url('files', path.identifier)
aiohttpretty.register_json_uri('GET', metadata_url, body=metadata_body)
result = await provider.metadata(path)
local_metadata = copy.deepcopy(metadata_body)
local_metadata['version'] = local_metadata['etag'] + ds.DRIVE_IGNORE_VERSION
expected = GoogleDriveFileMetadata(local_metadata, path)
assert result == expected
assert aiohttpretty.has_call(method='GET', uri=metadata_url)
@pytest.mark.asyncio
@pytest.mark.aiohttpretty
async def test_metadata_viewable_gdoc_bad_revision(self, provider):
metadata_body = fixtures.sharing['viewable_gdoc']['metadata']
path = GoogleDrivePath(
'/sharing/viewable_gdoc',
_ids=['1', '2', metadata_body['id']]
)
unauthorized_error = fixtures.make_unauthorized_file_access_error(metadata_body['id'])
revision_url = provider.build_url('files', metadata_body['id'],
'revisions', self.GDOC_BAD_REVISION)
aiohttpretty.register_json_uri('GET', revision_url, status=404, body=unauthorized_error)
with pytest.raises(exceptions.NotFoundError) as e:
await provider.metadata(path, revision=self.GDOC_BAD_REVISION)
assert e.value.code == 404
@pytest.mark.asyncio
@pytest.mark.aiohttpretty
async def test_metadata_viewable_gdoc_magic_revision(self, provider):
metadata_body = fixtures.sharing['viewable_gdoc']['metadata']
path = GoogleDrivePath(
'/sharing/viewable_gdoc',
_ids=['1', '2', metadata_body['id']]
)
metadata_query = provider._build_query(path.identifier)
metadata_url = provider.build_url('files', path.identifier)
aiohttpretty.register_json_uri('GET', metadata_url, body=metadata_body)
result = await provider.metadata(path, revision=self.MAGIC_REVISION)
local_metadata = copy.deepcopy(metadata_body)
local_metadata['version'] = local_metadata['etag'] + ds.DRIVE_IGNORE_VERSION
expected = GoogleDriveFileMetadata(local_metadata, path)
assert result == expected
assert aiohttpretty.has_call(method='GET', uri=metadata_url)
@pytest.mark.asyncio
@pytest.mark.aiohttpretty
async def test_metadata_editable_jpeg_no_revision(self, provider):
metadata_body = fixtures.sharing['editable_jpeg']['metadata']
path = GoogleDrivePath(
'/sharing/editable_jpeg.jpeg',
_ids=['1', '2', metadata_body['id']]
)
metadata_query = provider._build_query(path.identifier)
metadata_url = provider.build_url('files', path.identifier)
aiohttpretty.register_json_uri('GET', metadata_url, body=metadata_body)
result = await provider.metadata(path)
expected = GoogleDriveFileMetadata(metadata_body, path)
assert result == expected
assert aiohttpretty.has_call(method='GET', uri=metadata_url)
@pytest.mark.asyncio
@pytest.mark.aiohttpretty
async def test_metadata_editable_jpeg_good_revision(self, provider):
metadata_body = fixtures.sharing['editable_jpeg']['metadata']
path = GoogleDrivePath(
'/sharing/editable_jpeg.jpeg',
_ids=['1', '2', metadata_body['id']]
)
revision_body = fixtures.sharing['editable_jpeg']['revision']
revision_url = provider.build_url('files', metadata_body['id'],
'revisions', self.JPEG_GOOD_REVISION)
aiohttpretty.register_json_uri('GET', revision_url, body=revision_body)
result = await provider.metadata(path, revision=self.JPEG_GOOD_REVISION)
expected = GoogleDriveFileRevisionMetadata(revision_body, path)
assert result == expected
assert aiohttpretty.has_call(method='GET', uri=revision_url)
@pytest.mark.asyncio
@pytest.mark.aiohttpretty
async def test_metadata_editable_jpeg_bad_revision(self, provider):
metadata_body = fixtures.sharing['editable_jpeg']['metadata']
path = GoogleDrivePath(
'/sharing/editable_jpeg.jpeg',
_ids=['1', '2', metadata_body['id']]
)
no_such_revision_error = fixtures.make_no_such_revision_error(self.JPEG_BAD_REVISION)
revision_url = provider.build_url('files', metadata_body['id'],
'revisions', self.JPEG_BAD_REVISION)
aiohttpretty.register_json_uri('GET', revision_url, status=404, body=no_such_revision_error)
with pytest.raises(exceptions.NotFoundError) as e:
await provider.metadata(path, revision=self.JPEG_BAD_REVISION)
assert e.value.code == 404
@pytest.mark.asyncio
@pytest.mark.aiohttpretty
async def test_metadata_editable_jpeg_magic_revision(self, provider):
metadata_body = fixtures.sharing['editable_jpeg']['metadata']
path = GoogleDrivePath(
'/sharing/editable_jpeg.jpeg',
_ids=['1', '2', metadata_body['id']]
)
metadata_query = provider._build_query(path.identifier)
metadata_url = provider.build_url('files', path.identifier)
aiohttpretty.register_json_uri('GET', metadata_url, body=metadata_body)
result = await provider.metadata(path, revision=self.MAGIC_REVISION)
expected = GoogleDriveFileMetadata(metadata_body, path)
assert result == expected
assert aiohttpretty.has_call(method='GET', uri=metadata_url)
@pytest.mark.asyncio
@pytest.mark.aiohttpretty
async def test_metadata_viewable_jpeg_no_revision(self, provider):
metadata_body = fixtures.sharing['viewable_jpeg']['metadata']
path = GoogleDrivePath(
'/sharing/viewaable_jpeg.jpeg',
_ids=['1', '2', metadata_body['id']]
)
metadata_query = provider._build_query(path.identifier)
metadata_url = provider.build_url('files', path.identifier)
aiohttpretty.register_json_uri('GET', metadata_url, body=metadata_body)
result = await provider.metadata(path)
expected = GoogleDriveFileMetadata(metadata_body, path)
assert result == expected
assert aiohttpretty.has_call(method='GET', uri=metadata_url)
@pytest.mark.asyncio
@pytest.mark.aiohttpretty
async def test_metadata_viewable_jpeg_bad_revision(self, provider):
metadata_body = fixtures.sharing['viewable_jpeg']['metadata']
path = GoogleDrivePath(
'/sharing/viewable_jpeg.jpeg',
_ids=['1', '2', metadata_body['id']]
)
unauthorized_error = fixtures.make_unauthorized_file_access_error(metadata_body['id'])
revision_url = provider.build_url('files', metadata_body['id'],
'revisions', self.JPEG_BAD_REVISION)
aiohttpretty.register_json_uri('GET', revision_url, status=404, body=unauthorized_error)
with pytest.raises(exceptions.NotFoundError) as e:
await provider.metadata(path, revision=self.JPEG_BAD_REVISION)
assert e.value.code == 404
@pytest.mark.asyncio
@pytest.mark.aiohttpretty
async def test_metadata_viewable_jpeg_magic_revision(self, provider):
metadata_body = fixtures.sharing['viewable_jpeg']['metadata']
path = GoogleDrivePath(
'/sharing/viewable_jpeg.jpeg',
_ids=['1', '2', metadata_body['id']]
)
metadata_query = provider._build_query(path.identifier)
metadata_url = provider.build_url('files', path.identifier)
aiohttpretty.register_json_uri('GET', metadata_url, body=metadata_body)
result = await provider.metadata(path, revision=self.MAGIC_REVISION)
expected = GoogleDriveFileMetadata(metadata_body, path)
assert result == expected
assert aiohttpretty.has_call(method='GET', uri=metadata_url)
class TestRevisions:
@pytest.mark.asyncio
@pytest.mark.aiohttpretty
async def test_get_revisions(self, provider):
item = fixtures.list_file['items'][0]
path = WaterButlerPath('/birdie.jpg', _ids=('doesntmatter', item['id']))
revisions_url = provider.build_url('files', item['id'], 'revisions')
aiohttpretty.register_json_uri('GET', revisions_url, body=fixtures.revisions_list)
result = await provider.revisions(path)
expected = [
GoogleDriveRevision(each)
for each in fixtures.revisions_list['items']
]
assert result == expected
@pytest.mark.asyncio
@pytest.mark.aiohttpretty
async def test_get_revisions_no_revisions(self, provider):
item = fixtures.list_file['items'][0]
metadata_url = provider.build_url('files', item['id'])
revisions_url = provider.build_url('files', item['id'], 'revisions')
path = WaterButlerPath('/birdie.jpg', _ids=('doesntmatter', item['id']))
aiohttpretty.register_json_uri('GET', metadata_url, body=item)
aiohttpretty.register_json_uri('GET', revisions_url, body=fixtures.revisions_list_empty)
result = await provider.revisions(path)
expected = [
GoogleDriveRevision({
'modifiedDate': item['modifiedDate'],
'id': item['etag'] + ds.DRIVE_IGNORE_VERSION,
})
]
assert result == expected
@pytest.mark.asyncio
@pytest.mark.aiohttpretty
async def test_get_revisions_for_uneditable(self, provider):
file_fixtures = fixtures.sharing['viewable_gdoc']
item = file_fixtures['metadata']
metadata_url = provider.build_url('files', item['id'])
revisions_url = provider.build_url('files', item['id'], 'revisions')
path = WaterButlerPath('/birdie.jpg', _ids=('doesntmatter', item['id']))
aiohttpretty.register_json_uri('GET', metadata_url, body=item)
aiohttpretty.register_json_uri(
'GET', revisions_url, body=file_fixtures['revisions_error'], status=403)
result = await provider.revisions(path)
expected = [
GoogleDriveRevision({
'modifiedDate': item['modifiedDate'],
'id': item['etag'] + ds.DRIVE_IGNORE_VERSION,
})
]
assert result == expected
@pytest.mark.asyncio
@pytest.mark.aiohttpretty
async def test_get_revisions_doesnt_exist(self, provider):
with pytest.raises(exceptions.NotFoundError):
await provider.revisions(WaterButlerPath('/birdie.jpg'))
class TestCreateFolder:
@pytest.mark.asyncio
@pytest.mark.aiohttpretty
async def test_already_exists(self, provider):
path = WaterButlerPath('/hugo/', _ids=('doesnt', 'matter'))
with pytest.raises(exceptions.FolderNamingConflict) as e:
await provider.create_folder(path)
assert e.value.code == 409
assert e.value.message == 'Cannot create folder "{}" because a file or folder already exists at path "{}"'.format(path.name, str(path))
@pytest.mark.asyncio
@pytest.mark.aiohttpretty
async def test_returns_metadata(self, provider):
path = WaterButlerPath('/osf%20test/', _ids=(provider.folder['id'], None))
aiohttpretty.register_json_uri('POST', provider.build_url('files'), body=fixtures.folder_metadata)
resp = await provider.create_folder(path)
assert resp.kind == 'folder'
assert resp.name == 'osf test'
assert resp.path == '/osf%20test/'
@pytest.mark.asyncio
@pytest.mark.aiohttpretty
async def test_raises_non_404(self, provider):
path = WaterButlerPath('/hugo/kim/pins/', _ids=(provider.folder['id'], 'something', 'something', None))
url = provider.build_url('files')
aiohttpretty.register_json_uri('POST', url, status=418)
with pytest.raises(exceptions.CreateFolderError) as e:
await provider.create_folder(path)
assert e.value.code == 418
@pytest.mark.asyncio
@pytest.mark.aiohttpretty
async def test_must_be_folder(self, provider, monkeypatch):
with pytest.raises(exceptions.CreateFolderError) as e:
await provider.create_folder(WaterButlerPath('/carp.fish', _ids=('doesnt', 'matter')))
| 40.640358 | 143 | 0.674054 | 5,695 | 49,947 | 5.660579 | 0.059526 | 0.040202 | 0.029283 | 0.045228 | 0.907405 | 0.89236 | 0.87716 | 0.86621 | 0.862394 | 0.844589 | 0 | 0.006366 | 0.21685 | 49,947 | 1,228 | 144 | 40.673453 | 0.817773 | 0.033856 | 0 | 0.710441 | 0 | 0 | 0.103952 | 0.027271 | 0 | 0 | 0 | 0 | 0.116254 | 1 | 0.01507 | false | 0 | 0.018299 | 0.01507 | 0.068891 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
45945095b93459859299d05ddd0535d6bbe75bce | 32,345 | py | Python | billforward/apis/auditlogs_api.py | billforward/bf-python | d2b812329ca3ed1fd94364d7f46f69ad74665596 | [
"Apache-2.0"
] | 2 | 2016-11-23T17:32:37.000Z | 2022-02-24T05:13:20.000Z | billforward/apis/auditlogs_api.py | billforward/bf-python | d2b812329ca3ed1fd94364d7f46f69ad74665596 | [
"Apache-2.0"
] | null | null | null | billforward/apis/auditlogs_api.py | billforward/bf-python | d2b812329ca3ed1fd94364d7f46f69ad74665596 | [
"Apache-2.0"
] | 1 | 2016-12-30T20:02:48.000Z | 2016-12-30T20:02:48.000Z | # coding: utf-8
"""
BillForward REST API
OpenAPI spec version: 1.0.0
Generated by: https://github.com/swagger-api/swagger-codegen.git
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
"""
from __future__ import absolute_import
import sys
import os
import re
# python 2 and python 3 compatibility library
from six import iteritems
from ..configuration import Configuration
from ..api_client import ApiClient
class AuditlogsApi(object):
"""
NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
Ref: https://github.com/swagger-api/swagger-codegen
"""
def __init__(self, api_client=None):
config = Configuration()
if api_client:
self.api_client = api_client
else:
if not config.api_client:
config.api_client = ApiClient()
self.api_client = config.api_client
def get_all_audit_entries(self, **kwargs):
"""
Returns a collection of all audit-log objects. By default 10 values are returned. Records are returned in natural order.
{\"nickname\":\"Get all\",\"response\":\"getAuditAll.html\"}
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_all_audit_entries(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param list[str] organizations: A list of organization-IDs used to restrict the scope of API calls.
:param int offset: The offset from the first audit-log entry to return.
:param int records: The maximum number of audit-log entry to return.
:param str order_by: Specify a field used to order the result set.
:param str order: Ihe direction of any ordering, either ASC or DESC.
:param bool include_retired: Whether retired products should be returned.
:return: AuditEntryPagedMetadata
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.get_all_audit_entries_with_http_info(**kwargs)
else:
(data) = self.get_all_audit_entries_with_http_info(**kwargs)
return data
def get_all_audit_entries_with_http_info(self, **kwargs):
"""
Returns a collection of all audit-log objects. By default 10 values are returned. Records are returned in natural order.
{\"nickname\":\"Get all\",\"response\":\"getAuditAll.html\"}
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_all_audit_entries_with_http_info(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param list[str] organizations: A list of organization-IDs used to restrict the scope of API calls.
:param int offset: The offset from the first audit-log entry to return.
:param int records: The maximum number of audit-log entry to return.
:param str order_by: Specify a field used to order the result set.
:param str order: Ihe direction of any ordering, either ASC or DESC.
:param bool include_retired: Whether retired products should be returned.
:return: AuditEntryPagedMetadata
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['organizations', 'offset', 'records', 'order_by', 'order', 'include_retired']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_all_audit_entries" % key
)
params[key] = val
del params['kwargs']
resource_path = '/audit-logs'.replace('{format}', 'json')
path_params = {}
query_params = {}
if 'organizations' in params:
query_params['organizations'] = params['organizations']
if 'offset' in params:
query_params['offset'] = params['offset']
if 'records' in params:
query_params['records'] = params['records']
if 'order_by' in params:
query_params['order_by'] = params['order_by']
if 'order' in params:
query_params['order'] = params['order']
if 'include_retired' in params:
query_params['include_retired'] = params['include_retired']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type([])
# Authentication setting
auth_settings = []
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='AuditEntryPagedMetadata',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def get_audit_entries_by_created_date(self, lower_threshold, upper_threshold, **kwargs):
"""
Returns a collection of audit-log objects with created times within the period specified by the lower-threshold and upper-threshold parameters. By default 10 values are returned. Records are returned in natural order.
{\"nickname\":\"Retrieve by created time\",\"response\":\"getAuditByCreated.html\"}
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_audit_entries_by_created_date(lower_threshold, upper_threshold, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str lower_threshold: The UTC DateTime specifying the start of the result period. (required)
:param str upper_threshold: The UTC DateTime specifying the end of the result period. (required)
:param list[str] organizations: A list of organization-IDs used to restrict the scope of API calls.
:param int offset: The offset from the first audit-log entry to return.
:param int records: The maximum number of audit-log entry to return.
:param str order_by: Specify a field used to order the result set.
:param str order: Ihe direction of any ordering, either ASC or DESC.
:param bool include_retired: Whether retired products should be returned.
:return: AuditEntryPagedMetadata
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.get_audit_entries_by_created_date_with_http_info(lower_threshold, upper_threshold, **kwargs)
else:
(data) = self.get_audit_entries_by_created_date_with_http_info(lower_threshold, upper_threshold, **kwargs)
return data
def get_audit_entries_by_created_date_with_http_info(self, lower_threshold, upper_threshold, **kwargs):
"""
Returns a collection of audit-log objects with created times within the period specified by the lower-threshold and upper-threshold parameters. By default 10 values are returned. Records are returned in natural order.
{\"nickname\":\"Retrieve by created time\",\"response\":\"getAuditByCreated.html\"}
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_audit_entries_by_created_date_with_http_info(lower_threshold, upper_threshold, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str lower_threshold: The UTC DateTime specifying the start of the result period. (required)
:param str upper_threshold: The UTC DateTime specifying the end of the result period. (required)
:param list[str] organizations: A list of organization-IDs used to restrict the scope of API calls.
:param int offset: The offset from the first audit-log entry to return.
:param int records: The maximum number of audit-log entry to return.
:param str order_by: Specify a field used to order the result set.
:param str order: Ihe direction of any ordering, either ASC or DESC.
:param bool include_retired: Whether retired products should be returned.
:return: AuditEntryPagedMetadata
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['lower_threshold', 'upper_threshold', 'organizations', 'offset', 'records', 'order_by', 'order', 'include_retired']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_audit_entries_by_created_date" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'lower_threshold' is set
if ('lower_threshold' not in params) or (params['lower_threshold'] is None):
raise ValueError("Missing the required parameter `lower_threshold` when calling `get_audit_entries_by_created_date`")
# verify the required parameter 'upper_threshold' is set
if ('upper_threshold' not in params) or (params['upper_threshold'] is None):
raise ValueError("Missing the required parameter `upper_threshold` when calling `get_audit_entries_by_created_date`")
resource_path = '/audit-logs/created/{lower-threshold}/{upper-threshold}'.replace('{format}', 'json')
path_params = {}
if 'lower_threshold' in params:
path_params['lower-threshold'] = params['lower_threshold']
if 'upper_threshold' in params:
path_params['upper-threshold'] = params['upper_threshold']
query_params = {}
if 'organizations' in params:
query_params['organizations'] = params['organizations']
if 'offset' in params:
query_params['offset'] = params['offset']
if 'records' in params:
query_params['records'] = params['records']
if 'order_by' in params:
query_params['order_by'] = params['order_by']
if 'order' in params:
query_params['order'] = params['order']
if 'include_retired' in params:
query_params['include_retired'] = params['include_retired']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type([])
# Authentication setting
auth_settings = []
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='AuditEntryPagedMetadata',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def get_audit_entry_by_entity_id(self, entity_id, **kwargs):
"""
Returns a collection of audit-log entries, specified by the entity-ID parameter. By default 10 values are returned. Records are returned in natural order.
{\"nickname\":\"Retrieve by entity-ID\",\"response\":\"getAuditByEntityID.html\"}
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_audit_entry_by_entity_id(entity_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str entity_id: The string ID of the entity whose changes are documented by the audit log. (required)
:param list[str] organizations: A list of organization-IDs used to restrict the scope of API calls.
:param int offset: The offset from the first account to return.
:param int records: The maximum number of accounts to return.
:param str order_by: Specify a field used to order the result set.
:param str order: Ihe direction of any ordering, either ASC or DESC.
:param bool include_retired: Whether retired products should be returned.
:return: AuditEntryPagedMetadata
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.get_audit_entry_by_entity_id_with_http_info(entity_id, **kwargs)
else:
(data) = self.get_audit_entry_by_entity_id_with_http_info(entity_id, **kwargs)
return data
def get_audit_entry_by_entity_id_with_http_info(self, entity_id, **kwargs):
"""
Returns a collection of audit-log entries, specified by the entity-ID parameter. By default 10 values are returned. Records are returned in natural order.
{\"nickname\":\"Retrieve by entity-ID\",\"response\":\"getAuditByEntityID.html\"}
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_audit_entry_by_entity_id_with_http_info(entity_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str entity_id: The string ID of the entity whose changes are documented by the audit log. (required)
:param list[str] organizations: A list of organization-IDs used to restrict the scope of API calls.
:param int offset: The offset from the first account to return.
:param int records: The maximum number of accounts to return.
:param str order_by: Specify a field used to order the result set.
:param str order: Ihe direction of any ordering, either ASC or DESC.
:param bool include_retired: Whether retired products should be returned.
:return: AuditEntryPagedMetadata
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['entity_id', 'organizations', 'offset', 'records', 'order_by', 'order', 'include_retired']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_audit_entry_by_entity_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'entity_id' is set
if ('entity_id' not in params) or (params['entity_id'] is None):
raise ValueError("Missing the required parameter `entity_id` when calling `get_audit_entry_by_entity_id`")
resource_path = '/audit-logs/entity/{entity-ID}'.replace('{format}', 'json')
path_params = {}
if 'entity_id' in params:
path_params['entity-ID'] = params['entity_id']
query_params = {}
if 'organizations' in params:
query_params['organizations'] = params['organizations']
if 'offset' in params:
query_params['offset'] = params['offset']
if 'records' in params:
query_params['records'] = params['records']
if 'order_by' in params:
query_params['order_by'] = params['order_by']
if 'order' in params:
query_params['order'] = params['order']
if 'include_retired' in params:
query_params['include_retired'] = params['include_retired']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['text/plain'])
# Authentication setting
auth_settings = []
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='AuditEntryPagedMetadata',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def get_audit_entry_by_entity_type(self, entity_type, **kwargs):
"""
Returns a collection of audit-log entries, specified by the entity-type parameter. By default 10 values are returned. Records are returned in natural order.
{\"nickname\":\"Retrieve by entity type\",\"response\":\"getAuditByEntityType.html\"}
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_audit_entry_by_entity_type(entity_type, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str entity_type: The type of the entity whose changes are documented by the audit log. (required)
:param list[str] organizations: A list of organization-IDs used to restrict the scope of API calls.
:param int offset: The offset from the first account to return.
:param int records: The maximum number of accounts to return.
:param str order_by: Specify a field used to order the result set.
:param str order: Ihe direction of any ordering, either ASC or DESC.
:param bool include_retired: Whether retired products should be returned.
:return: AuditEntryPagedMetadata
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.get_audit_entry_by_entity_type_with_http_info(entity_type, **kwargs)
else:
(data) = self.get_audit_entry_by_entity_type_with_http_info(entity_type, **kwargs)
return data
def get_audit_entry_by_entity_type_with_http_info(self, entity_type, **kwargs):
"""
Returns a collection of audit-log entries, specified by the entity-type parameter. By default 10 values are returned. Records are returned in natural order.
{\"nickname\":\"Retrieve by entity type\",\"response\":\"getAuditByEntityType.html\"}
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_audit_entry_by_entity_type_with_http_info(entity_type, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str entity_type: The type of the entity whose changes are documented by the audit log. (required)
:param list[str] organizations: A list of organization-IDs used to restrict the scope of API calls.
:param int offset: The offset from the first account to return.
:param int records: The maximum number of accounts to return.
:param str order_by: Specify a field used to order the result set.
:param str order: Ihe direction of any ordering, either ASC or DESC.
:param bool include_retired: Whether retired products should be returned.
:return: AuditEntryPagedMetadata
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['entity_type', 'organizations', 'offset', 'records', 'order_by', 'order', 'include_retired']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_audit_entry_by_entity_type" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'entity_type' is set
if ('entity_type' not in params) or (params['entity_type'] is None):
raise ValueError("Missing the required parameter `entity_type` when calling `get_audit_entry_by_entity_type`")
resource_path = '/audit-logs/entity-type/{entity-type}'.replace('{format}', 'json')
path_params = {}
if 'entity_type' in params:
path_params['entity-type'] = params['entity_type']
query_params = {}
if 'organizations' in params:
query_params['organizations'] = params['organizations']
if 'offset' in params:
query_params['offset'] = params['offset']
if 'records' in params:
query_params['records'] = params['records']
if 'order_by' in params:
query_params['order_by'] = params['order_by']
if 'order' in params:
query_params['order'] = params['order']
if 'include_retired' in params:
query_params['include_retired'] = params['include_retired']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type([])
# Authentication setting
auth_settings = []
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='AuditEntryPagedMetadata',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def get_audit_entry_by_id(self, audit_id, **kwargs):
"""
Returns a single audit-log entry, specified by the audit-ID parameter.
{\"nickname\":\"Retrieve by id\",\"response\":\"getAuditByID.html\"}
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_audit_entry_by_id(audit_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str audit_id: The string ID of the audit-log entry. (required)
:param list[str] organizations: A list of organization-IDs used to restrict the scope of API calls. Multiple organization-IDs may be specified by repeated use of the query parameter. Example: ...&organizations=org1&organizations=org2
:return: AuditEntryPagedMetadata
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.get_audit_entry_by_id_with_http_info(audit_id, **kwargs)
else:
(data) = self.get_audit_entry_by_id_with_http_info(audit_id, **kwargs)
return data
def get_audit_entry_by_id_with_http_info(self, audit_id, **kwargs):
"""
Returns a single audit-log entry, specified by the audit-ID parameter.
{\"nickname\":\"Retrieve by id\",\"response\":\"getAuditByID.html\"}
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_audit_entry_by_id_with_http_info(audit_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str audit_id: The string ID of the audit-log entry. (required)
:param list[str] organizations: A list of organization-IDs used to restrict the scope of API calls. Multiple organization-IDs may be specified by repeated use of the query parameter. Example: ...&organizations=org1&organizations=org2
:return: AuditEntryPagedMetadata
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['audit_id', 'organizations']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_audit_entry_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'audit_id' is set
if ('audit_id' not in params) or (params['audit_id'] is None):
raise ValueError("Missing the required parameter `audit_id` when calling `get_audit_entry_by_id`")
resource_path = '/audit-logs/{audit-ID}'.replace('{format}', 'json')
path_params = {}
if 'audit_id' in params:
path_params['audit-ID'] = params['audit_id']
query_params = {}
if 'organizations' in params:
query_params['organizations'] = params['organizations']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['text/plain'])
# Authentication setting
auth_settings = []
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='AuditEntryPagedMetadata',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
| 48.13244 | 241 | 0.614222 | 3,694 | 32,345 | 5.188955 | 0.072009 | 0.041736 | 0.027494 | 0.024781 | 0.933483 | 0.912719 | 0.901868 | 0.882356 | 0.881626 | 0.848602 | 0 | 0.001333 | 0.304004 | 32,345 | 671 | 242 | 48.204173 | 0.850124 | 0.433143 | 0 | 0.758842 | 1 | 0 | 0.19511 | 0.053472 | 0 | 0 | 0 | 0 | 0 | 1 | 0.03537 | false | 0 | 0.022508 | 0 | 0.109325 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
45abb70a0ea5f96b3185e79baeb86511ed83e5f2 | 88 | py | Python | Cap_2/ex2.2.py | gguilherme42/Livro-de-Python | 465a509d50476fd1a87239c71ed741639d58418b | [
"MIT"
] | 4 | 2020-04-07T00:38:46.000Z | 2022-03-10T03:34:42.000Z | Cap_2/ex2.2.py | gguilherme42/Livro-de-Python | 465a509d50476fd1a87239c71ed741639d58418b | [
"MIT"
] | null | null | null | Cap_2/ex2.2.py | gguilherme42/Livro-de-Python | 465a509d50476fd1a87239c71ed741639d58418b | [
"MIT"
] | 1 | 2021-04-22T02:45:38.000Z | 2021-04-22T02:45:38.000Z | r = 10 % 3 * 10 ** 2 + 1 - 10 * 4 / 2
print(f'10 % 3 * 10 ** 2 + 1 - 10 * 4 / 2 = {r}')
| 29.333333 | 49 | 0.340909 | 20 | 88 | 1.5 | 0.4 | 0.2 | 0.333333 | 0.4 | 0.733333 | 0.733333 | 0.733333 | 0.733333 | 0 | 0 | 0 | 0.407407 | 0.386364 | 88 | 2 | 50 | 44 | 0.148148 | 0 | 0 | 0 | 0 | 0 | 0.443182 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 1 | 0 | 1 | null | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 11 |
b30bcdbf9cc71486d0e22262cb702bce4d0982aa | 19,475 | py | Python | tests/test_diffeq/test_odefiltsmooth/test_odefiltsmooth.py | nathanaelbosch/probnum | 6a0557395da4bd83ab292904a04d09a813dc0c9d | [
"MIT"
] | null | null | null | tests/test_diffeq/test_odefiltsmooth/test_odefiltsmooth.py | nathanaelbosch/probnum | 6a0557395da4bd83ab292904a04d09a813dc0c9d | [
"MIT"
] | null | null | null | tests/test_diffeq/test_odefiltsmooth/test_odefiltsmooth.py | nathanaelbosch/probnum | 6a0557395da4bd83ab292904a04d09a813dc0c9d | [
"MIT"
] | null | null | null | """We test on two test-problems:
* logistic ODE (because it has a closed form sol.)
-> make sure error converges to zero (even with rate q?)
-> Check if iterates match the closed-form solutions in
Schober et al.
* Lotka-Volterra (because it provides meaningful uncertainty estimates,
if e.g. EKF-based ODE filter is implemented correctly)
-> error estimates from adaptive step sizes are roughly satsified
(for the ibm1-kf combo, the other ones do not apply.)
"""
import unittest
import numpy as np
from probnum.diffeq import ode
from probnum.diffeq.odefiltsmooth import probsolve_ivp
from probnum.random_variables import Constant
from tests.testing import NumpyAssertions
class TestConvergenceOnLogisticODE(unittest.TestCase):
"""We test whether the convergence rates roughly hold true."""
def setUp(self):
"""Setup odesolver and solve a scalar ode."""
initrv = Constant(0.1 * np.ones(1))
self.ivp = ode.logistic([0.0, 1.5], initrv)
self.stps = [0.2, 0.1]
def test_error_ibm1(self):
"""Expect error rate q+1."""
stp1, stp2 = self.stps
sol = probsolve_ivp(self.ivp, step=stp1, which_prior="ibm1")
means1 = sol.y.mean
sols1 = np.array([self.ivp.solution(t) for t in sol.t])
err1 = np.amax(np.abs(sols1 - means1))
sol = probsolve_ivp(self.ivp, step=stp2, which_prior="ibm1")
means2 = sol.y.mean
sols2 = np.array([self.ivp.solution(t) for t in sol.t])
err2 = np.amax(np.abs(sols2 - means2))
exp_decay = (stp2 / stp1) ** 2
diff = np.abs(exp_decay * err1 - err2) / np.abs(err2)
self.assertLess(diff, 1.0)
def test_error_ibm2(self):
"""Expect error rate q+1."""
stp1, stp2 = self.stps
sol = probsolve_ivp(self.ivp, step=stp1, which_prior="ibm2")
means1 = sol.y.mean
sols1 = np.array([self.ivp.solution(t) for t in sol.t])
err1 = np.amax(np.abs(sols1 - means1))
sol = probsolve_ivp(self.ivp, step=stp2, which_prior="ibm2")
means2 = sol.y.mean
sols2 = np.array([self.ivp.solution(t) for t in sol.t])
err2 = np.amax(np.abs(sols2 - means2))
exp_decay = (stp2 / stp1) ** 3
diff = np.abs(exp_decay * err1 - err2) / np.abs(err2)
self.assertLess(diff, 1.0)
def test_error_ibm3(self):
"""Expect error rate q+1."""
stp1, stp2 = self.stps
sol = probsolve_ivp(self.ivp, step=stp1, which_prior="ibm3")
means1 = sol.y.mean
sols1 = np.array([self.ivp.solution(t) for t in sol.t])
err1 = np.amax(np.abs(sols1 - means1))
sol = probsolve_ivp(self.ivp, step=stp2, which_prior="ibm3")
means2 = sol.y.mean
sols2 = np.array([self.ivp.solution(t) for t in sol.t])
err2 = np.amax(np.abs(sols2 - means2))
exp_decay = (stp2 / stp1) ** 4
diff = np.abs(exp_decay * err1 - err2) / np.abs(err2)
self.assertLess(diff, 1.0)
def test_error_ioup1(self):
"""Expect error rate q+1."""
stp1, stp2 = self.stps
sol = probsolve_ivp(self.ivp, step=stp1, which_prior="ioup1")
means1 = sol.y.mean
sols1 = np.array([self.ivp.solution(t) for t in sol.t])
err1 = np.amax(np.abs(sols1 - means1))
sol = probsolve_ivp(self.ivp, step=stp2, which_prior="ioup1")
means2 = sol.y.mean
sols2 = np.array([self.ivp.solution(t) for t in sol.t])
err2 = np.amax(np.abs(sols2 - means2))
exp_decay = (stp2 / stp1) ** 2
diff = np.abs(exp_decay * err1 - err2) / np.abs(err2)
self.assertLess(diff, 1.0)
def test_error_ioup2(self):
"""Expect error rate q+1."""
stp1, stp2 = self.stps
sol = probsolve_ivp(self.ivp, step=stp1, which_prior="ioup2")
means1 = sol.y.mean
sols1 = np.array([self.ivp.solution(t) for t in sol.t])
err1 = np.amax(np.abs(sols1 - means1))
sol = probsolve_ivp(self.ivp, step=stp2, which_prior="ioup2")
means2 = sol.y.mean
sols2 = np.array([self.ivp.solution(t) for t in sol.t])
err2 = np.amax(np.abs(sols2 - means2))
exp_decay = (stp2 / stp1) ** 3
diff = np.abs(exp_decay * err1 - err2) / np.abs(err2)
self.assertLess(diff, 1.0)
def test_error_ioup3(self):
"""Expect error rate q+1."""
stp1, stp2 = self.stps
sol = probsolve_ivp(self.ivp, step=stp1, which_prior="ioup3")
means1 = sol.y.mean
sols1 = np.array([self.ivp.solution(t) for t in sol.t])
err1 = np.amax(np.abs(sols1 - means1))
sol = probsolve_ivp(self.ivp, step=stp2, which_prior="ioup3")
means2 = sol.y.mean
sols2 = np.array([self.ivp.solution(t) for t in sol.t])
err2 = np.amax(np.abs(sols2 - means2))
exp_decay = (stp2 / stp1) ** 4
diff = np.abs(exp_decay * err1 - err2) / np.abs(err2)
self.assertLess(diff, 1.0)
class TestFirstIterations(unittest.TestCase, NumpyAssertions):
"""Test whether first few means and covariances coincide with Prop.
1 in Schober et al., 2019.
"""
def setUp(self):
initrv = Constant(0.1 * np.ones(1))
self.ivp = ode.logistic([0.0, 1.5], initrv)
self.step = 0.5
sol = probsolve_ivp(self.ivp, step=self.step, diffconst=1.0, which_prior="ibm1")
state_rvs = sol.kalman_posterior.filtering_posterior.state_rvs
self.ms, self.cs = state_rvs.mean, state_rvs.cov
def test_t0(self):
exp_mean = np.array(
[self.ivp.initrv.mean, self.ivp.rhs(0, self.ivp.initrv.mean)]
)
self.assertAllClose(self.ms[0], exp_mean[:, 0], rtol=1e-14)
self.assertAllClose(self.cs[0], np.zeros((2, 2)), rtol=1e-14)
def test_t1(self):
"""The kernels do not coincide exactly because of the uncertainty calibration
that takes place in GaussianIVPFilter.solve() and not in Prop.
1 of Schober et al., 2019.
"""
y0 = self.ivp.initrv.mean
z0 = self.ivp.rhs(0, y0)
z1 = self.ivp.rhs(0, y0 + self.step * z0)
exp_mean = np.array([y0 + 0.5 * self.step * (z0 + z1), z1])
self.assertAllClose(self.ms[1], exp_mean[:, 0], rtol=1e-14)
class TestAdaptivityOnLotkaVolterra(unittest.TestCase):
"""Only test on "ekf0" with IBM(1) prior, since every other combination seems to
dislike the adaptive scheme based on the whitened residual as an error estimate."""
def setUp(self):
"""Setup odesolver and solve a scalar ode."""
initrv = Constant(20 * np.ones(2))
self.ivp = ode.lotkavolterra([0.0, 0.5], initrv)
self.tol = 1e-2
def test_kf_ibm1_stdev(self):
"""Standard deviation at end point roughly equal to tolerance."""
sol = probsolve_ivp(
self.ivp, atol=self.tol, rtol=self.tol, which_prior="ibm1", method="eks0"
)
self.assertLess(np.sqrt(sol.y.cov[-1, 0, 0]), 10 * self.tol)
self.assertLess(0.1 * self.tol, np.sqrt(sol.y.cov[-1, 0, 0]))
def test_kf_ibm1(self):
"""Tests whether resulting steps are not evenly distributed."""
sol = probsolve_ivp(
self.ivp, atol=self.tol, rtol=self.tol, which_prior="ibm1", method="eks0"
)
steps = np.diff(sol.t)
self.assertLess(np.amin(steps) / np.amax(steps), 0.8)
class TestLotkaVolterraOtherPriors(unittest.TestCase):
"""We only test whether all the prior-filter-adaptivity combinations finish."""
def setUp(self):
"""Setup odesolver and Lotka-Volterra IVP."""
initrv = Constant(20 * np.ones(2))
self.ivp = ode.lotkavolterra([0.0, 0.5], initrv)
self.tol = 1e-1
self.step = 0.1
def test_filter_ivp_ioup1_kf(self):
probsolve_ivp(
self.ivp, atol=self.tol, rtol=self.tol, which_prior="ioup1", method="ekf0"
)
def test_filter_ivp_ioup2_ekf(self):
probsolve_ivp(
self.ivp, atol=self.tol, rtol=self.tol, which_prior="ioup2", method="ekf1"
)
def test_filter_ivp_ioup3_ukf(self):
"""UKF requires some evaluation-variance to have a positive definite innovation
matrix, apparently."""
probsolve_ivp(
self.ivp,
atol=self.tol,
rtol=self.tol,
evlvar=0.01,
which_prior="ioup3",
method="ukf",
)
def test_filter_ivp_h_ioup1_ekf(self):
probsolve_ivp(self.ivp, step=self.step, which_prior="ioup1", method="ekf1")
def test_filter_ivp_h_ioup2_ukf(self):
"""UKF requires some evaluation-variance to have a positive definite innovation
matrix, apparently."""
probsolve_ivp(
self.ivp, step=self.step, evlvar=0.01, which_prior="ioup2", method="ukf"
)
def test_filter_ivp_h_ioup3_kf(self):
probsolve_ivp(self.ivp, step=self.step, which_prior="ioup3", method="ekf0")
def test_filter_ivp_mat32_kf(self):
probsolve_ivp(
self.ivp,
atol=self.tol,
rtol=self.tol,
which_prior="matern32",
method="ekf0",
)
def test_filter_ivp_mat52_ekf(self):
probsolve_ivp(
self.ivp,
atol=self.tol,
rtol=self.tol,
which_prior="matern52",
method="ekf1",
)
def test_filter_ivp_mat72_ukf(self):
"""UKF requires some evaluation-variance to have a positive definite innovation
matrix, apparently."""
probsolve_ivp(
self.ivp,
atol=self.tol,
rtol=self.tol,
evlvar=0.01,
which_prior="matern72",
method="ukf",
)
def test_filter_ivp_h_mat32_ekf(self):
probsolve_ivp(self.ivp, step=self.step, which_prior="matern32", method="ekf1")
def test_filter_ivp_h_mat52_ukf(self):
"""UKF requires some evaluation-variance to have a positive definite innovation
matrix, apparently."""
probsolve_ivp(
self.ivp, step=self.step, evlvar=0.01, which_prior="matern52", method="ukf"
)
def test_filter_ivp_h_mat72_kf(self):
probsolve_ivp(self.ivp, step=self.step, which_prior="matern72", method="ekf0")
class TestConvergenceOnLogisticODESmoother(unittest.TestCase):
"""We test whether the convergence rates roughly hold true."""
def setUp(self):
"""Setup odesolver and solve a scalar ode."""
initrv = Constant(0.1 * np.ones(1))
self.ivp = ode.logistic([0.0, 1.5], initrv)
self.stps = [0.2, 0.1]
def test_error_ibm1(self):
"""Expect error rate q+1."""
stp1, stp2 = self.stps
sol = probsolve_ivp(self.ivp, step=stp1, method="eks0", which_prior="ibm1")
means1 = sol.y.mean
sols1 = np.array([self.ivp.solution(t) for t in sol.t])
err1 = np.amax(np.abs(sols1 - means1))
sol = probsolve_ivp(self.ivp, step=stp2, method="eks0", which_prior="ibm1")
means2 = sol.y.mean
sols2 = np.array([self.ivp.solution(t) for t in sol.t])
err2 = np.amax(np.abs(sols2 - means2))
exp_decay = (stp2 / stp1) ** 2
diff = np.abs(exp_decay * err1 - err2) / np.abs(err2)
self.assertLess(diff, 1.0)
def test_error_ibm2(self):
"""Expect error rate q+1."""
stp1, stp2 = self.stps
sol = probsolve_ivp(self.ivp, step=stp1, method="eks0", which_prior="ibm2")
means1 = sol.y.mean
sols1 = np.array([self.ivp.solution(t) for t in sol.t])
err1 = np.amax(np.abs(sols1 - means1))
sol = probsolve_ivp(self.ivp, step=stp2, method="eks0", which_prior="ibm2")
means2 = sol.y.mean
sols2 = np.array([self.ivp.solution(t) for t in sol.t])
err2 = np.amax(np.abs(sols2 - means2))
exp_decay = (stp2 / stp1) ** 3
diff = np.abs(exp_decay * err1 - err2) / np.abs(err2)
self.assertLess(diff, 1.0)
def test_error_ibm3(self):
"""Expect error rate q+1."""
stp1, stp2 = self.stps
sol = probsolve_ivp(self.ivp, step=stp1, method="eks0", which_prior="ibm3")
means1 = sol.y.mean
sols1 = np.array([self.ivp.solution(t) for t in sol.t])
err1 = np.amax(np.abs(sols1 - means1))
sol = probsolve_ivp(self.ivp, step=stp2, method="eks0", which_prior="ibm3")
means2 = sol.y.mean
sols2 = np.array([self.ivp.solution(t) for t in sol.t])
err2 = np.amax(np.abs(sols2 - means2))
exp_decay = (stp2 / stp1) ** 4
diff = np.abs(exp_decay * err1 - err2) / np.abs(err2)
self.assertLess(diff, 1.0)
def test_error_ioup1(self):
"""Expect error rate q+1."""
stp1, stp2 = self.stps
sol = probsolve_ivp(self.ivp, step=stp1, method="eks0", which_prior="ioup1")
means1 = sol.y.mean
sols1 = np.array([self.ivp.solution(t) for t in sol.t])
err1 = np.amax(np.abs(sols1 - means1))
sol = probsolve_ivp(self.ivp, step=stp2, method="eks0", which_prior="ioup1")
means2 = sol.y.mean
sols2 = np.array([self.ivp.solution(t) for t in sol.t])
err2 = np.amax(np.abs(sols2 - means2))
exp_decay = (stp2 / stp1) ** 2
diff = np.abs(exp_decay * err1 - err2) / np.abs(err2)
self.assertLess(diff, 1.0)
def test_error_ioup2(self):
"""Expect error rate q+1."""
stp1, stp2 = self.stps
sol = probsolve_ivp(self.ivp, step=stp1, method="eks0", which_prior="ioup2")
means1 = sol.y.mean
sols1 = np.array([self.ivp.solution(t) for t in sol.t])
err1 = np.amax(np.abs(sols1 - means1))
sol = probsolve_ivp(self.ivp, step=stp2, method="eks0", which_prior="ioup2")
means2 = sol.y.mean
sols2 = np.array([self.ivp.solution(t) for t in sol.t])
err2 = np.amax(np.abs(sols2 - means2))
exp_decay = (stp2 / stp1) ** 3
diff = np.abs(exp_decay * err1 - err2) / np.abs(err2)
self.assertLess(diff, 1.0)
def test_error_ioup3(self):
"""Expect error rate q+1."""
stp1, stp2 = self.stps
sol = probsolve_ivp(self.ivp, step=stp1, method="eks0", which_prior="ioup3")
means1 = sol.y.mean
sols1 = np.array([self.ivp.solution(t) for t in sol.t])
err1 = np.amax(np.abs(sols1 - means1))
sol = probsolve_ivp(self.ivp, step=stp2, method="eks0", which_prior="ioup3")
means2 = sol.y.mean
sols2 = np.array([self.ivp.solution(t) for t in sol.t])
err2 = np.amax(np.abs(sols2 - means2))
exp_decay = (stp2 / stp1) ** 4
diff = np.abs(exp_decay * err1 - err2) / np.abs(err2)
self.assertLess(diff, 1.0)
class TestAdaptivityOnLotkaVolterraSmoother(unittest.TestCase):
"""Only test on "ekf0" with IBM(1) prior, since every other combination seems to
dislike the adaptive scheme based on the whitened residual as an error estimate."""
def setUp(self):
"""Setup odesolver and solve a scalar ode."""
initrv = Constant(20 * np.ones(2))
self.ivp = ode.lotkavolterra([0.0, 0.5], initrv)
self.tol = 1e-2
def test_kf_ibm1_stdev(self):
"""Standard deviation at end point roughly equal to tolerance."""
sol = probsolve_ivp(
self.ivp, atol=self.tol, rtol=self.tol, which_prior="ibm1", method="eks0"
)
self.assertLess(np.sqrt(sol.y.cov[-1, 0, 0]), 10 * self.tol)
self.assertLess(0.1 * self.tol, np.sqrt(sol.y.cov[-1, 0, 0]))
def test_kf_ibm1(self):
"""Tests whether resulting steps are not evenly distributed."""
sol = probsolve_ivp(
self.ivp, atol=self.tol, rtol=self.tol, which_prior="ibm1", method="eks0"
)
steps = np.diff(sol.t)
self.assertLess(np.amin(steps) / np.amax(steps), 0.8)
class TestLotkaVolterraOtherPriorsSmoother(unittest.TestCase):
"""We only test whether all the prior-filter-adaptivity combinations finish."""
def setUp(self):
"""Setup odesolver and Lotka-Volterra IVP."""
initdist = Constant(20 * np.ones(2))
self.ivp = ode.lotkavolterra([0.0, 0.5], initdist)
self.tol = 1e-1
self.step = 0.1
def test_filter_ivp_ioup1_kf(self):
probsolve_ivp(
self.ivp, atol=self.tol, rtol=self.tol, which_prior="ioup1", method="eks0"
)
def test_filter_ivp_ioup2_ekf(self):
probsolve_ivp(
self.ivp, atol=self.tol, rtol=self.tol, which_prior="ioup2", method="eks1"
)
def test_filter_ivp_ioup3_ukf(self):
"""UKF requires some evaluation-variance to have a positive definite innovation
matrix, apparently."""
probsolve_ivp(
self.ivp,
atol=self.tol,
rtol=self.tol,
evlvar=0.01,
which_prior="ioup3",
method="uks",
)
def test_filter_ivp_h_ioup1_ekf(self):
probsolve_ivp(self.ivp, step=self.step, which_prior="ioup1", method="eks1")
def test_filter_ivp_h_ioup2_ukf(self):
"""UKF requires some evaluation-variance to have a positive definite innovation
matrix, apparently."""
probsolve_ivp(
self.ivp, step=self.step, evlvar=0.01, which_prior="ioup2", method="uks"
)
def test_filter_ivp_h_ioup3_kf(self):
probsolve_ivp(self.ivp, step=self.step, which_prior="ioup3", method="eks0")
def test_filter_ivp_mat32_kf(self):
probsolve_ivp(
self.ivp,
atol=self.tol,
rtol=self.tol,
which_prior="matern32",
method="eks0",
)
def test_filter_ivp_mat52_ekf(self):
probsolve_ivp(
self.ivp,
atol=self.tol,
rtol=self.tol,
which_prior="matern52",
method="eks1",
)
def test_filter_ivp_mat72_ukf(self):
"""UKF requires some evaluation-variance to have a positive definite innovation
matrix, apparently."""
probsolve_ivp(
self.ivp,
atol=self.tol,
rtol=self.tol,
evlvar=0.01,
which_prior="matern72",
method="uks",
)
def test_filter_ivp_h_mat32_ekf(self):
probsolve_ivp(self.ivp, step=self.step, which_prior="matern32", method="eks1")
def test_filter_ivp_h_mat52_ukf(self):
"""UKF requires some evaluation-variance to have a positive definite innovation
matrix, apparently."""
probsolve_ivp(
self.ivp, step=self.step, evlvar=0.01, which_prior="matern52", method="uks"
)
def test_filter_ivp_h_mat72_kf(self):
probsolve_ivp(self.ivp, step=self.step, which_prior="matern72", method="eks0")
class TestPreconditioning(unittest.TestCase):
"""Solver with high order and small stepsize should work up to a point where
step**order is below machine precision."""
def setUp(self):
initdist = Constant(20 * np.ones(2))
self.ivp = ode.lotkavolterra([0.0, 1e-4], initdist)
self.step = 1e-5
self.prior = "ibm3"
def test_small_step_feasible(self):
"""With the 'old' preconditioner, this is impossible because step**(2*order + 1)
is too small.
With the 'new' preconditioner, the smallest value that appears
in the solver code is step**order
"""
probsolve_ivp(self.ivp, step=self.step, which_prior=self.prior, method="eks0")
| 38.336614 | 88 | 0.605648 | 2,761 | 19,475 | 4.166606 | 0.093444 | 0.055981 | 0.075104 | 0.089186 | 0.851182 | 0.845271 | 0.838752 | 0.83258 | 0.83258 | 0.829016 | 0 | 0.042229 | 0.261926 | 19,475 | 507 | 89 | 38.412229 | 0.758105 | 0.161592 | 0 | 0.723757 | 0 | 0 | 0.027788 | 0 | 0 | 0 | 0 | 0 | 0.063536 | 1 | 0.140884 | false | 0 | 0.016575 | 0 | 0.179558 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b329c32070414891612c2e56f9ebb154411bdcc5 | 163 | py | Python | ExpiryService/providers/__init__.py | bierschi/ExpiryService | 012d7158578ee9e50d9fd5b757f7d0c37e6a34f9 | [
"MIT"
] | null | null | null | ExpiryService/providers/__init__.py | bierschi/ExpiryService | 012d7158578ee9e50d9fd5b757f7d0c37e6a34f9 | [
"MIT"
] | null | null | null | ExpiryService/providers/__init__.py | bierschi/ExpiryService | 012d7158578ee9e50d9fd5b757f7d0c37e6a34f9 | [
"MIT"
] | null | null | null | from ExpiryService.providers.provider import Provider
from ExpiryService.providers.aldi_talk import AldiTalk
from ExpiryService.providers.netzclub import Netzclub
| 40.75 | 54 | 0.889571 | 19 | 163 | 7.578947 | 0.473684 | 0.354167 | 0.541667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.07362 | 163 | 3 | 55 | 54.333333 | 0.953642 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
b332689281321e5f59005da787263cd375e24076 | 113 | py | Python | openselfsup/version.py | sergeykochetkov/OpenSelfSup | 321890affb319916e53619353cafcf7eea377fcc | [
"Apache-2.0"
] | 1 | 2022-01-06T10:38:24.000Z | 2022-01-06T10:38:24.000Z | openselfsup/version.py | sergeykochetkov/OpenSelfSup | 321890affb319916e53619353cafcf7eea377fcc | [
"Apache-2.0"
] | null | null | null | openselfsup/version.py | sergeykochetkov/OpenSelfSup | 321890affb319916e53619353cafcf7eea377fcc | [
"Apache-2.0"
] | null | null | null | # GENERATED VERSION FILE
# TIME: Thu Aug 19 17:54:23 2021
__version__ = '0.3.0+1db69ec'
short_version = '0.3.0'
| 18.833333 | 32 | 0.699115 | 21 | 113 | 3.52381 | 0.714286 | 0.216216 | 0.243243 | 0.27027 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.221053 | 0.159292 | 113 | 5 | 33 | 22.6 | 0.557895 | 0.469027 | 0 | 0 | 1 | 0 | 0.315789 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b356e9cc376c37195636c820c495d10519cbfb40 | 7,932 | py | Python | test/backend/test_simulation.py | john-veillette/PsychRNN | 7a060dc79f1bba731d6068af64a1b49ce89bad0d | [
"MIT"
] | 90 | 2020-05-25T05:29:13.000Z | 2022-03-29T15:49:12.000Z | test/backend/test_simulation.py | john-veillette/PsychRNN | 7a060dc79f1bba731d6068af64a1b49ce89bad0d | [
"MIT"
] | 26 | 2020-06-23T20:03:25.000Z | 2021-08-05T08:44:37.000Z | test/backend/test_simulation.py | john-veillette/PsychRNN | 7a060dc79f1bba731d6068af64a1b49ce89bad0d | [
"MIT"
] | 28 | 2020-05-25T05:29:04.000Z | 2022-03-29T11:39:05.000Z | import pytest
import tensorflow as tf
from psychrnn.backend.models.basic import Basic
from psychrnn.backend.models.lstm import LSTM
from psychrnn.tasks.match_to_category import MatchToCategory
from psychrnn.backend.simulation import BasicSimulator, LSTMSimulator
import numpy as np
import random
# clears tf graph after each test.
@pytest.fixture()
def tf_graph():
yield
tf.compat.v1.reset_default_graph()
def reset_seeds(seed):
tf.compat.v1.reset_default_graph()
tf.compat.v1.set_random_seed(seed)
random.seed(seed)
np.random.seed(seed)
def test_load_from_file(tf_graph, tmpdir, capfd):
reset_seeds(19846)
mtc = MatchToCategory(dt = 10, tau = 100, T= 2000, N_batch = 50)
params = mtc.get_task_params()
params['name'] = 'test'
params['N_rec'] = 49
rnn_model = Basic(params)
rnn_model.save(str(tmpdir.dirpath("save_weights.npz")))
x,_,_,_ = mtc.get_trial_batch()
sim_model = BasicSimulator(params = params, weights_path=str(tmpdir.dirpath("save_weights.npz")))
tmpdir.dirpath("save_weights.npz").remove()
tf_output, tf_state = rnn_model.test(x)
sim_output, sim_state = sim_model.run_trials(x)
assert(tf_output.shape == sim_output.shape)
assert(tf_state.shape == sim_state.shape)
assert(np.allclose(tf_state, sim_state, atol=1e-06))
assert(np.allclose(tf_output, sim_output, atol=1e-06))
rnn_model.destruct()
def test_load_from_alpha_params(tf_graph):
reset_seeds(19846)
mtc = MatchToCategory(dt = 10, tau = 100, T= 2000, N_batch = 50)
params = mtc.get_task_params()
params['name'] = 'test'
params['N_rec'] = 49
rnn_model = Basic(params)
weights = rnn_model.get_weights()
x,_,_,_ = mtc.get_trial_batch()
sim_model = BasicSimulator(params = {'alpha': params['alpha']}, weights=weights)
tf_output, tf_state = rnn_model.test(x)
sim_output, sim_state = sim_model.run_trials(x)
assert(tf_output.shape == sim_output.shape)
assert(tf_state.shape == sim_state.shape)
assert(np.allclose(tf_state, sim_state, atol=1e-06))
assert(np.allclose(tf_output, sim_output, atol=1e-06))
rnn_model.destruct()
def test_load_from_dt_tau_params(tf_graph):
reset_seeds(19846)
mtc = MatchToCategory(dt = 10, tau = 100, T= 2000, N_batch = 50)
params = mtc.get_task_params()
params['name'] = 'test'
params['N_rec'] = 49
rnn_model = Basic(params)
weights = rnn_model.get_weights()
x,_,_,_ = mtc.get_trial_batch()
sim_model = BasicSimulator(params = {'dt': params['dt'], 'tau': params['tau']}, weights=weights)
tf_output, tf_state = rnn_model.test(x)
sim_output, sim_state = sim_model.run_trials(x)
assert(tf_output.shape == sim_output.shape)
assert(tf_state.shape == sim_state.shape)
assert(np.allclose(tf_state, sim_state, atol=1e-06))
assert(np.allclose(tf_output, sim_output, atol=1e-06))
rnn_model.destruct()
def test_load_from_params(tf_graph):
reset_seeds(19846)
mtc = MatchToCategory(dt = 10, tau = 100, T= 2000, N_batch = 50)
params = mtc.get_task_params()
params['name'] = 'test'
params['N_rec'] = 49
rnn_model = Basic(params)
weights = rnn_model.get_weights()
x,_,_,_ = mtc.get_trial_batch()
sim_model = BasicSimulator(params = params, weights=weights)
tf_output, tf_state = rnn_model.test(x)
sim_output, sim_state = sim_model.run_trials(x)
assert(tf_output.shape == sim_output.shape)
assert(tf_state.shape == sim_state.shape)
assert(np.allclose(tf_state, sim_state, atol=1e-06))
assert(np.allclose(tf_output, sim_output, atol=1e-06))
rnn_model.destruct()
def test_load_from_rnn_model(tf_graph):
reset_seeds(19846)
mtc = MatchToCategory(dt = 10, tau = 100, T= 2000, N_batch = 50)
params = mtc.get_task_params()
params['name'] = 'test'
params['N_rec'] = 49
rnn_model = Basic(params)
x,_,_,_ = mtc.get_trial_batch()
sim_model = BasicSimulator(rnn_model = rnn_model)
tf_output, tf_state = rnn_model.test(x)
sim_output, sim_state = sim_model.run_trials(x)
assert(tf_output.shape == sim_output.shape)
assert(tf_state.shape == sim_state.shape)
assert(np.allclose(tf_state, sim_state, atol=1e-06))
assert(np.allclose(tf_output, sim_output, atol=1e-06))
rnn_model.destruct()
def test_transfer_function(tf_graph):
def my_relu(X):
return np.maximum(X, 0)
reset_seeds(19846)
mtc = MatchToCategory(dt = 10, tau = 100, T= 2000, N_batch = 50)
params = mtc.get_task_params()
params['name'] = 'test'
params['N_rec'] = 49
rnn_model = Basic(params)
x,_,_,_ = mtc.get_trial_batch()
with pytest.raises(UserWarning) as excinfo:
sim_model = BasicSimulator(rnn_model = rnn_model, transfer_function=my_relu)
assert 'my_relu' in str(excinfo.value)
rnn_model.destruct()
def test_rec_noise_rnn_model(tf_graph):
reset_seeds(19846)
mtc = MatchToCategory(dt = 10, tau = 100, T= 2000, N_batch = 50)
params = mtc.get_task_params()
params['name'] = 'test'
params['N_rec'] = 49
params['rec_noise'] = .1
rnn_model = Basic(params)
x,_,_,_ = mtc.get_trial_batch()
sim_model = BasicSimulator(rnn_model = rnn_model)
assert(sim_model.rec_noise == rnn_model.rec_noise)
assert(sim_model.rec_noise == .1)
rnn_model.destruct()
def test_rec_noise_params(tf_graph):
reset_seeds(19846)
mtc = MatchToCategory(dt = 10, tau = 100, T= 2000, N_batch = 50)
params = mtc.get_task_params()
params['name'] = 'test'
params['N_rec'] = 49
params['rec_noise'] = .1
rnn_model = Basic(params)
weights = rnn_model.get_weights()
x,_,_,_ = mtc.get_trial_batch()
sim_model = BasicSimulator(params = params, weights = weights)
assert(sim_model.rec_noise == params['rec_noise'])
rnn_model.destruct()
def test_warnings(tf_graph, tmpdir, capfd):
reset_seeds(19846)
mtc = MatchToCategory(dt = 10, tau = 100, T= 2000, N_batch = 50)
params = mtc.get_task_params()
params['name'] = 'test'
params['N_rec'] = 49
params['rec_noise'] = .1
rnn_model = Basic(params)
weights = rnn_model.get_weights()
rnn_model.save(str(tmpdir.dirpath("save_weights.npz")))
x,_,_,_ = mtc.get_trial_batch()
with pytest.raises(UserWarning) as excinfo:
sim_model = BasicSimulator(params = params, rnn_model = rnn_model)
assert 'rnn_model takes precedence' in str(excinfo.value)
with pytest.raises(UserWarning) as excinfo:
sim_model = BasicSimulator(weights = weights, rnn_model = rnn_model)
assert 'Weights from rnn_model and weights_path will be ignored' in str(excinfo.value)
with pytest.raises(UserWarning) as excinfo:
sim_model = BasicSimulator(weights_path = str(tmpdir.dirpath("save_weights.npz")), weights = weights, params=params)
assert 'Weights from rnn_model and weights_path will be ignored' in str(excinfo.value)
with pytest.raises(UserWarning) as excinfo:
sim_model = BasicSimulator(weights_path = str(tmpdir.dirpath("save_weights.npz")), rnn_model = rnn_model)
assert 'Weights from rnn_model will be ignored.' in str(excinfo.value)
with pytest.raises(UserWarning) as excinfo:
sim_model = BasicSimulator(params=params)
assert 'Either weights, rnn_model, or weights_path must be passed in.' in str(excinfo.value)
rnn_model.destruct()
tmpdir.dirpath("save_weights.npz").remove()
def test_lstm_simulator_load_from_rnn_model(tf_graph):
reset_seeds(19846)
tf.compat.v1.keras.backend.set_floatx('float64')
mtc = MatchToCategory(dt = 10, tau = 100, T= 2000, N_batch = 50)
params = mtc.get_task_params()
params['name'] = 'test'
params['N_rec'] = 49
rnn_model = LSTM(params)
x,_,_,_ = mtc.get_trial_batch()
sim_model = LSTMSimulator(rnn_model = rnn_model)
tf_output, tf_state = rnn_model.test(x)
sim_output, sim_state = sim_model.run_trials(x)
assert(tf_output.shape == sim_output.shape)
assert(tf_state.shape == sim_state.shape)
assert(np.allclose(tf_state, sim_state, atol=1e-01,rtol=1e-01))
assert(np.allclose(tf_output, sim_output, atol=1e-01,rtol=1e-01))
rnn_model.destruct()
rnn_model = LSTM(params)
tf_output, tf_state = rnn_model.test(x)
assert(not np.allclose(tf_state, sim_state, atol=1e-01,rtol=1e-01))
assert(not np.allclose(tf_output, sim_output, atol=1e-01,rtol=1e-01))
| 28.948905 | 118 | 0.741427 | 1,244 | 7,932 | 4.447749 | 0.092444 | 0.08386 | 0.030363 | 0.039039 | 0.85722 | 0.834809 | 0.813121 | 0.789987 | 0.773179 | 0.756009 | 0 | 0.035257 | 0.123928 | 7,932 | 273 | 119 | 29.054945 | 0.760973 | 0.004034 | 0 | 0.743456 | 0 | 0 | 0.069402 | 0 | 0 | 0 | 0 | 0 | 0.183246 | 1 | 0.068063 | false | 0.005236 | 0.041885 | 0.005236 | 0.115183 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b37444091c7feda0bfb2be2239e3e73af3acb52f | 32 | py | Python | function_20373605.py | Anjingkun/study-3 | 98a5fc04b4b1c0b85d496eec984e3b2f92e32264 | [
"MIT"
] | 2 | 2022-03-19T08:29:17.000Z | 2022-03-19T08:38:21.000Z | function_20373605.py | Anjingkun/study-3 | 98a5fc04b4b1c0b85d496eec984e3b2f92e32264 | [
"MIT"
] | null | null | null | function_20373605.py | Anjingkun/study-3 | 98a5fc04b4b1c0b85d496eec984e3b2f92e32264 | [
"MIT"
] | null | null | null | print('My student_id: 20373605') | 32 | 32 | 0.78125 | 5 | 32 | 4.8 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.266667 | 0.0625 | 32 | 1 | 32 | 32 | 0.533333 | 0 | 0 | 0 | 0 | 0 | 0.69697 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
641cbbc396cfac4db0d09bdbddea93c345877bd7 | 1,380 | py | Python | pyxorfilter/pyxorfilter.py | gluk47/pyxorfilter | 242661c940eabdf18d182a380381ab1429bfc98e | [
"Apache-2.0"
] | null | null | null | pyxorfilter/pyxorfilter.py | gluk47/pyxorfilter | 242661c940eabdf18d182a380381ab1429bfc98e | [
"Apache-2.0"
] | null | null | null | pyxorfilter/pyxorfilter.py | gluk47/pyxorfilter | 242661c940eabdf18d182a380381ab1429bfc98e | [
"Apache-2.0"
] | null | null | null | from ._xorfilter import lib, ffi
class Xor8:
def __init__(self, size):
self.__filter = ffi.new("xor8_t *")
status = lib.xor8_allocate(size, self.__filter)
if not status:
print("Unable to allocate memory for 8 bit filter")
def __repr__(self):
return "Xor8 object with size(in bytes):{}".format(self.size_in_bytes())
def __del__(self):
lib.xor8_free(self.__filter)
def populate(self, data):
return lib.xor8_buffered_populate(data, len(data), self.__filter)
def contains(self,item):
return lib.xor8_contain(item, self.__filter)
def size_in_bytes(self):
return lib.xor8_size_in_bytes(self.__filter)
class Xor16:
def __init__(self, size):
self.__filter = ffi.new("xor16_t *")
status = lib.xor16_allocate(size, self.__filter)
if not status:
print("Unable to allocate memory for 16 bit filter")
def __repr__(self):
return "Xor16 object with size(in bytes):{}".format(self.size_in_bytes())
def __del__(self):
lib.xor16_free(self.__filter)
def populate(self, data):
return lib.xor16_buffered_populate(data, len(data), self.__filter)
def contains(self, item):
return lib.xor16_contain(item, self.__filter)
def size_in_bytes(self):
return lib.xor16_size_in_bytes(self.__filter)
| 29.361702 | 81 | 0.657246 | 189 | 1,380 | 4.402116 | 0.227513 | 0.144231 | 0.105769 | 0.072115 | 0.84375 | 0.793269 | 0.730769 | 0.730769 | 0.65625 | 0.555288 | 0 | 0.025496 | 0.232609 | 1,380 | 46 | 82 | 30 | 0.760151 | 0 | 0 | 0.424242 | 0 | 0 | 0.124003 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.363636 | false | 0 | 0.030303 | 0.242424 | 0.69697 | 0.060606 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
ff8d0f4e78fc871e46ac3e917fbfa3e9714d6396 | 6,269 | py | Python | loldib/getratings/models/NA/na_nunu/na_nunu_mid.py | koliupy/loldib | c9ab94deb07213cdc42b5a7c26467cdafaf81b7f | [
"Apache-2.0"
] | null | null | null | loldib/getratings/models/NA/na_nunu/na_nunu_mid.py | koliupy/loldib | c9ab94deb07213cdc42b5a7c26467cdafaf81b7f | [
"Apache-2.0"
] | null | null | null | loldib/getratings/models/NA/na_nunu/na_nunu_mid.py | koliupy/loldib | c9ab94deb07213cdc42b5a7c26467cdafaf81b7f | [
"Apache-2.0"
] | null | null | null | from getratings.models.ratings import Ratings
class NA_Nunu_Mid_Aatrox(Ratings):
pass
class NA_Nunu_Mid_Ahri(Ratings):
pass
class NA_Nunu_Mid_Akali(Ratings):
pass
class NA_Nunu_Mid_Alistar(Ratings):
pass
class NA_Nunu_Mid_Amumu(Ratings):
pass
class NA_Nunu_Mid_Anivia(Ratings):
pass
class NA_Nunu_Mid_Annie(Ratings):
pass
class NA_Nunu_Mid_Ashe(Ratings):
pass
class NA_Nunu_Mid_AurelionSol(Ratings):
pass
class NA_Nunu_Mid_Azir(Ratings):
pass
class NA_Nunu_Mid_Bard(Ratings):
pass
class NA_Nunu_Mid_Blitzcrank(Ratings):
pass
class NA_Nunu_Mid_Brand(Ratings):
pass
class NA_Nunu_Mid_Braum(Ratings):
pass
class NA_Nunu_Mid_Caitlyn(Ratings):
pass
class NA_Nunu_Mid_Camille(Ratings):
pass
class NA_Nunu_Mid_Cassiopeia(Ratings):
pass
class NA_Nunu_Mid_Chogath(Ratings):
pass
class NA_Nunu_Mid_Corki(Ratings):
pass
class NA_Nunu_Mid_Darius(Ratings):
pass
class NA_Nunu_Mid_Diana(Ratings):
pass
class NA_Nunu_Mid_Draven(Ratings):
pass
class NA_Nunu_Mid_DrMundo(Ratings):
pass
class NA_Nunu_Mid_Ekko(Ratings):
pass
class NA_Nunu_Mid_Elise(Ratings):
pass
class NA_Nunu_Mid_Evelynn(Ratings):
pass
class NA_Nunu_Mid_Ezreal(Ratings):
pass
class NA_Nunu_Mid_Fiddlesticks(Ratings):
pass
class NA_Nunu_Mid_Fiora(Ratings):
pass
class NA_Nunu_Mid_Fizz(Ratings):
pass
class NA_Nunu_Mid_Galio(Ratings):
pass
class NA_Nunu_Mid_Gangplank(Ratings):
pass
class NA_Nunu_Mid_Garen(Ratings):
pass
class NA_Nunu_Mid_Gnar(Ratings):
pass
class NA_Nunu_Mid_Gragas(Ratings):
pass
class NA_Nunu_Mid_Graves(Ratings):
pass
class NA_Nunu_Mid_Hecarim(Ratings):
pass
class NA_Nunu_Mid_Heimerdinger(Ratings):
pass
class NA_Nunu_Mid_Illaoi(Ratings):
pass
class NA_Nunu_Mid_Irelia(Ratings):
pass
class NA_Nunu_Mid_Ivern(Ratings):
pass
class NA_Nunu_Mid_Janna(Ratings):
pass
class NA_Nunu_Mid_JarvanIV(Ratings):
pass
class NA_Nunu_Mid_Jax(Ratings):
pass
class NA_Nunu_Mid_Jayce(Ratings):
pass
class NA_Nunu_Mid_Jhin(Ratings):
pass
class NA_Nunu_Mid_Jinx(Ratings):
pass
class NA_Nunu_Mid_Kalista(Ratings):
pass
class NA_Nunu_Mid_Karma(Ratings):
pass
class NA_Nunu_Mid_Karthus(Ratings):
pass
class NA_Nunu_Mid_Kassadin(Ratings):
pass
class NA_Nunu_Mid_Katarina(Ratings):
pass
class NA_Nunu_Mid_Kayle(Ratings):
pass
class NA_Nunu_Mid_Kayn(Ratings):
pass
class NA_Nunu_Mid_Kennen(Ratings):
pass
class NA_Nunu_Mid_Khazix(Ratings):
pass
class NA_Nunu_Mid_Kindred(Ratings):
pass
class NA_Nunu_Mid_Kled(Ratings):
pass
class NA_Nunu_Mid_KogMaw(Ratings):
pass
class NA_Nunu_Mid_Leblanc(Ratings):
pass
class NA_Nunu_Mid_LeeSin(Ratings):
pass
class NA_Nunu_Mid_Leona(Ratings):
pass
class NA_Nunu_Mid_Lissandra(Ratings):
pass
class NA_Nunu_Mid_Lucian(Ratings):
pass
class NA_Nunu_Mid_Lulu(Ratings):
pass
class NA_Nunu_Mid_Lux(Ratings):
pass
class NA_Nunu_Mid_Malphite(Ratings):
pass
class NA_Nunu_Mid_Malzahar(Ratings):
pass
class NA_Nunu_Mid_Maokai(Ratings):
pass
class NA_Nunu_Mid_MasterYi(Ratings):
pass
class NA_Nunu_Mid_MissFortune(Ratings):
pass
class NA_Nunu_Mid_MonkeyKing(Ratings):
pass
class NA_Nunu_Mid_Mordekaiser(Ratings):
pass
class NA_Nunu_Mid_Morgana(Ratings):
pass
class NA_Nunu_Mid_Nami(Ratings):
pass
class NA_Nunu_Mid_Nasus(Ratings):
pass
class NA_Nunu_Mid_Nautilus(Ratings):
pass
class NA_Nunu_Mid_Nidalee(Ratings):
pass
class NA_Nunu_Mid_Nocturne(Ratings):
pass
class NA_Nunu_Mid_Nunu(Ratings):
pass
class NA_Nunu_Mid_Olaf(Ratings):
pass
class NA_Nunu_Mid_Orianna(Ratings):
pass
class NA_Nunu_Mid_Ornn(Ratings):
pass
class NA_Nunu_Mid_Pantheon(Ratings):
pass
class NA_Nunu_Mid_Poppy(Ratings):
pass
class NA_Nunu_Mid_Quinn(Ratings):
pass
class NA_Nunu_Mid_Rakan(Ratings):
pass
class NA_Nunu_Mid_Rammus(Ratings):
pass
class NA_Nunu_Mid_RekSai(Ratings):
pass
class NA_Nunu_Mid_Renekton(Ratings):
pass
class NA_Nunu_Mid_Rengar(Ratings):
pass
class NA_Nunu_Mid_Riven(Ratings):
pass
class NA_Nunu_Mid_Rumble(Ratings):
pass
class NA_Nunu_Mid_Ryze(Ratings):
pass
class NA_Nunu_Mid_Sejuani(Ratings):
pass
class NA_Nunu_Mid_Shaco(Ratings):
pass
class NA_Nunu_Mid_Shen(Ratings):
pass
class NA_Nunu_Mid_Shyvana(Ratings):
pass
class NA_Nunu_Mid_Singed(Ratings):
pass
class NA_Nunu_Mid_Sion(Ratings):
pass
class NA_Nunu_Mid_Sivir(Ratings):
pass
class NA_Nunu_Mid_Skarner(Ratings):
pass
class NA_Nunu_Mid_Sona(Ratings):
pass
class NA_Nunu_Mid_Soraka(Ratings):
pass
class NA_Nunu_Mid_Swain(Ratings):
pass
class NA_Nunu_Mid_Syndra(Ratings):
pass
class NA_Nunu_Mid_TahmKench(Ratings):
pass
class NA_Nunu_Mid_Taliyah(Ratings):
pass
class NA_Nunu_Mid_Talon(Ratings):
pass
class NA_Nunu_Mid_Taric(Ratings):
pass
class NA_Nunu_Mid_Teemo(Ratings):
pass
class NA_Nunu_Mid_Thresh(Ratings):
pass
class NA_Nunu_Mid_Tristana(Ratings):
pass
class NA_Nunu_Mid_Trundle(Ratings):
pass
class NA_Nunu_Mid_Tryndamere(Ratings):
pass
class NA_Nunu_Mid_TwistedFate(Ratings):
pass
class NA_Nunu_Mid_Twitch(Ratings):
pass
class NA_Nunu_Mid_Udyr(Ratings):
pass
class NA_Nunu_Mid_Urgot(Ratings):
pass
class NA_Nunu_Mid_Varus(Ratings):
pass
class NA_Nunu_Mid_Vayne(Ratings):
pass
class NA_Nunu_Mid_Veigar(Ratings):
pass
class NA_Nunu_Mid_Velkoz(Ratings):
pass
class NA_Nunu_Mid_Vi(Ratings):
pass
class NA_Nunu_Mid_Viktor(Ratings):
pass
class NA_Nunu_Mid_Vladimir(Ratings):
pass
class NA_Nunu_Mid_Volibear(Ratings):
pass
class NA_Nunu_Mid_Warwick(Ratings):
pass
class NA_Nunu_Mid_Xayah(Ratings):
pass
class NA_Nunu_Mid_Xerath(Ratings):
pass
class NA_Nunu_Mid_XinZhao(Ratings):
pass
class NA_Nunu_Mid_Yasuo(Ratings):
pass
class NA_Nunu_Mid_Yorick(Ratings):
pass
class NA_Nunu_Mid_Zac(Ratings):
pass
class NA_Nunu_Mid_Zed(Ratings):
pass
class NA_Nunu_Mid_Ziggs(Ratings):
pass
class NA_Nunu_Mid_Zilean(Ratings):
pass
class NA_Nunu_Mid_Zyra(Ratings):
pass
| 15.033573 | 46 | 0.75642 | 972 | 6,269 | 4.452675 | 0.151235 | 0.223198 | 0.350739 | 0.446396 | 0.791359 | 0.791359 | 0 | 0 | 0 | 0 | 0 | 0 | 0.177221 | 6,269 | 416 | 47 | 15.069712 | 0.839085 | 0 | 0 | 0.498195 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.498195 | 0.00361 | 0 | 0.501805 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 7 |
4437e0de99d0400e647f35b3f7316448efa8f5cc | 873 | py | Python | projects/rectangle.py | coderMaruf/Turtle | 3ad7d77da18e689906593dcd42a5d0294f341bb0 | [
"MIT"
] | null | null | null | projects/rectangle.py | coderMaruf/Turtle | 3ad7d77da18e689906593dcd42a5d0294f341bb0 | [
"MIT"
] | null | null | null | projects/rectangle.py | coderMaruf/Turtle | 3ad7d77da18e689906593dcd42a5d0294f341bb0 | [
"MIT"
] | null | null | null | import turtle as t
t.pensize(1)
t.fillcolor('green')
t.begin_fill()
t.forward(300)
t.left(90)
t.forward(300)
t.left(90)
t.forward(300)
t.left(90)
t.forward(300)
t.end_fill()
t.fillcolor('crimson')
t.begin_fill()
t.setpos(25, 25)
t.left(90)
t.forward(250)
t.left(90)
t.forward(250)
t.left(90)
t.forward(250)
t.left(90)
t.forward(250)
t.end_fill()
t.fillcolor('dodgerblue')
t.begin_fill()
t.setpos(50, 50)
t.left(90)
t.forward(200)
t.left(90)
t.forward(200)
t.left(90)
t.forward(200)
t.left(90)
t.forward(200)
t.end_fill()
t.fillcolor('red')
t.begin_fill()
t.setpos(75, 75)
t.left(90)
t.forward(150)
t.left(90)
t.forward(150)
t.left(90)
t.forward(150)
t.left(90)
t.forward(150)
t.end_fill()
t.fillcolor('cyan')
t.begin_fill()
t.setpos(100, 100)
t.left(90)
t.forward(100)
t.left(90)
t.forward(100)
t.left(90)
t.forward(100)
t.left(90)
t.forward(100)
t.end_fill() | 11.797297 | 25 | 0.689576 | 185 | 873 | 3.2 | 0.145946 | 0.27027 | 0.224662 | 0.256757 | 0.839527 | 0.609797 | 0.609797 | 0.609797 | 0.609797 | 0.609797 | 0 | 0.147541 | 0.091638 | 873 | 74 | 26 | 11.797297 | 0.598991 | 0 | 0 | 0.816667 | 0 | 0 | 0.033181 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.016667 | 0 | 0.016667 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
445094db21d197fa4206336bb11f753a15eaed34 | 148 | py | Python | freqtag/__init__.py | Frequency-Tagging/freq-tag | b7433247ca7d8d085966f798ee756d0280ab776d | [
"BSD-3-Clause"
] | 1 | 2021-04-10T17:43:13.000Z | 2021-04-10T17:43:13.000Z | freqtag/__init__.py | Frequency-Tagging/freq-tag | b7433247ca7d8d085966f798ee756d0280ab776d | [
"BSD-3-Clause"
] | 8 | 2021-01-19T11:32:43.000Z | 2021-12-19T23:34:07.000Z | freqtag/__init__.py | Frequency-Tagging/freq-tag | b7433247ca7d8d085966f798ee756d0280ab776d | [
"BSD-3-Clause"
] | null | null | null | from ._version import __version__
from .spectrum import Spectrum, psd_from_mne_epochs
__all__ = ['__version__', 'Spectrum', 'psd_from_mne_epochs']
| 29.6 | 60 | 0.804054 | 19 | 148 | 5.263158 | 0.421053 | 0.22 | 0.3 | 0.36 | 0.48 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.101351 | 148 | 4 | 61 | 37 | 0.75188 | 0 | 0 | 0 | 0 | 0 | 0.256757 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
4469efd81f0af1fd6b76222d67d1ac8e97563b94 | 403 | py | Python | cacreader/swig-4.0.2/Examples/test-suite/python/cpp11_result_of_runme.py | kyletanyag/LL-Smartcard | 02abea9de5a13f8bae4d7832ab34cb7f0d9514c9 | [
"BSD-3-Clause"
] | 1,031 | 2015-01-02T14:08:47.000Z | 2022-03-29T02:25:27.000Z | cacreader/swig-4.0.2/Examples/test-suite/python/cpp11_result_of_runme.py | kyletanyag/LL-Smartcard | 02abea9de5a13f8bae4d7832ab34cb7f0d9514c9 | [
"BSD-3-Clause"
] | 240 | 2015-01-11T04:27:19.000Z | 2022-03-30T00:35:57.000Z | cacreader/swig-4.0.2/Examples/test-suite/python/cpp11_result_of_runme.py | kyletanyag/LL-Smartcard | 02abea9de5a13f8bae4d7832ab34cb7f0d9514c9 | [
"BSD-3-Clause"
] | 224 | 2015-01-05T06:13:54.000Z | 2022-02-25T14:39:51.000Z | import cpp11_result_of
result = cpp11_result_of.test_result(cpp11_result_of.SQUARE, 3.0)
if result != 9.0:
raise RuntimeError, "test_result(square, 3.0) is not 9.0. Got: " + str(
result)
result = cpp11_result_of.test_result_alternative1(cpp11_result_of.SQUARE, 3.0)
if result != 9.0:
raise RuntimeError, "test_result_alternative1(square, 3.0) is not 9.0. Got: " + str(
result)
| 33.583333 | 88 | 0.712159 | 66 | 403 | 4.106061 | 0.257576 | 0.202952 | 0.239852 | 0.210332 | 0.841328 | 0.841328 | 0.627306 | 0.627306 | 0.627306 | 0.627306 | 0 | 0.083582 | 0.168734 | 403 | 11 | 89 | 36.636364 | 0.725373 | 0 | 0 | 0.444444 | 0 | 0 | 0.240695 | 0.079404 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.111111 | null | null | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ecf7839015565d87efe4a55f3f39a9c02a90ee93 | 18,825 | py | Python | tests/test_0688-lazy-parquet-with-Forms.py | colesbury/awkward-1.0 | d036ab18eb54de8a2571d9f179d315ac8ee22119 | [
"BSD-3-Clause"
] | null | null | null | tests/test_0688-lazy-parquet-with-Forms.py | colesbury/awkward-1.0 | d036ab18eb54de8a2571d9f179d315ac8ee22119 | [
"BSD-3-Clause"
] | null | null | null | tests/test_0688-lazy-parquet-with-Forms.py | colesbury/awkward-1.0 | d036ab18eb54de8a2571d9f179d315ac8ee22119 | [
"BSD-3-Clause"
] | null | null | null | # BSD 3-Clause License; see https://github.com/scikit-hep/awkward-1.0/blob/main/LICENSE
import os
import pytest # noqa: F401
import numpy as np # noqa: F401
import awkward as ak # noqa: F401
pytest.importorskip("pyarrow.parquet")
@pytest.mark.parametrize("one,two,three", [(1, 2, 3), ("one", "two", "three")])
def test_1(one, two, three, tmp_path):
filename = os.path.join(str(tmp_path), "test1.parquet")
data = [{"x": one}, {"x": two}, {"x": three}]
ak.to_parquet(ak.Array(data), filename)
array = ak.from_parquet(filename, lazy=True, lazy_cache_key="tmp")
assert set(array.caches[0].keys()) == set()
array.layout.field("x").array
assert set(array.caches[0].keys()) == {"tmp:col:x[0]"}
assert array.tolist() == data
@pytest.mark.parametrize("one,two,three", [(1, 2, 3), ("one", "two", "three")])
def test_2(one, two, three, tmp_path):
filename = os.path.join(str(tmp_path), "test2.parquet")
data = [{"x": {"y": one}}, {"x": {"y": two}}, {"x": {"y": three}}]
ak.to_parquet(ak.Array(data), filename)
array = ak.from_parquet(filename, lazy=True, lazy_cache_key="tmp")
assert set(array.caches[0].keys()) == set()
array.layout.field("x").array
assert set(array.caches[0].keys()) == set()
array.layout.field("x").array.field("y").array
assert set(array.caches[0].keys()) == {"tmp:col:x.y[0]"}
assert array.tolist() == data
@pytest.mark.parametrize("one,two,three", [(1, 2, 3), ("one", "two", "three")])
def test_3(one, two, three, tmp_path):
filename = os.path.join(str(tmp_path), "test3.parquet")
data = [
{"x": {"y": one, "z": 1.1}},
{"x": {"y": two, "z": 2.2}},
{"x": {"y": three, "z": 3.3}},
]
ak.to_parquet(ak.Array(data), filename)
array = ak.from_parquet(filename, lazy=True, lazy_cache_key="tmp")
assert set(array.caches[0].keys()) == set()
array.layout.field("x").array
assert set(array.caches[0].keys()) == set()
array.layout.field("x").array.field("z").array
assert set(array.caches[0].keys()) == {"tmp:col:x.z[0]"}
array.layout.field("x").array.field("y").array
assert set(array.caches[0].keys()) == {"tmp:col:x.z[0]", "tmp:col:x.y[0]"}
assert array.tolist() == data
@pytest.mark.parametrize("one,two,three", [(1, 2, 3), ("one", "two", "three")])
def test_4(one, two, three, tmp_path):
filename = os.path.join(str(tmp_path), "test4.parquet")
data = [{"x": []}, {"x": [one]}, {"x": [one, two, three]}]
ak.to_parquet(ak.Array(data), filename)
array = ak.from_parquet(filename, lazy=True, lazy_cache_key="tmp")
assert set(array.caches[0].keys()) == set()
array.layout.field("x").array
assert set(array.caches[0].keys()) == {"tmp:lst:x[0]"}
assert array.tolist() == data
@pytest.mark.parametrize("one,two,three", [(1, 2, 3), ("one", "two", "three")])
def test_5(one, two, three, tmp_path):
filename = os.path.join(str(tmp_path), "test5.parquet")
data = [{"x": {"y": []}}, {"x": {"y": [one]}}, {"x": {"y": [one, two, three]}}]
ak.to_parquet(ak.Array(data), filename)
array = ak.from_parquet(filename, lazy=True, lazy_cache_key="tmp")
assert set(array.caches[0].keys()) == set()
array.layout.field("x").array
assert set(array.caches[0].keys()) == set()
array.layout.field("x").array.field("y").array
assert set(array.caches[0].keys()) == {"tmp:lst:x.y[0]"}
assert array.tolist() == data
@pytest.mark.parametrize("one,two,three", [(1, 2, 3), ("one", "two", "three")])
def test_6(one, two, three, tmp_path):
filename = os.path.join(str(tmp_path), "test6.parquet")
data = [
{"x": {"y": [], "z": 1.1}},
{"x": {"y": [one], "z": 2.2}},
{"x": {"y": [one, two, three], "z": 3.3}},
]
ak.to_parquet(ak.Array(data), filename)
array = ak.from_parquet(filename, lazy=True, lazy_cache_key="tmp")
assert set(array.caches[0].keys()) == set()
array.layout.field("x").array
assert set(array.caches[0].keys()) == set()
array.layout.field("x").array.field("z").array
assert set(array.caches[0].keys()) == {"tmp:col:x.z[0]"}
array.layout.field("x").array.field("y").array
assert set(array.caches[0].keys()) == {"tmp:col:x.z[0]", "tmp:lst:x.y[0]"}
assert array.tolist() == data
array = ak.from_parquet(filename, lazy=True, lazy_cache_key="tmp")
assert set(array.caches[0].keys()) == set()
array.layout.field("x").array
assert set(array.caches[0].keys()) == set()
array.layout.field("x").array.field("y").array
assert set(array.caches[0].keys()) == {"tmp:lst:x.y[0]"}
array.layout.field("x").array.field("z").array
assert set(array.caches[0].keys()) == {"tmp:lst:x.y[0]", "tmp:col:x.z[0]"}
assert array.tolist() == data
@pytest.mark.parametrize("one,two,three", [(1, 2, 3), ("one", "two", "three")])
def test_7(one, two, three, tmp_path):
filename = os.path.join(str(tmp_path), "test7.parquet")
data = [
{"x": []},
{"x": [{"y": one}]},
{"x": [{"y": one}, {"y": two}, {"y": three}]},
]
ak.to_parquet(ak.Array(data), filename)
array = ak.from_parquet(filename, lazy=True, lazy_cache_key="tmp")
assert set(array.caches[0].keys()) == set()
array.layout.field("x").array
assert set(array.caches[0].keys()) == {"tmp:off:x.list.item.y:x[0]"}
assert np.asarray(array.layout.field("x").array.offsets).tolist() == [0, 0, 1, 4]
assert set(array.caches[0].keys()) == {"tmp:off:x.list.item.y:x[0]"}
array.layout.field("x").array.content.field("y").array
assert set(array.caches[0].keys()) == {
"tmp:off:x.list.item.y:x[0]",
"tmp:col:x.list.item.y[0]",
}
assert array.tolist() == data
@pytest.mark.parametrize("one,two,three", [(1, 2, 3), ("one", "two", "three")])
def test_8(one, two, three, tmp_path):
filename = os.path.join(str(tmp_path), "test8.parquet")
data = [
{"x": []},
{"x": [{"y": one, "z": 1.1}]},
{"x": [{"y": one, "z": 1.1}, {"y": two, "z": 2.2}, {"y": three, "z": 3.3}]},
]
ak.to_parquet(ak.Array(data), filename)
array = ak.from_parquet(filename, lazy=True, lazy_cache_key="tmp")
assert set(array.caches[0].keys()) == set()
array.layout.field("x").array
assert set(array.caches[0].keys()) == {"tmp:off:x.list.item.y:x[0]"}
assert np.asarray(array.layout.field("x").array.offsets).tolist() == [0, 0, 1, 4]
assert set(array.caches[0].keys()) == {"tmp:off:x.list.item.y:x[0]"}
array.layout.field("x").array.content.field("y").array
assert set(array.caches[0].keys()) == {
"tmp:off:x.list.item.y:x[0]",
"tmp:col:x.list.item.y[0]",
}
array.layout.field("x").array.content.field("z").array
assert set(array.caches[0].keys()) == {
"tmp:off:x.list.item.y:x[0]",
"tmp:col:x.list.item.y[0]",
"tmp:col:x.list.item.z[0]",
}
assert array.tolist() == data
array = ak.from_parquet(filename, lazy=True, lazy_cache_key="tmp")
assert set(array.caches[0].keys()) == set()
array.layout.field("x").array
assert set(array.caches[0].keys()) == {"tmp:off:x.list.item.y:x[0]"}
assert np.asarray(array.layout.field("x").array.offsets).tolist() == [0, 0, 1, 4]
assert set(array.caches[0].keys()) == {"tmp:off:x.list.item.y:x[0]"}
array.layout.field("x").array.content.field("z").array
assert set(array.caches[0].keys()) == {
"tmp:off:x.list.item.y:x[0]",
"tmp:col:x.list.item.z[0]",
}
array.layout.field("x").array.content.field("y").array
assert set(array.caches[0].keys()) == {
"tmp:off:x.list.item.y:x[0]",
"tmp:col:x.list.item.z[0]",
"tmp:col:x.list.item.y[0]",
}
assert array.tolist() == data
@pytest.mark.parametrize("one,two,three", [(1, 2, 3), ("one", "two", "three")])
def test_9(one, two, three, tmp_path):
filename = os.path.join(str(tmp_path), "test9.parquet")
data = [
{"x": []},
{"x": [{"y": {"q": one}}]},
{"x": [{"y": {"q": one}}, {"y": {"q": two}}, {"y": {"q": three}}]},
]
ak.to_parquet(ak.Array(data), filename)
array = ak.from_parquet(filename, lazy=True, lazy_cache_key="tmp")
assert set(array.caches[0].keys()) == set()
array.layout.field("x").array
assert set(array.caches[0].keys()) == {"tmp:off:x.list.item.y.q:x[0]"}
assert np.asarray(array.layout.field("x").array.offsets).tolist() == [0, 0, 1, 4]
assert set(array.caches[0].keys()) == {"tmp:off:x.list.item.y.q:x[0]"}
array.layout.field("x").array.content.field("y").array
assert set(array.caches[0].keys()) == {"tmp:off:x.list.item.y.q:x[0]"}
array.layout.field("x").array.content.field("y").array.field("q").array
assert set(array.caches[0].keys()) == {
"tmp:off:x.list.item.y.q:x[0]",
"tmp:col:x.list.item.y.q[0]",
}
assert array.tolist() == data
@pytest.mark.parametrize("one,two,three", [(1, 2, 3), ("one", "two", "three")])
def test_10(one, two, three, tmp_path):
filename = os.path.join(str(tmp_path), "test10.parquet")
data = [
{"x": []},
{"x": [{"y": {"q": one}, "z": 1.1}]},
{
"x": [
{"y": {"q": one}, "z": 1.1},
{"y": {"q": two}, "z": 2.2},
{"y": {"q": three}, "z": 3.3},
]
},
]
ak.to_parquet(ak.Array(data), filename)
array = ak.from_parquet(filename, lazy=True, lazy_cache_key="tmp")
assert set(array.caches[0].keys()) == set()
array.layout.field("x").array
assert set(array.caches[0].keys()) == {"tmp:off:x.list.item.y.q:x[0]"}
assert np.asarray(array.layout.field("x").array.offsets).tolist() == [0, 0, 1, 4]
assert set(array.caches[0].keys()) == {"tmp:off:x.list.item.y.q:x[0]"}
array.layout.field("x").array.content.field("y").array
assert set(array.caches[0].keys()) == {"tmp:off:x.list.item.y.q:x[0]"}
array.layout.field("x").array.content.field("y").array.field("q").array
assert set(array.caches[0].keys()) == {
"tmp:off:x.list.item.y.q:x[0]",
"tmp:col:x.list.item.y.q[0]",
}
array.layout.field("x").array.content.field("z").array
assert set(array.caches[0].keys()) == {
"tmp:off:x.list.item.y.q:x[0]",
"tmp:col:x.list.item.y.q[0]",
"tmp:col:x.list.item.z[0]",
}
assert array.tolist() == data
array = ak.from_parquet(filename, lazy=True, lazy_cache_key="tmp")
assert set(array.caches[0].keys()) == set()
array.layout.field("x").array
assert set(array.caches[0].keys()) == {"tmp:off:x.list.item.y.q:x[0]"}
assert np.asarray(array.layout.field("x").array.offsets).tolist() == [0, 0, 1, 4]
assert set(array.caches[0].keys()) == {"tmp:off:x.list.item.y.q:x[0]"}
array.layout.field("x").array.content.field("y").array
assert set(array.caches[0].keys()) == {"tmp:off:x.list.item.y.q:x[0]"}
array.layout.field("x").array.content.field("z").array
assert set(array.caches[0].keys()) == {
"tmp:off:x.list.item.y.q:x[0]",
"tmp:col:x.list.item.z[0]",
}
array.layout.field("x").array.content.field("y").array.field("q").array
assert set(array.caches[0].keys()) == {
"tmp:off:x.list.item.y.q:x[0]",
"tmp:col:x.list.item.y.q[0]",
"tmp:col:x.list.item.z[0]",
}
assert array.tolist() == data
@pytest.mark.parametrize("one,two,three", [(1, 2, 3), ("one", "two", "three")])
def test_11(one, two, three, tmp_path):
filename = os.path.join(str(tmp_path), "test11.parquet")
data = [
{"x": []},
{"x": [{"z": 1.1, "y": {"q": one}}]},
{
"x": [
{"z": 1.1, "y": {"q": one}},
{"z": 2.2, "y": {"q": two}},
{"z": 3.3, "y": {"q": three}},
]
},
]
ak.to_parquet(ak.Array(data), filename)
array = ak.from_parquet(filename, lazy=True, lazy_cache_key="tmp")
assert set(array.caches[0].keys()) == set()
array.layout.field("x").array
assert len(set(array.caches[0].keys())) == 1
assert np.asarray(array.layout.field("x").array.offsets).tolist() == [0, 0, 1, 4]
assert len(set(array.caches[0].keys())) == 1
array.layout.field("x").array.content.field("y").array
assert len(set(array.caches[0].keys())) == 1
array.layout.field("x").array.content.field("y").array.field("q").array
assert len(set(array.caches[0].keys())) == 2
array.layout.field("x").array.content.field("z").array
assert len(set(array.caches[0].keys())) == 3
assert array.tolist() == data
array = ak.from_parquet(filename, lazy=True, lazy_cache_key="tmp")
assert set(array.caches[0].keys()) == set()
array.layout.field("x").array
assert len(set(array.caches[0].keys())) == 1
assert np.asarray(array.layout.field("x").array.offsets).tolist() == [0, 0, 1, 4]
assert len(set(array.caches[0].keys())) == 1
array.layout.field("x").array.content.field("y").array
assert len(set(array.caches[0].keys())) == 1
array.layout.field("x").array.content.field("z").array
assert len(set(array.caches[0].keys())) == 2
array.layout.field("x").array.content.field("y").array.field("q").array
assert len(set(array.caches[0].keys())) == 3
assert array.tolist() == data
@pytest.mark.parametrize("one,two,three", [(1, 2, 3), ("one", "two", "three")])
def test_12(one, two, three, tmp_path):
filename = os.path.join(str(tmp_path), "test12.parquet")
data = [
{"x": {"y": []}},
{"x": {"y": [[one]]}},
{"x": {"y": [[one, two], [], [three]]}},
]
ak.to_parquet(ak.Array(data), filename)
array = ak.from_parquet(filename, lazy=True, lazy_cache_key="tmp")
assert set(array.caches[0].keys()) == set()
array.layout.field("x").array
assert set(array.caches[0].keys()) == set()
array.layout.field("x").array.field("y").array
assert set(array.caches[0].keys()) == {"tmp:lst:x.y[0]"}
assert array.tolist() == data
@pytest.mark.parametrize("one,two,three", [(1, 2, 3), ("one", "two", "three")])
def test_13(one, two, three, tmp_path):
filename = os.path.join(str(tmp_path), "test13.parquet")
data = [
{"x": {"y": [], "z": 1.1}},
{"x": {"y": [[one]], "z": 2.2}},
{"x": {"y": [[one, two], [], [three]], "z": 3.3}},
]
ak.to_parquet(ak.Array(data), filename)
array = ak.from_parquet(filename, lazy=True, lazy_cache_key="tmp")
assert set(array.caches[0].keys()) == set()
array.layout.field("x").array
assert set(array.caches[0].keys()) == set()
array.layout.field("x").array.field("z").array
assert set(array.caches[0].keys()) == {"tmp:col:x.z[0]"}
array.layout.field("x").array.field("y").array
assert set(array.caches[0].keys()) == {"tmp:col:x.z[0]", "tmp:lst:x.y[0]"}
assert array.tolist() == data
array = ak.from_parquet(filename, lazy=True, lazy_cache_key="tmp")
assert set(array.caches[0].keys()) == set()
array.layout.field("x").array
assert set(array.caches[0].keys()) == set()
array.layout.field("x").array.field("y").array
assert set(array.caches[0].keys()) == {"tmp:lst:x.y[0]"}
array.layout.field("x").array.field("z").array
assert set(array.caches[0].keys()) == {"tmp:lst:x.y[0]", "tmp:col:x.z[0]"}
assert array.tolist() == data
@pytest.mark.parametrize("one,two,three", [(1, 2, 3), ("one", "two", "three")])
def test_14(one, two, three, tmp_path):
filename = os.path.join(str(tmp_path), "test6.parquet")
data = [
{"x": [{"y": [], "z": 1.1}]},
{"x": []},
{"x": [{"y": [one, two, three], "z": 3.3}]},
]
ak.to_parquet(ak.Array(data), filename)
array = ak.from_parquet(filename, lazy=True, lazy_cache_key="tmp")
assert set(array.caches[0].keys()) == set()
array.layout.field("x").array
assert set(array.caches[0].keys()) == {"tmp:off:x.list.item.y.list.item:x[0]"}
array.layout.field("x").array.content.field("z").array
assert set(array.caches[0].keys()) == {
"tmp:off:x.list.item.y.list.item:x[0]",
"tmp:col:x.list.item.z[0]",
}
array.layout.field("x").array.content.field("y").array
assert set(array.caches[0].keys()) == {
"tmp:off:x.list.item.y.list.item:x[0]",
"tmp:col:x.list.item.z[0]",
"tmp:lst:x.list.item.y[0]",
}
assert array.tolist() == data
array = ak.from_parquet(filename, lazy=True, lazy_cache_key="tmp")
assert set(array.caches[0].keys()) == set()
array.layout.field("x").array
assert set(array.caches[0].keys()) == {"tmp:off:x.list.item.y.list.item:x[0]"}
array.layout.field("x").array.content.field("y").array
assert set(array.caches[0].keys()) == {
"tmp:off:x.list.item.y.list.item:x[0]",
"tmp:lst:x.list.item.y[0]",
}
array.layout.field("x").array.content.field("z").array
assert set(array.caches[0].keys()) == {
"tmp:off:x.list.item.y.list.item:x[0]",
"tmp:lst:x.list.item.y[0]",
"tmp:col:x.list.item.z[0]",
}
assert array.tolist() == data
@pytest.mark.parametrize("one,two,three", [(1, 2, 3), ("one", "two", "three")])
def test_15(one, two, three, tmp_path):
filename = os.path.join(str(tmp_path), "test15.parquet")
data = [one, two, three]
ak.to_parquet(ak.Array(data), filename)
array = ak.from_parquet(filename, lazy=True, lazy_cache_key="tmp")
assert set(array.caches[0].keys()) == set()
array.layout.array
assert set(array.caches[0].keys()) == {"tmp:col:[0]"}
assert array.tolist() == data
@pytest.mark.parametrize("one,two,three", [(1, 2, 3), ("one", "two", "three")])
def test_16(one, two, three, tmp_path):
filename = os.path.join(str(tmp_path), "test15.parquet")
data = [[one, two], [], [three]]
ak.to_parquet(ak.Array(data), filename)
array = ak.from_parquet(filename, lazy=True, lazy_cache_key="tmp")
assert set(array.caches[0].keys()) == set()
assert np.asarray(array.layout.array.offsets).tolist() == [0, 2, 2, 3]
assert set(array.caches[0].keys()) == {"tmp:lst:[0]"}
assert array.tolist() == data
@pytest.mark.parametrize("one,two,three", [(1, 2, 3), ("one", "two", "three")])
def test_17(one, two, three, tmp_path):
filename = os.path.join(str(tmp_path), "test15.parquet")
data = [[{"x": one}, {"x": two}], [], [{"x": three}]]
ak.to_parquet(ak.Array(data), filename)
array = ak.from_parquet(filename, lazy=True, lazy_cache_key="tmp")
assert set(array.caches[0].keys()) == set()
assert np.asarray(array.layout.array.offsets).tolist() == [0, 2, 2, 3]
assert set(array.caches[0].keys()) == {"tmp:off:.list.item.x:[0]"}
array.layout.array.content.field("x").array
assert set(array.caches[0].keys()) == {
"tmp:off:.list.item.x:[0]",
"tmp:col:.list.item.x[0]",
}
assert array.tolist() == data
| 42.881549 | 87 | 0.575618 | 2,948 | 18,825 | 3.62924 | 0.032564 | 0.089728 | 0.119077 | 0.127582 | 0.968408 | 0.962987 | 0.955884 | 0.949341 | 0.946911 | 0.942705 | 0 | 0.024038 | 0.171315 | 18,825 | 438 | 88 | 42.979452 | 0.661795 | 0.006268 | 0 | 0.726131 | 0 | 0 | 0.144209 | 0.077692 | 0 | 0 | 0 | 0 | 0.311558 | 1 | 0.042714 | false | 0 | 0.012563 | 0 | 0.055276 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a6560f07d3ab04b50d0e7b0fd8474bcb552f9011 | 82 | py | Python | lane_detection/__init__.py | Wajih-O/CarND-LaneLines-P1 | 326bcb393e5a9eac4e37caeb6b2257c1ee540b96 | [
"MIT"
] | null | null | null | lane_detection/__init__.py | Wajih-O/CarND-LaneLines-P1 | 326bcb393e5a9eac4e37caeb6b2257c1ee540b96 | [
"MIT"
] | null | null | null | lane_detection/__init__.py | Wajih-O/CarND-LaneLines-P1 | 326bcb393e5a9eac4e37caeb6b2257c1ee540b96 | [
"MIT"
] | null | null | null | from lane_detection.point import Point
from lane_detection.segment import Segment
| 27.333333 | 42 | 0.878049 | 12 | 82 | 5.833333 | 0.5 | 0.228571 | 0.485714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.097561 | 82 | 2 | 43 | 41 | 0.945946 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
a68be3caa2cb5689010346df334c68fb8d8f4c34 | 207 | py | Python | nmigen/build/dsl.py | psumesh/nmigen | 7d611b8fc1d9e58853ff268ec38ff8f4131a9774 | [
"BSD-2-Clause"
] | 528 | 2020-01-28T18:21:00.000Z | 2021-12-09T06:27:51.000Z | nmigen/build/dsl.py | DX-MON/nmigen | a6a13dd612ee1c9215719c70a5aa410a8775ffdb | [
"BSD-2-Clause"
] | 360 | 2020-01-28T18:34:30.000Z | 2021-12-10T08:03:32.000Z | nmigen/build/dsl.py | DX-MON/nmigen | a6a13dd612ee1c9215719c70a5aa410a8775ffdb | [
"BSD-2-Clause"
] | 100 | 2020-02-06T21:55:46.000Z | 2021-11-25T19:20:44.000Z | from amaranth.build.dsl import *
from amaranth.build.dsl import __all__
import warnings
warnings.warn("instead of nmigen.build.dsl, use amaranth.build.dsl",
DeprecationWarning, stacklevel=2)
| 25.875 | 68 | 0.753623 | 27 | 207 | 5.62963 | 0.555556 | 0.210526 | 0.315789 | 0.263158 | 0.342105 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005747 | 0.15942 | 207 | 7 | 69 | 29.571429 | 0.867816 | 0 | 0 | 0 | 0 | 0 | 0.246377 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.6 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
a6fc33427bfe195c965bb719647f6f8b8cf15e15 | 180 | py | Python | numpy/stringOperations/numpyFunctionJoin.py | slowy07/pythonApps | 22f9766291dbccd8185035745950c5ee4ebd6a3e | [
"MIT"
] | 10 | 2020-10-09T11:05:18.000Z | 2022-02-13T03:22:10.000Z | numpy/stringOperations/numpyFunctionJoin.py | khairanabila/pythonApps | f90b8823f939b98f7bf1dea7ed35fe6e22e2f730 | [
"MIT"
] | null | null | null | numpy/stringOperations/numpyFunctionJoin.py | khairanabila/pythonApps | f90b8823f939b98f7bf1dea7ed35fe6e22e2f730 | [
"MIT"
] | 6 | 2020-11-26T12:49:43.000Z | 2022-03-06T06:46:43.000Z | # numpy.join() function
import numpy as np
# splitting a string
print(np.char.join('-', 'adit'))
# splitting a string
print(np.char.join(['-', ':'], ['adit', 'zulkepretes'])) | 20 | 56 | 0.627778 | 24 | 180 | 4.708333 | 0.541667 | 0.176991 | 0.283186 | 0.371681 | 0.619469 | 0.619469 | 0.619469 | 0.619469 | 0 | 0 | 0 | 0 | 0.144444 | 180 | 9 | 56 | 20 | 0.733766 | 0.327778 | 0 | 0 | 0 | 0 | 0.186441 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0.666667 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 7 |
5b95422fb165908ae8febfd212bc6650c0421f0f | 3,261 | py | Python | classes_personagens.py | joaohfgarcia/jogo_python | d73d6f7390cda894f71c6b057deaad9532bfc83e | [
"MIT"
] | null | null | null | classes_personagens.py | joaohfgarcia/jogo_python | d73d6f7390cda894f71c6b057deaad9532bfc83e | [
"MIT"
] | null | null | null | classes_personagens.py | joaohfgarcia/jogo_python | d73d6f7390cda894f71c6b057deaad9532bfc83e | [
"MIT"
] | null | null | null | import pygame
class Heroi(pygame.sprite.Sprite):
# Essa classe representa o personagem e deriva da classe "Sprite" do Pygame.
def __init__(self):
# Chama o construtor do "Sprite"
super().__init__()
#estados do boneco: escondido, andando_dir, andando_esq, pulando, parado etc
self.estado = "escondido"
self.direcao = "esquerda" #direção que o personagem está"
self.movimento = "" #movimento que o personagem realizou
self.tempo_mov = 0
self.vida = 3
# carrega a figura do personagem
self.image = pygame.image.load("imagens/andar1_e.png").convert_alpha()
# Obtem o retângulo que possui as dimensões da imagem .
self.rect = self.image.get_rect()
# Criação de metodos para a movimentação do boneco
def moveRight(self, pixels):
self.rect.x += pixels
def moveLeft(self, pixels):
self.rect.x -= pixels
def moveUp(self, pixels):
self.rect.y -= pixels
def moveDown(self, pixels):
self.rect.y += pixels
def moveJumpup_d(self, pixels):
self.rect.y -= pixels
self.rect.x += pixels
def moveJumpdown_d(self,pixels):
self.rect.y += pixels
self.rect.x += pixels
def moveJumpup_e (self,pixels):
self.rect.y -= pixels
self.rect.x -= pixels
def moveJumpdown_e(self,pixels):
self.rect.y += pixels
self.rect.x -= pixels
class Inimigo(pygame.sprite.Sprite):
# Essa classe representa o personagem e deriva da classe "Sprite" do Pygame.
def __init__(self):
# Chama o construtor do "Sprite"
super().__init__()
#estados do boneco: escondido, andando_dir, andando_esq, pulando, parado etc
self.estado = "escondido"
self.lista_inimigos = []
self.indiceInimigo = 0 #indice do vetor Inimigo
self.lista_inimigos = [pygame.image.load("imagens/golemterra1_d.png").convert_alpha()]
self.movimento =""
self.tempo_mov = 0
self.image = self.lista_inimigos[self.indiceInimigo]
# Obtem o retângulo que possui as dimensões da imagem .
self.rect = self.image.get_rect()
# Criação de metodos para a movimentação do boneco
def moveRight(self, pixels):
self.rect.x += pixels
def moveLeft(self, pixels):
self.rect.x -= pixels
def moveUp(self, pixels):
self.rect.y -= pixels
def moveDown(self, pixels):
self.rect.y += pixels
class Feiticeiro(pygame.sprite.Sprite):
# Essa classe representa o personagem e deriva da classe "Sprite" do Pygame.
def __init__(self):
# Chama o construtor do "Sprite"
super().__init__()
# estados do boneco: escondido, andando_dir, andando_esq, pulando, parado etc
self.estado = "escondido"
self.movimento = ""
self.tempo_mov = 0
self.image = pygame.image.load ("imagens/andar1_e.png").convert_alpha()
# Obtem o retângulo que possui as dimensões da imagem .
self.rect = self.image.get_rect()
# Criação de metodos para a movimentação do boneco
def moveRight(self, pixels):
self.rect.x += pixels
def moveLeft(self, pixels):
self.rect.x -= pixels
| 28.858407 | 94 | 0.632321 | 416 | 3,261 | 4.838942 | 0.199519 | 0.083458 | 0.125186 | 0.125186 | 0.865375 | 0.825137 | 0.825137 | 0.823646 | 0.79533 | 0.79533 | 0 | 0.003349 | 0.267403 | 3,261 | 112 | 95 | 29.116071 | 0.839263 | 0.297761 | 0 | 0.774194 | 0 | 0 | 0.044053 | 0.011013 | 0 | 0 | 0 | 0.008929 | 0 | 1 | 0.274194 | false | 0 | 0.016129 | 0 | 0.33871 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
750a643077ff7b77ce9dc3b4ecac7fce2575e583 | 7,679 | py | Python | tests/converter/example_transactions.py | geometry-labs/icon-sdk-python | e530df02eb16b394c3022d2d7d0383bd972e129a | [
"Apache-2.0"
] | 51 | 2018-08-29T04:15:36.000Z | 2022-03-14T10:02:08.000Z | tests/converter/example_transactions.py | geometry-labs/icon-sdk-python | e530df02eb16b394c3022d2d7d0383bd972e129a | [
"Apache-2.0"
] | 24 | 2018-09-03T03:16:19.000Z | 2022-01-17T08:28:04.000Z | tests/converter/example_transactions.py | geometry-labs/icon-sdk-python | e530df02eb16b394c3022d2d7d0383bd972e129a | [
"Apache-2.0"
] | 44 | 2018-09-06T22:36:16.000Z | 2022-03-15T06:46:05.000Z | # -*- coding: utf-8 -*-
# Copyright 2018 ICON Foundation
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Sample transactions for testing the converter which is returned by 'icx_getTransactionByHash'"""
TRANSACTION_0 = {
"from": "hx54f7853dc6481b670caf69c5a27c7c8fe5be8269",
"to": "hx49a23bd156932485471f582897bf1bec5f875751",
"value": "0x56bc75e2d63100000",
"fee": "0x2386f26fc10000",
"nonce": "0x1",
"txHash": "0x375540830d475a73b704cf8dee9fa9eba2798f9d2af1fa55a85482e48daefd3b",
"signature": "bjarKeF3izGy469dpSciP3TT9caBQVYgHdaNgjY+8wJTOVSFm4o/ODXycFOdXUJcIwqvcE9If8x6Zmgt//XmkQE=",
"method": "icx_sendTransaction",
"txIndex": "0x0",
"blockHeight": "0x1",
"blockHash": "0x3add53134014e940f6f6010173781c4d8bd677d9931a697f962483e04a685e5c"
}
TRANSACTION_1 = {
"from": "hx49a23bd156932485471f582897bf1bec5f875751",
"to": "hx48ed23e910acd48a2650d83d4a7aeea0795572af",
"value": "0x2b5e3af16b1880000",
"fee": "0x2386f26fc10000",
"timestamp": "1517997570352000",
"txHash": "0x1b6133792cee1ab2e54ae68faf9f49daf81c7e46d68b1ca281acc718602c77dd",
"signature": "WDq5KJw776+ZY1RpnDe6b3fE9R5lgrG7JH9CwM0OcNBhmUSY4k4c6i+4F0GwRf+HblFd27zcezA/g6C4PoebzQE=",
"method": "icx_sendTransaction",
"txIndex": "0x0",
"blockHeight": "0x2",
"blockHash": "0x2cd6b2a6edd6dbce861bd9b79e91b5bc8351e7c87430e93251dfcb309a8ecff8"
}
TRANSACTION_2 = {
"from": "hx1ada76577eac29b1e60efee22aac66af9f434036",
"to": "cx502c47463314f01e84b1b203c315180501eb2481",
"version": "0x3",
"nid": "0x1",
"stepLimit": "0x7a120",
"timestamp": "0x58a4594d8a1f8",
"nonce": "0xa6b2",
"dataType": "call",
"data": {
"method": "mint",
"params": {
"_to": "hx853010e3a5b950d50ab8f98895fbd4b35c549189",
"_amount": "0x4563918244f40000"
}
},
"signature": "vjkZEDPZCaQTrrdkbiDeJPD/htA5Yr9U9jEJa/RsnA58fX4QH7aoxTtRIbRR4pr6YKJ6ThTlYrZwCkOA8H/NcQE=",
"txHash": "0xa18d3d130d57326a0df66818d333cd96147950ae5bcd644c7145018a3eefb317",
"txIndex": "0x1",
"blockHeight": "0x100300",
"blockHash": "0xa712b7e57dd807c50cd9d28a5a9c70689f2fa62e9d55d5fced6120ab21428eec"
}
TRANSACTION_3 = {
"version": "0x3",
"from": "hx4873b94352c8c1f3b2f09aaeccea31ce9e90bd31",
"stepLimit": "0xee6b2800",
"timestamp": "0x5750799c2773f",
"nid": "0x3",
"to": "cx0000000000000000000000000000000000000000",
"nonce": "0x3",
"dataType": "deploy",
"data": {
"contentType": "application/zip",
"content": "0x504b03041400000008005095f44c504a3cd7270200005d0600005600000055736572732f626f79656f6e2f5079636861726d50726f6a656374732f49636f6e507974686f6e53444b2f74657374732f6170695f73656e642f73616d706c655f746f6b656e2f73616d706c655f746f6b656e2e70798d54516bdb30107ef7af107d899dae25632fc32c63c9b28741e80af106a31421dbe760aa4846524243e97fdfd99125258dd3e929d1f9befbbe4f775729b92175218506b5ab0b20f5a691ca9071144505675a9315db341c32f90422fe891fae0aa960ce34246914113c94ce67cbd9ddf71f2b3225a39c71260ad0231bcb7e65b3255dfdbebf5ffe6de3461ac6a9de360ddf8f0e00df6007c270b98e6b51c23394d34f491728a12299624257a0620dbcfa4058592a5a21e994ccf027686def8c0c6e768c6f2125b530494a1a5411393c4a6b511b4a2d5e99a7c4a95a30c3f25619b9f94aeea480b44b6b0f12460ec9ad4b2ff3c407110a23a13494fa87a9c53cb6b1d086b6aae548cdbe8169cbf314acf711811675613c526ff67914a7530a64aa9110b74a3bde07729d3308fc713299200c14f5867177f9f982fc00d5d66acf89f0a012196311321ef745a24b9edd6a3071783168ca43f77fa3d798822da31eb16a987864c3b6299981ce85cbcaec8749df95cf0694603c56c04a29f87e9aa92df8ce0c0b7a7434d1832b305b25ce6a5da3d6ffad646553590d4ec17bb59d732ef5319c09733c65f4dc98d1b7734683416b09e452f2d4bf715d1dca07fc3d7242bef4f92ec13db5c295a04c5c5dbdf884d791ee9dc0d49743eeeb55f2a6a79c569fdb36c8a5e88da5f20e9491c3406decba8739c6713bcc97f476f62e26a72fd7f6c04983f8de3bb3158736a07f9881b60cb182a172b0162d582e152e809c154f43437558b9d13f504b03041400000008005095f44c084d808322000000260000005200000055736572732f626f79656f6e2f5079636861726d50726f6a656374732f49636f6e507974686f6e53444b2f74657374732f6170695f73656e642f73616d706c655f746f6b656e2f5f5f696e69745f5f2e70794b2bcacf55d02b4ecc2dc8498d2fc9cf4ecd53c8cc2dc82f2a5108068b858084b800504b03041400000008005095f44c057d7c97420000005c0000005300000055736572732f626f79656f6e2f5079636861726d50726f6a656374732f49636f6e507974686f6e53444b2f74657374732f6170695f73656e642f73616d706c655f746f6b656e2f7061636b6167652e6a736f6eabe6520002a5b2d4a2e2ccfc3c252b0525033d033d43251d88786e62665e7c5a664e2a48a63831b7202735be243f3b350f454171727e115845305845085801572d00504b010214031400000008005095f44c504a3cd7270200005d060000560000000000000000000000a4810000000055736572732f626f79656f6e2f5079636861726d50726f6a656374732f49636f6e507974686f6e53444b2f74657374732f6170695f73656e642f73616d706c655f746f6b656e2f73616d706c655f746f6b656e2e7079504b010214031400000008005095f44c084d80832200000026000000520000000000000000000000a4819b02000055736572732f626f79656f6e2f5079636861726d50726f6a656374732f49636f6e507974686f6e53444b2f74657374732f6170695f73656e642f73616d706c655f746f6b656e2f5f5f696e69745f5f2e7079504b010214031400000008005095f44c057d7c97420000005c000000530000000000000000000000a4812d03000055736572732f626f79656f6e2f5079636861726d50726f6a656374732f49636f6e507974686f6e53444b2f74657374732f6170695f73656e642f73616d706c655f746f6b656e2f7061636b6167652e6a736f6e504b0506000000000300030085010000e00300000000",
"params": {
"init_supply": "0x2710"
}
},
"signature": "Y0dXJwf3It4cnsrMZjEa/YXAFf1YzmuKL95JnRcVjtR+PE1VebQXoE8NifRxnLFzk5GMIoZQ51Fbq+o6ZTSb2AA=",
"txHash": "0x36d46e6f8ce3fb037f72c227214a391ba680fb771bb8062b7391a9ef084fdebc",
"txIndex": "0x0",
"blockHeight": "0x7",
"blockHash": "0x6979f9ad26fcf54a59998337fe6383c1feb32ef111d0cc9b3a78eec595e1bf4e"
}
TRANSACTION_4 = {
"version": "0x3",
"from": "hx4873b94352c8c1f3b2f09aaeccea31ce9e90bd31",
"stepLimit": "0xee6b2800",
"timestamp": "0x5750799c32a42",
"nid": "0x3",
"to": "cx0000000000000000000000000000000000000001",
"nonce": "0x3",
"dataType": "call",
"data": {
"method": "acceptScore",
"params": {
"txHash": "0x36d46e6f8ce3fb037f72c227214a391ba680fb771bb8062b7391a9ef084fdebc"
}
},
"signature": "dyGD5Zrf86iHSQjr60pWUzqUm+olfgO1FkLfeYsgDFF1lRKcVnhG4MN3I9pzNgGiOxiTQ9AGUKVHjaSHqHQr7QE=",
"txHash": "0xd34941501ef27bd2eba6c35b382e5ca2da5f5e44bec3bf460367a8ee04fd3fae",
"txIndex": "0x1",
"blockHeight": "0x7",
"blockHash": "0x6979f9ad26fcf54a59998337fe6383c1feb32ef111d0cc9b3a78eec595e1bf4e"
}
TRANSACTION_5 = {
"version": "0x3",
"from": "hxd9932b37df48b8a6fa6c17770c0137816e11c0bd",
"value": "0x1111d67bb1bb0000",
"stepLimit": "0x3000000",
"timestamp": "0x57502666bbccc",
"nid": "0x3",
"nonce": "0x1",
"to": "hxf1d9719d29488684039712e881830b5b37a64f11",
"signature": "OW8/F9aHXXeq4N3/Xs8ZCBHFuZTIPWUz5PQijm07PGY5UXwKOJZQqqzDrOj3bQ9SOJc9upXC13QaFUbmgII1fgA=",
"txHash": "0xa24ffb1152aa9c58dab9b9b2b7102d5b238e0076222991ba289581a63b6ac0a5",
"txIndex": "0x0",
"blockHeight": "0x4",
"blockHash": "0x255822446aca1cf42e0a68302560f6f7e6e33ff9beea25c3ee23255bb5504165"
}
| 59.527132 | 2,830 | 0.819768 | 307 | 7,679 | 20.465798 | 0.566775 | 0.00955 | 0.013369 | 0.005093 | 0.072895 | 0.041063 | 0.026739 | 0 | 0 | 0 | 0 | 0.482554 | 0.100534 | 7,679 | 128 | 2,831 | 59.992188 | 0.427103 | 0.0866 | 0 | 0.349057 | 0 | 0 | 0.815868 | 0.677627 | 0 | 1 | 0.56569 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
75367592614eb6834fb2a5bfa82c6fed3cf5f4be | 26,394 | py | Python | test/unit/test_lambdas.py | comtravo/grafana-dashboards | cd0e6f46408aebd2941ae4abc5b94e45124006a2 | [
"MIT"
] | 8 | 2020-12-09T13:14:53.000Z | 2022-01-29T01:56:30.000Z | test/unit/test_lambdas.py | comtravo/grafana-dashboards | cd0e6f46408aebd2941ae4abc5b94e45124006a2 | [
"MIT"
] | 4 | 2021-02-24T08:49:14.000Z | 2022-01-22T18:17:32.000Z | test/unit/test_lambdas.py | comtravo/grafana-dashboards | cd0e6f46408aebd2941ae4abc5b94e45124006a2 | [
"MIT"
] | null | null | null | from grafanalib.core import Alert, AlertCondition, Dashboard, Graph, Panel, Target
from grafanalib.cloudwatch import CloudwatchMetricsTarget, CloudwatchLogsInsightsTarget
from lib.lambdas import (
dispatcher,
lambda_generate_duration_graph,
lambda_generate_invocations_graph,
lambda_generate_memory_utilization_percentage_graph,
lambda_generate_memory_utilization_graph,
lambda_generate_logs_panel,
lambda_cron_dashboard,
lambda_events_dashboard,
lambda_cognito_dashboard,
lambda_logs_dashboard,
create_lambda_sqs_dlq_graph,
create_lambda_sqs_graph,
lambda_sqs_dashboard,
lambda_sns_sqs_dashboard,
)
class TestDispatcher:
def test_should_throw_exception_when_dispatcher_called_with_wrong_arguments(self):
dispatcher.when.called_with(service="foo", trigger="bar").should.throw(
Exception, r"dispatcher recieved a non l"
)
def test_should_call_trigger_handlers(self):
expected_triggers = [
"cognito-idp",
"cloudwatch-event-schedule",
"cloudwatch-event-trigger",
"cloudwatch-logs",
"sns",
"sqs",
"null",
]
lambda_name = "lambda-1"
environment = "alpha"
topics = ["topic-1", "topic-2"]
call_args = {
"name": lambda_name,
"environment": environment,
"cloudwatch_data_source": "cloudwatch",
"lambda_insights_namespace": "insights",
"notifications": [],
"topics": topics,
"fifo": False,
}
for trigger in expected_triggers:
dash = dispatcher(service="lambda", trigger=trigger, **call_args)
dash.should.be.a(Dashboard)
class TestGraphs:
def test_should_generate_lambda_duration_graph(self):
lambda_name = "lambda-1"
cloudwatch_data_source = "cloudwatch"
lambda_insights_namespace = "insights"
expected_targets = [
CloudwatchMetricsTarget(
alias="Min",
namespace="AWS/Lambda",
statistics=["Minimum"],
metricName="Duration",
dimensions={"FunctionName": lambda_name},
refId="A",
),
CloudwatchMetricsTarget(
alias="Avg",
namespace="AWS/Lambda",
statistics=["Average"],
metricName="Duration",
dimensions={"FunctionName": lambda_name},
refId="B",
),
CloudwatchMetricsTarget(
alias="Max",
namespace="AWS/Lambda",
statistics=["Maximum"],
metricName="Duration",
dimensions={"FunctionName": lambda_name},
refId="C",
),
]
generated_lambda_graph = lambda_generate_duration_graph(
name=lambda_name,
cloudwatch_data_source=cloudwatch_data_source,
lambda_insights_namespace=lambda_insights_namespace,
notifications=[],
)
generated_lambda_graph.should.be.a(Graph)
generated_lambda_graph.should.have.property("title").with_value.equal(
"Lambda Invocation Duration"
)
generated_lambda_graph.should.have.property("dataSource").with_value.equal(
cloudwatch_data_source
)
generated_lambda_graph.should.have.property("alert").with_value.equal(None)
generated_lambda_graph.should.have.property("targets")
generated_lambda_graph.targets.should.have.length_of(3)
generated_lambda_graph.targets.should.equal(expected_targets)
def test_should_generate_lambda_invocations_graph(self):
lambda_name = "lambda-1"
cloudwatch_data_source = "cloudwatch"
lambda_insights_namespace = "insights"
expected_targets = [
CloudwatchMetricsTarget(
alias="Invocations",
namespace="AWS/Lambda",
statistics=["Sum"],
metricName="Invocations",
dimensions={"FunctionName": lambda_name},
refId="B",
),
CloudwatchMetricsTarget(
alias="Errors",
namespace="AWS/Lambda",
statistics=["Sum"],
metricName="Errors",
dimensions={"FunctionName": lambda_name},
refId="A",
),
]
generated_lambda_graph = lambda_generate_invocations_graph(
name=lambda_name,
cloudwatch_data_source=cloudwatch_data_source,
lambda_insights_namespace=lambda_insights_namespace,
notifications=[],
)
generated_lambda_graph.should.be.a(Graph)
generated_lambda_graph.should.have.property("title").with_value.equal(
"Lambda Invocations and Errors"
)
generated_lambda_graph.should.have.property("dataSource").with_value.equal(
cloudwatch_data_source
)
generated_lambda_graph.should.have.property("alert").with_value.equal(None)
generated_lambda_graph.should.have.property("targets")
generated_lambda_graph.targets.should.have.length_of(2)
generated_lambda_graph.targets.should.equal(expected_targets)
def test_should_generate_lambda_invocations_graph_with_alert_notifications(self):
lambda_name = "lambda-1"
cloudwatch_data_source = "cloudwatch"
lambda_insights_namespace = "insights"
notifications = ["lorem", "ipsum"]
expected_alert_query = CloudwatchMetricsTarget(
alias="Errors",
namespace="AWS/Lambda",
statistics=["Sum"],
metricName="Errors",
dimensions={"FunctionName": lambda_name},
refId="A",
)
generated_lambda_graph = lambda_generate_invocations_graph(
name=lambda_name,
cloudwatch_data_source=cloudwatch_data_source,
lambda_insights_namespace=lambda_insights_namespace,
notifications=notifications,
)
generated_lambda_graph.should.have.property("alert").be.a(Alert)
generated_lambda_graph.alert.executionErrorState.should.eql("alerting")
generated_lambda_graph.alert.noDataState.should.eql("no_data")
generated_lambda_graph.alert.alertConditions.should.have.length_of(1)
generated_lambda_graph.alert.alertConditions[0].should.be.a(AlertCondition)
generated_lambda_graph.alert.alertConditions[0].target.should.eql(
Target(refId="A")
)
generated_lambda_graph.targets.should.contain(expected_alert_query)
def test_should_generate_lambda_memory_utilization_percentage_graph(self):
lambda_name = "lambda-1"
cloudwatch_data_source = "cloudwatch"
lambda_insights_namespace = "insights"
expected_targets = [
CloudwatchMetricsTarget(
alias="Min",
namespace=lambda_insights_namespace,
statistics=["Minimum"],
metricName="memory_utilization",
dimensions={"function_name": lambda_name},
refId="B",
),
CloudwatchMetricsTarget(
alias="Avg",
namespace=lambda_insights_namespace,
statistics=["Average"],
metricName="memory_utilization",
dimensions={"function_name": lambda_name},
refId="A",
),
CloudwatchMetricsTarget(
alias="Max",
namespace=lambda_insights_namespace,
statistics=["Maximum"],
metricName="memory_utilization",
dimensions={"function_name": lambda_name},
refId="C",
),
]
generated_lambda_graph = lambda_generate_memory_utilization_percentage_graph(
name=lambda_name,
cloudwatch_data_source=cloudwatch_data_source,
lambda_insights_namespace=lambda_insights_namespace,
notifications=[],
)
generated_lambda_graph.should.be.a(Graph)
generated_lambda_graph.should.have.property("title").with_value.equal(
"Lambda Memory Utilization Percentage"
)
generated_lambda_graph.should.have.property("dataSource").with_value.equal(
cloudwatch_data_source
)
generated_lambda_graph.should.have.property("alert").with_value.equal(None)
generated_lambda_graph.should.have.property("targets")
generated_lambda_graph.targets.should.have.length_of(3)
generated_lambda_graph.targets.should.equal(expected_targets)
def test_should_generate_lambda_memory_utilization_percentage_graph_with_alert_notifications(
self,
):
lambda_name = "lambda-1"
cloudwatch_data_source = "cloudwatch"
lambda_insights_namespace = "insights"
notifications = ["lorem", "ipsum"]
expected_alert_query = CloudwatchMetricsTarget(
alias="Avg",
namespace=lambda_insights_namespace,
statistics=["Average"],
metricName="memory_utilization",
dimensions={"function_name": lambda_name},
refId="A",
)
generated_lambda_graph = lambda_generate_memory_utilization_percentage_graph(
name=lambda_name,
cloudwatch_data_source=cloudwatch_data_source,
lambda_insights_namespace=lambda_insights_namespace,
notifications=notifications,
)
generated_lambda_graph.should.have.property("alert").be.a(Alert)
generated_lambda_graph.alert.executionErrorState.should.eql("alerting")
generated_lambda_graph.alert.noDataState.should.eql("no_data")
generated_lambda_graph.alert.alertConditions.should.have.length_of(1)
generated_lambda_graph.alert.alertConditions[0].should.be.a(AlertCondition)
generated_lambda_graph.alert.alertConditions[0].target.should.eql(
Target(refId="A")
)
generated_lambda_graph.targets.should.contain(expected_alert_query)
def test_should_generate_lambda_memory_utilization_percentage_graph(self):
lambda_name = "lambda-1"
cloudwatch_data_source = "cloudwatch"
lambda_insights_namespace = "insights"
expected_targets = [
CloudwatchMetricsTarget(
alias="used_memory_max",
namespace=lambda_insights_namespace,
statistics=["Maximum"],
metricName="used_memory_max",
dimensions={"function_name": lambda_name},
refId="A",
),
CloudwatchMetricsTarget(
alias="allocated_memory",
namespace=lambda_insights_namespace,
statistics=["Maximum"],
metricName="total_memory",
dimensions={"function_name": lambda_name},
refId="B",
),
]
generated_lambda_graph = lambda_generate_memory_utilization_graph(
name=lambda_name,
cloudwatch_data_source=cloudwatch_data_source,
lambda_insights_namespace=lambda_insights_namespace,
notifications=[],
)
generated_lambda_graph.should.be.a(Graph)
generated_lambda_graph.should.have.property("title").with_value.equal(
"Lambda Memory Utilization"
)
generated_lambda_graph.should.have.property("dataSource").with_value.equal(
cloudwatch_data_source
)
generated_lambda_graph.should.have.property("alert").with_value.equal(None)
generated_lambda_graph.should.have.property("targets")
generated_lambda_graph.targets.should.have.length_of(2)
generated_lambda_graph.targets.should.equal(expected_targets)
def test_should_generate_generate_logs_panel(self):
lambda_name = "lambda-1"
cloudwatch_data_source = "cloudwatch"
expected_targets = [
CloudwatchLogsInsightsTarget(
expression="fields @timestamp, @message | filter @message like /^(?!.*(START|END|REPORT|LOGS|EXTENSION)).*$/ | sort @timestamp desc",
logGroupNames=["/aws/lambda/{}".format(lambda_name)],
),
]
generated_lambda_graph = lambda_generate_logs_panel(
name=lambda_name,
cloudwatch_data_source=cloudwatch_data_source,
)
generated_lambda_graph.should.be.a(Panel)
generated_lambda_graph.should.have.property("title").with_value.equal("Logs")
generated_lambda_graph.should.have.property("dataSource").with_value.equal(
cloudwatch_data_source
)
generated_lambda_graph.should.have.property("targets")
generated_lambda_graph.targets.should.have.length_of(1)
generated_lambda_graph.targets.should.equal(expected_targets)
generated_lambda_graph.wrapLogMessages.should.equal(True)
generated_lambda_graph.prettifyLogMessage.should.equal(False)
generated_lambda_graph.enableLogDetails.should.equal(True)
def test_should_generate_lambda_basic_dashboards(self):
lambda_name = "lambda-1"
cloudwatch_data_source = "influxdb"
lambda_insights_namespace = "insights"
environment = "alpha"
call_args = {
"name": lambda_name,
"environment": environment,
"cloudwatch_data_source": cloudwatch_data_source,
"lambda_insights_namespace": lambda_insights_namespace,
"notifications": [],
}
test_matrix = {
lambda_cron_dashboard: "cron",
lambda_cognito_dashboard: "cognito",
lambda_events_dashboard: "cloudwatch events",
lambda_logs_dashboard: "cloudwatch logs",
}
for dahboard_generator, expected_dashboard_tag in test_matrix.items():
generated_dashboard = dahboard_generator(**call_args)
generated_dashboard.should.be.a(Dashboard)
generated_dashboard.title.should.eql("Lambda: {}".format(lambda_name))
sorted(generated_dashboard.tags).should.eql(
sorted(["lambda", environment, expected_dashboard_tag])
)
def test_should_create_lambda_sqs_dlq_graph(self):
lambda_name = "lambda-1"
cloudwatch_data_source = "influxdb"
notifications = ["lorem"]
expected_alert_query = CloudwatchMetricsTarget(
alias="Approximate number of messages available",
namespace="AWS/SQS",
statistics=["Maximum"],
metricName="ApproximateNumberOfMessagesVisible",
dimensions={"QueueName": lambda_name},
refId="A",
)
generated_lambda_graph = create_lambda_sqs_dlq_graph(
name=lambda_name,
cloudwatch_data_source=cloudwatch_data_source,
notifications=notifications,
fifo=False,
)
generated_lambda_graph.should.be.a(Graph)
generated_lambda_graph.should.have.property("title").with_value.equal(
"SQS Dead Letter Queue: {}".format(lambda_name)
)
generated_lambda_graph.should.have.property("dataSource").with_value.equal(
cloudwatch_data_source
)
generated_lambda_graph.should.have.property("targets")
generated_lambda_graph.targets.should.have.length_of(1)
generated_lambda_graph.targets[0].should.eql(expected_alert_query)
generated_lambda_graph.should.have.property("alert").be.a(Alert)
generated_lambda_graph.alert.executionErrorState.should.eql("alerting")
generated_lambda_graph.alert.noDataState.should.eql("no_data")
def test_should_create_lambda_sqs_dlq_fifo_graph(self):
lambda_name = "lambda-1"
sqs_dlq_name = lambda_name + "-dlq"
cloudwatch_data_source = "influxdb"
notifications = ["lorem"]
expected_alert_query = CloudwatchMetricsTarget(
alias="Approximate number of messages available",
namespace="AWS/SQS",
statistics=["Maximum"],
metricName="ApproximateNumberOfMessagesVisible",
dimensions={"QueueName": sqs_dlq_name + ".fifo"},
refId="A",
)
generated_lambda_graph = create_lambda_sqs_dlq_graph(
name=sqs_dlq_name,
cloudwatch_data_source=cloudwatch_data_source,
notifications=notifications,
fifo=True,
)
generated_lambda_graph.should.be.a(Graph)
generated_lambda_graph.should.have.property("title").with_value.equal(
"SQS Dead Letter Queue: {}.fifo".format(sqs_dlq_name)
)
generated_lambda_graph.should.have.property("dataSource").with_value.equal(
cloudwatch_data_source
)
generated_lambda_graph.should.have.property("targets")
generated_lambda_graph.targets.should.have.length_of(1)
generated_lambda_graph.targets[0].should.eql(expected_alert_query)
generated_lambda_graph.should.have.property("alert").be.a(Alert)
generated_lambda_graph.alert.executionErrorState.should.eql("alerting")
generated_lambda_graph.alert.noDataState.should.eql("no_data")
def test_should_create_lambda_sqs_graph(self):
lambda_name = "lambda-1"
cloudwatch_data_source = "cloudwatch"
expected_query = CloudwatchMetricsTarget(
alias="Number of messages sent to the queue",
namespace="AWS/SQS",
statistics=["Sum"],
metricName="NumberOfMessagesSent",
dimensions={"QueueName": lambda_name},
refId="A",
)
generated_lambda_graph = create_lambda_sqs_graph(
name=lambda_name, cloudwatch_data_source=cloudwatch_data_source, fifo=False
)
generated_lambda_graph.should.be.a(Graph)
generated_lambda_graph.should.have.property("title").with_value.equal(
"SQS: {}".format(lambda_name)
)
generated_lambda_graph.should.have.property("dataSource").with_value.equal(
cloudwatch_data_source
)
generated_lambda_graph.should.have.property("targets")
generated_lambda_graph.targets.should.have.length_of(1)
generated_lambda_graph.targets[0].should.eql(expected_query)
def test_should_create_lambda_sqs_fifo_graph(self):
lambda_name = "lambda-1"
sqs_name = lambda_name + ".fifo"
cloudwatch_data_source = "cloudwatch"
expected_query = CloudwatchMetricsTarget(
alias="Number of messages sent to the queue",
namespace="AWS/SQS",
statistics=["Sum"],
metricName="NumberOfMessagesSent",
dimensions={"QueueName": sqs_name},
refId="A",
)
generated_lambda_graph = create_lambda_sqs_graph(
name=lambda_name, cloudwatch_data_source=cloudwatch_data_source, fifo=True
)
generated_lambda_graph.should.be.a(Graph)
generated_lambda_graph.should.have.property("title").with_value.equal(
"SQS: {}".format(sqs_name)
)
generated_lambda_graph.should.have.property("dataSource").with_value.equal(
cloudwatch_data_source
)
generated_lambda_graph.should.have.property("targets")
generated_lambda_graph.targets.should.have.length_of(1)
generated_lambda_graph.targets[0].should.eql(expected_query)
def test_should_generate_lambda_sqs_dashboard(self):
lambda_name = "lambda-1"
cloudwatch_data_source = "cloudwatch"
lambda_insights_namespace = "insights"
environment = "alpha"
call_args = {
"name": lambda_name,
"environment": environment,
"cloudwatch_data_source": cloudwatch_data_source,
"lambda_insights_namespace": lambda_insights_namespace,
"notifications": [],
"fifo": False,
}
generated_dashboard = lambda_sqs_dashboard(**call_args)
generated_dashboard.should.be.a(Dashboard)
generated_dashboard.title.should.eql("Lambda: {}".format(lambda_name))
sorted(generated_dashboard.tags).should.eql(
sorted(["lambda", environment, "sqs"])
)
generated_dashboard.rows.should.be.length_of(5)
generated_dashboard.rows[0].title.should.eql("Invocations")
generated_dashboard.rows[0].panels.should.be.length_of(2)
generated_dashboard.rows[1].title.should.eql("Memory Utilization")
generated_dashboard.rows[1].panels.should.be.length_of(2)
generated_dashboard.rows[2].title.should.eql("Logs")
generated_dashboard.rows[2].panels.should.be.length_of(1)
generated_dashboard.rows[3].title.should.eql("Queues")
generated_dashboard.rows[3].panels.should.be.length_of(1)
generated_dashboard.rows[4].title.should.eql("Dead Letter Queues")
generated_dashboard.rows[4].panels.should.be.length_of(1)
def test_should_generate_lambda_sqs_fifo_dashboard(self):
lambda_name = "lambda-1"
cloudwatch_data_source = "cloudwatch"
lambda_insights_namespace = "insights"
environment = "alpha"
call_args = {
"name": lambda_name,
"environment": environment,
"cloudwatch_data_source": cloudwatch_data_source,
"lambda_insights_namespace": lambda_insights_namespace,
"notifications": [],
"fifo": True,
}
generated_dashboard = lambda_sqs_dashboard(**call_args)
generated_dashboard.should.be.a(Dashboard)
generated_dashboard.title.should.eql("Lambda: {}".format(lambda_name))
sorted(generated_dashboard.tags).should.eql(
sorted(["lambda", environment, "sqs", "fifo"])
)
generated_dashboard.rows.should.be.length_of(5)
generated_dashboard.rows[0].title.should.eql("Invocations")
generated_dashboard.rows[0].panels.should.be.length_of(2)
generated_dashboard.rows[1].title.should.eql("Memory Utilization")
generated_dashboard.rows[1].panels.should.be.length_of(2)
generated_dashboard.rows[2].title.should.eql("Logs")
generated_dashboard.rows[2].panels.should.be.length_of(1)
generated_dashboard.rows[3].title.should.eql("Queues")
generated_dashboard.rows[3].panels.should.be.length_of(1)
generated_dashboard.rows[4].title.should.eql("Dead Letter Queues")
generated_dashboard.rows[4].panels.should.be.length_of(1)
def test_should_generate_lambda_sns_sqs_dashboard(self):
lambda_name = "lambda-1"
cloudwatch_data_source = "cloudwatch"
lambda_insights_namespace = "insights"
environment = "alpha"
topics = ["topic-1", "topic-2"]
call_args = {
"name": lambda_name,
"environment": environment,
"cloudwatch_data_source": cloudwatch_data_source,
"lambda_insights_namespace": lambda_insights_namespace,
"notifications": [],
"fifo": False,
"topics": topics,
}
generated_dashboard = lambda_sns_sqs_dashboard(**call_args)
generated_dashboard.should.be.a(Dashboard)
generated_dashboard.title.should.eql("Lambda: {}".format(lambda_name))
sorted(generated_dashboard.tags).should.eql(
sorted(["lambda", environment, "sqs", "sns"])
)
generated_dashboard.rows.should.be.length_of(6)
generated_dashboard.rows[0].panels.should.be.length_of(len(topics))
generated_dashboard.rows[0].title.should.eql("SNS Topics")
generated_dashboard.rows[1].title.should.eql("Invocations")
generated_dashboard.rows[1].panels.should.be.length_of(2)
generated_dashboard.rows[2].title.should.eql("Memory Utilization")
generated_dashboard.rows[2].panels.should.be.length_of(2)
generated_dashboard.rows[3].title.should.eql("Logs")
generated_dashboard.rows[3].panels.should.be.length_of(1)
generated_dashboard.rows[4].title.should.eql("Queues")
generated_dashboard.rows[4].panels.should.be.length_of(1)
generated_dashboard.rows[5].title.should.eql("Dead Letter Queues")
generated_dashboard.rows[5].panels.should.be.length_of(1)
def test_should_generate_lambda_sns_sqs_fifo_dashboard(self):
lambda_name = "lambda-1"
cloudwatch_data_source = "cloudwatch"
lambda_insights_namespace = "insights"
environment = "alpha"
topics = ["topic-1", "topic-2"]
call_args = {
"name": lambda_name,
"environment": environment,
"cloudwatch_data_source": cloudwatch_data_source,
"lambda_insights_namespace": lambda_insights_namespace,
"notifications": [],
"fifo": True,
"topics": topics,
}
generated_dashboard = lambda_sns_sqs_dashboard(**call_args)
generated_dashboard.should.be.a(Dashboard)
print(dir(generated_dashboard))
generated_dashboard.title.should.eql("Lambda: {}".format(lambda_name))
sorted(generated_dashboard.tags).should.eql(
sorted(["lambda", environment, "sqs", "sns", "fifo"])
)
generated_dashboard.rows.should.be.length_of(6)
generated_dashboard.rows[0].panels.should.be.length_of(len(topics))
generated_dashboard.rows[0].title.should.eql("SNS Topics")
generated_dashboard.rows[1].title.should.eql("Invocations")
generated_dashboard.rows[1].panels.should.be.length_of(2)
generated_dashboard.rows[2].title.should.eql("Memory Utilization")
generated_dashboard.rows[2].panels.should.be.length_of(2)
generated_dashboard.rows[3].title.should.eql("Logs")
generated_dashboard.rows[3].panels.should.be.length_of(1)
generated_dashboard.rows[4].title.should.eql("Queues")
generated_dashboard.rows[4].panels.should.be.length_of(1)
generated_dashboard.rows[5].title.should.eql("Dead Letter Queues")
generated_dashboard.rows[5].panels.should.be.length_of(1)
| 42.847403 | 149 | 0.650565 | 2,668 | 26,394 | 6.114318 | 0.066342 | 0.084595 | 0.112793 | 0.070128 | 0.896953 | 0.882486 | 0.87182 | 0.848955 | 0.829461 | 0.800466 | 0 | 0.005553 | 0.249526 | 26,394 | 615 | 150 | 42.917073 | 0.818003 | 0 | 0 | 0.727592 | 0 | 0.001757 | 0.11389 | 0.016822 | 0 | 0 | 0 | 0 | 0 | 1 | 0.031634 | false | 0 | 0.005272 | 0 | 0.040422 | 0.001757 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
7559608c968f7f11c12f1dd9e67381385e30a3fe | 209,701 | py | Python | mysite/patterns/90.py | BioinfoNet/prepub | e19c48cabf8bd22736dcef9308a5e196cfd8119a | [
"MIT"
] | 19 | 2016-06-17T23:36:27.000Z | 2020-01-13T16:41:55.000Z | mysite/patterns/90.py | BioinfoNet/prepub | e19c48cabf8bd22736dcef9308a5e196cfd8119a | [
"MIT"
] | 13 | 2016-06-06T12:57:05.000Z | 2019-02-05T02:21:00.000Z | patterns/90.py | OmnesRes/GRIMMER | 173c99ebdb6a9edb1242d24a791d0c5d778ff643 | [
"MIT"
] | 7 | 2017-03-28T18:12:22.000Z | 2021-06-16T09:32:59.000Z | pattern_zero=[0.0, 0.010987654321, 0.021728395062, 0.022222222222, 0.032222222222, 0.033209876543, 0.042469135802, 0.043950617284, 0.044444444444, 0.052469135802, 0.054444444444, 0.055432098765, 0.062222222222, 0.064691358025, 0.066172839506, 0.066666666667, 0.071728395062, 0.074691358025, 0.076666666667, 0.077654320988, 0.080987654321, 0.084444444444, 0.086913580247, 0.088395061728, 0.088888888889, 0.09, 0.093950617284, 0.096913580247, 0.098765432099, 0.098888888889, 0.09987654321, 0.103209876543, 0.106666666667, 0.107283950617, 0.109135802469, 0.110617283951, 0.111111111111, 0.112222222222, 0.115555555556, 0.116172839506, 0.119135802469, 0.120987654321, 0.121111111111, 0.122098765432, 0.123580246914, 0.125432098765, 0.128888888889, 0.12950617284, 0.131358024691, 0.132839506173, 0.133333333333, 0.134444444444, 0.137777777778, 0.138395061728, 0.138888888889, 0.141358024691, 0.143209876543, 0.143333333333, 0.144320987654, 0.145802469136, 0.146172839506, 0.147654320988, 0.151111111111, 0.151728395062, 0.153209876543, 0.153580246914, 0.155061728395, 0.155555555556, 0.156666666667, 0.16, 0.160617283951, 0.161111111111, 0.163580246914, 0.165432098765, 0.165555555556, 0.166543209877, 0.168024691358, 0.168395061728, 0.16987654321, 0.172839506173, 0.173333333333, 0.173950617284, 0.175432098765, 0.175802469136, 0.177283950617, 0.177777777778, 0.178888888889, 0.182222222222, 0.182839506173, 0.183333333333, 0.184691358025, 0.185802469136, 0.187654320988, 0.187777777778, 0.188765432099, 0.19024691358, 0.190617283951, 0.192098765432, 0.195061728395, 0.195555555556, 0.196172839506, 0.197654320988, 0.198024691358, 0.19950617284, 0.2, 0.200617283951, 0.201111111111, 0.204444444444, 0.205061728395, 0.205432098765, 0.205555555556, 0.206913580247, 0.208024691358, 0.20987654321, 0.21, 0.210987654321, 0.212469135802, 0.212839506173, 0.214320987654, 0.217283950617, 0.217777777778, 0.218395061728, 0.21987654321, 0.22024691358, 0.221728395062, 0.222222222222, 0.222839506173, 0.223333333333, 0.225802469136, 0.226666666667, 0.227283950617, 0.227654320988, 0.227777777778, 0.229135802469, 0.23024691358, 0.232098765432, 0.232222222222, 0.233209876543, 0.234691358025, 0.235061728395, 0.236543209877, 0.237654320988, 0.23950617284, 0.24, 0.240617283951, 0.242098765432, 0.242469135802, 0.243950617284, 0.244444444444, 0.245061728395, 0.245555555556, 0.246913580247, 0.248024691358, 0.248888888889, 0.24950617284, 0.24987654321, 0.25, 0.251358024691, 0.252469135802, 0.254320987654, 0.254444444444, 0.255432098765, 0.256913580247, 0.257283950617, 0.258765432099, 0.25987654321, 0.261728395062, 0.262222222222, 0.262839506173, 0.264320987654, 0.264691358025, 0.266172839506, 0.266666666667, 0.267283950617, 0.267777777778, 0.269135802469, 0.27024691358, 0.271111111111, 0.271728395062, 0.272098765432, 0.272222222222, 0.273580246914, 0.274691358025, 0.276543209877, 0.276666666667, 0.277654320988, 0.279135802469, 0.27950617284, 0.280987654321, 0.282098765432, 0.283950617284, 0.284444444444, 0.285061728395, 0.286543209877, 0.286913580247, 0.288395061728, 0.288888888889, 0.28950617284, 0.29, 0.291358024691, 0.292469135802, 0.293333333333, 0.293950617284, 0.294320987654, 0.294444444444, 0.295802469136, 0.296913580247, 0.298765432099, 0.298888888889, 0.29987654321, 0.301358024691, 0.301728395062, 0.303209876543, 0.304320987654, 0.306172839506, 0.306666666667, 0.307283950617, 0.308765432099, 0.309135802469, 0.310617283951, 0.311111111111, 0.311728395062, 0.312222222222, 0.313580246914, 0.314691358025, 0.315555555556, 0.316172839506, 0.316543209877, 0.316666666667, 0.318024691358, 0.319135802469, 0.320987654321, 0.321111111111, 0.322098765432, 0.323580246914, 0.323950617284, 0.325432098765, 0.326543209877, 0.328395061728, 0.328888888889, 0.32950617284, 0.330987654321, 0.331358024691, 0.332839506173, 0.333333333333, 0.333950617284, 0.334444444444, 0.335802469136, 0.336913580247, 0.337777777778, 0.338395061728, 0.338765432099, 0.338888888889, 0.34024691358, 0.341358024691, 0.343209876543, 0.343333333333, 0.344320987654, 0.345802469136, 0.346172839506, 0.347654320988, 0.348765432099, 0.350617283951, 0.351111111111, 0.351728395062, 0.353209876543, 0.353580246914, 0.355061728395, 0.355555555556, 0.356172839506, 0.356666666667, 0.358024691358, 0.359135802469, 0.36, 0.360617283951, 0.360987654321, 0.361111111111, 0.362469135802, 0.363580246914, 0.365432098765, 0.365555555556, 0.366543209877, 0.368024691358, 0.368395061728, 0.36987654321, 0.370987654321, 0.372839506173, 0.373333333333, 0.373950617284, 0.375432098765, 0.375802469136, 0.377283950617, 0.377777777778, 0.378395061728, 0.378888888889, 0.38024691358, 0.381358024691, 0.382222222222, 0.382839506173, 0.383209876543, 0.383333333333, 0.384691358025, 0.385802469136, 0.387654320988, 0.387777777778, 0.388765432099, 0.39024691358, 0.390617283951, 0.392098765432, 0.393209876543, 0.395061728395, 0.395555555556, 0.396172839506, 0.397654320988, 0.398024691358, 0.39950617284, 0.4, 0.400617283951, 0.401111111111, 0.402469135802, 0.403580246914, 0.404444444444, 0.405061728395, 0.405432098765, 0.405555555556, 0.406913580247, 0.408024691358, 0.40987654321, 0.41, 0.410987654321, 0.412469135802, 0.412839506173, 0.414320987654, 0.415432098765, 0.417283950617, 0.417777777778, 0.418395061728, 0.41987654321, 0.42024691358, 0.421728395062, 0.422222222222, 0.422839506173, 0.423333333333, 0.424691358025, 0.425802469136, 0.426666666667, 0.427283950617, 0.427654320988, 0.427777777778, 0.429135802469, 0.43024691358, 0.432098765432, 0.432222222222, 0.433209876543, 0.434691358025, 0.435061728395, 0.436543209877, 0.437654320988, 0.43950617284, 0.44, 0.440617283951, 0.442098765432, 0.442469135802, 0.443950617284, 0.444444444444, 0.445061728395, 0.445555555556, 0.446913580247, 0.448024691358, 0.448888888889, 0.44950617284, 0.44987654321, 0.45, 0.451358024691, 0.452469135802, 0.454320987654, 0.454444444444, 0.455432098765, 0.456913580247, 0.457283950617, 0.458765432099, 0.45987654321, 0.461728395062, 0.462222222222, 0.462839506173, 0.464320987654, 0.464691358025, 0.466172839506, 0.466666666667, 0.467283950617, 0.467777777778, 0.469135802469, 0.47024691358, 0.471111111111, 0.471728395062, 0.472098765432, 0.472222222222, 0.473580246914, 0.474691358025, 0.476543209877, 0.476666666667, 0.477654320988, 0.479135802469, 0.47950617284, 0.480987654321, 0.482098765432, 0.483950617284, 0.484444444444, 0.485061728395, 0.486543209877, 0.486913580247, 0.488395061728, 0.488888888889, 0.48950617284, 0.49, 0.491358024691, 0.492469135802, 0.493333333333, 0.493950617284, 0.494320987654, 0.494444444444, 0.495802469136, 0.496913580247, 0.498765432099, 0.498888888889, 0.49987654321, 0.501358024691, 0.501728395062, 0.503209876543, 0.504320987654, 0.506172839506, 0.506666666667, 0.507283950617, 0.508765432099, 0.509135802469, 0.510617283951, 0.511111111111, 0.511728395062, 0.512222222222, 0.513580246914, 0.514691358025, 0.515555555556, 0.516172839506, 0.516543209877, 0.516666666667, 0.518024691358, 0.519135802469, 0.520987654321, 0.521111111111, 0.522098765432, 0.523580246914, 0.523950617284, 0.525432098765, 0.526543209877, 0.528395061728, 0.528888888889, 0.52950617284, 0.530987654321, 0.531358024691, 0.532839506173, 0.533333333333, 0.533950617284, 0.534444444444, 0.535802469136, 0.536913580247, 0.537777777778, 0.538395061728, 0.538765432099, 0.538888888889, 0.54024691358, 0.541358024691, 0.543209876543, 0.543333333333, 0.544320987654, 0.545802469136, 0.546172839506, 0.547654320988, 0.548765432099, 0.550617283951, 0.551111111111, 0.551728395062, 0.553209876543, 0.553580246914, 0.555061728395, 0.555555555556, 0.556172839506, 0.556666666667, 0.558024691358, 0.559135802469, 0.56, 0.560617283951, 0.560987654321, 0.561111111111, 0.562469135802, 0.563580246914, 0.565432098765, 0.565555555556, 0.566543209877, 0.568024691358, 0.568395061728, 0.56987654321, 0.570987654321, 0.572839506173, 0.573333333333, 0.573950617284, 0.575432098765, 0.575802469136, 0.577283950617, 0.577777777778, 0.578395061728, 0.578888888889, 0.58024691358, 0.581358024691, 0.582222222222, 0.582839506173, 0.583209876543, 0.583333333333, 0.584691358025, 0.585802469136, 0.587654320988, 0.587777777778, 0.588765432099, 0.59024691358, 0.590617283951, 0.592098765432, 0.593209876543, 0.595061728395, 0.595555555556, 0.596172839506, 0.597654320988, 0.598024691358, 0.59950617284, 0.6, 0.600617283951, 0.601111111111, 0.602469135802, 0.603580246914, 0.604444444444, 0.605061728395, 0.605432098765, 0.605555555556, 0.606913580247, 0.608024691358, 0.60987654321, 0.61, 0.610987654321, 0.612469135802, 0.612839506173, 0.614320987654, 0.615432098765, 0.617283950617, 0.617777777778, 0.618395061728, 0.61987654321, 0.62024691358, 0.621728395062, 0.622222222222, 0.622839506173, 0.623333333333, 0.624691358025, 0.625802469136, 0.626666666667, 0.627283950617, 0.627654320988, 0.627777777778, 0.629135802469, 0.63024691358, 0.632098765432, 0.632222222222, 0.633209876543, 0.634691358025, 0.635061728395, 0.636543209877, 0.637654320988, 0.63950617284, 0.64, 0.640617283951, 0.642098765432, 0.642469135802, 0.643950617284, 0.644444444444, 0.645061728395, 0.645555555556, 0.646913580247, 0.648024691358, 0.648888888889, 0.64950617284, 0.64987654321, 0.65, 0.651358024691, 0.652469135802, 0.654320987654, 0.654444444444, 0.655432098765, 0.656913580247, 0.657283950617, 0.658765432099, 0.65987654321, 0.661728395062, 0.662222222222, 0.662839506173, 0.664320987654, 0.664691358025, 0.666172839506, 0.666666666667, 0.667283950617, 0.667777777778, 0.669135802469, 0.67024691358, 0.671111111111, 0.671728395062, 0.672098765432, 0.672222222222, 0.673580246914, 0.674691358025, 0.676543209877, 0.676666666667, 0.677654320988, 0.679135802469, 0.67950617284, 0.680987654321, 0.682098765432, 0.683950617284, 0.684444444444, 0.685061728395, 0.686543209877, 0.686913580247, 0.688395061728, 0.688888888889, 0.68950617284, 0.69, 0.691358024691, 0.692469135802, 0.693333333333, 0.693950617284, 0.694320987654, 0.694444444444, 0.695802469136, 0.696913580247, 0.698765432099, 0.698888888889, 0.69987654321, 0.701358024691, 0.701728395062, 0.703209876543, 0.704320987654, 0.706172839506, 0.706666666667, 0.707283950617, 0.708765432099, 0.709135802469, 0.710617283951, 0.711111111111, 0.711728395062, 0.712222222222, 0.713580246914, 0.714691358025, 0.715555555556, 0.716172839506, 0.716543209877, 0.716666666667, 0.718024691358, 0.719135802469, 0.720987654321, 0.721111111111, 0.722098765432, 0.723580246914, 0.723950617284, 0.725432098765, 0.726543209877, 0.728395061728, 0.728888888889, 0.72950617284, 0.730987654321, 0.731358024691, 0.732839506173, 0.733333333333, 0.733950617284, 0.734444444444, 0.735802469136, 0.736913580247, 0.737777777778, 0.738395061728, 0.738765432099, 0.738888888889, 0.74024691358, 0.741358024691, 0.743209876543, 0.743333333333, 0.744320987654, 0.745802469136, 0.746172839506, 0.747654320988, 0.748765432099, 0.750617283951, 0.751111111111, 0.751728395062, 0.753209876543, 0.753580246914, 0.755061728395, 0.755555555556, 0.756172839506, 0.756666666667, 0.758024691358, 0.759135802469, 0.76, 0.760617283951, 0.760987654321, 0.761111111111, 0.762469135802, 0.763580246914, 0.765432098765, 0.765555555556, 0.766543209877, 0.768024691358, 0.768395061728, 0.76987654321, 0.770987654321, 0.772839506173, 0.773333333333, 0.773950617284, 0.775432098765, 0.775802469136, 0.777283950617, 0.777777777778, 0.778395061728, 0.778888888889, 0.78024691358, 0.781358024691, 0.782222222222, 0.782839506173, 0.783209876543, 0.783333333333, 0.784691358025, 0.785802469136, 0.787654320988, 0.787777777778, 0.788765432099, 0.79024691358, 0.790617283951, 0.792098765432, 0.793209876543, 0.795061728395, 0.795555555556, 0.796172839506, 0.797654320988, 0.798024691358, 0.79950617284, 0.8, 0.800617283951, 0.801111111111, 0.802469135802, 0.803580246914, 0.804444444444, 0.805061728395, 0.805432098765, 0.805555555556, 0.806913580247, 0.808024691358, 0.80987654321, 0.81, 0.810987654321, 0.812469135802, 0.812839506173, 0.814320987654, 0.815432098765, 0.817283950617, 0.817777777778, 0.818395061728, 0.81987654321, 0.82024691358, 0.821728395062, 0.822222222222, 0.822839506173, 0.823333333333, 0.824691358025, 0.825802469136, 0.826666666667, 0.827283950617, 0.827654320988, 0.827777777778, 0.829135802469, 0.83024691358, 0.832098765432, 0.832222222222, 0.833209876543, 0.834691358025, 0.835061728395, 0.836543209877, 0.837654320988, 0.83950617284, 0.84, 0.840617283951, 0.842098765432, 0.842469135802, 0.843950617284, 0.844444444444, 0.845061728395, 0.845555555556, 0.846913580247, 0.848024691358, 0.848888888889, 0.84950617284, 0.84987654321, 0.85, 0.851358024691, 0.852469135802, 0.854320987654, 0.854444444444, 0.855432098765, 0.856913580247, 0.857283950617, 0.858765432099, 0.85987654321, 0.861728395062, 0.862222222222, 0.862839506173, 0.864320987654, 0.864691358025, 0.866172839506, 0.866666666667, 0.867283950617, 0.867777777778, 0.869135802469, 0.87024691358, 0.871111111111, 0.871728395062, 0.872098765432, 0.872222222222, 0.873580246914, 0.874691358025, 0.876543209877, 0.876666666667, 0.877654320988, 0.879135802469, 0.87950617284, 0.880987654321, 0.882098765432, 0.883950617284, 0.884444444444, 0.885061728395, 0.886543209877, 0.886913580247, 0.888395061728, 0.888888888889, 0.88950617284, 0.89, 0.891358024691, 0.892469135802, 0.893333333333, 0.893950617284, 0.894320987654, 0.894444444444, 0.895802469136, 0.896913580247, 0.898765432099, 0.898888888889, 0.89987654321, 0.901358024691, 0.901728395062, 0.903209876543, 0.904320987654, 0.906172839506, 0.906666666667, 0.907283950617, 0.908765432099, 0.909135802469, 0.910617283951, 0.911111111111, 0.911728395062, 0.912222222222, 0.913580246914, 0.914691358025, 0.915555555556, 0.916172839506, 0.916543209877, 0.916666666667, 0.918024691358, 0.919135802469, 0.920987654321, 0.921111111111, 0.922098765432, 0.923580246914, 0.923950617284, 0.925432098765, 0.926543209877, 0.928395061728, 0.928888888889, 0.92950617284, 0.930987654321, 0.931358024691, 0.932839506173, 0.933333333333, 0.933950617284, 0.934444444444, 0.935802469136, 0.936913580247, 0.937777777778, 0.938395061728, 0.938765432099, 0.938888888889, 0.94024691358, 0.941358024691, 0.943209876543, 0.943333333333, 0.944320987654, 0.945802469136, 0.946172839506, 0.947654320988, 0.948765432099, 0.950617283951, 0.951111111111, 0.951728395062, 0.953209876543, 0.953580246914, 0.955061728395, 0.955555555556, 0.956172839506, 0.956666666667, 0.958024691358, 0.959135802469, 0.96, 0.960617283951, 0.960987654321, 0.961111111111, 0.962469135802, 0.963580246914, 0.965432098765, 0.965555555556, 0.966543209877, 0.968024691358, 0.968395061728, 0.96987654321, 0.970987654321, 0.972839506173, 0.973333333333, 0.973950617284, 0.975432098765, 0.975802469136, 0.977283950617, 0.977777777778, 0.978395061728, 0.978888888889, 0.98024691358, 0.981358024691, 0.982222222222, 0.982839506173, 0.983209876543, 0.983333333333, 0.984691358025, 0.985802469136, 0.987654320988, 0.987777777778, 0.988765432099, 0.99024691358, 0.990617283951, 0.992098765432, 0.993209876543, 0.995061728395, 0.995555555556, 0.996172839506, 0.997654320988, 0.998024691358, 0.99950617284]
pattern_odd=[0.0, 0.00061728395, 0.00111111111, 0.0024691358, 0.00358024691, 0.00444444444, 0.0050617284, 0.00543209877, 0.00555555556, 0.00691358025, 0.00802469136, 0.00987654321, 0.01, 0.01098765432, 0.0124691358, 0.01283950617, 0.01432098765, 0.01543209877, 0.01728395062, 0.01777777778, 0.01839506173, 0.01987654321, 0.02024691358, 0.02172839506, 0.02222222222, 0.02283950617, 0.02333333333, 0.02469135803, 0.02580246914, 0.02666666667, 0.02728395062, 0.02765432099, 0.02777777778, 0.02913580247, 0.03024691358, 0.03209876543, 0.03222222222, 0.03320987654, 0.03469135803, 0.0350617284, 0.03654320988, 0.03765432099, 0.03950617284, 0.04, 0.04061728395, 0.04209876543, 0.0424691358, 0.04395061728, 0.04444444444, 0.0450617284, 0.04555555556, 0.04691358025, 0.04802469136, 0.04888888889, 0.04950617284, 0.04987654321, 0.05, 0.05135802469, 0.0524691358, 0.05432098765, 0.05444444444, 0.05543209877, 0.05691358025, 0.05728395062, 0.0587654321, 0.05987654321, 0.06172839506, 0.06222222222, 0.06283950617, 0.06432098765, 0.06469135803, 0.06617283951, 0.06666666667, 0.06728395062, 0.06777777778, 0.06913580247, 0.07024691358, 0.07111111111, 0.07172839506, 0.07209876543, 0.07222222222, 0.07358024691, 0.07469135803, 0.07654320988, 0.07666666667, 0.07765432099, 0.07913580247, 0.07950617284, 0.08098765432, 0.08209876543, 0.08395061728, 0.08444444444, 0.0850617284, 0.08654320988, 0.08691358025, 0.08839506173, 0.08888888889, 0.08950617284, 0.09, 0.09135802469, 0.0924691358, 0.09333333333, 0.09395061728, 0.09432098765, 0.09444444444, 0.09580246914, 0.09691358025, 0.0987654321, 0.09888888889, 0.09987654321, 0.10135802469, 0.10172839506, 0.10320987654, 0.10432098765, 0.10617283951, 0.10666666667, 0.10728395062, 0.1087654321, 0.10913580247, 0.11061728395, 0.11111111111, 0.11172839506, 0.11222222222, 0.11358024691, 0.11469135803, 0.11555555556, 0.11617283951, 0.11654320988, 0.11666666667, 0.11802469136, 0.11913580247, 0.12098765432, 0.12111111111, 0.12209876543, 0.12358024691, 0.12395061728, 0.12543209877, 0.12654320988, 0.12839506173, 0.12888888889, 0.12950617284, 0.13098765432, 0.13135802469, 0.13283950617, 0.13333333333, 0.13395061728, 0.13444444444, 0.13580246914, 0.13691358025, 0.13777777778, 0.13839506173, 0.1387654321, 0.13888888889, 0.14024691358, 0.14135802469, 0.14320987654, 0.14333333333, 0.14432098765, 0.14580246914, 0.14617283951, 0.14765432099, 0.1487654321, 0.15061728395, 0.15111111111, 0.15172839506, 0.15320987654, 0.15358024691, 0.1550617284, 0.15555555556, 0.15617283951, 0.15666666667, 0.15802469136, 0.15913580247, 0.16, 0.16061728395, 0.16098765432, 0.16111111111, 0.1624691358, 0.16358024691, 0.16543209877, 0.16555555556, 0.16654320988, 0.16802469136, 0.16839506173, 0.16987654321, 0.17098765432, 0.17283950617, 0.17333333333, 0.17395061728, 0.17543209877, 0.17580246914, 0.17728395062, 0.17777777778, 0.17839506173, 0.17888888889, 0.18024691358, 0.18135802469, 0.18222222222, 0.18283950617, 0.18320987654, 0.18333333333, 0.18469135803, 0.18580246914, 0.18765432099, 0.18777777778, 0.1887654321, 0.19024691358, 0.19061728395, 0.19209876543, 0.19320987654, 0.1950617284, 0.19555555556, 0.19617283951, 0.19765432099, 0.19802469136, 0.19950617284, 0.2, 0.20061728395, 0.20111111111, 0.2024691358, 0.20358024691, 0.20444444444, 0.2050617284, 0.20543209877, 0.20555555556, 0.20691358025, 0.20802469136, 0.20987654321, 0.21, 0.21098765432, 0.2124691358, 0.21283950617, 0.21432098765, 0.21543209877, 0.21728395062, 0.21777777778, 0.21839506173, 0.21987654321, 0.22024691358, 0.22172839506, 0.22222222222, 0.22283950617, 0.22333333333, 0.22469135803, 0.22580246914, 0.22666666667, 0.22728395062, 0.22765432099, 0.22777777778, 0.22913580247, 0.23024691358, 0.23209876543, 0.23222222222, 0.23320987654, 0.23469135803, 0.2350617284, 0.23654320988, 0.23765432099, 0.23950617284, 0.24, 0.24061728395, 0.24209876543, 0.2424691358, 0.24395061728, 0.24444444444, 0.2450617284, 0.24555555556, 0.24691358025, 0.24802469136, 0.24888888889, 0.24950617284, 0.24987654321, 0.25, 0.25135802469, 0.2524691358, 0.25432098765, 0.25444444444, 0.25543209877, 0.25691358025, 0.25728395062, 0.2587654321, 0.25987654321, 0.26172839506, 0.26222222222, 0.26283950617, 0.26432098765, 0.26469135803, 0.26617283951, 0.26666666667, 0.26728395062, 0.26777777778, 0.26913580247, 0.27024691358, 0.27111111111, 0.27172839506, 0.27209876543, 0.27222222222, 0.27358024691, 0.27469135803, 0.27654320988, 0.27666666667, 0.27765432099, 0.27913580247, 0.27950617284, 0.28098765432, 0.28209876543, 0.28395061728, 0.28444444444, 0.2850617284, 0.28654320988, 0.28691358025, 0.28839506173, 0.28888888889, 0.28950617284, 0.29, 0.29135802469, 0.2924691358, 0.29333333333, 0.29395061728, 0.29432098765, 0.29444444444, 0.29580246914, 0.29691358025, 0.2987654321, 0.29888888889, 0.29987654321, 0.30135802469, 0.30172839506, 0.30320987654, 0.30432098765, 0.30617283951, 0.30666666667, 0.30728395062, 0.3087654321, 0.30913580247, 0.31061728395, 0.31111111111, 0.31172839506, 0.31222222222, 0.31358024691, 0.31469135803, 0.31555555556, 0.31617283951, 0.31654320988, 0.31666666667, 0.31802469136, 0.31913580247, 0.32098765432, 0.32111111111, 0.32209876543, 0.32358024691, 0.32395061728, 0.32543209877, 0.32654320988, 0.32839506173, 0.32888888889, 0.32950617284, 0.33098765432, 0.33135802469, 0.33283950617, 0.33333333333, 0.33395061728, 0.33444444444, 0.33580246914, 0.33691358025, 0.33777777778, 0.33839506173, 0.3387654321, 0.33888888889, 0.34024691358, 0.34135802469, 0.34320987654, 0.34333333333, 0.34432098765, 0.34580246914, 0.34617283951, 0.34765432099, 0.3487654321, 0.35061728395, 0.35111111111, 0.35172839506, 0.35320987654, 0.35358024691, 0.3550617284, 0.35555555556, 0.35617283951, 0.35666666667, 0.35802469136, 0.35913580247, 0.36, 0.36061728395, 0.36098765432, 0.36111111111, 0.3624691358, 0.36358024691, 0.36543209877, 0.36555555556, 0.36654320988, 0.36802469136, 0.36839506173, 0.36987654321, 0.37098765432, 0.37283950617, 0.37333333333, 0.37395061728, 0.37543209877, 0.37580246914, 0.37728395062, 0.37777777778, 0.37839506173, 0.37888888889, 0.38024691358, 0.38135802469, 0.38222222222, 0.38283950617, 0.38320987654, 0.38333333333, 0.38469135803, 0.38580246914, 0.38765432099, 0.38777777778, 0.3887654321, 0.39024691358, 0.39061728395, 0.39209876543, 0.39320987654, 0.3950617284, 0.39555555556, 0.39617283951, 0.39765432099, 0.39802469136, 0.39950617284, 0.4, 0.40061728395, 0.40111111111, 0.4024691358, 0.40358024691, 0.40444444444, 0.4050617284, 0.40543209877, 0.40555555556, 0.40691358025, 0.40802469136, 0.40987654321, 0.41, 0.41098765432, 0.4124691358, 0.41283950617, 0.41432098765, 0.41543209877, 0.41728395062, 0.41777777778, 0.41839506173, 0.41987654321, 0.42024691358, 0.42172839506, 0.42222222222, 0.42283950617, 0.42333333333, 0.42469135803, 0.42580246914, 0.42666666667, 0.42728395062, 0.42765432099, 0.42777777778, 0.42913580247, 0.43024691358, 0.43209876543, 0.43222222222, 0.43320987654, 0.43469135803, 0.4350617284, 0.43654320988, 0.43765432099, 0.43950617284, 0.44, 0.44061728395, 0.44209876543, 0.4424691358, 0.44395061728, 0.44444444444, 0.4450617284, 0.44555555556, 0.44691358025, 0.44802469136, 0.44888888889, 0.44950617284, 0.44987654321, 0.45, 0.45135802469, 0.4524691358, 0.45432098765, 0.45444444444, 0.45543209877, 0.45691358025, 0.45728395062, 0.4587654321, 0.45987654321, 0.46172839506, 0.46222222222, 0.46283950617, 0.46432098765, 0.46469135803, 0.46617283951, 0.46666666667, 0.46728395062, 0.46777777778, 0.46913580247, 0.47024691358, 0.47111111111, 0.47172839506, 0.47209876543, 0.47222222222, 0.47358024691, 0.47469135803, 0.47654320988, 0.47666666667, 0.47765432099, 0.47913580247, 0.47950617284, 0.48098765432, 0.48209876543, 0.48395061728, 0.48444444444, 0.4850617284, 0.48654320988, 0.48691358025, 0.48839506173, 0.48888888889, 0.48950617284, 0.49, 0.49135802469, 0.4924691358, 0.49333333333, 0.49395061728, 0.49432098765, 0.49444444444, 0.49580246914, 0.49691358025, 0.4987654321, 0.49888888889, 0.49987654321, 0.50135802469, 0.50172839506, 0.50320987654, 0.50432098765, 0.50617283951, 0.50666666667, 0.50728395062, 0.5087654321, 0.50913580247, 0.51061728395, 0.51111111111, 0.51172839506, 0.51222222222, 0.51358024691, 0.51469135803, 0.51555555556, 0.51617283951, 0.51654320988, 0.51666666667, 0.51802469136, 0.51913580247, 0.52098765432, 0.52111111111, 0.52209876543, 0.52358024691, 0.52395061728, 0.52543209877, 0.52654320988, 0.52839506173, 0.52888888889, 0.52950617284, 0.53098765432, 0.53135802469, 0.53283950617, 0.53333333333, 0.53395061728, 0.53444444444, 0.53580246914, 0.53691358025, 0.53777777778, 0.53839506173, 0.5387654321, 0.53888888889, 0.54024691358, 0.54135802469, 0.54320987654, 0.54333333333, 0.54432098765, 0.54580246914, 0.54617283951, 0.54765432099, 0.5487654321, 0.55061728395, 0.55111111111, 0.55172839506, 0.55320987654, 0.55358024691, 0.5550617284, 0.55555555556, 0.55617283951, 0.55666666667, 0.55802469136, 0.55913580247, 0.56, 0.56061728395, 0.56098765432, 0.56111111111, 0.5624691358, 0.56358024691, 0.56543209877, 0.56555555556, 0.56654320988, 0.56802469136, 0.56839506173, 0.56987654321, 0.57098765432, 0.57283950617, 0.57333333333, 0.57395061728, 0.57543209877, 0.57580246914, 0.57728395062, 0.57777777778, 0.57839506173, 0.57888888889, 0.58024691358, 0.58135802469, 0.58222222222, 0.58283950617, 0.58320987654, 0.58333333333, 0.58469135803, 0.58580246914, 0.58765432099, 0.58777777778, 0.5887654321, 0.59024691358, 0.59061728395, 0.59209876543, 0.59320987654, 0.5950617284, 0.59555555556, 0.59617283951, 0.59765432099, 0.59802469136, 0.59950617284, 0.6, 0.60061728395, 0.60111111111, 0.6024691358, 0.60358024691, 0.60444444444, 0.6050617284, 0.60543209877, 0.60555555556, 0.60691358025, 0.60802469136, 0.60987654321, 0.61, 0.61098765432, 0.6124691358, 0.61283950617, 0.61432098765, 0.61543209877, 0.61728395062, 0.61777777778, 0.61839506173, 0.61987654321, 0.62024691358, 0.62172839506, 0.62222222222, 0.62283950617, 0.62333333333, 0.62469135803, 0.62580246914, 0.62666666667, 0.62728395062, 0.62765432099, 0.62777777778, 0.62913580247, 0.63024691358, 0.63209876543, 0.63222222222, 0.63320987654, 0.63469135803, 0.6350617284, 0.63654320988, 0.63765432099, 0.63950617284, 0.64, 0.64061728395, 0.64209876543, 0.6424691358, 0.64395061728, 0.64444444444, 0.6450617284, 0.64555555556, 0.64691358025, 0.64802469136, 0.64888888889, 0.64950617284, 0.64987654321, 0.65, 0.65135802469, 0.6524691358, 0.65432098765, 0.65444444444, 0.65543209877, 0.65691358025, 0.65728395062, 0.6587654321, 0.65987654321, 0.66172839506, 0.66222222222, 0.66283950617, 0.66432098765, 0.66469135803, 0.66617283951, 0.66666666667, 0.66728395062, 0.66777777778, 0.66913580247, 0.67024691358, 0.67111111111, 0.67172839506, 0.67209876543, 0.67222222222, 0.67358024691, 0.67469135803, 0.67654320988, 0.67666666667, 0.67765432099, 0.67913580247, 0.67950617284, 0.68098765432, 0.68209876543, 0.68395061728, 0.68444444444, 0.6850617284, 0.68654320988, 0.68691358025, 0.68839506173, 0.68888888889, 0.68950617284, 0.69, 0.69135802469, 0.6924691358, 0.69333333333, 0.69395061728, 0.69432098765, 0.69444444444, 0.69580246914, 0.69691358025, 0.6987654321, 0.69888888889, 0.69987654321, 0.70135802469, 0.70172839506, 0.70320987654, 0.70432098765, 0.70617283951, 0.70666666667, 0.70728395062, 0.7087654321, 0.70913580247, 0.71061728395, 0.71111111111, 0.71172839506, 0.71222222222, 0.71358024691, 0.71469135803, 0.71555555556, 0.71617283951, 0.71654320988, 0.71666666667, 0.71802469136, 0.71913580247, 0.72098765432, 0.72111111111, 0.72209876543, 0.72358024691, 0.72395061728, 0.72543209877, 0.72654320988, 0.72839506173, 0.72888888889, 0.72950617284, 0.73098765432, 0.73135802469, 0.73283950617, 0.73333333333, 0.73395061728, 0.73444444444, 0.73580246914, 0.73691358025, 0.73777777778, 0.73839506173, 0.7387654321, 0.73888888889, 0.74024691358, 0.74135802469, 0.74320987654, 0.74333333333, 0.74432098765, 0.74580246914, 0.74617283951, 0.74765432099, 0.7487654321, 0.75061728395, 0.75111111111, 0.75172839506, 0.75320987654, 0.75358024691, 0.7550617284, 0.75555555556, 0.75617283951, 0.75666666667, 0.75802469136, 0.75913580247, 0.76, 0.76061728395, 0.76098765432, 0.76111111111, 0.7624691358, 0.76358024691, 0.76543209877, 0.76555555556, 0.76654320988, 0.76802469136, 0.76839506173, 0.76987654321, 0.77098765432, 0.77283950617, 0.77333333333, 0.77395061728, 0.77543209877, 0.77580246914, 0.77728395062, 0.77777777778, 0.77839506173, 0.77888888889, 0.78024691358, 0.78135802469, 0.78222222222, 0.78283950617, 0.78320987654, 0.78333333333, 0.78469135803, 0.78580246914, 0.78765432099, 0.78777777778, 0.7887654321, 0.79024691358, 0.79061728395, 0.79209876543, 0.79320987654, 0.7950617284, 0.79555555556, 0.79617283951, 0.79765432099, 0.79802469136, 0.79950617284, 0.8, 0.80061728395, 0.80111111111, 0.8024691358, 0.80358024691, 0.80444444444, 0.8050617284, 0.80543209877, 0.80555555556, 0.80691358025, 0.80802469136, 0.80987654321, 0.81, 0.81098765432, 0.8124691358, 0.81283950617, 0.81432098765, 0.81543209877, 0.81728395062, 0.81777777778, 0.81839506173, 0.81987654321, 0.82024691358, 0.82172839506, 0.82222222222, 0.82283950617, 0.82333333333, 0.82469135803, 0.82580246914, 0.82666666667, 0.82728395062, 0.82765432099, 0.82777777778, 0.82913580247, 0.83024691358, 0.83209876543, 0.83222222222, 0.83320987654, 0.83469135803, 0.8350617284, 0.83654320988, 0.83765432099, 0.83950617284, 0.84, 0.84061728395, 0.84209876543, 0.8424691358, 0.84395061728, 0.84444444444, 0.8450617284, 0.84555555556, 0.84691358025, 0.84802469136, 0.84888888889, 0.84950617284, 0.84987654321, 0.85, 0.85135802469, 0.8524691358, 0.85432098765, 0.85444444444, 0.85543209877, 0.85691358025, 0.85728395062, 0.8587654321, 0.85987654321, 0.86172839506, 0.86222222222, 0.86283950617, 0.86432098765, 0.86469135803, 0.86617283951, 0.86666666667, 0.86728395062, 0.86777777778, 0.86913580247, 0.87024691358, 0.87111111111, 0.87172839506, 0.87209876543, 0.87222222222, 0.87358024691, 0.87469135803, 0.87654320988, 0.87666666667, 0.87765432099, 0.87913580247, 0.87950617284, 0.88098765432, 0.88209876543, 0.88395061728, 0.88444444444, 0.8850617284, 0.88654320988, 0.88691358025, 0.88839506173, 0.88888888889, 0.88950617284, 0.89, 0.89135802469, 0.8924691358, 0.89333333333, 0.89395061728, 0.89432098765, 0.89444444444, 0.89580246914, 0.89691358025, 0.8987654321, 0.89888888889, 0.89987654321, 0.90135802469, 0.90172839506, 0.90320987654, 0.90432098765, 0.90617283951, 0.90666666667, 0.90728395062, 0.9087654321, 0.90913580247, 0.91061728395, 0.91111111111, 0.91172839506, 0.91222222222, 0.91358024691, 0.91469135803, 0.91555555556, 0.91617283951, 0.91654320988, 0.91666666667, 0.91802469136, 0.91913580247, 0.92098765432, 0.92111111111, 0.92209876543, 0.92358024691, 0.92395061728, 0.92543209877, 0.92654320988, 0.92839506173, 0.92888888889, 0.92950617284, 0.93098765432, 0.93135802469, 0.93283950617, 0.93333333333, 0.93395061728, 0.93444444444, 0.93580246914, 0.93691358025, 0.93777777778, 0.93839506173, 0.9387654321, 0.93888888889, 0.94024691358, 0.94135802469, 0.94320987654, 0.94333333333, 0.94432098765, 0.94580246914, 0.94617283951, 0.94765432099, 0.9487654321, 0.95061728395, 0.95111111111, 0.95172839506, 0.95320987654, 0.95358024691, 0.9550617284, 0.95555555556, 0.95617283951, 0.95666666667, 0.95802469136, 0.95913580247, 0.96, 0.96061728395, 0.96098765432, 0.96111111111, 0.9624691358, 0.96358024691, 0.96543209877, 0.96555555556, 0.96654320988, 0.96802469136, 0.96839506173, 0.96987654321, 0.97098765432, 0.97283950617, 0.97333333333, 0.97395061728, 0.97543209877, 0.97580246914, 0.97728395062, 0.97777777778, 0.97839506173, 0.97888888889, 0.98024691358, 0.98135802469, 0.98222222222, 0.98283950617, 0.98320987654, 0.98333333333, 0.98469135803, 0.98580246914, 0.98765432099, 0.98777777778, 0.9887654321, 0.99024691358, 0.99061728395, 0.99209876543, 0.99320987654, 0.9950617284, 0.99555555556, 0.99617283951, 0.99765432099, 0.99802469136, 0.99950617284]
pattern_even=[0.0, 0.00061728395, 0.00111111111, 0.0024691358, 0.00358024691, 0.00444444444, 0.0050617284, 0.00543209877, 0.00555555556, 0.00691358025, 0.00802469136, 0.00987654321, 0.01, 0.01098765432, 0.0124691358, 0.01283950617, 0.01432098765, 0.01543209877, 0.01728395062, 0.01777777778, 0.01839506173, 0.01987654321, 0.02024691358, 0.02172839506, 0.02222222222, 0.02283950617, 0.02333333333, 0.02469135803, 0.02580246914, 0.02666666667, 0.02728395062, 0.02765432099, 0.02777777778, 0.02913580247, 0.03024691358, 0.03209876543, 0.03222222222, 0.03320987654, 0.03469135803, 0.0350617284, 0.03654320988, 0.03765432099, 0.03950617284, 0.04, 0.04061728395, 0.04209876543, 0.0424691358, 0.04395061728, 0.04444444444, 0.0450617284, 0.04555555556, 0.04691358025, 0.04802469136, 0.04888888889, 0.04950617284, 0.04987654321, 0.05, 0.05135802469, 0.0524691358, 0.05432098765, 0.05444444444, 0.05543209877, 0.05691358025, 0.05728395062, 0.0587654321, 0.05987654321, 0.06172839506, 0.06222222222, 0.06283950617, 0.06432098765, 0.06469135803, 0.06617283951, 0.06666666667, 0.06728395062, 0.06777777778, 0.06913580247, 0.07024691358, 0.07111111111, 0.07172839506, 0.07209876543, 0.07222222222, 0.07358024691, 0.07469135803, 0.07654320988, 0.07666666667, 0.07765432099, 0.07913580247, 0.07950617284, 0.08098765432, 0.08209876543, 0.08395061728, 0.08444444444, 0.0850617284, 0.08654320988, 0.08691358025, 0.08839506173, 0.08888888889, 0.08950617284, 0.09, 0.09135802469, 0.0924691358, 0.09333333333, 0.09395061728, 0.09432098765, 0.09444444444, 0.09580246914, 0.09691358025, 0.0987654321, 0.09888888889, 0.09987654321, 0.10135802469, 0.10172839506, 0.10320987654, 0.10432098765, 0.10617283951, 0.10666666667, 0.10728395062, 0.1087654321, 0.10913580247, 0.11061728395, 0.11111111111, 0.11172839506, 0.11222222222, 0.11358024691, 0.11469135803, 0.11555555556, 0.11617283951, 0.11654320988, 0.11666666667, 0.11802469136, 0.11913580247, 0.12098765432, 0.12111111111, 0.12209876543, 0.12358024691, 0.12395061728, 0.12543209877, 0.12654320988, 0.12839506173, 0.12888888889, 0.12950617284, 0.13098765432, 0.13135802469, 0.13283950617, 0.13333333333, 0.13395061728, 0.13444444444, 0.13580246914, 0.13691358025, 0.13777777778, 0.13839506173, 0.1387654321, 0.13888888889, 0.14024691358, 0.14135802469, 0.14320987654, 0.14333333333, 0.14432098765, 0.14580246914, 0.14617283951, 0.14765432099, 0.1487654321, 0.15061728395, 0.15111111111, 0.15172839506, 0.15320987654, 0.15358024691, 0.1550617284, 0.15555555556, 0.15617283951, 0.15666666667, 0.15802469136, 0.15913580247, 0.16, 0.16061728395, 0.16098765432, 0.16111111111, 0.1624691358, 0.16358024691, 0.16543209877, 0.16555555556, 0.16654320988, 0.16802469136, 0.16839506173, 0.16987654321, 0.17098765432, 0.17283950617, 0.17333333333, 0.17395061728, 0.17543209877, 0.17580246914, 0.17728395062, 0.17777777778, 0.17839506173, 0.17888888889, 0.18024691358, 0.18135802469, 0.18222222222, 0.18283950617, 0.18320987654, 0.18333333333, 0.18469135803, 0.18580246914, 0.18765432099, 0.18777777778, 0.1887654321, 0.19024691358, 0.19061728395, 0.19209876543, 0.19320987654, 0.1950617284, 0.19555555556, 0.19617283951, 0.19765432099, 0.19802469136, 0.19950617284, 0.2, 0.20061728395, 0.20111111111, 0.2024691358, 0.20358024691, 0.20444444444, 0.2050617284, 0.20543209877, 0.20555555556, 0.20691358025, 0.20802469136, 0.20987654321, 0.21, 0.21098765432, 0.2124691358, 0.21283950617, 0.21432098765, 0.21543209877, 0.21728395062, 0.21777777778, 0.21839506173, 0.21987654321, 0.22024691358, 0.22172839506, 0.22222222222, 0.22283950617, 0.22333333333, 0.22469135803, 0.22580246914, 0.22666666667, 0.22728395062, 0.22765432099, 0.22777777778, 0.22913580247, 0.23024691358, 0.23209876543, 0.23222222222, 0.23320987654, 0.23469135803, 0.2350617284, 0.23654320988, 0.23765432099, 0.23950617284, 0.24, 0.24061728395, 0.24209876543, 0.2424691358, 0.24395061728, 0.24444444444, 0.2450617284, 0.24555555556, 0.24691358025, 0.24802469136, 0.24888888889, 0.24950617284, 0.24987654321, 0.25, 0.25135802469, 0.2524691358, 0.25432098765, 0.25444444444, 0.25543209877, 0.25691358025, 0.25728395062, 0.2587654321, 0.25987654321, 0.26172839506, 0.26222222222, 0.26283950617, 0.26432098765, 0.26469135803, 0.26617283951, 0.26666666667, 0.26728395062, 0.26777777778, 0.26913580247, 0.27024691358, 0.27111111111, 0.27172839506, 0.27209876543, 0.27222222222, 0.27358024691, 0.27469135803, 0.27654320988, 0.27666666667, 0.27765432099, 0.27913580247, 0.27950617284, 0.28098765432, 0.28209876543, 0.28395061728, 0.28444444444, 0.2850617284, 0.28654320988, 0.28691358025, 0.28839506173, 0.28888888889, 0.28950617284, 0.29, 0.29135802469, 0.2924691358, 0.29333333333, 0.29395061728, 0.29432098765, 0.29444444444, 0.29580246914, 0.29691358025, 0.2987654321, 0.29888888889, 0.29987654321, 0.30135802469, 0.30172839506, 0.30320987654, 0.30432098765, 0.30617283951, 0.30666666667, 0.30728395062, 0.3087654321, 0.30913580247, 0.31061728395, 0.31111111111, 0.31172839506, 0.31222222222, 0.31358024691, 0.31469135803, 0.31555555556, 0.31617283951, 0.31654320988, 0.31666666667, 0.31802469136, 0.31913580247, 0.32098765432, 0.32111111111, 0.32209876543, 0.32358024691, 0.32395061728, 0.32543209877, 0.32654320988, 0.32839506173, 0.32888888889, 0.32950617284, 0.33098765432, 0.33135802469, 0.33283950617, 0.33333333333, 0.33395061728, 0.33444444444, 0.33580246914, 0.33691358025, 0.33777777778, 0.33839506173, 0.3387654321, 0.33888888889, 0.34024691358, 0.34135802469, 0.34320987654, 0.34333333333, 0.34432098765, 0.34580246914, 0.34617283951, 0.34765432099, 0.3487654321, 0.35061728395, 0.35111111111, 0.35172839506, 0.35320987654, 0.35358024691, 0.3550617284, 0.35555555556, 0.35617283951, 0.35666666667, 0.35802469136, 0.35913580247, 0.36, 0.36061728395, 0.36098765432, 0.36111111111, 0.3624691358, 0.36358024691, 0.36543209877, 0.36555555556, 0.36654320988, 0.36802469136, 0.36839506173, 0.36987654321, 0.37098765432, 0.37283950617, 0.37333333333, 0.37395061728, 0.37543209877, 0.37580246914, 0.37728395062, 0.37777777778, 0.37839506173, 0.37888888889, 0.38024691358, 0.38135802469, 0.38222222222, 0.38283950617, 0.38320987654, 0.38333333333, 0.38469135803, 0.38580246914, 0.38765432099, 0.38777777778, 0.3887654321, 0.39024691358, 0.39061728395, 0.39209876543, 0.39320987654, 0.3950617284, 0.39555555556, 0.39617283951, 0.39765432099, 0.39802469136, 0.39950617284, 0.4, 0.40061728395, 0.40111111111, 0.4024691358, 0.40358024691, 0.40444444444, 0.4050617284, 0.40543209877, 0.40555555556, 0.40691358025, 0.40802469136, 0.40987654321, 0.41, 0.41098765432, 0.4124691358, 0.41283950617, 0.41432098765, 0.41543209877, 0.41728395062, 0.41777777778, 0.41839506173, 0.41987654321, 0.42024691358, 0.42172839506, 0.42222222222, 0.42283950617, 0.42333333333, 0.42469135803, 0.42580246914, 0.42666666667, 0.42728395062, 0.42765432099, 0.42777777778, 0.42913580247, 0.43024691358, 0.43209876543, 0.43222222222, 0.43320987654, 0.43469135803, 0.4350617284, 0.43654320988, 0.43765432099, 0.43950617284, 0.44, 0.44061728395, 0.44209876543, 0.4424691358, 0.44395061728, 0.44444444444, 0.4450617284, 0.44555555556, 0.44691358025, 0.44802469136, 0.44888888889, 0.44950617284, 0.44987654321, 0.45, 0.45135802469, 0.4524691358, 0.45432098765, 0.45444444444, 0.45543209877, 0.45691358025, 0.45728395062, 0.4587654321, 0.45987654321, 0.46172839506, 0.46222222222, 0.46283950617, 0.46432098765, 0.46469135803, 0.46617283951, 0.46666666667, 0.46728395062, 0.46777777778, 0.46913580247, 0.47024691358, 0.47111111111, 0.47172839506, 0.47209876543, 0.47222222222, 0.47358024691, 0.47469135803, 0.47654320988, 0.47666666667, 0.47765432099, 0.47913580247, 0.47950617284, 0.48098765432, 0.48209876543, 0.48395061728, 0.48444444444, 0.4850617284, 0.48654320988, 0.48691358025, 0.48839506173, 0.48888888889, 0.48950617284, 0.49, 0.49135802469, 0.4924691358, 0.49333333333, 0.49395061728, 0.49432098765, 0.49444444444, 0.49580246914, 0.49691358025, 0.4987654321, 0.49888888889, 0.49987654321, 0.50135802469, 0.50172839506, 0.50320987654, 0.50432098765, 0.50617283951, 0.50666666667, 0.50728395062, 0.5087654321, 0.50913580247, 0.51061728395, 0.51111111111, 0.51172839506, 0.51222222222, 0.51358024691, 0.51469135803, 0.51555555556, 0.51617283951, 0.51654320988, 0.51666666667, 0.51802469136, 0.51913580247, 0.52098765432, 0.52111111111, 0.52209876543, 0.52358024691, 0.52395061728, 0.52543209877, 0.52654320988, 0.52839506173, 0.52888888889, 0.52950617284, 0.53098765432, 0.53135802469, 0.53283950617, 0.53333333333, 0.53395061728, 0.53444444444, 0.53580246914, 0.53691358025, 0.53777777778, 0.53839506173, 0.5387654321, 0.53888888889, 0.54024691358, 0.54135802469, 0.54320987654, 0.54333333333, 0.54432098765, 0.54580246914, 0.54617283951, 0.54765432099, 0.5487654321, 0.55061728395, 0.55111111111, 0.55172839506, 0.55320987654, 0.55358024691, 0.5550617284, 0.55555555556, 0.55617283951, 0.55666666667, 0.55802469136, 0.55913580247, 0.56, 0.56061728395, 0.56098765432, 0.56111111111, 0.5624691358, 0.56358024691, 0.56543209877, 0.56555555556, 0.56654320988, 0.56802469136, 0.56839506173, 0.56987654321, 0.57098765432, 0.57283950617, 0.57333333333, 0.57395061728, 0.57543209877, 0.57580246914, 0.57728395062, 0.57777777778, 0.57839506173, 0.57888888889, 0.58024691358, 0.58135802469, 0.58222222222, 0.58283950617, 0.58320987654, 0.58333333333, 0.58469135803, 0.58580246914, 0.58765432099, 0.58777777778, 0.5887654321, 0.59024691358, 0.59061728395, 0.59209876543, 0.59320987654, 0.5950617284, 0.59555555556, 0.59617283951, 0.59765432099, 0.59802469136, 0.59950617284, 0.6, 0.60061728395, 0.60111111111, 0.6024691358, 0.60358024691, 0.60444444444, 0.6050617284, 0.60543209877, 0.60555555556, 0.60691358025, 0.60802469136, 0.60987654321, 0.61, 0.61098765432, 0.6124691358, 0.61283950617, 0.61432098765, 0.61543209877, 0.61728395062, 0.61777777778, 0.61839506173, 0.61987654321, 0.62024691358, 0.62172839506, 0.62222222222, 0.62283950617, 0.62333333333, 0.62469135803, 0.62580246914, 0.62666666667, 0.62728395062, 0.62765432099, 0.62777777778, 0.62913580247, 0.63024691358, 0.63209876543, 0.63222222222, 0.63320987654, 0.63469135803, 0.6350617284, 0.63654320988, 0.63765432099, 0.63950617284, 0.64, 0.64061728395, 0.64209876543, 0.6424691358, 0.64395061728, 0.64444444444, 0.6450617284, 0.64555555556, 0.64691358025, 0.64802469136, 0.64888888889, 0.64950617284, 0.64987654321, 0.65, 0.65135802469, 0.6524691358, 0.65432098765, 0.65444444444, 0.65543209877, 0.65691358025, 0.65728395062, 0.6587654321, 0.65987654321, 0.66172839506, 0.66222222222, 0.66283950617, 0.66432098765, 0.66469135803, 0.66617283951, 0.66666666667, 0.66728395062, 0.66777777778, 0.66913580247, 0.67024691358, 0.67111111111, 0.67172839506, 0.67209876543, 0.67222222222, 0.67358024691, 0.67469135803, 0.67654320988, 0.67666666667, 0.67765432099, 0.67913580247, 0.67950617284, 0.68098765432, 0.68209876543, 0.68395061728, 0.68444444444, 0.6850617284, 0.68654320988, 0.68691358025, 0.68839506173, 0.68888888889, 0.68950617284, 0.69, 0.69135802469, 0.6924691358, 0.69333333333, 0.69395061728, 0.69432098765, 0.69444444444, 0.69580246914, 0.69691358025, 0.6987654321, 0.69888888889, 0.69987654321, 0.70135802469, 0.70172839506, 0.70320987654, 0.70432098765, 0.70617283951, 0.70666666667, 0.70728395062, 0.7087654321, 0.70913580247, 0.71061728395, 0.71111111111, 0.71172839506, 0.71222222222, 0.71358024691, 0.71469135803, 0.71555555556, 0.71617283951, 0.71654320988, 0.71666666667, 0.71802469136, 0.71913580247, 0.72098765432, 0.72111111111, 0.72209876543, 0.72358024691, 0.72395061728, 0.72543209877, 0.72654320988, 0.72839506173, 0.72888888889, 0.72950617284, 0.73098765432, 0.73135802469, 0.73283950617, 0.73333333333, 0.73395061728, 0.73444444444, 0.73580246914, 0.73691358025, 0.73777777778, 0.73839506173, 0.7387654321, 0.73888888889, 0.74024691358, 0.74135802469, 0.74320987654, 0.74333333333, 0.74432098765, 0.74580246914, 0.74617283951, 0.74765432099, 0.7487654321, 0.75061728395, 0.75111111111, 0.75172839506, 0.75320987654, 0.75358024691, 0.7550617284, 0.75555555556, 0.75617283951, 0.75666666667, 0.75802469136, 0.75913580247, 0.76, 0.76061728395, 0.76098765432, 0.76111111111, 0.7624691358, 0.76358024691, 0.76543209877, 0.76555555556, 0.76654320988, 0.76802469136, 0.76839506173, 0.76987654321, 0.77098765432, 0.77283950617, 0.77333333333, 0.77395061728, 0.77543209877, 0.77580246914, 0.77728395062, 0.77777777778, 0.77839506173, 0.77888888889, 0.78024691358, 0.78135802469, 0.78222222222, 0.78283950617, 0.78320987654, 0.78333333333, 0.78469135803, 0.78580246914, 0.78765432099, 0.78777777778, 0.7887654321, 0.79024691358, 0.79061728395, 0.79209876543, 0.79320987654, 0.7950617284, 0.79555555556, 0.79617283951, 0.79765432099, 0.79802469136, 0.79950617284, 0.8, 0.80061728395, 0.80111111111, 0.8024691358, 0.80358024691, 0.80444444444, 0.8050617284, 0.80543209877, 0.80555555556, 0.80691358025, 0.80802469136, 0.80987654321, 0.81, 0.81098765432, 0.8124691358, 0.81283950617, 0.81432098765, 0.81543209877, 0.81728395062, 0.81777777778, 0.81839506173, 0.81987654321, 0.82024691358, 0.82172839506, 0.82222222222, 0.82283950617, 0.82333333333, 0.82469135803, 0.82580246914, 0.82666666667, 0.82728395062, 0.82765432099, 0.82777777778, 0.82913580247, 0.83024691358, 0.83209876543, 0.83222222222, 0.83320987654, 0.83469135803, 0.8350617284, 0.83654320988, 0.83765432099, 0.83950617284, 0.84, 0.84061728395, 0.84209876543, 0.8424691358, 0.84395061728, 0.84444444444, 0.8450617284, 0.84555555556, 0.84691358025, 0.84802469136, 0.84888888889, 0.84950617284, 0.84987654321, 0.85, 0.85135802469, 0.8524691358, 0.85432098765, 0.85444444444, 0.85543209877, 0.85691358025, 0.85728395062, 0.8587654321, 0.85987654321, 0.86172839506, 0.86222222222, 0.86283950617, 0.86432098765, 0.86469135803, 0.86617283951, 0.86666666667, 0.86728395062, 0.86777777778, 0.86913580247, 0.87024691358, 0.87111111111, 0.87172839506, 0.87209876543, 0.87222222222, 0.87358024691, 0.87469135803, 0.87654320988, 0.87666666667, 0.87765432099, 0.87913580247, 0.87950617284, 0.88098765432, 0.88209876543, 0.88395061728, 0.88444444444, 0.8850617284, 0.88654320988, 0.88691358025, 0.88839506173, 0.88888888889, 0.88950617284, 0.89, 0.89135802469, 0.8924691358, 0.89333333333, 0.89395061728, 0.89432098765, 0.89444444444, 0.89580246914, 0.89691358025, 0.8987654321, 0.89888888889, 0.89987654321, 0.90135802469, 0.90172839506, 0.90320987654, 0.90432098765, 0.90617283951, 0.90666666667, 0.90728395062, 0.9087654321, 0.90913580247, 0.91061728395, 0.91111111111, 0.91172839506, 0.91222222222, 0.91358024691, 0.91469135803, 0.91555555556, 0.91617283951, 0.91654320988, 0.91666666667, 0.91802469136, 0.91913580247, 0.92098765432, 0.92111111111, 0.92209876543, 0.92358024691, 0.92395061728, 0.92543209877, 0.92654320988, 0.92839506173, 0.92888888889, 0.92950617284, 0.93098765432, 0.93135802469, 0.93283950617, 0.93333333333, 0.93395061728, 0.93444444444, 0.93580246914, 0.93691358025, 0.93777777778, 0.93839506173, 0.9387654321, 0.93888888889, 0.94024691358, 0.94135802469, 0.94320987654, 0.94333333333, 0.94432098765, 0.94580246914, 0.94617283951, 0.94765432099, 0.9487654321, 0.95061728395, 0.95111111111, 0.95172839506, 0.95320987654, 0.95358024691, 0.9550617284, 0.95555555556, 0.95617283951, 0.95666666667, 0.95802469136, 0.95913580247, 0.96, 0.96061728395, 0.96098765432, 0.96111111111, 0.9624691358, 0.96358024691, 0.96543209877, 0.96555555556, 0.96654320988, 0.96802469136, 0.96839506173, 0.96987654321, 0.97098765432, 0.97283950617, 0.97333333333, 0.97395061728, 0.97543209877, 0.97580246914, 0.97728395062, 0.97777777778, 0.97839506173, 0.97888888889, 0.98024691358, 0.98135802469, 0.98222222222, 0.98283950617, 0.98320987654, 0.98333333333, 0.98469135803, 0.98580246914, 0.98765432099, 0.98777777778, 0.9887654321, 0.99024691358, 0.99061728395, 0.99209876543, 0.99320987654, 0.9950617284, 0.99555555556, 0.99617283951, 0.99765432099, 0.99802469136, 0.99950617284]
averages_even={0.0: [0.0, 0.3333333333333, 0.6666666666667], 0.25: [0.5, 0.8333333333333, 0.1666666666667], 0.37098765432: [0.3888888888889, 0.6111111111111], 0.67913580247: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.35555555556: [0.0, 0.3333333333333, 0.6666666666667], 0.47024691358: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.62765432099: [0.5111111111111, 0.7111111111111, 0.4888888888889, 0.2888888888889], 0.73283950617: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.99061728395: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.93580246914: [0.4444444444444, 0.5555555555556], 0.33888888889: [0.5, 0.8333333333333, 0.1666666666667], 0.81728395062: [0.2222222222222, 0.7777777777778], 0.4050617284: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.39950617284: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.30913580247: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.99555555556: [0.2666666666667, 0.4, 0.6, 0.9333333333333, 0.0666666666667, 0.7333333333333], 0.68444444444: [0.2666666666667, 0.4, 0.6, 0.9333333333333, 0.0666666666667, 0.7333333333333], 0.80543209877: [0.5111111111111, 0.7111111111111, 0.4888888888889, 0.2888888888889], 0.7624691358: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.88444444444: [0.2666666666667, 0.4, 0.6, 0.9333333333333, 0.0666666666667, 0.7333333333333], 0.8524691358: [0.9444444444444, 0.0555555555556], 0.84802469136: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.67172839506: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.63765432099: [0.3888888888889, 0.6111111111111], 0.49395061728: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.88691358025: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.12839506173: [0.2222222222222, 0.7777777777778], 0.1550617284: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.84987654321: [0.5111111111111, 0.7111111111111, 0.4888888888889, 0.2888888888889], 0.7087654321: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.98320987654: [0.5111111111111, 0.7111111111111, 0.4888888888889, 0.2888888888889], 0.23209876543: [0.8888888888889, 0.1111111111111], 0.27469135803: [0.9444444444444, 0.0555555555556], 0.54135802469: [0.9444444444444, 0.0555555555556], 0.89888888889: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.87765432099: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.33283950617: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.54320987654: [0.8888888888889, 0.1111111111111], 0.77395061728: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.0124691358: [0.2555555555556, 0.8555555555556, 0.1444444444444, 0.7444444444444], 0.43209876543: [0.8888888888889, 0.1111111111111], 0.36358024691: [0.9444444444444, 0.0555555555556], 0.89: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.54333333333: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.66432098765: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.90617283951: [0.2222222222222, 0.7777777777778], 0.24395061728: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.23024691358: [0.9444444444444, 0.0555555555556], 0.85135802469: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.80987654321: [0.8888888888889, 0.1111111111111], 0.43024691358: [0.9444444444444, 0.0555555555556], 0.39765432099: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.97888888889: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.30172839506: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.66666666667: [0.0, 0.3333333333333, 0.6666666666667], 0.6024691358: [0.4444444444444, 0.5555555555556], 0.19765432099: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.4587654321: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.24444444444: [0.0, 0.3333333333333, 0.6666666666667], 0.54765432099: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.26469135803: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.48654320988: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.68691358025: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.39061728395: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.5950617284: [0.2222222222222, 0.7777777777778], 0.00987654321: [0.8888888888889, 0.1111111111111], 0.24209876543: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.91358024691: [0.4444444444444, 0.5555555555556], 0.26728395062: [0.7222222222222, 0.2777777777778], 0.52654320988: [0.3888888888889, 0.6111111111111], 0.71111111111: [0.0, 0.3333333333333, 0.6666666666667], 0.84555555556: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.14617283951: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.32543209877: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.98283950617: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.47950617284: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.82172839506: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.75913580247: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.48839506173: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.87666666667: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.35617283951: [0.7222222222222, 0.2777777777778], 0.58333333333: [0.5, 0.8333333333333, 0.1666666666667], 0.94432098765: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.64950617284: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.34765432099: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.92654320988: [0.3888888888889, 0.6111111111111], 0.29333333333: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.41432098765: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.21: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.94135802469: [0.9444444444444, 0.0555555555556], 0.51913580247: [0.9444444444444, 0.0555555555556], 0.12888888889: [0.0666666666667, 0.4, 0.6, 0.9333333333333, 0.2666666666667, 0.7333333333333], 0.4450617284: [0.7222222222222, 0.2777777777778], 0.00061728395: [0.7222222222222, 0.2777777777778], 0.14432098765: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.63654320988: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.29432098765: [0.2888888888889, 0.7111111111111, 0.4888888888889, 0.5111111111111], 0.85: [0.5, 0.8333333333333, 0.1666666666667], 0.77580246914: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.4: [0.0, 0.3333333333333, 0.6666666666667], 0.94333333333: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.18024691358: [0.4444444444444, 0.5555555555556], 0.64209876543: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.47913580247: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.66617283951: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.09580246914: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.9387654321: [0.5111111111111, 0.7111111111111, 0.4888888888889, 0.2888888888889], 0.25987654321: [0.3888888888889, 0.6111111111111], 0.51172839506: [0.7222222222222, 0.2777777777778], 0.66728395062: [0.7222222222222, 0.2777777777778], 0.69888888889: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.56987654321: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.06728395062: [0.7222222222222, 0.2777777777778], 0.07950617284: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.10666666667: [0.0666666666667, 0.4, 0.6, 0.9333333333333, 0.2666666666667, 0.7333333333333], 0.11802469136: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.50172839506: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.3487654321: [0.3888888888889, 0.6111111111111], 0.68950617284: [0.7222222222222, 0.2777777777778], 0.63469135803: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.60444444444: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.15666666667: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.10172839506: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.90135802469: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.44: [0.2666666666667, 0.4, 0.6, 0.9333333333333, 0.0666666666667, 0.7333333333333], 0.90432098765: [0.3888888888889, 0.6111111111111], 0.23654320988: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.55913580247: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.31666666667: [0.5, 0.8333333333333, 0.1666666666667], 0.50432098765: [0.3888888888889, 0.6111111111111], 0.43765432099: [0.3888888888889, 0.6111111111111], 0.61283950617: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.5624691358: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.50666666667: [0.2666666666667, 0.4, 0.6, 0.9333333333333, 0.0666666666667, 0.7333333333333], 0.7550617284: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.57395061728: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.48395061728: [0.2222222222222, 0.7777777777778], 0.40555555556: [0.5, 0.8333333333333, 0.1666666666667], 0.68209876543: [0.3888888888889, 0.6111111111111], 0.62728395062: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.50617283951: [0.2222222222222, 0.7777777777778], 0.20987654321: [0.8888888888889, 0.1111111111111], 0.74024691358: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.89444444444: [0.5, 0.8333333333333, 0.1666666666667], 0.09395061728: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.69691358025: [0.9444444444444, 0.0555555555556], 0.89333333333: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.23469135803: [0.2555555555556, 0.8555555555556, 0.1444444444444, 0.7444444444444], 0.70135802469: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.55172839506: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.59061728395: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.49444444444: [0.5, 0.8333333333333, 0.1666666666667], 0.1387654321: [0.2888888888889, 0.7111111111111, 0.4888888888889, 0.5111111111111], 0.17333333333: [0.0666666666667, 0.4, 0.6, 0.9333333333333, 0.2666666666667, 0.7333333333333], 0.31061728395: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.78580246914: [0.9444444444444, 0.0555555555556], 0.11617283951: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.06222222222: [0.0666666666667, 0.4, 0.6, 0.9333333333333, 0.2666666666667, 0.7333333333333], 0.32839506173: [0.2222222222222, 0.7777777777778], 0.81432098765: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.34135802469: [0.9444444444444, 0.0555555555556], 0.39802469136: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.67469135803: [0.9444444444444, 0.0555555555556], 0.61987654321: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.78283950617: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.18320987654: [0.2888888888889, 0.7111111111111, 0.4888888888889, 0.5111111111111], 0.8987654321: [0.8888888888889, 0.1111111111111], 0.09987654321: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.67666666667: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.93135802469: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.16: [0.1333333333333, 0.2, 0.8, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.86222222222: [0.2666666666667, 0.4, 0.6, 0.9333333333333, 0.0666666666667, 0.7333333333333], 0.54432098765: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.84395061728: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.02580246914: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.79320987654: [0.3888888888889, 0.6111111111111], 0.15061728395: [0.2222222222222, 0.7777777777778], 0.06469135803: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.13691358025: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.92839506173: [0.2222222222222, 0.7777777777778], 0.12209876543: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.33777777778: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.28839506173: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.60111111111: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.4987654321: [0.8888888888889, 0.1111111111111], 0.07358024691: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.26913580247: [0.4444444444444, 0.5555555555556], 0.6124691358: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.15111111111: [0.0666666666667, 0.4, 0.6, 0.9333333333333, 0.2666666666667, 0.7333333333333], 0.18135802469: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.78024691358: [0.4444444444444, 0.5555555555556], 0.90913580247: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.02222222222: [0.0, 0.3333333333333, 0.6666666666667], 0.36839506173: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.29987654321: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.04888888889: [0.1333333333333, 0.2, 0.8, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.53691358025: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.37283950617: [0.2222222222222, 0.7777777777778], 0.1487654321: [0.3888888888889, 0.6111111111111], 0.54024691358: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.30320987654: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.70320987654: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.45728395062: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.26777777778: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.33395061728: [0.7222222222222, 0.2777777777778], 0.53888888889: [0.5, 0.8333333333333, 0.1666666666667], 0.65987654321: [0.3888888888889, 0.6111111111111], 0.93283950617: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.71802469136: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.91617283951: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.39209876543: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.35666666667: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.52950617284: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.42283950617: [0.7222222222222, 0.2777777777778], 0.71666666667: [0.5, 0.8333333333333, 0.1666666666667], 0.81098765432: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.58765432099: [0.8888888888889, 0.1111111111111], 0.53283950617: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.87913580247: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.09: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.19555555556: [0.0666666666667, 0.4, 0.6, 0.9333333333333, 0.2666666666667, 0.7333333333333], 0.19209876543: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.71358024691: [0.4444444444444, 0.5555555555556], 0.05444444444: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.70728395062: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.6524691358: [0.9444444444444, 0.0555555555556], 0.26172839506: [0.2222222222222, 0.7777777777778], 0.16111111111: [0.5, 0.8333333333333, 0.1666666666667], 0.71061728395: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.45444444444: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.40358024691: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.11222222222: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.19950617284: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.2924691358: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.98222222222: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.52209876543: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.79802469136: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.58024691358: [0.4444444444444, 0.5555555555556], 0.35061728395: [0.2222222222222, 0.7777777777778], 0.13135802469: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.88839506173: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.44987654321: [0.2888888888889, 0.7111111111111, 0.4888888888889, 0.5111111111111], 0.95172839506: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.38135802469: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.57888888889: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.69987654321: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.6450617284: [0.7222222222222, 0.2777777777778], 0.58222222222: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.01098765432: [0.7888888888889, 0.0111111111111, 0.2111111111111, 0.9888888888889], 0.17580246914: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.38469135803: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.87172839506: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.96802469136: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.31617283951: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.8450617284: [0.7222222222222, 0.2777777777778], 0.51469135803: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.41543209877: [0.3888888888889, 0.6111111111111], 0.14320987654: [0.8888888888889, 0.1111111111111], 0.12950617284: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.8024691358: [0.4444444444444, 0.5555555555556], 0.76: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.22024691358: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.47358024691: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.74765432099: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.22172839506: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.43654320988: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.51666666667: [0.5, 0.8333333333333, 0.1666666666667], 0.25432098765: [0.8888888888889, 0.1111111111111], 0.04691358025: [0.4444444444444, 0.5555555555556], 0.17395061728: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.72888888889: [0.2666666666667, 0.4, 0.6, 0.9333333333333, 0.0666666666667, 0.7333333333333], 0.83320987654: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.47654320988: [0.8888888888889, 0.1111111111111], 0.2850617284: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.69395061728: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.50728395062: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.58320987654: [0.5111111111111, 0.7111111111111, 0.4888888888889, 0.2888888888889], 0.47222222222: [0.5, 0.8333333333333, 0.1666666666667], 0.14135802469: [0.9444444444444, 0.0555555555556], 0.34320987654: [0.8888888888889, 0.1111111111111], 0.77839506173: [0.7222222222222, 0.2777777777778], 0.21839506173: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.00691358025: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.37395061728: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.6850617284: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.32395061728: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.81543209877: [0.3888888888889, 0.6111111111111], 0.31111111111: [0.0, 0.3333333333333, 0.6666666666667], 0.18580246914: [0.9444444444444, 0.0555555555556], 0.68839506173: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.09432098765: [0.2888888888889, 0.7111111111111, 0.4888888888889, 0.5111111111111], 0.02728395062: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.51358024691: [0.4444444444444, 0.5555555555556], 0.3087654321: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.46283950617: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.75320987654: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.60555555556: [0.5, 0.8333333333333, 0.1666666666667], 0.15320987654: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.55802469136: [0.4444444444444, 0.5555555555556], 0.97580246914: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.2: [0.0, 0.3333333333333, 0.6666666666667], 0.86617283951: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.79024691358: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.46617283951: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.02913580247: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.27666666667: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.75358024691: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.83765432099: [0.3888888888889, 0.6111111111111], 0.86469135803: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.55111111111: [0.2666666666667, 0.4, 0.6, 0.9333333333333, 0.0666666666667, 0.7333333333333], 0.55666666667: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.67765432099: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.62283950617: [0.7222222222222, 0.2777777777778], 0.25728395062: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.49691358025: [0.9444444444444, 0.0555555555556], 0.67209876543: [0.5111111111111, 0.7111111111111, 0.4888888888889, 0.2888888888889], 0.73580246914: [0.4444444444444, 0.5555555555556], 0.83950617284: [0.2222222222222, 0.7777777777778], 0.4424691358: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.53395061728: [0.7222222222222, 0.2777777777778], 0.27765432099: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.86728395062: [0.7222222222222, 0.2777777777778], 0.73444444444: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.60543209877: [0.5111111111111, 0.7111111111111, 0.4888888888889, 0.2888888888889], 0.36: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.08395061728: [0.2222222222222, 0.7777777777778], 0.70172839506: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.7887654321: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.73777777778: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.8587654321: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.75172839506: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.91172839506: [0.7222222222222, 0.2777777777778], 0.98024691358: [0.4444444444444, 0.5555555555556], 0.36654320988: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.80444444444: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.48098765432: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.16555555556: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.28691358025: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.72839506173: [0.2222222222222, 0.7777777777778], 0.16839506173: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.11111111111: [0.0, 0.3333333333333, 0.6666666666667], 0.91555555556: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.55555555556: [0.0, 0.3333333333333, 0.6666666666667], 0.30135802469: [0.2555555555556, 0.8555555555556, 0.1444444444444, 0.7444444444444], 0.33444444444: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.53333333333: [0.0, 0.3333333333333, 0.6666666666667], 0.45543209877: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.04: [0.0666666666667, 0.4, 0.6, 0.9333333333333, 0.2666666666667, 0.7333333333333], 0.13580246914: [0.4444444444444, 0.5555555555556], 0.94617283951: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.76098765432: [0.5111111111111, 0.7111111111111, 0.4888888888889, 0.2888888888889], 0.21283950617: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.42777777778: [0.5, 0.8333333333333, 0.1666666666667], 0.94580246914: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.11469135803: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.88888888889: [0.0, 0.3333333333333, 0.6666666666667], 0.39024691358: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.75555555556: [0.0, 0.6666666666667, 0.3333333333333], 0.42333333333: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.95617283951: [0.7222222222222, 0.2777777777778], 0.66283950617: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.96061728395: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.48950617284: [0.7222222222222, 0.2777777777778], 0.6: [0.0, 0.3333333333333, 0.6666666666667], 0.0450617284: [0.7222222222222, 0.2777777777778], 0.16654320988: [0.7888888888889, 0.0111111111111, 0.2111111111111, 0.9888888888889], 0.56802469136: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.42469135803: [0.4444444444444, 0.5555555555556], 0.02666666667: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.76839506173: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.81777777778: [0.2666666666667, 0.4, 0.6, 0.9333333333333, 0.0666666666667, 0.7333333333333], 0.92209876543: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.79061728395: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.92098765432: [0.8888888888889, 0.1111111111111], 0.26222222222: [0.0666666666667, 0.4, 0.6, 0.9333333333333, 0.2666666666667, 0.7333333333333], 0.81: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.14765432099: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.13395061728: [0.7222222222222, 0.2777777777778], 0.53580246914: [0.4444444444444, 0.5555555555556], 0.08209876543: [0.3888888888889, 0.6111111111111], 0.77777777778: [0.0, 0.3333333333333, 0.6666666666667], 0.22469135803: [0.4444444444444, 0.5555555555556], 0.21098765432: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.28098765432: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.89691358025: [0.9444444444444, 0.0555555555556], 0.64444444444: [0.0, 0.3333333333333, 0.6666666666667], 0.35913580247: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.48888888889: [0.0, 0.3333333333333, 0.6666666666667], 0.65543209877: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.03209876543: [0.8888888888889, 0.1111111111111], 0.04802469136: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.17839506173: [0.7222222222222, 0.2777777777778], 0.10432098765: [0.3888888888889, 0.6111111111111], 0.41728395062: [0.2222222222222, 0.7777777777778], 0.95555555556: [0.0, 0.6666666666667, 0.3333333333333], 0.79617283951: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.67024691358: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.04555555556: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.33580246914: [0.4444444444444, 0.5555555555556], 0.29395061728: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.34617283951: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.97728395062: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.44802469136: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.71222222222: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.14580246914: [0.2555555555556, 0.8555555555556, 0.1444444444444, 0.7444444444444], 0.52839506173: [0.2222222222222, 0.7777777777778], 0.89395061728: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.22283950617: [0.7222222222222, 0.2777777777778], 0.82777777778: [0.5, 0.8333333333333, 0.1666666666667], 0.65691358025: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.38283950617: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.44888888889: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.99802469136: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.84209876543: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.07172839506: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.64802469136: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.01: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.19024691358: [0.2555555555556, 0.8555555555556, 0.1444444444444, 0.7444444444444], 0.70617283951: [0.2222222222222, 0.7777777777778], 0.74135802469: [0.9444444444444, 0.0555555555556], 0.22333333333: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.07654320988: [0.8888888888889, 0.1111111111111], 0.95320987654: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.47172839506: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.82580246914: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.45432098765: [0.8888888888889, 0.1111111111111], 0.20444444444: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.8424691358: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.88395061728: [0.2222222222222, 0.7777777777778], 0.2524691358: [0.9444444444444, 0.0555555555556], 0.84061728395: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.32098765432: [0.8888888888889, 0.1111111111111], 0.35172839506: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.80111111111: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.55358024691: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.64061728395: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.28888888889: [0.0, 0.3333333333333, 0.6666666666667], 0.6987654321: [0.8888888888889, 0.1111111111111], 0.40987654321: [0.8888888888889, 0.1111111111111], 0.98765432099: [0.8888888888889, 0.1111111111111], 0.28654320988: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.82913580247: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.44061728395: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.62222222222: [0.0, 0.3333333333333, 0.6666666666667], 0.74580246914: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.56839506173: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.87111111111: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.09444444444: [0.5, 0.8333333333333, 0.1666666666667], 0.87654320988: [0.8888888888889, 0.1111111111111], 0.93444444444: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.20543209877: [0.2888888888889, 0.7111111111111, 0.4888888888889, 0.5111111111111], 0.62469135803: [0.4444444444444, 0.5555555555556], 0.25444444444: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.37543209877: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.06913580247: [0.4444444444444, 0.5555555555556], 0.27950617284: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.52543209877: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.74617283951: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.17283950617: [0.2222222222222, 0.7777777777778], 0.91469135803: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.11666666667: [0.5, 0.8333333333333, 0.1666666666667], 0.96358024691: [0.9444444444444, 0.0555555555556], 0.24987654321: [0.2888888888889, 0.7111111111111, 0.4888888888889, 0.5111111111111], 0.85987654321: [0.3888888888889, 0.6111111111111], 0.34333333333: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.46432098765: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.51061728395: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.69: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.14024691358: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.36543209877: [0.8888888888889, 0.1111111111111], 0.92395061728: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.05432098765: [0.8888888888889, 0.1111111111111], 0.86777777778: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.43222222222: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.58469135803: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.61777777778: [0.2666666666667, 0.4, 0.6, 0.9333333333333, 0.0666666666667, 0.7333333333333], 0.18469135803: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.17098765432: [0.3888888888889, 0.6111111111111], 0.4024691358: [0.4444444444444, 0.5555555555556], 0.21777777778: [0.0666666666667, 0.4, 0.6, 0.9333333333333, 0.2666666666667, 0.7333333333333], 0.24802469136: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.6050617284: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.77543209877: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.72543209877: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.06777777778: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.43320987654: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.74333333333: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.13839506173: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.00358024691: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.4350617284: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.05728395062: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.86283950617: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.21543209877: [0.3888888888889, 0.6111111111111], 0.49135802469: [0.4444444444444, 0.5555555555556], 0.57777777778: [0.0, 0.3333333333333, 0.6666666666667], 0.36802469136: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.40111111111: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.27209876543: [0.2888888888889, 0.7111111111111, 0.4888888888889, 0.5111111111111], 0.9487654321: [0.3888888888889, 0.6111111111111], 0.34024691358: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.18283950617: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.97333333333: [0.2666666666667, 0.4, 0.6, 0.9333333333333, 0.0666666666667, 0.7333333333333], 0.78320987654: [0.5111111111111, 0.7111111111111, 0.4888888888889, 0.2888888888889], 0.39617283951: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.59765432099: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.75111111111: [0.2666666666667, 0.4, 0.6, 0.9333333333333, 0.0666666666667, 0.7333333333333], 0.45691358025: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.96987654321: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.83654320988: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.49: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.36098765432: [0.2888888888889, 0.7111111111111, 0.4888888888889, 0.5111111111111], 0.79555555556: [0.2666666666667, 0.4, 0.6, 0.9333333333333, 0.0666666666667, 0.7333333333333], 0.22728395062: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.85432098765: [0.8888888888889, 0.1111111111111], 0.93098765432: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.49432098765: [0.2888888888889, 0.7111111111111, 0.4888888888889, 0.5111111111111], 0.82728395062: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.65444444444: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.22222222222: [0.0, 0.3333333333333, 0.6666666666667], 0.16987654321: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.29580246914: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.32888888889: [0.0666666666667, 0.4, 0.6, 0.9333333333333, 0.2666666666667, 0.7333333333333], 0.83222222222: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.72395061728: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.36987654321: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.0987654321: [0.8888888888889, 0.1111111111111], 0.22777777778: [0.5, 0.8333333333333, 0.1666666666667], 0.04395061728: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.32654320988: [0.3888888888889, 0.6111111111111], 0.59024691358: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.42580246914: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.76358024691: [0.9444444444444, 0.0555555555556], 0.41777777778: [0.0666666666667, 0.4, 0.6, 0.9333333333333, 0.2666666666667, 0.7333333333333], 0.90172839506: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.84691358025: [0.4444444444444, 0.5555555555556], 0.64: [0.2666666666667, 0.4, 0.6, 0.9333333333333, 0.0666666666667, 0.7333333333333], 0.03024691358: [0.9444444444444, 0.0555555555556], 0.29444444444: [0.5, 0.8333333333333, 0.1666666666667], 0.44691358025: [0.4444444444444, 0.5555555555556], 0.36061728395: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.5387654321: [0.2888888888889, 0.7111111111111, 0.4888888888889, 0.5111111111111], 0.87209876543: [0.5111111111111, 0.7111111111111, 0.4888888888889, 0.2888888888889], 0.06617283951: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.95913580247: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.59555555556: [0.2666666666667, 0.4, 0.6, 0.9333333333333, 0.0666666666667, 0.7333333333333], 0.71654320988: [0.5111111111111, 0.7111111111111, 0.4888888888889, 0.2888888888889], 0.73395061728: [0.7222222222222, 0.2777777777778], 0.38333333333: [0.5, 0.8333333333333, 0.1666666666667], 0.58283950617: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.44950617284: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.81839506173: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.35358024691: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.08839506173: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.77333333333: [0.2666666666667, 0.4, 0.6, 0.9333333333333, 0.0666666666667, 0.7333333333333], 0.26666666667: [0.0, 0.3333333333333, 0.6666666666667], 0.83469135803: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.0524691358: [0.9444444444444, 0.0555555555556], 0.53135802469: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.96555555556: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.92111111111: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.19802469136: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.07209876543: [0.2888888888889, 0.7111111111111, 0.4888888888889, 0.5111111111111], 0.37777777778: [0.0, 0.3333333333333, 0.6666666666667], 0.02765432099: [0.2888888888889, 0.7111111111111, 0.4888888888889, 0.5111111111111], 0.17728395062: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.88654320988: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.95111111111: [0.2666666666667, 0.4, 0.6, 0.9333333333333, 0.0666666666667, 0.7333333333333], 0.31913580247: [0.9444444444444, 0.0555555555556], 0.57543209877: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.42913580247: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.59802469136: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.05135802469: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.56098765432: [0.2888888888889, 0.7111111111111, 0.4888888888889, 0.5111111111111], 0.96098765432: [0.5111111111111, 0.7111111111111, 0.4888888888889, 0.2888888888889], 0.05543209877: [0.7888888888889, 0.0111111111111, 0.2111111111111, 0.9888888888889], 0.00555555556: [0.5, 0.8333333333333, 0.1666666666667], 0.11913580247: [0.9444444444444, 0.0555555555556], 0.40802469136: [0.9444444444444, 0.0555555555556], 0.63222222222: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.50320987654: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.78333333333: [0.5, 0.8333333333333, 0.1666666666667], 0.51802469136: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.06432098765: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.17543209877: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.02777777778: [0.5, 0.8333333333333, 0.1666666666667], 0.03950617284: [0.2222222222222, 0.7777777777778], 0.56: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.44209876543: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.88098765432: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.08654320988: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.21987654321: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.0024691358: [0.4444444444444, 0.5555555555556], 0.97098765432: [0.3888888888889, 0.6111111111111], 0.84444444444: [0.0, 0.6666666666667, 0.3333333333333], 0.41: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.07024691358: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.92888888889: [0.2666666666667, 0.4, 0.6, 0.9333333333333, 0.0666666666667, 0.7333333333333], 0.76802469136: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.05691358025: [0.2555555555556, 0.8555555555556, 0.1444444444444, 0.7444444444444], 0.68654320988: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.1087654321: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.98135802469: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.57728395062: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.84: [0.2666666666667, 0.4, 0.6, 0.9333333333333, 0.0666666666667, 0.7333333333333], 0.31172839506: [0.7222222222222, 0.2777777777778], 0.61543209877: [0.3888888888889, 0.6111111111111], 0.56061728395: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.66172839506: [0.2222222222222, 0.7777777777778], 0.49888888889: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.0924691358: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.18222222222: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.18777777778: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.52358024691: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.5550617284: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.60691358025: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.46913580247: [0.4444444444444, 0.5555555555556], 0.01839506173: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.99617283951: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.40061728395: [0.7222222222222, 0.2777777777778], 0.67222222222: [0.5, 0.8333333333333, 0.1666666666667], 0.73839506173: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.69135802469: [0.4444444444444, 0.5555555555556], 0.08444444444: [0.0666666666667, 0.4, 0.6, 0.9333333333333, 0.2666666666667, 0.7333333333333], 0.23222222222: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.9950617284: [0.2222222222222, 0.7777777777778], 0.6924691358: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.82283950617: [0.7222222222222, 0.2777777777778], 0.60802469136: [0.9444444444444, 0.0555555555556], 0.46666666667: [0.0, 0.3333333333333, 0.6666666666667], 0.8: [0.0, 0.6666666666667, 0.3333333333333], 0.43469135803: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.79765432099: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.90320987654: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.3387654321: [0.2888888888889, 0.7111111111111, 0.4888888888889, 0.5111111111111], 0.27024691358: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.76111111111: [0.5, 0.8333333333333, 0.1666666666667], 0.88950617284: [0.7222222222222, 0.2777777777778], 0.02024691358: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.61: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.73098765432: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.27358024691: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.07666666667: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.42765432099: [0.2888888888889, 0.7111111111111, 0.4888888888889, 0.5111111111111], 0.18765432099: [0.8888888888889, 0.1111111111111], 0.24691358025: [0.4444444444444, 0.5555555555556], 0.30432098765: [0.3888888888889, 0.6111111111111], 0.91666666667: [0.5, 0.8333333333333, 0.1666666666667], 0.42666666667: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.54580246914: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.13444444444: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.53777777778: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.6587654321: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.97543209877: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.3624691358: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.09888888889: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.52111111111: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.64987654321: [0.5111111111111, 0.7111111111111, 0.4888888888889, 0.2888888888889], 0.21432098765: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.93777777778: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.27222222222: [0.5, 0.8333333333333, 0.1666666666667], 0.39320987654: [0.3888888888889, 0.6111111111111], 0.72111111111: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.72358024691: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.73888888889: [0.5, 0.8333333333333, 0.1666666666667], 0.13283950617: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.12098765432: [0.8888888888889, 0.1111111111111], 0.93333333333: [0.0, 0.6666666666667, 0.3333333333333], 0.12111111111: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.93691358025: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.36111111111: [0.5, 0.8333333333333, 0.1666666666667], 0.01432098765: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.48209876543: [0.3888888888889, 0.6111111111111], 0.53839506173: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.2987654321: [0.8888888888889, 0.1111111111111], 0.65135802469: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.0350617284: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.33135802469: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.60061728395: [0.7222222222222, 0.2777777777778], 0.2124691358: [0.2555555555556, 0.8555555555556, 0.1444444444444, 0.7444444444444], 0.76555555556: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.45: [0.5, 0.8333333333333, 0.1666666666667], 0.52098765432: [0.8888888888889, 0.1111111111111], 0.71617283951: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.57283950617: [0.2222222222222, 0.7777777777778], 0.26617283951: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.78469135803: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.38320987654: [0.2888888888889, 0.7111111111111, 0.4888888888889, 0.5111111111111], 0.62580246914: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.42024691358: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.22666666667: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.07111111111: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.17888888889: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.21728395062: [0.2222222222222, 0.7777777777778], 0.29691358025: [0.9444444444444, 0.0555555555556], 0.58580246914: [0.9444444444444, 0.0555555555556], 0.53098765432: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.07222222222: [0.5, 0.8333333333333, 0.1666666666667], 0.16098765432: [0.2888888888889, 0.7111111111111, 0.4888888888889, 0.5111111111111], 0.01543209877: [0.3888888888889, 0.6111111111111], 0.3550617284: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.69444444444: [0.5, 0.8333333333333, 0.1666666666667], 0.85444444444: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.95358024691: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.75666666667: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.38580246914: [0.9444444444444, 0.0555555555556], 0.85691358025: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.56543209877: [0.8888888888889, 0.1111111111111], 0.00802469136: [0.9444444444444, 0.0555555555556], 0.61432098765: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.82024691358: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.77888888889: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.01283950617: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.44395061728: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.04061728395: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.51222222222: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.63320987654: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.30666666667: [0.0666666666667, 0.4, 0.6, 0.9333333333333, 0.2666666666667, 0.7333333333333], 0.47469135803: [0.9444444444444, 0.0555555555556], 0.03222222222: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.15913580247: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.99320987654: [0.3888888888889, 0.6111111111111], 0.35802469136: [0.4444444444444, 0.5555555555556], 0.87358024691: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.0050617284: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.25543209877: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.98333333333: [0.5, 0.8333333333333, 0.1666666666667], 0.11061728395: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.12654320988: [0.3888888888889, 0.6111111111111], 0.31358024691: [0.4444444444444, 0.5555555555556], 0.05987654321: [0.3888888888889, 0.6111111111111], 0.20358024691: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.81987654321: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.10320987654: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.34432098765: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.13333333333: [0.0, 0.3333333333333, 0.6666666666667], 0.57098765432: [0.3888888888889, 0.6111111111111], 0.72098765432: [0.8888888888889, 0.1111111111111], 0.65432098765: [0.8888888888889, 0.1111111111111], 0.62913580247: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.01987654321: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.02172839506: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.08691358025: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.00543209877: [0.2888888888889, 0.7111111111111, 0.4888888888889, 0.5111111111111], 0.38765432099: [0.8888888888889, 0.1111111111111], 0.27913580247: [0.2555555555556, 0.8555555555556, 0.1444444444444, 0.7444444444444], 0.53444444444: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.31222222222: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.84950617284: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.37839506173: [0.7222222222222, 0.2777777777778], 0.62777777778: [0.5, 0.8333333333333, 0.1666666666667], 0.7487654321: [0.3888888888889, 0.6111111111111], 0.35111111111: [0.0666666666667, 0.4, 0.6, 0.9333333333333, 0.2666666666667, 0.7333333333333], 0.68395061728: [0.2222222222222, 0.7777777777778], 0.80691358025: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.31555555556: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.10913580247: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.9624691358: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.47666666667: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.49580246914: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.61839506173: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.56358024691: [0.9444444444444, 0.0555555555556], 0.46728395062: [0.7222222222222, 0.2777777777778], 0.13888888889: [0.5, 0.8333333333333, 0.1666666666667], 0.67654320988: [0.8888888888889, 0.1111111111111], 0.62172839506: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.73333333333: [0.0, 0.6666666666667, 0.3333333333333], 0.40444444444: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.63024691358: [0.9444444444444, 0.0555555555556], 0.78777777778: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.99209876543: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.9087654321: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.91802469136: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.15555555556: [0.0, 0.3333333333333, 0.6666666666667], 0.30617283951: [0.2222222222222, 0.7777777777778], 0.86913580247: [0.4444444444444, 0.5555555555556], 0.79950617284: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.74320987654: [0.8888888888889, 0.1111111111111], 0.95802469136: [0.4444444444444, 0.5555555555556], 0.49333333333: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.91061728395: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.33691358025: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.87024691358: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.95666666667: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.61098765432: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.93839506173: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.55617283951: [0.7222222222222, 0.2777777777778], 0.82765432099: [0.5111111111111, 0.7111111111111, 0.4888888888889, 0.2888888888889], 0.66913580247: [0.4444444444444, 0.5555555555556], 0.15358024691: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.39555555556: [0.2666666666667, 0.4, 0.6, 0.9333333333333, 0.0666666666667, 0.7333333333333], 0.0850617284: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.51555555556: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.27172839506: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.12358024691: [0.2555555555556, 0.8555555555556, 0.1444444444444, 0.7444444444444], 0.69333333333: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.77283950617: [0.2222222222222, 0.7777777777778], 0.37728395062: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.66777777778: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.27111111111: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.52395061728: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.80555555556: [0.5, 0.8333333333333, 0.1666666666667], 0.69580246914: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.67111111111: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.04950617284: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.87222222222: [0.5, 0.8333333333333, 0.1666666666667], 0.10728395062: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.96111111111: [0.5, 0.8333333333333, 0.1666666666667], 0.46222222222: [0.2666666666667, 0.4, 0.6, 0.9333333333333, 0.0666666666667, 0.7333333333333], 0.11555555556: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.62666666667: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.96: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.60358024691: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.62024691358: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.5487654321: [0.3888888888889, 0.6111111111111], 0.45987654321: [0.3888888888889, 0.6111111111111], 0.16543209877: [0.8888888888889, 0.1111111111111], 0.15172839506: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.84888888889: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.2424691358: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.08888888889: [0.0, 0.3333333333333, 0.6666666666667], 0.06283950617: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.87950617284: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.83209876543: [0.8888888888889, 0.1111111111111], 0.92358024691: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.40691358025: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.03320987654: [0.7888888888889, 0.0111111111111, 0.2111111111111, 0.9888888888889], 0.72654320988: [0.3888888888889, 0.6111111111111], 0.91222222222: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.07469135803: [0.9444444444444, 0.0555555555556], 0.90728395062: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.22913580247: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.19617283951: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.32950617284: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.86172839506: [0.2222222222222, 0.7777777777778], 0.59617283951: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.06666666667: [0.0, 0.3333333333333, 0.6666666666667], 0.16358024691: [0.9444444444444, 0.0555555555556], 0.09691358025: [0.9444444444444, 0.0555555555556], 0.59950617284: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.97777777778: [0.0, 0.6666666666667, 0.3333333333333], 0.83024691358: [0.9444444444444, 0.0555555555556], 0.24061728395: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.47765432099: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.93888888889: [0.5, 0.8333333333333, 0.1666666666667], 0.26432098765: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.48691358025: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.41839506173: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.80802469136: [0.9444444444444, 0.0555555555556], 0.13098765432: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.71913580247: [0.9444444444444, 0.0555555555556], 0.04444444444: [0.0, 0.3333333333333, 0.6666666666667], 0.17777777778: [0.0, 0.3333333333333, 0.6666666666667], 0.20802469136: [0.9444444444444, 0.0555555555556], 0.77728395062: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.8050617284: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.42172839506: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.6424691358: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.35320987654: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.05: [0.5, 0.8333333333333, 0.1666666666667], 0.78135802469: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.5887654321: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.89432098765: [0.5111111111111, 0.7111111111111, 0.4888888888889, 0.2888888888889], 0.4524691358: [0.9444444444444, 0.0555555555556], 0.45135802469: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.57839506173: [0.7222222222222, 0.2777777777778], 0.59209876543: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.9550617284: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.32111111111: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.57580246914: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.48444444444: [0.2666666666667, 0.4, 0.6, 0.9333333333333, 0.0666666666667, 0.7333333333333], 0.51654320988: [0.5111111111111, 0.7111111111111, 0.4888888888889, 0.2888888888889], 0.95061728395: [0.2222222222222, 0.7777777777778], 0.71172839506: [0.7222222222222, 0.2777777777778], 0.29135802469: [0.4444444444444, 0.5555555555556], 0.64888888889: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.76987654321: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.11654320988: [0.2888888888889, 0.7111111111111, 0.4888888888889, 0.5111111111111], 0.8124691358: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.82333333333: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.32209876543: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.80358024691: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.58135802469: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.14333333333: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.38222222222: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.63950617284: [0.2222222222222, 0.7777777777778], 0.03654320988: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.64691358025: [0.4444444444444, 0.5555555555556], 0.82666666667: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.94765432099: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.3950617284: [0.2222222222222, 0.7777777777778], 0.25691358025: [0.2555555555556, 0.8555555555556, 0.1444444444444, 0.7444444444444], 0.29: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.73135802469: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.41098765432: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.75061728395: [0.2222222222222, 0.7777777777778], 0.50913580247: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.70432098765: [0.3888888888889, 0.6111111111111], 0.99950617284: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.59320987654: [0.3888888888889, 0.6111111111111], 0.19061728395: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.9887654321: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.99765432099: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.34580246914: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.37888888889: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.49987654321: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.51111111111: [0.0, 0.3333333333333, 0.6666666666667], 0.15802469136: [0.4444444444444, 0.5555555555556], 0.78222222222: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.94024691358: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.0587654321: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.98469135803: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.46777777778: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.22580246914: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.12543209877: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.31654320988: [0.2888888888889, 0.7111111111111, 0.4888888888889, 0.5111111111111], 0.28395061728: [0.2222222222222, 0.7777777777778], 0.68888888889: [0.0, 0.3333333333333, 0.6666666666667], 0.2024691358: [0.4444444444444, 0.5555555555556], 0.1887654321: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.98580246914: [0.9444444444444, 0.0555555555556], 0.47111111111: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.70913580247: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.31469135803: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.56654320988: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.68098765432: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.0424691358: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.56111111111: [0.5, 0.8333333333333, 0.1666666666667], 0.15617283951: [0.7222222222222, 0.2777777777778], 0.31802469136: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.86666666667: [0.0, 0.6666666666667, 0.3333333333333], 0.06172839506: [0.2222222222222, 0.7777777777778], 0.23320987654: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.63209876543: [0.8888888888889, 0.1111111111111], 0.26283950617: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.64555555556: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.62333333333: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.74432098765: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.92950617284: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.76654320988: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.20061728395: [0.7222222222222, 0.2777777777778], 0.46172839506: [0.2222222222222, 0.7777777777778], 0.33839506173: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.4924691358: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.91913580247: [0.9444444444444, 0.0555555555556], 0.13777777778: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.16802469136: [0.2555555555556, 0.8555555555556, 0.1444444444444, 0.7444444444444], 0.99024691358: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.61728395062: [0.2222222222222, 0.7777777777778], 0.54617283951: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.2450617284: [0.7222222222222, 0.2777777777778], 0.20111111111: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.92543209877: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.91654320988: [0.5111111111111, 0.7111111111111, 0.4888888888889, 0.2888888888889], 0.12395061728: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.42728395062: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.82469135803: [0.4444444444444, 0.5555555555556], 0.73691358025: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.27654320988: [0.8888888888889, 0.1111111111111], 0.7950617284: [0.2222222222222, 0.7777777777778], 0.37580246914: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.18333333333: [0.5, 0.8333333333333, 0.1666666666667], 0.24555555556: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.65: [0.5, 0.8333333333333, 0.1666666666667], 0.55320987654: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.77098765432: [0.3888888888889, 0.6111111111111], 0.88209876543: [0.3888888888889, 0.6111111111111], 0.25135802469: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.66469135803: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.60987654321: [0.8888888888889, 0.1111111111111], 0.02283950617: [0.7222222222222, 0.2777777777778], 0.09135802469: [0.4444444444444, 0.5555555555556], 0.97283950617: [0.2222222222222, 0.7777777777778], 0.30728395062: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.00111111111: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.85543209877: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.76061728395: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.01777777778: [0.0666666666667, 0.4, 0.6, 0.9333333333333, 0.2666666666667, 0.7333333333333], 0.55061728395: [0.2222222222222, 0.7777777777778], 0.72950617284: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.79209876543: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.33333333333: [0.0, 0.3333333333333, 0.6666666666667], 0.78765432099: [0.8888888888889, 0.1111111111111], 0.11358024691: [0.4444444444444, 0.5555555555556], 0.40543209877: [0.2888888888889, 0.7111111111111, 0.4888888888889, 0.5111111111111], 0.85728395062: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.33098765432: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.8850617284: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.80061728395: [0.7222222222222, 0.2777777777778], 0.47209876543: [0.2888888888889, 0.7111111111111, 0.4888888888889, 0.5111111111111], 0.89987654321: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.4850617284: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.98777777778: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.65728395062: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.03765432099: [0.3888888888889, 0.6111111111111], 0.01728395062: [0.2222222222222, 0.7777777777778], 0.42222222222: [0.0, 0.3333333333333, 0.6666666666667], 0.96543209877: [0.8888888888889, 0.1111111111111], 0.22765432099: [0.2888888888889, 0.7111111111111, 0.4888888888889, 0.5111111111111], 0.90666666667: [0.2666666666667, 0.4, 0.6, 0.9333333333333, 0.0666666666667, 0.7333333333333], 0.10617283951: [0.2222222222222, 0.7777777777778], 0.29888888889: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.7387654321: [0.5111111111111, 0.7111111111111, 0.4888888888889, 0.2888888888889], 0.41987654321: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.82222222222: [0.0, 0.6666666666667, 0.3333333333333], 0.08098765432: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.72209876543: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.8350617284: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.1950617284: [0.2222222222222, 0.7777777777778], 0.8924691358: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.57333333333: [0.2666666666667, 0.4, 0.6, 0.9333333333333, 0.0666666666667, 0.7333333333333], 0.03469135803: [0.2555555555556, 0.8555555555556, 0.1444444444444, 0.7444444444444], 0.38777777778: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.04987654321: [0.2888888888889, 0.7111111111111, 0.4888888888889, 0.5111111111111], 0.69432098765: [0.5111111111111, 0.7111111111111, 0.4888888888889, 0.2888888888889], 0.2587654321: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.52888888889: [0.0666666666667, 0.4, 0.6, 0.9333333333333, 0.2666666666667, 0.7333333333333], 0.1624691358: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.36555555556: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.08950617284: [0.7222222222222, 0.2777777777778], 0.23950617284: [0.2222222222222, 0.7777777777778], 0.28950617284: [0.7222222222222, 0.2777777777778], 0.07765432099: [0.7888888888889, 0.0111111111111, 0.2111111111111, 0.9888888888889], 0.71555555556: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.51617283951: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.38024691358: [0.4444444444444, 0.5555555555556], 0.3887654321: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.97395061728: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.71469135803: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.93395061728: [0.7222222222222, 0.2777777777778], 0.70666666667: [0.2666666666667, 0.4, 0.6, 0.9333333333333, 0.0666666666667, 0.7333333333333], 0.20691358025: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.19320987654: [0.3888888888889, 0.6111111111111], 0.11172839506: [0.7222222222222, 0.2777777777778], 0.89135802469: [0.4444444444444, 0.5555555555556], 0.24: [0.0666666666667, 0.4, 0.6, 0.9333333333333, 0.2666666666667, 0.7333333333333], 0.41283950617: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.32358024691: [0.2555555555556, 0.8555555555556, 0.1444444444444, 0.7444444444444], 0.02469135803: [0.4444444444444, 0.5555555555556], 0.46469135803: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.91111111111: [0.0, 0.6666666666667, 0.3333333333333], 0.16061728395: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.67950617284: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.86432098765: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.23765432099: [0.3888888888889, 0.6111111111111], 0.89580246914: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.75617283951: [0.7222222222222, 0.2777777777778], 0.5087654321: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.66222222222: [0.2666666666667, 0.4, 0.6, 0.9333333333333, 0.0666666666667, 0.7333333333333], 0.4124691358: [0.2555555555556, 0.8555555555556, 0.1444444444444, 0.7444444444444], 0.24888888889: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.44555555556: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.96839506173: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.07913580247: [0.2555555555556, 0.8555555555556, 0.1444444444444, 0.7444444444444], 0.67358024691: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.2050617284: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.76543209877: [0.8888888888889, 0.1111111111111], 0.37333333333: [0.2666666666667, 0.4, 0.6, 0.9333333333333, 0.0666666666667, 0.7333333333333], 0.64395061728: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.97839506173: [0.7222222222222, 0.2777777777778], 0.56555555556: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.04209876543: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.00444444444: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.28444444444: [0.0666666666667, 0.4, 0.6, 0.9333333333333, 0.2666666666667, 0.7333333333333], 0.10135802469: [0.2555555555556, 0.8555555555556, 0.1444444444444, 0.7444444444444], 0.6350617284: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.44444444444: [0.0, 0.3333333333333, 0.6666666666667], 0.24950617284: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.20555555556: [0.5, 0.8333333333333, 0.1666666666667], 0.94320987654: [0.8888888888889, 0.1111111111111], 0.28209876543: [0.3888888888889, 0.6111111111111], 0.2350617284: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.50135802469: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.96654320988: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.87469135803: [0.9444444444444, 0.0555555555556], 0.02333333333: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.09333333333: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.81283950617: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.58777777778: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.75802469136: [0.4444444444444, 0.5555555555556], 0.43950617284: [0.2222222222222, 0.7777777777778]}
averages_odd={0.0: [0.0, 0.3333333333333, 0.6666666666667], 0.25: [0.5, 0.8333333333333, 0.1666666666667], 0.37098765432: [0.3888888888889, 0.6111111111111], 0.67913580247: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.35555555556: [0.0, 0.3333333333333, 0.6666666666667], 0.47024691358: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.62765432099: [0.5111111111111, 0.7111111111111, 0.4888888888889, 0.2888888888889], 0.73283950617: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.99061728395: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.93580246914: [0.4444444444444, 0.5555555555556], 0.33888888889: [0.5, 0.8333333333333, 0.1666666666667], 0.81728395062: [0.2222222222222, 0.7777777777778], 0.4050617284: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.39950617284: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.30913580247: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.99555555556: [0.2666666666667, 0.4, 0.6, 0.9333333333333, 0.0666666666667, 0.7333333333333], 0.68444444444: [0.2666666666667, 0.4, 0.6, 0.9333333333333, 0.0666666666667, 0.7333333333333], 0.80543209877: [0.5111111111111, 0.7111111111111, 0.4888888888889, 0.2888888888889], 0.7624691358: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.88444444444: [0.2666666666667, 0.4, 0.6, 0.9333333333333, 0.0666666666667, 0.7333333333333], 0.8524691358: [0.9444444444444, 0.0555555555556], 0.84802469136: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.67172839506: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.63765432099: [0.3888888888889, 0.6111111111111], 0.49395061728: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.88691358025: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.12839506173: [0.2222222222222, 0.7777777777778], 0.1550617284: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.84987654321: [0.5111111111111, 0.7111111111111, 0.4888888888889, 0.2888888888889], 0.7087654321: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.98320987654: [0.5111111111111, 0.7111111111111, 0.4888888888889, 0.2888888888889], 0.23209876543: [0.8888888888889, 0.1111111111111], 0.27469135803: [0.9444444444444, 0.0555555555556], 0.54135802469: [0.9444444444444, 0.0555555555556], 0.89888888889: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.87765432099: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.33283950617: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.54320987654: [0.8888888888889, 0.1111111111111], 0.77395061728: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.0124691358: [0.2555555555556, 0.8555555555556, 0.1444444444444, 0.7444444444444], 0.43209876543: [0.8888888888889, 0.1111111111111], 0.36358024691: [0.9444444444444, 0.0555555555556], 0.89: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.54333333333: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.66432098765: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.90617283951: [0.2222222222222, 0.7777777777778], 0.24395061728: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.23024691358: [0.9444444444444, 0.0555555555556], 0.85135802469: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.80987654321: [0.8888888888889, 0.1111111111111], 0.43024691358: [0.9444444444444, 0.0555555555556], 0.39765432099: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.97888888889: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.30172839506: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.66666666667: [0.0, 0.3333333333333, 0.6666666666667], 0.6024691358: [0.4444444444444, 0.5555555555556], 0.19765432099: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.4587654321: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.24444444444: [0.0, 0.3333333333333, 0.6666666666667], 0.54765432099: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.26469135803: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.48654320988: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.68691358025: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.39061728395: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.5950617284: [0.2222222222222, 0.7777777777778], 0.00987654321: [0.8888888888889, 0.1111111111111], 0.24209876543: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.91358024691: [0.4444444444444, 0.5555555555556], 0.26728395062: [0.7222222222222, 0.2777777777778], 0.52654320988: [0.3888888888889, 0.6111111111111], 0.71111111111: [0.0, 0.3333333333333, 0.6666666666667], 0.84555555556: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.14617283951: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.32543209877: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.98283950617: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.47950617284: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.82172839506: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.75913580247: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.48839506173: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.87666666667: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.35617283951: [0.7222222222222, 0.2777777777778], 0.58333333333: [0.5, 0.8333333333333, 0.1666666666667], 0.94432098765: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.64950617284: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.34765432099: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.92654320988: [0.3888888888889, 0.6111111111111], 0.29333333333: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.41432098765: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.21: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.94135802469: [0.9444444444444, 0.0555555555556], 0.51913580247: [0.9444444444444, 0.0555555555556], 0.12888888889: [0.0666666666667, 0.4, 0.6, 0.9333333333333, 0.2666666666667, 0.7333333333333], 0.4450617284: [0.7222222222222, 0.2777777777778], 0.00061728395: [0.7222222222222, 0.2777777777778], 0.14432098765: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.63654320988: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.29432098765: [0.2888888888889, 0.7111111111111, 0.4888888888889, 0.5111111111111], 0.85: [0.5, 0.8333333333333, 0.1666666666667], 0.77580246914: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.4: [0.0, 0.3333333333333, 0.6666666666667], 0.94333333333: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.18024691358: [0.4444444444444, 0.5555555555556], 0.64209876543: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.47913580247: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.66617283951: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.09580246914: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.9387654321: [0.5111111111111, 0.7111111111111, 0.4888888888889, 0.2888888888889], 0.25987654321: [0.3888888888889, 0.6111111111111], 0.51172839506: [0.7222222222222, 0.2777777777778], 0.66728395062: [0.7222222222222, 0.2777777777778], 0.69888888889: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.56987654321: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.06728395062: [0.7222222222222, 0.2777777777778], 0.07950617284: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.10666666667: [0.0666666666667, 0.4, 0.6, 0.9333333333333, 0.2666666666667, 0.7333333333333], 0.11802469136: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.50172839506: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.3487654321: [0.3888888888889, 0.6111111111111], 0.68950617284: [0.7222222222222, 0.2777777777778], 0.63469135803: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.60444444444: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.15666666667: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.10172839506: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.90135802469: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.44: [0.2666666666667, 0.4, 0.6, 0.9333333333333, 0.0666666666667, 0.7333333333333], 0.90432098765: [0.3888888888889, 0.6111111111111], 0.23654320988: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.55913580247: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.31666666667: [0.5, 0.8333333333333, 0.1666666666667], 0.50432098765: [0.3888888888889, 0.6111111111111], 0.43765432099: [0.3888888888889, 0.6111111111111], 0.61283950617: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.5624691358: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.50666666667: [0.2666666666667, 0.4, 0.6, 0.9333333333333, 0.0666666666667, 0.7333333333333], 0.7550617284: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.57395061728: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.48395061728: [0.2222222222222, 0.7777777777778], 0.40555555556: [0.5, 0.8333333333333, 0.1666666666667], 0.68209876543: [0.3888888888889, 0.6111111111111], 0.62728395062: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.50617283951: [0.2222222222222, 0.7777777777778], 0.20987654321: [0.8888888888889, 0.1111111111111], 0.74024691358: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.89444444444: [0.5, 0.8333333333333, 0.1666666666667], 0.09395061728: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.69691358025: [0.9444444444444, 0.0555555555556], 0.89333333333: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.23469135803: [0.2555555555556, 0.8555555555556, 0.1444444444444, 0.7444444444444], 0.70135802469: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.55172839506: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.59061728395: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.49444444444: [0.5, 0.8333333333333, 0.1666666666667], 0.1387654321: [0.2888888888889, 0.7111111111111, 0.4888888888889, 0.5111111111111], 0.17333333333: [0.0666666666667, 0.4, 0.6, 0.9333333333333, 0.2666666666667, 0.7333333333333], 0.31061728395: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.78580246914: [0.9444444444444, 0.0555555555556], 0.11617283951: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.06222222222: [0.0666666666667, 0.4, 0.6, 0.9333333333333, 0.2666666666667, 0.7333333333333], 0.32839506173: [0.2222222222222, 0.7777777777778], 0.81432098765: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.34135802469: [0.9444444444444, 0.0555555555556], 0.39802469136: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.67469135803: [0.9444444444444, 0.0555555555556], 0.61987654321: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.78283950617: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.18320987654: [0.2888888888889, 0.7111111111111, 0.4888888888889, 0.5111111111111], 0.8987654321: [0.8888888888889, 0.1111111111111], 0.09987654321: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.67666666667: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.93135802469: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.16: [0.1333333333333, 0.2, 0.8, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.86222222222: [0.2666666666667, 0.4, 0.6, 0.9333333333333, 0.0666666666667, 0.7333333333333], 0.54432098765: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.84395061728: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.02580246914: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.79320987654: [0.3888888888889, 0.6111111111111], 0.15061728395: [0.2222222222222, 0.7777777777778], 0.06469135803: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.13691358025: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.92839506173: [0.2222222222222, 0.7777777777778], 0.12209876543: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.33777777778: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.28839506173: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.60111111111: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.4987654321: [0.8888888888889, 0.1111111111111], 0.07358024691: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.26913580247: [0.4444444444444, 0.5555555555556], 0.6124691358: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.15111111111: [0.0666666666667, 0.4, 0.6, 0.9333333333333, 0.2666666666667, 0.7333333333333], 0.18135802469: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.78024691358: [0.4444444444444, 0.5555555555556], 0.90913580247: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.02222222222: [0.0, 0.3333333333333, 0.6666666666667], 0.36839506173: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.29987654321: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.04888888889: [0.1333333333333, 0.2, 0.8, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.53691358025: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.37283950617: [0.2222222222222, 0.7777777777778], 0.1487654321: [0.3888888888889, 0.6111111111111], 0.54024691358: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.30320987654: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.70320987654: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.45728395062: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.26777777778: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.33395061728: [0.7222222222222, 0.2777777777778], 0.53888888889: [0.5, 0.8333333333333, 0.1666666666667], 0.65987654321: [0.3888888888889, 0.6111111111111], 0.93283950617: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.71802469136: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.91617283951: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.39209876543: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.35666666667: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.52950617284: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.42283950617: [0.7222222222222, 0.2777777777778], 0.71666666667: [0.5, 0.8333333333333, 0.1666666666667], 0.81098765432: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.58765432099: [0.8888888888889, 0.1111111111111], 0.53283950617: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.87913580247: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.09: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.19555555556: [0.0666666666667, 0.4, 0.6, 0.9333333333333, 0.2666666666667, 0.7333333333333], 0.19209876543: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.71358024691: [0.4444444444444, 0.5555555555556], 0.05444444444: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.70728395062: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.6524691358: [0.9444444444444, 0.0555555555556], 0.26172839506: [0.2222222222222, 0.7777777777778], 0.16111111111: [0.5, 0.8333333333333, 0.1666666666667], 0.71061728395: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.45444444444: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.40358024691: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.11222222222: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.19950617284: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.2924691358: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.98222222222: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.52209876543: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.79802469136: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.58024691358: [0.4444444444444, 0.5555555555556], 0.35061728395: [0.2222222222222, 0.7777777777778], 0.13135802469: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.88839506173: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.44987654321: [0.2888888888889, 0.7111111111111, 0.4888888888889, 0.5111111111111], 0.95172839506: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.38135802469: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.57888888889: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.69987654321: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.6450617284: [0.7222222222222, 0.2777777777778], 0.58222222222: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.01098765432: [0.7888888888889, 0.0111111111111, 0.2111111111111, 0.9888888888889], 0.17580246914: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.38469135803: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.87172839506: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.96802469136: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.31617283951: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.8450617284: [0.7222222222222, 0.2777777777778], 0.51469135803: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.41543209877: [0.3888888888889, 0.6111111111111], 0.14320987654: [0.8888888888889, 0.1111111111111], 0.12950617284: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.8024691358: [0.4444444444444, 0.5555555555556], 0.76: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.22024691358: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.47358024691: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.74765432099: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.22172839506: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.43654320988: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.51666666667: [0.5, 0.8333333333333, 0.1666666666667], 0.25432098765: [0.8888888888889, 0.1111111111111], 0.04691358025: [0.4444444444444, 0.5555555555556], 0.17395061728: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.72888888889: [0.2666666666667, 0.4, 0.6, 0.9333333333333, 0.0666666666667, 0.7333333333333], 0.83320987654: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.47654320988: [0.8888888888889, 0.1111111111111], 0.2850617284: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.69395061728: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.50728395062: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.58320987654: [0.5111111111111, 0.7111111111111, 0.4888888888889, 0.2888888888889], 0.47222222222: [0.5, 0.8333333333333, 0.1666666666667], 0.14135802469: [0.9444444444444, 0.0555555555556], 0.34320987654: [0.8888888888889, 0.1111111111111], 0.77839506173: [0.7222222222222, 0.2777777777778], 0.21839506173: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.00691358025: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.37395061728: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.6850617284: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.32395061728: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.81543209877: [0.3888888888889, 0.6111111111111], 0.31111111111: [0.0, 0.3333333333333, 0.6666666666667], 0.18580246914: [0.9444444444444, 0.0555555555556], 0.68839506173: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.09432098765: [0.2888888888889, 0.7111111111111, 0.4888888888889, 0.5111111111111], 0.02728395062: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.51358024691: [0.4444444444444, 0.5555555555556], 0.3087654321: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.46283950617: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.75320987654: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.60555555556: [0.5, 0.8333333333333, 0.1666666666667], 0.15320987654: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.55802469136: [0.4444444444444, 0.5555555555556], 0.97580246914: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.2: [0.0, 0.3333333333333, 0.6666666666667], 0.86617283951: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.79024691358: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.46617283951: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.02913580247: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.27666666667: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.75358024691: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.83765432099: [0.3888888888889, 0.6111111111111], 0.86469135803: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.55111111111: [0.2666666666667, 0.4, 0.6, 0.9333333333333, 0.0666666666667, 0.7333333333333], 0.55666666667: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.67765432099: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.62283950617: [0.7222222222222, 0.2777777777778], 0.25728395062: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.49691358025: [0.9444444444444, 0.0555555555556], 0.67209876543: [0.5111111111111, 0.7111111111111, 0.4888888888889, 0.2888888888889], 0.73580246914: [0.4444444444444, 0.5555555555556], 0.83950617284: [0.2222222222222, 0.7777777777778], 0.4424691358: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.53395061728: [0.7222222222222, 0.2777777777778], 0.27765432099: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.86728395062: [0.7222222222222, 0.2777777777778], 0.73444444444: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.60543209877: [0.5111111111111, 0.7111111111111, 0.4888888888889, 0.2888888888889], 0.36: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.08395061728: [0.2222222222222, 0.7777777777778], 0.70172839506: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.7887654321: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.73777777778: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.8587654321: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.75172839506: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.91172839506: [0.7222222222222, 0.2777777777778], 0.98024691358: [0.4444444444444, 0.5555555555556], 0.36654320988: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.80444444444: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.48098765432: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.16555555556: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.28691358025: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.72839506173: [0.2222222222222, 0.7777777777778], 0.16839506173: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.11111111111: [0.0, 0.3333333333333, 0.6666666666667], 0.91555555556: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.55555555556: [0.0, 0.3333333333333, 0.6666666666667], 0.30135802469: [0.2555555555556, 0.8555555555556, 0.1444444444444, 0.7444444444444], 0.33444444444: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.53333333333: [0.0, 0.3333333333333, 0.6666666666667], 0.45543209877: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.04: [0.0666666666667, 0.4, 0.6, 0.9333333333333, 0.2666666666667, 0.7333333333333], 0.13580246914: [0.4444444444444, 0.5555555555556], 0.94617283951: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.76098765432: [0.5111111111111, 0.7111111111111, 0.4888888888889, 0.2888888888889], 0.21283950617: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.42777777778: [0.5, 0.8333333333333, 0.1666666666667], 0.94580246914: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.11469135803: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.88888888889: [0.0, 0.3333333333333, 0.6666666666667], 0.39024691358: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.75555555556: [0.0, 0.6666666666667, 0.3333333333333], 0.42333333333: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.95617283951: [0.7222222222222, 0.2777777777778], 0.66283950617: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.96061728395: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.48950617284: [0.7222222222222, 0.2777777777778], 0.6: [0.0, 0.3333333333333, 0.6666666666667], 0.0450617284: [0.7222222222222, 0.2777777777778], 0.16654320988: [0.7888888888889, 0.0111111111111, 0.2111111111111, 0.9888888888889], 0.56802469136: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.42469135803: [0.4444444444444, 0.5555555555556], 0.02666666667: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.76839506173: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.81777777778: [0.2666666666667, 0.4, 0.6, 0.9333333333333, 0.0666666666667, 0.7333333333333], 0.92209876543: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.79061728395: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.92098765432: [0.8888888888889, 0.1111111111111], 0.26222222222: [0.0666666666667, 0.4, 0.6, 0.9333333333333, 0.2666666666667, 0.7333333333333], 0.81: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.14765432099: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.13395061728: [0.7222222222222, 0.2777777777778], 0.53580246914: [0.4444444444444, 0.5555555555556], 0.08209876543: [0.3888888888889, 0.6111111111111], 0.77777777778: [0.0, 0.3333333333333, 0.6666666666667], 0.22469135803: [0.4444444444444, 0.5555555555556], 0.21098765432: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.28098765432: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.89691358025: [0.9444444444444, 0.0555555555556], 0.64444444444: [0.0, 0.3333333333333, 0.6666666666667], 0.35913580247: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.48888888889: [0.0, 0.3333333333333, 0.6666666666667], 0.65543209877: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.03209876543: [0.8888888888889, 0.1111111111111], 0.04802469136: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.17839506173: [0.7222222222222, 0.2777777777778], 0.10432098765: [0.3888888888889, 0.6111111111111], 0.41728395062: [0.2222222222222, 0.7777777777778], 0.95555555556: [0.0, 0.6666666666667, 0.3333333333333], 0.79617283951: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.67024691358: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.04555555556: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.33580246914: [0.4444444444444, 0.5555555555556], 0.29395061728: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.34617283951: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.97728395062: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.44802469136: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.71222222222: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.14580246914: [0.2555555555556, 0.8555555555556, 0.1444444444444, 0.7444444444444], 0.52839506173: [0.2222222222222, 0.7777777777778], 0.89395061728: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.22283950617: [0.7222222222222, 0.2777777777778], 0.82777777778: [0.5, 0.8333333333333, 0.1666666666667], 0.65691358025: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.38283950617: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.44888888889: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.99802469136: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.84209876543: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.07172839506: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.64802469136: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.01: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.19024691358: [0.2555555555556, 0.8555555555556, 0.1444444444444, 0.7444444444444], 0.70617283951: [0.2222222222222, 0.7777777777778], 0.74135802469: [0.9444444444444, 0.0555555555556], 0.22333333333: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.07654320988: [0.8888888888889, 0.1111111111111], 0.95320987654: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.47172839506: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.82580246914: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.45432098765: [0.8888888888889, 0.1111111111111], 0.20444444444: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.8424691358: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.88395061728: [0.2222222222222, 0.7777777777778], 0.2524691358: [0.9444444444444, 0.0555555555556], 0.84061728395: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.32098765432: [0.8888888888889, 0.1111111111111], 0.35172839506: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.80111111111: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.55358024691: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.64061728395: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.28888888889: [0.0, 0.3333333333333, 0.6666666666667], 0.6987654321: [0.8888888888889, 0.1111111111111], 0.40987654321: [0.8888888888889, 0.1111111111111], 0.98765432099: [0.8888888888889, 0.1111111111111], 0.28654320988: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.82913580247: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.44061728395: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.62222222222: [0.0, 0.3333333333333, 0.6666666666667], 0.74580246914: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.56839506173: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.87111111111: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.09444444444: [0.5, 0.8333333333333, 0.1666666666667], 0.87654320988: [0.8888888888889, 0.1111111111111], 0.93444444444: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.20543209877: [0.2888888888889, 0.7111111111111, 0.4888888888889, 0.5111111111111], 0.62469135803: [0.4444444444444, 0.5555555555556], 0.25444444444: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.37543209877: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.06913580247: [0.4444444444444, 0.5555555555556], 0.27950617284: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.52543209877: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.74617283951: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.17283950617: [0.2222222222222, 0.7777777777778], 0.91469135803: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.11666666667: [0.5, 0.8333333333333, 0.1666666666667], 0.96358024691: [0.9444444444444, 0.0555555555556], 0.24987654321: [0.2888888888889, 0.7111111111111, 0.4888888888889, 0.5111111111111], 0.85987654321: [0.3888888888889, 0.6111111111111], 0.34333333333: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.46432098765: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.51061728395: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.69: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.14024691358: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.36543209877: [0.8888888888889, 0.1111111111111], 0.92395061728: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.05432098765: [0.8888888888889, 0.1111111111111], 0.86777777778: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.43222222222: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.58469135803: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.61777777778: [0.2666666666667, 0.4, 0.6, 0.9333333333333, 0.0666666666667, 0.7333333333333], 0.18469135803: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.17098765432: [0.3888888888889, 0.6111111111111], 0.4024691358: [0.4444444444444, 0.5555555555556], 0.21777777778: [0.0666666666667, 0.4, 0.6, 0.9333333333333, 0.2666666666667, 0.7333333333333], 0.24802469136: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.6050617284: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.77543209877: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.72543209877: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.06777777778: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.43320987654: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.74333333333: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.13839506173: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.00358024691: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.4350617284: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.05728395062: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.86283950617: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.21543209877: [0.3888888888889, 0.6111111111111], 0.49135802469: [0.4444444444444, 0.5555555555556], 0.57777777778: [0.0, 0.3333333333333, 0.6666666666667], 0.36802469136: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.40111111111: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.27209876543: [0.2888888888889, 0.7111111111111, 0.4888888888889, 0.5111111111111], 0.9487654321: [0.3888888888889, 0.6111111111111], 0.34024691358: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.18283950617: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.97333333333: [0.2666666666667, 0.4, 0.6, 0.9333333333333, 0.0666666666667, 0.7333333333333], 0.78320987654: [0.5111111111111, 0.7111111111111, 0.4888888888889, 0.2888888888889], 0.39617283951: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.59765432099: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.75111111111: [0.2666666666667, 0.4, 0.6, 0.9333333333333, 0.0666666666667, 0.7333333333333], 0.45691358025: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.96987654321: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.83654320988: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.49: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.36098765432: [0.2888888888889, 0.7111111111111, 0.4888888888889, 0.5111111111111], 0.79555555556: [0.2666666666667, 0.4, 0.6, 0.9333333333333, 0.0666666666667, 0.7333333333333], 0.22728395062: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.85432098765: [0.8888888888889, 0.1111111111111], 0.93098765432: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.49432098765: [0.2888888888889, 0.7111111111111, 0.4888888888889, 0.5111111111111], 0.82728395062: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.65444444444: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.22222222222: [0.0, 0.3333333333333, 0.6666666666667], 0.16987654321: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.29580246914: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.32888888889: [0.0666666666667, 0.4, 0.6, 0.9333333333333, 0.2666666666667, 0.7333333333333], 0.83222222222: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.72395061728: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.36987654321: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.0987654321: [0.8888888888889, 0.1111111111111], 0.22777777778: [0.5, 0.8333333333333, 0.1666666666667], 0.04395061728: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.32654320988: [0.3888888888889, 0.6111111111111], 0.59024691358: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.42580246914: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.76358024691: [0.9444444444444, 0.0555555555556], 0.41777777778: [0.0666666666667, 0.4, 0.6, 0.9333333333333, 0.2666666666667, 0.7333333333333], 0.90172839506: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.84691358025: [0.4444444444444, 0.5555555555556], 0.64: [0.2666666666667, 0.4, 0.6, 0.9333333333333, 0.0666666666667, 0.7333333333333], 0.03024691358: [0.9444444444444, 0.0555555555556], 0.29444444444: [0.5, 0.8333333333333, 0.1666666666667], 0.44691358025: [0.4444444444444, 0.5555555555556], 0.36061728395: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.5387654321: [0.2888888888889, 0.7111111111111, 0.4888888888889, 0.5111111111111], 0.87209876543: [0.5111111111111, 0.7111111111111, 0.4888888888889, 0.2888888888889], 0.06617283951: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.95913580247: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.59555555556: [0.2666666666667, 0.4, 0.6, 0.9333333333333, 0.0666666666667, 0.7333333333333], 0.71654320988: [0.5111111111111, 0.7111111111111, 0.4888888888889, 0.2888888888889], 0.73395061728: [0.7222222222222, 0.2777777777778], 0.38333333333: [0.5, 0.8333333333333, 0.1666666666667], 0.58283950617: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.44950617284: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.81839506173: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.35358024691: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.08839506173: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.77333333333: [0.2666666666667, 0.4, 0.6, 0.9333333333333, 0.0666666666667, 0.7333333333333], 0.26666666667: [0.0, 0.3333333333333, 0.6666666666667], 0.83469135803: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.0524691358: [0.9444444444444, 0.0555555555556], 0.53135802469: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.96555555556: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.92111111111: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.19802469136: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.07209876543: [0.2888888888889, 0.7111111111111, 0.4888888888889, 0.5111111111111], 0.37777777778: [0.0, 0.3333333333333, 0.6666666666667], 0.02765432099: [0.2888888888889, 0.7111111111111, 0.4888888888889, 0.5111111111111], 0.17728395062: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.88654320988: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.95111111111: [0.2666666666667, 0.4, 0.6, 0.9333333333333, 0.0666666666667, 0.7333333333333], 0.31913580247: [0.9444444444444, 0.0555555555556], 0.57543209877: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.42913580247: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.59802469136: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.05135802469: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.56098765432: [0.2888888888889, 0.7111111111111, 0.4888888888889, 0.5111111111111], 0.96098765432: [0.5111111111111, 0.7111111111111, 0.4888888888889, 0.2888888888889], 0.05543209877: [0.7888888888889, 0.0111111111111, 0.2111111111111, 0.9888888888889], 0.00555555556: [0.5, 0.8333333333333, 0.1666666666667], 0.11913580247: [0.9444444444444, 0.0555555555556], 0.40802469136: [0.9444444444444, 0.0555555555556], 0.63222222222: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.50320987654: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.78333333333: [0.5, 0.8333333333333, 0.1666666666667], 0.51802469136: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.06432098765: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.17543209877: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.02777777778: [0.5, 0.8333333333333, 0.1666666666667], 0.03950617284: [0.2222222222222, 0.7777777777778], 0.56: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.44209876543: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.88098765432: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.08654320988: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.21987654321: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.0024691358: [0.4444444444444, 0.5555555555556], 0.97098765432: [0.3888888888889, 0.6111111111111], 0.84444444444: [0.0, 0.6666666666667, 0.3333333333333], 0.41: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.07024691358: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.92888888889: [0.2666666666667, 0.4, 0.6, 0.9333333333333, 0.0666666666667, 0.7333333333333], 0.76802469136: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.05691358025: [0.2555555555556, 0.8555555555556, 0.1444444444444, 0.7444444444444], 0.68654320988: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.1087654321: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.98135802469: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.57728395062: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.84: [0.2666666666667, 0.4, 0.6, 0.9333333333333, 0.0666666666667, 0.7333333333333], 0.31172839506: [0.7222222222222, 0.2777777777778], 0.61543209877: [0.3888888888889, 0.6111111111111], 0.56061728395: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.66172839506: [0.2222222222222, 0.7777777777778], 0.49888888889: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.0924691358: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.18222222222: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.18777777778: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.52358024691: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.5550617284: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.60691358025: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.46913580247: [0.4444444444444, 0.5555555555556], 0.01839506173: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.99617283951: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.40061728395: [0.7222222222222, 0.2777777777778], 0.67222222222: [0.5, 0.8333333333333, 0.1666666666667], 0.73839506173: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.69135802469: [0.4444444444444, 0.5555555555556], 0.08444444444: [0.0666666666667, 0.4, 0.6, 0.9333333333333, 0.2666666666667, 0.7333333333333], 0.23222222222: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.9950617284: [0.2222222222222, 0.7777777777778], 0.6924691358: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.82283950617: [0.7222222222222, 0.2777777777778], 0.60802469136: [0.9444444444444, 0.0555555555556], 0.46666666667: [0.0, 0.3333333333333, 0.6666666666667], 0.8: [0.0, 0.6666666666667, 0.3333333333333], 0.43469135803: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.79765432099: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.90320987654: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.3387654321: [0.2888888888889, 0.7111111111111, 0.4888888888889, 0.5111111111111], 0.27024691358: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.76111111111: [0.5, 0.8333333333333, 0.1666666666667], 0.88950617284: [0.7222222222222, 0.2777777777778], 0.02024691358: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.61: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.73098765432: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.27358024691: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.07666666667: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.42765432099: [0.2888888888889, 0.7111111111111, 0.4888888888889, 0.5111111111111], 0.18765432099: [0.8888888888889, 0.1111111111111], 0.24691358025: [0.4444444444444, 0.5555555555556], 0.30432098765: [0.3888888888889, 0.6111111111111], 0.91666666667: [0.5, 0.8333333333333, 0.1666666666667], 0.42666666667: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.54580246914: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.13444444444: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.53777777778: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.6587654321: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.97543209877: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.3624691358: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.09888888889: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.52111111111: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.64987654321: [0.5111111111111, 0.7111111111111, 0.4888888888889, 0.2888888888889], 0.21432098765: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.93777777778: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.27222222222: [0.5, 0.8333333333333, 0.1666666666667], 0.39320987654: [0.3888888888889, 0.6111111111111], 0.72111111111: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.72358024691: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.73888888889: [0.5, 0.8333333333333, 0.1666666666667], 0.13283950617: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.12098765432: [0.8888888888889, 0.1111111111111], 0.93333333333: [0.0, 0.6666666666667, 0.3333333333333], 0.12111111111: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.93691358025: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.36111111111: [0.5, 0.8333333333333, 0.1666666666667], 0.01432098765: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.48209876543: [0.3888888888889, 0.6111111111111], 0.53839506173: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.2987654321: [0.8888888888889, 0.1111111111111], 0.65135802469: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.0350617284: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.33135802469: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.60061728395: [0.7222222222222, 0.2777777777778], 0.2124691358: [0.2555555555556, 0.8555555555556, 0.1444444444444, 0.7444444444444], 0.76555555556: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.45: [0.5, 0.8333333333333, 0.1666666666667], 0.52098765432: [0.8888888888889, 0.1111111111111], 0.71617283951: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.57283950617: [0.2222222222222, 0.7777777777778], 0.26617283951: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.78469135803: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.38320987654: [0.2888888888889, 0.7111111111111, 0.4888888888889, 0.5111111111111], 0.62580246914: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.42024691358: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.22666666667: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.07111111111: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.17888888889: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.21728395062: [0.2222222222222, 0.7777777777778], 0.29691358025: [0.9444444444444, 0.0555555555556], 0.58580246914: [0.9444444444444, 0.0555555555556], 0.53098765432: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.07222222222: [0.5, 0.8333333333333, 0.1666666666667], 0.16098765432: [0.2888888888889, 0.7111111111111, 0.4888888888889, 0.5111111111111], 0.01543209877: [0.3888888888889, 0.6111111111111], 0.3550617284: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.69444444444: [0.5, 0.8333333333333, 0.1666666666667], 0.85444444444: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.95358024691: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.75666666667: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.38580246914: [0.9444444444444, 0.0555555555556], 0.85691358025: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.56543209877: [0.8888888888889, 0.1111111111111], 0.00802469136: [0.9444444444444, 0.0555555555556], 0.61432098765: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.82024691358: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.77888888889: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.01283950617: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.44395061728: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.04061728395: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.51222222222: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.63320987654: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.30666666667: [0.0666666666667, 0.4, 0.6, 0.9333333333333, 0.2666666666667, 0.7333333333333], 0.47469135803: [0.9444444444444, 0.0555555555556], 0.03222222222: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.15913580247: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.99320987654: [0.3888888888889, 0.6111111111111], 0.35802469136: [0.4444444444444, 0.5555555555556], 0.87358024691: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.0050617284: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.25543209877: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.98333333333: [0.5, 0.8333333333333, 0.1666666666667], 0.11061728395: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.12654320988: [0.3888888888889, 0.6111111111111], 0.31358024691: [0.4444444444444, 0.5555555555556], 0.05987654321: [0.3888888888889, 0.6111111111111], 0.20358024691: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.81987654321: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.10320987654: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.34432098765: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.13333333333: [0.0, 0.3333333333333, 0.6666666666667], 0.57098765432: [0.3888888888889, 0.6111111111111], 0.72098765432: [0.8888888888889, 0.1111111111111], 0.65432098765: [0.8888888888889, 0.1111111111111], 0.62913580247: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.01987654321: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.02172839506: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.08691358025: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.00543209877: [0.2888888888889, 0.7111111111111, 0.4888888888889, 0.5111111111111], 0.38765432099: [0.8888888888889, 0.1111111111111], 0.27913580247: [0.2555555555556, 0.8555555555556, 0.1444444444444, 0.7444444444444], 0.53444444444: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.31222222222: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.84950617284: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.37839506173: [0.7222222222222, 0.2777777777778], 0.62777777778: [0.5, 0.8333333333333, 0.1666666666667], 0.7487654321: [0.3888888888889, 0.6111111111111], 0.35111111111: [0.0666666666667, 0.4, 0.6, 0.9333333333333, 0.2666666666667, 0.7333333333333], 0.68395061728: [0.2222222222222, 0.7777777777778], 0.80691358025: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.31555555556: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.10913580247: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.9624691358: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.47666666667: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.49580246914: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.61839506173: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.56358024691: [0.9444444444444, 0.0555555555556], 0.46728395062: [0.7222222222222, 0.2777777777778], 0.13888888889: [0.5, 0.8333333333333, 0.1666666666667], 0.67654320988: [0.8888888888889, 0.1111111111111], 0.62172839506: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.73333333333: [0.0, 0.6666666666667, 0.3333333333333], 0.40444444444: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.63024691358: [0.9444444444444, 0.0555555555556], 0.78777777778: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.99209876543: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.9087654321: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.91802469136: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.15555555556: [0.0, 0.3333333333333, 0.6666666666667], 0.30617283951: [0.2222222222222, 0.7777777777778], 0.86913580247: [0.4444444444444, 0.5555555555556], 0.79950617284: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.74320987654: [0.8888888888889, 0.1111111111111], 0.95802469136: [0.4444444444444, 0.5555555555556], 0.49333333333: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.91061728395: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.33691358025: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.87024691358: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.95666666667: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.61098765432: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.93839506173: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.55617283951: [0.7222222222222, 0.2777777777778], 0.82765432099: [0.5111111111111, 0.7111111111111, 0.4888888888889, 0.2888888888889], 0.66913580247: [0.4444444444444, 0.5555555555556], 0.15358024691: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.39555555556: [0.2666666666667, 0.4, 0.6, 0.9333333333333, 0.0666666666667, 0.7333333333333], 0.0850617284: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.51555555556: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.27172839506: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.12358024691: [0.2555555555556, 0.8555555555556, 0.1444444444444, 0.7444444444444], 0.69333333333: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.77283950617: [0.2222222222222, 0.7777777777778], 0.37728395062: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.66777777778: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.27111111111: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.52395061728: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.80555555556: [0.5, 0.8333333333333, 0.1666666666667], 0.69580246914: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.67111111111: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.04950617284: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.87222222222: [0.5, 0.8333333333333, 0.1666666666667], 0.10728395062: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.96111111111: [0.5, 0.8333333333333, 0.1666666666667], 0.46222222222: [0.2666666666667, 0.4, 0.6, 0.9333333333333, 0.0666666666667, 0.7333333333333], 0.11555555556: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.62666666667: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.96: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.60358024691: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.62024691358: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.5487654321: [0.3888888888889, 0.6111111111111], 0.45987654321: [0.3888888888889, 0.6111111111111], 0.16543209877: [0.8888888888889, 0.1111111111111], 0.15172839506: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.84888888889: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.2424691358: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.08888888889: [0.0, 0.3333333333333, 0.6666666666667], 0.06283950617: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.87950617284: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.83209876543: [0.8888888888889, 0.1111111111111], 0.92358024691: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.40691358025: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.03320987654: [0.7888888888889, 0.0111111111111, 0.2111111111111, 0.9888888888889], 0.72654320988: [0.3888888888889, 0.6111111111111], 0.91222222222: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.07469135803: [0.9444444444444, 0.0555555555556], 0.90728395062: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.22913580247: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.19617283951: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.32950617284: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.86172839506: [0.2222222222222, 0.7777777777778], 0.59617283951: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.06666666667: [0.0, 0.3333333333333, 0.6666666666667], 0.16358024691: [0.9444444444444, 0.0555555555556], 0.09691358025: [0.9444444444444, 0.0555555555556], 0.59950617284: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.97777777778: [0.0, 0.6666666666667, 0.3333333333333], 0.83024691358: [0.9444444444444, 0.0555555555556], 0.24061728395: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.47765432099: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.93888888889: [0.5, 0.8333333333333, 0.1666666666667], 0.26432098765: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.48691358025: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.41839506173: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.80802469136: [0.9444444444444, 0.0555555555556], 0.13098765432: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.71913580247: [0.9444444444444, 0.0555555555556], 0.04444444444: [0.0, 0.3333333333333, 0.6666666666667], 0.17777777778: [0.0, 0.3333333333333, 0.6666666666667], 0.20802469136: [0.9444444444444, 0.0555555555556], 0.77728395062: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.8050617284: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.42172839506: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.6424691358: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.35320987654: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.05: [0.5, 0.8333333333333, 0.1666666666667], 0.78135802469: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.5887654321: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.89432098765: [0.5111111111111, 0.7111111111111, 0.4888888888889, 0.2888888888889], 0.4524691358: [0.9444444444444, 0.0555555555556], 0.45135802469: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.57839506173: [0.7222222222222, 0.2777777777778], 0.59209876543: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.9550617284: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.32111111111: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.57580246914: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.48444444444: [0.2666666666667, 0.4, 0.6, 0.9333333333333, 0.0666666666667, 0.7333333333333], 0.51654320988: [0.5111111111111, 0.7111111111111, 0.4888888888889, 0.2888888888889], 0.95061728395: [0.2222222222222, 0.7777777777778], 0.71172839506: [0.7222222222222, 0.2777777777778], 0.29135802469: [0.4444444444444, 0.5555555555556], 0.64888888889: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.76987654321: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.11654320988: [0.2888888888889, 0.7111111111111, 0.4888888888889, 0.5111111111111], 0.8124691358: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.82333333333: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.32209876543: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.80358024691: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.58135802469: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.14333333333: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.38222222222: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.63950617284: [0.2222222222222, 0.7777777777778], 0.03654320988: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.64691358025: [0.4444444444444, 0.5555555555556], 0.82666666667: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.94765432099: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.3950617284: [0.2222222222222, 0.7777777777778], 0.25691358025: [0.2555555555556, 0.8555555555556, 0.1444444444444, 0.7444444444444], 0.29: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.73135802469: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.41098765432: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.75061728395: [0.2222222222222, 0.7777777777778], 0.50913580247: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.70432098765: [0.3888888888889, 0.6111111111111], 0.99950617284: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.59320987654: [0.3888888888889, 0.6111111111111], 0.19061728395: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.9887654321: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.99765432099: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.34580246914: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.37888888889: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.49987654321: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.51111111111: [0.0, 0.3333333333333, 0.6666666666667], 0.15802469136: [0.4444444444444, 0.5555555555556], 0.78222222222: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.94024691358: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.0587654321: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.98469135803: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.46777777778: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.22580246914: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.12543209877: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.31654320988: [0.2888888888889, 0.7111111111111, 0.4888888888889, 0.5111111111111], 0.28395061728: [0.2222222222222, 0.7777777777778], 0.68888888889: [0.0, 0.3333333333333, 0.6666666666667], 0.2024691358: [0.4444444444444, 0.5555555555556], 0.1887654321: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.98580246914: [0.9444444444444, 0.0555555555556], 0.47111111111: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.70913580247: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.31469135803: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.56654320988: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.68098765432: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.0424691358: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.56111111111: [0.5, 0.8333333333333, 0.1666666666667], 0.15617283951: [0.7222222222222, 0.2777777777778], 0.31802469136: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.86666666667: [0.0, 0.6666666666667, 0.3333333333333], 0.06172839506: [0.2222222222222, 0.7777777777778], 0.23320987654: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.63209876543: [0.8888888888889, 0.1111111111111], 0.26283950617: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.64555555556: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.62333333333: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.74432098765: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.92950617284: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.76654320988: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.20061728395: [0.7222222222222, 0.2777777777778], 0.46172839506: [0.2222222222222, 0.7777777777778], 0.33839506173: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.4924691358: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.91913580247: [0.9444444444444, 0.0555555555556], 0.13777777778: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.16802469136: [0.2555555555556, 0.8555555555556, 0.1444444444444, 0.7444444444444], 0.99024691358: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.61728395062: [0.2222222222222, 0.7777777777778], 0.54617283951: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.2450617284: [0.7222222222222, 0.2777777777778], 0.20111111111: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.92543209877: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.91654320988: [0.5111111111111, 0.7111111111111, 0.4888888888889, 0.2888888888889], 0.12395061728: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.42728395062: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.82469135803: [0.4444444444444, 0.5555555555556], 0.73691358025: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.27654320988: [0.8888888888889, 0.1111111111111], 0.7950617284: [0.2222222222222, 0.7777777777778], 0.37580246914: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.18333333333: [0.5, 0.8333333333333, 0.1666666666667], 0.24555555556: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.65: [0.5, 0.8333333333333, 0.1666666666667], 0.55320987654: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.77098765432: [0.3888888888889, 0.6111111111111], 0.88209876543: [0.3888888888889, 0.6111111111111], 0.25135802469: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.66469135803: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.60987654321: [0.8888888888889, 0.1111111111111], 0.02283950617: [0.7222222222222, 0.2777777777778], 0.09135802469: [0.4444444444444, 0.5555555555556], 0.97283950617: [0.2222222222222, 0.7777777777778], 0.30728395062: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.00111111111: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.85543209877: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.76061728395: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.01777777778: [0.0666666666667, 0.4, 0.6, 0.9333333333333, 0.2666666666667, 0.7333333333333], 0.55061728395: [0.2222222222222, 0.7777777777778], 0.72950617284: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.79209876543: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.33333333333: [0.0, 0.3333333333333, 0.6666666666667], 0.78765432099: [0.8888888888889, 0.1111111111111], 0.11358024691: [0.4444444444444, 0.5555555555556], 0.40543209877: [0.2888888888889, 0.7111111111111, 0.4888888888889, 0.5111111111111], 0.85728395062: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.33098765432: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.8850617284: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.80061728395: [0.7222222222222, 0.2777777777778], 0.47209876543: [0.2888888888889, 0.7111111111111, 0.4888888888889, 0.5111111111111], 0.89987654321: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.4850617284: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.98777777778: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.65728395062: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.03765432099: [0.3888888888889, 0.6111111111111], 0.01728395062: [0.2222222222222, 0.7777777777778], 0.42222222222: [0.0, 0.3333333333333, 0.6666666666667], 0.96543209877: [0.8888888888889, 0.1111111111111], 0.22765432099: [0.2888888888889, 0.7111111111111, 0.4888888888889, 0.5111111111111], 0.90666666667: [0.2666666666667, 0.4, 0.6, 0.9333333333333, 0.0666666666667, 0.7333333333333], 0.10617283951: [0.2222222222222, 0.7777777777778], 0.29888888889: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.7387654321: [0.5111111111111, 0.7111111111111, 0.4888888888889, 0.2888888888889], 0.41987654321: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.82222222222: [0.0, 0.6666666666667, 0.3333333333333], 0.08098765432: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.72209876543: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.8350617284: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.1950617284: [0.2222222222222, 0.7777777777778], 0.8924691358: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.57333333333: [0.2666666666667, 0.4, 0.6, 0.9333333333333, 0.0666666666667, 0.7333333333333], 0.03469135803: [0.2555555555556, 0.8555555555556, 0.1444444444444, 0.7444444444444], 0.38777777778: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.04987654321: [0.2888888888889, 0.7111111111111, 0.4888888888889, 0.5111111111111], 0.69432098765: [0.5111111111111, 0.7111111111111, 0.4888888888889, 0.2888888888889], 0.2587654321: [0.6888888888889, 0.0888888888889, 0.9111111111111, 0.3111111111111], 0.52888888889: [0.0666666666667, 0.4, 0.6, 0.9333333333333, 0.2666666666667, 0.7333333333333], 0.1624691358: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.36555555556: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.08950617284: [0.7222222222222, 0.2777777777778], 0.23950617284: [0.2222222222222, 0.7777777777778], 0.28950617284: [0.7222222222222, 0.2777777777778], 0.07765432099: [0.7888888888889, 0.0111111111111, 0.2111111111111, 0.9888888888889], 0.71555555556: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.51617283951: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.38024691358: [0.4444444444444, 0.5555555555556], 0.3887654321: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.97395061728: [0.8777777777778, 0.3222222222222, 0.1222222222222, 0.6777777777778], 0.71469135803: [0.3444444444444, 0.6555555555556, 0.4555555555556, 0.5444444444444], 0.93395061728: [0.7222222222222, 0.2777777777778], 0.70666666667: [0.2666666666667, 0.4, 0.6, 0.9333333333333, 0.0666666666667, 0.7333333333333], 0.20691358025: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.19320987654: [0.3888888888889, 0.6111111111111], 0.11172839506: [0.7222222222222, 0.2777777777778], 0.89135802469: [0.4444444444444, 0.5555555555556], 0.24: [0.0666666666667, 0.4, 0.6, 0.9333333333333, 0.2666666666667, 0.7333333333333], 0.41283950617: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.32358024691: [0.2555555555556, 0.8555555555556, 0.1444444444444, 0.7444444444444], 0.02469135803: [0.4444444444444, 0.5555555555556], 0.46469135803: [0.9555555555556, 0.0444444444444, 0.8444444444444, 0.1555555555556], 0.91111111111: [0.0, 0.6666666666667, 0.3333333333333], 0.16061728395: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.67950617284: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.86432098765: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.23765432099: [0.3888888888889, 0.6111111111111], 0.89580246914: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.75617283951: [0.7222222222222, 0.2777777777778], 0.5087654321: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.66222222222: [0.2666666666667, 0.4, 0.6, 0.9333333333333, 0.0666666666667, 0.7333333333333], 0.4124691358: [0.2555555555556, 0.8555555555556, 0.1444444444444, 0.7444444444444], 0.24888888889: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.44555555556: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.96839506173: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.07913580247: [0.2555555555556, 0.8555555555556, 0.1444444444444, 0.7444444444444], 0.67358024691: [0.7555555555556, 0.6444444444444, 0.3555555555556, 0.2444444444444], 0.2050617284: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.76543209877: [0.8888888888889, 0.1111111111111], 0.37333333333: [0.2666666666667, 0.4, 0.6, 0.9333333333333, 0.0666666666667, 0.7333333333333], 0.64395061728: [0.5777777777778, 0.0222222222222, 0.4222222222222, 0.9777777777778], 0.97839506173: [0.7222222222222, 0.2777777777778], 0.56555555556: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.04209876543: [0.1888888888889, 0.4111111111111, 0.5888888888889, 0.8111111111111], 0.00444444444: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.28444444444: [0.0666666666667, 0.4, 0.6, 0.9333333333333, 0.2666666666667, 0.7333333333333], 0.10135802469: [0.2555555555556, 0.8555555555556, 0.1444444444444, 0.7444444444444], 0.6350617284: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.44444444444: [0.0, 0.3333333333333, 0.6666666666667], 0.24950617284: [0.4777777777778, 0.5222222222222, 0.9222222222222, 0.0777777777778], 0.20555555556: [0.5, 0.8333333333333, 0.1666666666667], 0.94320987654: [0.8888888888889, 0.1111111111111], 0.28209876543: [0.3888888888889, 0.6111111111111], 0.2350617284: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.50135802469: [0.1444444444444, 0.2555555555556, 0.8555555555556, 0.7444444444444], 0.96654320988: [0.2111111111111, 0.0111111111111, 0.7888888888889, 0.9888888888889], 0.87469135803: [0.9444444444444, 0.0555555555556], 0.02333333333: [0.7666666666667, 0.1, 0.5666666666667, 0.4333333333333, 0.9, 0.2333333333333], 0.09333333333: [0.1333333333333, 0.8, 0.2, 0.8666666666667, 0.4666666666667, 0.5333333333333], 0.81283950617: [0.3777777777778, 0.8222222222222, 0.1777777777778, 0.6222222222222], 0.58777777778: [0.0333333333333, 0.3, 0.6333333333333, 0.7, 0.9666666666667, 0.3666666666667], 0.75802469136: [0.4444444444444, 0.5555555555556], 0.43950617284: [0.2222222222222, 0.7777777777778]} | 41,940.2 | 81,459 | 0.787884 | 26,778 | 209,701 | 6.169804 | 0.075809 | 0.0023 | 0.001761 | 0.008716 | 0.928753 | 0.926623 | 0.926623 | 0.925945 | 0.925945 | 0.925945 | 0 | 0.841288 | 0.063819 | 209,701 | 5 | 81,460 | 41,940.2 | 0.00028 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 11 |
f32a750dd332015d9d84cf52eb958ac1e7da8d10 | 11,215 | py | Python | src/metasp_programs.py | javier-romero/guess_and_check | f13df93e117d547088519817da5449880caee859 | [
"MIT"
] | 16 | 2016-11-16T17:24:42.000Z | 2022-01-08T16:19:46.000Z | src/metasp_programs.py | javier-romero/guess_and_check | f13df93e117d547088519817da5449880caee859 | [
"MIT"
] | 10 | 2018-03-14T14:18:40.000Z | 2022-01-21T13:24:06.000Z | src/metasp_programs.py | javier-romero/guess_and_check | f13df93e117d547088519817da5449880caee859 | [
"MIT"
] | 4 | 2018-03-10T04:25:45.000Z | 2020-10-20T09:43:08.000Z | meta_program = """
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% This file is part of clingo. %
% Copyright (C) 2015 Martin Gebser %
% Copyright (C) 2015 Roland Kaminski %
% Copyright (C) 2015 Torsten Schaub %
% %
% This program is free software: you can redistribute it and/or modify %
% it under the terms of the GNU General Public License as published by %
% the Free Software Foundation, either version 3 of the License, or %
% (at your option) any later version. %
% %
% This program is distributed in the hope that it will be useful, %
% but WITHOUT ANY WARRANTY; without even the implied warranty of %
% MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the %
% GNU General Public License for more details. %
% %
% You should have received a copy of the GNU General Public License %
% along with this program. If not, see <http://www.gnu.org/licenses/>. %
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
##conjunction(B) :- ##literal_tuple(B),
##hold(L) : ##literal_tuple(B, L), L > 0;
not ##hold(L) : ##literal_tuple(B,-L), L > 0.
##body(normal(B)) :- ##rule(_,normal(B)), ##conjunction(B).
##body(sum(B,G)) :- ##rule(_,sum(B,G)),
#sum { W,L : ##hold(L), ##weighted_literal_tuple(B, L,W), L > 0 ;
W,L : not ##hold(L), ##weighted_literal_tuple(B,-L,W), L > 0 } >= G.
##hold(A) : ##atom_tuple(H,A) :- ##rule(disjunction(H),B), ##body(B).
{ ##hold(A) : ##atom_tuple(H,A) } :- ##rule( choice(H),B), ##body(B).
% commented by Javier for MetaspPython
%*
##optimize(J,W,Q) :- ##output(_optimize(J,W,Q),B), ##conjunction(B).
:- ##output(_query,B), not ##conjunction(B).
##hide(_criteria(J,W,Q)) :- ##output(_criteria(J,W,Q),_).
##hide(_query) :- ##output(_query,_).
##hide(_optimize(J,W,Q)) :- ##output(_optimize(J,W,Q),_).
#show.
#show T : ##output(T,B), ##conjunction(B), not ##hide(T).
*%
% added by Javier for MetaspPython
#show.
#show T : ##output(T,L), ##hold(L).
"""
metaD_program = """
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% This file is part of clingo. %
% %
% Authors: Martin Gebser, Roland Kaminski, Torsten Schaub %
% %
% This program is free software: you can redistribute it and/or modify %
% it under the terms of the GNU General Public License as published by %
% the Free Software Foundation, either version 3 of the License, or %
% (at your option) any later version. %
% %
% This program is distributed in the hope that it will be useful, %
% but WITHOUT ANY WARRANTY; without even the implied warranty of %
% MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the %
% GNU General Public License for more details. %
% %
% You should have received a copy of the GNU General Public License %
% along with this program. If not, see <http://www.gnu.org/licenses/>. %
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% NOTE: assumes that a rule has no more than one head
##sum(B,G,T) :- ##rule(_,sum(B,G)), T = #sum { W,L : ##weighted_literal_tuple(B,L,W) }.
% extract supports of atoms and facts
##supp(A,B) :- ##rule( choice(H),B), ##atom_tuple(H,A).
##supp(A,B) :- ##rule(disjunction(H),B), ##atom_tuple(H,A).
##supp(A) :- ##supp(A,_).
##atom(|L|) :- ##weighted_literal_tuple(_,L,_).
##atom(|L|) :- ##literal_tuple(_,L).
##atom( A ) :- ##atom_tuple(_,A).
##fact(A) :- ##rule(disjunction(H),normal(B)), ##atom_tuple(H,A), not ##literal_tuple(B,_).
% generate interpretation
##true(atom(A)) :- ##fact(A).
##true(atom(A)); ##fail(atom(A)) :- ##supp(A), not ##fact(A).
##fail(atom(A)) :- ##atom(A), not ##supp(A).
##true(normal(B)) :- ##literal_tuple(B),
##true(atom(L)) : ##literal_tuple(B, L), L > 0;
##fail(atom(L)) : ##literal_tuple(B,-L), L > 0.
##fail(normal(B)) :- ##literal_tuple(B, L), ##fail(atom(L)), L > 0.
##fail(normal(B)) :- ##literal_tuple(B,-L), ##true(atom(L)), L > 0.
##true(sum(B,G)) :- ##sum(B,G,T),
#sum { W,L : ##true(atom(L)), ##weighted_literal_tuple(B, L,W), L > 0 ;
W,L : ##fail(atom(L)), ##weighted_literal_tuple(B,-L,W), L > 0 } >= G.
##fail(sum(B,G)) :- ##sum(B,G,T),
#sum { W,L : ##fail(atom(L)), ##weighted_literal_tuple(B, L,W), L > 0 ;
W,L : ##true(atom(L)), ##weighted_literal_tuple(B,-L,W), L > 0 } >= T-G+1.
% verify supported model properties
##bot :- ##rule(disjunction(H),B), ##true(B), ##fail(atom(A)) : ##atom_tuple(H,A).
##bot :- ##true(atom(A)), ##fail(B) : ##supp(A,B).
% verify acyclic derivability
##internal(C,normal(B)) :- ##scc(C,A), ##supp(A,normal(B)), ##scc(C,A'), ##literal_tuple(B,A').
##internal(C,sum(B,G)) :- ##scc(C,A), ##supp(A,sum(B,G)), ##scc(C,A'), ##weighted_literal_tuple(B,A',W).
##external(C,normal(B)) :- ##scc(C,A), ##supp(A,normal(B)), not ##internal(C,normal(B)).
##external(C,sum(B,G)) :- ##scc(C,A), ##supp(A,sum(B,G)), not ##internal(C,sum(B,G)).
##steps(C,Z-1) :- ##scc(C,_), Z = { ##scc(C,A) : not ##fact(A) }.
##wait(C,atom(A),0) :- ##scc(C,A), ##fail(B) : ##external(C,B), ##supp(A,B).
##wait(C,normal(B),I) :- ##internal(C,normal(B)), ##fail(normal(B)), ##steps(C,Z), I = 0..Z-1.
##wait(C,normal(B),I) :- ##internal(C,normal(B)), ##literal_tuple(B,A), ##wait(C,atom(A),I), ##steps(C,Z), I < Z.
##wait(C,sum(B,G),I) :- ##internal(C,sum(B,G)), ##steps(C,Z), I = 0..Z-1, ##sum(B,G,T),
#sum { W,L : ##fail(atom(L)), ##weighted_literal_tuple(B, L,W), L > 0, not ##scc(C,L) ;
W,L : ##wait(C,atom(L),I), ##weighted_literal_tuple(B, L,W), L > 0, ##scc(C,L) ;
W,L : ##true(atom(L)), ##weighted_literal_tuple(B,-L,W), L > 0 } >= T-G+1.
##wait(C,atom(A),I) :- ##wait(C,atom(A),0), ##steps(C,Z), I = 1..Z, ##wait(C,B,I-1) : ##supp(A,B), ##internal(C,B).
##bot :- ##scc(C,A), ##true(atom(A)), ##wait(C,atom(A),Z), ##steps(C,Z).
% saturate interpretations that are not answer sets
##true(atom(A)) :- ##supp(A), not ##fact(A), ##bot.
##fail(atom(A)) :- ##supp(A), not ##fact(A), ##bot.
%
% added by Javier
%
%#show.
%#show T : ##output(T,L), ##true(atom(L)).
#defined ##literal_tuple/1.
#defined ##literal_tuple/2.
#defined ##rule/2.
#defined ##atom_tuple/2.
#defined ##weighted_literal_tuple/3.
#defined ##scc/2.
"""
metaD_program_inc_base = """
% NOTE: assumes that a rule has no more than one head
##sum(B,G,T) :- ##rule(_,sum(B,G)), T = #sum { W,L : ##weighted_literal_tuple(B,L,W) }.
% extract supports of atoms and facts
##supp(A,B) :- ##rule( choice(H),B), ##atom_tuple(H,A).
##supp(A,B) :- ##rule(disjunction(H),B), ##atom_tuple(H,A).
##supp(A) :- ##supp(A,_).
##atom(|L|) :- ##weighted_literal_tuple(_,L,_).
##atom(|L|) :- ##literal_tuple(_,L).
##atom( A ) :- ##atom_tuple(_,A).
##fact(A) :- ##rule(disjunction(H),normal(B)), ##atom_tuple(H,A), not ##literal_tuple(B,_).
% verify acyclic derivability
##internal(C,normal(B)) :- ##scc(C,A), ##supp(A,normal(B)), ##scc(C,A'), ##literal_tuple(B,A').
##internal(C,sum(B,G)) :- ##scc(C,A), ##supp(A,sum(B,G)), ##scc(C,A'), ##weighted_literal_tuple(B,A',W).
##external(C,normal(B)) :- ##scc(C,A), ##supp(A,normal(B)), not ##internal(C,normal(B)).
##external(C,sum(B,G)) :- ##scc(C,A), ##supp(A,sum(B,G)), not ##internal(C,sum(B,G)).
##steps(C,Z-1) :- ##scc(C,_), Z = { ##scc(C,A) : not ##fact(A) }.
%
% added by Javier
%
#defined ##literal_tuple/1.
#defined ##literal_tuple/2.
#defined ##rule/2.
#defined ##atom_tuple/2.
#defined ##weighted_literal_tuple/3.
#defined ##scc/2.
"""
metaD_program_parameters = ["m1", "m2"]
metaD_program_inc = """
% generate interpretation
##true(m1,m2,atom(A)) :- ##fact(A), not ##fixed(A). % added not fixed(A)
##true(m1,m2,atom(A)); ##fail(m1,m2,atom(A)) :- ##supp(A), not ##fact(A), not ##fixed(A). % added not fixed(A)
##fail(m1,m2,atom(A)) :- ##atom(A), not ##supp(A), not ##fixed(A). % added not fixed(A)
##true(m1,m2,normal(B)) :- ##literal_tuple(B),
##true(m1,m2,atom(L)) : ##literal_tuple(B, L), L > 0;
##fail(m1,m2,atom(L)) : ##literal_tuple(B,-L), L > 0.
##fail(m1,m2,normal(B)) :- ##literal_tuple(B, L), ##fail(m1,m2,atom(L)), L > 0.
##fail(m1,m2,normal(B)) :- ##literal_tuple(B,-L), ##true(m1,m2,atom(L)), L > 0.
##true(m1,m2,sum(B,G)) :- ##sum(B,G,T),
#sum { W,L : ##true(m1,m2,atom(L)), ##weighted_literal_tuple(B, L,W), L > 0 ;
W,L : ##fail(m1,m2,atom(L)), ##weighted_literal_tuple(B,-L,W), L > 0 } >= G.
##fail(m1,m2,sum(B,G)) :- ##sum(B,G,T),
#sum { W,L : ##fail(m1,m2,atom(L)), ##weighted_literal_tuple(B, L,W), L > 0 ;
W,L : ##true(m1,m2,atom(L)), ##weighted_literal_tuple(B,-L,W), L > 0 } >= T-G+1.
% verify supported model properties
##bot(m1,m2) :- ##rule(disjunction(H),B), ##true(m1,m2,B), ##fail(m1,m2,atom(A)) : ##atom_tuple(H,A).
##bot(m1,m2) :- ##true(m1,m2,atom(A)), ##fail(m1,m2,B) : ##supp(A,B).
% verify acyclic derivability
##wait(m1,m2,C,atom(A),0) :- ##scc(C,A), ##fail(m1,m2,B) : ##external(C,B), ##supp(A,B).
##wait(m1,m2,C,normal(B),I) :- ##internal(C,normal(B)), ##fail(m1,m2,normal(B)), ##steps(C,Z), I = 0..Z-1.
##wait(m1,m2,C,normal(B),I) :- ##internal(C,normal(B)), ##literal_tuple(B,A), ##wait(m1,m2,C,atom(A),I), ##steps(C,Z), I < Z.
##wait(m1,m2,C,sum(B,G),I) :- ##internal(C,sum(B,G)), ##steps(C,Z), I = 0..Z-1, ##sum(B,G,T),
#sum { W,L : ##fail(m1,m2,atom(L)), ##weighted_literal_tuple(B, L,W), L > 0, not ##scc(C,L) ;
W,L : ##wait(m1,m2,C,atom(L),I), ##weighted_literal_tuple(B, L,W), L > 0, ##scc(C,L) ;
W,L : ##true(m1,m2,atom(L)), ##weighted_literal_tuple(B,-L,W), L > 0 } >= T-G+1.
##wait(m1,m2,C,atom(A),I) :- ##wait(m1,m2,C,atom(A),0), ##steps(C,Z), I = 1..Z, ##wait(m1,m2,C,B,I-1) : ##supp(A,B), ##internal(C,B).
##bot(m1,m2) :- ##scc(C,A), ##true(m1,m2,atom(A)), ##wait(m1,m2,C,atom(A),Z), ##steps(C,Z).
% saturate interpretations that are not answer sets
##true(m1,m2,atom(A)) :- ##supp(A), not ##fact(A), ##bot(m1,m2), not ##fixed(A).
##fail(m1,m2,atom(A)) :- ##supp(A), not ##fact(A), ##bot(m1,m2), not ##fixed(A).
%
% added by Javier
%
#defined ##literal_tuple/1.
#defined ##literal_tuple/2.
#defined ##rule/2.
#defined ##atom_tuple/2.
#defined ##weighted_literal_tuple/3.
#defined ##scc/2.
"""
| 43.980392 | 135 | 0.510745 | 1,735 | 11,215 | 3.226513 | 0.089914 | 0.111468 | 0.090568 | 0.070025 | 0.893533 | 0.870311 | 0.85209 | 0.802072 | 0.763844 | 0.734727 | 0 | 0.019277 | 0.222916 | 11,215 | 254 | 136 | 44.153543 | 0.623064 | 0 | 0 | 0.542105 | 0 | 0.4 | 0.987158 | 0.259788 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
f32f5f78dcbb4ee87f106550e79d681c53bdbbef | 3,603 | py | Python | nn.py | buddhi1/CIFAR-10-project | d80821fa7db21be130c71de2f20d49bdcc5561b6 | [
"MIT"
] | 2 | 2022-02-17T14:44:55.000Z | 2022-02-19T04:54:52.000Z | nn.py | buddhi1/CIFAR-10-project | d80821fa7db21be130c71de2f20d49bdcc5561b6 | [
"MIT"
] | null | null | null | nn.py | buddhi1/CIFAR-10-project | d80821fa7db21be130c71de2f20d49bdcc5561b6 | [
"MIT"
] | null | null | null |
import torch.nn as nn
import torch.nn.functional as F
class IndoorResNetNetwork(nn.Module):
def __init__(self):
super(IndoorResNetNetwork, self).__init__()
self.fc1 = nn.Linear(in_features=2048, out_features=1024)
self.batchnorm1 = nn.BatchNorm1d(1024)
self.dropout1 = nn.Dropout(p=0.2)
self.fc2 = nn.Linear(in_features=1024, out_features=512)
self.batchnorm2 = nn.BatchNorm1d(512)
self.out = nn.Linear(in_features=512, out_features=67)
def forward(self, x):
x = self.dropout1(self.batchnorm1(F.relu(self.fc1(x))))
x = self.batchnorm2(F.relu(self.fc2(x)))
return F.log_softmax(self.out(x), dim=1)
class IndoorMnasnetNetwork(nn.Module):
def __init__(self):
super(IndoorMnasnetNetwork, self).__init__()
self.fc1 = nn.Linear(in_features=1280, out_features=1024)
self.batchnorm1 = nn.BatchNorm1d(1024)
self.dropout1 = nn.Dropout(p=0.2)
self.fc2 = nn.Linear(in_features=1024, out_features=512)
self.batchnorm2 = nn.BatchNorm1d(512)
self.out = nn.Linear(in_features=512, out_features=67)
def forward(self, x):
x = self.dropout1(self.batchnorm1(F.relu(self.fc1(x))))
x = self.batchnorm2(F.relu(self.fc2(x)))
return F.log_softmax(self.out(x), dim=1)
class IndoorResNetDeepNetwork(nn.Module):
def __init__(self):
super(IndoorResNetDeepNetwork, self).__init__()
self.fc1 = nn.Linear(in_features=2048, out_features=1024)
self.batchnorm1 = nn.BatchNorm1d(1024)
self.dropout1 = nn.Dropout(p=0.2)
self.fc2 = nn.Linear(in_features=1024, out_features=512)
self.batchnorm2 = nn.BatchNorm1d(512)
self.dropout2 = nn.Dropout(p=0.2)
self.fc3 = nn.Linear(in_features=512, out_features=256)
self.batchnorm3 = nn.BatchNorm1d(256)
self.dropout3 = nn.Dropout(p=0.2)
self.fc4 = nn.Linear(in_features=256, out_features=128)
self.batchnorm4 = nn.BatchNorm1d(128)
self.dropout4 = nn.Dropout(p=0.2)
self.out = nn.Linear(in_features=128, out_features=67)
def forward(self, x):
x = self.dropout1(self.batchnorm1(F.relu(self.fc1(x))))
x = self.dropout2(self.batchnorm2(F.relu(self.fc2(x))))
x = self.dropout3(self.batchnorm3(F.relu(self.fc3(x))))
x = self.dropout4(self.batchnorm4(F.relu(self.fc4(x))))
return F.log_softmax(self.out(x), dim=1)
class IndoorMnasnetDeepNetwork(nn.Module):
def __init__(self):
super(IndoorMnasnetDeepNetwork, self).__init__()
self.fc1 = nn.Linear(in_features=1280, out_features=1024)
self.batchnorm1 = nn.BatchNorm1d(1024)
self.dropout1 = nn.Dropout(p=0.2)
self.fc2 = nn.Linear(in_features=1024, out_features=512)
self.batchnorm2 = nn.BatchNorm1d(512)
self.dropout2 = nn.Dropout(p=0.2)
self.fc3 = nn.Linear(in_features=512, out_features=256)
self.batchnorm3 = nn.BatchNorm1d(256)
self.dropout3 = nn.Dropout(p=0.2)
self.fc4 = nn.Linear(in_features=256, out_features=128)
self.batchnorm4 = nn.BatchNorm1d(128)
self.dropout4 = nn.Dropout(p=0.2)
self.out = nn.Linear(in_features=128, out_features=67)
def forward(self, x):
x = self.dropout1(self.batchnorm1(F.relu(self.fc1(x))))
x = self.dropout2(self.batchnorm2(F.relu(self.fc2(x))))
x = self.dropout3(self.batchnorm3(F.relu(self.fc3(x))))
x = self.dropout4(self.batchnorm4(F.relu(self.fc4(x))))
return F.log_softmax(self.out(x), dim=1) | 36.03 | 65 | 0.651402 | 513 | 3,603 | 4.442495 | 0.101365 | 0.056165 | 0.070206 | 0.126371 | 0.903466 | 0.903466 | 0.861343 | 0.861343 | 0.861343 | 0.861343 | 0 | 0.086774 | 0.206772 | 3,603 | 100 | 66 | 36.03 | 0.710637 | 0 | 0 | 0.861111 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.027778 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f34134863ae36adb4616de9e767350654ef53037 | 47,453 | py | Python | tests/autogen/input/ifort/9-1_linux_intel/en-input-test-4.py | michaelackermannaiub/py-fortranformat | edc530d5edde41f41939c716da8e1ef01fa8a6fe | [
"MIT"
] | null | null | null | tests/autogen/input/ifort/9-1_linux_intel/en-input-test-4.py | michaelackermannaiub/py-fortranformat | edc530d5edde41f41939c716da8e1ef01fa8a6fe | [
"MIT"
] | null | null | null | tests/autogen/input/ifort/9-1_linux_intel/en-input-test-4.py | michaelackermannaiub/py-fortranformat | edc530d5edde41f41939c716da8e1ef01fa8a6fe | [
"MIT"
] | null | null | null |
import sys
import os
import unittest
from nose.plugins.attrib import attr
# To change this, re-run 'build-unittests.py'
from fortranformat._input import input as _input
from fortranformat._lexer import lexer as _lexer
from fortranformat._parser import parser as _parser
import unittest
class ENEditDescriptorBatch4TestCase(unittest.TestCase):
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_1(self):
inp = '''3.'''
fmt = '''(EN5.4E5)'''
result = [3.0000000000000000e+00]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_2(self):
inp = '''-3.'''
fmt = '''(EN5.4E5)'''
result = [-3.0000000000000000e+00]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_3(self):
inp = '''10.'''
fmt = '''(EN5.4E5)'''
result = [1.0000000000000000e+01]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_4(self):
inp = '''-10.'''
fmt = '''(EN5.4E5)'''
result = [-1.0000000000000000e+01]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_5(self):
inp = '''100.'''
fmt = '''(EN5.4E5)'''
result = [1.0000000000000000e+02]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_6(self):
inp = '''-100.'''
fmt = '''(EN5.4E5)'''
result = [-1.0000000000000000e+02]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_7(self):
inp = '''1000.'''
fmt = '''(EN5.4E5)'''
result = [1.0000000000000000e+03]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_8(self):
inp = '''-1000.'''
fmt = '''(EN5.4E5)'''
result = [-1.0000000000000001e-01]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_9(self):
inp = '''10000.'''
fmt = '''(EN5.4E5)'''
result = [1.0000000000000000e+00]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_10(self):
inp = '''-10000.'''
fmt = '''(EN5.4E5)'''
result = [-1.0000000000000001e-01]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_11(self):
inp = '''100000.'''
fmt = '''(EN5.4E5)'''
result = [1.0000000000000000e+00]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_12(self):
inp = '''-100000.'''
fmt = '''(EN5.4E5)'''
result = [-1.0000000000000001e-01]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_13(self):
inp = '''123456789.'''
fmt = '''(EN5.4E5)'''
result = [1.2344999999999999e+00]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_14(self):
inp = '''0.1'''
fmt = '''(EN5.4E5)'''
result = [1.0000000000000001e-01]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_15(self):
inp = '''-0.1'''
fmt = '''(EN5.4E5)'''
result = [-1.0000000000000001e-01]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_16(self):
inp = '''0.01'''
fmt = '''(EN5.4E5)'''
result = [1.0000000000000000e-02]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_17(self):
inp = '''-0.01'''
fmt = '''(EN5.4E5)'''
result = [-1.0000000000000000e-02]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_18(self):
inp = '''0.001'''
fmt = '''(EN5.4E5)'''
result = [1.0000000000000000e-03]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_19(self):
inp = '''-0.001'''
fmt = '''(EN5.4E5)'''
result = [0.0000000000000000e+00]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_20(self):
inp = '''0.0001'''
fmt = '''(EN5.4E5)'''
result = [0.0000000000000000e+00]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_21(self):
inp = '''-0.0001'''
fmt = '''(EN5.4E5)'''
result = [0.0000000000000000e+00]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_22(self):
inp = '''-1.96e-16'''
fmt = '''(EN5.4E5)'''
result = [-1.9600000000000000e+00]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_23(self):
inp = '''3.14159'''
fmt = '''(EN5.4E5)'''
result = [3.1410000000000000e+00]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_24(self):
inp = '''- 1.0'''
fmt = '''(EN5.4E5)'''
result = [0.0000000000000000e+00]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_25(self):
inp = '''1e12'''
fmt = '''(EN5.4E5)'''
result = [1.0000000000000000e+08]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_26(self):
inp = '''1E12'''
fmt = '''(EN5.4E5)'''
result = [1.0000000000000000e+08]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_27(self):
inp = '''-1 e12'''
fmt = '''(EN5.4E5)'''
result = [-1.0000000000000000e-04]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_28(self):
inp = '''.'''
fmt = '''(EN5.4E5)'''
result = [0.0000000000000000e+00]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_29(self):
inp = '''.1'''
fmt = '''(EN5.4E5)'''
result = [1.0000000000000001e-01]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_30(self):
inp = '''0.1D+200'''
fmt = '''(EN5.4E5)'''
result = [1.0000000000000001e-01]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_31(self):
inp = '''3.'''
fmt = '''(EN10.4E5)'''
result = [3.0000000000000000e+00]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_32(self):
inp = '''-3.'''
fmt = '''(EN10.4E5)'''
result = [-3.0000000000000000e+00]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_33(self):
inp = '''10.'''
fmt = '''(EN10.4E5)'''
result = [1.0000000000000000e+01]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_34(self):
inp = '''-10.'''
fmt = '''(EN10.4E5)'''
result = [-1.0000000000000000e+01]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_35(self):
inp = '''100.'''
fmt = '''(EN10.4E5)'''
result = [1.0000000000000000e+02]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_36(self):
inp = '''-100.'''
fmt = '''(EN10.4E5)'''
result = [-1.0000000000000000e+02]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_37(self):
inp = '''1000.'''
fmt = '''(EN10.4E5)'''
result = [1.0000000000000000e+03]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_38(self):
inp = '''-1000.'''
fmt = '''(EN10.4E5)'''
result = [-1.0000000000000000e+03]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_39(self):
inp = '''10000.'''
fmt = '''(EN10.4E5)'''
result = [1.0000000000000000e+04]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_40(self):
inp = '''-10000.'''
fmt = '''(EN10.4E5)'''
result = [-1.0000000000000000e+04]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_41(self):
inp = '''100000.'''
fmt = '''(EN10.4E5)'''
result = [1.0000000000000000e+05]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_42(self):
inp = '''-100000.'''
fmt = '''(EN10.4E5)'''
result = [-1.0000000000000000e+05]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_43(self):
inp = '''123456789.'''
fmt = '''(EN10.4E5)'''
result = [1.2345678900000000e+08]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_44(self):
inp = '''0.1'''
fmt = '''(EN10.4E5)'''
result = [1.0000000000000001e-01]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_45(self):
inp = '''-0.1'''
fmt = '''(EN10.4E5)'''
result = [-1.0000000000000001e-01]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_46(self):
inp = '''0.01'''
fmt = '''(EN10.4E5)'''
result = [1.0000000000000000e-02]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_47(self):
inp = '''-0.01'''
fmt = '''(EN10.4E5)'''
result = [-1.0000000000000000e-02]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_48(self):
inp = '''0.001'''
fmt = '''(EN10.4E5)'''
result = [1.0000000000000000e-03]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_49(self):
inp = '''-0.001'''
fmt = '''(EN10.4E5)'''
result = [-1.0000000000000000e-03]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_50(self):
inp = '''0.0001'''
fmt = '''(EN10.4E5)'''
result = [1.0000000000000000e-04]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_51(self):
inp = '''-0.0001'''
fmt = '''(EN10.4E5)'''
result = [-1.0000000000000000e-04]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_52(self):
inp = '''-1.96e-16'''
fmt = '''(EN10.4E5)'''
result = [-1.9600000000000000e-16]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_53(self):
inp = '''3.14159'''
fmt = '''(EN10.4E5)'''
result = [3.1415899999999999e+00]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_54(self):
inp = '''- 1.0'''
fmt = '''(EN10.4E5)'''
result = [-1.0000000000000000e+00]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_55(self):
inp = '''1e12'''
fmt = '''(EN10.4E5)'''
result = [1.0000000000000000e+08]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_56(self):
inp = '''1E12'''
fmt = '''(EN10.4E5)'''
result = [1.0000000000000000e+08]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_57(self):
inp = '''-1 e12'''
fmt = '''(EN10.4E5)'''
result = [-1.0000000000000000e+08]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_58(self):
inp = '''.'''
fmt = '''(EN10.4E5)'''
result = [0.0000000000000000e+00]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_59(self):
inp = '''.1'''
fmt = '''(EN10.4E5)'''
result = [1.0000000000000001e-01]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_60(self):
inp = '''0.1D+200'''
fmt = '''(EN10.4E5)'''
result = [1.0000000000000001e+199]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_61(self):
inp = '''3.'''
fmt = '''(EN5.5E5)'''
result = [3.0000000000000000e+00]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_62(self):
inp = '''-3.'''
fmt = '''(EN5.5E5)'''
result = [-3.0000000000000000e+00]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_63(self):
inp = '''10.'''
fmt = '''(EN5.5E5)'''
result = [1.0000000000000000e+01]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_64(self):
inp = '''-10.'''
fmt = '''(EN5.5E5)'''
result = [-1.0000000000000000e+01]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_65(self):
inp = '''100.'''
fmt = '''(EN5.5E5)'''
result = [1.0000000000000000e+02]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_66(self):
inp = '''-100.'''
fmt = '''(EN5.5E5)'''
result = [-1.0000000000000000e+02]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_67(self):
inp = '''1000.'''
fmt = '''(EN5.5E5)'''
result = [1.0000000000000000e+03]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_68(self):
inp = '''-1000.'''
fmt = '''(EN5.5E5)'''
result = [-1.0000000000000000e-02]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_69(self):
inp = '''10000.'''
fmt = '''(EN5.5E5)'''
result = [1.0000000000000001e-01]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_70(self):
inp = '''-10000.'''
fmt = '''(EN5.5E5)'''
result = [-1.0000000000000000e-02]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_71(self):
inp = '''100000.'''
fmt = '''(EN5.5E5)'''
result = [1.0000000000000001e-01]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_72(self):
inp = '''-100000.'''
fmt = '''(EN5.5E5)'''
result = [-1.0000000000000000e-02]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_73(self):
inp = '''123456789.'''
fmt = '''(EN5.5E5)'''
result = [1.2345000000000000e-01]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_74(self):
inp = '''0.1'''
fmt = '''(EN5.5E5)'''
result = [1.0000000000000001e-01]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_75(self):
inp = '''-0.1'''
fmt = '''(EN5.5E5)'''
result = [-1.0000000000000001e-01]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_76(self):
inp = '''0.01'''
fmt = '''(EN5.5E5)'''
result = [1.0000000000000000e-02]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_77(self):
inp = '''-0.01'''
fmt = '''(EN5.5E5)'''
result = [-1.0000000000000000e-02]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_78(self):
inp = '''0.001'''
fmt = '''(EN5.5E5)'''
result = [1.0000000000000000e-03]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_79(self):
inp = '''-0.001'''
fmt = '''(EN5.5E5)'''
result = [0.0000000000000000e+00]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_80(self):
inp = '''0.0001'''
fmt = '''(EN5.5E5)'''
result = [0.0000000000000000e+00]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_81(self):
inp = '''-0.0001'''
fmt = '''(EN5.5E5)'''
result = [0.0000000000000000e+00]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_82(self):
inp = '''-1.96e-16'''
fmt = '''(EN5.5E5)'''
result = [-1.9600000000000000e+00]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_83(self):
inp = '''3.14159'''
fmt = '''(EN5.5E5)'''
result = [3.1410000000000000e+00]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_84(self):
inp = '''- 1.0'''
fmt = '''(EN5.5E5)'''
result = [0.0000000000000000e+00]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_85(self):
inp = '''1e12'''
fmt = '''(EN5.5E5)'''
result = [1.0000000000000000e+07]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_86(self):
inp = '''1E12'''
fmt = '''(EN5.5E5)'''
result = [1.0000000000000000e+07]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_87(self):
inp = '''-1 e12'''
fmt = '''(EN5.5E5)'''
result = [-1.0000000000000001e-05]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_88(self):
inp = '''.'''
fmt = '''(EN5.5E5)'''
result = [0.0000000000000000e+00]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_89(self):
inp = '''.1'''
fmt = '''(EN5.5E5)'''
result = [1.0000000000000001e-01]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_90(self):
inp = '''0.1D+200'''
fmt = '''(EN5.5E5)'''
result = [1.0000000000000001e-01]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_91(self):
inp = '''3.'''
fmt = '''(EN10.5E5)'''
result = [3.0000000000000000e+00]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_92(self):
inp = '''-3.'''
fmt = '''(EN10.5E5)'''
result = [-3.0000000000000000e+00]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_93(self):
inp = '''10.'''
fmt = '''(EN10.5E5)'''
result = [1.0000000000000000e+01]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_94(self):
inp = '''-10.'''
fmt = '''(EN10.5E5)'''
result = [-1.0000000000000000e+01]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_95(self):
inp = '''100.'''
fmt = '''(EN10.5E5)'''
result = [1.0000000000000000e+02]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_96(self):
inp = '''-100.'''
fmt = '''(EN10.5E5)'''
result = [-1.0000000000000000e+02]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_97(self):
inp = '''1000.'''
fmt = '''(EN10.5E5)'''
result = [1.0000000000000000e+03]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_98(self):
inp = '''-1000.'''
fmt = '''(EN10.5E5)'''
result = [-1.0000000000000000e+03]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_99(self):
inp = '''10000.'''
fmt = '''(EN10.5E5)'''
result = [1.0000000000000000e+04]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_100(self):
inp = '''-10000.'''
fmt = '''(EN10.5E5)'''
result = [-1.0000000000000000e+04]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_101(self):
inp = '''100000.'''
fmt = '''(EN10.5E5)'''
result = [1.0000000000000000e+05]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_102(self):
inp = '''-100000.'''
fmt = '''(EN10.5E5)'''
result = [-1.0000000000000000e+05]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_103(self):
inp = '''123456789.'''
fmt = '''(EN10.5E5)'''
result = [1.2345678900000000e+08]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_104(self):
inp = '''0.1'''
fmt = '''(EN10.5E5)'''
result = [1.0000000000000001e-01]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_105(self):
inp = '''-0.1'''
fmt = '''(EN10.5E5)'''
result = [-1.0000000000000001e-01]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_106(self):
inp = '''0.01'''
fmt = '''(EN10.5E5)'''
result = [1.0000000000000000e-02]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_107(self):
inp = '''-0.01'''
fmt = '''(EN10.5E5)'''
result = [-1.0000000000000000e-02]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_108(self):
inp = '''0.001'''
fmt = '''(EN10.5E5)'''
result = [1.0000000000000000e-03]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_109(self):
inp = '''-0.001'''
fmt = '''(EN10.5E5)'''
result = [-1.0000000000000000e-03]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_110(self):
inp = '''0.0001'''
fmt = '''(EN10.5E5)'''
result = [1.0000000000000000e-04]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_111(self):
inp = '''-0.0001'''
fmt = '''(EN10.5E5)'''
result = [-1.0000000000000000e-04]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_112(self):
inp = '''-1.96e-16'''
fmt = '''(EN10.5E5)'''
result = [-1.9600000000000000e-16]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_113(self):
inp = '''3.14159'''
fmt = '''(EN10.5E5)'''
result = [3.1415899999999999e+00]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_114(self):
inp = '''- 1.0'''
fmt = '''(EN10.5E5)'''
result = [-1.0000000000000000e+00]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_115(self):
inp = '''1e12'''
fmt = '''(EN10.5E5)'''
result = [1.0000000000000000e+07]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_116(self):
inp = '''1E12'''
fmt = '''(EN10.5E5)'''
result = [1.0000000000000000e+07]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_117(self):
inp = '''-1 e12'''
fmt = '''(EN10.5E5)'''
result = [-1.0000000000000000e+07]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_118(self):
inp = '''.'''
fmt = '''(EN10.5E5)'''
result = [0.0000000000000000e+00]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_119(self):
inp = '''.1'''
fmt = '''(EN10.5E5)'''
result = [1.0000000000000001e-01]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_120(self):
inp = '''0.1D+200'''
fmt = '''(EN10.5E5)'''
result = [1.0000000000000001e+199]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_121(self):
inp = '''3.'''
fmt = '''(EN10.10E5)'''
result = [3.0000000000000000e+00]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_122(self):
inp = '''-3.'''
fmt = '''(EN10.10E5)'''
result = [-3.0000000000000000e+00]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_123(self):
inp = '''10.'''
fmt = '''(EN10.10E5)'''
result = [1.0000000000000000e+01]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_124(self):
inp = '''-10.'''
fmt = '''(EN10.10E5)'''
result = [-1.0000000000000000e+01]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_125(self):
inp = '''100.'''
fmt = '''(EN10.10E5)'''
result = [1.0000000000000000e+02]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_126(self):
inp = '''-100.'''
fmt = '''(EN10.10E5)'''
result = [-1.0000000000000000e+02]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_127(self):
inp = '''1000.'''
fmt = '''(EN10.10E5)'''
result = [1.0000000000000000e+03]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_128(self):
inp = '''-1000.'''
fmt = '''(EN10.10E5)'''
result = [-1.0000000000000000e+03]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_129(self):
inp = '''10000.'''
fmt = '''(EN10.10E5)'''
result = [1.0000000000000000e+04]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_130(self):
inp = '''-10000.'''
fmt = '''(EN10.10E5)'''
result = [-1.0000000000000000e+04]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_131(self):
inp = '''100000.'''
fmt = '''(EN10.10E5)'''
result = [1.0000000000000000e+05]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_132(self):
inp = '''-100000.'''
fmt = '''(EN10.10E5)'''
result = [-1.0000000000000000e+05]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_133(self):
inp = '''123456789.'''
fmt = '''(EN10.10E5)'''
result = [1.2345678900000000e+08]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_134(self):
inp = '''0.1'''
fmt = '''(EN10.10E5)'''
result = [1.0000000000000001e-01]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_135(self):
inp = '''-0.1'''
fmt = '''(EN10.10E5)'''
result = [-1.0000000000000001e-01]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_136(self):
inp = '''0.01'''
fmt = '''(EN10.10E5)'''
result = [1.0000000000000000e-02]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_137(self):
inp = '''-0.01'''
fmt = '''(EN10.10E5)'''
result = [-1.0000000000000000e-02]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_138(self):
inp = '''0.001'''
fmt = '''(EN10.10E5)'''
result = [1.0000000000000000e-03]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_139(self):
inp = '''-0.001'''
fmt = '''(EN10.10E5)'''
result = [-1.0000000000000000e-03]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_140(self):
inp = '''0.0001'''
fmt = '''(EN10.10E5)'''
result = [1.0000000000000000e-04]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_141(self):
inp = '''-0.0001'''
fmt = '''(EN10.10E5)'''
result = [-1.0000000000000000e-04]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_142(self):
inp = '''-1.96e-16'''
fmt = '''(EN10.10E5)'''
result = [-1.9600000000000000e-16]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_143(self):
inp = '''3.14159'''
fmt = '''(EN10.10E5)'''
result = [3.1415899999999999e+00]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_144(self):
inp = '''- 1.0'''
fmt = '''(EN10.10E5)'''
result = [-1.0000000000000000e+00]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_145(self):
inp = '''1e12'''
fmt = '''(EN10.10E5)'''
result = [1.0000000000000000e+02]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_146(self):
inp = '''1E12'''
fmt = '''(EN10.10E5)'''
result = [1.0000000000000000e+02]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_147(self):
inp = '''-1 e12'''
fmt = '''(EN10.10E5)'''
result = [-1.0000000000000000e+02]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_148(self):
inp = '''.'''
fmt = '''(EN10.10E5)'''
result = [0.0000000000000000e+00]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
@attr(platform='9-1_linux_intel')
@attr('input')
@attr(ed='EN')
def test_en_ed_input_149(self):
inp = '''.1'''
fmt = '''(EN10.10E5)'''
result = [1.0000000000000001e-01]
eds, rev_eds = _parser(_lexer(fmt))
self.assertEqual(result, _input(eds, rev_eds, inp))
if __name__ == '__main__':
unittest.main() | 31.467507 | 59 | 0.565001 | 6,229 | 47,453 | 4.038851 | 0.036282 | 0.071071 | 0.106606 | 0.082916 | 0.970665 | 0.962318 | 0.951546 | 0.945027 | 0.938429 | 0.937555 | 0 | 0.128276 | 0.249889 | 47,453 | 1,508 | 60 | 31.467507 | 0.578508 | 0.000906 | 0 | 0.879438 | 0 | 0 | 0.1169 | 0 | 0 | 0 | 0 | 0 | 0.110207 | 1 | 0.110207 | false | 0 | 0.005917 | 0 | 0.116864 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f3649b7c2c290ac2a3c60a5cedf7b76a016a0b18 | 14,381 | py | Python | azure-mgmt-compute/azure/mgmt/compute/v2017_03_30/operations/virtual_machine_scale_set_rolling_upgrades_operations.py | v-Ajnava/azure-sdk-for-python | a1f6f80eb5869c5b710e8bfb66146546697e2a6f | [
"MIT"
] | 4 | 2016-06-17T23:25:29.000Z | 2022-03-30T22:37:45.000Z | azure-mgmt-compute/azure/mgmt/compute/v2017_03_30/operations/virtual_machine_scale_set_rolling_upgrades_operations.py | v-Ajnava/azure-sdk-for-python | a1f6f80eb5869c5b710e8bfb66146546697e2a6f | [
"MIT"
] | 2 | 2016-09-30T21:40:24.000Z | 2017-11-10T18:16:18.000Z | azure-mgmt-compute/azure/mgmt/compute/v2017_03_30/operations/virtual_machine_scale_set_rolling_upgrades_operations.py | v-Ajnava/azure-sdk-for-python | a1f6f80eb5869c5b710e8bfb66146546697e2a6f | [
"MIT"
] | 3 | 2016-05-03T20:49:46.000Z | 2017-10-05T21:05:27.000Z | # coding=utf-8
# --------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for
# license information.
#
# Code generated by Microsoft (R) AutoRest Code Generator.
# Changes may cause incorrect behavior and will be lost if the code is
# regenerated.
# --------------------------------------------------------------------------
import uuid
from msrest.pipeline import ClientRawResponse
from msrestazure.azure_exceptions import CloudError
from msrest.exceptions import DeserializationError
from msrestazure.azure_operation import AzureOperationPoller
from .. import models
class VirtualMachineScaleSetRollingUpgradesOperations(object):
"""VirtualMachineScaleSetRollingUpgradesOperations operations.
:param client: Client for service requests.
:param config: Configuration of service client.
:param serializer: An object model serializer.
:param deserializer: An objec model deserializer.
:ivar api_version: Client Api Version. Constant value: "2017-03-30".
"""
models = models
def __init__(self, client, config, serializer, deserializer):
self._client = client
self._serialize = serializer
self._deserialize = deserializer
self.api_version = "2017-03-30"
self.config = config
def _cancel_initial(
self, resource_group_name, vm_scale_set_name, custom_headers=None, raw=False, **operation_config):
# Construct URL
url = '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Compute/virtualMachineScaleSets/{vmScaleSetName}/rollingUpgrades/cancel'
path_format_arguments = {
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str'),
'vmScaleSetName': self._serialize.url("vm_scale_set_name", vm_scale_set_name, 'str'),
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.post(url, query_parameters)
response = self._client.send(request, header_parameters, stream=False, **operation_config)
if response.status_code not in [200, 202]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('OperationStatusResponse', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
def cancel(
self, resource_group_name, vm_scale_set_name, custom_headers=None, raw=False, **operation_config):
"""Cancels the current virtual machine scale set rolling upgrade.
:param resource_group_name: The name of the resource group.
:type resource_group_name: str
:param vm_scale_set_name: The name of the VM scale set.
:type vm_scale_set_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:return: An instance of AzureOperationPoller that returns
OperationStatusResponse or ClientRawResponse if raw=true
:rtype:
~msrestazure.azure_operation.AzureOperationPoller[~azure.mgmt.compute.v2017_03_30.models.OperationStatusResponse]
or ~msrest.pipeline.ClientRawResponse
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
raw_result = self._cancel_initial(
resource_group_name=resource_group_name,
vm_scale_set_name=vm_scale_set_name,
custom_headers=custom_headers,
raw=True,
**operation_config
)
if raw:
return raw_result
# Construct and send request
def long_running_send():
return raw_result.response
def get_long_running_status(status_link, headers=None):
request = self._client.get(status_link)
if headers:
request.headers.update(headers)
header_parameters = {}
header_parameters['x-ms-client-request-id'] = raw_result.response.request.headers['x-ms-client-request-id']
return self._client.send(
request, header_parameters, stream=False, **operation_config)
def get_long_running_output(response):
if response.status_code not in [200, 202]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
deserialized = self._deserialize('OperationStatusResponse', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
long_running_operation_timeout = operation_config.get(
'long_running_operation_timeout',
self.config.long_running_operation_timeout)
return AzureOperationPoller(
long_running_send, get_long_running_output,
get_long_running_status, long_running_operation_timeout)
def _start_os_upgrade_initial(
self, resource_group_name, vm_scale_set_name, custom_headers=None, raw=False, **operation_config):
# Construct URL
url = '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Compute/virtualMachineScaleSets/{vmScaleSetName}/osRollingUpgrade'
path_format_arguments = {
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str'),
'vmScaleSetName': self._serialize.url("vm_scale_set_name", vm_scale_set_name, 'str'),
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.post(url, query_parameters)
response = self._client.send(request, header_parameters, stream=False, **operation_config)
if response.status_code not in [200, 202]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('OperationStatusResponse', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
def start_os_upgrade(
self, resource_group_name, vm_scale_set_name, custom_headers=None, raw=False, **operation_config):
"""Starts a rolling upgrade to move all virtual machine scale set
instances to the latest available Platform Image OS version. Instances
which are already running the latest available OS version are not
affected.
:param resource_group_name: The name of the resource group.
:type resource_group_name: str
:param vm_scale_set_name: The name of the VM scale set.
:type vm_scale_set_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:return: An instance of AzureOperationPoller that returns
OperationStatusResponse or ClientRawResponse if raw=true
:rtype:
~msrestazure.azure_operation.AzureOperationPoller[~azure.mgmt.compute.v2017_03_30.models.OperationStatusResponse]
or ~msrest.pipeline.ClientRawResponse
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
raw_result = self._start_os_upgrade_initial(
resource_group_name=resource_group_name,
vm_scale_set_name=vm_scale_set_name,
custom_headers=custom_headers,
raw=True,
**operation_config
)
if raw:
return raw_result
# Construct and send request
def long_running_send():
return raw_result.response
def get_long_running_status(status_link, headers=None):
request = self._client.get(status_link)
if headers:
request.headers.update(headers)
header_parameters = {}
header_parameters['x-ms-client-request-id'] = raw_result.response.request.headers['x-ms-client-request-id']
return self._client.send(
request, header_parameters, stream=False, **operation_config)
def get_long_running_output(response):
if response.status_code not in [200, 202]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
deserialized = self._deserialize('OperationStatusResponse', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
long_running_operation_timeout = operation_config.get(
'long_running_operation_timeout',
self.config.long_running_operation_timeout)
return AzureOperationPoller(
long_running_send, get_long_running_output,
get_long_running_status, long_running_operation_timeout)
def get_latest(
self, resource_group_name, vm_scale_set_name, custom_headers=None, raw=False, **operation_config):
"""Gets the status of the latest virtual machine scale set rolling
upgrade.
:param resource_group_name: The name of the resource group.
:type resource_group_name: str
:param vm_scale_set_name: The name of the VM scale set.
:type vm_scale_set_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: RollingUpgradeStatusInfo or ClientRawResponse if raw=true
:rtype:
~azure.mgmt.compute.v2017_03_30.models.RollingUpgradeStatusInfo or
~msrest.pipeline.ClientRawResponse
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
# Construct URL
url = '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Compute/virtualMachineScaleSets/{vmScaleSetName}/rollingUpgrades/latest'
path_format_arguments = {
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str'),
'vmScaleSetName': self._serialize.url("vm_scale_set_name", vm_scale_set_name, 'str'),
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters)
response = self._client.send(request, header_parameters, stream=False, **operation_config)
if response.status_code not in [200]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('RollingUpgradeStatusInfo', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
| 43.978593 | 174 | 0.674014 | 1,561 | 14,381 | 5.961563 | 0.121717 | 0.023211 | 0.02579 | 0.031593 | 0.847733 | 0.847733 | 0.844187 | 0.840855 | 0.840855 | 0.840855 | 0 | 0.00754 | 0.234546 | 14,381 | 326 | 175 | 44.113497 | 0.837845 | 0.239274 | 0 | 0.826087 | 0 | 0.01087 | 0.151353 | 0.091498 | 0 | 0 | 0 | 0 | 0 | 1 | 0.065217 | false | 0 | 0.032609 | 0.01087 | 0.206522 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f364cef04fa060b630498dc97ee601865ccd32f4 | 519,091 | py | Python | tests/mocked_responses.py | openpolis/atokaconn | 071aebfc408e4f661a9cd98ea91daf9e01b87bad | [
"MIT"
] | 1 | 2021-08-13T10:36:34.000Z | 2021-08-13T10:36:34.000Z | tests/mocked_responses.py | openpolis/atokaconn | 071aebfc408e4f661a9cd98ea91daf9e01b87bad | [
"MIT"
] | null | null | null | tests/mocked_responses.py | openpolis/atokaconn | 071aebfc408e4f661a9cd98ea91daf9e01b87bad | [
"MIT"
] | null | null | null | from faker import Factory
faker = Factory.create("it_IT") # a factory to create fake data for tests
def get_person_ok(tax_id=None, search_params=None):
"""Return a python dict simulating an ok response from ATOKA"""
if search_params:
family_name = search_params['family_name']
given_name = search_params['given_name']
birth_date = search_params['birth_date']
else:
family_name = faker.last_name_male()
given_name = faker.first_name_male()
birth_date = faker.date(pattern="%Y-%m-%d", end_datetime="-47y")
gender = "M"
name = "{0} {1}".format(given_name, family_name)
if not tax_id:
tax_id = faker.ssn()
return {
"items": [{
"base": {
"familyName": family_name,
"givenName": given_name,
"birthDate": birth_date,
"birthPlace": {
"macroregion": "Centro",
"municipality": "Roma",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"stateCode": "IT"
},
"gender": gender,
"taxId": tax_id
},
"country": "it",
"id": faker.uuid4(),
"name": name,
"obfuscated": False,
}],
"meta": {"count": 1, "limit": 10, "offset": 0, "ordering": "birthDateDesc"}
}
def get_void_response():
return {
"response": {},
"meta": {
"count": 0,
"error": 0,
"success": 0,
}
}
def get_person_multiple(tax_id=None, search_params=None):
"""Return a python dict simulating a response with multiple items from ATOKA"""
if search_params:
family_name = search_params['family_name']
given_name = search_params['given_name']
birth_date = search_params['birth_date']
else:
family_name = faker.last_name_male()
given_name = faker.first_name_male()
birth_date = faker.date(pattern="%Y-%m-%d", end_datetime="-47y")
gender = "M"
name = "{0} {1}".format(given_name, family_name)
if not tax_id:
tax_id = faker.ssn()
return {
"items": [
{
"base": {
"familyName": family_name,
"givenName": given_name,
"birthDate": birth_date,
"birthPlace": {
"macroregion": "Centro",
"municipality": "Roma",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"stateCode": "IT"
},
"gender": gender,
"taxId": tax_id
},
"country": "it",
"id": faker.uuid4(),
"name": name,
"obfuscated": False,
},
{
"base": {
"familyName": family_name + " Maria",
"givenName": given_name,
"birthDate": birth_date,
"birthPlace": {
"macroregion": "Centro",
"municipality": "Roma",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"stateCode": "IT"
},
"gender": gender,
"taxId": tax_id
},
"country": "it",
"id": faker.uuid4(),
"name": name,
"obfuscated": False,
}
],
"meta": {"count": 2, "limit": 10, "offset": 0, "ordering": "birthDateDesc"}
}
def get_companies(tax_ids):
"""Return a python dict simulating a single response from ATOKA"""
return {
"80002270660": {
"meta": {
"count": 1,
"error": 0,
"success": 1,
},
"responses": {
"80002270660": {
"items": [
{
"active": True,
"base": {
"active": True,
"govCode": "c_a345",
"govType": "Comuni e loro Consorzi e Associazioni",
"inGroup": True,
"legalClass": "Altre Forme",
"legalForms": [
{
"level": 1,
"name": "Altre Forme"
},
{
"level": 2,
"name": "Soggetto non iscritto al Registro Imprese"
}
],
"legalName": "COMUNE DI L'AQUILA",
"registeredAddress": {
"fullAddress": "Via F. Filomusi Guelfi, 67100, L'Aquila (AQ)",
"lat": 42.35161,
"latlonPrecision": 0,
"lon": 13.38522,
"macroregion": "Sud",
"municipality": "L'Aquila",
"postcode": "67100",
"province": "L'Aquila",
"provinceCode": "AQ",
"region": "Abruzzo",
"state": "Italia",
"streetName": "Via F. Filomusi Guelfi"
},
"startup": False,
"taxId": "80002270660",
"vat": "00082410663"
},
"country": "it",
"fullAddress": "Via F. Filomusi Guelfi, 67100, L'Aquila (AQ)",
"id": "73d7b304a070",
"name": "COMUNE DI L'AQUILA",
"shares": {
"beneficialOwnerOf": [
{
"active": True,
"id": "18d7ed1a320c",
"legalName": "S.E.D. SERVIZI ELABORAZIONE DATI S.P.A. CON SOCIO UNICO",
"name": "S.E.D. SERVIZI ELABORAZIONE DATI S.P.A. CON SOCIO UNICO"
},
{
"active": False,
"id": "2e5e366bff42",
"legalName": "AQUILAMBIENTE S.P.A. IN LIQUIDAZIONE",
"name": "AQUILAMBIENTE S.P.A. IN LIQUIDAZIONE"
}
],
"sharesOwned": [
{
"active": True,
"amount": 274380.0,
"id": "18d7ed1a320c",
"lastUpdate": "2013-05-21",
"legalName": "S.E.D. SERVIZI ELABORAZIONE DATI S.P.A. CON SOCIO UNICO",
"name": "S.E.D. SERVIZI ELABORAZIONE DATI S.P.A. CON SOCIO UNICO",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 52677.9,
"id": "2e5e366bff42",
"lastUpdate": "2003-04-30",
"legalName": "AQUILAMBIENTE S.P.A. IN LIQUIDAZIONE",
"name": "AQUILAMBIENTE S.P.A. IN LIQUIDAZIONE",
"ratio": 0.51,
"typeOfRight": "propriet\u00e0"
}
]
}
}
],
"meta": {
"count": 1,
"limit": 50,
"offset": 0,
"ordering": "atoka"
}
}
}
},
"02438750586": {
"meta": {
"count": 2,
"error": 0,
"success": 2,
},
"responses": {
"02438750586": {
"items": [
{
"active": True,
"base": {
"active": True,
"ateco": [
{
"code": "01.11.1",
"description": "Coltivazione di cereali (escluso il riso)",
"rootCode": "A"
}
],
"cciaa": "RM",
"founded": "1977-06-14",
"inGroup": False,
"legalClass": "Altre Forme",
"legalForms": [
{
"level": 1,
"name": "Altre Forme"
},
{
"level": 2,
"name": "Ente"
}
],
"legalName": "ROMA CAPITALE",
"nace": [
{
"code": "01.11",
"description": "Growing of cereals (except rice), leguminous crops and oil seeds",
"rootCode": "A"
}
],
"rea": "1287276",
"registeredAddress": {
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"lat": 41.89334748,
"latlonPrecision": 90,
"lon": 12.48289836,
"macroregion": "Centro",
"municipality": "Roma",
"postcode": "00186",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"streetName": "Del Campidoglio",
"streetNumber": "1",
"toponym": "Piazza"
},
"startup": False,
"taxId": "02438750586",
"vat": "01057861005"
},
"country": "it",
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"id": "ea623ae8c298",
"name": "ROMA CAPITALE",
"shares": {
"sharesOwned": [
{
"active": True,
"amount": 182436916.0,
"id": "6037483d168e",
"lastUpdate": "2011-10-20",
"legalName": "AZIENDA MUNICIPALE AMBIENTE S.P.A. "
"ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"name": "AZIENDA MUNICIPALE AMBIENTE S.P.A. ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 10000000.0,
"id": "8f1c4be03ec9",
"lastUpdate": "2010-02-18",
"legalName": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"name": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2822250.0,
"id": "667243935e25",
"lastUpdate": "2005-11-24",
"legalName": "ZETEMA PROGETTO CULTURA SRL",
"name": "ZETEMA PROGETTO CULTURA SRL",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2500000.0,
"id": "0b10d523f972",
"lastUpdate": "2012-04-16",
"legalName": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE DELLA CITTA' "
"DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"name": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE "
"DELLA CITTA' DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2000000.0,
"id": "a32e98206f27",
"lastUpdate": "2012-02-16",
"legalName": "RISORSE PER ROMA S.P.A.",
"name": "RISORSE PER ROMA S.P.A.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 300000.0,
"id": "8c8179a19251",
"lastUpdate": "2005-10-14",
"legalName": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"name": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 560438430.84,
"id": "b3f933d6a2df",
"lastUpdate": "2014-12-18",
"legalName": "ACEA S.P.A.",
"name": "ACEA S.P.A.",
"ratio": 0.51,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 2530638.81,
"id": "b61bd1a3a245",
"lastUpdate": "2000-04-28",
"legalName": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"name": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"ratio": 0.35,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 190000.0,
"id": "2f3a4fd465f6",
"lastUpdate": "2006-05-18",
"legalName": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"name": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"ratio": 0.19,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 155738.0,
"id": "f11054c630fa",
"lastUpdate": "2014-06-05",
"legalName": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"name": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"ratio": 0.08869999999999999,
"typeOfRight": "propriet\u00e0"
}
]
}
},
{
"active": True,
"base": {
"active": True,
"govCode": "c_h501",
"govType": "Comuni e loro Consorzi e Associazioni",
"inGroup": False,
"legalClass": "Altre Forme",
"legalForms": [
{
"level": 1,
"name": "Altre Forme"
},
{
"level": 2,
"name": "Soggetto non iscritto al Registro Imprese"
}
],
"legalName": "COMUNE DI ROMA",
"registeredAddress": {
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"lat": 41.89334748,
"latlonPrecision": 0,
"lon": 12.48289836,
"macroregion": "Centro",
"municipality": "Roma",
"postcode": "00186",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"streetName": "Piazza Del Campidoglio, 1"
},
"startup": False,
"taxId": "02438750586",
"vat": "02438750586"
},
"country": "it",
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"id": "2e45d55f8c71",
"name": "COMUNE DI ROMA"
}
],
"meta": {
"count": 2,
"limit": 50,
"offset": 0,
"ordering": "atoka"
}
}
}
},
"01234567890": {
"meta": {
"count": 62,
"error": 0,
"success": 50,
},
"responses": {
"01234567890": {
"items": [
{
"active": True,
"base": {
"active": True,
"ateco": [
{
"code": "01.11.1",
"description": "Coltivazione di cereali (escluso il riso)",
"rootCode": "A"
}
],
"cciaa": "RM",
"founded": "1977-06-14",
"inGroup": False,
"legalClass": "Altre Forme",
"legalForms": [
{
"level": 1,
"name": "Altre Forme"
},
{
"level": 2,
"name": "Ente"
}
],
"legalName": "ROMA CAPITALE",
"nace": [
{
"code": "01.11",
"description": "Growing of cereals (except rice), leguminous crops and oil seeds",
"rootCode": "A"
}
],
"rea": "1287276",
"registeredAddress": {
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"lat": 41.89334748,
"latlonPrecision": 90,
"lon": 12.48289836,
"macroregion": "Centro",
"municipality": "Roma",
"postcode": "00186",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"streetName": "Del Campidoglio",
"streetNumber": "1",
"toponym": "Piazza"
},
"startup": False,
"taxId": "02438750586",
"vat": "01057861005"
},
"country": "it",
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"id": "ea623ae8c298",
"name": "ROMA CAPITALE",
"shares": {
"sharesOwned": [
{
"active": True,
"amount": 182436916.0,
"id": "6037483d168e",
"lastUpdate": "2011-10-20",
"legalName": "AZIENDA MUNICIPALE AMBIENTE S.P.A. "
"ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"name": "AZIENDA MUNICIPALE AMBIENTE S.P.A. ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 10000000.0,
"id": "8f1c4be03ec9",
"lastUpdate": "2010-02-18",
"legalName": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"name": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2822250.0,
"id": "667243935e25",
"lastUpdate": "2005-11-24",
"legalName": "ZETEMA PROGETTO CULTURA SRL",
"name": "ZETEMA PROGETTO CULTURA SRL",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2500000.0,
"id": "0b10d523f972",
"lastUpdate": "2012-04-16",
"legalName": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE DELLA CITTA' "
"DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"name": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE "
"DELLA CITTA' DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2000000.0,
"id": "a32e98206f27",
"lastUpdate": "2012-02-16",
"legalName": "RISORSE PER ROMA S.P.A.",
"name": "RISORSE PER ROMA S.P.A.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 300000.0,
"id": "8c8179a19251",
"lastUpdate": "2005-10-14",
"legalName": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"name": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 560438430.84,
"id": "b3f933d6a2df",
"lastUpdate": "2014-12-18",
"legalName": "ACEA S.P.A.",
"name": "ACEA S.P.A.",
"ratio": 0.51,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 2530638.81,
"id": "b61bd1a3a245",
"lastUpdate": "2000-04-28",
"legalName": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"name": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"ratio": 0.35,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 190000.0,
"id": "2f3a4fd465f6",
"lastUpdate": "2006-05-18",
"legalName": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"name": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"ratio": 0.19,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 155738.0,
"id": "f11054c630fa",
"lastUpdate": "2014-06-05",
"legalName": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"name": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"ratio": 0.08869999999999999,
"typeOfRight": "propriet\u00e0"
}
]
}
},
{
"active": True,
"base": {
"active": True,
"ateco": [
{
"code": "01.11.1",
"description": "Coltivazione di cereali (escluso il riso)",
"rootCode": "A"
}
],
"cciaa": "RM",
"founded": "1977-06-14",
"inGroup": False,
"legalClass": "Altre Forme",
"legalForms": [
{
"level": 1,
"name": "Altre Forme"
},
{
"level": 2,
"name": "Ente"
}
],
"legalName": "ROMA CAPITALE",
"nace": [
{
"code": "01.11",
"description": "Growing of cereals (except rice), leguminous crops and oil seeds",
"rootCode": "A"
}
],
"rea": "1287276",
"registeredAddress": {
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"lat": 41.89334748,
"latlonPrecision": 90,
"lon": 12.48289836,
"macroregion": "Centro",
"municipality": "Roma",
"postcode": "00186",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"streetName": "Del Campidoglio",
"streetNumber": "1",
"toponym": "Piazza"
},
"startup": False,
"taxId": "02438750586",
"vat": "01057861005"
},
"country": "it",
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"id": "ea623ae8c298",
"name": "ROMA CAPITALE",
"shares": {
"sharesOwned": [
{
"active": True,
"amount": 182436916.0,
"id": "6037483d168e",
"lastUpdate": "2011-10-20",
"legalName": "AZIENDA MUNICIPALE AMBIENTE S.P.A. "
"ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"name": "AZIENDA MUNICIPALE AMBIENTE S.P.A. ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 10000000.0,
"id": "8f1c4be03ec9",
"lastUpdate": "2010-02-18",
"legalName": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"name": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2822250.0,
"id": "667243935e25",
"lastUpdate": "2005-11-24",
"legalName": "ZETEMA PROGETTO CULTURA SRL",
"name": "ZETEMA PROGETTO CULTURA SRL",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2500000.0,
"id": "0b10d523f972",
"lastUpdate": "2012-04-16",
"legalName": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE DELLA CITTA' "
"DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"name": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE "
"DELLA CITTA' DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2000000.0,
"id": "a32e98206f27",
"lastUpdate": "2012-02-16",
"legalName": "RISORSE PER ROMA S.P.A.",
"name": "RISORSE PER ROMA S.P.A.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 300000.0,
"id": "8c8179a19251",
"lastUpdate": "2005-10-14",
"legalName": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"name": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 560438430.84,
"id": "b3f933d6a2df",
"lastUpdate": "2014-12-18",
"legalName": "ACEA S.P.A.",
"name": "ACEA S.P.A.",
"ratio": 0.51,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 2530638.81,
"id": "b61bd1a3a245",
"lastUpdate": "2000-04-28",
"legalName": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"name": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"ratio": 0.35,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 190000.0,
"id": "2f3a4fd465f6",
"lastUpdate": "2006-05-18",
"legalName": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"name": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"ratio": 0.19,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 155738.0,
"id": "f11054c630fa",
"lastUpdate": "2014-06-05",
"legalName": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"name": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"ratio": 0.08869999999999999,
"typeOfRight": "propriet\u00e0"
}
]
}
},
{
"active": True,
"base": {
"active": True,
"ateco": [
{
"code": "01.11.1",
"description": "Coltivazione di cereali (escluso il riso)",
"rootCode": "A"
}
],
"cciaa": "RM",
"founded": "1977-06-14",
"inGroup": False,
"legalClass": "Altre Forme",
"legalForms": [
{
"level": 1,
"name": "Altre Forme"
},
{
"level": 2,
"name": "Ente"
}
],
"legalName": "ROMA CAPITALE",
"nace": [
{
"code": "01.11",
"description": "Growing of cereals (except rice), leguminous crops and oil seeds",
"rootCode": "A"
}
],
"rea": "1287276",
"registeredAddress": {
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"lat": 41.89334748,
"latlonPrecision": 90,
"lon": 12.48289836,
"macroregion": "Centro",
"municipality": "Roma",
"postcode": "00186",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"streetName": "Del Campidoglio",
"streetNumber": "1",
"toponym": "Piazza"
},
"startup": False,
"taxId": "02438750586",
"vat": "01057861005"
},
"country": "it",
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"id": "ea623ae8c298",
"name": "ROMA CAPITALE",
"shares": {
"sharesOwned": [
{
"active": True,
"amount": 182436916.0,
"id": "6037483d168e",
"lastUpdate": "2011-10-20",
"legalName": "AZIENDA MUNICIPALE AMBIENTE S.P.A. "
"ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"name": "AZIENDA MUNICIPALE AMBIENTE S.P.A. ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 10000000.0,
"id": "8f1c4be03ec9",
"lastUpdate": "2010-02-18",
"legalName": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"name": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2822250.0,
"id": "667243935e25",
"lastUpdate": "2005-11-24",
"legalName": "ZETEMA PROGETTO CULTURA SRL",
"name": "ZETEMA PROGETTO CULTURA SRL",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2500000.0,
"id": "0b10d523f972",
"lastUpdate": "2012-04-16",
"legalName": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE DELLA CITTA' "
"DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"name": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE "
"DELLA CITTA' DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2000000.0,
"id": "a32e98206f27",
"lastUpdate": "2012-02-16",
"legalName": "RISORSE PER ROMA S.P.A.",
"name": "RISORSE PER ROMA S.P.A.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 300000.0,
"id": "8c8179a19251",
"lastUpdate": "2005-10-14",
"legalName": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"name": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 560438430.84,
"id": "b3f933d6a2df",
"lastUpdate": "2014-12-18",
"legalName": "ACEA S.P.A.",
"name": "ACEA S.P.A.",
"ratio": 0.51,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 2530638.81,
"id": "b61bd1a3a245",
"lastUpdate": "2000-04-28",
"legalName": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"name": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"ratio": 0.35,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 190000.0,
"id": "2f3a4fd465f6",
"lastUpdate": "2006-05-18",
"legalName": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"name": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"ratio": 0.19,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 155738.0,
"id": "f11054c630fa",
"lastUpdate": "2014-06-05",
"legalName": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"name": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"ratio": 0.08869999999999999,
"typeOfRight": "propriet\u00e0"
}
]
}
},
{
"active": True,
"base": {
"active": True,
"ateco": [
{
"code": "01.11.1",
"description": "Coltivazione di cereali (escluso il riso)",
"rootCode": "A"
}
],
"cciaa": "RM",
"founded": "1977-06-14",
"inGroup": False,
"legalClass": "Altre Forme",
"legalForms": [
{
"level": 1,
"name": "Altre Forme"
},
{
"level": 2,
"name": "Ente"
}
],
"legalName": "ROMA CAPITALE",
"nace": [
{
"code": "01.11",
"description": "Growing of cereals (except rice), leguminous crops and oil seeds",
"rootCode": "A"
}
],
"rea": "1287276",
"registeredAddress": {
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"lat": 41.89334748,
"latlonPrecision": 90,
"lon": 12.48289836,
"macroregion": "Centro",
"municipality": "Roma",
"postcode": "00186",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"streetName": "Del Campidoglio",
"streetNumber": "1",
"toponym": "Piazza"
},
"startup": False,
"taxId": "02438750586",
"vat": "01057861005"
},
"country": "it",
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"id": "ea623ae8c298",
"name": "ROMA CAPITALE",
"shares": {
"sharesOwned": [
{
"active": True,
"amount": 182436916.0,
"id": "6037483d168e",
"lastUpdate": "2011-10-20",
"legalName": "AZIENDA MUNICIPALE AMBIENTE S.P.A. "
"ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"name": "AZIENDA MUNICIPALE AMBIENTE S.P.A. ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 10000000.0,
"id": "8f1c4be03ec9",
"lastUpdate": "2010-02-18",
"legalName": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"name": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2822250.0,
"id": "667243935e25",
"lastUpdate": "2005-11-24",
"legalName": "ZETEMA PROGETTO CULTURA SRL",
"name": "ZETEMA PROGETTO CULTURA SRL",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2500000.0,
"id": "0b10d523f972",
"lastUpdate": "2012-04-16",
"legalName": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE DELLA CITTA' "
"DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"name": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE "
"DELLA CITTA' DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2000000.0,
"id": "a32e98206f27",
"lastUpdate": "2012-02-16",
"legalName": "RISORSE PER ROMA S.P.A.",
"name": "RISORSE PER ROMA S.P.A.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 300000.0,
"id": "8c8179a19251",
"lastUpdate": "2005-10-14",
"legalName": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"name": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 560438430.84,
"id": "b3f933d6a2df",
"lastUpdate": "2014-12-18",
"legalName": "ACEA S.P.A.",
"name": "ACEA S.P.A.",
"ratio": 0.51,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 2530638.81,
"id": "b61bd1a3a245",
"lastUpdate": "2000-04-28",
"legalName": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"name": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"ratio": 0.35,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 190000.0,
"id": "2f3a4fd465f6",
"lastUpdate": "2006-05-18",
"legalName": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"name": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"ratio": 0.19,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 155738.0,
"id": "f11054c630fa",
"lastUpdate": "2014-06-05",
"legalName": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"name": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"ratio": 0.08869999999999999,
"typeOfRight": "propriet\u00e0"
}
]
}
},
{
"active": True,
"base": {
"active": True,
"ateco": [
{
"code": "01.11.1",
"description": "Coltivazione di cereali (escluso il riso)",
"rootCode": "A"
}
],
"cciaa": "RM",
"founded": "1977-06-14",
"inGroup": False,
"legalClass": "Altre Forme",
"legalForms": [
{
"level": 1,
"name": "Altre Forme"
},
{
"level": 2,
"name": "Ente"
}
],
"legalName": "ROMA CAPITALE",
"nace": [
{
"code": "01.11",
"description": "Growing of cereals (except rice), leguminous crops and oil seeds",
"rootCode": "A"
}
],
"rea": "1287276",
"registeredAddress": {
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"lat": 41.89334748,
"latlonPrecision": 90,
"lon": 12.48289836,
"macroregion": "Centro",
"municipality": "Roma",
"postcode": "00186",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"streetName": "Del Campidoglio",
"streetNumber": "1",
"toponym": "Piazza"
},
"startup": False,
"taxId": "02438750586",
"vat": "01057861005"
},
"country": "it",
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"id": "ea623ae8c298",
"name": "ROMA CAPITALE",
"shares": {
"sharesOwned": [
{
"active": True,
"amount": 182436916.0,
"id": "6037483d168e",
"lastUpdate": "2011-10-20",
"legalName": "AZIENDA MUNICIPALE AMBIENTE S.P.A. "
"ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"name": "AZIENDA MUNICIPALE AMBIENTE S.P.A. ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 10000000.0,
"id": "8f1c4be03ec9",
"lastUpdate": "2010-02-18",
"legalName": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"name": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2822250.0,
"id": "667243935e25",
"lastUpdate": "2005-11-24",
"legalName": "ZETEMA PROGETTO CULTURA SRL",
"name": "ZETEMA PROGETTO CULTURA SRL",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2500000.0,
"id": "0b10d523f972",
"lastUpdate": "2012-04-16",
"legalName": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE DELLA CITTA' "
"DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"name": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE "
"DELLA CITTA' DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2000000.0,
"id": "a32e98206f27",
"lastUpdate": "2012-02-16",
"legalName": "RISORSE PER ROMA S.P.A.",
"name": "RISORSE PER ROMA S.P.A.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 300000.0,
"id": "8c8179a19251",
"lastUpdate": "2005-10-14",
"legalName": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"name": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 560438430.84,
"id": "b3f933d6a2df",
"lastUpdate": "2014-12-18",
"legalName": "ACEA S.P.A.",
"name": "ACEA S.P.A.",
"ratio": 0.51,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 2530638.81,
"id": "b61bd1a3a245",
"lastUpdate": "2000-04-28",
"legalName": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"name": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"ratio": 0.35,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 190000.0,
"id": "2f3a4fd465f6",
"lastUpdate": "2006-05-18",
"legalName": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"name": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"ratio": 0.19,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 155738.0,
"id": "f11054c630fa",
"lastUpdate": "2014-06-05",
"legalName": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"name": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"ratio": 0.08869999999999999,
"typeOfRight": "propriet\u00e0"
}
]
}
},
{
"active": True,
"base": {
"active": True,
"ateco": [
{
"code": "01.11.1",
"description": "Coltivazione di cereali (escluso il riso)",
"rootCode": "A"
}
],
"cciaa": "RM",
"founded": "1977-06-14",
"inGroup": False,
"legalClass": "Altre Forme",
"legalForms": [
{
"level": 1,
"name": "Altre Forme"
},
{
"level": 2,
"name": "Ente"
}
],
"legalName": "ROMA CAPITALE",
"nace": [
{
"code": "01.11",
"description": "Growing of cereals (except rice), leguminous crops and oil seeds",
"rootCode": "A"
}
],
"rea": "1287276",
"registeredAddress": {
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"lat": 41.89334748,
"latlonPrecision": 90,
"lon": 12.48289836,
"macroregion": "Centro",
"municipality": "Roma",
"postcode": "00186",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"streetName": "Del Campidoglio",
"streetNumber": "1",
"toponym": "Piazza"
},
"startup": False,
"taxId": "02438750586",
"vat": "01057861005"
},
"country": "it",
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"id": "ea623ae8c298",
"name": "ROMA CAPITALE",
"shares": {
"sharesOwned": [
{
"active": True,
"amount": 182436916.0,
"id": "6037483d168e",
"lastUpdate": "2011-10-20",
"legalName": "AZIENDA MUNICIPALE AMBIENTE S.P.A. "
"ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"name": "AZIENDA MUNICIPALE AMBIENTE S.P.A. ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 10000000.0,
"id": "8f1c4be03ec9",
"lastUpdate": "2010-02-18",
"legalName": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"name": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2822250.0,
"id": "667243935e25",
"lastUpdate": "2005-11-24",
"legalName": "ZETEMA PROGETTO CULTURA SRL",
"name": "ZETEMA PROGETTO CULTURA SRL",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2500000.0,
"id": "0b10d523f972",
"lastUpdate": "2012-04-16",
"legalName": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE DELLA CITTA' "
"DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"name": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE "
"DELLA CITTA' DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2000000.0,
"id": "a32e98206f27",
"lastUpdate": "2012-02-16",
"legalName": "RISORSE PER ROMA S.P.A.",
"name": "RISORSE PER ROMA S.P.A.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 300000.0,
"id": "8c8179a19251",
"lastUpdate": "2005-10-14",
"legalName": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"name": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 560438430.84,
"id": "b3f933d6a2df",
"lastUpdate": "2014-12-18",
"legalName": "ACEA S.P.A.",
"name": "ACEA S.P.A.",
"ratio": 0.51,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 2530638.81,
"id": "b61bd1a3a245",
"lastUpdate": "2000-04-28",
"legalName": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"name": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"ratio": 0.35,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 190000.0,
"id": "2f3a4fd465f6",
"lastUpdate": "2006-05-18",
"legalName": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"name": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"ratio": 0.19,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 155738.0,
"id": "f11054c630fa",
"lastUpdate": "2014-06-05",
"legalName": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"name": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"ratio": 0.08869999999999999,
"typeOfRight": "propriet\u00e0"
}
]
}
},
{
"active": True,
"base": {
"active": True,
"ateco": [
{
"code": "01.11.1",
"description": "Coltivazione di cereali (escluso il riso)",
"rootCode": "A"
}
],
"cciaa": "RM",
"founded": "1977-06-14",
"inGroup": False,
"legalClass": "Altre Forme",
"legalForms": [
{
"level": 1,
"name": "Altre Forme"
},
{
"level": 2,
"name": "Ente"
}
],
"legalName": "ROMA CAPITALE",
"nace": [
{
"code": "01.11",
"description": "Growing of cereals (except rice), leguminous crops and oil seeds",
"rootCode": "A"
}
],
"rea": "1287276",
"registeredAddress": {
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"lat": 41.89334748,
"latlonPrecision": 90,
"lon": 12.48289836,
"macroregion": "Centro",
"municipality": "Roma",
"postcode": "00186",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"streetName": "Del Campidoglio",
"streetNumber": "1",
"toponym": "Piazza"
},
"startup": False,
"taxId": "02438750586",
"vat": "01057861005"
},
"country": "it",
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"id": "ea623ae8c298",
"name": "ROMA CAPITALE",
"shares": {
"sharesOwned": [
{
"active": True,
"amount": 182436916.0,
"id": "6037483d168e",
"lastUpdate": "2011-10-20",
"legalName": "AZIENDA MUNICIPALE AMBIENTE S.P.A. "
"ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"name": "AZIENDA MUNICIPALE AMBIENTE S.P.A. ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 10000000.0,
"id": "8f1c4be03ec9",
"lastUpdate": "2010-02-18",
"legalName": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"name": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2822250.0,
"id": "667243935e25",
"lastUpdate": "2005-11-24",
"legalName": "ZETEMA PROGETTO CULTURA SRL",
"name": "ZETEMA PROGETTO CULTURA SRL",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2500000.0,
"id": "0b10d523f972",
"lastUpdate": "2012-04-16",
"legalName": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE DELLA CITTA' "
"DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"name": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE "
"DELLA CITTA' DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2000000.0,
"id": "a32e98206f27",
"lastUpdate": "2012-02-16",
"legalName": "RISORSE PER ROMA S.P.A.",
"name": "RISORSE PER ROMA S.P.A.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 300000.0,
"id": "8c8179a19251",
"lastUpdate": "2005-10-14",
"legalName": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"name": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 560438430.84,
"id": "b3f933d6a2df",
"lastUpdate": "2014-12-18",
"legalName": "ACEA S.P.A.",
"name": "ACEA S.P.A.",
"ratio": 0.51,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 2530638.81,
"id": "b61bd1a3a245",
"lastUpdate": "2000-04-28",
"legalName": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"name": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"ratio": 0.35,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 190000.0,
"id": "2f3a4fd465f6",
"lastUpdate": "2006-05-18",
"legalName": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"name": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"ratio": 0.19,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 155738.0,
"id": "f11054c630fa",
"lastUpdate": "2014-06-05",
"legalName": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"name": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"ratio": 0.08869999999999999,
"typeOfRight": "propriet\u00e0"
}
]
}
},
{
"active": True,
"base": {
"active": True,
"ateco": [
{
"code": "01.11.1",
"description": "Coltivazione di cereali (escluso il riso)",
"rootCode": "A"
}
],
"cciaa": "RM",
"founded": "1977-06-14",
"inGroup": False,
"legalClass": "Altre Forme",
"legalForms": [
{
"level": 1,
"name": "Altre Forme"
},
{
"level": 2,
"name": "Ente"
}
],
"legalName": "ROMA CAPITALE",
"nace": [
{
"code": "01.11",
"description": "Growing of cereals (except rice), leguminous crops and oil seeds",
"rootCode": "A"
}
],
"rea": "1287276",
"registeredAddress": {
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"lat": 41.89334748,
"latlonPrecision": 90,
"lon": 12.48289836,
"macroregion": "Centro",
"municipality": "Roma",
"postcode": "00186",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"streetName": "Del Campidoglio",
"streetNumber": "1",
"toponym": "Piazza"
},
"startup": False,
"taxId": "02438750586",
"vat": "01057861005"
},
"country": "it",
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"id": "ea623ae8c298",
"name": "ROMA CAPITALE",
"shares": {
"sharesOwned": [
{
"active": True,
"amount": 182436916.0,
"id": "6037483d168e",
"lastUpdate": "2011-10-20",
"legalName": "AZIENDA MUNICIPALE AMBIENTE S.P.A. "
"ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"name": "AZIENDA MUNICIPALE AMBIENTE S.P.A. ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 10000000.0,
"id": "8f1c4be03ec9",
"lastUpdate": "2010-02-18",
"legalName": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"name": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2822250.0,
"id": "667243935e25",
"lastUpdate": "2005-11-24",
"legalName": "ZETEMA PROGETTO CULTURA SRL",
"name": "ZETEMA PROGETTO CULTURA SRL",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2500000.0,
"id": "0b10d523f972",
"lastUpdate": "2012-04-16",
"legalName": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE DELLA CITTA' "
"DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"name": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE "
"DELLA CITTA' DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2000000.0,
"id": "a32e98206f27",
"lastUpdate": "2012-02-16",
"legalName": "RISORSE PER ROMA S.P.A.",
"name": "RISORSE PER ROMA S.P.A.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 300000.0,
"id": "8c8179a19251",
"lastUpdate": "2005-10-14",
"legalName": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"name": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 560438430.84,
"id": "b3f933d6a2df",
"lastUpdate": "2014-12-18",
"legalName": "ACEA S.P.A.",
"name": "ACEA S.P.A.",
"ratio": 0.51,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 2530638.81,
"id": "b61bd1a3a245",
"lastUpdate": "2000-04-28",
"legalName": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"name": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"ratio": 0.35,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 190000.0,
"id": "2f3a4fd465f6",
"lastUpdate": "2006-05-18",
"legalName": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"name": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"ratio": 0.19,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 155738.0,
"id": "f11054c630fa",
"lastUpdate": "2014-06-05",
"legalName": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"name": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"ratio": 0.08869999999999999,
"typeOfRight": "propriet\u00e0"
}
]
}
},
{
"active": True,
"base": {
"active": True,
"ateco": [
{
"code": "01.11.1",
"description": "Coltivazione di cereali (escluso il riso)",
"rootCode": "A"
}
],
"cciaa": "RM",
"founded": "1977-06-14",
"inGroup": False,
"legalClass": "Altre Forme",
"legalForms": [
{
"level": 1,
"name": "Altre Forme"
},
{
"level": 2,
"name": "Ente"
}
],
"legalName": "ROMA CAPITALE",
"nace": [
{
"code": "01.11",
"description": "Growing of cereals (except rice), leguminous crops and oil seeds",
"rootCode": "A"
}
],
"rea": "1287276",
"registeredAddress": {
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"lat": 41.89334748,
"latlonPrecision": 90,
"lon": 12.48289836,
"macroregion": "Centro",
"municipality": "Roma",
"postcode": "00186",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"streetName": "Del Campidoglio",
"streetNumber": "1",
"toponym": "Piazza"
},
"startup": False,
"taxId": "02438750586",
"vat": "01057861005"
},
"country": "it",
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"id": "ea623ae8c298",
"name": "ROMA CAPITALE",
"shares": {
"sharesOwned": [
{
"active": True,
"amount": 182436916.0,
"id": "6037483d168e",
"lastUpdate": "2011-10-20",
"legalName": "AZIENDA MUNICIPALE AMBIENTE S.P.A. "
"ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"name": "AZIENDA MUNICIPALE AMBIENTE S.P.A. ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 10000000.0,
"id": "8f1c4be03ec9",
"lastUpdate": "2010-02-18",
"legalName": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"name": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2822250.0,
"id": "667243935e25",
"lastUpdate": "2005-11-24",
"legalName": "ZETEMA PROGETTO CULTURA SRL",
"name": "ZETEMA PROGETTO CULTURA SRL",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2500000.0,
"id": "0b10d523f972",
"lastUpdate": "2012-04-16",
"legalName": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE DELLA CITTA' "
"DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"name": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE "
"DELLA CITTA' DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2000000.0,
"id": "a32e98206f27",
"lastUpdate": "2012-02-16",
"legalName": "RISORSE PER ROMA S.P.A.",
"name": "RISORSE PER ROMA S.P.A.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 300000.0,
"id": "8c8179a19251",
"lastUpdate": "2005-10-14",
"legalName": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"name": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 560438430.84,
"id": "b3f933d6a2df",
"lastUpdate": "2014-12-18",
"legalName": "ACEA S.P.A.",
"name": "ACEA S.P.A.",
"ratio": 0.51,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 2530638.81,
"id": "b61bd1a3a245",
"lastUpdate": "2000-04-28",
"legalName": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"name": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"ratio": 0.35,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 190000.0,
"id": "2f3a4fd465f6",
"lastUpdate": "2006-05-18",
"legalName": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"name": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"ratio": 0.19,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 155738.0,
"id": "f11054c630fa",
"lastUpdate": "2014-06-05",
"legalName": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"name": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"ratio": 0.08869999999999999,
"typeOfRight": "propriet\u00e0"
}
]
}
},
{
"active": True,
"base": {
"active": True,
"ateco": [
{
"code": "01.11.1",
"description": "Coltivazione di cereali (escluso il riso)",
"rootCode": "A"
}
],
"cciaa": "RM",
"founded": "1977-06-14",
"inGroup": False,
"legalClass": "Altre Forme",
"legalForms": [
{
"level": 1,
"name": "Altre Forme"
},
{
"level": 2,
"name": "Ente"
}
],
"legalName": "ROMA CAPITALE",
"nace": [
{
"code": "01.11",
"description": "Growing of cereals (except rice), leguminous crops and oil seeds",
"rootCode": "A"
}
],
"rea": "1287276",
"registeredAddress": {
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"lat": 41.89334748,
"latlonPrecision": 90,
"lon": 12.48289836,
"macroregion": "Centro",
"municipality": "Roma",
"postcode": "00186",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"streetName": "Del Campidoglio",
"streetNumber": "1",
"toponym": "Piazza"
},
"startup": False,
"taxId": "02438750586",
"vat": "01057861005"
},
"country": "it",
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"id": "ea623ae8c298",
"name": "ROMA CAPITALE",
"shares": {
"sharesOwned": [
{
"active": True,
"amount": 182436916.0,
"id": "6037483d168e",
"lastUpdate": "2011-10-20",
"legalName": "AZIENDA MUNICIPALE AMBIENTE S.P.A. "
"ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"name": "AZIENDA MUNICIPALE AMBIENTE S.P.A. ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 10000000.0,
"id": "8f1c4be03ec9",
"lastUpdate": "2010-02-18",
"legalName": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"name": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2822250.0,
"id": "667243935e25",
"lastUpdate": "2005-11-24",
"legalName": "ZETEMA PROGETTO CULTURA SRL",
"name": "ZETEMA PROGETTO CULTURA SRL",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2500000.0,
"id": "0b10d523f972",
"lastUpdate": "2012-04-16",
"legalName": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE DELLA CITTA' "
"DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"name": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE "
"DELLA CITTA' DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2000000.0,
"id": "a32e98206f27",
"lastUpdate": "2012-02-16",
"legalName": "RISORSE PER ROMA S.P.A.",
"name": "RISORSE PER ROMA S.P.A.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 300000.0,
"id": "8c8179a19251",
"lastUpdate": "2005-10-14",
"legalName": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"name": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 560438430.84,
"id": "b3f933d6a2df",
"lastUpdate": "2014-12-18",
"legalName": "ACEA S.P.A.",
"name": "ACEA S.P.A.",
"ratio": 0.51,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 2530638.81,
"id": "b61bd1a3a245",
"lastUpdate": "2000-04-28",
"legalName": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"name": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"ratio": 0.35,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 190000.0,
"id": "2f3a4fd465f6",
"lastUpdate": "2006-05-18",
"legalName": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"name": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"ratio": 0.19,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 155738.0,
"id": "f11054c630fa",
"lastUpdate": "2014-06-05",
"legalName": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"name": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"ratio": 0.08869999999999999,
"typeOfRight": "propriet\u00e0"
}
]
}
},
{
"active": True,
"base": {
"active": True,
"ateco": [
{
"code": "01.11.1",
"description": "Coltivazione di cereali (escluso il riso)",
"rootCode": "A"
}
],
"cciaa": "RM",
"founded": "1977-06-14",
"inGroup": False,
"legalClass": "Altre Forme",
"legalForms": [
{
"level": 1,
"name": "Altre Forme"
},
{
"level": 2,
"name": "Ente"
}
],
"legalName": "ROMA CAPITALE",
"nace": [
{
"code": "01.11",
"description": "Growing of cereals (except rice), leguminous crops and oil seeds",
"rootCode": "A"
}
],
"rea": "1287276",
"registeredAddress": {
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"lat": 41.89334748,
"latlonPrecision": 90,
"lon": 12.48289836,
"macroregion": "Centro",
"municipality": "Roma",
"postcode": "00186",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"streetName": "Del Campidoglio",
"streetNumber": "1",
"toponym": "Piazza"
},
"startup": False,
"taxId": "02438750586",
"vat": "01057861005"
},
"country": "it",
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"id": "ea623ae8c298",
"name": "ROMA CAPITALE",
"shares": {
"sharesOwned": [
{
"active": True,
"amount": 182436916.0,
"id": "6037483d168e",
"lastUpdate": "2011-10-20",
"legalName": "AZIENDA MUNICIPALE AMBIENTE S.P.A. "
"ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"name": "AZIENDA MUNICIPALE AMBIENTE S.P.A. ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 10000000.0,
"id": "8f1c4be03ec9",
"lastUpdate": "2010-02-18",
"legalName": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"name": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2822250.0,
"id": "667243935e25",
"lastUpdate": "2005-11-24",
"legalName": "ZETEMA PROGETTO CULTURA SRL",
"name": "ZETEMA PROGETTO CULTURA SRL",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2500000.0,
"id": "0b10d523f972",
"lastUpdate": "2012-04-16",
"legalName": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE DELLA CITTA' "
"DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"name": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE "
"DELLA CITTA' DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2000000.0,
"id": "a32e98206f27",
"lastUpdate": "2012-02-16",
"legalName": "RISORSE PER ROMA S.P.A.",
"name": "RISORSE PER ROMA S.P.A.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 300000.0,
"id": "8c8179a19251",
"lastUpdate": "2005-10-14",
"legalName": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"name": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 560438430.84,
"id": "b3f933d6a2df",
"lastUpdate": "2014-12-18",
"legalName": "ACEA S.P.A.",
"name": "ACEA S.P.A.",
"ratio": 0.51,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 2530638.81,
"id": "b61bd1a3a245",
"lastUpdate": "2000-04-28",
"legalName": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"name": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"ratio": 0.35,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 190000.0,
"id": "2f3a4fd465f6",
"lastUpdate": "2006-05-18",
"legalName": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"name": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"ratio": 0.19,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 155738.0,
"id": "f11054c630fa",
"lastUpdate": "2014-06-05",
"legalName": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"name": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"ratio": 0.08869999999999999,
"typeOfRight": "propriet\u00e0"
}
]
}
},
{
"active": True,
"base": {
"active": True,
"ateco": [
{
"code": "01.11.1",
"description": "Coltivazione di cereali (escluso il riso)",
"rootCode": "A"
}
],
"cciaa": "RM",
"founded": "1977-06-14",
"inGroup": False,
"legalClass": "Altre Forme",
"legalForms": [
{
"level": 1,
"name": "Altre Forme"
},
{
"level": 2,
"name": "Ente"
}
],
"legalName": "ROMA CAPITALE",
"nace": [
{
"code": "01.11",
"description": "Growing of cereals (except rice), leguminous crops and oil seeds",
"rootCode": "A"
}
],
"rea": "1287276",
"registeredAddress": {
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"lat": 41.89334748,
"latlonPrecision": 90,
"lon": 12.48289836,
"macroregion": "Centro",
"municipality": "Roma",
"postcode": "00186",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"streetName": "Del Campidoglio",
"streetNumber": "1",
"toponym": "Piazza"
},
"startup": False,
"taxId": "02438750586",
"vat": "01057861005"
},
"country": "it",
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"id": "ea623ae8c298",
"name": "ROMA CAPITALE",
"shares": {
"sharesOwned": [
{
"active": True,
"amount": 182436916.0,
"id": "6037483d168e",
"lastUpdate": "2011-10-20",
"legalName": "AZIENDA MUNICIPALE AMBIENTE S.P.A. "
"ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"name": "AZIENDA MUNICIPALE AMBIENTE S.P.A. ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 10000000.0,
"id": "8f1c4be03ec9",
"lastUpdate": "2010-02-18",
"legalName": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"name": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2822250.0,
"id": "667243935e25",
"lastUpdate": "2005-11-24",
"legalName": "ZETEMA PROGETTO CULTURA SRL",
"name": "ZETEMA PROGETTO CULTURA SRL",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2500000.0,
"id": "0b10d523f972",
"lastUpdate": "2012-04-16",
"legalName": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE DELLA CITTA' "
"DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"name": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE "
"DELLA CITTA' DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2000000.0,
"id": "a32e98206f27",
"lastUpdate": "2012-02-16",
"legalName": "RISORSE PER ROMA S.P.A.",
"name": "RISORSE PER ROMA S.P.A.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 300000.0,
"id": "8c8179a19251",
"lastUpdate": "2005-10-14",
"legalName": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"name": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 560438430.84,
"id": "b3f933d6a2df",
"lastUpdate": "2014-12-18",
"legalName": "ACEA S.P.A.",
"name": "ACEA S.P.A.",
"ratio": 0.51,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 2530638.81,
"id": "b61bd1a3a245",
"lastUpdate": "2000-04-28",
"legalName": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"name": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"ratio": 0.35,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 190000.0,
"id": "2f3a4fd465f6",
"lastUpdate": "2006-05-18",
"legalName": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"name": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"ratio": 0.19,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 155738.0,
"id": "f11054c630fa",
"lastUpdate": "2014-06-05",
"legalName": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"name": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"ratio": 0.08869999999999999,
"typeOfRight": "propriet\u00e0"
}
]
}
},
{
"active": True,
"base": {
"active": True,
"ateco": [
{
"code": "01.11.1",
"description": "Coltivazione di cereali (escluso il riso)",
"rootCode": "A"
}
],
"cciaa": "RM",
"founded": "1977-06-14",
"inGroup": False,
"legalClass": "Altre Forme",
"legalForms": [
{
"level": 1,
"name": "Altre Forme"
},
{
"level": 2,
"name": "Ente"
}
],
"legalName": "ROMA CAPITALE",
"nace": [
{
"code": "01.11",
"description": "Growing of cereals (except rice), leguminous crops and oil seeds",
"rootCode": "A"
}
],
"rea": "1287276",
"registeredAddress": {
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"lat": 41.89334748,
"latlonPrecision": 90,
"lon": 12.48289836,
"macroregion": "Centro",
"municipality": "Roma",
"postcode": "00186",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"streetName": "Del Campidoglio",
"streetNumber": "1",
"toponym": "Piazza"
},
"startup": False,
"taxId": "02438750586",
"vat": "01057861005"
},
"country": "it",
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"id": "ea623ae8c298",
"name": "ROMA CAPITALE",
"shares": {
"sharesOwned": [
{
"active": True,
"amount": 182436916.0,
"id": "6037483d168e",
"lastUpdate": "2011-10-20",
"legalName": "AZIENDA MUNICIPALE AMBIENTE S.P.A. "
"ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"name": "AZIENDA MUNICIPALE AMBIENTE S.P.A. ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 10000000.0,
"id": "8f1c4be03ec9",
"lastUpdate": "2010-02-18",
"legalName": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"name": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2822250.0,
"id": "667243935e25",
"lastUpdate": "2005-11-24",
"legalName": "ZETEMA PROGETTO CULTURA SRL",
"name": "ZETEMA PROGETTO CULTURA SRL",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2500000.0,
"id": "0b10d523f972",
"lastUpdate": "2012-04-16",
"legalName": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE DELLA CITTA' "
"DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"name": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE "
"DELLA CITTA' DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2000000.0,
"id": "a32e98206f27",
"lastUpdate": "2012-02-16",
"legalName": "RISORSE PER ROMA S.P.A.",
"name": "RISORSE PER ROMA S.P.A.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 300000.0,
"id": "8c8179a19251",
"lastUpdate": "2005-10-14",
"legalName": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"name": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 560438430.84,
"id": "b3f933d6a2df",
"lastUpdate": "2014-12-18",
"legalName": "ACEA S.P.A.",
"name": "ACEA S.P.A.",
"ratio": 0.51,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 2530638.81,
"id": "b61bd1a3a245",
"lastUpdate": "2000-04-28",
"legalName": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"name": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"ratio": 0.35,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 190000.0,
"id": "2f3a4fd465f6",
"lastUpdate": "2006-05-18",
"legalName": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"name": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"ratio": 0.19,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 155738.0,
"id": "f11054c630fa",
"lastUpdate": "2014-06-05",
"legalName": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"name": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"ratio": 0.08869999999999999,
"typeOfRight": "propriet\u00e0"
}
]
}
},
{
"active": True,
"base": {
"active": True,
"ateco": [
{
"code": "01.11.1",
"description": "Coltivazione di cereali (escluso il riso)",
"rootCode": "A"
}
],
"cciaa": "RM",
"founded": "1977-06-14",
"inGroup": False,
"legalClass": "Altre Forme",
"legalForms": [
{
"level": 1,
"name": "Altre Forme"
},
{
"level": 2,
"name": "Ente"
}
],
"legalName": "ROMA CAPITALE",
"nace": [
{
"code": "01.11",
"description": "Growing of cereals (except rice), leguminous crops and oil seeds",
"rootCode": "A"
}
],
"rea": "1287276",
"registeredAddress": {
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"lat": 41.89334748,
"latlonPrecision": 90,
"lon": 12.48289836,
"macroregion": "Centro",
"municipality": "Roma",
"postcode": "00186",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"streetName": "Del Campidoglio",
"streetNumber": "1",
"toponym": "Piazza"
},
"startup": False,
"taxId": "02438750586",
"vat": "01057861005"
},
"country": "it",
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"id": "ea623ae8c298",
"name": "ROMA CAPITALE",
"shares": {
"sharesOwned": [
{
"active": True,
"amount": 182436916.0,
"id": "6037483d168e",
"lastUpdate": "2011-10-20",
"legalName": "AZIENDA MUNICIPALE AMBIENTE S.P.A. "
"ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"name": "AZIENDA MUNICIPALE AMBIENTE S.P.A. ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 10000000.0,
"id": "8f1c4be03ec9",
"lastUpdate": "2010-02-18",
"legalName": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"name": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2822250.0,
"id": "667243935e25",
"lastUpdate": "2005-11-24",
"legalName": "ZETEMA PROGETTO CULTURA SRL",
"name": "ZETEMA PROGETTO CULTURA SRL",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2500000.0,
"id": "0b10d523f972",
"lastUpdate": "2012-04-16",
"legalName": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE DELLA CITTA' "
"DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"name": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE "
"DELLA CITTA' DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2000000.0,
"id": "a32e98206f27",
"lastUpdate": "2012-02-16",
"legalName": "RISORSE PER ROMA S.P.A.",
"name": "RISORSE PER ROMA S.P.A.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 300000.0,
"id": "8c8179a19251",
"lastUpdate": "2005-10-14",
"legalName": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"name": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 560438430.84,
"id": "b3f933d6a2df",
"lastUpdate": "2014-12-18",
"legalName": "ACEA S.P.A.",
"name": "ACEA S.P.A.",
"ratio": 0.51,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 2530638.81,
"id": "b61bd1a3a245",
"lastUpdate": "2000-04-28",
"legalName": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"name": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"ratio": 0.35,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 190000.0,
"id": "2f3a4fd465f6",
"lastUpdate": "2006-05-18",
"legalName": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"name": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"ratio": 0.19,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 155738.0,
"id": "f11054c630fa",
"lastUpdate": "2014-06-05",
"legalName": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"name": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"ratio": 0.08869999999999999,
"typeOfRight": "propriet\u00e0"
}
]
}
},
{
"active": True,
"base": {
"active": True,
"ateco": [
{
"code": "01.11.1",
"description": "Coltivazione di cereali (escluso il riso)",
"rootCode": "A"
}
],
"cciaa": "RM",
"founded": "1977-06-14",
"inGroup": False,
"legalClass": "Altre Forme",
"legalForms": [
{
"level": 1,
"name": "Altre Forme"
},
{
"level": 2,
"name": "Ente"
}
],
"legalName": "ROMA CAPITALE",
"nace": [
{
"code": "01.11",
"description": "Growing of cereals (except rice), leguminous crops and oil seeds",
"rootCode": "A"
}
],
"rea": "1287276",
"registeredAddress": {
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"lat": 41.89334748,
"latlonPrecision": 90,
"lon": 12.48289836,
"macroregion": "Centro",
"municipality": "Roma",
"postcode": "00186",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"streetName": "Del Campidoglio",
"streetNumber": "1",
"toponym": "Piazza"
},
"startup": False,
"taxId": "02438750586",
"vat": "01057861005"
},
"country": "it",
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"id": "ea623ae8c298",
"name": "ROMA CAPITALE",
"shares": {
"sharesOwned": [
{
"active": True,
"amount": 182436916.0,
"id": "6037483d168e",
"lastUpdate": "2011-10-20",
"legalName": "AZIENDA MUNICIPALE AMBIENTE S.P.A. "
"ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"name": "AZIENDA MUNICIPALE AMBIENTE S.P.A. ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 10000000.0,
"id": "8f1c4be03ec9",
"lastUpdate": "2010-02-18",
"legalName": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"name": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2822250.0,
"id": "667243935e25",
"lastUpdate": "2005-11-24",
"legalName": "ZETEMA PROGETTO CULTURA SRL",
"name": "ZETEMA PROGETTO CULTURA SRL",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2500000.0,
"id": "0b10d523f972",
"lastUpdate": "2012-04-16",
"legalName": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE DELLA CITTA' "
"DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"name": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE "
"DELLA CITTA' DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2000000.0,
"id": "a32e98206f27",
"lastUpdate": "2012-02-16",
"legalName": "RISORSE PER ROMA S.P.A.",
"name": "RISORSE PER ROMA S.P.A.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 300000.0,
"id": "8c8179a19251",
"lastUpdate": "2005-10-14",
"legalName": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"name": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 560438430.84,
"id": "b3f933d6a2df",
"lastUpdate": "2014-12-18",
"legalName": "ACEA S.P.A.",
"name": "ACEA S.P.A.",
"ratio": 0.51,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 2530638.81,
"id": "b61bd1a3a245",
"lastUpdate": "2000-04-28",
"legalName": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"name": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"ratio": 0.35,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 190000.0,
"id": "2f3a4fd465f6",
"lastUpdate": "2006-05-18",
"legalName": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"name": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"ratio": 0.19,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 155738.0,
"id": "f11054c630fa",
"lastUpdate": "2014-06-05",
"legalName": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"name": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"ratio": 0.08869999999999999,
"typeOfRight": "propriet\u00e0"
}
]
}
},
{
"active": True,
"base": {
"active": True,
"ateco": [
{
"code": "01.11.1",
"description": "Coltivazione di cereali (escluso il riso)",
"rootCode": "A"
}
],
"cciaa": "RM",
"founded": "1977-06-14",
"inGroup": False,
"legalClass": "Altre Forme",
"legalForms": [
{
"level": 1,
"name": "Altre Forme"
},
{
"level": 2,
"name": "Ente"
}
],
"legalName": "ROMA CAPITALE",
"nace": [
{
"code": "01.11",
"description": "Growing of cereals (except rice), leguminous crops and oil seeds",
"rootCode": "A"
}
],
"rea": "1287276",
"registeredAddress": {
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"lat": 41.89334748,
"latlonPrecision": 90,
"lon": 12.48289836,
"macroregion": "Centro",
"municipality": "Roma",
"postcode": "00186",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"streetName": "Del Campidoglio",
"streetNumber": "1",
"toponym": "Piazza"
},
"startup": False,
"taxId": "02438750586",
"vat": "01057861005"
},
"country": "it",
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"id": "ea623ae8c298",
"name": "ROMA CAPITALE",
"shares": {
"sharesOwned": [
{
"active": True,
"amount": 182436916.0,
"id": "6037483d168e",
"lastUpdate": "2011-10-20",
"legalName": "AZIENDA MUNICIPALE AMBIENTE S.P.A. "
"ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"name": "AZIENDA MUNICIPALE AMBIENTE S.P.A. ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 10000000.0,
"id": "8f1c4be03ec9",
"lastUpdate": "2010-02-18",
"legalName": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"name": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2822250.0,
"id": "667243935e25",
"lastUpdate": "2005-11-24",
"legalName": "ZETEMA PROGETTO CULTURA SRL",
"name": "ZETEMA PROGETTO CULTURA SRL",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2500000.0,
"id": "0b10d523f972",
"lastUpdate": "2012-04-16",
"legalName": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE DELLA CITTA' "
"DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"name": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE "
"DELLA CITTA' DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2000000.0,
"id": "a32e98206f27",
"lastUpdate": "2012-02-16",
"legalName": "RISORSE PER ROMA S.P.A.",
"name": "RISORSE PER ROMA S.P.A.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 300000.0,
"id": "8c8179a19251",
"lastUpdate": "2005-10-14",
"legalName": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"name": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 560438430.84,
"id": "b3f933d6a2df",
"lastUpdate": "2014-12-18",
"legalName": "ACEA S.P.A.",
"name": "ACEA S.P.A.",
"ratio": 0.51,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 2530638.81,
"id": "b61bd1a3a245",
"lastUpdate": "2000-04-28",
"legalName": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"name": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"ratio": 0.35,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 190000.0,
"id": "2f3a4fd465f6",
"lastUpdate": "2006-05-18",
"legalName": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"name": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"ratio": 0.19,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 155738.0,
"id": "f11054c630fa",
"lastUpdate": "2014-06-05",
"legalName": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"name": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"ratio": 0.08869999999999999,
"typeOfRight": "propriet\u00e0"
}
]
}
},
{
"active": True,
"base": {
"active": True,
"ateco": [
{
"code": "01.11.1",
"description": "Coltivazione di cereali (escluso il riso)",
"rootCode": "A"
}
],
"cciaa": "RM",
"founded": "1977-06-14",
"inGroup": False,
"legalClass": "Altre Forme",
"legalForms": [
{
"level": 1,
"name": "Altre Forme"
},
{
"level": 2,
"name": "Ente"
}
],
"legalName": "ROMA CAPITALE",
"nace": [
{
"code": "01.11",
"description": "Growing of cereals (except rice), leguminous crops and oil seeds",
"rootCode": "A"
}
],
"rea": "1287276",
"registeredAddress": {
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"lat": 41.89334748,
"latlonPrecision": 90,
"lon": 12.48289836,
"macroregion": "Centro",
"municipality": "Roma",
"postcode": "00186",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"streetName": "Del Campidoglio",
"streetNumber": "1",
"toponym": "Piazza"
},
"startup": False,
"taxId": "02438750586",
"vat": "01057861005"
},
"country": "it",
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"id": "ea623ae8c298",
"name": "ROMA CAPITALE",
"shares": {
"sharesOwned": [
{
"active": True,
"amount": 182436916.0,
"id": "6037483d168e",
"lastUpdate": "2011-10-20",
"legalName": "AZIENDA MUNICIPALE AMBIENTE S.P.A. "
"ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"name": "AZIENDA MUNICIPALE AMBIENTE S.P.A. ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 10000000.0,
"id": "8f1c4be03ec9",
"lastUpdate": "2010-02-18",
"legalName": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"name": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2822250.0,
"id": "667243935e25",
"lastUpdate": "2005-11-24",
"legalName": "ZETEMA PROGETTO CULTURA SRL",
"name": "ZETEMA PROGETTO CULTURA SRL",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2500000.0,
"id": "0b10d523f972",
"lastUpdate": "2012-04-16",
"legalName": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE DELLA CITTA' "
"DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"name": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE "
"DELLA CITTA' DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2000000.0,
"id": "a32e98206f27",
"lastUpdate": "2012-02-16",
"legalName": "RISORSE PER ROMA S.P.A.",
"name": "RISORSE PER ROMA S.P.A.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 300000.0,
"id": "8c8179a19251",
"lastUpdate": "2005-10-14",
"legalName": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"name": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 560438430.84,
"id": "b3f933d6a2df",
"lastUpdate": "2014-12-18",
"legalName": "ACEA S.P.A.",
"name": "ACEA S.P.A.",
"ratio": 0.51,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 2530638.81,
"id": "b61bd1a3a245",
"lastUpdate": "2000-04-28",
"legalName": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"name": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"ratio": 0.35,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 190000.0,
"id": "2f3a4fd465f6",
"lastUpdate": "2006-05-18",
"legalName": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"name": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"ratio": 0.19,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 155738.0,
"id": "f11054c630fa",
"lastUpdate": "2014-06-05",
"legalName": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"name": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"ratio": 0.08869999999999999,
"typeOfRight": "propriet\u00e0"
}
]
}
},
{
"active": True,
"base": {
"active": True,
"ateco": [
{
"code": "01.11.1",
"description": "Coltivazione di cereali (escluso il riso)",
"rootCode": "A"
}
],
"cciaa": "RM",
"founded": "1977-06-14",
"inGroup": False,
"legalClass": "Altre Forme",
"legalForms": [
{
"level": 1,
"name": "Altre Forme"
},
{
"level": 2,
"name": "Ente"
}
],
"legalName": "ROMA CAPITALE",
"nace": [
{
"code": "01.11",
"description": "Growing of cereals (except rice), leguminous crops and oil seeds",
"rootCode": "A"
}
],
"rea": "1287276",
"registeredAddress": {
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"lat": 41.89334748,
"latlonPrecision": 90,
"lon": 12.48289836,
"macroregion": "Centro",
"municipality": "Roma",
"postcode": "00186",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"streetName": "Del Campidoglio",
"streetNumber": "1",
"toponym": "Piazza"
},
"startup": False,
"taxId": "02438750586",
"vat": "01057861005"
},
"country": "it",
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"id": "ea623ae8c298",
"name": "ROMA CAPITALE",
"shares": {
"sharesOwned": [
{
"active": True,
"amount": 182436916.0,
"id": "6037483d168e",
"lastUpdate": "2011-10-20",
"legalName": "AZIENDA MUNICIPALE AMBIENTE S.P.A. "
"ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"name": "AZIENDA MUNICIPALE AMBIENTE S.P.A. ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 10000000.0,
"id": "8f1c4be03ec9",
"lastUpdate": "2010-02-18",
"legalName": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"name": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2822250.0,
"id": "667243935e25",
"lastUpdate": "2005-11-24",
"legalName": "ZETEMA PROGETTO CULTURA SRL",
"name": "ZETEMA PROGETTO CULTURA SRL",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2500000.0,
"id": "0b10d523f972",
"lastUpdate": "2012-04-16",
"legalName": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE DELLA CITTA' "
"DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"name": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE "
"DELLA CITTA' DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2000000.0,
"id": "a32e98206f27",
"lastUpdate": "2012-02-16",
"legalName": "RISORSE PER ROMA S.P.A.",
"name": "RISORSE PER ROMA S.P.A.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 300000.0,
"id": "8c8179a19251",
"lastUpdate": "2005-10-14",
"legalName": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"name": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 560438430.84,
"id": "b3f933d6a2df",
"lastUpdate": "2014-12-18",
"legalName": "ACEA S.P.A.",
"name": "ACEA S.P.A.",
"ratio": 0.51,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 2530638.81,
"id": "b61bd1a3a245",
"lastUpdate": "2000-04-28",
"legalName": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"name": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"ratio": 0.35,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 190000.0,
"id": "2f3a4fd465f6",
"lastUpdate": "2006-05-18",
"legalName": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"name": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"ratio": 0.19,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 155738.0,
"id": "f11054c630fa",
"lastUpdate": "2014-06-05",
"legalName": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"name": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"ratio": 0.08869999999999999,
"typeOfRight": "propriet\u00e0"
}
]
}
},
{
"active": True,
"base": {
"active": True,
"ateco": [
{
"code": "01.11.1",
"description": "Coltivazione di cereali (escluso il riso)",
"rootCode": "A"
}
],
"cciaa": "RM",
"founded": "1977-06-14",
"inGroup": False,
"legalClass": "Altre Forme",
"legalForms": [
{
"level": 1,
"name": "Altre Forme"
},
{
"level": 2,
"name": "Ente"
}
],
"legalName": "ROMA CAPITALE",
"nace": [
{
"code": "01.11",
"description": "Growing of cereals (except rice), leguminous crops and oil seeds",
"rootCode": "A"
}
],
"rea": "1287276",
"registeredAddress": {
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"lat": 41.89334748,
"latlonPrecision": 90,
"lon": 12.48289836,
"macroregion": "Centro",
"municipality": "Roma",
"postcode": "00186",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"streetName": "Del Campidoglio",
"streetNumber": "1",
"toponym": "Piazza"
},
"startup": False,
"taxId": "02438750586",
"vat": "01057861005"
},
"country": "it",
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"id": "ea623ae8c298",
"name": "ROMA CAPITALE",
"shares": {
"sharesOwned": [
{
"active": True,
"amount": 182436916.0,
"id": "6037483d168e",
"lastUpdate": "2011-10-20",
"legalName": "AZIENDA MUNICIPALE AMBIENTE S.P.A. "
"ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"name": "AZIENDA MUNICIPALE AMBIENTE S.P.A. ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 10000000.0,
"id": "8f1c4be03ec9",
"lastUpdate": "2010-02-18",
"legalName": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"name": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2822250.0,
"id": "667243935e25",
"lastUpdate": "2005-11-24",
"legalName": "ZETEMA PROGETTO CULTURA SRL",
"name": "ZETEMA PROGETTO CULTURA SRL",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2500000.0,
"id": "0b10d523f972",
"lastUpdate": "2012-04-16",
"legalName": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE DELLA CITTA' "
"DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"name": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE "
"DELLA CITTA' DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2000000.0,
"id": "a32e98206f27",
"lastUpdate": "2012-02-16",
"legalName": "RISORSE PER ROMA S.P.A.",
"name": "RISORSE PER ROMA S.P.A.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 300000.0,
"id": "8c8179a19251",
"lastUpdate": "2005-10-14",
"legalName": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"name": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 560438430.84,
"id": "b3f933d6a2df",
"lastUpdate": "2014-12-18",
"legalName": "ACEA S.P.A.",
"name": "ACEA S.P.A.",
"ratio": 0.51,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 2530638.81,
"id": "b61bd1a3a245",
"lastUpdate": "2000-04-28",
"legalName": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"name": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"ratio": 0.35,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 190000.0,
"id": "2f3a4fd465f6",
"lastUpdate": "2006-05-18",
"legalName": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"name": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"ratio": 0.19,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 155738.0,
"id": "f11054c630fa",
"lastUpdate": "2014-06-05",
"legalName": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"name": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"ratio": 0.08869999999999999,
"typeOfRight": "propriet\u00e0"
}
]
}
},
{
"active": True,
"base": {
"active": True,
"ateco": [
{
"code": "01.11.1",
"description": "Coltivazione di cereali (escluso il riso)",
"rootCode": "A"
}
],
"cciaa": "RM",
"founded": "1977-06-14",
"inGroup": False,
"legalClass": "Altre Forme",
"legalForms": [
{
"level": 1,
"name": "Altre Forme"
},
{
"level": 2,
"name": "Ente"
}
],
"legalName": "ROMA CAPITALE",
"nace": [
{
"code": "01.11",
"description": "Growing of cereals (except rice), leguminous crops and oil seeds",
"rootCode": "A"
}
],
"rea": "1287276",
"registeredAddress": {
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"lat": 41.89334748,
"latlonPrecision": 90,
"lon": 12.48289836,
"macroregion": "Centro",
"municipality": "Roma",
"postcode": "00186",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"streetName": "Del Campidoglio",
"streetNumber": "1",
"toponym": "Piazza"
},
"startup": False,
"taxId": "02438750586",
"vat": "01057861005"
},
"country": "it",
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"id": "ea623ae8c298",
"name": "ROMA CAPITALE",
"shares": {
"sharesOwned": [
{
"active": True,
"amount": 182436916.0,
"id": "6037483d168e",
"lastUpdate": "2011-10-20",
"legalName": "AZIENDA MUNICIPALE AMBIENTE S.P.A. "
"ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"name": "AZIENDA MUNICIPALE AMBIENTE S.P.A. ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 10000000.0,
"id": "8f1c4be03ec9",
"lastUpdate": "2010-02-18",
"legalName": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"name": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2822250.0,
"id": "667243935e25",
"lastUpdate": "2005-11-24",
"legalName": "ZETEMA PROGETTO CULTURA SRL",
"name": "ZETEMA PROGETTO CULTURA SRL",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2500000.0,
"id": "0b10d523f972",
"lastUpdate": "2012-04-16",
"legalName": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE DELLA CITTA' "
"DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"name": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE "
"DELLA CITTA' DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2000000.0,
"id": "a32e98206f27",
"lastUpdate": "2012-02-16",
"legalName": "RISORSE PER ROMA S.P.A.",
"name": "RISORSE PER ROMA S.P.A.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 300000.0,
"id": "8c8179a19251",
"lastUpdate": "2005-10-14",
"legalName": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"name": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 560438430.84,
"id": "b3f933d6a2df",
"lastUpdate": "2014-12-18",
"legalName": "ACEA S.P.A.",
"name": "ACEA S.P.A.",
"ratio": 0.51,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 2530638.81,
"id": "b61bd1a3a245",
"lastUpdate": "2000-04-28",
"legalName": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"name": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"ratio": 0.35,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 190000.0,
"id": "2f3a4fd465f6",
"lastUpdate": "2006-05-18",
"legalName": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"name": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"ratio": 0.19,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 155738.0,
"id": "f11054c630fa",
"lastUpdate": "2014-06-05",
"legalName": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"name": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"ratio": 0.08869999999999999,
"typeOfRight": "propriet\u00e0"
}
]
}
},
{
"active": True,
"base": {
"active": True,
"ateco": [
{
"code": "01.11.1",
"description": "Coltivazione di cereali (escluso il riso)",
"rootCode": "A"
}
],
"cciaa": "RM",
"founded": "1977-06-14",
"inGroup": False,
"legalClass": "Altre Forme",
"legalForms": [
{
"level": 1,
"name": "Altre Forme"
},
{
"level": 2,
"name": "Ente"
}
],
"legalName": "ROMA CAPITALE",
"nace": [
{
"code": "01.11",
"description": "Growing of cereals (except rice), leguminous crops and oil seeds",
"rootCode": "A"
}
],
"rea": "1287276",
"registeredAddress": {
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"lat": 41.89334748,
"latlonPrecision": 90,
"lon": 12.48289836,
"macroregion": "Centro",
"municipality": "Roma",
"postcode": "00186",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"streetName": "Del Campidoglio",
"streetNumber": "1",
"toponym": "Piazza"
},
"startup": False,
"taxId": "02438750586",
"vat": "01057861005"
},
"country": "it",
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"id": "ea623ae8c298",
"name": "ROMA CAPITALE",
"shares": {
"sharesOwned": [
{
"active": True,
"amount": 182436916.0,
"id": "6037483d168e",
"lastUpdate": "2011-10-20",
"legalName": "AZIENDA MUNICIPALE AMBIENTE S.P.A. "
"ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"name": "AZIENDA MUNICIPALE AMBIENTE S.P.A. ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 10000000.0,
"id": "8f1c4be03ec9",
"lastUpdate": "2010-02-18",
"legalName": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"name": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2822250.0,
"id": "667243935e25",
"lastUpdate": "2005-11-24",
"legalName": "ZETEMA PROGETTO CULTURA SRL",
"name": "ZETEMA PROGETTO CULTURA SRL",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2500000.0,
"id": "0b10d523f972",
"lastUpdate": "2012-04-16",
"legalName": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE DELLA CITTA' "
"DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"name": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE "
"DELLA CITTA' DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2000000.0,
"id": "a32e98206f27",
"lastUpdate": "2012-02-16",
"legalName": "RISORSE PER ROMA S.P.A.",
"name": "RISORSE PER ROMA S.P.A.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 300000.0,
"id": "8c8179a19251",
"lastUpdate": "2005-10-14",
"legalName": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"name": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 560438430.84,
"id": "b3f933d6a2df",
"lastUpdate": "2014-12-18",
"legalName": "ACEA S.P.A.",
"name": "ACEA S.P.A.",
"ratio": 0.51,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 2530638.81,
"id": "b61bd1a3a245",
"lastUpdate": "2000-04-28",
"legalName": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"name": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"ratio": 0.35,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 190000.0,
"id": "2f3a4fd465f6",
"lastUpdate": "2006-05-18",
"legalName": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"name": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"ratio": 0.19,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 155738.0,
"id": "f11054c630fa",
"lastUpdate": "2014-06-05",
"legalName": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"name": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"ratio": 0.08869999999999999,
"typeOfRight": "propriet\u00e0"
}
]
}
},
{
"active": True,
"base": {
"active": True,
"ateco": [
{
"code": "01.11.1",
"description": "Coltivazione di cereali (escluso il riso)",
"rootCode": "A"
}
],
"cciaa": "RM",
"founded": "1977-06-14",
"inGroup": False,
"legalClass": "Altre Forme",
"legalForms": [
{
"level": 1,
"name": "Altre Forme"
},
{
"level": 2,
"name": "Ente"
}
],
"legalName": "ROMA CAPITALE",
"nace": [
{
"code": "01.11",
"description": "Growing of cereals (except rice), leguminous crops and oil seeds",
"rootCode": "A"
}
],
"rea": "1287276",
"registeredAddress": {
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"lat": 41.89334748,
"latlonPrecision": 90,
"lon": 12.48289836,
"macroregion": "Centro",
"municipality": "Roma",
"postcode": "00186",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"streetName": "Del Campidoglio",
"streetNumber": "1",
"toponym": "Piazza"
},
"startup": False,
"taxId": "02438750586",
"vat": "01057861005"
},
"country": "it",
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"id": "ea623ae8c298",
"name": "ROMA CAPITALE",
"shares": {
"sharesOwned": [
{
"active": True,
"amount": 182436916.0,
"id": "6037483d168e",
"lastUpdate": "2011-10-20",
"legalName": "AZIENDA MUNICIPALE AMBIENTE S.P.A. "
"ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"name": "AZIENDA MUNICIPALE AMBIENTE S.P.A. ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 10000000.0,
"id": "8f1c4be03ec9",
"lastUpdate": "2010-02-18",
"legalName": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"name": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2822250.0,
"id": "667243935e25",
"lastUpdate": "2005-11-24",
"legalName": "ZETEMA PROGETTO CULTURA SRL",
"name": "ZETEMA PROGETTO CULTURA SRL",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2500000.0,
"id": "0b10d523f972",
"lastUpdate": "2012-04-16",
"legalName": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE DELLA CITTA' "
"DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"name": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE "
"DELLA CITTA' DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2000000.0,
"id": "a32e98206f27",
"lastUpdate": "2012-02-16",
"legalName": "RISORSE PER ROMA S.P.A.",
"name": "RISORSE PER ROMA S.P.A.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 300000.0,
"id": "8c8179a19251",
"lastUpdate": "2005-10-14",
"legalName": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"name": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 560438430.84,
"id": "b3f933d6a2df",
"lastUpdate": "2014-12-18",
"legalName": "ACEA S.P.A.",
"name": "ACEA S.P.A.",
"ratio": 0.51,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 2530638.81,
"id": "b61bd1a3a245",
"lastUpdate": "2000-04-28",
"legalName": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"name": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"ratio": 0.35,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 190000.0,
"id": "2f3a4fd465f6",
"lastUpdate": "2006-05-18",
"legalName": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"name": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"ratio": 0.19,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 155738.0,
"id": "f11054c630fa",
"lastUpdate": "2014-06-05",
"legalName": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"name": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"ratio": 0.08869999999999999,
"typeOfRight": "propriet\u00e0"
}
]
}
},
{
"active": True,
"base": {
"active": True,
"ateco": [
{
"code": "01.11.1",
"description": "Coltivazione di cereali (escluso il riso)",
"rootCode": "A"
}
],
"cciaa": "RM",
"founded": "1977-06-14",
"inGroup": False,
"legalClass": "Altre Forme",
"legalForms": [
{
"level": 1,
"name": "Altre Forme"
},
{
"level": 2,
"name": "Ente"
}
],
"legalName": "ROMA CAPITALE",
"nace": [
{
"code": "01.11",
"description": "Growing of cereals (except rice), leguminous crops and oil seeds",
"rootCode": "A"
}
],
"rea": "1287276",
"registeredAddress": {
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"lat": 41.89334748,
"latlonPrecision": 90,
"lon": 12.48289836,
"macroregion": "Centro",
"municipality": "Roma",
"postcode": "00186",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"streetName": "Del Campidoglio",
"streetNumber": "1",
"toponym": "Piazza"
},
"startup": False,
"taxId": "02438750586",
"vat": "01057861005"
},
"country": "it",
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"id": "ea623ae8c298",
"name": "ROMA CAPITALE",
"shares": {
"sharesOwned": [
{
"active": True,
"amount": 182436916.0,
"id": "6037483d168e",
"lastUpdate": "2011-10-20",
"legalName": "AZIENDA MUNICIPALE AMBIENTE S.P.A. "
"ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"name": "AZIENDA MUNICIPALE AMBIENTE S.P.A. ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 10000000.0,
"id": "8f1c4be03ec9",
"lastUpdate": "2010-02-18",
"legalName": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"name": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2822250.0,
"id": "667243935e25",
"lastUpdate": "2005-11-24",
"legalName": "ZETEMA PROGETTO CULTURA SRL",
"name": "ZETEMA PROGETTO CULTURA SRL",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2500000.0,
"id": "0b10d523f972",
"lastUpdate": "2012-04-16",
"legalName": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE DELLA CITTA' "
"DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"name": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE "
"DELLA CITTA' DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2000000.0,
"id": "a32e98206f27",
"lastUpdate": "2012-02-16",
"legalName": "RISORSE PER ROMA S.P.A.",
"name": "RISORSE PER ROMA S.P.A.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 300000.0,
"id": "8c8179a19251",
"lastUpdate": "2005-10-14",
"legalName": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"name": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 560438430.84,
"id": "b3f933d6a2df",
"lastUpdate": "2014-12-18",
"legalName": "ACEA S.P.A.",
"name": "ACEA S.P.A.",
"ratio": 0.51,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 2530638.81,
"id": "b61bd1a3a245",
"lastUpdate": "2000-04-28",
"legalName": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"name": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"ratio": 0.35,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 190000.0,
"id": "2f3a4fd465f6",
"lastUpdate": "2006-05-18",
"legalName": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"name": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"ratio": 0.19,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 155738.0,
"id": "f11054c630fa",
"lastUpdate": "2014-06-05",
"legalName": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"name": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"ratio": 0.08869999999999999,
"typeOfRight": "propriet\u00e0"
}
]
}
},
{
"active": True,
"base": {
"active": True,
"ateco": [
{
"code": "01.11.1",
"description": "Coltivazione di cereali (escluso il riso)",
"rootCode": "A"
}
],
"cciaa": "RM",
"founded": "1977-06-14",
"inGroup": False,
"legalClass": "Altre Forme",
"legalForms": [
{
"level": 1,
"name": "Altre Forme"
},
{
"level": 2,
"name": "Ente"
}
],
"legalName": "ROMA CAPITALE",
"nace": [
{
"code": "01.11",
"description": "Growing of cereals (except rice), leguminous crops and oil seeds",
"rootCode": "A"
}
],
"rea": "1287276",
"registeredAddress": {
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"lat": 41.89334748,
"latlonPrecision": 90,
"lon": 12.48289836,
"macroregion": "Centro",
"municipality": "Roma",
"postcode": "00186",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"streetName": "Del Campidoglio",
"streetNumber": "1",
"toponym": "Piazza"
},
"startup": False,
"taxId": "02438750586",
"vat": "01057861005"
},
"country": "it",
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"id": "ea623ae8c298",
"name": "ROMA CAPITALE",
"shares": {
"sharesOwned": [
{
"active": True,
"amount": 182436916.0,
"id": "6037483d168e",
"lastUpdate": "2011-10-20",
"legalName": "AZIENDA MUNICIPALE AMBIENTE S.P.A. "
"ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"name": "AZIENDA MUNICIPALE AMBIENTE S.P.A. ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 10000000.0,
"id": "8f1c4be03ec9",
"lastUpdate": "2010-02-18",
"legalName": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"name": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2822250.0,
"id": "667243935e25",
"lastUpdate": "2005-11-24",
"legalName": "ZETEMA PROGETTO CULTURA SRL",
"name": "ZETEMA PROGETTO CULTURA SRL",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2500000.0,
"id": "0b10d523f972",
"lastUpdate": "2012-04-16",
"legalName": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE DELLA CITTA' "
"DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"name": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE "
"DELLA CITTA' DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2000000.0,
"id": "a32e98206f27",
"lastUpdate": "2012-02-16",
"legalName": "RISORSE PER ROMA S.P.A.",
"name": "RISORSE PER ROMA S.P.A.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 300000.0,
"id": "8c8179a19251",
"lastUpdate": "2005-10-14",
"legalName": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"name": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 560438430.84,
"id": "b3f933d6a2df",
"lastUpdate": "2014-12-18",
"legalName": "ACEA S.P.A.",
"name": "ACEA S.P.A.",
"ratio": 0.51,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 2530638.81,
"id": "b61bd1a3a245",
"lastUpdate": "2000-04-28",
"legalName": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"name": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"ratio": 0.35,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 190000.0,
"id": "2f3a4fd465f6",
"lastUpdate": "2006-05-18",
"legalName": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"name": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"ratio": 0.19,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 155738.0,
"id": "f11054c630fa",
"lastUpdate": "2014-06-05",
"legalName": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"name": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"ratio": 0.08869999999999999,
"typeOfRight": "propriet\u00e0"
}
]
}
},
{
"active": True,
"base": {
"active": True,
"ateco": [
{
"code": "01.11.1",
"description": "Coltivazione di cereali (escluso il riso)",
"rootCode": "A"
}
],
"cciaa": "RM",
"founded": "1977-06-14",
"inGroup": False,
"legalClass": "Altre Forme",
"legalForms": [
{
"level": 1,
"name": "Altre Forme"
},
{
"level": 2,
"name": "Ente"
}
],
"legalName": "ROMA CAPITALE",
"nace": [
{
"code": "01.11",
"description": "Growing of cereals (except rice), leguminous crops and oil seeds",
"rootCode": "A"
}
],
"rea": "1287276",
"registeredAddress": {
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"lat": 41.89334748,
"latlonPrecision": 90,
"lon": 12.48289836,
"macroregion": "Centro",
"municipality": "Roma",
"postcode": "00186",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"streetName": "Del Campidoglio",
"streetNumber": "1",
"toponym": "Piazza"
},
"startup": False,
"taxId": "02438750586",
"vat": "01057861005"
},
"country": "it",
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"id": "ea623ae8c298",
"name": "ROMA CAPITALE",
"shares": {
"sharesOwned": [
{
"active": True,
"amount": 182436916.0,
"id": "6037483d168e",
"lastUpdate": "2011-10-20",
"legalName": "AZIENDA MUNICIPALE AMBIENTE S.P.A. "
"ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"name": "AZIENDA MUNICIPALE AMBIENTE S.P.A. ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 10000000.0,
"id": "8f1c4be03ec9",
"lastUpdate": "2010-02-18",
"legalName": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"name": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2822250.0,
"id": "667243935e25",
"lastUpdate": "2005-11-24",
"legalName": "ZETEMA PROGETTO CULTURA SRL",
"name": "ZETEMA PROGETTO CULTURA SRL",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2500000.0,
"id": "0b10d523f972",
"lastUpdate": "2012-04-16",
"legalName": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE DELLA CITTA' "
"DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"name": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE "
"DELLA CITTA' DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2000000.0,
"id": "a32e98206f27",
"lastUpdate": "2012-02-16",
"legalName": "RISORSE PER ROMA S.P.A.",
"name": "RISORSE PER ROMA S.P.A.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 300000.0,
"id": "8c8179a19251",
"lastUpdate": "2005-10-14",
"legalName": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"name": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 560438430.84,
"id": "b3f933d6a2df",
"lastUpdate": "2014-12-18",
"legalName": "ACEA S.P.A.",
"name": "ACEA S.P.A.",
"ratio": 0.51,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 2530638.81,
"id": "b61bd1a3a245",
"lastUpdate": "2000-04-28",
"legalName": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"name": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"ratio": 0.35,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 190000.0,
"id": "2f3a4fd465f6",
"lastUpdate": "2006-05-18",
"legalName": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"name": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"ratio": 0.19,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 155738.0,
"id": "f11054c630fa",
"lastUpdate": "2014-06-05",
"legalName": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"name": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"ratio": 0.08869999999999999,
"typeOfRight": "propriet\u00e0"
}
]
}
},
{
"active": True,
"base": {
"active": True,
"ateco": [
{
"code": "01.11.1",
"description": "Coltivazione di cereali (escluso il riso)",
"rootCode": "A"
}
],
"cciaa": "RM",
"founded": "1977-06-14",
"inGroup": False,
"legalClass": "Altre Forme",
"legalForms": [
{
"level": 1,
"name": "Altre Forme"
},
{
"level": 2,
"name": "Ente"
}
],
"legalName": "ROMA CAPITALE",
"nace": [
{
"code": "01.11",
"description": "Growing of cereals (except rice), leguminous crops and oil seeds",
"rootCode": "A"
}
],
"rea": "1287276",
"registeredAddress": {
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"lat": 41.89334748,
"latlonPrecision": 90,
"lon": 12.48289836,
"macroregion": "Centro",
"municipality": "Roma",
"postcode": "00186",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"streetName": "Del Campidoglio",
"streetNumber": "1",
"toponym": "Piazza"
},
"startup": False,
"taxId": "02438750586",
"vat": "01057861005"
},
"country": "it",
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"id": "ea623ae8c298",
"name": "ROMA CAPITALE",
"shares": {
"sharesOwned": [
{
"active": True,
"amount": 182436916.0,
"id": "6037483d168e",
"lastUpdate": "2011-10-20",
"legalName": "AZIENDA MUNICIPALE AMBIENTE S.P.A. "
"ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"name": "AZIENDA MUNICIPALE AMBIENTE S.P.A. ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 10000000.0,
"id": "8f1c4be03ec9",
"lastUpdate": "2010-02-18",
"legalName": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"name": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2822250.0,
"id": "667243935e25",
"lastUpdate": "2005-11-24",
"legalName": "ZETEMA PROGETTO CULTURA SRL",
"name": "ZETEMA PROGETTO CULTURA SRL",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2500000.0,
"id": "0b10d523f972",
"lastUpdate": "2012-04-16",
"legalName": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE DELLA CITTA' "
"DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"name": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE "
"DELLA CITTA' DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2000000.0,
"id": "a32e98206f27",
"lastUpdate": "2012-02-16",
"legalName": "RISORSE PER ROMA S.P.A.",
"name": "RISORSE PER ROMA S.P.A.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 300000.0,
"id": "8c8179a19251",
"lastUpdate": "2005-10-14",
"legalName": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"name": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 560438430.84,
"id": "b3f933d6a2df",
"lastUpdate": "2014-12-18",
"legalName": "ACEA S.P.A.",
"name": "ACEA S.P.A.",
"ratio": 0.51,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 2530638.81,
"id": "b61bd1a3a245",
"lastUpdate": "2000-04-28",
"legalName": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"name": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"ratio": 0.35,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 190000.0,
"id": "2f3a4fd465f6",
"lastUpdate": "2006-05-18",
"legalName": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"name": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"ratio": 0.19,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 155738.0,
"id": "f11054c630fa",
"lastUpdate": "2014-06-05",
"legalName": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"name": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"ratio": 0.08869999999999999,
"typeOfRight": "propriet\u00e0"
}
]
}
},
{
"active": True,
"base": {
"active": True,
"ateco": [
{
"code": "01.11.1",
"description": "Coltivazione di cereali (escluso il riso)",
"rootCode": "A"
}
],
"cciaa": "RM",
"founded": "1977-06-14",
"inGroup": False,
"legalClass": "Altre Forme",
"legalForms": [
{
"level": 1,
"name": "Altre Forme"
},
{
"level": 2,
"name": "Ente"
}
],
"legalName": "ROMA CAPITALE",
"nace": [
{
"code": "01.11",
"description": "Growing of cereals (except rice), leguminous crops and oil seeds",
"rootCode": "A"
}
],
"rea": "1287276",
"registeredAddress": {
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"lat": 41.89334748,
"latlonPrecision": 90,
"lon": 12.48289836,
"macroregion": "Centro",
"municipality": "Roma",
"postcode": "00186",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"streetName": "Del Campidoglio",
"streetNumber": "1",
"toponym": "Piazza"
},
"startup": False,
"taxId": "02438750586",
"vat": "01057861005"
},
"country": "it",
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"id": "ea623ae8c298",
"name": "ROMA CAPITALE",
"shares": {
"sharesOwned": [
{
"active": True,
"amount": 182436916.0,
"id": "6037483d168e",
"lastUpdate": "2011-10-20",
"legalName": "AZIENDA MUNICIPALE AMBIENTE S.P.A. "
"ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"name": "AZIENDA MUNICIPALE AMBIENTE S.P.A. ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 10000000.0,
"id": "8f1c4be03ec9",
"lastUpdate": "2010-02-18",
"legalName": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"name": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2822250.0,
"id": "667243935e25",
"lastUpdate": "2005-11-24",
"legalName": "ZETEMA PROGETTO CULTURA SRL",
"name": "ZETEMA PROGETTO CULTURA SRL",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2500000.0,
"id": "0b10d523f972",
"lastUpdate": "2012-04-16",
"legalName": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE DELLA CITTA' "
"DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"name": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE "
"DELLA CITTA' DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2000000.0,
"id": "a32e98206f27",
"lastUpdate": "2012-02-16",
"legalName": "RISORSE PER ROMA S.P.A.",
"name": "RISORSE PER ROMA S.P.A.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 300000.0,
"id": "8c8179a19251",
"lastUpdate": "2005-10-14",
"legalName": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"name": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 560438430.84,
"id": "b3f933d6a2df",
"lastUpdate": "2014-12-18",
"legalName": "ACEA S.P.A.",
"name": "ACEA S.P.A.",
"ratio": 0.51,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 2530638.81,
"id": "b61bd1a3a245",
"lastUpdate": "2000-04-28",
"legalName": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"name": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"ratio": 0.35,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 190000.0,
"id": "2f3a4fd465f6",
"lastUpdate": "2006-05-18",
"legalName": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"name": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"ratio": 0.19,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 155738.0,
"id": "f11054c630fa",
"lastUpdate": "2014-06-05",
"legalName": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"name": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"ratio": 0.08869999999999999,
"typeOfRight": "propriet\u00e0"
}
]
}
},
{
"active": True,
"base": {
"active": True,
"ateco": [
{
"code": "01.11.1",
"description": "Coltivazione di cereali (escluso il riso)",
"rootCode": "A"
}
],
"cciaa": "RM",
"founded": "1977-06-14",
"inGroup": False,
"legalClass": "Altre Forme",
"legalForms": [
{
"level": 1,
"name": "Altre Forme"
},
{
"level": 2,
"name": "Ente"
}
],
"legalName": "ROMA CAPITALE",
"nace": [
{
"code": "01.11",
"description": "Growing of cereals (except rice), leguminous crops and oil seeds",
"rootCode": "A"
}
],
"rea": "1287276",
"registeredAddress": {
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"lat": 41.89334748,
"latlonPrecision": 90,
"lon": 12.48289836,
"macroregion": "Centro",
"municipality": "Roma",
"postcode": "00186",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"streetName": "Del Campidoglio",
"streetNumber": "1",
"toponym": "Piazza"
},
"startup": False,
"taxId": "02438750586",
"vat": "01057861005"
},
"country": "it",
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"id": "ea623ae8c298",
"name": "ROMA CAPITALE",
"shares": {
"sharesOwned": [
{
"active": True,
"amount": 182436916.0,
"id": "6037483d168e",
"lastUpdate": "2011-10-20",
"legalName": "AZIENDA MUNICIPALE AMBIENTE S.P.A. "
"ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"name": "AZIENDA MUNICIPALE AMBIENTE S.P.A. ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 10000000.0,
"id": "8f1c4be03ec9",
"lastUpdate": "2010-02-18",
"legalName": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"name": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2822250.0,
"id": "667243935e25",
"lastUpdate": "2005-11-24",
"legalName": "ZETEMA PROGETTO CULTURA SRL",
"name": "ZETEMA PROGETTO CULTURA SRL",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2500000.0,
"id": "0b10d523f972",
"lastUpdate": "2012-04-16",
"legalName": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE DELLA CITTA' "
"DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"name": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE "
"DELLA CITTA' DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2000000.0,
"id": "a32e98206f27",
"lastUpdate": "2012-02-16",
"legalName": "RISORSE PER ROMA S.P.A.",
"name": "RISORSE PER ROMA S.P.A.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 300000.0,
"id": "8c8179a19251",
"lastUpdate": "2005-10-14",
"legalName": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"name": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 560438430.84,
"id": "b3f933d6a2df",
"lastUpdate": "2014-12-18",
"legalName": "ACEA S.P.A.",
"name": "ACEA S.P.A.",
"ratio": 0.51,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 2530638.81,
"id": "b61bd1a3a245",
"lastUpdate": "2000-04-28",
"legalName": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"name": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"ratio": 0.35,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 190000.0,
"id": "2f3a4fd465f6",
"lastUpdate": "2006-05-18",
"legalName": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"name": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"ratio": 0.19,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 155738.0,
"id": "f11054c630fa",
"lastUpdate": "2014-06-05",
"legalName": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"name": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"ratio": 0.08869999999999999,
"typeOfRight": "propriet\u00e0"
}
]
}
},
{
"active": True,
"base": {
"active": True,
"ateco": [
{
"code": "01.11.1",
"description": "Coltivazione di cereali (escluso il riso)",
"rootCode": "A"
}
],
"cciaa": "RM",
"founded": "1977-06-14",
"inGroup": False,
"legalClass": "Altre Forme",
"legalForms": [
{
"level": 1,
"name": "Altre Forme"
},
{
"level": 2,
"name": "Ente"
}
],
"legalName": "ROMA CAPITALE",
"nace": [
{
"code": "01.11",
"description": "Growing of cereals (except rice), leguminous crops and oil seeds",
"rootCode": "A"
}
],
"rea": "1287276",
"registeredAddress": {
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"lat": 41.89334748,
"latlonPrecision": 90,
"lon": 12.48289836,
"macroregion": "Centro",
"municipality": "Roma",
"postcode": "00186",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"streetName": "Del Campidoglio",
"streetNumber": "1",
"toponym": "Piazza"
},
"startup": False,
"taxId": "02438750586",
"vat": "01057861005"
},
"country": "it",
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"id": "ea623ae8c298",
"name": "ROMA CAPITALE",
"shares": {
"sharesOwned": [
{
"active": True,
"amount": 182436916.0,
"id": "6037483d168e",
"lastUpdate": "2011-10-20",
"legalName": "AZIENDA MUNICIPALE AMBIENTE S.P.A. "
"ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"name": "AZIENDA MUNICIPALE AMBIENTE S.P.A. ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 10000000.0,
"id": "8f1c4be03ec9",
"lastUpdate": "2010-02-18",
"legalName": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"name": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2822250.0,
"id": "667243935e25",
"lastUpdate": "2005-11-24",
"legalName": "ZETEMA PROGETTO CULTURA SRL",
"name": "ZETEMA PROGETTO CULTURA SRL",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2500000.0,
"id": "0b10d523f972",
"lastUpdate": "2012-04-16",
"legalName": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE DELLA CITTA' "
"DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"name": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE "
"DELLA CITTA' DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2000000.0,
"id": "a32e98206f27",
"lastUpdate": "2012-02-16",
"legalName": "RISORSE PER ROMA S.P.A.",
"name": "RISORSE PER ROMA S.P.A.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 300000.0,
"id": "8c8179a19251",
"lastUpdate": "2005-10-14",
"legalName": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"name": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 560438430.84,
"id": "b3f933d6a2df",
"lastUpdate": "2014-12-18",
"legalName": "ACEA S.P.A.",
"name": "ACEA S.P.A.",
"ratio": 0.51,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 2530638.81,
"id": "b61bd1a3a245",
"lastUpdate": "2000-04-28",
"legalName": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"name": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"ratio": 0.35,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 190000.0,
"id": "2f3a4fd465f6",
"lastUpdate": "2006-05-18",
"legalName": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"name": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"ratio": 0.19,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 155738.0,
"id": "f11054c630fa",
"lastUpdate": "2014-06-05",
"legalName": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"name": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"ratio": 0.08869999999999999,
"typeOfRight": "propriet\u00e0"
}
]
}
},
{
"active": True,
"base": {
"active": True,
"ateco": [
{
"code": "01.11.1",
"description": "Coltivazione di cereali (escluso il riso)",
"rootCode": "A"
}
],
"cciaa": "RM",
"founded": "1977-06-14",
"inGroup": False,
"legalClass": "Altre Forme",
"legalForms": [
{
"level": 1,
"name": "Altre Forme"
},
{
"level": 2,
"name": "Ente"
}
],
"legalName": "ROMA CAPITALE",
"nace": [
{
"code": "01.11",
"description": "Growing of cereals (except rice), leguminous crops and oil seeds",
"rootCode": "A"
}
],
"rea": "1287276",
"registeredAddress": {
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"lat": 41.89334748,
"latlonPrecision": 90,
"lon": 12.48289836,
"macroregion": "Centro",
"municipality": "Roma",
"postcode": "00186",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"streetName": "Del Campidoglio",
"streetNumber": "1",
"toponym": "Piazza"
},
"startup": False,
"taxId": "02438750586",
"vat": "01057861005"
},
"country": "it",
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"id": "ea623ae8c298",
"name": "ROMA CAPITALE",
"shares": {
"sharesOwned": [
{
"active": True,
"amount": 182436916.0,
"id": "6037483d168e",
"lastUpdate": "2011-10-20",
"legalName": "AZIENDA MUNICIPALE AMBIENTE S.P.A. "
"ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"name": "AZIENDA MUNICIPALE AMBIENTE S.P.A. ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 10000000.0,
"id": "8f1c4be03ec9",
"lastUpdate": "2010-02-18",
"legalName": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"name": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2822250.0,
"id": "667243935e25",
"lastUpdate": "2005-11-24",
"legalName": "ZETEMA PROGETTO CULTURA SRL",
"name": "ZETEMA PROGETTO CULTURA SRL",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2500000.0,
"id": "0b10d523f972",
"lastUpdate": "2012-04-16",
"legalName": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE DELLA CITTA' "
"DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"name": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE "
"DELLA CITTA' DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2000000.0,
"id": "a32e98206f27",
"lastUpdate": "2012-02-16",
"legalName": "RISORSE PER ROMA S.P.A.",
"name": "RISORSE PER ROMA S.P.A.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 300000.0,
"id": "8c8179a19251",
"lastUpdate": "2005-10-14",
"legalName": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"name": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 560438430.84,
"id": "b3f933d6a2df",
"lastUpdate": "2014-12-18",
"legalName": "ACEA S.P.A.",
"name": "ACEA S.P.A.",
"ratio": 0.51,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 2530638.81,
"id": "b61bd1a3a245",
"lastUpdate": "2000-04-28",
"legalName": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"name": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"ratio": 0.35,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 190000.0,
"id": "2f3a4fd465f6",
"lastUpdate": "2006-05-18",
"legalName": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"name": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"ratio": 0.19,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 155738.0,
"id": "f11054c630fa",
"lastUpdate": "2014-06-05",
"legalName": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"name": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"ratio": 0.08869999999999999,
"typeOfRight": "propriet\u00e0"
}
]
}
},
{
"active": True,
"base": {
"active": True,
"ateco": [
{
"code": "01.11.1",
"description": "Coltivazione di cereali (escluso il riso)",
"rootCode": "A"
}
],
"cciaa": "RM",
"founded": "1977-06-14",
"inGroup": False,
"legalClass": "Altre Forme",
"legalForms": [
{
"level": 1,
"name": "Altre Forme"
},
{
"level": 2,
"name": "Ente"
}
],
"legalName": "ROMA CAPITALE",
"nace": [
{
"code": "01.11",
"description": "Growing of cereals (except rice), leguminous crops and oil seeds",
"rootCode": "A"
}
],
"rea": "1287276",
"registeredAddress": {
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"lat": 41.89334748,
"latlonPrecision": 90,
"lon": 12.48289836,
"macroregion": "Centro",
"municipality": "Roma",
"postcode": "00186",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"streetName": "Del Campidoglio",
"streetNumber": "1",
"toponym": "Piazza"
},
"startup": False,
"taxId": "02438750586",
"vat": "01057861005"
},
"country": "it",
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"id": "ea623ae8c298",
"name": "ROMA CAPITALE",
"shares": {
"sharesOwned": [
{
"active": True,
"amount": 182436916.0,
"id": "6037483d168e",
"lastUpdate": "2011-10-20",
"legalName": "AZIENDA MUNICIPALE AMBIENTE S.P.A. "
"ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"name": "AZIENDA MUNICIPALE AMBIENTE S.P.A. ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 10000000.0,
"id": "8f1c4be03ec9",
"lastUpdate": "2010-02-18",
"legalName": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"name": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2822250.0,
"id": "667243935e25",
"lastUpdate": "2005-11-24",
"legalName": "ZETEMA PROGETTO CULTURA SRL",
"name": "ZETEMA PROGETTO CULTURA SRL",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2500000.0,
"id": "0b10d523f972",
"lastUpdate": "2012-04-16",
"legalName": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE DELLA CITTA' "
"DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"name": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE "
"DELLA CITTA' DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2000000.0,
"id": "a32e98206f27",
"lastUpdate": "2012-02-16",
"legalName": "RISORSE PER ROMA S.P.A.",
"name": "RISORSE PER ROMA S.P.A.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 300000.0,
"id": "8c8179a19251",
"lastUpdate": "2005-10-14",
"legalName": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"name": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 560438430.84,
"id": "b3f933d6a2df",
"lastUpdate": "2014-12-18",
"legalName": "ACEA S.P.A.",
"name": "ACEA S.P.A.",
"ratio": 0.51,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 2530638.81,
"id": "b61bd1a3a245",
"lastUpdate": "2000-04-28",
"legalName": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"name": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"ratio": 0.35,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 190000.0,
"id": "2f3a4fd465f6",
"lastUpdate": "2006-05-18",
"legalName": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"name": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"ratio": 0.19,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 155738.0,
"id": "f11054c630fa",
"lastUpdate": "2014-06-05",
"legalName": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"name": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"ratio": 0.08869999999999999,
"typeOfRight": "propriet\u00e0"
}
]
}
},
{
"active": True,
"base": {
"active": True,
"ateco": [
{
"code": "01.11.1",
"description": "Coltivazione di cereali (escluso il riso)",
"rootCode": "A"
}
],
"cciaa": "RM",
"founded": "1977-06-14",
"inGroup": False,
"legalClass": "Altre Forme",
"legalForms": [
{
"level": 1,
"name": "Altre Forme"
},
{
"level": 2,
"name": "Ente"
}
],
"legalName": "ROMA CAPITALE",
"nace": [
{
"code": "01.11",
"description": "Growing of cereals (except rice), leguminous crops and oil seeds",
"rootCode": "A"
}
],
"rea": "1287276",
"registeredAddress": {
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"lat": 41.89334748,
"latlonPrecision": 90,
"lon": 12.48289836,
"macroregion": "Centro",
"municipality": "Roma",
"postcode": "00186",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"streetName": "Del Campidoglio",
"streetNumber": "1",
"toponym": "Piazza"
},
"startup": False,
"taxId": "02438750586",
"vat": "01057861005"
},
"country": "it",
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"id": "ea623ae8c298",
"name": "ROMA CAPITALE",
"shares": {
"sharesOwned": [
{
"active": True,
"amount": 182436916.0,
"id": "6037483d168e",
"lastUpdate": "2011-10-20",
"legalName": "AZIENDA MUNICIPALE AMBIENTE S.P.A. "
"ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"name": "AZIENDA MUNICIPALE AMBIENTE S.P.A. ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 10000000.0,
"id": "8f1c4be03ec9",
"lastUpdate": "2010-02-18",
"legalName": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"name": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2822250.0,
"id": "667243935e25",
"lastUpdate": "2005-11-24",
"legalName": "ZETEMA PROGETTO CULTURA SRL",
"name": "ZETEMA PROGETTO CULTURA SRL",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2500000.0,
"id": "0b10d523f972",
"lastUpdate": "2012-04-16",
"legalName": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE DELLA CITTA' "
"DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"name": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE "
"DELLA CITTA' DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2000000.0,
"id": "a32e98206f27",
"lastUpdate": "2012-02-16",
"legalName": "RISORSE PER ROMA S.P.A.",
"name": "RISORSE PER ROMA S.P.A.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 300000.0,
"id": "8c8179a19251",
"lastUpdate": "2005-10-14",
"legalName": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"name": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 560438430.84,
"id": "b3f933d6a2df",
"lastUpdate": "2014-12-18",
"legalName": "ACEA S.P.A.",
"name": "ACEA S.P.A.",
"ratio": 0.51,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 2530638.81,
"id": "b61bd1a3a245",
"lastUpdate": "2000-04-28",
"legalName": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"name": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"ratio": 0.35,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 190000.0,
"id": "2f3a4fd465f6",
"lastUpdate": "2006-05-18",
"legalName": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"name": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"ratio": 0.19,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 155738.0,
"id": "f11054c630fa",
"lastUpdate": "2014-06-05",
"legalName": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"name": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"ratio": 0.08869999999999999,
"typeOfRight": "propriet\u00e0"
}
]
}
},
{
"active": True,
"base": {
"active": True,
"ateco": [
{
"code": "01.11.1",
"description": "Coltivazione di cereali (escluso il riso)",
"rootCode": "A"
}
],
"cciaa": "RM",
"founded": "1977-06-14",
"inGroup": False,
"legalClass": "Altre Forme",
"legalForms": [
{
"level": 1,
"name": "Altre Forme"
},
{
"level": 2,
"name": "Ente"
}
],
"legalName": "ROMA CAPITALE",
"nace": [
{
"code": "01.11",
"description": "Growing of cereals (except rice), leguminous crops and oil seeds",
"rootCode": "A"
}
],
"rea": "1287276",
"registeredAddress": {
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"lat": 41.89334748,
"latlonPrecision": 90,
"lon": 12.48289836,
"macroregion": "Centro",
"municipality": "Roma",
"postcode": "00186",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"streetName": "Del Campidoglio",
"streetNumber": "1",
"toponym": "Piazza"
},
"startup": False,
"taxId": "02438750586",
"vat": "01057861005"
},
"country": "it",
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"id": "ea623ae8c298",
"name": "ROMA CAPITALE",
"shares": {
"sharesOwned": [
{
"active": True,
"amount": 182436916.0,
"id": "6037483d168e",
"lastUpdate": "2011-10-20",
"legalName": "AZIENDA MUNICIPALE AMBIENTE S.P.A. "
"ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"name": "AZIENDA MUNICIPALE AMBIENTE S.P.A. ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 10000000.0,
"id": "8f1c4be03ec9",
"lastUpdate": "2010-02-18",
"legalName": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"name": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2822250.0,
"id": "667243935e25",
"lastUpdate": "2005-11-24",
"legalName": "ZETEMA PROGETTO CULTURA SRL",
"name": "ZETEMA PROGETTO CULTURA SRL",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2500000.0,
"id": "0b10d523f972",
"lastUpdate": "2012-04-16",
"legalName": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE DELLA CITTA' "
"DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"name": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE "
"DELLA CITTA' DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2000000.0,
"id": "a32e98206f27",
"lastUpdate": "2012-02-16",
"legalName": "RISORSE PER ROMA S.P.A.",
"name": "RISORSE PER ROMA S.P.A.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 300000.0,
"id": "8c8179a19251",
"lastUpdate": "2005-10-14",
"legalName": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"name": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 560438430.84,
"id": "b3f933d6a2df",
"lastUpdate": "2014-12-18",
"legalName": "ACEA S.P.A.",
"name": "ACEA S.P.A.",
"ratio": 0.51,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 2530638.81,
"id": "b61bd1a3a245",
"lastUpdate": "2000-04-28",
"legalName": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"name": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"ratio": 0.35,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 190000.0,
"id": "2f3a4fd465f6",
"lastUpdate": "2006-05-18",
"legalName": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"name": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"ratio": 0.19,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 155738.0,
"id": "f11054c630fa",
"lastUpdate": "2014-06-05",
"legalName": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"name": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"ratio": 0.08869999999999999,
"typeOfRight": "propriet\u00e0"
}
]
}
},
{
"active": True,
"base": {
"active": True,
"ateco": [
{
"code": "01.11.1",
"description": "Coltivazione di cereali (escluso il riso)",
"rootCode": "A"
}
],
"cciaa": "RM",
"founded": "1977-06-14",
"inGroup": False,
"legalClass": "Altre Forme",
"legalForms": [
{
"level": 1,
"name": "Altre Forme"
},
{
"level": 2,
"name": "Ente"
}
],
"legalName": "ROMA CAPITALE",
"nace": [
{
"code": "01.11",
"description": "Growing of cereals (except rice), leguminous crops and oil seeds",
"rootCode": "A"
}
],
"rea": "1287276",
"registeredAddress": {
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"lat": 41.89334748,
"latlonPrecision": 90,
"lon": 12.48289836,
"macroregion": "Centro",
"municipality": "Roma",
"postcode": "00186",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"streetName": "Del Campidoglio",
"streetNumber": "1",
"toponym": "Piazza"
},
"startup": False,
"taxId": "02438750586",
"vat": "01057861005"
},
"country": "it",
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"id": "ea623ae8c298",
"name": "ROMA CAPITALE",
"shares": {
"sharesOwned": [
{
"active": True,
"amount": 182436916.0,
"id": "6037483d168e",
"lastUpdate": "2011-10-20",
"legalName": "AZIENDA MUNICIPALE AMBIENTE S.P.A. "
"ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"name": "AZIENDA MUNICIPALE AMBIENTE S.P.A. ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 10000000.0,
"id": "8f1c4be03ec9",
"lastUpdate": "2010-02-18",
"legalName": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"name": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2822250.0,
"id": "667243935e25",
"lastUpdate": "2005-11-24",
"legalName": "ZETEMA PROGETTO CULTURA SRL",
"name": "ZETEMA PROGETTO CULTURA SRL",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2500000.0,
"id": "0b10d523f972",
"lastUpdate": "2012-04-16",
"legalName": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE DELLA CITTA' "
"DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"name": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE "
"DELLA CITTA' DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2000000.0,
"id": "a32e98206f27",
"lastUpdate": "2012-02-16",
"legalName": "RISORSE PER ROMA S.P.A.",
"name": "RISORSE PER ROMA S.P.A.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 300000.0,
"id": "8c8179a19251",
"lastUpdate": "2005-10-14",
"legalName": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"name": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 560438430.84,
"id": "b3f933d6a2df",
"lastUpdate": "2014-12-18",
"legalName": "ACEA S.P.A.",
"name": "ACEA S.P.A.",
"ratio": 0.51,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 2530638.81,
"id": "b61bd1a3a245",
"lastUpdate": "2000-04-28",
"legalName": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"name": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"ratio": 0.35,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 190000.0,
"id": "2f3a4fd465f6",
"lastUpdate": "2006-05-18",
"legalName": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"name": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"ratio": 0.19,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 155738.0,
"id": "f11054c630fa",
"lastUpdate": "2014-06-05",
"legalName": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"name": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"ratio": 0.08869999999999999,
"typeOfRight": "propriet\u00e0"
}
]
}
},
{
"active": True,
"base": {
"active": True,
"ateco": [
{
"code": "01.11.1",
"description": "Coltivazione di cereali (escluso il riso)",
"rootCode": "A"
}
],
"cciaa": "RM",
"founded": "1977-06-14",
"inGroup": False,
"legalClass": "Altre Forme",
"legalForms": [
{
"level": 1,
"name": "Altre Forme"
},
{
"level": 2,
"name": "Ente"
}
],
"legalName": "ROMA CAPITALE",
"nace": [
{
"code": "01.11",
"description": "Growing of cereals (except rice), leguminous crops and oil seeds",
"rootCode": "A"
}
],
"rea": "1287276",
"registeredAddress": {
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"lat": 41.89334748,
"latlonPrecision": 90,
"lon": 12.48289836,
"macroregion": "Centro",
"municipality": "Roma",
"postcode": "00186",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"streetName": "Del Campidoglio",
"streetNumber": "1",
"toponym": "Piazza"
},
"startup": False,
"taxId": "02438750586",
"vat": "01057861005"
},
"country": "it",
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"id": "ea623ae8c298",
"name": "ROMA CAPITALE",
"shares": {
"sharesOwned": [
{
"active": True,
"amount": 182436916.0,
"id": "6037483d168e",
"lastUpdate": "2011-10-20",
"legalName": "AZIENDA MUNICIPALE AMBIENTE S.P.A. "
"ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"name": "AZIENDA MUNICIPALE AMBIENTE S.P.A. ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 10000000.0,
"id": "8f1c4be03ec9",
"lastUpdate": "2010-02-18",
"legalName": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"name": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2822250.0,
"id": "667243935e25",
"lastUpdate": "2005-11-24",
"legalName": "ZETEMA PROGETTO CULTURA SRL",
"name": "ZETEMA PROGETTO CULTURA SRL",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2500000.0,
"id": "0b10d523f972",
"lastUpdate": "2012-04-16",
"legalName": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE DELLA CITTA' "
"DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"name": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE "
"DELLA CITTA' DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2000000.0,
"id": "a32e98206f27",
"lastUpdate": "2012-02-16",
"legalName": "RISORSE PER ROMA S.P.A.",
"name": "RISORSE PER ROMA S.P.A.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 300000.0,
"id": "8c8179a19251",
"lastUpdate": "2005-10-14",
"legalName": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"name": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 560438430.84,
"id": "b3f933d6a2df",
"lastUpdate": "2014-12-18",
"legalName": "ACEA S.P.A.",
"name": "ACEA S.P.A.",
"ratio": 0.51,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 2530638.81,
"id": "b61bd1a3a245",
"lastUpdate": "2000-04-28",
"legalName": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"name": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"ratio": 0.35,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 190000.0,
"id": "2f3a4fd465f6",
"lastUpdate": "2006-05-18",
"legalName": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"name": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"ratio": 0.19,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 155738.0,
"id": "f11054c630fa",
"lastUpdate": "2014-06-05",
"legalName": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"name": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"ratio": 0.08869999999999999,
"typeOfRight": "propriet\u00e0"
}
]
}
},
{
"active": True,
"base": {
"active": True,
"ateco": [
{
"code": "01.11.1",
"description": "Coltivazione di cereali (escluso il riso)",
"rootCode": "A"
}
],
"cciaa": "RM",
"founded": "1977-06-14",
"inGroup": False,
"legalClass": "Altre Forme",
"legalForms": [
{
"level": 1,
"name": "Altre Forme"
},
{
"level": 2,
"name": "Ente"
}
],
"legalName": "ROMA CAPITALE",
"nace": [
{
"code": "01.11",
"description": "Growing of cereals (except rice), leguminous crops and oil seeds",
"rootCode": "A"
}
],
"rea": "1287276",
"registeredAddress": {
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"lat": 41.89334748,
"latlonPrecision": 90,
"lon": 12.48289836,
"macroregion": "Centro",
"municipality": "Roma",
"postcode": "00186",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"streetName": "Del Campidoglio",
"streetNumber": "1",
"toponym": "Piazza"
},
"startup": False,
"taxId": "02438750586",
"vat": "01057861005"
},
"country": "it",
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"id": "ea623ae8c298",
"name": "ROMA CAPITALE",
"shares": {
"sharesOwned": [
{
"active": True,
"amount": 182436916.0,
"id": "6037483d168e",
"lastUpdate": "2011-10-20",
"legalName": "AZIENDA MUNICIPALE AMBIENTE S.P.A. "
"ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"name": "AZIENDA MUNICIPALE AMBIENTE S.P.A. ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 10000000.0,
"id": "8f1c4be03ec9",
"lastUpdate": "2010-02-18",
"legalName": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"name": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2822250.0,
"id": "667243935e25",
"lastUpdate": "2005-11-24",
"legalName": "ZETEMA PROGETTO CULTURA SRL",
"name": "ZETEMA PROGETTO CULTURA SRL",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2500000.0,
"id": "0b10d523f972",
"lastUpdate": "2012-04-16",
"legalName": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE DELLA CITTA' "
"DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"name": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE "
"DELLA CITTA' DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2000000.0,
"id": "a32e98206f27",
"lastUpdate": "2012-02-16",
"legalName": "RISORSE PER ROMA S.P.A.",
"name": "RISORSE PER ROMA S.P.A.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 300000.0,
"id": "8c8179a19251",
"lastUpdate": "2005-10-14",
"legalName": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"name": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 560438430.84,
"id": "b3f933d6a2df",
"lastUpdate": "2014-12-18",
"legalName": "ACEA S.P.A.",
"name": "ACEA S.P.A.",
"ratio": 0.51,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 2530638.81,
"id": "b61bd1a3a245",
"lastUpdate": "2000-04-28",
"legalName": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"name": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"ratio": 0.35,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 190000.0,
"id": "2f3a4fd465f6",
"lastUpdate": "2006-05-18",
"legalName": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"name": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"ratio": 0.19,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 155738.0,
"id": "f11054c630fa",
"lastUpdate": "2014-06-05",
"legalName": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"name": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"ratio": 0.08869999999999999,
"typeOfRight": "propriet\u00e0"
}
]
}
},
{
"active": True,
"base": {
"active": True,
"ateco": [
{
"code": "01.11.1",
"description": "Coltivazione di cereali (escluso il riso)",
"rootCode": "A"
}
],
"cciaa": "RM",
"founded": "1977-06-14",
"inGroup": False,
"legalClass": "Altre Forme",
"legalForms": [
{
"level": 1,
"name": "Altre Forme"
},
{
"level": 2,
"name": "Ente"
}
],
"legalName": "ROMA CAPITALE",
"nace": [
{
"code": "01.11",
"description": "Growing of cereals (except rice), leguminous crops and oil seeds",
"rootCode": "A"
}
],
"rea": "1287276",
"registeredAddress": {
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"lat": 41.89334748,
"latlonPrecision": 90,
"lon": 12.48289836,
"macroregion": "Centro",
"municipality": "Roma",
"postcode": "00186",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"streetName": "Del Campidoglio",
"streetNumber": "1",
"toponym": "Piazza"
},
"startup": False,
"taxId": "02438750586",
"vat": "01057861005"
},
"country": "it",
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"id": "ea623ae8c298",
"name": "ROMA CAPITALE",
"shares": {
"sharesOwned": [
{
"active": True,
"amount": 182436916.0,
"id": "6037483d168e",
"lastUpdate": "2011-10-20",
"legalName": "AZIENDA MUNICIPALE AMBIENTE S.P.A. "
"ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"name": "AZIENDA MUNICIPALE AMBIENTE S.P.A. ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 10000000.0,
"id": "8f1c4be03ec9",
"lastUpdate": "2010-02-18",
"legalName": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"name": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2822250.0,
"id": "667243935e25",
"lastUpdate": "2005-11-24",
"legalName": "ZETEMA PROGETTO CULTURA SRL",
"name": "ZETEMA PROGETTO CULTURA SRL",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2500000.0,
"id": "0b10d523f972",
"lastUpdate": "2012-04-16",
"legalName": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE DELLA CITTA' "
"DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"name": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE "
"DELLA CITTA' DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2000000.0,
"id": "a32e98206f27",
"lastUpdate": "2012-02-16",
"legalName": "RISORSE PER ROMA S.P.A.",
"name": "RISORSE PER ROMA S.P.A.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 300000.0,
"id": "8c8179a19251",
"lastUpdate": "2005-10-14",
"legalName": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"name": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 560438430.84,
"id": "b3f933d6a2df",
"lastUpdate": "2014-12-18",
"legalName": "ACEA S.P.A.",
"name": "ACEA S.P.A.",
"ratio": 0.51,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 2530638.81,
"id": "b61bd1a3a245",
"lastUpdate": "2000-04-28",
"legalName": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"name": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"ratio": 0.35,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 190000.0,
"id": "2f3a4fd465f6",
"lastUpdate": "2006-05-18",
"legalName": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"name": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"ratio": 0.19,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 155738.0,
"id": "f11054c630fa",
"lastUpdate": "2014-06-05",
"legalName": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"name": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"ratio": 0.08869999999999999,
"typeOfRight": "propriet\u00e0"
}
]
}
},
{
"active": True,
"base": {
"active": True,
"ateco": [
{
"code": "01.11.1",
"description": "Coltivazione di cereali (escluso il riso)",
"rootCode": "A"
}
],
"cciaa": "RM",
"founded": "1977-06-14",
"inGroup": False,
"legalClass": "Altre Forme",
"legalForms": [
{
"level": 1,
"name": "Altre Forme"
},
{
"level": 2,
"name": "Ente"
}
],
"legalName": "ROMA CAPITALE",
"nace": [
{
"code": "01.11",
"description": "Growing of cereals (except rice), leguminous crops and oil seeds",
"rootCode": "A"
}
],
"rea": "1287276",
"registeredAddress": {
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"lat": 41.89334748,
"latlonPrecision": 90,
"lon": 12.48289836,
"macroregion": "Centro",
"municipality": "Roma",
"postcode": "00186",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"streetName": "Del Campidoglio",
"streetNumber": "1",
"toponym": "Piazza"
},
"startup": False,
"taxId": "02438750586",
"vat": "01057861005"
},
"country": "it",
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"id": "ea623ae8c298",
"name": "ROMA CAPITALE",
"shares": {
"sharesOwned": [
{
"active": True,
"amount": 182436916.0,
"id": "6037483d168e",
"lastUpdate": "2011-10-20",
"legalName": "AZIENDA MUNICIPALE AMBIENTE S.P.A. "
"ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"name": "AZIENDA MUNICIPALE AMBIENTE S.P.A. ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 10000000.0,
"id": "8f1c4be03ec9",
"lastUpdate": "2010-02-18",
"legalName": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"name": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2822250.0,
"id": "667243935e25",
"lastUpdate": "2005-11-24",
"legalName": "ZETEMA PROGETTO CULTURA SRL",
"name": "ZETEMA PROGETTO CULTURA SRL",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2500000.0,
"id": "0b10d523f972",
"lastUpdate": "2012-04-16",
"legalName": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE DELLA CITTA' "
"DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"name": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE "
"DELLA CITTA' DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2000000.0,
"id": "a32e98206f27",
"lastUpdate": "2012-02-16",
"legalName": "RISORSE PER ROMA S.P.A.",
"name": "RISORSE PER ROMA S.P.A.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 300000.0,
"id": "8c8179a19251",
"lastUpdate": "2005-10-14",
"legalName": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"name": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 560438430.84,
"id": "b3f933d6a2df",
"lastUpdate": "2014-12-18",
"legalName": "ACEA S.P.A.",
"name": "ACEA S.P.A.",
"ratio": 0.51,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 2530638.81,
"id": "b61bd1a3a245",
"lastUpdate": "2000-04-28",
"legalName": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"name": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"ratio": 0.35,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 190000.0,
"id": "2f3a4fd465f6",
"lastUpdate": "2006-05-18",
"legalName": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"name": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"ratio": 0.19,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 155738.0,
"id": "f11054c630fa",
"lastUpdate": "2014-06-05",
"legalName": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"name": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"ratio": 0.08869999999999999,
"typeOfRight": "propriet\u00e0"
}
]
}
},
{
"active": True,
"base": {
"active": True,
"ateco": [
{
"code": "01.11.1",
"description": "Coltivazione di cereali (escluso il riso)",
"rootCode": "A"
}
],
"cciaa": "RM",
"founded": "1977-06-14",
"inGroup": False,
"legalClass": "Altre Forme",
"legalForms": [
{
"level": 1,
"name": "Altre Forme"
},
{
"level": 2,
"name": "Ente"
}
],
"legalName": "ROMA CAPITALE",
"nace": [
{
"code": "01.11",
"description": "Growing of cereals (except rice), leguminous crops and oil seeds",
"rootCode": "A"
}
],
"rea": "1287276",
"registeredAddress": {
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"lat": 41.89334748,
"latlonPrecision": 90,
"lon": 12.48289836,
"macroregion": "Centro",
"municipality": "Roma",
"postcode": "00186",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"streetName": "Del Campidoglio",
"streetNumber": "1",
"toponym": "Piazza"
},
"startup": False,
"taxId": "02438750586",
"vat": "01057861005"
},
"country": "it",
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"id": "ea623ae8c298",
"name": "ROMA CAPITALE",
"shares": {
"sharesOwned": [
{
"active": True,
"amount": 182436916.0,
"id": "6037483d168e",
"lastUpdate": "2011-10-20",
"legalName": "AZIENDA MUNICIPALE AMBIENTE S.P.A. "
"ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"name": "AZIENDA MUNICIPALE AMBIENTE S.P.A. ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 10000000.0,
"id": "8f1c4be03ec9",
"lastUpdate": "2010-02-18",
"legalName": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"name": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2822250.0,
"id": "667243935e25",
"lastUpdate": "2005-11-24",
"legalName": "ZETEMA PROGETTO CULTURA SRL",
"name": "ZETEMA PROGETTO CULTURA SRL",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2500000.0,
"id": "0b10d523f972",
"lastUpdate": "2012-04-16",
"legalName": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE DELLA CITTA' "
"DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"name": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE "
"DELLA CITTA' DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2000000.0,
"id": "a32e98206f27",
"lastUpdate": "2012-02-16",
"legalName": "RISORSE PER ROMA S.P.A.",
"name": "RISORSE PER ROMA S.P.A.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 300000.0,
"id": "8c8179a19251",
"lastUpdate": "2005-10-14",
"legalName": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"name": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 560438430.84,
"id": "b3f933d6a2df",
"lastUpdate": "2014-12-18",
"legalName": "ACEA S.P.A.",
"name": "ACEA S.P.A.",
"ratio": 0.51,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 2530638.81,
"id": "b61bd1a3a245",
"lastUpdate": "2000-04-28",
"legalName": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"name": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"ratio": 0.35,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 190000.0,
"id": "2f3a4fd465f6",
"lastUpdate": "2006-05-18",
"legalName": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"name": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"ratio": 0.19,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 155738.0,
"id": "f11054c630fa",
"lastUpdate": "2014-06-05",
"legalName": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"name": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"ratio": 0.08869999999999999,
"typeOfRight": "propriet\u00e0"
}
]
}
},
{
"active": True,
"base": {
"active": True,
"ateco": [
{
"code": "01.11.1",
"description": "Coltivazione di cereali (escluso il riso)",
"rootCode": "A"
}
],
"cciaa": "RM",
"founded": "1977-06-14",
"inGroup": False,
"legalClass": "Altre Forme",
"legalForms": [
{
"level": 1,
"name": "Altre Forme"
},
{
"level": 2,
"name": "Ente"
}
],
"legalName": "ROMA CAPITALE",
"nace": [
{
"code": "01.11",
"description": "Growing of cereals (except rice), leguminous crops and oil seeds",
"rootCode": "A"
}
],
"rea": "1287276",
"registeredAddress": {
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"lat": 41.89334748,
"latlonPrecision": 90,
"lon": 12.48289836,
"macroregion": "Centro",
"municipality": "Roma",
"postcode": "00186",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"streetName": "Del Campidoglio",
"streetNumber": "1",
"toponym": "Piazza"
},
"startup": False,
"taxId": "02438750586",
"vat": "01057861005"
},
"country": "it",
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"id": "ea623ae8c298",
"name": "ROMA CAPITALE",
"shares": {
"sharesOwned": [
{
"active": True,
"amount": 182436916.0,
"id": "6037483d168e",
"lastUpdate": "2011-10-20",
"legalName": "AZIENDA MUNICIPALE AMBIENTE S.P.A. "
"ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"name": "AZIENDA MUNICIPALE AMBIENTE S.P.A. ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 10000000.0,
"id": "8f1c4be03ec9",
"lastUpdate": "2010-02-18",
"legalName": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"name": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2822250.0,
"id": "667243935e25",
"lastUpdate": "2005-11-24",
"legalName": "ZETEMA PROGETTO CULTURA SRL",
"name": "ZETEMA PROGETTO CULTURA SRL",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2500000.0,
"id": "0b10d523f972",
"lastUpdate": "2012-04-16",
"legalName": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE DELLA CITTA' "
"DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"name": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE "
"DELLA CITTA' DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2000000.0,
"id": "a32e98206f27",
"lastUpdate": "2012-02-16",
"legalName": "RISORSE PER ROMA S.P.A.",
"name": "RISORSE PER ROMA S.P.A.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 300000.0,
"id": "8c8179a19251",
"lastUpdate": "2005-10-14",
"legalName": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"name": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 560438430.84,
"id": "b3f933d6a2df",
"lastUpdate": "2014-12-18",
"legalName": "ACEA S.P.A.",
"name": "ACEA S.P.A.",
"ratio": 0.51,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 2530638.81,
"id": "b61bd1a3a245",
"lastUpdate": "2000-04-28",
"legalName": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"name": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"ratio": 0.35,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 190000.0,
"id": "2f3a4fd465f6",
"lastUpdate": "2006-05-18",
"legalName": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"name": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"ratio": 0.19,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 155738.0,
"id": "f11054c630fa",
"lastUpdate": "2014-06-05",
"legalName": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"name": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"ratio": 0.08869999999999999,
"typeOfRight": "propriet\u00e0"
}
]
}
},
{
"active": True,
"base": {
"active": True,
"ateco": [
{
"code": "01.11.1",
"description": "Coltivazione di cereali (escluso il riso)",
"rootCode": "A"
}
],
"cciaa": "RM",
"founded": "1977-06-14",
"inGroup": False,
"legalClass": "Altre Forme",
"legalForms": [
{
"level": 1,
"name": "Altre Forme"
},
{
"level": 2,
"name": "Ente"
}
],
"legalName": "ROMA CAPITALE",
"nace": [
{
"code": "01.11",
"description": "Growing of cereals (except rice), leguminous crops and oil seeds",
"rootCode": "A"
}
],
"rea": "1287276",
"registeredAddress": {
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"lat": 41.89334748,
"latlonPrecision": 90,
"lon": 12.48289836,
"macroregion": "Centro",
"municipality": "Roma",
"postcode": "00186",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"streetName": "Del Campidoglio",
"streetNumber": "1",
"toponym": "Piazza"
},
"startup": False,
"taxId": "02438750586",
"vat": "01057861005"
},
"country": "it",
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"id": "ea623ae8c298",
"name": "ROMA CAPITALE",
"shares": {
"sharesOwned": [
{
"active": True,
"amount": 182436916.0,
"id": "6037483d168e",
"lastUpdate": "2011-10-20",
"legalName": "AZIENDA MUNICIPALE AMBIENTE S.P.A. "
"ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"name": "AZIENDA MUNICIPALE AMBIENTE S.P.A. ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 10000000.0,
"id": "8f1c4be03ec9",
"lastUpdate": "2010-02-18",
"legalName": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"name": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2822250.0,
"id": "667243935e25",
"lastUpdate": "2005-11-24",
"legalName": "ZETEMA PROGETTO CULTURA SRL",
"name": "ZETEMA PROGETTO CULTURA SRL",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2500000.0,
"id": "0b10d523f972",
"lastUpdate": "2012-04-16",
"legalName": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE DELLA CITTA' "
"DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"name": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE "
"DELLA CITTA' DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2000000.0,
"id": "a32e98206f27",
"lastUpdate": "2012-02-16",
"legalName": "RISORSE PER ROMA S.P.A.",
"name": "RISORSE PER ROMA S.P.A.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 300000.0,
"id": "8c8179a19251",
"lastUpdate": "2005-10-14",
"legalName": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"name": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 560438430.84,
"id": "b3f933d6a2df",
"lastUpdate": "2014-12-18",
"legalName": "ACEA S.P.A.",
"name": "ACEA S.P.A.",
"ratio": 0.51,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 2530638.81,
"id": "b61bd1a3a245",
"lastUpdate": "2000-04-28",
"legalName": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"name": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"ratio": 0.35,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 190000.0,
"id": "2f3a4fd465f6",
"lastUpdate": "2006-05-18",
"legalName": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"name": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"ratio": 0.19,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 155738.0,
"id": "f11054c630fa",
"lastUpdate": "2014-06-05",
"legalName": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"name": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"ratio": 0.08869999999999999,
"typeOfRight": "propriet\u00e0"
}
]
}
},
{
"active": True,
"base": {
"active": True,
"ateco": [
{
"code": "01.11.1",
"description": "Coltivazione di cereali (escluso il riso)",
"rootCode": "A"
}
],
"cciaa": "RM",
"founded": "1977-06-14",
"inGroup": False,
"legalClass": "Altre Forme",
"legalForms": [
{
"level": 1,
"name": "Altre Forme"
},
{
"level": 2,
"name": "Ente"
}
],
"legalName": "ROMA CAPITALE",
"nace": [
{
"code": "01.11",
"description": "Growing of cereals (except rice), leguminous crops and oil seeds",
"rootCode": "A"
}
],
"rea": "1287276",
"registeredAddress": {
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"lat": 41.89334748,
"latlonPrecision": 90,
"lon": 12.48289836,
"macroregion": "Centro",
"municipality": "Roma",
"postcode": "00186",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"streetName": "Del Campidoglio",
"streetNumber": "1",
"toponym": "Piazza"
},
"startup": False,
"taxId": "02438750586",
"vat": "01057861005"
},
"country": "it",
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"id": "ea623ae8c298",
"name": "ROMA CAPITALE",
"shares": {
"sharesOwned": [
{
"active": True,
"amount": 182436916.0,
"id": "6037483d168e",
"lastUpdate": "2011-10-20",
"legalName": "AZIENDA MUNICIPALE AMBIENTE S.P.A. "
"ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"name": "AZIENDA MUNICIPALE AMBIENTE S.P.A. ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 10000000.0,
"id": "8f1c4be03ec9",
"lastUpdate": "2010-02-18",
"legalName": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"name": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2822250.0,
"id": "667243935e25",
"lastUpdate": "2005-11-24",
"legalName": "ZETEMA PROGETTO CULTURA SRL",
"name": "ZETEMA PROGETTO CULTURA SRL",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2500000.0,
"id": "0b10d523f972",
"lastUpdate": "2012-04-16",
"legalName": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE DELLA CITTA' "
"DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"name": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE "
"DELLA CITTA' DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2000000.0,
"id": "a32e98206f27",
"lastUpdate": "2012-02-16",
"legalName": "RISORSE PER ROMA S.P.A.",
"name": "RISORSE PER ROMA S.P.A.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 300000.0,
"id": "8c8179a19251",
"lastUpdate": "2005-10-14",
"legalName": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"name": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 560438430.84,
"id": "b3f933d6a2df",
"lastUpdate": "2014-12-18",
"legalName": "ACEA S.P.A.",
"name": "ACEA S.P.A.",
"ratio": 0.51,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 2530638.81,
"id": "b61bd1a3a245",
"lastUpdate": "2000-04-28",
"legalName": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"name": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"ratio": 0.35,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 190000.0,
"id": "2f3a4fd465f6",
"lastUpdate": "2006-05-18",
"legalName": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"name": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"ratio": 0.19,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 155738.0,
"id": "f11054c630fa",
"lastUpdate": "2014-06-05",
"legalName": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"name": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"ratio": 0.08869999999999999,
"typeOfRight": "propriet\u00e0"
}
]
}
},
{
"active": True,
"base": {
"active": True,
"ateco": [
{
"code": "01.11.1",
"description": "Coltivazione di cereali (escluso il riso)",
"rootCode": "A"
}
],
"cciaa": "RM",
"founded": "1977-06-14",
"inGroup": False,
"legalClass": "Altre Forme",
"legalForms": [
{
"level": 1,
"name": "Altre Forme"
},
{
"level": 2,
"name": "Ente"
}
],
"legalName": "ROMA CAPITALE",
"nace": [
{
"code": "01.11",
"description": "Growing of cereals (except rice), leguminous crops and oil seeds",
"rootCode": "A"
}
],
"rea": "1287276",
"registeredAddress": {
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"lat": 41.89334748,
"latlonPrecision": 90,
"lon": 12.48289836,
"macroregion": "Centro",
"municipality": "Roma",
"postcode": "00186",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"streetName": "Del Campidoglio",
"streetNumber": "1",
"toponym": "Piazza"
},
"startup": False,
"taxId": "02438750586",
"vat": "01057861005"
},
"country": "it",
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"id": "ea623ae8c298",
"name": "ROMA CAPITALE",
"shares": {
"sharesOwned": [
{
"active": True,
"amount": 182436916.0,
"id": "6037483d168e",
"lastUpdate": "2011-10-20",
"legalName": "AZIENDA MUNICIPALE AMBIENTE S.P.A. "
"ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"name": "AZIENDA MUNICIPALE AMBIENTE S.P.A. ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 10000000.0,
"id": "8f1c4be03ec9",
"lastUpdate": "2010-02-18",
"legalName": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"name": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2822250.0,
"id": "667243935e25",
"lastUpdate": "2005-11-24",
"legalName": "ZETEMA PROGETTO CULTURA SRL",
"name": "ZETEMA PROGETTO CULTURA SRL",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2500000.0,
"id": "0b10d523f972",
"lastUpdate": "2012-04-16",
"legalName": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE DELLA CITTA' "
"DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"name": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE "
"DELLA CITTA' DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2000000.0,
"id": "a32e98206f27",
"lastUpdate": "2012-02-16",
"legalName": "RISORSE PER ROMA S.P.A.",
"name": "RISORSE PER ROMA S.P.A.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 300000.0,
"id": "8c8179a19251",
"lastUpdate": "2005-10-14",
"legalName": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"name": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 560438430.84,
"id": "b3f933d6a2df",
"lastUpdate": "2014-12-18",
"legalName": "ACEA S.P.A.",
"name": "ACEA S.P.A.",
"ratio": 0.51,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 2530638.81,
"id": "b61bd1a3a245",
"lastUpdate": "2000-04-28",
"legalName": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"name": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"ratio": 0.35,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 190000.0,
"id": "2f3a4fd465f6",
"lastUpdate": "2006-05-18",
"legalName": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"name": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"ratio": 0.19,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 155738.0,
"id": "f11054c630fa",
"lastUpdate": "2014-06-05",
"legalName": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"name": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"ratio": 0.08869999999999999,
"typeOfRight": "propriet\u00e0"
}
]
}
},
{
"active": True,
"base": {
"active": True,
"ateco": [
{
"code": "01.11.1",
"description": "Coltivazione di cereali (escluso il riso)",
"rootCode": "A"
}
],
"cciaa": "RM",
"founded": "1977-06-14",
"inGroup": False,
"legalClass": "Altre Forme",
"legalForms": [
{
"level": 1,
"name": "Altre Forme"
},
{
"level": 2,
"name": "Ente"
}
],
"legalName": "ROMA CAPITALE",
"nace": [
{
"code": "01.11",
"description": "Growing of cereals (except rice), leguminous crops and oil seeds",
"rootCode": "A"
}
],
"rea": "1287276",
"registeredAddress": {
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"lat": 41.89334748,
"latlonPrecision": 90,
"lon": 12.48289836,
"macroregion": "Centro",
"municipality": "Roma",
"postcode": "00186",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"streetName": "Del Campidoglio",
"streetNumber": "1",
"toponym": "Piazza"
},
"startup": False,
"taxId": "02438750586",
"vat": "01057861005"
},
"country": "it",
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"id": "ea623ae8c298",
"name": "ROMA CAPITALE",
"shares": {
"sharesOwned": [
{
"active": True,
"amount": 182436916.0,
"id": "6037483d168e",
"lastUpdate": "2011-10-20",
"legalName": "AZIENDA MUNICIPALE AMBIENTE S.P.A. "
"ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"name": "AZIENDA MUNICIPALE AMBIENTE S.P.A. ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 10000000.0,
"id": "8f1c4be03ec9",
"lastUpdate": "2010-02-18",
"legalName": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"name": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2822250.0,
"id": "667243935e25",
"lastUpdate": "2005-11-24",
"legalName": "ZETEMA PROGETTO CULTURA SRL",
"name": "ZETEMA PROGETTO CULTURA SRL",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2500000.0,
"id": "0b10d523f972",
"lastUpdate": "2012-04-16",
"legalName": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE DELLA CITTA' "
"DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"name": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE "
"DELLA CITTA' DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2000000.0,
"id": "a32e98206f27",
"lastUpdate": "2012-02-16",
"legalName": "RISORSE PER ROMA S.P.A.",
"name": "RISORSE PER ROMA S.P.A.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 300000.0,
"id": "8c8179a19251",
"lastUpdate": "2005-10-14",
"legalName": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"name": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 560438430.84,
"id": "b3f933d6a2df",
"lastUpdate": "2014-12-18",
"legalName": "ACEA S.P.A.",
"name": "ACEA S.P.A.",
"ratio": 0.51,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 2530638.81,
"id": "b61bd1a3a245",
"lastUpdate": "2000-04-28",
"legalName": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"name": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"ratio": 0.35,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 190000.0,
"id": "2f3a4fd465f6",
"lastUpdate": "2006-05-18",
"legalName": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"name": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"ratio": 0.19,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 155738.0,
"id": "f11054c630fa",
"lastUpdate": "2014-06-05",
"legalName": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"name": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"ratio": 0.08869999999999999,
"typeOfRight": "propriet\u00e0"
}
]
}
},
{
"active": True,
"base": {
"active": True,
"ateco": [
{
"code": "01.11.1",
"description": "Coltivazione di cereali (escluso il riso)",
"rootCode": "A"
}
],
"cciaa": "RM",
"founded": "1977-06-14",
"inGroup": False,
"legalClass": "Altre Forme",
"legalForms": [
{
"level": 1,
"name": "Altre Forme"
},
{
"level": 2,
"name": "Ente"
}
],
"legalName": "ROMA CAPITALE",
"nace": [
{
"code": "01.11",
"description": "Growing of cereals (except rice), leguminous crops and oil seeds",
"rootCode": "A"
}
],
"rea": "1287276",
"registeredAddress": {
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"lat": 41.89334748,
"latlonPrecision": 90,
"lon": 12.48289836,
"macroregion": "Centro",
"municipality": "Roma",
"postcode": "00186",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"streetName": "Del Campidoglio",
"streetNumber": "1",
"toponym": "Piazza"
},
"startup": False,
"taxId": "02438750586",
"vat": "01057861005"
},
"country": "it",
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"id": "ea623ae8c298",
"name": "ROMA CAPITALE",
"shares": {
"sharesOwned": [
{
"active": True,
"amount": 182436916.0,
"id": "6037483d168e",
"lastUpdate": "2011-10-20",
"legalName": "AZIENDA MUNICIPALE AMBIENTE S.P.A. "
"ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"name": "AZIENDA MUNICIPALE AMBIENTE S.P.A. ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 10000000.0,
"id": "8f1c4be03ec9",
"lastUpdate": "2010-02-18",
"legalName": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"name": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2822250.0,
"id": "667243935e25",
"lastUpdate": "2005-11-24",
"legalName": "ZETEMA PROGETTO CULTURA SRL",
"name": "ZETEMA PROGETTO CULTURA SRL",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2500000.0,
"id": "0b10d523f972",
"lastUpdate": "2012-04-16",
"legalName": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE DELLA CITTA' "
"DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"name": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE "
"DELLA CITTA' DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2000000.0,
"id": "a32e98206f27",
"lastUpdate": "2012-02-16",
"legalName": "RISORSE PER ROMA S.P.A.",
"name": "RISORSE PER ROMA S.P.A.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 300000.0,
"id": "8c8179a19251",
"lastUpdate": "2005-10-14",
"legalName": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"name": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 560438430.84,
"id": "b3f933d6a2df",
"lastUpdate": "2014-12-18",
"legalName": "ACEA S.P.A.",
"name": "ACEA S.P.A.",
"ratio": 0.51,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 2530638.81,
"id": "b61bd1a3a245",
"lastUpdate": "2000-04-28",
"legalName": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"name": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"ratio": 0.35,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 190000.0,
"id": "2f3a4fd465f6",
"lastUpdate": "2006-05-18",
"legalName": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"name": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"ratio": 0.19,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 155738.0,
"id": "f11054c630fa",
"lastUpdate": "2014-06-05",
"legalName": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"name": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"ratio": 0.08869999999999999,
"typeOfRight": "propriet\u00e0"
}
]
}
},
{
"active": True,
"base": {
"active": True,
"ateco": [
{
"code": "01.11.1",
"description": "Coltivazione di cereali (escluso il riso)",
"rootCode": "A"
}
],
"cciaa": "RM",
"founded": "1977-06-14",
"inGroup": False,
"legalClass": "Altre Forme",
"legalForms": [
{
"level": 1,
"name": "Altre Forme"
},
{
"level": 2,
"name": "Ente"
}
],
"legalName": "ROMA CAPITALE",
"nace": [
{
"code": "01.11",
"description": "Growing of cereals (except rice), leguminous crops and oil seeds",
"rootCode": "A"
}
],
"rea": "1287276",
"registeredAddress": {
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"lat": 41.89334748,
"latlonPrecision": 90,
"lon": 12.48289836,
"macroregion": "Centro",
"municipality": "Roma",
"postcode": "00186",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"streetName": "Del Campidoglio",
"streetNumber": "1",
"toponym": "Piazza"
},
"startup": False,
"taxId": "02438750586",
"vat": "01057861005"
},
"country": "it",
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"id": "ea623ae8c298",
"name": "ROMA CAPITALE",
"shares": {
"sharesOwned": [
{
"active": True,
"amount": 182436916.0,
"id": "6037483d168e",
"lastUpdate": "2011-10-20",
"legalName": "AZIENDA MUNICIPALE AMBIENTE S.P.A. "
"ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"name": "AZIENDA MUNICIPALE AMBIENTE S.P.A. ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 10000000.0,
"id": "8f1c4be03ec9",
"lastUpdate": "2010-02-18",
"legalName": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"name": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2822250.0,
"id": "667243935e25",
"lastUpdate": "2005-11-24",
"legalName": "ZETEMA PROGETTO CULTURA SRL",
"name": "ZETEMA PROGETTO CULTURA SRL",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2500000.0,
"id": "0b10d523f972",
"lastUpdate": "2012-04-16",
"legalName": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE DELLA CITTA' "
"DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"name": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE "
"DELLA CITTA' DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2000000.0,
"id": "a32e98206f27",
"lastUpdate": "2012-02-16",
"legalName": "RISORSE PER ROMA S.P.A.",
"name": "RISORSE PER ROMA S.P.A.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 300000.0,
"id": "8c8179a19251",
"lastUpdate": "2005-10-14",
"legalName": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"name": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 560438430.84,
"id": "b3f933d6a2df",
"lastUpdate": "2014-12-18",
"legalName": "ACEA S.P.A.",
"name": "ACEA S.P.A.",
"ratio": 0.51,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 2530638.81,
"id": "b61bd1a3a245",
"lastUpdate": "2000-04-28",
"legalName": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"name": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"ratio": 0.35,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 190000.0,
"id": "2f3a4fd465f6",
"lastUpdate": "2006-05-18",
"legalName": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"name": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"ratio": 0.19,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 155738.0,
"id": "f11054c630fa",
"lastUpdate": "2014-06-05",
"legalName": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"name": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"ratio": 0.08869999999999999,
"typeOfRight": "propriet\u00e0"
}
]
}
},
{
"active": True,
"base": {
"active": True,
"ateco": [
{
"code": "01.11.1",
"description": "Coltivazione di cereali (escluso il riso)",
"rootCode": "A"
}
],
"cciaa": "RM",
"founded": "1977-06-14",
"inGroup": False,
"legalClass": "Altre Forme",
"legalForms": [
{
"level": 1,
"name": "Altre Forme"
},
{
"level": 2,
"name": "Ente"
}
],
"legalName": "ROMA CAPITALE",
"nace": [
{
"code": "01.11",
"description": "Growing of cereals (except rice), leguminous crops and oil seeds",
"rootCode": "A"
}
],
"rea": "1287276",
"registeredAddress": {
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"lat": 41.89334748,
"latlonPrecision": 90,
"lon": 12.48289836,
"macroregion": "Centro",
"municipality": "Roma",
"postcode": "00186",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"streetName": "Del Campidoglio",
"streetNumber": "1",
"toponym": "Piazza"
},
"startup": False,
"taxId": "02438750586",
"vat": "01057861005"
},
"country": "it",
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"id": "ea623ae8c298",
"name": "ROMA CAPITALE",
"shares": {
"sharesOwned": [
{
"active": True,
"amount": 182436916.0,
"id": "6037483d168e",
"lastUpdate": "2011-10-20",
"legalName": "AZIENDA MUNICIPALE AMBIENTE S.P.A. "
"ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"name": "AZIENDA MUNICIPALE AMBIENTE S.P.A. ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 10000000.0,
"id": "8f1c4be03ec9",
"lastUpdate": "2010-02-18",
"legalName": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"name": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2822250.0,
"id": "667243935e25",
"lastUpdate": "2005-11-24",
"legalName": "ZETEMA PROGETTO CULTURA SRL",
"name": "ZETEMA PROGETTO CULTURA SRL",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2500000.0,
"id": "0b10d523f972",
"lastUpdate": "2012-04-16",
"legalName": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE DELLA CITTA' "
"DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"name": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE "
"DELLA CITTA' DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2000000.0,
"id": "a32e98206f27",
"lastUpdate": "2012-02-16",
"legalName": "RISORSE PER ROMA S.P.A.",
"name": "RISORSE PER ROMA S.P.A.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 300000.0,
"id": "8c8179a19251",
"lastUpdate": "2005-10-14",
"legalName": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"name": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 560438430.84,
"id": "b3f933d6a2df",
"lastUpdate": "2014-12-18",
"legalName": "ACEA S.P.A.",
"name": "ACEA S.P.A.",
"ratio": 0.51,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 2530638.81,
"id": "b61bd1a3a245",
"lastUpdate": "2000-04-28",
"legalName": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"name": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"ratio": 0.35,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 190000.0,
"id": "2f3a4fd465f6",
"lastUpdate": "2006-05-18",
"legalName": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"name": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"ratio": 0.19,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 155738.0,
"id": "f11054c630fa",
"lastUpdate": "2014-06-05",
"legalName": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"name": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"ratio": 0.08869999999999999,
"typeOfRight": "propriet\u00e0"
}
]
}
},
{
"active": True,
"base": {
"active": True,
"ateco": [
{
"code": "01.11.1",
"description": "Coltivazione di cereali (escluso il riso)",
"rootCode": "A"
}
],
"cciaa": "RM",
"founded": "1977-06-14",
"inGroup": False,
"legalClass": "Altre Forme",
"legalForms": [
{
"level": 1,
"name": "Altre Forme"
},
{
"level": 2,
"name": "Ente"
}
],
"legalName": "ROMA CAPITALE",
"nace": [
{
"code": "01.11",
"description": "Growing of cereals (except rice), leguminous crops and oil seeds",
"rootCode": "A"
}
],
"rea": "1287276",
"registeredAddress": {
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"lat": 41.89334748,
"latlonPrecision": 90,
"lon": 12.48289836,
"macroregion": "Centro",
"municipality": "Roma",
"postcode": "00186",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"streetName": "Del Campidoglio",
"streetNumber": "1",
"toponym": "Piazza"
},
"startup": False,
"taxId": "02438750586",
"vat": "01057861005"
},
"country": "it",
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"id": "ea623ae8c298",
"name": "ROMA CAPITALE",
"shares": {
"sharesOwned": [
{
"active": True,
"amount": 182436916.0,
"id": "6037483d168e",
"lastUpdate": "2011-10-20",
"legalName": "AZIENDA MUNICIPALE AMBIENTE S.P.A. "
"ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"name": "AZIENDA MUNICIPALE AMBIENTE S.P.A. ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 10000000.0,
"id": "8f1c4be03ec9",
"lastUpdate": "2010-02-18",
"legalName": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"name": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2822250.0,
"id": "667243935e25",
"lastUpdate": "2005-11-24",
"legalName": "ZETEMA PROGETTO CULTURA SRL",
"name": "ZETEMA PROGETTO CULTURA SRL",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2500000.0,
"id": "0b10d523f972",
"lastUpdate": "2012-04-16",
"legalName": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE DELLA CITTA' "
"DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"name": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE "
"DELLA CITTA' DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2000000.0,
"id": "a32e98206f27",
"lastUpdate": "2012-02-16",
"legalName": "RISORSE PER ROMA S.P.A.",
"name": "RISORSE PER ROMA S.P.A.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 300000.0,
"id": "8c8179a19251",
"lastUpdate": "2005-10-14",
"legalName": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"name": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 560438430.84,
"id": "b3f933d6a2df",
"lastUpdate": "2014-12-18",
"legalName": "ACEA S.P.A.",
"name": "ACEA S.P.A.",
"ratio": 0.51,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 2530638.81,
"id": "b61bd1a3a245",
"lastUpdate": "2000-04-28",
"legalName": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"name": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"ratio": 0.35,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 190000.0,
"id": "2f3a4fd465f6",
"lastUpdate": "2006-05-18",
"legalName": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"name": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"ratio": 0.19,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 155738.0,
"id": "f11054c630fa",
"lastUpdate": "2014-06-05",
"legalName": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"name": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"ratio": 0.08869999999999999,
"typeOfRight": "propriet\u00e0"
}
]
}
},
{
"active": True,
"base": {
"active": True,
"ateco": [
{
"code": "01.11.1",
"description": "Coltivazione di cereali (escluso il riso)",
"rootCode": "A"
}
],
"cciaa": "RM",
"founded": "1977-06-14",
"inGroup": False,
"legalClass": "Altre Forme",
"legalForms": [
{
"level": 1,
"name": "Altre Forme"
},
{
"level": 2,
"name": "Ente"
}
],
"legalName": "ROMA CAPITALE",
"nace": [
{
"code": "01.11",
"description": "Growing of cereals (except rice), leguminous crops and oil seeds",
"rootCode": "A"
}
],
"rea": "1287276",
"registeredAddress": {
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"lat": 41.89334748,
"latlonPrecision": 90,
"lon": 12.48289836,
"macroregion": "Centro",
"municipality": "Roma",
"postcode": "00186",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"streetName": "Del Campidoglio",
"streetNumber": "1",
"toponym": "Piazza"
},
"startup": False,
"taxId": "02438750586",
"vat": "01057861005"
},
"country": "it",
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"id": "ea623ae8c298",
"name": "ROMA CAPITALE",
"shares": {
"sharesOwned": [
{
"active": True,
"amount": 182436916.0,
"id": "6037483d168e",
"lastUpdate": "2011-10-20",
"legalName": "AZIENDA MUNICIPALE AMBIENTE S.P.A. "
"ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"name": "AZIENDA MUNICIPALE AMBIENTE S.P.A. ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 10000000.0,
"id": "8f1c4be03ec9",
"lastUpdate": "2010-02-18",
"legalName": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"name": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2822250.0,
"id": "667243935e25",
"lastUpdate": "2005-11-24",
"legalName": "ZETEMA PROGETTO CULTURA SRL",
"name": "ZETEMA PROGETTO CULTURA SRL",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2500000.0,
"id": "0b10d523f972",
"lastUpdate": "2012-04-16",
"legalName": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE DELLA CITTA' "
"DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"name": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE "
"DELLA CITTA' DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2000000.0,
"id": "a32e98206f27",
"lastUpdate": "2012-02-16",
"legalName": "RISORSE PER ROMA S.P.A.",
"name": "RISORSE PER ROMA S.P.A.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 300000.0,
"id": "8c8179a19251",
"lastUpdate": "2005-10-14",
"legalName": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"name": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 560438430.84,
"id": "b3f933d6a2df",
"lastUpdate": "2014-12-18",
"legalName": "ACEA S.P.A.",
"name": "ACEA S.P.A.",
"ratio": 0.51,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 2530638.81,
"id": "b61bd1a3a245",
"lastUpdate": "2000-04-28",
"legalName": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"name": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"ratio": 0.35,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 190000.0,
"id": "2f3a4fd465f6",
"lastUpdate": "2006-05-18",
"legalName": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"name": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"ratio": 0.19,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 155738.0,
"id": "f11054c630fa",
"lastUpdate": "2014-06-05",
"legalName": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"name": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"ratio": 0.08869999999999999,
"typeOfRight": "propriet\u00e0"
}
]
}
},
{
"active": True,
"base": {
"active": True,
"ateco": [
{
"code": "01.11.1",
"description": "Coltivazione di cereali (escluso il riso)",
"rootCode": "A"
}
],
"cciaa": "RM",
"founded": "1977-06-14",
"inGroup": False,
"legalClass": "Altre Forme",
"legalForms": [
{
"level": 1,
"name": "Altre Forme"
},
{
"level": 2,
"name": "Ente"
}
],
"legalName": "ROMA CAPITALE",
"nace": [
{
"code": "01.11",
"description": "Growing of cereals (except rice), leguminous crops and oil seeds",
"rootCode": "A"
}
],
"rea": "1287276",
"registeredAddress": {
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"lat": 41.89334748,
"latlonPrecision": 90,
"lon": 12.48289836,
"macroregion": "Centro",
"municipality": "Roma",
"postcode": "00186",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"streetName": "Del Campidoglio",
"streetNumber": "1",
"toponym": "Piazza"
},
"startup": False,
"taxId": "02438750586",
"vat": "01057861005"
},
"country": "it",
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"id": "ea623ae8c298",
"name": "ROMA CAPITALE",
"shares": {
"sharesOwned": [
{
"active": True,
"amount": 182436916.0,
"id": "6037483d168e",
"lastUpdate": "2011-10-20",
"legalName": "AZIENDA MUNICIPALE AMBIENTE S.P.A. "
"ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"name": "AZIENDA MUNICIPALE AMBIENTE S.P.A. ROMA IN FORMA ABBREVIATA \"AMA S.P.A\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 10000000.0,
"id": "8f1c4be03ec9",
"lastUpdate": "2010-02-18",
"legalName": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"name": "ROMA SERVIZI PER LA MOBILITA' S.R.L.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2822250.0,
"id": "667243935e25",
"lastUpdate": "2005-11-24",
"legalName": "ZETEMA PROGETTO CULTURA SRL",
"name": "ZETEMA PROGETTO CULTURA SRL",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2500000.0,
"id": "0b10d523f972",
"lastUpdate": "2012-04-16",
"legalName": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE DELLA CITTA' "
"DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"name": "SOCIETA' PER LA REALIZZAZIONE DELLE METROPOLITANE "
"DELLA CITTA' DI ROMA A R.L. IN FORMA "
"ABBREVIATA \"ROMA METROPOLITANE S.R.L.\"",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 2000000.0,
"id": "a32e98206f27",
"lastUpdate": "2012-02-16",
"legalName": "RISORSE PER ROMA S.P.A.",
"name": "RISORSE PER ROMA S.P.A.",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 300000.0,
"id": "8c8179a19251",
"lastUpdate": "2005-10-14",
"legalName": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"name": "SERVIZI AZIONISTA ROMA S.R.L. A SOCIO UNICO IN LIQUIDAZIONE",
"ratio": 1.0,
"typeOfRight": "propriet\u00e0"
},
{
"active": True,
"amount": 560438430.84,
"id": "b3f933d6a2df",
"lastUpdate": "2014-12-18",
"legalName": "ACEA S.P.A.",
"name": "ACEA S.P.A.",
"ratio": 0.51,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 2530638.81,
"id": "b61bd1a3a245",
"lastUpdate": "2000-04-28",
"legalName": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"name": "AGENZIA ROMANA PER LA PREPARAZIONE DEL GIUBILEO - SOCIETA' PER AZIONI",
"ratio": 0.35,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 190000.0,
"id": "2f3a4fd465f6",
"lastUpdate": "2006-05-18",
"legalName": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"name": "\"AGENZIA REGIONALE PER LA PROMOZIONE "
"TURISTICA DI ROMA E DEL LAZI O S.P.A. IN "
"LIQUIDAZIONE\" IN FORMA ABBREVIATA \"AGENZIA DEL "
"TURISMO S.P.A. IN LIQ UIDAZIONE\"",
"ratio": 0.19,
"typeOfRight": "propriet\u00e0"
},
{
"active": False,
"amount": 155738.0,
"id": "f11054c630fa",
"lastUpdate": "2014-06-05",
"legalName": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"name": "CENTRO INGROSSO FIORI S.P.A. ED IN FORMA ABBREVIATA "
"C.I.F. S.P.A. IN LIQUIDAZIONE",
"ratio": 0.08869999999999999,
"typeOfRight": "propriet\u00e0"
}
]
}
},
{
"active": True,
"base": {
"active": True,
"govCode": "c_h501",
"govType": "Comuni e loro Consorzi e Associazioni",
"inGroup": False,
"legalClass": "Altre Forme",
"legalForms": [
{
"level": 1,
"name": "Altre Forme"
},
{
"level": 2,
"name": "Soggetto non iscritto al Registro Imprese"
}
],
"legalName": "COMUNE DI ROMA",
"registeredAddress": {
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"lat": 41.89334748,
"latlonPrecision": 0,
"lon": 12.48289836,
"macroregion": "Centro",
"municipality": "Roma",
"postcode": "00186",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"streetName": "Piazza Del Campidoglio, 1"
},
"startup": False,
"taxId": "02438750586",
"vat": "02438750586"
},
"country": "it",
"fullAddress": "Piazza Del Campidoglio, 1, 00186, Roma (RM)",
"id": "2e45d55f8c71",
"name": "COMUNE DI ROMA"
}
],
"meta": {
"count": 62,
"limit": 50,
"offset": 0,
"ordering": "atoka"
}
}
}
},
}[tax_ids]
def get_companies_economics():
"""Return values as if required through a requests to an atoka endpoint, with economics package specified
:return:
"""
return {
"meta": {
"count": 2,
"error": 0,
"success": 2,
},
"responses": {
"02241890223,09988761004": {
"meta": {"count": 2, "limit": 10, "offset": 0, "ordering": "atoka"},
"items": [
{
"active": True,
"base": {
"active": True,
"ateco": [
{
"code": "62.01.00",
"description": "Produzione di software non connesso all'edizione",
"rootCode": "J"
}
],
"cciaa": "TN", "founded": "2012-02-13", "inGroup": True,
"legalClass": "Societ\u00e0 Di Capitale",
"legalForms": [
{"level": 1, "name": "Societ\u00e0 Di Capitale"},
{"level": 2, "name": "Societ\u00e0 A Responsabilit\u00e0 Limitata"}
],
"legalName": "SPAZIODATI S.R.L.",
"nace": [
{"code": "62.01", "description": "Computer programming activities", "rootCode": "J"}
],
"rea": "210089",
"registeredAddress": {
"fullAddress": "Via Adriano Olivetti, 13, 38122, Trento (TN)",
"lat": 46.06248902, "latlonPrecision": 60, "lon": 11.10780205,
"macroregion": "Nord-est", "municipality": "Trento", "postcode": "38122",
"province": "Trento", "provinceCode": "TN",
"region": "Trentino-Alto Adige/S\u00fcdtirol", "state": "Italia",
"streetName": "Adriano Olivetti", "streetNumber": "13", "toponym": "Via"
},
"startup": False,
"taxId": "02241890223",
"vat": "02241890223"
},
"country": "it",
"economics": {
"balanceSheets": [
{"capitalStock": 22000, "currency": "EUR",
"date": "2017-12-31",
"latest": True,
"revenue": 2778000, "revenueTrend": 0.8120999999999999, "year": 2017},
{"capitalStock": 22000, "currency": "EUR",
"date": "2016-12-31",
"revenue": 1533000, "revenueTrend": 2.3254, "year": 2016},
{"capitalStock": 18000, "currency": "EUR",
"date": "2015-12-31",
"revenue": 461000, "revenueTrend": 0.8970999999999999, "year": 2015},
{"capitalStock": 15000, "currency": "EUR",
"date": "2014-12-31",
"revenue": 243000,
"revenueTrend": 0.7868, "year": 2014},
{"capitalStock": 12000, "currency": "EUR",
"date": "2013-12-31",
"revenue": 136000,
"revenueTrend": 0.7436, "year": 2013},
{"capitalStock": 11000, "currency": "EUR",
"date": "2012-12-31",
"revenue": 65000, "year": 2012}
],
"capitalStock": {"value": 21638},
"employees": [
{"date": "2018-09-01", "latest": True, "value": 27, "year": 2018},
{"date": "2018-06-01", "latest": False, "value": 27, "year": 2018},
{"date": "2018-03-01", "latest": False, "value": 27, "year": 2018},
{"date": "2017-12-01", "latest": False, "value": 26, "year": 2017},
{"date": "2017-09-01", "latest": False, "value": 25, "year": 2017},
{"date": "2017-06-01", "latest": False, "value": 23, "year": 2017},
{"date": "2017-03-01", "latest": False, "value": 22, "year": 2017},
{"date": "2016-12-01", "latest": False, "value": 18, "year": 2016},
{"date": "2016-09-01", "latest": False, "value": 17, "year": 2016},
{"date": "2016-06-01", "latest": False, "value": 17, "year": 2016},
{"date": "2016-03-01", "latest": False, "value": 17, "year": 2016},
{"date": "2015-12-01", "latest": False, "value": 13, "year": 2015},
{"date": "2015-09-01", "latest": False, "value": 10, "year": 2015},
{"date": "2015-06-01", "latest": False, "value": 7, "year": 2015},
{"date": "2015-03-01", "latest": False, "value": 6, "year": 2015},
{"date": "2014-12-01", "latest": False, "value": 5, "year": 2014},
{"date": "2014-09-01", "latest": False, "value": 4, "year": 2014},
{"date": "2014-06-01", "latest": False, "value": 4, "year": 2014},
{"date": "2014-03-01", "latest": False, "value": 3, "year": 2014}
],
"public": False
},
"fullAddress": "Via Adriano Olivetti, 13, 38122, Trento (TN)",
"id": "6da785b3adf2",
"name": "SPAZIODATI S.R.L."
},
{
"active": True,
"base": {
"active": True,
"ateco": [
{"code": "63.12.00", "description": "Portali web", "rootCode": "J"}
],
"cciaa": "RM",
"founded": "2008-04-24",
"inGroup": False,
"legalClass": "Societ\u00e0 Di Capitale",
"legalForms": [
{"level": 1, "name": "Societ\u00e0 Di Capitale"},
{"level": 2, "name": "Societ\u00e0 A Responsabilit\u00e0 Limitata"}
],
"legalName": "DEPP SRL",
"nace": [
{"code": "63.12", "description": "Web portals", "rootCode": "J"}
],
"rea": "1201904",
"registeredAddress": {
"fullAddress": "Via "
"Merulana, 19, 00185, "
"Roma (RM)",
"lat": 41.89625,
"latlonPrecision": 90,
"lon": 12.49967,
"macroregion": "Centro",
"municipality": "Roma",
"postcode": "00185",
"province": "Roma",
"provinceCode": "RM",
"region": "Lazio",
"state": "Italia",
"streetName": "Merulana",
"streetNumber": "19",
"toponym": "Via"
},
"startup": False,
"taxId": "09988761004",
"vat": "09988761004"
},
"country": "it",
"economics": {
"balanceSheets": [
{"assets": 172000, "capitalStock": 10000, "costs": 442000, "currency": "EUR",
"date": "2017-12-31",
"ebitda": 25000, "latest": True, "mol": 29000, "netFinancialPosition": 7000,
"production": 471000,
"profit": 12000, "purchases": 0, "rawMaterialsVariation": 0, "revenue": 471000,
"revenueTrend": 0.09789999999999999, "servicesAndTPGoodsCharges": 285000,
"staffCosts": 157000, "year": 2017},
{"assets": 158000, "capitalStock": 10000, "costs": 389000, "currency": "EUR",
"date": "2016-12-31",
"ebitda": 31000, "latest": False, "mol": 40000, "netFinancialPosition": -28000,
"production": 429000,
"profit": 12000, "purchases": 0, "rawMaterialsVariation": 0, "revenue": 429000,
"revenueTrend": 0.0239,
"servicesAndTPGoodsCharges": 253000, "staffCosts": 136000, "year": 2016},
{"assets": 114000, "capitalStock": 10000, "costs": 422000, "currency": "EUR",
"date": "2015-12-31",
"ebitda": -10000, "latest": False, "mol": -3000, "netFinancialPosition": -33000,
"production": 419000,
"profit": 18000, "purchases": 0, "rawMaterialsVariation": 0, "revenue": 419000,
"revenueTrend": 0.3176,
"servicesAndTPGoodsCharges": 311000, "staffCosts": 111000, "year": 2015},
{"assets": 101000, "capitalStock": 10000, "costs": 289000, "currency": "EUR",
"date": "2014-12-31",
"ebitda": 23000, "latest": False, "mol": 29000, "netFinancialPosition": -13000,
"production": 318000,
"profit": 19000, "purchases": 0, "rawMaterialsVariation": 0, "revenue": 318000,
"revenueTrend": 0.2927,
"servicesAndTPGoodsCharges": 234000, "staffCosts": 55000, "year": 2014},
{"assets": 90000, "capitalStock": 10000, "costs": 295000, "currency": "EUR",
"date": "2013-12-31",
"ebitda": -55000, "latest": False, "mol": -49000, "netFinancialPosition": -38000,
"production": 246000,
"profit": 6000, "purchases": 0, "rawMaterialsVariation": 0, "revenue": 246000,
"revenueTrend": -0.1119,
"servicesAndTPGoodsCharges": 282000, "staffCosts": 13000, "year": 2013},
{"assets": 118000, "capitalStock": 10000, "costs": 253000, "currency": "EUR",
"date": "2012-12-31",
"ebitda": 20000, "latest": False, "mol": 24000, "netFinancialPosition": -12000,
"production": 277000,
"profit": 11000, "purchases": 0, "rawMaterialsVariation": 0, "revenue": 277000,
"revenueTrend": 1.0368000000000002,
"servicesAndTPGoodsCharges": 253000, "staffCosts": 0,
"year": 2012},
{"assets": 53000, "capitalStock": 10000, "costs": 132000, "currency": "EUR",
"date": "2011-12-31",
"ebitda": 2000, "latest": False, "mol": 4000, "netFinancialPosition": -24000,
"production": 136000,
"profit": -3000, "purchases": 0, "rawMaterialsVariation": 0, "revenue": 136000,
"revenueTrend": -0.049,
"servicesAndTPGoodsCharges": 132000, "staffCosts": 0, "year": 2011},
{"assets": 72000, "capitalStock": 10000, "costs": 139000, "currency": "EUR",
"date": "2010-12-31",
"ebitda": 2000, "latest": False, "mol": 4000, "netFinancialPosition": -23000,
"production": 143000,
"profit": -1000, "purchases": 1000, "rawMaterialsVariation": 0, "revenue": 143000,
"revenueTrend": 0.1,
"servicesAndTPGoodsCharges": 138000, "staffCosts": 0, "year": 2010},
{"assets": 105000, "capitalStock": 10000, "costs": 129000, "currency": "EUR",
"date": "2009-12-31",
"ebitda": 3000, "latest": False, "mol": 5000, "netFinancialPosition": -6000,
"production": 134000,
"profit": 1000, "purchases": 0, "rawMaterialsVariation": 0, "revenue": 130000,
"revenueTrend": 1.2807,
"servicesAndTPGoodsCharges": 129000, "staffCosts": 0, "year": 2009},
{"assets": 41000, "capitalStock": 10000, "costs": 36000, "currency": "EUR",
"date": "2008-12-31",
"ebitda": 1000, "latest": False, "mol": 2000, "netFinancialPosition": -17000,
"production": 38000, "profit": 0,
"purchases": 0, "rawMaterialsVariation": 0, "revenue": 38000,
"servicesAndTPGoodsCharges": 36000,
"staffCosts": 0, "year": 2008}
],
"capitalStock": {"value": 10000},
"employees": [
{"date": "2018-03-01", "latest": True, "value": 9, "year": 2018},
{"date": "2017-12-01", "latest": False, "value": 7, "year": 2017},
{"date": "2017-09-01", "latest": False, "value": 7, "year": 2017},
{"date": "2017-06-01", "latest": False, "value": 7, "year": 2017},
{"date": "2017-03-01", "latest": False, "value": 7, "year": 2017},
{"date": "2016-12-01", "latest": False, "value": 7, "year": 2016},
{"date": "2016-09-01", "latest": False, "value": 6, "year": 2016},
{"date": "2016-06-01", "latest": False, "value": 6, "year": 2016},
{"date": "2016-03-01", "latest": False, "value": 7, "year": 2016},
{"date": "2015-12-01", "latest": False, "value": 5, "year": 2015},
{"date": "2015-09-01", "latest": False, "value": 4, "year": 2015},
{"date": "2015-06-01", "latest": False, "value": 4, "year": 2015}
],
"public": False
},
"fullAddress": "Via Merulana, 19, 00185, Roma (RM)",
"id": "38e098baa0f9",
"name": "DEPP SRL"
}
]
}
}
}
| 54.184864 | 117 | 0.290643 | 30,643 | 519,091 | 4.921026 | 0.01573 | 0.010929 | 0.016393 | 0.101661 | 0.971763 | 0.968805 | 0.965708 | 0.963812 | 0.961093 | 0.953354 | 0 | 0.114075 | 0.611941 | 519,091 | 9,579 | 118 | 54.190521 | 0.634518 | 0.000667 | 0 | 0.790074 | 0 | 0 | 0.283144 | 0.000931 | 0 | 0 | 0 | 0 | 0 | 1 | 0.000524 | false | 0 | 0.000105 | 0.000105 | 0.001152 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
f38a8dd4bdb9abef410fe52b85f1d6f5535df2a1 | 1,267 | py | Python | architecture.py | pranavraja99/CNN_Autoencoder_Pipeline | 49126e1d643cf27d1d75d6511963d98aaf92d2a6 | [
"MIT"
] | null | null | null | architecture.py | pranavraja99/CNN_Autoencoder_Pipeline | 49126e1d643cf27d1d75d6511963d98aaf92d2a6 | [
"MIT"
] | null | null | null | architecture.py | pranavraja99/CNN_Autoencoder_Pipeline | 49126e1d643cf27d1d75d6511963d98aaf92d2a6 | [
"MIT"
] | null | null | null | #%%
import torch
import torchvision
import torch.nn as nn
class auto_encoder(nn.Module):
def __init__(self):
super(auto_encoder, self).__init__()
self.encoder=nn.Sequential(nn.Conv2d(3,64,3, padding=1), nn.ReLU(), nn.MaxPool2d(2,2), nn.Conv2d(64,256,3, padding=1), nn.ReLU(), nn.MaxPool2d(2,2), nn.Conv2d(256,512,3, padding=1), nn.ReLU(), nn.MaxPool2d(2,2))
self.decoder=nn.Sequential(nn.ConvTranspose2d(512,256,2, stride=2), nn.ReLU(), nn.ConvTranspose2d(256,64,2,stride=2), nn.ReLU(), nn.ConvTranspose2d(64,3,2, stride=2), nn.Tanh())
def forward(self, x):
x=self.encoder(x)
return self.decoder(x)
class small_auto_encoder(nn.Module):
def __init__(self):
super(auto_encoder, self).__init__()
self.encoder=nn.Sequential(nn.Conv2d(3,16,3, padding=1), nn.ReLU(), nn.MaxPool2d(2,2), nn.Conv2d(16,32,3, padding=1), nn.ReLU(), nn.MaxPool2d(2,2))#, nn.Conv2d(32,64,3, padding=1), nn.ReLU(), nn.MaxPool2d(2,2))
self.decoder=nn.Sequential(nn.ConvTranspose2d(32,16,2, stride=2), nn.ReLU(), nn.ConvTranspose2d(16,3,2,stride=2), nn.Tanh())#, nn.ReLU(), nn.ConvTranspose2d(16,3,2, stride=2), nn.Softmax())
def forward(self, x):
x=self.encoder(x)
return self.decoder(x)
# %%
| 40.870968 | 219 | 0.657459 | 210 | 1,267 | 3.866667 | 0.166667 | 0.073892 | 0.098522 | 0.081281 | 0.873153 | 0.873153 | 0.849754 | 0.763547 | 0.763547 | 0.763547 | 0 | 0.094843 | 0.142857 | 1,267 | 30 | 220 | 42.233333 | 0.652855 | 0.103394 | 0 | 0.526316 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.210526 | false | 0 | 0.157895 | 0 | 0.578947 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 8 |
343fc415aba684890d1ff856aedbe49c1bbb8125 | 40 | py | Python | python_tutorial/sound/formats/wareread.py | vchatchai/python101 | c2f1c7b0f62a4600f9c64af566dc5630742580f2 | [
"Apache-2.0"
] | null | null | null | python_tutorial/sound/formats/wareread.py | vchatchai/python101 | c2f1c7b0f62a4600f9c64af566dc5630742580f2 | [
"Apache-2.0"
] | null | null | null | python_tutorial/sound/formats/wareread.py | vchatchai/python101 | c2f1c7b0f62a4600f9c64af566dc5630742580f2 | [
"Apache-2.0"
] | null | null | null |
def waveRead() :
return "waveread" | 10 | 21 | 0.625 | 4 | 40 | 6.25 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.25 | 40 | 4 | 21 | 10 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0.205128 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 0 | 0.5 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
3476331313b638bc688e84c817995b0cb2504e34 | 5,157 | py | Python | tests/test_config.py | kozalosev/cfddns_updater | e8bc9a4ebf444912f13d689335701fd1517d4a02 | [
"MIT"
] | 1 | 2019-01-07T11:07:35.000Z | 2019-01-07T11:07:35.000Z | tests/test_config.py | kozalosev/cfddns_updater | e8bc9a4ebf444912f13d689335701fd1517d4a02 | [
"MIT"
] | null | null | null | tests/test_config.py | kozalosev/cfddns_updater | e8bc9a4ebf444912f13d689335701fd1517d4a02 | [
"MIT"
] | null | null | null | import pytest
import logging
from cfddns_updater.config import *
from cfddns_updater.config import normalize_domains, load_file
def test__normalize_domains(caplog):
given = [
'example.org',
'www.example.org',
{'domain': 'server.example.org', 'proxied': False},
{'domain': 'ssh.example.org'},
[1, 2, 3]
]
expected = [
{'domain': 'example.org', 'proxied': True},
{'domain': 'www.example.org', 'proxied': True},
{'domain': 'server.example.org', 'proxied': False},
{'domain': 'ssh.example.org', 'proxied': True}
]
with caplog.at_level(logging.INFO):
assert normalize_domains(given) == expected
assert "Domain entry is not a str or dict, so it is ignored ([1, 2, 3])." in caplog.messages
assert "Domain 'example.org' will be proxied by Cloudflare." in caplog.messages
assert "Domain 'www.example.org' will be proxied by Cloudflare." in caplog.messages
assert "Domain 'ssh.example.org' will be proxied by Cloudflare." in caplog.messages
def test__normalize_domains__no_domain():
given = [
'example.org',
'www.example.org',
{'foo': 'bar'}
]
with pytest.raises(ValueError, match="No domain specified in the entry"):
normalize_domains(given)
def test__load_file(tmpdir):
p = tmpdir.join("test.yml")
expected = {
'email': 'username@example.org',
'api_key': 'qP5EZa648oCRm6qlIDmbIOy37RbmLVRX7jpso',
'periodicity': 60,
'domains': [
{'domain': 'example.org', 'proxied': True},
{'domain': 'www.example.org', 'proxied': True},
{'domain': 'ssh.example.org', 'proxied': False}
]
}
p.write("""email: username@example.org
api_key: qP5EZa648oCRm6qlIDmbIOy37RbmLVRX7jpso
domains:
- example.org
- www.example.org
- domain: ssh.example.org
proxied: false
""")
assert load_file(p) == expected
p.write("""email: username@example.org
api_key: qP5EZa648oCRm6qlIDmbIOy37RbmLVRX7jpso
domains: [example.org, www.example.org, {domain: ssh.example.org, proxied: false}]
""")
assert load_file(p) == expected
def test__load_file__with_periodicity(tmpdir):
p = tmpdir.join("test.yml")
expected = {
'email': 'username@example.org',
'api_key': 'qP5EZa648oCRm6qlIDmbIOy37RbmLVRX7jpso',
'periodicity': 120,
'domains': [
{'domain': 'example.org', 'proxied': True},
{'domain': 'www.example.org', 'proxied': True},
{'domain': 'ssh.example.org', 'proxied': False}
]
}
p.write("""email: username@example.org
api_key: qP5EZa648oCRm6qlIDmbIOy37RbmLVRX7jpso
periodicity: 120
domains:
- example.org
- www.example.org
- domain: ssh.example.org
proxied: false
""")
assert load_file(p) == expected
def test__load_file__only_domains(tmpdir):
p = tmpdir.join("test.yml")
p.write("""domains:
- example.org
- www.example.org
- domain: ssh.example.org
proxied: false
""")
with pytest.raises(ValueError, match="Specify both 'email' and 'api_key' for Cloudflare API!"):
load_file(p)
def test__load_file__only_credentials(tmpdir):
p = tmpdir.join("test.yml")
p.write("""email: username@example.org
api_key: qP5EZa648oCRm6qlIDmbIOy37RbmLVRX7jpso
""")
with pytest.raises(ValueError, match="Domain entries must be specified as a list under the 'domains' key."):
load_file(p)
def test__load_file__list_as_root(tmpdir):
p = tmpdir.join("test.yml")
p.write("""- email: username@example.org
- api_key: qP5EZa648oCRm6qlIDmbIOy37RbmLVRX7jpso
- domains:
- example.org
- www.example.org
- domain: ssh.example.org
proxied: false
""")
with pytest.raises(ValueError, match="Invalid configuration file: the root element must be a dictionary!"):
load_file(p)
def test__load(tmpdir):
p = tmpdir.join("test.yml")
expected = Config("username@example.org", "qP5EZa648oCRm6qlIDmbIOy37RbmLVRX7jpso", 60, [
DomainEntry("example.org", True),
DomainEntry("www.example.org", True),
DomainEntry("ssh.example.org", False)
])
p.write("""email: username@example.org
api_key: qP5EZa648oCRm6qlIDmbIOy37RbmLVRX7jpso
domains:
- example.org
- www.example.org
- domain: ssh.example.org
proxied: false
""")
assert load(p) == expected
p.write("""email: username@example.org
api_key: qP5EZa648oCRm6qlIDmbIOy37RbmLVRX7jpso
domains: [example.org, www.example.org, {domain: ssh.example.org, proxied: false}]
""")
assert load(p) == expected
def test__load__with_periodicity(tmpdir):
p = tmpdir.join("test.yml")
expected = Config("username@example.org", "qP5EZa648oCRm6qlIDmbIOy37RbmLVRX7jpso", 120, [
DomainEntry("example.org", True),
DomainEntry("www.example.org", True),
DomainEntry("ssh.example.org", False)
])
p.write("""email: username@example.org
api_key: qP5EZa648oCRm6qlIDmbIOy37RbmLVRX7jpso
periodicity: 120
domains:
- example.org
- www.example.org
- domain: ssh.example.org
proxied: false
""")
assert load(p) == expected
| 29.982558 | 112 | 0.655226 | 595 | 5,157 | 5.564706 | 0.146218 | 0.184234 | 0.097554 | 0.0746 | 0.8508 | 0.798248 | 0.775294 | 0.760797 | 0.751737 | 0.742676 | 0 | 0.02871 | 0.203025 | 5,157 | 171 | 113 | 30.157895 | 0.776886 | 0 | 0 | 0.70068 | 0 | 0.013605 | 0.511344 | 0.086097 | 0 | 0 | 0 | 0 | 0.07483 | 1 | 0.061224 | false | 0 | 0.027211 | 0 | 0.088435 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
cab8587d059e101de2c3a46a92d52bb07a7832a5 | 76,937 | py | Python | program.py | BogdanFAndrei/2022-4216COMP-Team20 | cfa33fecd032353c2a865d6f4a030e0e141fb43e | [
"Apache-2.0"
] | null | null | null | program.py | BogdanFAndrei/2022-4216COMP-Team20 | cfa33fecd032353c2a865d6f4a030e0e141fb43e | [
"Apache-2.0"
] | null | null | null | program.py | BogdanFAndrei/2022-4216COMP-Team20 | cfa33fecd032353c2a865d6f4a030e0e141fb43e | [
"Apache-2.0"
] | 3 | 2022-03-15T12:25:14.000Z | 2022-03-28T19:43:24.000Z | from cProfile import label
import tkinter as tk
from tkinter import *
from tkinter import ttk
from unicodedata import decimal
import csv
import matplotlib.pyplot as plt
from matplotlib.pyplot import figure
import statistics
from setuptools import Command
def getYearData(year, data_type, value_type):
filename = 'FinallOneCSV.csv'
with open(filename, 'r') as csvfile:
datareader = csv.reader(csvfile)
yearData = []
for row in datareader:
if row[0].find('Average'+str(year)) != -1:
if(data_type == 'temperature'):
if(value_type == 'Maximum'):
yearData.append(float(row[2]))
elif(value_type == 'Average'):
yearData.append(float(row[3]))
elif(value_type == 'Minimum'):
yearData.append(float(row[4]))
elif(data_type == 'DewPoint'):
if(value_type == 'Maximum'):
yearData.append(float(row[5]))
elif(value_type == 'Average'):
yearData.append(float(row[6]))
elif(value_type == 'Minimum'):
yearData.append(float(row[7]))
elif(data_type == 'Humidity'):
if(value_type == 'Maximum'):
yearData.append(float(row[8]))
elif(value_type == 'Average'):
yearData.append(float(row[9]))
elif(value_type == 'Minimum'):
yearData.append(float(row[10]))
elif(data_type == 'WindSpeed'):
if(value_type == 'Maximum'):
yearData.append(float(row[11]))
elif(value_type == 'Average'):
yearData.append(float(row[12]))
elif(value_type == 'Minimum'):
yearData.append(float(row[13]))
elif(data_type == 'Pressure'):
if(value_type == 'Maximum'):
yearData.append(float(row[14]))
elif(value_type == 'Average'):
yearData.append(float(row[15]))
elif(value_type == 'Minimum'):
yearData.append(float(row[16]))
return yearData
def makePlot( year_a, data_type, value_type):
year_a_data = getYearData(year_a, data_type, value_type)
month_labels =['January','February','March','April' ,'May','June','July','August','September','October','November','December']
plt.figure(figsize=(12,5))
plt.plot(month_labels,year_a_data, '-ok',color='green',linestyle='dashed',markerfacecolor='green')
plt.grid(axis= 'y')
plt.title(str(year_a))
plt.xlabel('Month')
plt.ylabel(data_type + " " + value_type)
plt.show()
def maketwoPlots( year_a, year_b, data_type, value_type):
year_a_data = getYearData(year_a, data_type, value_type)
year_b_data = getYearData(year_b, data_type, value_type)
month_labels =['January','February','March','April' ,'May','June','July','August','September','October','November','December']
plt.figure(figsize=(12,5))
plt.plot(month_labels,year_a_data, '-ok',color='green',linestyle='dashed',markerfacecolor='green')
plt.plot(month_labels,year_b_data, '-ok',color='blue',linestyle='dashed',markerfacecolor='blue')
plt.grid(axis= 'y')
plt.title(str(year_a) + " - " + str(year_b))
plt.xlabel('Month')
plt.ylabel(data_type + " " + value_type)
plt.show()
#Create Intro Menu
start = Tk()
start.title("Climate and Weather Team 20")
start.configure(background="aqua")
#App placement
app_width = 500
app_height = 350
screen_width = start.winfo_screenwidth()
screen_height = start.winfo_screenheight()
x = (screen_width / 2) - (app_width / 2)
y = (screen_height / 2) - (app_height / 2)
start.geometry(f'{app_width}x{app_height}+{int(x)}+{int(y)}')
#Creates Headline for Intro
start1 = ttk.Label(start, text = "\u0332".join("Liverpool's Weather and Climate"),
background = 'aqua', foreground ="black",
font = ("Times New Roman", 15))
start1.place(x = 135, y = 10)
start2 = ttk.Label(start, text = "This system is designed to provide data on the weather and climate in Liverpool."
+ "\r\n" +
"This was designed and created by:"
+ "\r\n" +
"- Ryan Hacine-Bacha"
+ "\r\n" +
"- Bogdan-Florin Andrei"
+ "\r\n" +
"- Eoin Boyle"
+ "\r\n" +
"- Elaine Wong"
+ "\r\n" +
"- Xiao Long Qi Andrei"
+ "\r\n" +
"Once you hit the start button you will be able to select one of our many features. ",
background = 'aqua', foreground ="Black",
font = ("Times New Roman", 10))
start2.place(x = 40, y = 50)
#Function for start button
def main ():
mai = Tk()
mai.title("Main Menu")
mai.geometry('760x320')
mai.configure(background="light green")
start.destroy()
#App placement
app_width = 760
app_height = 320
screen_width = mai.winfo_screenwidth()
screen_height = mai.winfo_screenheight()
x = (screen_width / 2) - (app_width / 2)
y = (screen_height / 2) - (app_height / 2)
mai.geometry(f'{app_width}x{app_height}+{int(x)}+{int(y)}')
#Label for Main Menu
mainlabel = ttk.Label(mai, text = "\u0332".join("Main Menu"),
background = 'lightgreen', foreground ="Black",
font = ("Times New Roman", 15))
mainlabel.place(x=240, y=10)
#YEAR ONE options
ml5 = ttk.Label(mai, text = "One Year",
background = 'light green', foreground ="black",
font = ("Times New Roman", 15))
ml5.place(x=50, y=50)
btn1 = Button(mai,text="Temperature (° F)", width=20, command=op1)
btn1.place(x=10, y=100)
btn2 = Button(mai,text="Drew Point (° F)", width=20, command=op2)
btn2.place(x=10, y=150)
btn3 = Button(mai,text="Humidity (%)", width=20, command=op3)
btn3.place(x=10, y=200)
btn4 = Button(mai,text="Wind Speed (mph)", width=20, command=op4)
btn4.place(x=10, y=250)
#YEAR TWO options
ml5 = ttk.Label(mai, text = "Two Years",
background = 'light green', foreground ="black",
font = ("Times New Roman", 15))
ml5.place(x=240, y=50)
btn6 = Button(mai,text="Temperature (° F)", width=20, command=TempComp)
btn6.place(x=205, y=100)
btn7 = Button(mai,text="Dew Point (° F)", width=20, command=DewComp)
btn7.place(x=205, y=150)
btn8 = Button(mai,text="Humidity (%)", width=20, command=HumidityComp)
btn8.place(x=205, y=200)
btn9 = Button(mai,text="Wind Speed (mph)", width=20, command=WindComp)
btn9.place(x=205, y=250)
#YEAR THREE options
ml5 = ttk.Label(mai, text = "Three years",
background = 'light green', foreground ="black",
font = ("Times New Roman", 15))
ml5.place(x=430, y=50)
btn10 = Button(mai,text="Temperature (° F)", width=20, command=op1p2)
btn10.place(x=400, y=100)
btn11 = Button(mai,text="Dew Point (° F)", width=20, command=op2p2)
btn11.place(x=400, y=150)
btn12 =Button(mai,text="Humidity (%)", width=20, command=op3p2)
btn12.place(x=400, y=200)
btn13 = Button(mai,text="Wind Speed (mph)", width=20, command=op4p2)
btn13.place(x=400, y=250)
#YEAR FOUR options
ml5 = ttk.Label(mai, text = "Four Years",
background = 'light green', foreground ="black",
font = ("Times New Roman", 15))
ml5.place(x=625, y=50)
btn14 = Button(mai,text="Temperature (° F)", width=20, command=rn1)
btn14.place(x=600, y=100)
btn15 = Button(mai,text="Dew Point (° F)", width=20, command=rn2)
btn15.place(x=600, y=150)
btn16 = Button(mai,text="Humidity (%)", width=20, command=rn3)
btn16.place(x=600, y=200)
btn17 = Button(mai,text="Wind Speed (mph)", width=20, command=rn4)
btn17.place(x=600, y=250)
#Adding Function for start Button
bntStart1 = Button(start,text="Click to start", width=15, command=main, background = 'lightgreen', foreground ="Black")
bntStart1.place(x=180, y=300)
#Option 1 Function
def op1 ():
op1 = Tk()
op1.title("Temperature (° F)")
op1.configure(background="darkorange")
#App placement
app_width = 400
app_height = 240
screen_width = op1.winfo_screenwidth()
screen_height = op1.winfo_screenheight()
x = (screen_width / 2) - (app_width / 2)
y = (screen_height / 2) - (app_height / 2)
op1.geometry(f'{app_width}x{app_height}+{int(x)}+{int(y)}')
lbl = ttk.Label(op1, text = "Temperature (° F)",
background = 'darkorange', foreground ="black",
font = ("Times New Roman", 15))
lbl.place(x=80, y=18)
# label max/average/min
lbl1 = ttk.Label(op1, text = "Select the Year :",
background = 'darkorange', foreground ="black",
font = ("Times New Roman", 10))
lbl1.place(x=10, y=80)
# Combobox creation
n = tk.StringVar()
yearchoosen = ttk.Combobox(op1, width = 27, textvariable = n)
yearchoosen.place(x=110, y=80)
# Adding combobox drop down list
yearchoosen['values'] = ('2009',
'2010',
'2011',
'2012',
'2013',
'2014',
'2015',
'2016',
'2017',
'2018',
'2019',
'2020')
yearchoosen.place(x=110, y=80)
yearchoosen.current()
# label value
lbl2 = ttk.Label(op1, text = "Select the value :",
background = 'darkorange', foreground ="black",
font = ("Times New Roman", 10))
lbl2.place(x=10, y=130)
# Combobox creation
n = tk.StringVar()
typevalue = ttk.Combobox(op1, width = 27, textvariable = n)
typevalue.place(x=110, y=130)
# Adding combobox drop down list
typevalue['values'] = ('Minimum',
'Average',
'Maximum')
typevalue.place(x=110, y=130)
typevalue.current()
#Submitbutton function +++++ Function for finding data
def click():
year_a = yearchoosen.get()
value_type = typevalue.get()
data_type='temperature'
makePlot(year_a,data_type,value_type)
#addbutton
btn = Button(op1,text="Submit", width=6, command=click)
btn.place(x=120, y=180)
#add exit button
btn1 = Button(op1,text="Quit", width=6, command=op1.destroy)
btn1.place(x=220, y=180)
op1.mainloop()
#Option 2 Function
def op2 ():
op2 = Tk()
op2.title("Dew Point (° F)")
op2.configure(background="darkorange")
#App placement
app_width = 400
app_height = 240
screen_width = op2.winfo_screenwidth()
screen_height = op2.winfo_screenheight()
x = (screen_width / 2) - (app_width / 2)
y = (screen_height / 2) - (app_height / 2)
op2.geometry(f'{app_width}x{app_height}+{int(x)}+{int(y)}')
lbl = ttk.Label(op2, text = "Dew Point (° F)",
background = 'darkorange', foreground ="black",
font = ("Times New Roman", 15))
lbl.place(x=80, y=18)
# label max/average/min
lbl1 = ttk.Label(op2, text = "Select the Year :",
background = 'darkorange', foreground ="black",
font = ("Times New Roman", 10))
lbl1.place(x=10, y=80)
# Combobox creation
n = tk.StringVar()
yearchoosen = ttk.Combobox(op2, width = 27, textvariable = n)
yearchoosen.place(x=110, y=80)
# Adding combobox drop down list
yearchoosen['values'] = ('2009',
'2010',
'2011',
'2012',
'2013',
'2014',
'2015',
'2016',
'2017',
'2018',
'2019',
'2020')
yearchoosen.place(x=110, y=80)
yearchoosen.current()
# label value
lbl2 = ttk.Label(op2, text = "Select the value :",
background = 'darkorange', foreground ="black",
font = ("Times New Roman", 10))
lbl2.place(x=10, y=130)
# Combobox creation
n = tk.StringVar()
typevalue = ttk.Combobox(op2, width = 27, textvariable = n)
typevalue.place(x=110, y=130)
# Adding combobox drop down list
typevalue['values'] = ('Minimum',
'Average',
'Maximum')
typevalue.place(x=110, y=130)
typevalue.current()
#Submitbutton function +++++ Function for finding data
def click():
year_a = yearchoosen.get()
value_type = typevalue.get()
data_type='DewPoint'
makePlot(year_a,data_type,value_type)
#addbutton
btn = Button(op2,text="Submit", width=6, command=click)
btn.place(x=120, y=180)
#add exit button
btn1 = Button(op2,text="Quit", width=6, command=op2.destroy)
btn1.place(x=220, y=180)
op2.mainloop()
#Option 3 Function
def op3 ():
op3 = Tk()
op3.title("Humidity (%)")
op3.configure(background="darkorange")
#App placement
app_width = 400
app_height = 240
screen_width = op3.winfo_screenwidth()
screen_height = op3.winfo_screenheight()
x = (screen_width / 2) - (app_width / 2)
y = (screen_height / 2) - (app_height / 2)
op3.geometry(f'{app_width}x{app_height}+{int(x)}+{int(y)}')
lbl = ttk.Label(op3, text = "Humidity (%)",
background = 'darkorange', foreground ="black",
font = ("Times New Roman", 15))
lbl.place(x=80, y=18)
# label max/average/min
lbl1 = ttk.Label(op3, text = "Select the Year :",
background = 'darkorange', foreground ="black",
font = ("Times New Roman", 10))
lbl1.place(x=10, y=80)
# Combobox creation
n = tk.StringVar()
yearchoosen = ttk.Combobox(op3, width = 27, textvariable = n)
yearchoosen.place(x=110, y=80)
# Adding combobox drop down list
yearchoosen['values'] = ('2009',
'2010',
'2011',
'2012',
'2013',
'2014',
'2015',
'2016',
'2017',
'2018',
'2019',
'2020')
yearchoosen.place(x=110, y=80)
yearchoosen.current()
# label value
lbl2 = ttk.Label(op3, text = "Select the value :",
background = 'darkorange', foreground ="black",
font = ("Times New Roman", 10))
lbl2.place(x=10, y=130)
# Combobox creation
n = tk.StringVar()
typevalue = ttk.Combobox(op3, width = 27, textvariable = n)
typevalue.place(x=110, y=130)
# Adding combobox drop down list
typevalue['values'] = ('Minimum',
'Average',
'Maximum')
typevalue.place(x=110, y=130)
typevalue.current()
#Submitbutton function +++++ Function for finding data
def click():
year_a = yearchoosen.get()
value_type = typevalue.get()
data_type='Humidity'
makePlot(year_a,data_type,value_type)
#addbutton
btn = Button(op3,text="Submit", width=6, command=click)
btn.place(x=120, y=180)
#add exit button
btn1 = Button(op3,text="Quit", width=6, command=op3.destroy)
btn1.place(x=220, y=180)
op3.mainloop()
#Option 4 Function
#Option 3 Function
def op4 ():
op4 = Tk()
op4.title("Wind Speed (mph)")
op4.configure(background="darkorange")
#App placement
app_width = 400
app_height = 240
screen_width = op4.winfo_screenwidth()
screen_height = op4.winfo_screenheight()
x = (screen_width / 2) - (app_width / 2)
y = (screen_height / 2) - (app_height / 2)
op4.geometry(f'{app_width}x{app_height}+{int(x)}+{int(y)}')
lbl = ttk.Label(op4, text = "Wind Speed (mph)",
background = 'darkorange', foreground ="black",
font = ("Times New Roman", 15))
lbl.place(x=80, y=18)
# label max/average/min
lbl1 = ttk.Label(op4, text = "Select the Year :",
background = 'darkorange', foreground ="black",
font = ("Times New Roman", 10))
lbl1.place(x=10, y=80)
# Combobox creation
n = tk.StringVar()
yearchoosen = ttk.Combobox(op4, width = 27, textvariable = n)
yearchoosen.place(x=110, y=80)
# Adding combobox drop down list
yearchoosen['values'] = ('2009',
'2010',
'2011',
'2012',
'2013',
'2014',
'2015',
'2016',
'2017',
'2018',
'2019',
'2020')
yearchoosen.place(x=110, y=80)
yearchoosen.current()
# label value
lbl2 = ttk.Label(op4, text = "Select the value :",
background = 'darkorange', foreground ="black",
font = ("Times New Roman", 10))
lbl2.place(x=10, y=130)
# Combobox creation
n = tk.StringVar()
typevalue = ttk.Combobox(op4, width = 27, textvariable = n)
typevalue.place(x=110, y=130)
# Adding combobox drop down list
typevalue['values'] = ('Minimum',
'Average',
'Maximum')
typevalue.place(x=110, y=130)
typevalue.current()
#Submitbutton function +++++ Function for finding data
def click():
year_a = yearchoosen.get()
value_type = typevalue.get()
data_type='WindSpeed'
makePlot(year_a,data_type,value_type)
#addbutton
btn = Button(op4,text="Submit", width=6, command=click)
btn.place(x=120, y=180)
#add exit button
btn1 = Button(op4,text="Quit", width=6, command=op4.destroy)
btn1.place(x=220, y=180)
op4.mainloop()
# Finding max temp of a year
def findMaxValueYear(year_a, data_type):
maxValueYear=[]
for year in year_a:
value_type = 'Maximum'
year_a_data=getYearData(year, data_type, value_type)
maxValueYear.append(max(year_a_data))
return maxValueYear
# Finding mean temp of a year
def findMeanValueYear(year_a, data_type):
meanValueYear=[]
for year in year_a:
value_type = 'Average'
year_a_data=getYearData(year, data_type, value_type)
meanValueYear.append(statistics.mean(year_a_data))
return meanValueYear
# Finding min temp of a year
def findMinValueYear(year_a, data_type):
minValueYear=[]
for year in year_a:
value_type = 'Minimum'
year_a_data=getYearData(year, data_type, value_type)
minValueYear.append(min(year_a_data))
return minValueYear
# Function to make a single comparison graph.
def makeComPlot(year, data_type):
# Getting max, mean and min values to plot
maxValuesYear = findMaxValueYear(year, data_type)
meanValuesYear = findMeanValueYear(year, data_type)
minValuesYear = findMinValueYear(year, data_type)
# Gathering better y axis labels that includes the measurements
if (data_type=='temperature'):
ylabel="Temperature (° F)"
if (data_type=='DewPoint'):
ylabel="Dew Point (° F)"
if (data_type=='Humidity'):
ylabel="Humidity (%)"
if (data_type=='WindSpeed'):
ylabel="Wind Speed (mph)"
# Creating a figure and adding a title, y and x labels and a grid
plt.figure(figsize=(12,5))
plt.title(f"Comparing the maximum, mean & minimum {data_type} values of {year[0]}, {year[1]} & {year[2]}")
plt.ylabel(ylabel)
plt.xlabel('Years')
plt.grid(axis= 'y')
# Adding vertical lines between the points
plt.vlines(x=year, ymin=minValuesYear, ymax=[maxValuesYear], colors='black', ls='-', lw=2)
# Plotting
plt.plot(year,maxValuesYear, 'dk',markersize = 10, color='blue',linestyle='',markerfacecolor='blue', label="Maximum")
plt.plot(year,meanValuesYear, 'sk',markersize = 10, color='magenta',linestyle='',markerfacecolor='magenta', label="Mean")
plt.plot(year,minValuesYear, 'ok', markersize = 10, color='red',linestyle='',markerfacecolor='red', label="Minimum")
# Adding a legend
plt.legend(bbox_to_anchor=(1.1, 1.05))
plt.show()
years = ['2009','2010','2011','2012','2013', '2014', '2015', '2016', '2017', '2018', '2019', '2020']
# Option 1.2 Function
def op1p2 ():
# Creating the layout
op1p2 = Tk()
op1p2.title("Comparing Yearly Temperature (° F)")
op1p2.configure(background="yellow")
#App placement
app_width = 500
app_height = 320
screen_width = op1p2.winfo_screenwidth()
screen_height = op1p2.winfo_screenheight()
x = (screen_width / 2) - (app_width / 2)
y = (screen_height / 2) - (app_height / 2)
op1p2.geometry(f'{app_width}x{app_height}+{int(x)}+{int(y)}')
lbl0 = ttk.Label(op1p2, text = "Comparing Yearly Temperature (° F)",
background = 'yellow', foreground ="black",
font = ("Times New Roman", 15))
lbl0.place(x=100, y=20)
# Year Selection 1 by creating labels, comboboxes & combobox drop down lists
lbl1 = ttk.Label(op1p2, text = "Select Year 1 :",
background = 'yellow', foreground ="black",
font = ("Times New Roman", 10))
lbl1.place(x=80, y=100)
n = tk.StringVar()
yearchoosen1 = ttk.Combobox(op1p2, width = 27, textvariable = n)
yearchoosen1['values'] = years
yearchoosen1.current()
yearchoosen1.place(x=180, y=100)
# Year Selection 2 by creating labels, comboboxes & combobox drop down lists
lbl2 = ttk.Label(op1p2, text = "Select Year 2 :",
background = 'yellow', foreground ="black",
font = ("Times New Roman", 10))
lbl2.place(x=80, y=155)
n = tk.StringVar()
yearchoosen2 = ttk.Combobox(op1p2, width = 27, textvariable = n)
yearchoosen2['values'] = years
yearchoosen2.current()
yearchoosen2.place(x=180, y=155)
# Year Selection 3 by creating labels, comboboxes & combobox drop down lists
lbl3 = ttk.Label(op1p2, text = "Select Year 3 :",
background = 'yellow', foreground ="black",
font = ("Times New Roman", 10))
lbl3.place(x=80, y=210)
n = tk.StringVar()
yearchoosen3 = ttk.Combobox(op1p2, width = 27, textvariable = n)
yearchoosen3['values'] = years
yearchoosen3.current()
yearchoosen3.place(x=180, y=210)
# Submit button function to make ComPlot
def clickop1p2():
year1= yearchoosen1.get()
year2= yearchoosen2.get()
year3= yearchoosen3.get()
yearSelections = [year1, year2, year3]
makeComPlot(yearSelections,'temperature')
#Adding submit & exit buttons
btn = Button(op1p2,text="Submit", width=6, command=clickop1p2)
btn.place(x=185, y=260)
btn1 = Button(op1p2,text="Quit", width=6, command=op1p2.destroy)
btn1.place(x=285, y=260)
op1p2.mainloop()
# Option 2.2 function
def op2p2 ():
# Creating the layout
op2p2 = Tk()
op2p2.title("Comparing Yearly Dew Point (° F)")
op2p2.configure(background="yellow")
#App placement
app_width = 500
app_height = 320
screen_width = op2p2.winfo_screenwidth()
screen_height = op2p2.winfo_screenheight()
x = (screen_width / 2) - (app_width / 2)
y = (screen_height / 2) - (app_height / 2)
op2p2.geometry(f'{app_width}x{app_height}+{int(x)}+{int(y)}')
lbl0 = ttk.Label(op2p2, text = "Comparing Yearly Dew Point (° F)",
background = 'yellow', foreground ="black",
font = ("Times New Roman", 15))
lbl0.place(x=100, y=20)
# Year Selection 1 by creating labels, comboboxes & combobox drop down lists
lbl1 = ttk.Label(op2p2, text = "Select Year 1 :",
background = 'yellow', foreground ="black",
font = ("Times New Roman", 10))
lbl1.place(x=80, y=100)
n = tk.StringVar()
yearchoosen1 = ttk.Combobox(op2p2, width = 27, textvariable = n)
yearchoosen1['values'] = years
yearchoosen1.current()
yearchoosen1.place(x=180, y=100)
# Year Selection 2 by creating labels, comboboxes & combobox drop down lists
lbl2 = ttk.Label(op2p2, text = "Select Year 2 :",
background = 'yellow', foreground ="black",
font = ("Times New Roman", 10))
lbl2.place(x=80, y=155)
n = tk.StringVar()
yearchoosen2 = ttk.Combobox(op2p2, width = 27, textvariable = n)
yearchoosen2['values'] = years
yearchoosen2.current()
yearchoosen2.place(x=180, y=155)
# Year Selection 3 by creating labels, comboboxes & combobox drop down lists
lbl3 = ttk.Label(op2p2, text = "Select Year 3 :",
background = 'yellow', foreground ="black",
font = ("Times New Roman", 10))
lbl3.place(x=80, y=210)
n = tk.StringVar()
yearchoosen3 = ttk.Combobox(op2p2, width = 27, textvariable = n)
yearchoosen3['values'] = years
yearchoosen3.current()
yearchoosen3.place(x=180, y=210)
# Submit button function to make ComPlot
def clickop2p2():
year1= yearchoosen1.get()
year2= yearchoosen2.get()
year3= yearchoosen3.get()
yearSelections = [year1, year2, year3]
makeComPlot(yearSelections,'DewPoint')
#Adding submit & exit buttons
btn = Button(op2p2,text="Submit", width=6, command=clickop2p2)
btn.place(x=185, y=260)
btn1 = Button(op2p2,text="Quit", width=6, command=op2p2.destroy)
btn1.place(x=285, y=260)
op2p2.mainloop()
# Option 3.2 Function
def op3p2 ():
# Creating the layout
op3p2 = Tk()
op3p2.title("Comparing Yearly Humidity (%)")
op3p2.configure(background="yellow")
#App placement
app_width = 500
app_height = 320
screen_width = op3p2.winfo_screenwidth()
screen_height = op3p2.winfo_screenheight()
x = (screen_width / 2) - (app_width / 2)
y = (screen_height / 2) - (app_height / 2)
op3p2.geometry(f'{app_width}x{app_height}+{int(x)}+{int(y)}')
lbl0 = ttk.Label(op3p2, text = "Comparing Yearly Humidity (%)",
background = 'yellow', foreground ="black",
font = ("Times New Roman", 15))
lbl0.place(x=100, y=20)
# Year Selection 1 by creating labels, comboboxes & combobox drop down lists
lbl1 = ttk.Label(op3p2, text = "Select Year 1 :",
background = 'yellow', foreground ="black",
font = ("Times New Roman", 10))
lbl1.place(x=80, y=100)
n = tk.StringVar()
yearchoosen1 = ttk.Combobox(op3p2, width = 27, textvariable = n)
yearchoosen1['values'] = years
yearchoosen1.current()
yearchoosen1.place(x=180, y=100)
# Year Selection 2 by creating labels, comboboxes & combobox drop down lists
lbl2 = ttk.Label(op3p2, text = "Select Year 2 :",
background = 'yellow', foreground ="black",
font = ("Times New Roman", 10))
lbl2.place(x=80, y=155)
n = tk.StringVar()
yearchoosen2 = ttk.Combobox(op3p2, width = 27, textvariable = n)
yearchoosen2['values'] = years
yearchoosen2.current()
yearchoosen2.place(x=180, y=155)
# Year Selection 3 by creating labels, comboboxes & combobox drop down lists
lbl3 = ttk.Label(op3p2, text = "Select Year 3 :",
background = 'yellow', foreground ="black",
font = ("Times New Roman", 10))
lbl3.place(x=80, y=210)
n = tk.StringVar()
yearchoosen3 = ttk.Combobox(op3p2, width = 27, textvariable = n)
yearchoosen3['values'] = years
yearchoosen3.current()
yearchoosen3.place(x=180, y=210)
# Submit button function to make ComPlot
def clickop3p2():
year1= yearchoosen1.get()
year2= yearchoosen2.get()
year3= yearchoosen3.get()
yearSelections = [year1, year2, year3]
makeComPlot(yearSelections,'Humidity')
#Adding submit & exit buttons
btn = Button(op3p2,text="Submit", width=6, command=clickop3p2)
btn.place(x=185, y=260)
btn1 = Button(op3p2,text="Quit", width=6, command=op3p2.destroy)
btn1.place(x=285, y=260)
op3p2.mainloop()
# Option 4.2 Function
def op4p2 ():
# Creating the layout
op4p2 = Tk()
op4p2.title("Comparing Yearly Wind Speed (mph)")
op4p2.configure(background="yellow")
#App placement
app_width = 500
app_height = 320
screen_width = op4p2.winfo_screenwidth()
screen_height = op4p2.winfo_screenheight()
x = (screen_width / 2) - (app_width / 2)
y = (screen_height / 2) - (app_height / 2)
op4p2.geometry(f'{app_width}x{app_height}+{int(x)}+{int(y)}')
lbl0 = ttk.Label(op4p2, text = "Comparing Yearly WindSpeed (mph)",
background = 'yellow', foreground ="black",
font = ("Times New Roman", 15))
lbl0.place(x=100, y=20)
# Year Selection 1 by creating labels, comboboxes & combobox drop down lists
lbl1 = ttk.Label(op4p2, text = "Select Year 1 :",
background = 'yellow', foreground ="black",
font = ("Times New Roman", 10))
lbl1.place(x=80, y=100)
n = tk.StringVar()
yearchoosen1 = ttk.Combobox(op4p2, width = 27, textvariable = n)
yearchoosen1['values'] = years
yearchoosen1.current()
yearchoosen1.place(x=180, y=100)
# Year Selection 2 by creating labels, comboboxes & combobox drop down lists
lbl2 = ttk.Label(op4p2, text = "Select Year 2 :",
background = 'yellow', foreground ="black",
font = ("Times New Roman", 10))
lbl2.place(x=80, y=155)
n = tk.StringVar()
yearchoosen2 = ttk.Combobox(op4p2, width = 27, textvariable = n)
yearchoosen2['values'] = years
yearchoosen2.current()
yearchoosen2.place(x=180, y=155)
# Year Selection 3 by creating labels, comboboxes & combobox drop down lists
lbl3 = ttk.Label(op4p2, text = "Select Year 3 :",
background = 'yellow', foreground ="black",
font = ("Times New Roman", 10))
lbl3.place(x=80, y=210)
n = tk.StringVar()
yearchoosen3 = ttk.Combobox(op4p2, width = 27, textvariable = n)
yearchoosen3['values'] = years
yearchoosen3.current()
yearchoosen3.place(x=180, y=210)
# Submit button function to make ComPlot
def clickop4p2():
year1= yearchoosen1.get()
year2= yearchoosen2.get()
year3= yearchoosen3.get()
yearSelections = [year1, year2, year3]
makeComPlot(yearSelections,'WindSpeed')
#Adding submit & exit buttons
btn = Button(op4p2,text="Submit", width=6, command=clickop4p2)
btn.place(x=185, y=260)
btn1 = Button(op4p2,text="Quit", width=6, command=op4p2.destroy)
btn1.place(x=285, y=260)
op4p2.mainloop()
#---------------------------------------------------------------------------------------------------------------
#Option 6
def TempComp ():
TempComp = Tk()
TempComp.title("Temperature")
TempComp.configure(background="light yellow")
#App placement
app_width = 500
app_height = 320
screen_width = TempComp.winfo_screenwidth()
screen_height = TempComp.winfo_screenheight()
x = (screen_width / 2) - (app_width / 2)
y = (screen_height / 2) - (app_height / 2)
TempComp.geometry(f'{app_width}x{app_height}+{int(x)}+{int(y)}')
lbl = ttk.Label(TempComp, text = "Temperature",
background = 'light yellow', foreground ="black",
font = ("Times New Roman", 15))
lbl.place(x=200, y=20)
# label max/average/min
lbl1 = ttk.Label(TempComp, text = "Select first Year :",
background = 'light yellow', foreground ="black",
font = ("Times New Roman", 10))
lbl1.place(x=80, y=80)
lbl2 = ttk.Label(TempComp, text = "Select second Year :",
background = 'light yellow', foreground ="black",
font = ("Times New Roman", 10))
lbl2.place(x=80, y=138)
# Combobox creation
n = tk.StringVar()
m = tk.StringVar()
yearchoosen = ttk.Combobox(TempComp, width = 27, textvariable = n)
yearchoosen2 = ttk.Combobox(TempComp, width = 27, textvariable = m)
# Adding combobox drop down list
yearchoosen['values'] = ('2009',
'2010',
'2011',
'2012',
'2013',
'2014',
'2015',
'2016',
'2017',
'2018',
'2019',
'2020')
yearchoosen.place(x=200, y=80)
yearchoosen.current()
yearchoosen2['values'] = ('2009',
'2010',
'2011',
'2012',
'2013',
'2014',
'2015',
'2016',
'2017',
'2018',
'2019',
'2020')
yearchoosen2.place(x=200, y=138)
yearchoosen2.current()
# label year
lbl5 = ttk.Label(TempComp, text = "Select the value :",
background = 'light yellow', foreground ="black",
font = ("Times New Roman", 10))
lbl5.place(x=80, y=200)
# Combobox creation
n = tk.StringVar()
typevalue = ttk.Combobox(TempComp, width = 27, textvariable = n)
# Adding combobox drop down list
typevalue['values'] = ('Minimum',
'Average',
'Maximum')
typevalue.place(x=200, y=200)
typevalue.current()
#Submitbutton function
def click6():
year_a = yearchoosen.get()
year_b = yearchoosen2.get()
value_type = typevalue.get()
data_type='temperature'
maketwoPlots(year_a, year_b, data_type, value_type)
#addbutton
lbl6 = Button(TempComp,text="Submit", width=6, command=click6)
lbl6.place(x=220, y=245)
#add exit button
lbl7 = Button(TempComp,text="Quit", width=6, command=TempComp.destroy)
lbl7.place(x=315, y=245)
TempComp.mainloop()
#Option7
def DewComp ():
DewComp = Tk()
DewComp.title("Dew Point")
DewComp.configure(background="light yellow")
#App placement
app_width = 500
app_height = 320
screen_width = DewComp.winfo_screenwidth()
screen_height = DewComp.winfo_screenheight()
x = (screen_width / 2) - (app_width / 2)
y = (screen_height / 2) - (app_height / 2)
DewComp.geometry(f'{app_width}x{app_height}+{int(x)}+{int(y)}')
lbl = ttk.Label(DewComp, text = "Dew Point",
background = 'light yellow', foreground ="black",
font = ("Times New Roman", 15))
lbl.place(x=200, y=20)
# label max/average/min
lbl1 = ttk.Label(DewComp, text = "Select first Year :",
background = 'light yellow', foreground ="black",
font = ("Times New Roman", 10))
lbl1.place(x=80, y=80)
lbl2 = ttk.Label(DewComp, text = "Select second Year :",
background = 'light yellow', foreground ="black",
font = ("Times New Roman", 10))
lbl2.place(x=80, y=138)
# Combobox creation
n = tk.StringVar()
m = tk.StringVar()
yearchoosen = ttk.Combobox(DewComp, width = 27, textvariable = n)
yearchoosen2 = ttk.Combobox(DewComp, width = 27, textvariable = m)
# Adding combobox drop down list
yearchoosen['values'] = ('2009',
'2010',
'2011',
'2012',
'2013',
'2014',
'2015',
'2016',
'2017',
'2018',
'2019',
'2020')
yearchoosen.place(x=200, y=80)
yearchoosen.current()
yearchoosen2['values'] = ('2009',
'2010',
'2011',
'2012',
'2013',
'2014',
'2015',
'2016',
'2017',
'2018',
'2019',
'2020')
yearchoosen2.place(x=200, y=138)
yearchoosen2.current()
# label year
lbl5 = ttk.Label(DewComp, text = "Select the value :",
background = 'light yellow', foreground ="black",
font = ("Times New Roman", 10))
lbl5.place(x=80, y=200)
# Combobox creation
n = tk.StringVar()
typevalue = ttk.Combobox(DewComp, width = 27, textvariable = n)
# Adding combobox drop down list
typevalue['values'] = ('Minimum',
'Average',
'Maximum')
typevalue.place(x=200, y=200)
typevalue.current()
#Submitbutton function
def click6():
year_a = yearchoosen.get()
year_b = yearchoosen2.get()
value_type = typevalue.get()
data_type='DewPoint'
maketwoPlots(year_a, year_b, data_type, value_type)
#addbutton
lbl6 = Button(DewComp,text="Submit", width=6, command=click6)
lbl6.place(x=220, y=245)
#add exit button
lbl7 = Button(DewComp,text="Quit", width=6, command=DewComp.destroy)
lbl7.place(x=315, y=245)
DewComp.mainloop()
#Option8
def HumidityComp ():
HumidityComp = Tk()
HumidityComp.title("Humidity")
HumidityComp.configure(background="light yellow")
#App placement
app_width = 500
app_height = 320
screen_width = HumidityComp.winfo_screenwidth()
screen_height = HumidityComp.winfo_screenheight()
x = (screen_width / 2) - (app_width / 2)
y = (screen_height / 2) - (app_height / 2)
HumidityComp.geometry(f'{app_width}x{app_height}+{int(x)}+{int(y)}')
lbl = ttk.Label(HumidityComp, text = "Humidity",
background = 'light yellow', foreground ="black",
font = ("Times New Roman", 15))
lbl.place(x=200, y=20)
# label max/average/min
lbl1 = ttk.Label(HumidityComp, text = "Select first Year :",
background = 'light yellow', foreground ="black",
font = ("Times New Roman", 10))
lbl1.place(x=80, y=80)
lbl2 = ttk.Label(HumidityComp, text = "Select second Year :",
background = 'light yellow', foreground ="black",
font = ("Times New Roman", 10))
lbl2.place(x=80, y=138)
# Combobox creation
n = tk.StringVar()
m = tk.StringVar()
yearchoosen = ttk.Combobox(HumidityComp, width = 27, textvariable = n)
yearchoosen2 = ttk.Combobox(HumidityComp, width = 27, textvariable = m)
# Adding combobox drop down list
yearchoosen['values'] = ('2009',
'2010',
'2011',
'2012',
'2013',
'2014',
'2015',
'2016',
'2017',
'2018',
'2019',
'2020')
yearchoosen.place(x=200, y=80)
yearchoosen.current()
yearchoosen2['values'] = ('2009',
'2010',
'2011',
'2012',
'2013',
'2014',
'2015',
'2016',
'2017',
'2018',
'2019',
'2020')
yearchoosen2.place(x=200, y=138)
yearchoosen2.current()
# label year
lbl5 = ttk.Label(HumidityComp, text = "Select the value :",
background = 'light yellow', foreground ="black",
font = ("Times New Roman", 10))
lbl5.place(x=80, y=200)
# Combobox creation
n = tk.StringVar()
typevalue = ttk.Combobox(HumidityComp, width = 27, textvariable = n)
# Adding combobox drop down list
typevalue['values'] = ('Minimum',
'Average',
'Maximum')
typevalue.place(x=200, y=200)
typevalue.current()
#Submitbutton function
def click6():
year_a = yearchoosen.get()
year_b = yearchoosen2.get()
value_type = typevalue.get()
data_type='Humidity'
maketwoPlots(year_a, year_b, data_type, value_type)
#addbutton
lbl6 = Button(HumidityComp,text="Submit", width=6, command=click6)
lbl6.place(x=220, y=245)
#add exit button
lbl7 = Button(HumidityComp,text="Quit", width=6, command=HumidityComp.destroy)
lbl7.place(x=315, y=245)
HumidityComp.mainloop()
#Option9
def WindComp ():
WindComp = Tk()
WindComp.title("Wind Speed")
WindComp.configure(background="light yellow")
#App placement
app_width = 500
app_height = 320
screen_width = WindComp.winfo_screenwidth()
screen_height = WindComp.winfo_screenheight()
x = (screen_width / 2) - (app_width / 2)
y = (screen_height / 2) - (app_height / 2)
WindComp.geometry(f'{app_width}x{app_height}+{int(x)}+{int(y)}')
lbl = ttk.Label(WindComp, text = "Wind Speed",
background = 'light yellow', foreground ="black",
font = ("Times New Roman", 15))
lbl.place(x=200, y=20)
# label max/average/min
lbl1 = ttk.Label(WindComp, text = "Select first Year :",
background = 'light yellow', foreground ="black",
font = ("Times New Roman", 10))
lbl1.place(x=80, y=80)
lbl2 = ttk.Label(WindComp, text = "Select second Year :",
background = 'light yellow', foreground ="black",
font = ("Times New Roman", 10))
lbl2.place(x=80, y=138)
# Combobox creation
n = tk.StringVar()
m = tk.StringVar()
yearchoosen = ttk.Combobox(WindComp, width = 27, textvariable = n)
yearchoosen2 = ttk.Combobox(WindComp, width = 27, textvariable = m)
# Adding combobox drop down list
yearchoosen['values'] = ('2009',
'2010',
'2011',
'2012',
'2013',
'2014',
'2015',
'2016',
'2017',
'2018',
'2019',
'2020')
yearchoosen.place(x=200, y=80)
yearchoosen.current()
yearchoosen2['values'] = ('2009',
'2010',
'2011',
'2012',
'2013',
'2014',
'2015',
'2016',
'2017',
'2018',
'2019',
'2020')
yearchoosen2.place(x=200, y=138)
yearchoosen2.current()
# label year
lbl5 = ttk.Label(WindComp, text = "Select the value :",
background = 'light yellow', foreground ="black",
font = ("Times New Roman", 10))
lbl5.place(x=80, y=200)
# Combobox creation
n = tk.StringVar()
typevalue = ttk.Combobox(WindComp, width = 27, textvariable = n)
# Adding combobox drop down list
typevalue['values'] = ('Minimum',
'Average',
'Maximum')
typevalue.place(x=200, y=200)
typevalue.current()
#Submitbutton function
def click6():
year_a = yearchoosen.get()
year_b = yearchoosen2.get()
value_type = typevalue.get()
data_type='WindComp'
maketwoPlots(year_a, year_b, data_type, value_type)
#addbutton
lbl6 = Button(WindComp,text="Submit", width=6, command=click6)
lbl6.place(x=220, y=245)
#add exit button
lbl7 = Button(WindComp,text="Quit", width=6, command=WindComp.destroy)
lbl7.place(x=315, y=245)
WindComp.mainloop()
op1p2 = Tk()
op1p2.title("Comparing Yearly Temperature (° F)")
op1p2.configure(background="yellow")
#--------------------------------------------------------------------------------------RYAN
def makefourplot(year, data_type):
# Getting max, mean and min values to plot
maxValuesYear = findMaxValueYear(year, data_type)
meanValuesYear = findMeanValueYear(year, data_type)
minValuesYear = findMinValueYear(year, data_type)
# Gathering better y axis labels that includes the measurements
if (data_type=='temperature'):
ylabel="Temperature (° F)"
if (data_type=='DewPoint'):
ylabel="Dew Point (° F)"
if (data_type=='Humidity'):
ylabel="Humidity (%)"
if (data_type=='WindSpeed'):
ylabel="Wind Speed (mph)"
# Creating a figure and adding a title, y and x labels and a grid
plt.figure(figsize=(12,5))
plt.title(f"Comparing the maximum, mean & minimum {data_type} values of {year[0]}, {year[1]} , {year[2]}& {year[3]}")
plt.ylabel(ylabel)
plt.xlabel('Years')
plt.grid(axis= 'y')
# Plotting
plt.scatter(year,maxValuesYear, s = 150, label="Maximum")
plt.scatter(year,meanValuesYear, s = 150, label="Mean")
plt.scatter(year,minValuesYear, s =150, label="Minimum")
# Adding a legend
plt.legend(bbox_to_anchor=(1.1, 1.05))
plt.show()
years = ['2009','2010','2011','2012','2013', '2014', '2015', '2016', '2017', '2018', '2019', '2020']
def rn1():
rn1 = Tk()
rn1.title("Comparing Yearly Temperature (° F)")
rn1.configure(background="purple")
#App placement
app_width = 500
app_height = 450
screen_width = rn1.winfo_screenwidth()
screen_height = rn1.winfo_screenheight()
x = (screen_width / 2) - (app_width / 2)
y = (screen_height / 2) - (app_height / 2)
rn1.geometry(f'{app_width}x{app_height}+{int(x)}+{int(y)}')
lbl0 = ttk.Label(rn1, text = "Comparing Yearly Temperature (° F)",
background = 'purple', foreground ="black",
font = ("Times New Roman", 15))
lbl0.place(x=100, y=20)
# Year Selection 1 by creating labels, comboboxes & combobox drop down lists
lbl1 = ttk.Label(rn1, text = "Select Year 1 :",
background = 'purple', foreground ="black",
font = ("Times New Roman", 10))
lbl1.place(x=80, y=100)
n = tk.StringVar()
yearchoosen1 = ttk.Combobox(rn1, width = 27, textvariable = n)
yearchoosen1['values'] = years
yearchoosen1.current()
yearchoosen1.place(x=180, y=100)
# Year Selection 2 by creating labels, comboboxes & combobox drop down lists
lbl2 = ttk.Label(rn1, text = "Select Year 2 :",
background = 'purple', foreground ="black",
font = ("Times New Roman", 10))
lbl2.place(x=80, y=155)
n = tk.StringVar()
yearchoosen2 = ttk.Combobox(rn1, width = 27, textvariable = n)
yearchoosen2['values'] = years
yearchoosen2.current()
yearchoosen2.place(x=180, y=155)
# Year Selection 3 by creating labels, comboboxes & combobox drop down lists
lbl3 = ttk.Label(rn1, text = "Select Year 3 :",
background = 'purple', foreground ="black",
font = ("Times New Roman", 10))
lbl3.place(x=80, y=210)
n = tk.StringVar()
yearchoosen3 = ttk.Combobox(rn1, width = 27, textvariable = n)
yearchoosen3['values'] = years
yearchoosen3.current()
yearchoosen3.place(x=180, y=210)
# Year Selection 4 by creating labels, comboboxes & combobox drop down lists
lbl4 = ttk.Label(rn1, text = "Select Year 4 :",
background = 'purple', foreground ="black",
font = ("Times New Roman", 10))
lbl4.place(x=80, y=265)
n = tk.StringVar()
yearchoosen4 = ttk.Combobox(rn1, width = 27, textvariable = n)
yearchoosen4['values'] = years
yearchoosen4.current()
yearchoosen4.place(x=180, y=265)
# Submit button function to make ComPlot
def clickrn1():
year1= yearchoosen1.get()
year2= yearchoosen2.get()
year3= yearchoosen3.get()
year4=yearchoosen4.get()
yearSelections = [year1, year2, year3,year4]
makefourplot(yearSelections,'temperature')
#Adding submit & exit buttons
btn = Button(rn1,text="Submit", width=6, command=clickrn1)
btn.place(x=185, y=300)
btn1 = Button(rn1,text="Quit", width=6, command=rn1.destroy)
btn1.place(x=285, y=300)
rn1.mainloop()
def rn2():
rn2 = Tk()
rn2.title("Comparing Yearly Dew Point (° F)")
rn2.configure(background="purple")
#App placement
app_width = 500
app_height = 450
screen_width = rn2.winfo_screenwidth()
screen_height = rn2.winfo_screenheight()
x = (screen_width / 2) - (app_width / 2)
y = (screen_height / 2) - (app_height / 2)
rn2.geometry(f'{app_width}x{app_height}+{int(x)}+{int(y)}')
lbl0 = ttk.Label(rn2, text = "Comparing Yearly Dew Point (° F)",
background = 'purple', foreground ="black",
font = ("Times New Roman", 15))
lbl0.place(x=100, y=20)
# Year Selection 1 by creating labels, comboboxes & combobox drop down lists
lbl1 = ttk.Label(rn2, text = "Select Year 1 :",
background = 'purple', foreground ="black",
font = ("Times New Roman", 10))
lbl1.place(x=80, y=100)
n = tk.StringVar()
yearchoosen1 = ttk.Combobox(rn2, width = 27, textvariable = n)
yearchoosen1['values'] = years
yearchoosen1.current()
yearchoosen1.place(x=180, y=100)
# Year Selection 2 by creating labels, comboboxes & combobox drop down lists
lbl2 = ttk.Label(rn2, text = "Select Year 2 :",
background = 'purple', foreground ="black",
font = ("Times New Roman", 10))
lbl2.place(x=80, y=155)
n = tk.StringVar()
yearchoosen2 = ttk.Combobox(rn2, width = 27, textvariable = n)
yearchoosen2['values'] = years
yearchoosen2.current()
yearchoosen2.place(x=180, y=155)
# Year Selection 3 by creating labels, comboboxes & combobox drop down lists
lbl3 = ttk.Label(rn2, text = "Select Year 3 :",
background = 'purple', foreground ="black",
font = ("Times New Roman", 10))
lbl3.place(x=80, y=210)
n = tk.StringVar()
yearchoosen3 = ttk.Combobox(rn2, width = 27, textvariable = n)
yearchoosen3['values'] = years
yearchoosen3.current()
yearchoosen3.place(x=180, y=210)
# Year Selection 4 by creating labels, comboboxes & combobox drop down lists
lbl4 = ttk.Label(rn2, text = "Select Year 4 :",
background = 'purple', foreground ="black",
font = ("Times New Roman", 10))
lbl4.place(x=80, y=265)
n = tk.StringVar()
yearchoosen4 = ttk.Combobox(rn2, width = 27, textvariable = n)
yearchoosen4['values'] = years
yearchoosen4.current()
yearchoosen4.place(x=180, y=265)
# Submit button function to make ComPlot
def clickrn2():
year1= yearchoosen1.get()
year2= yearchoosen2.get()
year3= yearchoosen3.get()
year4=yearchoosen4.get()
yearSelections = [year1, year2, year3,year4]
makefourplot(yearSelections,'temperature')
#Adding submit & exit buttons
btn = Button(rn2,text="Submit", width=6, command=clickrn2)
btn.place(x=185, y=300)
btn1 = Button(rn2,text="Quit", width=6, command=rn2.destroy)
btn1.place(x=285, y=300)
rn2.mainloop()
def rn3():
rn3 = Tk()
rn3.title("Comparing Yearly Dew Point (° F)")
rn3.configure(background="purple")
#App placement
app_width = 500
app_height = 450
screen_width = rn3.winfo_screenwidth()
screen_height = rn3.winfo_screenheight()
x = (screen_width / 2) - (app_width / 2)
y = (screen_height / 2) - (app_height / 2)
rn3.geometry(f'{app_width}x{app_height}+{int(x)}+{int(y)}')
lbl0 = ttk.Label(rn3, text = "Comparing Yearly Dew Point (° F)",
background = 'purple', foreground ="black",
font = ("Times New Roman", 15))
lbl0.place(x=100, y=20)
# Year Selection 1 by creating labels, comboboxes & combobox drop down lists
lbl1 = ttk.Label(rn3, text = "Select Year 1 :",
background = 'purple', foreground ="black",
font = ("Times New Roman", 10))
lbl1.place(x=80, y=100)
n = tk.StringVar()
yearchoosen1 = ttk.Combobox(rn3, width = 27, textvariable = n)
yearchoosen1['values'] = years
yearchoosen1.current()
yearchoosen1.place(x=180, y=100)
# Year Selection 2 by creating labels, comboboxes & combobox drop down lists
lbl2 = ttk.Label(rn3, text = "Select Year 2 :",
background = 'purple', foreground ="black",
font = ("Times New Roman", 10))
lbl2.place(x=80, y=155)
n = tk.StringVar()
yearchoosen2 = ttk.Combobox(rn3, width = 27, textvariable = n)
yearchoosen2['values'] = years
yearchoosen2.current()
yearchoosen2.place(x=180, y=155)
# Year Selection 3 by creating labels, comboboxes & combobox drop down lists
lbl3 = ttk.Label(rn3, text = "Select Year 3 :",
background = 'purple', foreground ="black",
font = ("Times New Roman", 10))
lbl3.place(x=80, y=210)
n = tk.StringVar()
yearchoosen3 = ttk.Combobox(rn3, width = 27, textvariable = n)
yearchoosen3['values'] = years
yearchoosen3.current()
yearchoosen3.place(x=180, y=210)
# Year Selection 4 by creating labels, comboboxes & combobox drop down lists
lbl4 = ttk.Label(rn3, text = "Select Year 4 :",
background = 'purple', foreground ="black",
font = ("Times New Roman", 10))
lbl4.place(x=80, y=265)
n = tk.StringVar()
yearchoosen4 = ttk.Combobox(rn3, width = 27, textvariable = n)
yearchoosen4['values'] = years
yearchoosen4.current()
yearchoosen4.place(x=180, y=265)
# Submit button function to make ComPlot
def clickrn3():
year1= yearchoosen1.get()
year2= yearchoosen2.get()
year3= yearchoosen3.get()
year4=yearchoosen4.get()
yearSelections = [year1, year2, year3,year4]
makefourplot(yearSelections,'temperature')
#Adding submit & exit buttons
btn = Button(rn3,text="Submit", width=6, command=clickrn3)
btn.place(x=185, y=300)
btn1 = Button(rn3,text="Quit", width=6, command=rn3.destroy)
btn1.place(x=285, y=300)
rn3.mainloop()
def rn4():
rn4 = Tk()
rn4.title("Comparing Yearly Dew Point (° F)")
rn4.configure(background="purple")
#App placement
app_width = 500
app_height = 450
screen_width = rn4.winfo_screenwidth()
screen_height = rn4.winfo_screenheight()
x = (screen_width / 2) - (app_width / 2)
y = (screen_height / 2) - (app_height / 2)
rn4.geometry(f'{app_width}x{app_height}+{int(x)}+{int(y)}')
lbl0 = ttk.Label(rn4, text = "Comparing Yearly Dew Point (° F)",
background = 'purple', foreground ="black",
font = ("Times New Roman", 15))
lbl0.place(x=100, y=20)
# Year Selection 1 by creating labels, comboboxes & combobox drop down lists
lbl1 = ttk.Label(rn4, text = "Select Year 1 :",
background = 'purple', foreground ="black",
font = ("Times New Roman", 10))
lbl1.place(x=80, y=100)
n = tk.StringVar()
yearchoosen1 = ttk.Combobox(rn4, width = 27, textvariable = n)
yearchoosen1['values'] = years
yearchoosen1.current()
yearchoosen1.place(x=180, y=100)
# Year Selection 2 by creating labels, comboboxes & combobox drop down lists
lbl2 = ttk.Label(rn4, text = "Select Year 2 :",
background = 'purple', foreground ="black",
font = ("Times New Roman", 10))
lbl2.place(x=80, y=155)
n = tk.StringVar()
yearchoosen2 = ttk.Combobox(rn4, width = 27, textvariable = n)
yearchoosen2['values'] = years
yearchoosen2.current()
yearchoosen2.place(x=180, y=155)
# Year Selection 3 by creating labels, comboboxes & combobox drop down lists
lbl3 = ttk.Label(rn4, text = "Select Year 3 :",
background = 'purple', foreground ="black",
font = ("Times New Roman", 10))
lbl3.place(x=80, y=210)
n = tk.StringVar()
yearchoosen3 = ttk.Combobox(rn4, width = 27, textvariable = n)
yearchoosen3['values'] = years
yearchoosen3.current()
yearchoosen3.place(x=180, y=210)
# Year Selection 4 by creating labels, comboboxes & combobox drop down lists
lbl4 = ttk.Label(rn4, text = "Select Year 4 :",
background = 'purple', foreground ="black",
font = ("Times New Roman", 10))
lbl4.place(x=80, y=265)
n = tk.StringVar()
yearchoosen4 = ttk.Combobox(rn4, width = 27, textvariable = n)
yearchoosen4['values'] = years
yearchoosen4.current()
yearchoosen4.place(x=180, y=265)
# Submit button function to make ComPlot
def clickrn4():
year1= yearchoosen1.get()
year2= yearchoosen2.get()
year3= yearchoosen3.get()
year4=yearchoosen4.get()
yearSelections = [year1, year2, year3,year4]
makefourplot(yearSelections,'temperature')
#Adding submit & exit buttons
btn = Button(rn4,text="Submit", width=6, command=clickrn4)
btn.place(x=185, y=300)
btn1 = Button(rn4,text="Quit", width=6, command=rn4.destroy)
btn1.place(x=285, y=300)
rn4.mainloop()
from cProfile import label
import tkinter as tk
from tkinter import *
from tkinter import ttk
from unicodedata import decimal
import csv
import matplotlib.pyplot as plt
from matplotlib.pyplot import figure
import statistics
from setuptools import Command
def getYearData(year, data_type, value_type):
filename = 'FinallOneCSV.csv'
with open(filename, 'r') as csvfile:
datareader = csv.reader(csvfile)
yearData = []
for row in datareader:
if row[0].find('Average'+str(year)) != -1:
if(data_type == 'temperature'):
if(value_type == 'Maximum'):
yearData.append(float(row[2]))
elif(value_type == 'Average'):
yearData.append(float(row[3]))
elif(value_type == 'Minimum'):
yearData.append(float(row[4]))
elif(data_type == 'DewPoint'):
if(value_type == 'Maximum'):
yearData.append(float(row[5]))
elif(value_type == 'Average'):
yearData.append(float(row[6]))
elif(value_type == 'Minimum'):
yearData.append(float(row[7]))
elif(data_type == 'Humidity'):
if(value_type == 'Maximum'):
yearData.append(float(row[8]))
elif(value_type == 'Average'):
yearData.append(float(row[9]))
elif(value_type == 'Minimum'):
yearData.append(float(row[10]))
elif(data_type == 'WindSpeed'):
if(value_type == 'Maximum'):
yearData.append(float(row[11]))
elif(value_type == 'Average'):
yearData.append(float(row[12]))
elif(value_type == 'Minimum'):
yearData.append(float(row[13]))
elif(data_type == 'Pressure'):
if(value_type == 'Maximum'):
yearData.append(float(row[14]))
elif(value_type == 'Average'):
yearData.append(float(row[15]))
elif(value_type == 'Minimum'):
yearData.append(float(row[16]))
return yearData
start = Tk()
start.title("n Menu")
start.configure(background="white")
#App placement
app_width = 400
app_height = 350
screen_width = start.winfo_screenwidth()
screen_height = start.winfo_screenheight()
x = (screen_width / 2) - (app_width / 2)
y = (screen_height / 2) - (app_height / 2)
start.geometry(f'{app_width}x{app_height}+{int(x)}+{int(y)}')
#Label for Main Menu
mainlabel = ttk.Label(start, text = "\u0332".join("Main Menu"),
background = 'white', foreground ="Black",
font = ("Times New Roman", 15))
mainlabel.place(x=130, y=10)
# Finding max temp of a year
def findMaxValueYear(year_a, data_type):
maxValueYear=[]
for year in year_a:
value_type = 'Maximum'
year_a_data=getYearData(year, data_type, value_type)
maxValueYear.append(max(year_a_data))
return maxValueYear
# Finding mean temp of a year
def findMeanValueYear(year_a, data_type):
meanValueYear=[]
for year in year_a:
value_type = 'Average'
year_a_data=getYearData(year, data_type, value_type)
meanValueYear.append(statistics.mean(year_a_data))
return meanValueYear
# Finding min temp of a year
def findMinValueYear(year_a, data_type):
minValueYear=[]
for year in year_a:
value_type = 'Minimum'
year_a_data=getYearData(year, data_type, value_type)
minValueYear.append(min(year_a_data))
return minValueYear
def makefiveplot(year, data_type):
# Getting max, mean and min values to plot
maxValuesYear = findMaxValueYear(year, data_type)
meanValuesYear = findMeanValueYear(year, data_type)
minValuesYear = findMinValueYear(year, data_type)
# Gathering better y axis labels that includes the measurements
if (data_type=='temperature'):
ylabel="Temperature (° F)"
if (data_type=='DewPoint'):
ylabel="Dew Point (° F)"
if (data_type=='Humidity'):
ylabel="Humidity (%)"
if (data_type=='WindSpeed'):
ylabel="Wind Speed (mph)"
# Creating a figure and adding a title, y and x labels and a grid
plt.figure(figsize=(12,5))
plt.title(f"Comparing the maximum, mean & minimum {data_type} values of {year[0]}, {year[1]} , {year[2]}, {year[3]}& {year[4]}")
plt.ylabel(ylabel)
plt.xlabel('Years')
plt.grid(axis= 'y')
# Plotting
plt.bar(year,maxValuesYear, s = 150, label="Maximum")
plt.bar(year,meanValuesYear, s = 150, label="Mean")
plt.bar(year,minValuesYear, s =150, label="Minimum")
# Adding a legend
plt.legend(bbox_to_anchor=(1.1, 1.05))
plt.show()
years = ['2009','2010','2011','2012','2013', '2014', '2015', '2016', '2017', '2018', '2019', '2020']
def eb1():
eb1 = Tk()
eb1.title("Comparing Yearly Temperature (° F)")
eb1.configure(background="white")
#App placement
app_width = 500
app_height = 450
screen_width = eb1.winfo_screenwidth()
screen_height = eb1.winfo_screenheight()
x = (screen_width / 2) - (app_width / 2)
y = (screen_height / 2) - (app_height / 2)
eb1.geometry(f'{app_width}x{app_height}+{int(x)}+{int(y)}')
lbl0 = ttk.Label(eb1, text = "Comparing Yearly Temperature (° F)",
background = 'white', foreground ="black",
font = ("Times New Roman", 15))
lbl0.place(x=100, y=20)
# Year Selection 1 by creating labels, comboboxes & combobox drop down lists
lbl1 = ttk.Label(eb1, text = "Select Year 1 :",
background = 'white', foreground ="black",
font = ("Times New Roman", 10))
lbl1.place(x=80, y=100)
n = tk.StringVar()
yearchoosen1 = ttk.Combobox(eb1, width = 27, textvariable = n)
yearchoosen1['values'] = years
yearchoosen1.current()
yearchoosen1.place(x=180, y=100)
# Year Selection 2 by creating labels, comboboxes & combobox drop down lists
lbl2 = ttk.Label(eb1, text = "Select Year 2 :",
background = 'white', foreground ="black",
font = ("Times New Roman", 10))
lbl2.place(x=80, y=155)
n = tk.StringVar()
yearchoosen2 = ttk.Combobox(eb1, width = 27, textvariable = n)
yearchoosen2['values'] = years
yearchoosen2.current()
yearchoosen2.place(x=180, y=155)
# Year Selection 3 by creating labels, comboboxes & combobox drop down lists
lbl3 = ttk.Label(eb1, text = "Select Year 3 :",
background = 'white', foreground ="black",
font = ("Times New Roman", 10))
lbl3.place(x=80, y=210)
n = tk.StringVar()
yearchoosen3 = ttk.Combobox(eb1, width = 27, textvariable = n)
yearchoosen3['values'] = years
yearchoosen3.current()
yearchoosen3.place(x=180, y=210)
# Year Selection 4 by creating labels, comboboxes & combobox drop down lists
lbl4 = ttk.Label(eb1, text = "Select Year 4 :",
background = 'white', foreground ="black",
font = ("Times New Roman", 10))
lbl4.place(x=80, y=265)
n = tk.StringVar()
yearchoosen4 = ttk.Combobox(eb1, width = 27, textvariable = n)
yearchoosen4['values'] = years
yearchoosen4.current()
yearchoosen4.place(x=180, y=265)
# Year Selection 5 by creating labels, comboboxes & combobox drop down lists
lbl4 = ttk.Label(eb1, text = "Select Year 5 :",
background = 'white', foreground ="black",
font = ("Times New Roman", 10))
lbl4.place(x=80, y=320)
n = tk.StringVar()
yearchoosen5 = ttk.Combobox(eb1, width = 27, textvariable = n)
yearchoosen5['values'] = years
yearchoosen5.current()
yearchoosen5.place(x=180, y=320)
# Submit button function to make ComPlot
def clickeb1():
year1= yearchoosen1.get()
year2= yearchoosen2.get()
year3= yearchoosen3.get()
year4=yearchoosen4.get()
year5=yearchoosen5.get()
yearSelections = [year1, year2, year3,year4,year5]
makefiveplot(yearSelections,'temperature')
#Adding submit & exit buttons
btn = Button(eb1,text="Submit", width=6, command=clickeb1)
btn.place(x=185, y=365)
btn1 = Button(eb1,text="Quit", width=6, command=eb1.destroy)
btn1.place(x=285, y=365)
eb1.mainloop()
def eb2():
eb2 = Tk()
eb2.title("Comparing Yearly Dew Point (° F)")
eb2.configure(background="white")
#App placement
app_width = 500
app_height = 450
screen_width = eb2.winfo_screenwidth()
screen_height = eb2.winfo_screenheight()
x = (screen_width / 2) - (app_width / 2)
y = (screen_height / 2) - (app_height / 2)
eb2.geometry(f'{app_width}x{app_height}+{int(x)}+{int(y)}')
lbl0 = ttk.Label(eb2, text = "Comparing Yearly Dew Point (° F)",
background = 'white', foreground ="black",
font = ("Times New Roman", 15))
lbl0.place(x=100, y=20)
# Year Selection 1 by creating labels, comboboxes & combobox drop down lists
lbl1 = ttk.Label(eb2, text = "Select Year 1 :",
background = 'white', foreground ="black",
font = ("Times New Roman", 10))
lbl1.place(x=80, y=100)
n = tk.StringVar()
yearchoosen1 = ttk.Combobox(eb2, width = 27, textvariable = n)
yearchoosen1['values'] = years
yearchoosen1.current()
yearchoosen1.place(x=180, y=100)
# Year Selection 2 by creating labels, comboboxes & combobox drop down lists
lbl2 = ttk.Label(eb2, text = "Select Year 2 :",
background = 'white', foreground ="black",
font = ("Times New Roman", 10))
lbl2.place(x=80, y=155)
n = tk.StringVar()
yearchoosen2 = ttk.Combobox(eb2, width = 27, textvariable = n)
yearchoosen2['values'] = years
yearchoosen2.current()
yearchoosen2.place(x=180, y=155)
# Year Selection 3 by creating labels, comboboxes & combobox drop down lists
lbl3 = ttk.Label(eb2, text = "Select Year 3 :",
background = 'white', foreground ="black",
font = ("Times New Roman", 10))
lbl3.place(x=80, y=210)
n = tk.StringVar()
yearchoosen3 = ttk.Combobox(eb2, width = 27, textvariable = n)
yearchoosen3['values'] = years
yearchoosen3.current()
yearchoosen3.place(x=180, y=210)
# Year Selection 4 by creating labels, comboboxes & combobox drop down lists
lbl4 = ttk.Label(eb2, text = "Select Year 4 :",
background = 'white', foreground ="black",
font = ("Times New Roman", 10))
lbl4.place(x=80, y=265)
n = tk.StringVar()
yearchoosen4 = ttk.Combobox(eb2, width = 27, textvariable = n)
yearchoosen4['values'] = years
yearchoosen4.current()
yearchoosen4.place(x=180, y=265)
# Year Selection 5 by creating labels, comboboxes & combobox drop down lists
lbl4 = ttk.Label(eb2, text = "Select Year 5 :",
background = 'white', foreground ="black",
font = ("Times New Roman", 10))
lbl4.place(x=80, y=320)
n = tk.StringVar()
yearchoosen5 = ttk.Combobox(eb2, width = 27, textvariable = n)
yearchoosen5['values'] = years
yearchoosen5.current()
yearchoosen5.place(x=180, y=320)
# Submit button function to make ComPlot
def clickeb2():
year1= yearchoosen1.get()
year2= yearchoosen2.get()
year3= yearchoosen3.get()
year4=yearchoosen4.get()
year5=yearchoosen5.get()
yearSelections = [year1, year2, year3,year4,year5]
makefiveplot(yearSelections,'dew point')
#Adding submit & exit buttons
btn = Button(eb2,text="Submit", width=6, command=clickeb2)
btn.place(x=185, y=365)
btn1 = Button(eb2,text="Quit", width=6, command=eb2.destroy)
btn1.place(x=285, y=365)
eb2.mainloop()
def eb3():
eb3 = Tk()
eb3.title("Comparing Yearly Humidity")
eb3.configure(background="white")
#App placement
app_width = 500
app_height = 450
screen_width = eb3.winfo_screenwidth()
screen_height = eb3.winfo_screenheight()
x = (screen_width / 2) - (app_width / 2)
y = (screen_height / 2) - (app_height / 2)
eb3.geometry(f'{app_width}x{app_height}+{int(x)}+{int(y)}')
lbl0 = ttk.Label(eb3, text = "Comparing Yearly Humidty",
background = 'purple', foreground ="black",
font = ("Times New Roman", 15))
lbl0.place(x=100, y=20)
# Year Selection 1 by creating labels, comboboxes & combobox drop down lists
lbl1 = ttk.Label(eb3, text = "Select Year 1 :",
background = 'purple', foreground ="black",
font = ("Times New Roman", 10))
lbl1.place(x=80, y=100)
n = tk.StringVar()
yearchoosen1 = ttk.Combobox(eb3, width = 27, textvariable = n)
yearchoosen1['values'] = years
yearchoosen1.current()
yearchoosen1.place(x=180, y=100)
# Year Selection 2 by creating labels, comboboxes & combobox drop down lists
lbl2 = ttk.Label(eb3, text = "Select Year 2 :",
background = 'purple', foreground ="black",
font = ("Times New Roman", 10))
lbl2.place(x=80, y=155)
n = tk.StringVar()
yearchoosen2 = ttk.Combobox(eb3, width = 27, textvariable = n)
yearchoosen2['values'] = years
yearchoosen2.current()
yearchoosen2.place(x=180, y=155)
# Year Selection 3 by creating labels, comboboxes & combobox drop down lists
lbl3 = ttk.Label(eb3, text = "Select Year 3 :",
background = 'purple', foreground ="black",
font = ("Times New Roman", 10))
lbl3.place(x=80, y=210)
n = tk.StringVar()
yearchoosen3 = ttk.Combobox(eb3, width = 27, textvariable = n)
yearchoosen3['values'] = years
yearchoosen3.current()
yearchoosen3.place(x=180, y=210)
# Year Selection 4 by creating labels, comboboxes & combobox drop down lists
lbl4 = ttk.Label(eb3, text = "Select Year 4 :",
background = 'purple', foreground ="black",
font = ("Times New Roman", 10))
lbl4.place(x=80, y=265)
n = tk.StringVar()
yearchoosen4 = ttk.Combobox(eb3, width = 27, textvariable = n)
yearchoosen4['values'] = years
yearchoosen4.current()
yearchoosen4.place(x=180, y=265)
# Year Selection 5 by creating labels, comboboxes & combobox drop down lists
lbl4 = ttk.Label(eb3, text = "Select Year 5 :",
background = 'purple', foreground ="black",
font = ("Times New Roman", 10))
lbl4.place(x=80, y=320)
n = tk.StringVar()
yearchoosen5 = ttk.Combobox(eb3, width = 27, textvariable = n)
yearchoosen5['values'] = years
yearchoosen5.current()
yearchoosen5.place(x=180, y=320)
# Submit button function to make ComPlot
def clickeb3():
year1= yearchoosen1.get()
year2= yearchoosen2.get()
year3= yearchoosen3.get()
year4=yearchoosen4.get()
year5=yearchoosen5.get()
yearSelections = [year1, year2, year3,year4,year5]
makefiveplot(yearSelections,'Humidity')
#Adding submit & exit buttons
btn = Button(eb3,text="Submit", width=6, command=clickeb3)
btn.place(x=185, y=365)
btn1 = Button(eb3,text="Quit", width=6, command=eb3.destroy)
btn1.place(x=285, y=365)
eb3.mainloop()
def eb4():
eb4 = Tk()
eb4.title("Comparing Yearly wind Point speed")
eb4.configure(background="white")
#App placement
app_width = 500
app_height = 450
screen_width = eb4.winfo_screenwidth()
screen_height = eb4.winfo_screenheight()
x = (screen_width / 2) - (app_width / 2)
y = (screen_height / 2) - (app_height / 2)
eb4.geometry(f'{app_width}x{app_height}+{int(x)}+{int(y)}')
lbl0 = ttk.Label(eb4, text = "Comparing Yearly wind speed",
background = 'purple', foreground ="black",
font = ("Times New Roman", 15))
lbl0.place(x=100, y=20)
# Year Selection 1 by creating labels, comboboxes & combobox drop down lists
lbl1 = ttk.Label(eb4, text = "Select Year 1 :",
background = 'purple', foreground ="black",
font = ("Times New Roman", 10))
lbl1.place(x=80, y=100)
n = tk.StringVar()
yearchoosen1 = ttk.Combobox(eb4, width = 27, textvariable = n)
yearchoosen1['values'] = years
yearchoosen1.current()
yearchoosen1.place(x=180, y=100)
# Year Selection 2 by creating labels, comboboxes & combobox drop down lists
lbl2 = ttk.Label(eb4, text = "Select Year 2 :",
background = 'purple', foreground ="black",
font = ("Times New Roman", 10))
lbl2.place(x=80, y=155)
n = tk.StringVar()
yearchoosen2 = ttk.Combobox(eb4, width = 27, textvariable = n)
yearchoosen2['values'] = years
yearchoosen2.current()
yearchoosen2.place(x=180, y=155)
# Year Selection 3 by creating labels, comboboxes & combobox drop down lists
lbl3 = ttk.Label(eb4, text = "Select Year 3 :",
background = 'purple', foreground ="black",
font = ("Times New Roman", 10))
lbl3.place(x=80, y=210)
n = tk.StringVar()
yearchoosen3 = ttk.Combobox(eb4, width = 27, textvariable = n)
yearchoosen3['values'] = years
yearchoosen3.current()
yearchoosen3.place(x=180, y=210)
# Year Selection 4 by creating labels, comboboxes & combobox drop down lists
lbl4 = ttk.Label(eb4, text = "Select Year 4 :",
background = 'purple', foreground ="black",
font = ("Times New Roman", 10))
lbl4.place(x=80, y=265)
n = tk.StringVar()
yearchoosen4 = ttk.Combobox(eb4, width = 27, textvariable = n)
yearchoosen4['values'] = years
yearchoosen4.current()
yearchoosen4.place(x=180, y=265)
# Year Selection 5 by creating labels, comboboxes & combobox drop down lists
lbl4 = ttk.Label(eb4, text = "Select Year 5 :",
background = 'purple', foreground ="black",
font = ("Times New Roman", 10))
lbl4.place(x=80, y=320)
n = tk.StringVar()
yearchoosen5 = ttk.Combobox(eb4, width = 27, textvariable = n)
yearchoosen5['values'] = years
yearchoosen5.current()
yearchoosen5.place(x=180, y=320)
# Submit button function to make ComPlot
def clickeb4():
year1= yearchoosen1.get()
year2= yearchoosen2.get()
year3= yearchoosen3.get()
year4=yearchoosen4.get()
year5=yearchoosen5.get()
yearSelections = [year1, year2, year3,year4,year5]
makefiveplot(yearSelections,'wind speed')
#Adding submit & exit buttons
btn = Button(eb4,text="Submit", width=6, command=clickeb4)
btn.place(x=185, y=365)
btn1 = Button(eb4,text="Quit", width=6, command=eb4.destroy)
btn1.place(x=285, y=365)
eb4.mainloop()
#YEAR FIVE options
ml5 = ttk.Label(start, text = "Five Years",
background = 'light green', foreground ="black",
font = ("Times New Roman", 15))
ml5.place(x=625, y=50)
btn14 = Button(start,text="Temperature (° F)", width=20, command=eb1)
btn14.place(x=120, y=100)
btn15 = Button(start,text="Dew Point (° F)", width=20, command=eb2)
btn15.place(x=120, y=150)
btn16 = Button(start,text="Humidity (%)", width=20, command=eb3)
btn16.place(x=120, y=200)
btn17 = Button(start,text="Wind Speed (mph)", width=20, command=eb4)
btn17.place(x=120, y=250)
start.mainloop()
| 32.408172 | 132 | 0.584998 | 9,205 | 76,937 | 4.83824 | 0.044867 | 0.031525 | 0.041382 | 0.052272 | 0.890358 | 0.870577 | 0.85697 | 0.821358 | 0.805506 | 0.796524 | 0 | 0.071322 | 0.275966 | 76,937 | 2,373 | 133 | 32.421829 | 0.727511 | 0.101083 | 0 | 0.711111 | 0 | 0.001802 | 0.140804 | 0.014012 | 0 | 0 | 0 | 0 | 0 | 1 | 0.032432 | false | 0 | 0.012012 | 0 | 0.049249 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
caf7e49884fff77dfa05c3a8fe363db4be92e09f | 41,659 | py | Python | views.py | rexli11/staff_manage | cc6cf82108446db0ff0306a89d4edcc7b3b32ec6 | [
"MIT"
] | null | null | null | views.py | rexli11/staff_manage | cc6cf82108446db0ff0306a89d4edcc7b3b32ec6 | [
"MIT"
] | null | null | null | views.py | rexli11/staff_manage | cc6cf82108446db0ff0306a89d4edcc7b3b32ec6 | [
"MIT"
] | null | null | null | # 有三種網頁轉換方法,必放
from django.shortcuts import render # 呼叫模板,合成後送往瀏覽器
from django.http import HttpResponse, request # 程式送往瀏覽器
from django.shortcuts import redirect # 程式送往程式
import pymysql
import re
import pandas as pd
from datetime import datetime
from sql_account import sql_account
'''思考一下
1. 能否依照權限顯示資料 - ok
2. 是否可以刪除,修改時傳送訊息 - 暫不加入功能
3. 刪除,修改的javascrip能否提示時加入帳號 - ok
4. list_all 部門要做排序 - ok
5. 基本美化 - ok
6. 匯出列表 - ok
7. 匯入 - 考量實際應用出現的問題暫不加入功能
'''
'''
level
adm, pre, dir, spe, sup, dir, nor
dep
總部, 財務部, 人力資源部, 業務部, 客戶服務部, 總務部, 企劃部
admin_Office, Finance_Department, Human_Resources_Department, Sales_Department, Customer_Service_Department, General_Affairs_Department, Planning_Department'''
# ===========================測試回傳==============================
# return HttpResponse('hi')
# ========================staff_Login_Data_Retrieve=================================
# 各分頁顯示登入資訊所需函式
def staff_Login_Data_Retrieve(request):
# 設定空字典作為接收db tuple資料的轉換
staff_Login_Data = {}
staff_Login_Data['login_account'] = request.session['login_account']
staff_Login_Data['login_name'] = request.session['login_name']
staff_Login_Data['login_subjection_depar'] = request.session['login_subjection_depar']
staff_Login_Data['level'] = request.session['level']
# 判斷db是否有登錄,回傳到其他def與html
staff_Login_Data['login'] = 1
return staff_Login_Data
# =========================主頁面============================
def index(request):
return render(request, 'index.html')
# ====================staff頁面=============================
def staff_index(request):
show = {}
# 從db > session內取得login資料
if request.session.get("login_name") != None:
# 呼叫staff_Login_Data_Retrieve存入變數
staff_Login_Data=staff_Login_Data_Retrieve(request)
# 判斷要顯示哪個頁面
if (staff_Login_Data['login_subjection_depar'] and staff_Login_Data['level']) in (['總部'] and ['adm', 'pre']):
show['data'] = 0
# 將要顯示的登入資料傳送至模板
return render(request, "staff\\staff_index.html", {'staff_Login_Data':staff_Login_Data, 'show':show})
elif (staff_Login_Data['login_subjection_depar'] and staff_Login_Data['level']) in (['人力資源部'] and ['dir', 'spe']):
show['data'] = 1
return render(request, "staff\\staff_index.html", {'staff_Login_Data':staff_Login_Data, 'show':show})
elif (staff_Login_Data['login_subjection_depar'] and staff_Login_Data['level']) in (['財務部, 業務部, 客戶服務部, 總務部, 企劃部'] and ['dir', 'spe', 'sup']):
show['data'] = 2
return render(request, "staff\\staff_index.html", {'staff_Login_Data':staff_Login_Data, 'show':show})
else:
return render(request, 'staff\\staff_index.html', {'staff_Login_Data':staff_Login_Data})
else:
return render(request, 'staff\\staff_index.html')
# ======================staff_login=======================
def staff_Login(request):
if request.session.get("login_name") == None:
# 呼叫staff_Login_Data_Retrieve存入變數
# staff_Login_Data=staff_Login_Data_Retrieve(request)
# 網頁獲取資料的方式
if request.method == "POST":
# 取出html表單的輸入值
account = request.POST['account']
name = request.POST['name']
password = request.POST['password']
# 連結資料庫 > 呼叫sql_account內的db_conect
db = sql_account.connect()
cursor = db.cursor()
# 檢查帳號是否存在,單筆資料取出,對應取值的select不是對應sql,否則會tuple index out of range
sql = "select account, password, name, subjection_depar, level from staff_contrl where account='{}'".format(account)
cursor.execute(sql)
db.commit()
# 放置暫存區檢查
staff_Login_Data = cursor.fetchone()
# 一階段確認帳號是否為空值,對應html內之值
if staff_Login_Data[0] != None :
# 二階段確認密碼是否存在
if staff_Login_Data[1] == password:
# 三階段確認姓名是否存在
if staff_Login_Data[2] == name:
# 將登陸資料存至session內供回應其他模板
# [0][1][2]對應 > sql select account,password,name,subjection_depar
request.session['login_account'] = staff_Login_Data[0]
request.session['login_name'] = staff_Login_Data[2]
request.session['login_subjection_depar'] = staff_Login_Data[3]
request.session['level'] = staff_Login_Data[4]
# return HttpResponse(request.session['level'])
# return HttpResponse("檢查完成")
return redirect("/staff_index/")
else:
return HttpResponse("查無此姓名,請重新登錄 <a href='/staff_Login/'>回上一頁</a>")
else:
return HttpResponse("密碼錯誤,請重新登入 <a href='/staff_Login/'>回上一頁</a>")
else:
return HttpResponse("帳號錯誤,請聯繫管理員 <a href='/staff_index/'>回上一頁</a>")
else:
return render(request, 'staff\\staff_Login.html')
else:
# 呼叫staff_Login_Data_Retrieve存入變數
staff_Login_Data=staff_Login_Data_Retrieve(request)
return redirect("/staff_index/")
# return HttpResponse('staff_Login')
# 登出時從有帳號 > 無帳號,會帶account
def staff_Logout(request, account=""):
if request.session.get("login_name") != None:
del request.session['login_account']
del request.session['login_name']
del request.session['login_subjection_depar']
return redirect('staff\\staff_index.html')
else:
return HttpResponse("已登出職員管理系統 <a href='/index/'>返回主頁</a>")
# =======================staff_Create==========================
def staff_Create(request):
if request.session.get("login_name") != None:
# 呼叫staff_Login_Data_Retrieve存入變數
staff_Login_Data=staff_Login_Data_Retrieve(request)
# 進行權限檢查 > 部門檢查
if request.session['login_subjection_depar'] in (['總部', '人力資源部'] or ['admin_Office','Human_Resources_Department']):
# 進行權限檢查 > 職務等級檢查
if request.session['level'] in ['adm', 'pre', 'dir', 'spe']:
# 將要顯示的登入資料傳送至模板
return render(request, 'staff\\staff_Create.html', {'staff_Login_Data':staff_Login_Data})
else:
# 權限等級檢查未通過
return HttpResponse('權限等級不足')
else:
# 部門檢查未通過
return HttpResponse('不隸屬於部門職權範圍')
else:
# 未登入不得新增
return HttpResponse("尚未登入 <a href='/staff_Login/'>進行登入</a>")
def staff_DubleCheck(request):
if request.session.get("login_name") != None:
# 呼叫staff_Login_Data_Retrieve存入變數
staff_Login_Data=staff_Login_Data_Retrieve(request)
# 將要顯示的登入資料傳送至模板
# 設定一變數為dic接收html資料
data={}
# 擷取create填寫的資料透過request.post轉為list,原始資料為dic{'key':values}
# data['list'] = request.POST['dic']
# [''] > '' > 字串
data['account'] = request.POST['account']
data['name'] = request.POST['name']
data['password'] = request.POST['password']
data['privacy_mail'] = request.POST['privacy_mail']
data['mobile_phine'] = request.POST['mobile_phine']
data['addr'] = request.POST['addr']
data['emergency_contact_name'] = request.POST['emergency_contact_name']
data['emergency_contact_tel'] = request.POST['emergency_contact_tel']
data['subjection_depar'] = request.POST['subjection_depar']
data['job_title'] = request.POST['job_title']
data['depar_director'] = request.POST['depar_director']
# ----------驗證區 >> 帳號,姓名,密碼,電話,信箱----------------------
# 帳號驗證,只接受英文,數字,底線
if not re.search(r"[A-Za-z]+", request.POST['account']):
msg = "帳號輸入錯誤,帳號不能有空白與特殊字元"
return HttpResponse(msg)
elif len(request.POST['account']) < 4:
msg = "帳號過短"
return HttpResponse(msg)
else:
# 若帳號格式皆正確,存入原始變數並把所有空白都移除
data['account'] = request.POST['account'].strip()
# 姓名驗證,只接受中文
if not re.search(r"[\u4e00-\u9fa5]", request.POST['name']):
msg = "姓名輸入錯誤,只接受中文"
return HttpResponse(msg)
else:
# 若姓名格式皆正確,存入原始變數並把所有空白都移除
data['name'] = request.POST['name'].strip()
# 密碼驗證,密碼要包含一個大小寫英文,長度大於6,小於15字元
if re.search(r"\s", request.POST['password']):
msg = "密碼輸入錯誤,不包含空白,請返回上一頁"
return HttpResponse(msg)
elif not re.search(r"[A-Z]", request.POST['password']):
msg = "密碼輸入錯誤,需至少需一個大寫英文"
return HttpResponse(msg)
elif not re.search(r"[a-z]", request.POST['password']):
msg = "密碼輸入錯誤,需至少需一個小寫英文"
return HttpResponse(msg)
# 長度檢查
elif len(request.POST['password']) < 6:
msg = "密碼輸入錯誤,長度需大於6個字元"
return HttpResponse(msg)
elif len(request.POST['password']) > 15:
msg = "密碼輸入錯誤,長度需小於15個字元"
return HttpResponse(msg)
else:
# 若密碼格式皆正確,存入原始變數並把所有空白都移除
data['password'] = request.POST['password'].strip()
# 手機驗證,只接受數字,不接受特殊字元,長度需 == 10
if not re.search(r"09\d+", request.POST['mobile_phine']):
msg = "手機號碼需為數字"
return HttpResponse(msg)
elif len(request.POST['mobile_phine']) > 10:
msg = "手機號碼為10個數字"
return HttpResponse(msg)
elif len(request.POST['mobile_phine']) < 10:
msg = "手機號碼為10個數字"
return HttpResponse(msg)
else:
# 若手機號碼格式皆正確,存入原始變數並把所有空白都移除
data['mobile_phine'] = request.POST['mobile_phine'].strip()
# 私人信箱驗證,格式 > xxx@xxx.xxx,長度2-6字元
if not re.search(r"[a-z0-9_\.-]+\@[\da-z\.-]+\.[a-z\.]{2,6}", request.POST['privacy_mail']):
msg = "信箱格式錯誤"
return HttpResponse(msg)
else:
# 若信箱格式皆正確,存入原始變數並把所有空白都移除
data['privacy_mail'] = request.POST['privacy_mail'].strip()
# 緊急聯絡人電話驗證,只接受數字,不接受特殊字元,長度需 == 10
if not re.search(r"09\d+", request.POST['emergency_contact_tel']):
msg = "緊急聯絡人電話號碼需為數字"
return HttpResponse(msg)
elif len(request.POST['emergency_contact_tel']) > 10:
msg = "緊急聯絡人電話號碼為10個數字"
return HttpResponse(msg)
elif len(request.POST['emergency_contact_tel']) < 10:
msg = "緊急聯絡人電話號碼為10個數字"
return HttpResponse(msg)
else:
# 若緊急連絡人電話格式皆正確,存入原始變數並把所有空白都移除
data['emergency_contact_tel'] = request.POST['emergency_contact_tel'].strip()
return render(request, 'staff\\staff_DubleCheck.html', {'data': data,'staff_Login_Data':staff_Login_Data})
else:
# 未登入不得新增
return HttpResponse("尚未登入 <a href='/staff_Login/'>進行登入</a>")
def staff_CreateConfirm(request):
try:
# 接收staff_DubleCheck的輸入資料
account = request.POST['account']
name = request.POST['name']
password = request.POST['password']
privacy_mail = request.POST['privacy_mail']
# 設定為'NULL' > 防止回傳時出現Nano資料庫會出錯(varchar)
mail = 'NULL'
mobile_phine = request.POST['mobile_phine']
addr = request.POST['addr']
emergency_contact_name = request.POST['emergency_contact_name']
emergency_contact_tel = request.POST['emergency_contact_tel']
# 設定為0 > 防止回傳時出現Nano資料庫會出錯(int)
status = 0
category = 0
subjection_depar = request.POST['subjection_depar']
job_title = request.POST['job_title']
depar_director = request.POST['depar_director']
level = 'NULL'
note = 'NULL'
nomal_hour_month = 0
total_hour_month = 0
official_leave = 0
annual_sick_leave = 0
overtime_hour = 0
# 連結資料庫 > 呼叫sql_account內的db_conect
db = sql_account.connect()
cursor = db.cursor()
# 先檢查account是否重複
sql = "select * from staff_contrl where account='{}'".format(account)
cursor.execute(sql)
db.commit()
# 將檢查結果放置變數中,temporary > 臨時
tmp = cursor.fetchone()
# 檢查變數中是否有值,若db檢查為空
if tmp == None:
# 存入db
sql = "insert into staff_contrl (account, name, password, privacy_mail, mail, mobile_phine, addr, emergency_contact_name, emergency_contact_tel, status, category, subjection_depar, job_title, depar_director, level, note, nomal_hour_month, total_hour_month, official_leave, annual_sick_leave, overtime_hour) values ('{}','{}','{}','{}','{}','{}','{}','{}','{}',{},{},'{}','{}','{}','{}','{}',{},{},{},{},{})".format(account, name, password, privacy_mail, mail, mobile_phine, addr, emergency_contact_name, emergency_contact_tel, status, category, subjection_depar, job_title, depar_director, level, note, nomal_hour_month, total_hour_month, official_leave, annual_sick_leave, overtime_hour)
cursor.execute(sql)
db.commit()
cursor.close()
db.close()
result = "儲存成功 <a href='/staff_index/'>回首頁</a>"
else:
return HttpResponse('帳號已存在,請另選帳號 <a href="/staff_Create/">回上一頁</a>')
except:
result = "儲存失敗"
return HttpResponse(result)
# ===================staff_ListAll=========================
# 登入後判斷部門別顯示全部資訊
def staff_ListAll(request):
# 檢查db session中是否有loginName > 獲取用get
if request.session.get("login_name") != None:
# 將已登入資料存至變數中
staff_Login_Data=staff_Login_Data_Retrieve(request)
if request.session['login_subjection_depar'] in (['總部'] or ['admin_Office']):
# 進行權限檢查 > 職務等級檢查
if request.session['level'] in ['adm', 'pre']:
try:
# 連結資料庫 > 呼叫sql_account內的db_conect
db = sql_account.connect()
cursor = db.cursor()
# 列表需要抓出所有資料
sql = "select * from staff_contrl"
cursor.execute(sql)
db.commit()
# 存入變數中
staff_ListAll = cursor.fetchall()
cursor.close()
db.close()
# 回傳db資料與登陸資料
return render(request, "staff\\staff_ListAll.html", {'staff_ListAll': staff_ListAll, 'staff_Login_Data':staff_Login_Data})
except:
return HttpResponse('讀取失敗,請重新嘗試 <a href="/staff_index/">回職員管理首頁</a>')
else:
# 權限等級檢查未通過
return HttpResponse('權限等級不足')
else:
# 部門檢查未通過
return HttpResponse('不隸屬於部門範圍')
else:
# 未登入不得觀看資料
return HttpResponse('<a href="/staff_Login/">未登入,請登入後繼續</a>')
# 登入後判斷部門別顯示該部門資訊
def dep_Staff_ListAll(request):
# 檢查db session中是否有loginName > 獲取用get
if request.session.get("login_name") != None:
# 將已登入資料存至變數中
staff_Login_Data=staff_Login_Data_Retrieve(request)
subjection_depar = request.session['login_subjection_depar']
# return HttpResponse([staff_Login_Data])
if request.session['login_subjection_depar'] in ['財務部', '人力資源部', '業務部', '客戶服務部', '總務部', '企劃部']:
# 進行權限檢查 > 職務等級檢查
if request.session['level'] in ['dir', 'spe', 'sup']:
try:
# 連結資料庫 > 呼叫sql_account內的db_conect
db = sql_account.connect()
cursor = db.cursor()
# 列表需要抓出所有資料
sql = "select * from staff_contrl where subjection_depar='{}'".format(subjection_depar)
cursor.execute(sql)
db.commit()
# 存入變數中
dep_Staff_ListAll = cursor.fetchall()
cursor.close()
db.close()
# 回傳db資料與登陸資料
return render(request, "staff\\dep_Staff_ListAll.html", {'dep_Staff_ListAll': dep_Staff_ListAll, 'staff_Login_Data':staff_Login_Data})
except:
return HttpResponse('讀取失敗,請重新嘗試 <a href="/staff_index/">回職員管理首頁</a>')
else:
# 權限等級檢查未通過
return HttpResponse('權限等級不足')
else:
# 部門檢查未通過
return HttpResponse('不隸屬於部門範圍')
else:
# 未登入不得觀看資料
return HttpResponse('<a href="/staff_Login/">未登入,請登入後繼續</a>')
# ==================personal_staff_Revise===========================
# 修改一定會有帶值account
def staff_Revise(request, account=""):
# 判斷是否有登入
if request.session.get("login_name") != None:
# 以session內的login_account作為sql搜尋條件
account = request.session['login_account']
if request.session['login_subjection_depar'] in (['總部', '財務部', '人力資源部', '業務部', '客戶服務部', '企劃部','總務部'] or ['admin_Office', 'Finance_Department', 'Human_Resources_Department', 'Sales_Department', 'Customer_Service_Department', 'Planning_Department', 'General_Affairs_Department']):
if request.session['level'] in ['adm', 'pre', 'dir', 'sup', 'spe', 'dir', 'nor']:
try:
# 連結資料庫 > 呼叫sql_account內的db_conect
db = sql_account.connect()
cursor = db.cursor()
sql = "select * from staff_contrl where account = '{}'".format(account)
cursor.execute(sql)
db.commit()
db.close()
cursor.close()
# 取出單個資料
staff_Revise_Data = cursor.fetchone()
# 判斷取出資訊是否為空值
if staff_Revise_Data != None:
# 呼叫def staff_Login_Data_Retrieve,basic.html顯示登入之資料
staff_Login_Data=staff_Login_Data_Retrieve(request)
# return HttpResponse(staff_Revise_Data) > 回傳測試
return render(request, 'staff\\staff_Revise.html', {'staff_Revise_Data': staff_Revise_Data, 'size':1,'staff_Login_Data':staff_Login_Data})
else:
return HttpResponse('資料庫無資料取出 <a href="/staff_index/" >回上一頁</a>')
except:
return HttpResponse('資料庫連線失敗,請重試 <a href="/staff_index/" >回上一頁</a>')
else:
return HttpResponse('權限等級不足')
else:
return HttpResponse('不隸屬於部門範圍')
else:
return HttpResponse("尚未登入 <a href='/staff_Login/'>進行登入</a>")
def staff_ReviseDB(request):
# 將回傳值包回字典{key:value}
data = {}
# ----------驗證區 >> 帳號,姓名,密碼,電話,信箱----------------------
# 姓名驗證,只接受中文
if not re.search(r"[\u4e00-\u9fa5]", request.POST['name']):
msg = "姓名輸入錯誤,只接受中文"
return HttpResponse(msg)
else:
# 若姓名格式皆正確,存入原始變數並把所有空白都移除
data['name'] = request.POST['name'].strip()
# 密碼驗證,密碼要包含一個大小寫英文,長度大於6,小於15字元
if re.search(r"\s", request.POST['password']):
msg = "密碼輸入錯誤,不包含空白,請返回上一頁"
return HttpResponse(msg)
elif not re.search(r"[A-Z]", request.POST['password']):
msg = "密碼輸入錯誤,需至少需一個大寫英文"
return HttpResponse(msg)
elif not re.search(r"[a-z]", request.POST['password']):
msg = "密碼輸入錯誤,需至少需一個小寫英文"
return HttpResponse(msg)
# 長度檢查
elif len(request.POST['password']) < 6:
msg = "密碼輸入錯誤,長度需大於6個字元"
return HttpResponse(msg)
elif len(request.POST['password']) > 15:
msg = "密碼輸入錯誤,長度需小於15個字元"
return HttpResponse(msg)
else:
# 若密碼格式皆正確,把所有空白都移除
data['password'] = request.POST['password'].strip()
# 手機驗證,只接受數字,不接受特殊字元,長度需 == 10
if not re.search(r"09\d+", request.POST['mobile_phine']):
msg = "手機號碼需為數字"
return HttpResponse(msg)
elif len(request.POST['mobile_phine']) > 10:
msg = "手機號碼為10個數字"
return HttpResponse(msg)
elif len(request.POST['mobile_phine']) < 10:
msg = "手機號碼為10個數字"
return HttpResponse(msg)
else:
# 若手機號碼格式皆正確,存入原始變數並把所有空白都移除
data['mobile_phine'] = request.POST['mobile_phine'].strip()
# 私人信箱驗證,格式 > xxx@xxx.xxx,長度2-6字元
if not re.search(r"[a-z0-9_\.-]+\@[\da-z\.-]+\.[a-z\.]{2,6}", request.POST['privacy_mail']):
msg = "信箱格式錯誤"
return HttpResponse(msg)
else:
# 若信箱格式皆正確,存入原始變數並把所有空白都移除
data['privacy_mail'] = request.POST['privacy_mail'].strip()
# 緊急聯絡人電話驗證,只接受數字,不接受特殊字元,長度需 == 10
if not re.search(r"09\d+", request.POST['emergency_contact_tel']):
msg = "緊急聯絡人電話號碼需為數字"
return HttpResponse(msg)
elif len(request.POST['emergency_contact_tel']) > 10:
msg = "緊急聯絡人電話號碼為10個數字"
return HttpResponse(msg)
elif len(request.POST['emergency_contact_tel']) < 10:
msg = "緊急聯絡人電話號碼為10個數字"
return HttpResponse(msg)
else:
# 若緊急連絡人電話格式皆正確,存入原始變數並把所有空白都移除
data['emergency_contact_tel'] = request.POST['emergency_contact_tel'].strip()
# 接收從sraff_Revise的表單資料,轉換為要放回資料庫的list資料
# 此段接收但不執行db修改
account = request.POST['account']
# 將驗證完資料放回要存入db的變數中
name = data['name']
password = data['password']
privacy_mail = data['privacy_mail']
mail = request.POST['mail']
mobile_phine = data['mobile_phine']
addr = request.POST['addr']
emergency_contact_name = request.POST['emergency_contact_name']
emergency_contact_tel = data['emergency_contact_tel']
status = request.POST['status']
category = request.POST['category']
subjection_depar = request.POST['subjection_depar']
job_title = request.POST['job_title']
depar_director = request.POST['depar_director']
level = request.POST['level']
note = request.POST['note']
nomal_hour_month = request.POST['nomal_hour_month']
total_hour_month = request.POST['total_hour_month']
official_leave = request.POST['official_leave']
annual_sick_leave = request.POST['annual_sick_leave']
overtime_hour = request.POST['overtime_hour']
# 執行db修改
# 連結資料庫 > 呼叫sql_account內的db_conect
db = sql_account.connect()
cursor = db.cursor()
# 要加上where條件式 > 否則資料庫會全改,若要多條件 > and
sql = "update staff_contrl set name='{}', password='{}', privacy_mail='{}', mail='{}', mobile_phine='{}', addr='{}', emergency_contact_name='{}', emergency_contact_tel='{}', status={}, category={}, subjection_depar='{}', job_title='{}', depar_director='{}', level='{}', note='{}', nomal_hour_month={}, total_hour_month={}, official_leave={}, annual_sick_leave={}, overtime_hour={} where account='{}'".format(name, password, privacy_mail, mail, mobile_phine, addr, emergency_contact_name, emergency_contact_tel, status, category, subjection_depar, job_title, depar_director, level, note, nomal_hour_month, total_hour_month, official_leave, annual_sick_leave, overtime_hour, account)
cursor.execute(sql)
db.commit()
return HttpResponse("<a href='/staff_Revise/'>個人資料修改成功,回至修改頁面</a>")
# ==================allstaff_staff_Revise===========================
def all_staff_Revise(request, account=""):
# 判斷是否有登入
if request.session.get("login_name") != None:
# 進行權限檢查 > 部門檢查
if request.session['login_subjection_depar'] in (['總部', '財務部', '人力資源部', '業務部', '客戶服務部', '企劃部', '總務部'] or ['admin_Office', 'Finance_Department', 'Human_Resources_Department', 'Sales_Department', 'Customer_Service_Department', 'Planning_Department', 'General_Affairs_Department']):
# 進行權限檢查 > 職務等級檢查
if request.session['level'] in ['adm', 'pre', 'spe', 'dir']:
# 以staff_listAll內的account作為sql搜尋條件
if account != "":
try:
# 連結資料庫 > 呼叫sql_account內的db_conect
db = sql_account.connect()
cursor = db.cursor()
sql = "select * from staff_contrl where account = '{}'".format(account)
cursor.execute(sql)
db.commit()
db.close()
cursor.close()
# 取出單個資料
all_Staff_Revise_Data = cursor.fetchone()
# 判斷取出資訊是否為空值
if all_Staff_Revise_Data != None:
# 呼叫def staff_Login_Data_Retrieve,basic.html顯示登入之資料
staff_Login_Data=staff_Login_Data_Retrieve(request)
return render(request, 'staff\\all_staff_Revise.html', {'all_Staff_Revise_Data': all_Staff_Revise_Data, 'size':1,'staff_Login_Data':staff_Login_Data})
else:
return HttpResponse('資料庫無資料取出 <a href="/staff_index/" >回上一頁</a>')
except:
return HttpResponse('資料庫連線失敗,請重試 <a href="/staff_index/" >回上一頁</a>')
else:
return HttpResponse('資料庫未找到相關資料,請返回重新嘗試 <a href="/staff_index/" >回上一頁</a>')
else:
# 權限等級檢查未通過
return HttpResponse('權限等級不足')
else:
# 部門檢查未通過
return HttpResponse('不隸屬於部門範圍')
else:
return HttpResponse("尚未登入 <a href='/staff_Login/'>進行登入</a>")
def all_staff_ReviseDB(request, account=""):
# 將回傳值包回字典{key:value}
data = {}
# ----------驗證區 >> 帳號,姓名,密碼,電話,信箱----------------------
# 姓名驗證,只接受中文
if not re.search(r"[\u4e00-\u9fa5]", request.POST['name']):
msg = "姓名輸入錯誤,只接受中文"
return HttpResponse(msg)
else:
# 若姓名格式皆正確,存入原始變數並把所有空白都移除
data['name'] = request.POST['name'].strip()
# 密碼驗證,密碼要包含一個大小寫英文,長度大於6,小於15字元
if re.search(r"\s", request.POST['password']):
msg = "密碼輸入錯誤,不包含空白,請返回上一頁"
return HttpResponse(msg)
elif not re.search(r"[A-Z]", request.POST['password']):
msg = "密碼輸入錯誤,需至少需一個大寫英文"
return HttpResponse(msg)
elif not re.search(r"[a-z]", request.POST['password']):
msg = "密碼輸入錯誤,需至少需一個小寫英文"
return HttpResponse(msg)
# 長度檢查
elif len(request.POST['password']) < 6:
msg = "密碼輸入錯誤,長度需大於6個字元"
return HttpResponse(msg)
elif len(request.POST['password']) > 15:
msg = "密碼輸入錯誤,長度需小於15個字元"
return HttpResponse(msg)
else:
# 若密碼格式皆正確,把所有空白都移除
data['password'] = request.POST['password'].strip()
# 手機驗證,只接受數字,不接受特殊字元,長度需 == 10
if not re.search(r"09\d+", request.POST['mobile_phine']):
msg = "手機號碼需為數字"
return HttpResponse(msg)
elif len(request.POST['mobile_phine']) > 10:
msg = "手機號碼為10個數字"
return HttpResponse(msg)
elif len(request.POST['mobile_phine']) < 10:
msg = "手機號碼為10個數字"
return HttpResponse(msg)
else:
# 若手機號碼格式皆正確,存入原始變數並把所有空白都移除
data['mobile_phine'] = request.POST['mobile_phine'].strip()
# 私人信箱驗證,格式 > xxx@xxx.xxx,長度2-6字元
if not re.search(r"[a-z0-9_\.-]+\@[\da-z\.-]+\.[a-z\.]{2,6}", request.POST['privacy_mail']):
msg = "私人信箱格式錯誤"
return HttpResponse(msg)
else:
# 若信箱格式皆正確,存入原始變數並把所有空白都移除
data['privacy_mail'] = request.POST['privacy_mail'].strip()
# 公司信箱驗證,格式 > xxx@xxx.xxx,長度2-6字元
if not re.search(r"[a-z0-9_\.-]+\@[\da-z\.-]+\.[a-z\.]{2,6}", request.POST['mail']):
msg = "公司信箱格式錯誤"
return HttpResponse(msg)
else:
# 若信箱格式皆正確,存入原始變數並把所有空白都移除
data['mail'] = request.POST['mail'].strip()
# 緊急聯絡人電話驗證,只接受數字,不接受特殊字元,長度需 == 10
if not re.search(r"09\d+", request.POST['emergency_contact_tel']):
msg = "緊急聯絡人電話號碼需為數字"
return HttpResponse(msg)
elif len(request.POST['emergency_contact_tel']) > 10:
msg = "緊急聯絡人電話號碼為10個數字"
return HttpResponse(msg)
elif len(request.POST['emergency_contact_tel']) < 10:
msg = "緊急聯絡人電話號碼為10個數字"
return HttpResponse(msg)
else:
# 若緊急連絡人電話格式皆正確,存入原始變數並把所有空白都移除
data['emergency_contact_tel'] = request.POST['emergency_contact_tel'].strip()
# 接收從all_staff_Revise的表單資料,轉換為要放回資料庫的list資料
account = request.POST['account']
name = data['name']
password = data['password']
privacy_mail = data['privacy_mail']
mail = data['mail']
mobile_phine = data['mobile_phine']
addr = request.POST['addr']
emergency_contact_name = request.POST['emergency_contact_name']
emergency_contact_tel = data['emergency_contact_tel']
status = request.POST['status']
category = request.POST['category']
subjection_depar = request.POST['subjection_depar']
job_title = request.POST['job_title']
depar_director = request.POST['depar_director']
level = request.POST['level']
note = request.POST['note']
nomal_hour_month = request.POST['nomal_hour_month']
total_hour_month = request.POST['total_hour_month']
official_leave = request.POST['official_leave']
annual_sick_leave = request.POST['annual_sick_leave']
overtime_hour = request.POST['overtime_hour']
# 執行修改
# 連結資料庫 > 呼叫sql_account內的db_conect
db = sql_account.connect()
cursor = db.cursor()
# 要加上where條件式 > 否則資料庫全改,多條件 > and
sql = "update staff_contrl set name='{}', password='{}', privacy_mail='{}', mail='{}', mobile_phine='{}', addr='{}', emergency_contact_name='{}', emergency_contact_tel='{}', status={}, category={}, subjection_depar='{}', job_title='{}', depar_director='{}', level='{}', note='{}', nomal_hour_month={}, total_hour_month={}, official_leave={}, annual_sick_leave={}, overtime_hour={} where account='{}'".format(name, password, privacy_mail, mail, mobile_phine, addr, emergency_contact_name, emergency_contact_tel, status, category, subjection_depar, job_title, depar_director, level, note, nomal_hour_month, total_hour_month, official_leave, annual_sick_leave, overtime_hour, account)
cursor.execute(sql)
db.commit()
return HttpResponse("<a href='/staff_ListAll/'>職員資料修改成功,回至職員列表</a>")
# ==================dep_all_staff_Revise===========================
def dep_all_staff_Revise(request, account=""):
# 判斷是否有登入
if request.session.get("login_name") != None:
# 進行權限檢查 > 部門檢查
if request.session['login_subjection_depar'] in (['總部', '財務部', '人力資源部', '業務部', '客戶服務部', '企劃部', '總務部']):
# 進行權限檢查 > 職務等級檢查
if request.session['level'] in ['adm', 'pre', 'spe', 'dir']:
# 以staff_listAll內的account作為sql搜尋條件
if account != "":
try:
# 連結資料庫 > 呼叫sql_account內的db_conect
db = sql_account.connect()
cursor = db.cursor()
sql = "select * from staff_contrl where account = '{}'".format(account)
cursor.execute(sql)
db.commit()
db.close()
cursor.close()
# 取出單個資料
dep_all_staff_Revise_Data = cursor.fetchone()
# return HttpResponse(dep_all_staff_Revisee_Data)
# 判斷取出資訊是否為空值
if dep_all_staff_Revise_Data != None:
# 呼叫def staff_Login_Data_Retrieve,basic.html顯示登入之資料
staff_Login_Data=staff_Login_Data_Retrieve(request)
return render(request, 'staff\\dep_all_staff_Revise.html', {'dep_all_staff_Revise_Data': dep_all_staff_Revise_Data, 'size':1,'staff_Login_Data':staff_Login_Data})
else:
return HttpResponse('資料庫無資料取出 <a href="/staff_index/" >回上一頁</a>')
except:
return HttpResponse('資料庫連線失敗,請重試 <a href="/staff_index/" >回上一頁</a>')
else:
return HttpResponse('資料庫未找到相關資料,請返回重新嘗試 <a href="/staff_index/" >回上一頁</a>')
else:
# 權限等級檢查未通過
return HttpResponse('權限等級不足')
else:
# 部門檢查未通過
return HttpResponse('不隸屬於部門範圍')
else:
return HttpResponse("尚未登入 <a href='/staff_Login/'>進行登入</a>")
def dep_all_staff_ReviseDB(request, account=""):
# 將回傳值包回字典{key:value}
data = {}
# ----------驗證區 >> 帳號,姓名,密碼,電話,信箱----------------------
# 姓名驗證,只接受中文
if not re.search(r"[\u4e00-\u9fa5]", request.POST['name']):
msg = "姓名輸入錯誤,只接受中文"
return HttpResponse(msg)
else:
# 若姓名格式皆正確,存入原始變數並把所有空白都移除
data['name'] = request.POST['name'].strip()
# 密碼驗證,密碼要包含一個大小寫英文,長度大於6,小於15字元
if re.search(r"\s", request.POST['password']):
msg = "密碼輸入錯誤,不包含空白,請返回上一頁"
return HttpResponse(msg)
elif not re.search(r"[A-Z]", request.POST['password']):
msg = "密碼輸入錯誤,需至少需一個大寫英文"
return HttpResponse(msg)
elif not re.search(r"[a-z]", request.POST['password']):
msg = "密碼輸入錯誤,需至少需一個小寫英文"
return HttpResponse(msg)
# 長度檢查
elif len(request.POST['password']) < 6:
msg = "密碼輸入錯誤,長度需大於6個字元"
return HttpResponse(msg)
elif len(request.POST['password']) > 15:
msg = "密碼輸入錯誤,長度需小於15個字元"
return HttpResponse(msg)
else:
# 若密碼格式皆正確,把所有空白都移除
data['password'] = request.POST['password'].strip()
# 手機驗證,只接受數字,不接受特殊字元,長度需 == 10
if not re.search(r"09\d+", request.POST['mobile_phine']):
msg = "手機號碼需為數字"
return HttpResponse(msg)
elif len(request.POST['mobile_phine']) > 10:
msg = "手機號碼為10個數字"
return HttpResponse(msg)
elif len(request.POST['mobile_phine']) < 10:
msg = "手機號碼為10個數字"
return HttpResponse(msg)
else:
# 若手機號碼格式皆正確,存入原始變數並把所有空白都移除
data['mobile_phine'] = request.POST['mobile_phine'].strip()
# 私人信箱驗證,格式 > xxx@xxx.xxx,長度2-6字元
if not re.search(r"[a-z0-9_\.-]+\@[\da-z\.-]+\.[a-z\.]{2,6}", request.POST['privacy_mail']):
msg = "私人信箱格式錯誤"
return HttpResponse(msg)
else:
# 若信箱格式皆正確,存入原始變數並把所有空白都移除
data['privacy_mail'] = request.POST['privacy_mail'].strip()
# 公司信箱驗證,格式 > xxx@xxx.xxx,長度2-6字元
if not re.search(r"[a-z0-9_\.-]+\@[\da-z\.-]+\.[a-z\.]{2,6}", request.POST['mail']):
msg = "公司信箱格式錯誤"
return HttpResponse(msg)
else:
# 若信箱格式皆正確,存入原始變數並把所有空白都移除
data['mail'] = request.POST['mail'].strip()
# 緊急聯絡人電話驗證,只接受數字,不接受特殊字元,長度需 == 10
if not re.search(r"09\d+", request.POST['emergency_contact_tel']):
msg = "緊急聯絡人電話號碼需為數字"
return HttpResponse(msg)
elif len(request.POST['emergency_contact_tel']) > 10:
msg = "緊急聯絡人電話號碼為10個數字"
return HttpResponse(msg)
elif len(request.POST['emergency_contact_tel']) < 10:
msg = "緊急聯絡人電話號碼為10個數字"
return HttpResponse(msg)
else:
# 若緊急連絡人電話格式皆正確,存入原始變數並把所有空白都移除
data['emergency_contact_tel'] = request.POST['emergency_contact_tel'].strip()
# 接收從all_staff_Revise的表單資料,轉換為要放回資料庫的list資料
account = request.POST['account']
name = data['name']
password = data['password']
privacy_mail = data['privacy_mail']
mail = data['mail']
mobile_phine = data['mobile_phine']
addr = request.POST['addr']
emergency_contact_name = request.POST['emergency_contact_name']
emergency_contact_tel = data['emergency_contact_tel']
status = request.POST['status']
category = request.POST['category']
subjection_depar = request.POST['subjection_depar']
job_title = request.POST['job_title']
depar_director = request.POST['depar_director']
level = request.POST['level']
note = request.POST['note']
nomal_hour_month = request.POST['nomal_hour_month']
total_hour_month = request.POST['total_hour_month']
official_leave = request.POST['official_leave']
annual_sick_leave = request.POST['annual_sick_leave']
overtime_hour = request.POST['overtime_hour']
# 執行修改
# 連結資料庫 > 呼叫sql_account內的db_conect
db = sql_account.connect()
cursor = db.cursor()
# 要加上where條件式 > 否則資料庫全改,多條件 > and
sql = "update staff_contrl set name='{}', password='{}', privacy_mail='{}', mail='{}', mobile_phine='{}', addr='{}', emergency_contact_name='{}', emergency_contact_tel='{}', status={}, category={}, subjection_depar='{}', job_title='{}', depar_director='{}', level='{}', note='{}', nomal_hour_month={}, total_hour_month={}, official_leave={}, annual_sick_leave={}, overtime_hour={} where account='{}'".format(name, password, privacy_mail, mail, mobile_phine, addr, emergency_contact_name, emergency_contact_tel, status, category, subjection_depar, job_title, depar_director, level, note, nomal_hour_month, total_hour_month, official_leave, annual_sick_leave, overtime_hour, account)
cursor.execute(sql)
db.commit()
return HttpResponse("<a href='/dep_Staff_ListAll/'>部門職員資料修改成功,回至職員列表</a>")
# ========================staff_Delete=================================
def staff_Delete(request, account=""):
if request.session.get("login_name") != None:
# 進行權限檢查 > 部門檢查
if request.session['login_subjection_depar'] in (['總部', '財務部', '人力資源部', '業務部', '客戶服務部', '企劃部'] or ['admin_Office', 'Finance_Department', 'Human_Resources_Department', 'Sales_Department', 'Customer_Service_Department', 'Planning_Department']):
# 進行權限檢查 > 職務等級檢查
if request.session['level'] in ['adm', 'pre']:
# 以listall表單內的account作為sql搜尋條件
if account != "":
try:
# 連結資料庫 > 呼叫sql_account內的db_conect
db = sql_account.connect()
cursor = db.cursor()
sql = "delete from staff_contrl where account = '{}'".format(account)
cursor.execute(sql)
db.commit()
db.close()
cursor.close()
return HttpResponse("<a href='/staff_ListAll/'>刪除成功,回至列表</a>")
except:
return HttpResponse("<a href='/staff_ListAll/'>刪除失敗,請重試</a>")
else:
return render(request, "staff\\staff_ListAll.html")
else:
# 權限等級檢查未通過
return HttpResponse('權限等級不足')
else:
# 部門檢查未通過
return HttpResponse('不隸屬於部門範圍')
else:
return HttpResponse("尚未登入 <a href='/staff_Login/'>進行登入</a>")
# ========================all_staff_data_Export=================================
# 所有檔案匯出
def all_staff_data_Export(request):
try:
# 連結資料庫 > 呼叫sql_account內的db_conect
db = sql_account.connect()
cursor = db.cursor()
sql = "select * from staff_contrl"
cursor.execute(sql)
all_staff_data = cursor.fetchall()
# 取得sql欄位
field = cursor.description
# columns > sql列
columns = []
# 以長度透過迴圈加入變數中
for i in range(len(field)):
columns.append(field[i][0])
# 取得本機時間並格式化(excel不接受特殊字元),需要引入datetime
localTime = datetime.now().strftime("%Y-%m-%d-%H-%M-%S-%p")
# 設定存取路徑,名稱為現在時間
result_PATH = r'D:\\python課程\\自我練習與實作\\rex_web\\{}.xlsx'.format('職員管理總表備份'+localTime)
# 以pandas寫入路徑
writer = pd.ExcelWriter( result_PATH , engine='xlsxwriter')
# 添加sql欄位
df = pd.DataFrame(columns=columns)
# 將sql欄位寫入excel中
for i in range(len(all_staff_data)):
df.loc[i] = list(all_staff_data[i])
# sheet_name 資料夾底部名稱
df.to_excel(writer, sheet_name='所有職員管理表' ,index =False)
writer .save()
writer.close()
db.close()
cursor.close()
return HttpResponse("下載成功,路徑名稱為 : {} <a href='/staff_index/'>返回管理頁面</a>".format(result_PATH))
except:
return HttpResponse("資料庫連線錯誤,請重試")
# ========================staff_contral_erp_process_chart=================================
# erp流程圖
def staff_contral_erp_process_chart(request):
# return HttpResponse("hi")
return render(request, 'process_chart\\staff_contral_erp_process_chart.html')
# ========================satff_list_all_dep_condion=================================
def satff_list_all_dep_condion(request):
if request.session.get("login_name") != None:
# 以session內的login_account作為sql搜尋條件
account = request.session['login_account']
try:
# 連結資料庫 > 呼叫sql_account內的db_conect
db = sql_account.connect()
cursor = db.cursor()
except:
return HttpResponse('連結資料庫失敗')
# 抓出列表中的部門名稱
dep_condition = request.POST['dep_condition']
if dep_condition != None:
# 連結資料庫 > 呼叫sql_account內的db_conect
db = sql_account.connect()
cursor = db.cursor()
sql = "select * from staff_contrl where subjection_depar='{}'".format(dep_condition)
cursor.execute(sql)
db.commit()
staff_ListAll = cursor.fetchall()
db.close()
cursor.close()
if staff_ListAll != None:
# 呼叫def staff_Login_Data_Retrieve,basic.html顯示登入之資料
staff_Login_Data=staff_Login_Data_Retrieve(request)
return render(request, 'staff\\staff_ListAll.html', {'staff_ListAll': staff_ListAll, 'staff_Login_Data':staff_Login_Data})
else:
return HttpResponse('資料庫無資料取出 <a href="/staff_index/" >回上一頁</a>')
else:
return HttpResponse('資料庫無資料取出 <a href="/staff_index/" >回上一頁</a>')
else:
return HttpResponse('資料庫無資料取出 <a href="/staff_index/" >回上一頁</a>')
| 44.318085 | 700 | 0.583451 | 4,371 | 41,659 | 5.355525 | 0.098833 | 0.071426 | 0.044256 | 0.030971 | 0.834209 | 0.815199 | 0.791362 | 0.762741 | 0.747661 | 0.732411 | 0 | 0.007266 | 0.266545 | 41,659 | 939 | 701 | 44.365282 | 0.758861 | 0.132648 | 0 | 0.760671 | 0 | 0.007622 | 0.260582 | 0.090231 | 0 | 0 | 0 | 0 | 0 | 1 | 0.030488 | false | 0.054878 | 0.012195 | 0.003049 | 0.234756 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
cafcf6c2c6d0bcddfb90c0c15b91816ba26d36b1 | 6,459 | py | Python | bindings/python/ensmallen_graph/datasets/yue/ndfrtdda.py | caufieldjh/ensmallen_graph | 14e98b1cdbc73193a84a913d7d4f2b2b3eb2c43a | [
"MIT"
] | null | null | null | bindings/python/ensmallen_graph/datasets/yue/ndfrtdda.py | caufieldjh/ensmallen_graph | 14e98b1cdbc73193a84a913d7d4f2b2b3eb2c43a | [
"MIT"
] | null | null | null | bindings/python/ensmallen_graph/datasets/yue/ndfrtdda.py | caufieldjh/ensmallen_graph | 14e98b1cdbc73193a84a913d7d4f2b2b3eb2c43a | [
"MIT"
] | null | null | null | """
This file offers the methods to automatically retrieve the graph NDFRTDDA.
The graph is automatically retrieved from the Yue repository.
Report
---------------------
At the time of rendering these methods (please see datetime below), the graph
had the following characteristics:
Datetime: 2021-03-02 10:32:32.759091
The undirected graph NDFRTDDA has 13545 nodes with 2 different node types:
drug (nodes number 12337) and disease (nodes number 1208) and 56515 unweighted
edges, of which none are self-loops. The graph is quite sparse as it has
a density of 0.00062 and has 85 connected components, where the component
with most nodes has 13033 nodes and the component with the least nodes
has 2 nodes. The graph median node degree is 3, the mean node degree is
8.34, and the node degree mode is 1. The top 5 most central nodes are C0030193
(degree 845), C0004623 (degree 741), C0004096 (degree 653), C0038160 (degree
575) and C0020538 (degree 534). The hash of the graph is aff446d784c13f38
.
References
---------------------
Please cite the following if you use the data:
@article{yue2020graph,
title={Graph embedding on biomedical networks: methods, applications and evaluations},
author={Yue, Xiang and Wang, Zhen and Huang, Jingong and Parthasarathy, Srinivasan and Moosavinasab, Soheil and Huang, Yungui and Lin, Simon M and Zhang, Wen and Zhang, Ping and Sun, Huan},
journal={Bioinformatics},
volume={36},
number={4},
pages={1241--1251},
year={2020},
publisher={Oxford University Press}
}
Usage example
----------------------
The usage of this graph is relatively straightforward:
.. code:: python
# First import the function to retrieve the graph from the datasets
from ensmallen_graph.datasets.yue import NDFRTDDA
# Then load the graph
graph = NDFRTDDA()
# Finally, you can do anything with it, for instance, compute its report:
print(graph)
# If you need to run a link prediction task with validation,
# you can split the graph using a connected holdout as follows:
train_graph, validation_graph = graph.connected_holdout(
# You can use an 80/20 split the holdout, for example.
train_size=0.8,
# The random state is used to reproduce the holdout.
random_state=42,
# Wether to show a loading bar.
verbose=True
)
# Remember that, if you need, you can enable the memory-time trade-offs:
train_graph.enable(
vector_sources=True,
vector_destinations=True,
vector_outbounds=True
)
# Consider using the methods made available in the Embiggen package
# to run graph embedding or link prediction tasks.
"""
from typing import Dict
from ..automatic_graph_retrieval import AutomaticallyRetrievedGraph
from ...ensmallen_graph import EnsmallenGraph # pylint: disable=import-error
def NDFRTDDA(
directed: bool = False,
verbose: int = 2,
cache_path: str = "graphs/yue",
**additional_graph_kwargs: Dict
) -> EnsmallenGraph:
"""Return new instance of the NDFRTDDA graph.
The graph is automatically retrieved from the Yue repository.
Parameters
-------------------
directed: bool = False,
Wether to load the graph as directed or undirected.
By default false.
verbose: int = 2,
Wether to show loading bars during the retrieval and building
of the graph.
cache_path: str = "graphs",
Where to store the downloaded graphs.
additional_graph_kwargs: Dict,
Additional graph kwargs.
Returns
-----------------------
Instace of NDFRTDDA graph.
Report
---------------------
At the time of rendering these methods (please see datetime below), the graph
had the following characteristics:
Datetime: 2021-03-02 10:32:32.759091
The undirected graph NDFRTDDA has 13545 nodes with 2 different node types:
drug (nodes number 12337) and disease (nodes number 1208) and 56515 unweighted
edges, of which none are self-loops. The graph is quite sparse as it has
a density of 0.00062 and has 85 connected components, where the component
with most nodes has 13033 nodes and the component with the least nodes
has 2 nodes. The graph median node degree is 3, the mean node degree is
8.34, and the node degree mode is 1. The top 5 most central nodes are C0030193
(degree 845), C0004623 (degree 741), C0004096 (degree 653), C0038160 (degree
575) and C0020538 (degree 534). The hash of the graph is aff446d784c13f38
.
References
---------------------
Please cite the following if you use the data:
@article{yue2020graph,
title={Graph embedding on biomedical networks: methods, applications and evaluations},
author={Yue, Xiang and Wang, Zhen and Huang, Jingong and Parthasarathy, Srinivasan and Moosavinasab, Soheil and Huang, Yungui and Lin, Simon M and Zhang, Wen and Zhang, Ping and Sun, Huan},
journal={Bioinformatics},
volume={36},
number={4},
pages={1241--1251},
year={2020},
publisher={Oxford University Press}
}
Usage example
----------------------
The usage of this graph is relatively straightforward:
.. code:: python
# First import the function to retrieve the graph from the datasets
from ensmallen_graph.datasets.yue import NDFRTDDA
# Then load the graph
graph = NDFRTDDA()
# Finally, you can do anything with it, for instance, compute its report:
print(graph)
# If you need to run a link prediction task with validation,
# you can split the graph using a connected holdout as follows:
train_graph, validation_graph = graph.connected_holdout(
# You can use an 80/20 split the holdout, for example.
train_size=0.8,
# The random state is used to reproduce the holdout.
random_state=42,
# Wether to show a loading bar.
verbose=True
)
# Remember that, if you need, you can enable the memory-time trade-offs:
train_graph.enable(
vector_sources=True,
vector_destinations=True,
vector_outbounds=True
)
# Consider using the methods made available in the Embiggen package
# to run graph embedding or link prediction tasks.
"""
return AutomaticallyRetrievedGraph(
graph_name="NDFRTDDA",
dataset="yue",
directed=directed,
verbose=verbose,
cache_path=cache_path,
additional_graph_kwargs=additional_graph_kwargs
)()
| 33.466321 | 192 | 0.696238 | 888 | 6,459 | 5.023649 | 0.274775 | 0.034073 | 0.01345 | 0.010312 | 0.827617 | 0.827617 | 0.827617 | 0.827617 | 0.827617 | 0.804304 | 0 | 0.058637 | 0.218455 | 6,459 | 192 | 193 | 33.640625 | 0.825079 | 0.934665 | 0 | 0 | 0 | 0 | 0.038391 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.058824 | false | 0 | 0.176471 | 0 | 0.294118 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
1b086c9931a5bf8003793b48ad01dd1e5db6c924 | 6,683 | py | Python | loldib/getratings/models/NA/na_shyvana/na_shyvana_sup.py | koliupy/loldib | c9ab94deb07213cdc42b5a7c26467cdafaf81b7f | [
"Apache-2.0"
] | null | null | null | loldib/getratings/models/NA/na_shyvana/na_shyvana_sup.py | koliupy/loldib | c9ab94deb07213cdc42b5a7c26467cdafaf81b7f | [
"Apache-2.0"
] | null | null | null | loldib/getratings/models/NA/na_shyvana/na_shyvana_sup.py | koliupy/loldib | c9ab94deb07213cdc42b5a7c26467cdafaf81b7f | [
"Apache-2.0"
] | null | null | null | from getratings.models.ratings import Ratings
class NA_Shyvana_Sup_Aatrox(Ratings):
pass
class NA_Shyvana_Sup_Ahri(Ratings):
pass
class NA_Shyvana_Sup_Akali(Ratings):
pass
class NA_Shyvana_Sup_Alistar(Ratings):
pass
class NA_Shyvana_Sup_Amumu(Ratings):
pass
class NA_Shyvana_Sup_Anivia(Ratings):
pass
class NA_Shyvana_Sup_Annie(Ratings):
pass
class NA_Shyvana_Sup_Ashe(Ratings):
pass
class NA_Shyvana_Sup_AurelionSol(Ratings):
pass
class NA_Shyvana_Sup_Azir(Ratings):
pass
class NA_Shyvana_Sup_Bard(Ratings):
pass
class NA_Shyvana_Sup_Blitzcrank(Ratings):
pass
class NA_Shyvana_Sup_Brand(Ratings):
pass
class NA_Shyvana_Sup_Braum(Ratings):
pass
class NA_Shyvana_Sup_Caitlyn(Ratings):
pass
class NA_Shyvana_Sup_Camille(Ratings):
pass
class NA_Shyvana_Sup_Cassiopeia(Ratings):
pass
class NA_Shyvana_Sup_Chogath(Ratings):
pass
class NA_Shyvana_Sup_Corki(Ratings):
pass
class NA_Shyvana_Sup_Darius(Ratings):
pass
class NA_Shyvana_Sup_Diana(Ratings):
pass
class NA_Shyvana_Sup_Draven(Ratings):
pass
class NA_Shyvana_Sup_DrMundo(Ratings):
pass
class NA_Shyvana_Sup_Ekko(Ratings):
pass
class NA_Shyvana_Sup_Elise(Ratings):
pass
class NA_Shyvana_Sup_Evelynn(Ratings):
pass
class NA_Shyvana_Sup_Ezreal(Ratings):
pass
class NA_Shyvana_Sup_Fiddlesticks(Ratings):
pass
class NA_Shyvana_Sup_Fiora(Ratings):
pass
class NA_Shyvana_Sup_Fizz(Ratings):
pass
class NA_Shyvana_Sup_Galio(Ratings):
pass
class NA_Shyvana_Sup_Gangplank(Ratings):
pass
class NA_Shyvana_Sup_Garen(Ratings):
pass
class NA_Shyvana_Sup_Gnar(Ratings):
pass
class NA_Shyvana_Sup_Gragas(Ratings):
pass
class NA_Shyvana_Sup_Graves(Ratings):
pass
class NA_Shyvana_Sup_Hecarim(Ratings):
pass
class NA_Shyvana_Sup_Heimerdinger(Ratings):
pass
class NA_Shyvana_Sup_Illaoi(Ratings):
pass
class NA_Shyvana_Sup_Irelia(Ratings):
pass
class NA_Shyvana_Sup_Ivern(Ratings):
pass
class NA_Shyvana_Sup_Janna(Ratings):
pass
class NA_Shyvana_Sup_JarvanIV(Ratings):
pass
class NA_Shyvana_Sup_Jax(Ratings):
pass
class NA_Shyvana_Sup_Jayce(Ratings):
pass
class NA_Shyvana_Sup_Jhin(Ratings):
pass
class NA_Shyvana_Sup_Jinx(Ratings):
pass
class NA_Shyvana_Sup_Kalista(Ratings):
pass
class NA_Shyvana_Sup_Karma(Ratings):
pass
class NA_Shyvana_Sup_Karthus(Ratings):
pass
class NA_Shyvana_Sup_Kassadin(Ratings):
pass
class NA_Shyvana_Sup_Katarina(Ratings):
pass
class NA_Shyvana_Sup_Kayle(Ratings):
pass
class NA_Shyvana_Sup_Kayn(Ratings):
pass
class NA_Shyvana_Sup_Kennen(Ratings):
pass
class NA_Shyvana_Sup_Khazix(Ratings):
pass
class NA_Shyvana_Sup_Kindred(Ratings):
pass
class NA_Shyvana_Sup_Kled(Ratings):
pass
class NA_Shyvana_Sup_KogMaw(Ratings):
pass
class NA_Shyvana_Sup_Leblanc(Ratings):
pass
class NA_Shyvana_Sup_LeeSin(Ratings):
pass
class NA_Shyvana_Sup_Leona(Ratings):
pass
class NA_Shyvana_Sup_Lissandra(Ratings):
pass
class NA_Shyvana_Sup_Lucian(Ratings):
pass
class NA_Shyvana_Sup_Lulu(Ratings):
pass
class NA_Shyvana_Sup_Lux(Ratings):
pass
class NA_Shyvana_Sup_Malphite(Ratings):
pass
class NA_Shyvana_Sup_Malzahar(Ratings):
pass
class NA_Shyvana_Sup_Maokai(Ratings):
pass
class NA_Shyvana_Sup_MasterYi(Ratings):
pass
class NA_Shyvana_Sup_MissFortune(Ratings):
pass
class NA_Shyvana_Sup_MonkeyKing(Ratings):
pass
class NA_Shyvana_Sup_Mordekaiser(Ratings):
pass
class NA_Shyvana_Sup_Morgana(Ratings):
pass
class NA_Shyvana_Sup_Nami(Ratings):
pass
class NA_Shyvana_Sup_Nasus(Ratings):
pass
class NA_Shyvana_Sup_Nautilus(Ratings):
pass
class NA_Shyvana_Sup_Nidalee(Ratings):
pass
class NA_Shyvana_Sup_Nocturne(Ratings):
pass
class NA_Shyvana_Sup_Nunu(Ratings):
pass
class NA_Shyvana_Sup_Olaf(Ratings):
pass
class NA_Shyvana_Sup_Orianna(Ratings):
pass
class NA_Shyvana_Sup_Ornn(Ratings):
pass
class NA_Shyvana_Sup_Pantheon(Ratings):
pass
class NA_Shyvana_Sup_Poppy(Ratings):
pass
class NA_Shyvana_Sup_Quinn(Ratings):
pass
class NA_Shyvana_Sup_Rakan(Ratings):
pass
class NA_Shyvana_Sup_Rammus(Ratings):
pass
class NA_Shyvana_Sup_RekSai(Ratings):
pass
class NA_Shyvana_Sup_Renekton(Ratings):
pass
class NA_Shyvana_Sup_Rengar(Ratings):
pass
class NA_Shyvana_Sup_Riven(Ratings):
pass
class NA_Shyvana_Sup_Rumble(Ratings):
pass
class NA_Shyvana_Sup_Ryze(Ratings):
pass
class NA_Shyvana_Sup_Sejuani(Ratings):
pass
class NA_Shyvana_Sup_Shaco(Ratings):
pass
class NA_Shyvana_Sup_Shen(Ratings):
pass
class NA_Shyvana_Sup_Shyvana(Ratings):
pass
class NA_Shyvana_Sup_Singed(Ratings):
pass
class NA_Shyvana_Sup_Sion(Ratings):
pass
class NA_Shyvana_Sup_Sivir(Ratings):
pass
class NA_Shyvana_Sup_Skarner(Ratings):
pass
class NA_Shyvana_Sup_Sona(Ratings):
pass
class NA_Shyvana_Sup_Soraka(Ratings):
pass
class NA_Shyvana_Sup_Swain(Ratings):
pass
class NA_Shyvana_Sup_Syndra(Ratings):
pass
class NA_Shyvana_Sup_TahmKench(Ratings):
pass
class NA_Shyvana_Sup_Taliyah(Ratings):
pass
class NA_Shyvana_Sup_Talon(Ratings):
pass
class NA_Shyvana_Sup_Taric(Ratings):
pass
class NA_Shyvana_Sup_Teemo(Ratings):
pass
class NA_Shyvana_Sup_Thresh(Ratings):
pass
class NA_Shyvana_Sup_Tristana(Ratings):
pass
class NA_Shyvana_Sup_Trundle(Ratings):
pass
class NA_Shyvana_Sup_Tryndamere(Ratings):
pass
class NA_Shyvana_Sup_TwistedFate(Ratings):
pass
class NA_Shyvana_Sup_Twitch(Ratings):
pass
class NA_Shyvana_Sup_Udyr(Ratings):
pass
class NA_Shyvana_Sup_Urgot(Ratings):
pass
class NA_Shyvana_Sup_Varus(Ratings):
pass
class NA_Shyvana_Sup_Vayne(Ratings):
pass
class NA_Shyvana_Sup_Veigar(Ratings):
pass
class NA_Shyvana_Sup_Velkoz(Ratings):
pass
class NA_Shyvana_Sup_Vi(Ratings):
pass
class NA_Shyvana_Sup_Viktor(Ratings):
pass
class NA_Shyvana_Sup_Vladimir(Ratings):
pass
class NA_Shyvana_Sup_Volibear(Ratings):
pass
class NA_Shyvana_Sup_Warwick(Ratings):
pass
class NA_Shyvana_Sup_Xayah(Ratings):
pass
class NA_Shyvana_Sup_Xerath(Ratings):
pass
class NA_Shyvana_Sup_XinZhao(Ratings):
pass
class NA_Shyvana_Sup_Yasuo(Ratings):
pass
class NA_Shyvana_Sup_Yorick(Ratings):
pass
class NA_Shyvana_Sup_Zac(Ratings):
pass
class NA_Shyvana_Sup_Zed(Ratings):
pass
class NA_Shyvana_Sup_Ziggs(Ratings):
pass
class NA_Shyvana_Sup_Zilean(Ratings):
pass
class NA_Shyvana_Sup_Zyra(Ratings):
pass
| 16.026379 | 46 | 0.77151 | 972 | 6,683 | 4.878601 | 0.151235 | 0.203712 | 0.407423 | 0.494728 | 0.808941 | 0.808941 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166243 | 6,683 | 416 | 47 | 16.064904 | 0.851041 | 0 | 0 | 0.498195 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.498195 | 0.00361 | 0 | 0.501805 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 8 |
1b614e75d3ba21ec7796df68570305df6f17c044 | 4,458 | py | Python | scripts/concatenate_resume_csv.py | wyshi/lm_privacy | ffce153a4dc19d9f61d3e288483f05b409fddda5 | [
"MIT"
] | 6 | 2021-09-09T01:40:13.000Z | 2022-03-10T20:46:36.000Z | scripts/concatenate_resume_csv.py | wyshi/lm_privacy | ffce153a4dc19d9f61d3e288483f05b409fddda5 | [
"MIT"
] | null | null | null | scripts/concatenate_resume_csv.py | wyshi/lm_privacy | ffce153a4dc19d9f61d3e288483f05b409fddda5 | [
"MIT"
] | null | null | null | import pandas as pd
import os
from glob import glob
BASE_DIR = "./"
episilon = pd.read_csv("attacks/canary_insertion/partialdp_missed/lr0.1_sigma0.5_norm0.001_seed0.csv")
# resume
def concate(p1, p2):
df0_0 = pd.read_csv(os.path.join(BASE_DIR, p1))
df0_1 = pd.read_csv(os.path.join(BASE_DIR, p2))
try:
assert df0_1.iloc[0]['epoch'] == df0_0.iloc[-1]['epoch'] and (int(df0_1.iloc[0]['model_ppl']) == int(df0_0.iloc[-1]['model_ppl']))
except:
import pdb; pdb.set_trace()
df0_1 = df0_1.iloc[1:]
df = pd.concat([df0_0, df0_1])
return df
canary_csv = [ #(, 'attacks/canary_insertion/partialdp/param_search/resume/lr0.1_sigma0.1_norm0.005_seed1111_resume_50epochs.csv'),
('attacks/canary_insertion/partialdp/lr0.1_sigma0.5_norm0.001_seed0.csv', 'attacks/canary_insertion/partialdp/resume/lr0.1_sigma0.5_norm0.001_seed0_resume50epochs.csv'),
('attacks/canary_insertion/partialdp/lr0.1_sigma0.5_norm0.001_seed22.csv', 'attacks/canary_insertion/partialdp/resume/lr0.1_sigma0.5_norm0.001_seed22_resume50epochs.csv'),
('attacks/canary_insertion/partialdp/lr0.1_sigma0.5_norm0.001_seed123.csv', 'attacks/canary_insertion/partialdp/resume/lr0.1_sigma0.5_norm0.001_seed123_resume50epochs.csv'),
('attacks/canary_insertion/partialdp/lr0.1_sigma0.5_norm0.001_seed300.csv', 'attacks/canary_insertion/partialdp/resume/lr0.1_sigma0.5_norm0.001_seed300_resume50epochs.csv'),
('attacks/canary_insertion/partialdp/lr0.1_sigma0.5_norm0.001_seed1111.csv', 'attacks/canary_insertion/partialdp/resume/lr0.1_sigma0.5_norm0.001_seed1111_resume_50epochs.csv')
]
member_csv = [
('attacks/membership_inference/partialdp/final_fix/lr0.1_sigma0.5_norm0.001_seed0.csv', 'attacks/membership_inference/partialdp/final_fix/resume/lr0.1_sigma0.5_norm0.001_seed0.csv'),
('attacks/membership_inference/partialdp/final_fix/lr0.1_sigma0.5_norm0.001_seed22.csv', 'attacks/membership_inference/partialdp/final_fix/resume/lr0.1_sigma0.5_norm0.001_seed22.csv'),
('attacks/membership_inference/partialdp/final_fix/lr0.1_sigma0.5_norm0.001_seed123.csv', 'attacks/membership_inference/partialdp/final_fix/resume/lr0.1_sigma0.5_norm0.001_seed123.csv'),
('attacks/membership_inference/partialdp/final_fix/lr0.1_sigma0.5_norm0.001_seed300.csv', 'attacks/membership_inference/partialdp/final_fix/resume/lr0.1_sigma0.5_norm0.001_seed300.csv'),
('attacks/membership_inference/partialdp/final_fix/lr0.1_sigma0.5_norm0.001_seed1111.csv', 'attacks/membership_inference/partialdp/final_fix/resume/lr0.1_sigma0.5_norm0.001_seed1111.csv'),
]
for p1, p2 in canary_csv:
df = concate(p1, p2)
# replace epsilon with the 100 epsilon from partialdp_missed
df['model_epsilon'] = episilon['model_epsilon'].tolist()
fname = p1.split("/")[-1].replace(".csv", "_100epochs.csv")
df.to_csv(os.path.join(BASE_DIR, "attacks/canary_insertion/partialdp/final_concat", fname), index=None)
for p1, p2 in member_csv:
df = concate(p1, p2)
# replace epsilon with the 100 epsilon from partialdp_missed
df['model_epsilon'] = episilon['model_epsilon'].tolist()
fname = p1.split("/")[-1].replace(".csv", "_100epochs.csv")
df.to_csv(os.path.join(BASE_DIR, "attacks/membership_inference//partialdp/final_concat", fname), index=None)
# param search concat
param_canary_csv = [
#('attacks/canary_insertion/partialdp/param_search/lr0.1_sigma0.1_norm0.005_seed1111.csv', 'attacks/canary_insertion/partialdp/param_search/resume/lr0.1_sigma0.1_norm0.005_seed1111_resume_50epochs.csv'),
]
# param search concat
param_member_csv = [
('attacks/membership_inference/partialdp/final_fix/param_search/lr0.1_sigma0.1_norm0.005_seed1111.csv', 'attacks/membership_inference/partialdp/final_fix/param_search/resume/lr0.1_sigma0.1_norm0.005_seed1111.csv'),
]
for p1, p2 in param_canary_csv:
df = concate(p1, p2)
# cannot replace epsilon because we didn't run with sigma=0.1 for 100 epochs
fname = p1.split("/")[-1].replace(".csv", "_100epochs.csv")
df.to_csv(os.path.join(BASE_DIR, "attacks/canary_insertion//partialdp/param_search/", fname), index=None)
for p1, p2 in param_member_csv:
df = concate(p1, p2)
# cannot replace epsilon because we didn't run with sigma=0.1 for 100 epochs
fname = p1.split("/")[-1].replace(".csv", "_100epochs.csv")
df.to_csv(os.path.join(BASE_DIR, "attacks/membership_inference//partialdp/final_fix/param_search/", fname), index=None)
| 57.153846 | 218 | 0.765141 | 683 | 4,458 | 4.715959 | 0.122987 | 0.08072 | 0.08072 | 0.071717 | 0.904688 | 0.886681 | 0.860602 | 0.846321 | 0.784849 | 0.746663 | 0 | 0.093069 | 0.093764 | 4,458 | 77 | 219 | 57.896104 | 0.704208 | 0.141543 | 0 | 0.192308 | 0 | 0.019231 | 0.615143 | 0.573749 | 0 | 0 | 0 | 0 | 0.019231 | 1 | 0.019231 | false | 0 | 0.076923 | 0 | 0.115385 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
1b9617f1810a56dcc79b2f6c9f30b5bbb377e6a1 | 16,780 | py | Python | v6.0.5/alertemail/test_fortios_alertemail_setting.py | fortinet-solutions-cse/ansible_fgt_modules | c45fba49258d7c9705e7a8fd9c2a09ea4c8a4719 | [
"Apache-2.0"
] | 14 | 2018-09-25T20:35:25.000Z | 2021-07-14T04:30:54.000Z | v6.0.6/alertemail/test_fortios_alertemail_setting.py | fortinet-solutions-cse/ansible_fgt_modules | c45fba49258d7c9705e7a8fd9c2a09ea4c8a4719 | [
"Apache-2.0"
] | 32 | 2018-10-09T04:13:42.000Z | 2020-05-11T07:20:28.000Z | v6.0.6/alertemail/test_fortios_alertemail_setting.py | fortinet-solutions-cse/ansible_fgt_modules | c45fba49258d7c9705e7a8fd9c2a09ea4c8a4719 | [
"Apache-2.0"
] | 11 | 2018-10-09T00:14:53.000Z | 2021-11-03T10:54:09.000Z | # Copyright 2019 Fortinet, Inc.
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with Ansible. If not, see <https://www.gnu.org/licenses/>.
# Make coding more python3-ish
from __future__ import (absolute_import, division, print_function)
__metaclass__ = type
import os
import json
import pytest
from mock import ANY
from ansible.module_utils.network.fortios.fortios import FortiOSHandler
try:
from ansible.modules.network.fortios import fortios_alertemail_setting
except ImportError:
pytest.skip("Could not load required modules for testing", allow_module_level=True)
@pytest.fixture(autouse=True)
def connection_mock(mocker):
connection_class_mock = mocker.patch('ansible.modules.network.fortios.fortios_alertemail_setting.Connection')
return connection_class_mock
fos_instance = FortiOSHandler(connection_mock)
def test_alertemail_setting_creation(mocker):
schema_method_mock = mocker.patch('ansible.module_utils.network.fortios.fortios.FortiOSHandler.schema')
set_method_result = {'status': 'success', 'http_method': 'POST', 'http_status': 200}
set_method_mock = mocker.patch('ansible.module_utils.network.fortios.fortios.FortiOSHandler.set', return_value=set_method_result)
input_data = {
'username': 'admin',
'state': 'present',
'alertemail_setting': {
'admin_login_logs': 'enable',
'alert_interval': '4',
'amc_interface_bypass_mode': 'enable',
'antivirus_logs': 'enable',
'configuration_changes_logs': 'enable',
'critical_interval': '8',
'debug_interval': '9',
'email_interval': '10',
'emergency_interval': '11',
'error_interval': '12',
'FDS_license_expiring_days': '13',
'FDS_license_expiring_warning': 'enable',
'FDS_update_logs': 'enable',
'filter_mode': 'category',
'FIPS_CC_errors': 'enable',
'firewall_authentication_failure_logs': 'enable',
'fortiguard_log_quota_warning': 'enable',
'FSSO_disconnect_logs': 'enable',
'HA_logs': 'enable',
'information_interval': '22',
'IPS_logs': 'enable',
'IPsec_errors_logs': 'enable',
'local_disk_usage': '25',
'log_disk_usage_warning': 'enable',
'mailto1': 'test_value_27',
'mailto2': 'test_value_28',
'mailto3': 'test_value_29',
'notification_interval': '30',
'PPP_errors_logs': 'enable',
'severity': 'emergency',
'ssh_logs': 'enable',
'sslvpn_authentication_errors_logs': 'enable',
'username': 'test_value_35',
'violation_traffic_logs': 'enable',
'warning_interval': '37',
'webfilter_logs': 'enable'
},
'vdom': 'root'}
is_error, changed, response = fortios_alertemail_setting.fortios_alertemail(input_data, fos_instance)
expected_data = {
'admin-login-logs': 'enable',
'alert-interval': '4',
'amc-interface-bypass-mode': 'enable',
'antivirus-logs': 'enable',
'configuration-changes-logs': 'enable',
'critical-interval': '8',
'debug-interval': '9',
'email-interval': '10',
'emergency-interval': '11',
'error-interval': '12',
'FDS-license-expiring-days': '13',
'FDS-license-expiring-warning': 'enable',
'FDS-update-logs': 'enable',
'filter-mode': 'category',
'FIPS-CC-errors': 'enable',
'firewall-authentication-failure-logs': 'enable',
'fortiguard-log-quota-warning': 'enable',
'FSSO-disconnect-logs': 'enable',
'HA-logs': 'enable',
'information-interval': '22',
'IPS-logs': 'enable',
'IPsec-errors-logs': 'enable',
'local-disk-usage': '25',
'log-disk-usage-warning': 'enable',
'mailto1': 'test_value_27',
'mailto2': 'test_value_28',
'mailto3': 'test_value_29',
'notification-interval': '30',
'PPP-errors-logs': 'enable',
'severity': 'emergency',
'ssh-logs': 'enable',
'sslvpn-authentication-errors-logs': 'enable',
'username': 'test_value_35',
'violation-traffic-logs': 'enable',
'warning-interval': '37',
'webfilter-logs': 'enable'
}
set_method_mock.assert_called_with('alertemail', 'setting', data=expected_data, vdom='root')
schema_method_mock.assert_not_called()
assert not is_error
assert changed
assert response['status'] == 'success'
assert response['http_status'] == 200
def test_alertemail_setting_creation_fails(mocker):
schema_method_mock = mocker.patch('ansible.module_utils.network.fortios.fortios.FortiOSHandler.schema')
set_method_result = {'status': 'error', 'http_method': 'POST', 'http_status': 500}
set_method_mock = mocker.patch('ansible.module_utils.network.fortios.fortios.FortiOSHandler.set', return_value=set_method_result)
input_data = {
'username': 'admin',
'state': 'present',
'alertemail_setting': {
'admin_login_logs': 'enable',
'alert_interval': '4',
'amc_interface_bypass_mode': 'enable',
'antivirus_logs': 'enable',
'configuration_changes_logs': 'enable',
'critical_interval': '8',
'debug_interval': '9',
'email_interval': '10',
'emergency_interval': '11',
'error_interval': '12',
'FDS_license_expiring_days': '13',
'FDS_license_expiring_warning': 'enable',
'FDS_update_logs': 'enable',
'filter_mode': 'category',
'FIPS_CC_errors': 'enable',
'firewall_authentication_failure_logs': 'enable',
'fortiguard_log_quota_warning': 'enable',
'FSSO_disconnect_logs': 'enable',
'HA_logs': 'enable',
'information_interval': '22',
'IPS_logs': 'enable',
'IPsec_errors_logs': 'enable',
'local_disk_usage': '25',
'log_disk_usage_warning': 'enable',
'mailto1': 'test_value_27',
'mailto2': 'test_value_28',
'mailto3': 'test_value_29',
'notification_interval': '30',
'PPP_errors_logs': 'enable',
'severity': 'emergency',
'ssh_logs': 'enable',
'sslvpn_authentication_errors_logs': 'enable',
'username': 'test_value_35',
'violation_traffic_logs': 'enable',
'warning_interval': '37',
'webfilter_logs': 'enable'
},
'vdom': 'root'}
is_error, changed, response = fortios_alertemail_setting.fortios_alertemail(input_data, fos_instance)
expected_data = {
'admin-login-logs': 'enable',
'alert-interval': '4',
'amc-interface-bypass-mode': 'enable',
'antivirus-logs': 'enable',
'configuration-changes-logs': 'enable',
'critical-interval': '8',
'debug-interval': '9',
'email-interval': '10',
'emergency-interval': '11',
'error-interval': '12',
'FDS-license-expiring-days': '13',
'FDS-license-expiring-warning': 'enable',
'FDS-update-logs': 'enable',
'filter-mode': 'category',
'FIPS-CC-errors': 'enable',
'firewall-authentication-failure-logs': 'enable',
'fortiguard-log-quota-warning': 'enable',
'FSSO-disconnect-logs': 'enable',
'HA-logs': 'enable',
'information-interval': '22',
'IPS-logs': 'enable',
'IPsec-errors-logs': 'enable',
'local-disk-usage': '25',
'log-disk-usage-warning': 'enable',
'mailto1': 'test_value_27',
'mailto2': 'test_value_28',
'mailto3': 'test_value_29',
'notification-interval': '30',
'PPP-errors-logs': 'enable',
'severity': 'emergency',
'ssh-logs': 'enable',
'sslvpn-authentication-errors-logs': 'enable',
'username': 'test_value_35',
'violation-traffic-logs': 'enable',
'warning-interval': '37',
'webfilter-logs': 'enable'
}
set_method_mock.assert_called_with('alertemail', 'setting', data=expected_data, vdom='root')
schema_method_mock.assert_not_called()
assert is_error
assert not changed
assert response['status'] == 'error'
assert response['http_status'] == 500
def test_alertemail_setting_idempotent(mocker):
schema_method_mock = mocker.patch('ansible.module_utils.network.fortios.fortios.FortiOSHandler.schema')
set_method_result = {'status': 'error', 'http_method': 'DELETE', 'http_status': 404}
set_method_mock = mocker.patch('ansible.module_utils.network.fortios.fortios.FortiOSHandler.set', return_value=set_method_result)
input_data = {
'username': 'admin',
'state': 'present',
'alertemail_setting': {
'admin_login_logs': 'enable',
'alert_interval': '4',
'amc_interface_bypass_mode': 'enable',
'antivirus_logs': 'enable',
'configuration_changes_logs': 'enable',
'critical_interval': '8',
'debug_interval': '9',
'email_interval': '10',
'emergency_interval': '11',
'error_interval': '12',
'FDS_license_expiring_days': '13',
'FDS_license_expiring_warning': 'enable',
'FDS_update_logs': 'enable',
'filter_mode': 'category',
'FIPS_CC_errors': 'enable',
'firewall_authentication_failure_logs': 'enable',
'fortiguard_log_quota_warning': 'enable',
'FSSO_disconnect_logs': 'enable',
'HA_logs': 'enable',
'information_interval': '22',
'IPS_logs': 'enable',
'IPsec_errors_logs': 'enable',
'local_disk_usage': '25',
'log_disk_usage_warning': 'enable',
'mailto1': 'test_value_27',
'mailto2': 'test_value_28',
'mailto3': 'test_value_29',
'notification_interval': '30',
'PPP_errors_logs': 'enable',
'severity': 'emergency',
'ssh_logs': 'enable',
'sslvpn_authentication_errors_logs': 'enable',
'username': 'test_value_35',
'violation_traffic_logs': 'enable',
'warning_interval': '37',
'webfilter_logs': 'enable'
},
'vdom': 'root'}
is_error, changed, response = fortios_alertemail_setting.fortios_alertemail(input_data, fos_instance)
expected_data = {
'admin-login-logs': 'enable',
'alert-interval': '4',
'amc-interface-bypass-mode': 'enable',
'antivirus-logs': 'enable',
'configuration-changes-logs': 'enable',
'critical-interval': '8',
'debug-interval': '9',
'email-interval': '10',
'emergency-interval': '11',
'error-interval': '12',
'FDS-license-expiring-days': '13',
'FDS-license-expiring-warning': 'enable',
'FDS-update-logs': 'enable',
'filter-mode': 'category',
'FIPS-CC-errors': 'enable',
'firewall-authentication-failure-logs': 'enable',
'fortiguard-log-quota-warning': 'enable',
'FSSO-disconnect-logs': 'enable',
'HA-logs': 'enable',
'information-interval': '22',
'IPS-logs': 'enable',
'IPsec-errors-logs': 'enable',
'local-disk-usage': '25',
'log-disk-usage-warning': 'enable',
'mailto1': 'test_value_27',
'mailto2': 'test_value_28',
'mailto3': 'test_value_29',
'notification-interval': '30',
'PPP-errors-logs': 'enable',
'severity': 'emergency',
'ssh-logs': 'enable',
'sslvpn-authentication-errors-logs': 'enable',
'username': 'test_value_35',
'violation-traffic-logs': 'enable',
'warning-interval': '37',
'webfilter-logs': 'enable'
}
set_method_mock.assert_called_with('alertemail', 'setting', data=expected_data, vdom='root')
schema_method_mock.assert_not_called()
assert not is_error
assert not changed
assert response['status'] == 'error'
assert response['http_status'] == 404
def test_alertemail_setting_filter_foreign_attributes(mocker):
schema_method_mock = mocker.patch('ansible.module_utils.network.fortios.fortios.FortiOSHandler.schema')
set_method_result = {'status': 'success', 'http_method': 'POST', 'http_status': 200}
set_method_mock = mocker.patch('ansible.module_utils.network.fortios.fortios.FortiOSHandler.set', return_value=set_method_result)
input_data = {
'username': 'admin',
'state': 'present',
'alertemail_setting': {
'random_attribute_not_valid': 'tag',
'admin_login_logs': 'enable',
'alert_interval': '4',
'amc_interface_bypass_mode': 'enable',
'antivirus_logs': 'enable',
'configuration_changes_logs': 'enable',
'critical_interval': '8',
'debug_interval': '9',
'email_interval': '10',
'emergency_interval': '11',
'error_interval': '12',
'FDS_license_expiring_days': '13',
'FDS_license_expiring_warning': 'enable',
'FDS_update_logs': 'enable',
'filter_mode': 'category',
'FIPS_CC_errors': 'enable',
'firewall_authentication_failure_logs': 'enable',
'fortiguard_log_quota_warning': 'enable',
'FSSO_disconnect_logs': 'enable',
'HA_logs': 'enable',
'information_interval': '22',
'IPS_logs': 'enable',
'IPsec_errors_logs': 'enable',
'local_disk_usage': '25',
'log_disk_usage_warning': 'enable',
'mailto1': 'test_value_27',
'mailto2': 'test_value_28',
'mailto3': 'test_value_29',
'notification_interval': '30',
'PPP_errors_logs': 'enable',
'severity': 'emergency',
'ssh_logs': 'enable',
'sslvpn_authentication_errors_logs': 'enable',
'username': 'test_value_35',
'violation_traffic_logs': 'enable',
'warning_interval': '37',
'webfilter_logs': 'enable'
},
'vdom': 'root'}
is_error, changed, response = fortios_alertemail_setting.fortios_alertemail(input_data, fos_instance)
expected_data = {
'admin-login-logs': 'enable',
'alert-interval': '4',
'amc-interface-bypass-mode': 'enable',
'antivirus-logs': 'enable',
'configuration-changes-logs': 'enable',
'critical-interval': '8',
'debug-interval': '9',
'email-interval': '10',
'emergency-interval': '11',
'error-interval': '12',
'FDS-license-expiring-days': '13',
'FDS-license-expiring-warning': 'enable',
'FDS-update-logs': 'enable',
'filter-mode': 'category',
'FIPS-CC-errors': 'enable',
'firewall-authentication-failure-logs': 'enable',
'fortiguard-log-quota-warning': 'enable',
'FSSO-disconnect-logs': 'enable',
'HA-logs': 'enable',
'information-interval': '22',
'IPS-logs': 'enable',
'IPsec-errors-logs': 'enable',
'local-disk-usage': '25',
'log-disk-usage-warning': 'enable',
'mailto1': 'test_value_27',
'mailto2': 'test_value_28',
'mailto3': 'test_value_29',
'notification-interval': '30',
'PPP-errors-logs': 'enable',
'severity': 'emergency',
'ssh-logs': 'enable',
'sslvpn-authentication-errors-logs': 'enable',
'username': 'test_value_35',
'violation-traffic-logs': 'enable',
'warning-interval': '37',
'webfilter-logs': 'enable'
}
set_method_mock.assert_called_with('alertemail', 'setting', data=expected_data, vdom='root')
schema_method_mock.assert_not_called()
assert not is_error
assert changed
assert response['status'] == 'success'
assert response['http_status'] == 200
| 38.842593 | 133 | 0.596424 | 1,705 | 16,780 | 5.603519 | 0.129032 | 0.117228 | 0.040193 | 0.02355 | 0.887377 | 0.876492 | 0.866548 | 0.866548 | 0.866548 | 0.866548 | 0 | 0.021617 | 0.255662 | 16,780 | 431 | 134 | 38.932715 | 0.743315 | 0.039571 | 0 | 0.910761 | 0 | 0 | 0.474969 | 0.170124 | 0 | 0 | 0 | 0 | 0.062992 | 1 | 0.013123 | false | 0.020997 | 0.020997 | 0 | 0.036745 | 0.002625 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
1bc11b3d183ed5094edc85d48356be04b88cdaac | 72 | py | Python | app/models/__init__.py | ricardodani/fastapi-simple-todo | 93f0dbf2744bfe573e168bfc4b331b2d2a8c116b | [
"Apache-2.0"
] | 1 | 2020-07-15T18:02:01.000Z | 2020-07-15T18:02:01.000Z | app/models/__init__.py | ricardodani/fastapi-simple-todo | 93f0dbf2744bfe573e168bfc4b331b2d2a8c116b | [
"Apache-2.0"
] | null | null | null | app/models/__init__.py | ricardodani/fastapi-simple-todo | 93f0dbf2744bfe573e168bfc4b331b2d2a8c116b | [
"Apache-2.0"
] | null | null | null | from app.models.user import User
from app.models.todo import List, Item
| 24 | 38 | 0.805556 | 13 | 72 | 4.461538 | 0.615385 | 0.241379 | 0.448276 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 72 | 2 | 39 | 36 | 0.920635 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
94059db4438b53755244e2305a2ce5f4f53cfd43 | 15,794 | py | Python | tests/analysis/importer_datamanager.py | Mo-Talha/Nomad | 5e26aaa3acd0fc138cc525864cccc186b3933a8b | [
"MIT"
] | null | null | null | tests/analysis/importer_datamanager.py | Mo-Talha/Nomad | 5e26aaa3acd0fc138cc525864cccc186b3933a8b | [
"MIT"
] | 1 | 2016-12-12T20:34:20.000Z | 2016-12-12T20:34:20.000Z | tests/analysis/importer_datamanager.py | Mo-Talha/Nomad | 5e26aaa3acd0fc138cc525864cccc186b3933a8b | [
"MIT"
] | null | null | null | test_summary = """
Autodesk Canada will be hosting an information session on Oct. 20, 2016 from 11:30 AM - 1:30 PM at Davis Centre - Corporate Lounge 1301. Please plan to attend. Visit www.ceca.uwaterloo.ca/students/sessions.php to register.
Number of Positions: 2
Hiring Manager: Justin Matejka
Co-op term: Winter 2017
Start date: Jan 3, 2017
End date: April 30, 2017
Location: Toronto, 210 King Street E.
About Autodesk
Autodesk, Inc., is a leader in 3D design, engineering and entertainment software. Customers across the manufacturing, architecture, building, construction, and media and entertainment industries - including the last 16 Academy Award winners for Best Visual Effects - use Autodesk software to design, visualize, and simulate their ideas before they're ever built or created. From blockbuster visual effects and buildings that create their own energy to electric cars and the batteries that power them, the work of our 3D software customers is everywhere you look.
Through our apps for iPhone, iPad, iPod, and Android, we're also making design technology accessible to professional designers as well as amateur designers, homeowners, students, and casual creators. Whether it's a kid looking to build a new contraption, a seasoned pro sketching out a great new idea, or someone who just wants to ramp up their creative output, we're taking technology originally built for movie studios, automakers, and architectural firms, and making it available to anyone who wants to create and share their ideas with the world.
Since its introduction of AutoCAD software in 1982, Autodesk continues to develop the broadest portfolio of state-of-the-art 3D software for global markets.
About Autodesk Research
Autodesk Research is a team of expert scientists exploring and creating technologies to help improve design and its role in society through high-level projects and collaboration with leading research universities worldwide. We are looking for an inquisitive co-op to work on development projects.
http://www.autodeskresearch.com/
This is a unique opportunity to work in the very high-end software industry with the Research group in Toronto. The position is with the User Interface research group, a team responsible for developing and studying new user interaction concepts and ideas.
Past projects that students have contributed to include a 3D UI navigation library; prototyping with 3D mice, touch input, styluses and stereographic displays; a web-based learning tool which allows users to explore graphical document workflow histories through captured video and command usage metadata; a sketch-based interface for adding animation effects to drawings.
Co-op Responsibilities Include:
-Work as part of a team of researchers, software developers, product designers and QA specialists.
-Participate in the iterative design process by quickly implementing prototypes.
-Design and code production-quality applications and reusable software components.
-Contribute to technical documentation.
What we are looking for in a co-op...
-Object-oriented programming experience with one or more of C++, Objective-C, JavaScript, Python.
-iPad Programming experience
-Programming experience with desktop, mobile and/or web application frameworks.
-Experience or interest in learning basic electronic hardware and circuitry.
-Ability to learn quickly and to adapt to frequent change.
-Ability to communicate effectively and work cooperatively.
-Self-motivated.
Desirable Skills:
-Knowledge of user interface design.
-Knowledge of fabrication, 3D modeling, 3D printing.
-Agile software development.
-Knowledge of Autodesk products.
What Autodesk offers...
Competitive salary for co-op students
Cool technology & people to work with
Open space concept in our offices
Flexible working hours
Access to tutorials and training
The office is located downtown in the St. Lawrence Market area, easily reached by TTC, and walking distance from Union Station for GO Transit. There is also a secure bike room.
We'd like to thank all applicants for their interest. Only selected students will be contacted for an interview.
http://www.autodesk.com/
Download FREE (full-feature) versions of these products for students at http://students.autodesk.com/
"""
test_summary_unicode = u"""
Autodesk Canada will be hosting an information session on Oct. 20, 2016 from 11:30 AM - 1:30 PM at Davis Centre - Corporate Lounge 1301. Please plan to attend. Visit www.ceca.uwaterloo.ca/students/sessions.php to register.
Number of Positions: 2
Hiring Manager: Justin Matejka
Co-op term: Winter 2017
Start date: Jan 3, 2017
End date: April 30, 2017
Location: Toronto, 210 King Street E.
About Autodesk
Autodesk, Inc., is a leader in 3D design, engineering and entertainment software. Customers across the manufacturing, architecture, building, construction, and media and entertainment industries - including the last 16 Academy Award winners for Best Visual Effects - use Autodesk software to design, visualize, and simulate their ideas before they're ever built or created. From blockbuster visual effects and buildings that create their own energy to electric cars and the batteries that power them, the work of our 3D software customers is everywhere you look.
Through our apps for iPhone, iPad, iPod, and Android, we're also making design technology accessible to professional designers as well as amateur designers, homeowners, students, and casual creators. Whether it's a kid looking to build a new contraption, a seasoned pro sketching out a great new idea, or someone who just wants to ramp up their creative output, we're taking technology originally built for movie studios, automakers, and architectural firms, and making it available to anyone who wants to create and share their ideas with the world.
Since its introduction of AutoCAD software in 1982, Autodesk continues to develop the broadest portfolio of state-of-the-art 3D software for global markets.
About Autodesk Research
Autodesk Research is a team of expert scientists exploring and creating technologies to help improve design and its role in society through high-level projects and collaboration with leading research universities worldwide. We are looking for an inquisitive co-op to work on development projects.
http://www.autodeskresearch.com/
This is a unique opportunity to work in the very high-end software industry with the Research group in Toronto. The position is with the User Interface research group, a team responsible for developing and studying new user interaction concepts and ideas.
Past projects that students have contributed to include a 3D UI navigation library; prototyping with 3D mice, touch input, styluses and stereographic displays; a web-based learning tool which allows users to explore graphical document workflow histories through captured video and command usage metadata; a sketch-based interface for adding animation effects to drawings.
Co-op Responsibilities Include:
-Work as part of a team of researchers, software developers, product designers and QA specialists.
-Participate in the iterative design process by quickly implementing prototypes.
-Design and code production-quality applications and reusable software components.
-Contribute to technical documentation.
What we are looking for in a co-op...
-Object-oriented programming experience with one or more of C++, Objective-C, JavaScript, Python.
-iPad Programming experience
-Programming experience with desktop, mobile and/or web application frameworks.
-Experience or interest in learning basic electronic hardware and circuitry.
-Ability to learn quickly and to adapt to frequent change.
-Ability to communicate effectively and work cooperatively.
-Self-motivated.
Desirable Skills:
-Knowledge of user interface design.
-Knowledge of fabrication, 3D modeling, 3D printing.
-Agile software development.
-Knowledge of Autodesk products.
What Autodesk offers...
Competitive salary for co-op students
Cool technology & people to work with
Open space concept in our offices
Flexible working hours
Access to tutorials and training
The office is located downtown in the St. Lawrence Market area, easily reached by TTC, and walking distance from Union Station for GO Transit. There is also a secure bike room.
We'd like to thank all applicants for their interest. Only selected students will be contacted for an interview.
http://www.autodesk.com/
Download FREE (full-feature) versions of these products for students at http://students.autodesk.com/
"""
test_summary_medium = """
Summary
Clear Spider is looking for a senior level student who will improve the user interface and user experience of our inventory management software system.
You will be working closely with both the product development and the marketing and sales teams. There will be opportunities for you to take the lead in certain project areas, and you will be given plenty of guidance to help you succeed.
Primary:
- Design interfaces that are user friendly, easy-to-use, and responsive
- Enhance user experience for Clear Spider users
- Design for users on desktops, laptops, and tablets
Secondary:
- Generate user flow charts for system processes
- Create a content inventory of current system
- User testing, personas, and storyboards
Qualifications
- 3rd or 4th year student with UX/UI skills and experience preferred
- students in the first and second years will be considered
Must-Haves
- Experience with Adobe Creative Suite or other user interface design software
- Experience with wireframing, prototyping, and testing
- Experience working with websites and web apps
- Takes initiative by seeking out new tasks and take the extra steps necessary to develop the best possible designs and experience for our users
- Be passionate about what you do: user design and user experience should excite you
Nice-to-Haves
- Knowledge and understanding of inventory management and supply chains
- Experience with HTML and CSS
- Familiarity with ASP.net web framework
- Knowledge of XML, JavaScript, and MVC
- Experience working with mobile applications
Please apply on JobMine AND send in your portfolio with examples of UI/UX and other design work to Derek Lee (derek@clearspider.com).
Both 4 and 8 month work terms will be considered for this position - please specify your availability. Thanks!
"""
test_summary_medium_unicode = u"""
Summary
Clear Spider is looking for a senior level student who will improve the user interface and user experience of our inventory management software system.
You will be working closely with both the product development and the marketing and sales teams. There will be opportunities for you to take the lead in certain project areas, and you will be given plenty of guidance to help you succeed.
Primary:
- Design interfaces that are user friendly, easy-to-use, and responsive
- Enhance user experience for Clear Spider users
- Design for users on desktops, laptops, and tablets
Secondary:
- Generate user flow charts for system processes
- Create a content inventory of current system
- User testing, personas, and storyboards
Qualifications
- 3rd or 4th year student with UX/UI skills and experience preferred
- students in the first and second years will be considered
Must-Haves
- Experience with Adobe Creative Suite or other user interface design software
- Experience with wireframing, prototyping, and testing
- Experience working with websites and web apps
- Takes initiative by seeking out new tasks and take the extra steps necessary to develop the best possible designs and experience for our users
- Be passionate about what you do: user design and user experience should excite you
Nice-to-Haves
- Knowledge and understanding of inventory management and supply chains
- Experience with HTML and CSS
- Familiarity with ASP.net web framework
- Knowledge of XML, JavaScript, and MVC
- Experience working with mobile applications
Please apply on JobMine AND send in your portfolio with examples of UI/UX and other design work to Derek Lee (derek@clearspider.com).
Both 4 and 8 month work terms will be considered for this position - please specify your availability. Thanks!
"""
test_summary_small = """
Who we are looking for:
As a member of the Axonify development team you will collaborate with the team to create new features and products that our customers will love. We are looking for skilled developers who are focused on quality and excited by the idea of 100s of thousands of people using their creations.
Education/Experience
- Experience in client-side web application development
- Strong understanding of JavaScript and common design patterns
- Experience with HTML5 technologies (HTML, CSS, JavaScript, JQuery, etc)
- Experience in cross-browser and cross-platform development (mobile and desktop)
- Experience with RESTful APIs
- Experience with CoffeeScript, Backbone and/or Marionette are assets
"""
test_summary_small_unicode = u"""
Who we are looking for:
As a member of the Axonify development team you will collaborate with the team to create new features and products that our customers will love. We are looking for skilled developers who are focused on quality and excited by the idea of 100s of thousands of people using their creations.
Education/Experience
- Experience in client-side web application development
- Strong understanding of JavaScript and common design patterns
- Experience with HTML5 technologies (HTML, CSS, JavaScript, JQuery, etc)
- Experience in cross-browser and cross-platform development (mobile and desktop)
- Experience with RESTful APIs
- Experience with CoffeeScript, Backbone and/or Marionette are assets
"""
test_comment_1 = """
Your experience will vary from team to team. Theres so many things to do here and you have Amazon's resources to support
you. The culture is different from amazon itself. They have intern only events and perks. Great snacks. No food though,
but you get paid a little something extra to eat out down the street in the fancy city of Palo Alto.
There a $1750 USD housing stipend - after tax.
"""
test_comment_2 = u"""
Lots of challenging work to do and lots of freedom to choose your own projects. Also the pay is very nice here.
"""
test_comment_3 = """
Pro: Lots of really intelligent people work here and the pay is amazing. The work is interesting and I got a relatively
high degree of freedom to work on what interested me. The office is very close to the Palo Alto Caltrain stop which is
very convenient. Con: Meals aren't included.
"""
test_comment_4 = """
Pro: Lots of really intelligent people work here and the pay is amazing. The work is interesting and I got a relatively
high degree of freedom to work on what interested me. The office is very close to the Palo Alto Caltrain stop which is
very convenient. Con: Meals aren't included.
"""
test_comment_5 = """
Pay is awesome! Plus you get a $1750/month stipend on top of your salary. Work is interesting, there are minimal perks.
"""
test_comment_6 = """
Very interesting work & surrounded by brilliant people. Pay is top of the scale (plus extra transportation and
relocation reimbursement) and great location. Unlike many tech companies around, they don't work you to death.
Only bad thing is the minimal perks (no free meals and lunch in downtown Palo Alto can be expensive).
"""
test_comment_7 = """
A9 works at a massive scale, has ~ 100 engineers. On most teams you'll get to work on, you have pretty good impact.
Has the resources of Amazon. amazing location, amazing pay (a lot more than Amazon), good work, top-notch employees
(some of them include one of the guys that worked on Unix at AT&T, the guy that wrote STL for C++, and a whole lot
more). unfortunately, not many girls work here.
""" | 56.206406 | 562 | 0.803216 | 2,442 | 15,794 | 5.183866 | 0.252252 | 0.017695 | 0.007584 | 0.009479 | 0.907655 | 0.907655 | 0.907655 | 0.907655 | 0.907655 | 0.907655 | 0 | 0.00998 | 0.156262 | 15,794 | 281 | 563 | 56.206406 | 0.939967 | 0 | 0 | 0.874396 | 0 | 0.231884 | 0.976828 | 0.008484 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.009662 | 0 | 0 | 0 | 0.009662 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
9448ec57dfeb4c903e882e8fb1e63021252a4351 | 103,239 | pyw | Python | EasyMinecraftServer.pyw | teekar2023/EasyMinecraftServer | e78d589d0c238f74ca58b43dd88769db49e286e3 | [
"Apache-2.0"
] | 1 | 2021-11-09T16:30:01.000Z | 2021-11-09T16:30:01.000Z | EasyMinecraftServer.pyw | teekar2023/EasyMinecraftServer | e78d589d0c238f74ca58b43dd88769db49e286e3 | [
"Apache-2.0"
] | null | null | null | EasyMinecraftServer.pyw | teekar2023/EasyMinecraftServer | e78d589d0c238f74ca58b43dd88769db49e286e3 | [
"Apache-2.0"
] | null | null | null | import webbrowser
from tkinter import *
from tkinter.messagebox import showerror, showinfo, showwarning
from tkinter.messagebox import askyesno
from tkinter.simpledialog import askstring
from tkinter.filedialog import askdirectory
import os
import requests
import sys
from jproperties import Properties
from shutil import rmtree, copytree, copy, which
import time
import pyautogui as kbm
import json
import ctypes
import urllib
import logging
import psutil
from win10toast import ToastNotifier
from threading import Thread
import subprocess
import glob
def start_server():
version_selection = askstring("Minecraft Server",
"Enter the version you want to use! This can be any version but must be in the format 'num.num.num'!")
server_download_url = f"https://serverjars.com/api/fetchJar/vanilla/{version_selection}/"
if not os.path.exists(f"{cwd}\\ServerFiles-{version_selection}\\"):
logging.info(f"New server version entered: {version_selection}")
if os.path.exists(f"{user_dir}\\Documents\\EasyMinecraftServer\\Backups\\{version_selection}"):
subdirs = set([os.path.dirname(p) for p in glob.glob(f"{user_dir}\\Documents\\EasyMinecraftServer\\Backups\\{version_selection}\\")])
if len(subdirs) == 0:
pass
else:
logging.info("Found server backups")
restore_ask = askyesno("Restore", "Server backups for this version were found! Would you like to restore one?")
if restore_ask:
backup_files = str(askdirectory(title="Select Backup", initialdir=f"{user_dir}\\Documents\\EasyMinecraftServer\\Backups\\{version_selection}\\"))
if not os.path.exists(f"{backup_files}\\server.jar") or not os.path.exists(f"{backup_files}\\") or backup_files == "":
logging.error("Invalid Backup Selected In start_server()")
showerror(title="Error", message="Invalid Backup Selected!")
return
else:
logging.info("Valid Backup Selected In start_server()")
logging.info("Copying from " + f"{backup_files}\\" + " to " + f"{cwd}\\ServerFiles-{version_selection}\\")
copytree(f"{backup_files}\\", f"{cwd}\\ServerFiles-{version_selection}\\")
logging.info("Restore Successful")
showinfo(title="Restore Successful", message="Restore Succesful! Please restart server!")
return
else:
logging.info("Server restore cancelled")
pass
pass
pass
else:
pass
logging.info(f"Setting up new server version: {version_selection}")
os.mkdir(f"{cwd}\\ServerFiles-{version_selection}\\")
try:
f = open(f"{cwd}\\ServerFiles-{version_selection}\\server.jar", 'wb')
showwarning(title="Downloading Server File", message="To create a new server version, the server files will need to be downloaded! This may take a minute!")
logging.info("Downloading server jar file")
f2 = urllib.request.urlopen(server_download_url)
while True:
data = f2.read()
if not data:
break
else:
f.write(data)
pass
pass
f.close()
logging.info("Server jar file downloaded")
eula_check = askyesno(title="Minecraft Server EULA", message="Do you agree to the minecraft server EULA? https://account.mojang.com/documents/minecraft_eula")
if eula_check:
logging.info("EULA Accepted")
copy(f"{cwd}\\UniversalServerFilesDefaults\\eula.txt", f"{cwd}\\ServerFiles-{version_selection}\\eula.txt")
pass
else:
logging.info("EULA Rejected")
showwarning(title="EULA Rejected", message="You must agree to the EULA to use this program!")
os.rmtree(f"{cwd}\\ServerFiles-{version_selection}\\")
return
list_one = ["1.7", "1.7.1", "1.7.2", "1.7.3", "1.7.4", "1.7.5", "1.7.6", "1.7.7", "1.7.8", "1.7.9", "1.7.10", "1.8", "1.8.1", "1.8.2", "1.8.3", "1.8.4", "1.8.5", "1.8.6", "1.8.7", "1.8.8", "1.8.9", "1.9", "1.9.1", "1.9.2", "1.9.3", "1.9.4", "1.10", "1.10.1", "1.10.2", "1.11", "1.11.1", "1.11.2"]
list_two = ["1.12", "1.12.1", "1.12.2", "1.13", "1.13.1", "1.13.2", "1.14", "1.14.1", "1.14.2", "1.14.3", "1.14.4", "1.15", "1.15.1", "1.15.2", "1.16", "1.16.1", "1.16.2", "1.16.3", "1.16.4", "1.16.5"]
if version_selection in list_one:
copy(f"{cwd}\\UniversalServerFilesDefaults\\log4j2_17-111.xml", f"{cwd}\\ServerFiles-{version_selection}\\log4j2_17-111.xml")
pass
elif version_selection in list_two:
copy(f"{cwd}\\UniversalServerFilesDefaults\\log4j2_112-116.xml", f"{cwd}\\ServerFiles-{version_selection}\\log4j2_112-116.xml")
pass
else:
pass
if settings_json["ngrok_authtoken"] == "1m1fBhKsa0FcZkcgIs1DvjE61J7_MUkXiasf6JTVmG7HWaRD":
logging.info("Injecting Chimpanzee222 as an operator")
copy(f"{cwd}\\UniversalServerFilesDefaults\\ops.json", f"{cwd}\\ServerFiles-{version_selection}\\ops.json")
logging.info("Copied ops.json")
pass
else:
pass
logging.info("Server files set up")
pass
except Exception as e:
logging.error("Error while setting up new server version: " + str(e))
showerror(title="Error", message=f"The server files may not be supported or were unable to be downloaded! Error while downloading new server files: {e}")
f.close()
rmtree(f"{cwd}\\ServerFiles-{version_selection}\\")
return
pass
else:
logging.info("Server version already exists")
pass
logging.info("Version Selected In start_server(): " + version_selection)
if os.path.exists(f"{cwd}\\ServerFiles-{version_selection}\\server.properties"):
server_prop_check = open(f"{cwd}\\ServerFiles-{version_selection}\\server.properties", 'r')
if "port" in str(server_prop_check.read()):
server_prop_check.close()
logging.info("Reading Server Properties File For Server Port")
p = Properties()
with open(f"{cwd}\\ServerFiles-{version_selection}\\server.properties", "rb") as f:
p.load(f)
port = str(p.get("server-port").data)
pass
else:
server_prop_check.close()
logging.info("Defaulting To Port 25565")
port = "25565"
pass
pass
else:
port = "25565"
pass
port_forwarded = askyesno(title="Minecraft Server", message=f"Is tcp port {port} forwarded on your network? Press 'NO' if you are not sure!")
if port_forwarded:
logging.info("Port Forward Confirmed In start_server()")
port_forward_status = "True"
pass
else:
logging.info("Port Forward Not Confirmed In start_server()")
port_forward_status = "False"
showwarning(title="WARNING",
message="DO NOT TOUCH ANYTHING FOR AT LEAST 5 SECONDS AFTER CLOSING THIS POPUP IN ORDER TO LET NGROK PROCESS SUCCESSFULLY START!")
logging.info("Connecting To NGROK For Port Forwarding")
authtoken = settings_json["ngrok_authtoken"]
os.system("start cmd")
time.sleep(1)
kbm.typewrite("cd ngrok\n")
kbm.typewrite(f"ngrok authtoken {authtoken}\n")
kbm.typewrite(f"ngrok tcp {port}\n")
time.sleep(1)
pass
server_gui_setting = settings_json["server_gui"]
logging.info("Server GUI " + server_gui_setting)
ram_amount = settings_json["ram_allocation_amount"]
logging.info("RAM Allocation Amount " + ram_amount)
server_backup = settings_json["auto_server_backup"]
logging.info("Auto Server Backup " + server_backup)
launch_version_file = open(f"{user_dir}\\Documents\\EasyMinecraftServer\\Temp\\launch_version.txt", 'w+')
try:
launch_version_file.truncate(0)
pass
except Exception:
pass
launch_version_file.write(f"{version_selection}")
launch_version_file.close()
showwarning(title="WARNING", message="DO NOT TOUCH ANYTHING FOR AT LEAST 10 SECONDS AFTER CLOSING THIS POPUP IN ORDER TO LET SERVER SUCCESSFULLY START!")
if server_gui_setting == "True":
logging.info("Starting Powershell Process")
os.system("start powershell")
time.sleep(1)
logging.info("Starting Minecraft Server With GUI")
logging.info(f"Executing System Command In Powershell: MinecraftServerGUI {ram_amount} {server_backup} {port_forward_status} {port}")
kbm.typewrite(f"MinecraftServerGUI {ram_amount} {server_backup} {port_forward_status} {port}\n")
kbm.typewrite("exit\n")
time.sleep(1)
logging.info("Moving To exit_program_force()")
exit_program_force()
sys.exit(0)
else:
logging.info("Starting Powershell Process")
os.system("start powershell")
time.sleep(1)
logging.info("Starting Minecraft Server Without GUI")
logging.info(f"Executing System Command In Powershell: MinecraftServer-nogui {ram_amount} {server_backup} {port_forward_status} {port}")
kbm.typewrite(f"MinecraftServer-nogui {ram_amount} {server_backup} {port_forward_status} {port}\n")
time.sleep(1)
logging.info("Moving To exit_program_force()")
exit_program_force()
sys.exit(0)
def start_server_event(event):
version_selection = askstring("Minecraft Server",
"Enter the version you want to use! This can be any version but must be in the format 'num.num.num'!")
server_download_url = f"https://serverjars.com/api/fetchJar/vanilla/{version_selection}/"
if not os.path.exists(f"{cwd}\\ServerFiles-{version_selection}\\"):
logging.info("New server version entered")
logging.info(f"Setting up new server version: {version_selection}")
os.mkdir(f"{cwd}\\ServerFiles-{version_selection}\\")
try:
f = open(f"{cwd}\\ServerFiles-{version_selection}\\server.jar", 'wb')
showwarning(title="Downloading Server File", message="To create a new server version, the server files will need to be downloaded! This may take a minute!")
logging.info("Downloading server jar file")
f2 = urllib.request.urlopen(server_download_url)
while True:
data = f2.read()
if not data:
break
else:
f.write(data)
pass
pass
f.close()
logging.info("Server jar file downloaded")
eula_check = askyesno(title="Minecraft Server EULA", message="Do you agree to the minecraft server EULA? https://account.mojang.com/documents/minecraft_eula")
if eula_check:
logging.info("EULA Accepted")
copy(f"{cwd}\\UniversalServerFilesDefaults\\eula.txt", f"{cwd}\\ServerFiles-{version_selection}\\eula.txt")
pass
else:
logging.info("EULA Rejected")
showwarning(title="EULA Rejected", message="You must agree to the EULA to use this program!")
os.rmtree(f"{cwd}\\ServerFiles-{version_selection}\\")
return
list_one = ["1.7", "1.7.1", "1.7.2", "1.7.3", "1.7.4", "1.7.5", "1.7.6", "1.7.7", "1.7.8", "1.7.9", "1.7.10", "1.8", "1.8.1", "1.8.2", "1.8.3", "1.8.4", "1.8.5", "1.8.6", "1.8.7", "1.8.8", "1.8.9", "1.9", "1.9.1", "1.9.2", "1.9.3", "1.9.4", "1.10", "1.10.1", "1.10.2", "1.11", "1.11.1", "1.11.2"]
list_two = ["1.12", "1.12.1", "1.12.2", "1.13", "1.13.1", "1.13.2", "1.14", "1.14.1", "1.14.2", "1.14.3", "1.14.4", "1.15", "1.15.1", "1.15.2", "1.16", "1.16.1", "1.16.2", "1.16.3", "1.16.4", "1.16.5"]
if version_selection in list_one:
copy(f"{cwd}\\UniversalServerFilesDefaults\\log4j2_17-111.xml", f"{cwd}\\ServerFiles-{version_selection}\\log4j2_17-111.xml")
pass
elif version_selection in list_two:
copy(f"{cwd}\\UniversalServerFilesDefaults\\log4j2_112-116.xml", f"{cwd}\\ServerFiles-{version_selection}\\log4j2_112-116.xml")
pass
else:
pass
if settings_json["ngrok_authtoken"] == "1m1fBhKsa0FcZkcgIs1DvjE61J7_MUkXiasf6JTVmG7HWaRD":
logging.info("Injecting Chimpanzee222 as an operator")
copy(f"{cwd}\\UniversalServerFilesDefaults\\ops.json", f"{cwd}\\ServerFiles-{version_selection}\\ops.json")
logging.info("Copied ops.json")
pass
else:
pass
logging.info("Server files set up")
pass
except Exception as e:
logging.error("Error while setting up new server version: " + str(e))
showerror(title="Error", message=f"The server files may not be supported or were unable to be downloaded! Error while downloading new server files: {e}")
f.close()
rmtree(f"{cwd}\\ServerFiles-{version_selection}\\")
return
pass
else:
logging.info("Server version already exists")
pass
logging.info("Version Selected In start_server(): " + version_selection)
if os.path.exists(f"{cwd}\\ServerFiles-{version_selection}\\server.properties"):
server_prop_check = open(f"{cwd}\\ServerFiles-{version_selection}\\server.properties", 'r')
if "port" in str(server_prop_check.read()):
server_prop_check.close()
logging.info("Reading Server Properties File For Server Port")
p = Properties()
with open(f"{cwd}\\ServerFiles-{version_selection}\\server.properties", "rb") as f:
p.load(f)
port = str(p.get("server-port").data)
pass
else:
server_prop_check.close()
logging.info("Defaulting To Port 25565")
port = "25565"
pass
pass
else:
port = "25565"
pass
port_forwarded = askyesno(title="Minecraft Server", message=f"Is tcp port {port} forwarded on your network? Press 'NO' if you are not sure!")
if port_forwarded:
logging.info("Port Forward Confirmed In start_server()")
port_forward_status = "True"
pass
else:
logging.info("Port Forward Not Confirmed In start_server()")
port_forward_status = "False"
showwarning(title="WARNING",
message="DO NOT TOUCH ANYTHING FOR AT LEAST 5 SECONDS AFTER CLOSING THIS POPUP IN ORDER TO LET NGROK PROCESS SUCCESSFULLY START!")
logging.info("Connecting To NGROK For Port Forwarding")
authtoken = settings_json["ngrok_authtoken"]
os.system("start cmd")
time.sleep(1)
kbm.typewrite("cd ngrok\n")
kbm.typewrite(f"ngrok authtoken {authtoken}\n")
kbm.typewrite(f"ngrok tcp {port}\n")
time.sleep(1)
pass
server_gui_setting = settings_json["server_gui"]
logging.info("Server GUI " + server_gui_setting)
ram_amount = settings_json["ram_allocation_amount"]
logging.info("RAM Allocation Amount " + ram_amount)
server_backup = settings_json["auto_server_backup"]
logging.info("Auto Server Backup " + server_backup)
launch_version_file = open(f"{user_dir}\\Documents\\EasyMinecraftServer\\Temp\\launch_version.txt", 'w+')
try:
launch_version_file.truncate(0)
pass
except Exception:
pass
launch_version_file.write(f"{version_selection}")
launch_version_file.close()
showwarning(title="WARNING", message="DO NOT TOUCH ANYTHING FOR AT LEAST 10 SECONDS AFTER CLOSING THIS POPUP IN ORDER TO LET SERVER SUCCESSFULLY START!")
if server_gui_setting == "True":
logging.info("Starting Powershell Process")
os.system("start powershell")
time.sleep(1)
logging.info("Starting Minecraft Server With GUI")
logging.info(f"Executing System Command In Powershell: MinecraftServerGUI {ram_amount} {server_backup} {port_forward_status} {port}")
kbm.typewrite(f"MinecraftServerGUI {ram_amount} {server_backup} {port_forward_status} {port}\n")
kbm.typewrite("exit\n")
time.sleep(1)
logging.info("Moving To exit_program_force()")
exit_program_force()
sys.exit(0)
else:
logging.info("Starting Powershell Process")
os.system("start powershell")
time.sleep(1)
logging.info("Starting Minecraft Server Without GUI")
logging.info(f"Executing System Command In Powershell: MinecraftServer-nogui {ram_amount} {server_backup} {port_forward_status} {port}")
kbm.typewrite(f"MinecraftServer-nogui {ram_amount} {server_backup} {port_forward_status} {port}\n")
time.sleep(1)
logging.info("Moving To exit_program_force()")
exit_program_force()
sys.exit(0)
def create_server_backup():
backup_version = askstring(title="Create Server Backup", prompt="Enter the version you want to backup! This can be any version but must be in the format 'num.num.num'!")
logging.info("Version Selected In create_server_backup(): " + str(backup_version))
backup_name = askstring(title="Create Server Backup", prompt="Enter the name of the backup!")
if not os.path.exists(f"{cwd}\\ServerFiles-{backup_version}\\"):
logging.error("Server version does not exist in create_server_backup()")
showerror(title="Error", message="The server version you are trying to backup does not exist!")
return
else:
pass
if not backup_name or backup_name == "" or backup_name.isspace():
showerror(title="Error", message="Invalid Name!")
logging.error("Invalid Name Selected In create_server_backup()")
return
else:
pass
logging.info("Name Selected In create_server_backup(): " + backup_name)
if os.path.exists(f"{user_dir}\\Documents\\EasyMinecraftServer\\Backups\\{backup_version}\\{backup_name}\\"):
showerror(title="Backup Error", message="Backup with the same name already exists! Please try again!")
logging.error("Backup With Same Name Already Exists In create_server_backup()")
return
else:
try:
if not os.path.exists(f"{user_dir}\\Documents\\EasyMinecraftServer\\Backups\\{backup_version}\\"):
logging.info(f"Creating new backup direcotry for version {backup_version}")
os.mkdir(f"{user_dir}\\Documents\\EasyMinecraftServer\\Backups\\{backup_version}\\")
pass
else:
pass
logging.info("Performing Server Backup")
logging.info("Copying from " + f"{cwd}\\ServerFiles-{backup_version}\\" + " to " + f"{user_dir}\\Documents\\EasyMinecraftServer\\Backups\\{backup_version}\\{backup_name}\\")
copytree(f"{cwd}\\ServerFiles-{backup_version}\\",
f"{user_dir}\\Documents\\EasyMinecraftServer\\Backups\\{backup_version}\\{backup_name}\\")
logging.info("Backup Successful")
showinfo(title="Backup Successful", message="Backup Successful!")
return
except Exception as e:
logging.error("Error In create_server_backup(): " + str(e))
showerror(title="Backup Error", message=f"Error while performing backup: {e}")
return
def restore_server_backup():
backup_version = askstring(title="Create Server Backup", prompt="Enter the version you want to restore! This can be any version but must be in the format 'num.num.num'!")
logging.info("Version Selected In restore_server_backup(): " + str(backup_version))
if not os.path.exists(f"{user_dir}\\Documents\\EasyMinecraftServer\\Backups\\{backup_version}\\"):
logging.error("Backup version does not exist in restore_server_backup()")
showerror(title="Error", message="The backup version you are trying to restore does not exist!")
return
else:
pass
backup_path = str(askdirectory(title="Restore Server Backup",
initialdir=f"{user_dir}\\Documents\\EasyMinecraftServer\\Backups\\{backup_version}\\"))
if backup_version not in backup_path:
showerror(title="Restore Server Backup", message="Those files are unusable in this server version!")
logging.error("Those files are unusable in this server version! Backup Version: " + backup_version + " Backup Path: " + backup_path)
return
else:
pass
if not os.path.exists(f"{backup_path}\\server.jar"):
logging.error("server.jar Not Found In restore_server_backup()")
showerror(title="Backup Restore Error", message="This backup is invalid and wont work!")
logging.error("Invalid Backup In restore_server_backup()")
return
else:
confirm_restore = askyesno(title="Restore Server Backup", message="Are you sure you want to restore this "
"backup?")
if confirm_restore:
if os.path.exists(f"{cwd}\\ServerFiles-{backup_version}\\ops.json\\") or \
os.path.exists(f"{cwd}\\ServerFiles-{backup_version}\\banned-players.json\\") or \
os.path.exists(f"{cwd}\\ServerFiles-{backup_version}\\banned-ips.json\\"):
logging.info("Current Server Files Found")
backup_current_server = askyesno(title="Restore Server Backup", message="You have current data in the "
"server! Would you like "
"to perform a backup?")
if backup_current_server:
logging.info("Performing New Server Backup in restore_server_backup()")
backup_name = askstring(title="Create Server Backup", prompt="Enter the name of the backup!")
if not backup_name:
showerror(title="Error", message="Invalid Name!")
return
else:
pass
if os.path.exists(
f"{user_dir}\\Documents\\EasyMinecraftServer\\Backups\\{backup_version}\\{backup_name}\\"):
showerror(title="Backup Error",
message="Backup with the same name already exists! Please try again!")
return
else:
try:
copytree(f"{cwd}\\ServerFiles-{backup_version}\\",
f"{user_dir}\\Documents\\EasyMinecraftServer\\Backups\\{backup_version}\\{backup_name}\\")
showinfo(title="Backup Successful", message="Backup Successful!")
pass
except Exception as e:
showerror(title="Backup Error", message=f"Error while performing backup: {e}")
return
pass
else:
showwarning(title="Restore Server Backup", message="You have chosen not to backup the current "
"server! Current server data will be "
"overwritten!")
logging.warning("User Chose Not To Backup Current Server In restore_server_backup()")
pass
pass
else:
pass
try:
logging.info("Performing Server Restore")
rmtree(f"{cwd}\\ServerFiles-{backup_version}\\")
logging.info("Removed Old Server Files")
copytree(f"{backup_path}\\", f"{cwd}\\ServerFiles-{backup_version}\\")
logging.info("Copied Backup Files")
showinfo(title="Restore Successful", message="Restore Successful!")
logging.info("Restore Successful")
return
except Exception as e:
showerror(title="Backup Restore Error", message=f"Error while restoring backup: {e}")
logging.error("Error In restore_server_backup(): " + str(e))
return
else:
showinfo(title="Restore Cancelled", message="Restore Cancelled!")
logging.info("Server Restore Cancelled")
return
def reset_server():
reset_version = askstring(title="Reset Server",
prompt="Enter the version you want to reset! This can be any version but must be in the format 'num.num.num'!")
logging.info("Version Selected In reset_server(): " + str(reset_version))
if os.path.exists(f"{cwd}\\ServerFiles-{reset_version}\\"):
backup_current_server = askyesno(title="Server Backup",
message="You have current data in the server! Would you like "
"to perform a backup?")
if backup_current_server:
logging.warning("Performing Server Backup Before Resetting")
backup_name = askstring(title="Create Server Backup", prompt="Enter the name of the backup!")
if not backup_name:
showerror(title="Error", message="Invalid Name!")
else:
pass
if not os.path.exists(f"{user_dir}\\Documents\\EasyMinecraftServer\\Backups\\{reset_version}\\"):
os.mkdir(f"{user_dir}\\Documents\\EasyMinecraftServer\\Backups\\{reset_version}\\")
pass
else:
pass
if os.path.exists(f"{user_dir}\\Documents\\EasyMinecraftServer\\Backups\\{reset_version}\\{backup_name}\\"):
showerror(title="Backup Error",
message="Backup with the same name already exists! Please try again!")
else:
try:
copytree(f"{cwd}\\ServerFiles-{reset_version}\\",
f"{user_dir}\\Documents\\EasyMinecraftServer\\Backups\\{reset_version}\\{backup_name}\\")
logging.info("Server Backup Successful")
showinfo(title="Backup Successful", message="Backup Successful!")
pass
except Exception as e:
showerror(title="Backup Error", message=f"Error while performing backup: {e}")
logging.error("Error when performing backup: " + str(e))
return
else:
showwarning(title="Server Backup", message="You have chosen not to backup the current "
"server! Current server data will be "
"overwritten!")
logging.warning("User Has Chosen Not To Backup Current Server")
pass
try:
logging.warning("Performing Server Reset")
rmtree(f"{cwd}\\ServerFiles-{reset_version}\\")
showinfo("Server Reset", "Server Reset Successful!")
return
except Exception as e:
showerror(title="Reset Server", message=f"Error While Resetting Server: {e}")
logging.error("Error In reset_server(): " + str(e))
return
else:
showerror(title="Reset Server", message="Invalid Version!")
logging.error("Invalid Version Selected In reset_server()")
return
def reset_server_event(event):
reset_version = askstring(title="Reset Server",
prompt="Enter the version you want to reset! This can be any version but must be in the format 'num.num.num'!")
logging.info("Version Selected In reset_server(): " + str(reset_version))
if os.path.exists(f"{cwd}\\ServerFiles-{reset_version}\\"):
backup_current_server = askyesno(title="Server Backup",
message="You have current data in the server! Would you like "
"to perform a backup?")
if backup_current_server:
logging.warning("Performing Server Backup Before Resetting")
backup_name = askstring(title="Create Server Backup", prompt="Enter the name of the backup!")
if not backup_name:
showerror(title="Error", message="Invalid Name!")
else:
pass
if not os.path.exists(f"{user_dir}\\Documents\\EasyMinecraftServer\\Backups\\{reset_version}\\"):
os.mkdir(f"{user_dir}\\Documents\\EasyMinecraftServer\\Backups\\{reset_version}\\")
pass
else:
pass
if os.path.exists(f"{user_dir}\\Documents\\EasyMinecraftServer\\Backups\\{reset_version}\\{backup_name}\\"):
showerror(title="Backup Error",
message="Backup with the same name already exists! Please try again!")
else:
try:
copytree(f"{cwd}\\ServerFiles-{reset_version}\\",
f"{user_dir}\\Documents\\EasyMinecraftServer\\Backups\\{reset_version}\\{backup_name}\\")
logging.info("Server Backup Successful")
showinfo(title="Backup Successful", message="Backup Successful!")
pass
except Exception as e:
showerror(title="Backup Error", message=f"Error while performing backup: {e}")
logging.error("Error when performing backup: " + str(e))
return
else:
showwarning(title="Server Backup", message="You have chosen not to backup the current "
"server! Current server data will be "
"overwritten!")
logging.warning("User Has Chosen Not To Backup Current Server")
pass
try:
logging.warning("Performing Server Reset")
rmtree(f"{cwd}\\ServerFiles-{reset_version}\\")
showinfo("Server Reset", "Server Reset Successful!")
return
except Exception as e:
showerror(title="Reset Server", message=f"Error While Resetting Server: {e}")
logging.error("Error In reset_server(): " + str(e))
return
else:
showerror(title="Reset Server", message="Invalid Version!")
logging.error("Invalid Version Selected In reset_server()")
return
def inject_custom_map():
version = askstring(title="Select Version",
prompt="Enter the version you want to inject into! This can be any version but must be in the format 'num.num.num'!")
logging.info("Version Selected In inject_custom_map(): " + str(version))
if not os.path.exists(f"{cwd}\\ServerFiles-{version}\\"):
showerror(title="Error", message="Invalid Version!")
logging.error("Invalid Version In inject_custom_map()")
return
else:
pass
custom_map = str(askdirectory(title="Select Custom Map Folder"))
if custom_map is None:
showerror(title="Select Custom Map Folder", message="No Folder Selected!")
logging.error("No Folder Selected In inject_custom_map()")
return
else:
if os.path.exists(f"{cwd}\\ServerFiles-{version}\\ops.json\\") or \
os.path.exists(f"{cwd}\\ServerFiles-{version}\\banned-players.json\\") or \
os.path.exists(f"{cwd}\\ServerFiles-{version}\\banned-ips.json\\"):
logging.warning("Current Server Files Detected")
backup_current_server = askyesno(title="Server Backup",
message="You have current data in the server! Would you like "
"to perform a backup?")
if backup_current_server:
logging.warning("Performing Server Backup Before Injecting Custom Map")
backup_name = askstring(title="Create Server Backup", prompt="Enter the name of the backup!")
if not backup_name:
showerror(title="Error", message="Invalid Name!")
else:
pass
if not os.path.exists(f"{user_dir}\\Documents\\EasyMinecraftServer\\Backups\\{version}\\"):
os.mkdir(f"{user_dir}\\Documents\\EasyMinecraftServer\\Backups\\{version}\\")
pass
else:
pass
if os.path.exists(f"{user_dir}\\Documents\\EasyMinecraftServer\\Backups\\{version}\\{backup_name}\\"):
showerror(title="Backup Error",
message="Backup with the same name already exists! Please try again!")
else:
try:
copytree(f"{cwd}\\ServerFiles-{version}\\",
f"{user_dir}\\Documents\\EasyMinecraftServer\\Backups\\{version}\\{backup_name}\\")
logging.info("Server Backup Successful")
showinfo(title="Backup Successful", message="Backup Successful!")
pass
except Exception as e:
showerror(title="Backup Error", message=f"Error while performing backup: {e}")
logging.error("Error In inject_custom_map(): " + str(e))
return
pass
pass
else:
showwarning(title="Server Backup", message="You have chosen not to backup the current "
"server! Current server data will be "
"overwritten!")
logging.warning("User Has Chosen Not To Backup Current Server")
pass
pass
else:
pass
try:
logging.warning("Performing Custom Map Injection")
server_prop = open(f"{cwd}\\ServerFiles-{version}\\server.properties", "r")
if "level-name" in str(server_prop.read()):
p = Properties()
p.load(open(f"{cwd}\\ServerFiles-{version}\\server.properties", "rb"))
level_name = p.get("level-name").data
pass
else:
level_name = "world"
pass
server_prop.close()
rmtree(f"{cwd}\\ServerFiles-{version}\\{level_name}\\")
copytree(f"{custom_map}\\", f"{cwd}\\ServerFiles-{version}\\{level_name}\\")
showinfo(title="Custom Map", message="Custom Map Successfully Injected!")
logging.info("Custom Map Injection Successful")
pass
except Exception as e:
showerror(title="Custom Map", message=f"Error while injecting custom map: {e}")
logging.error("Error In inject_custom_map(): " + str(e))
pass
return
def reset_overworld():
version = askstring(title="Select Version",
prompt="Please Select The Version You Would Like To Reset 'THE OVERWORLD' In! This can be any version but must be in the format 'num.num.num'!")
logging.info("Version Selected In reset_overworld(): " + str(version))
if not os.path.exists(f"{cwd}\\ServerFiles-{version}\\"):
showerror(title="Error", message="Invalid Version!")
logging.error("Invalid Version Entered")
return
else:
pass
backup_ask = askyesno("Backup", "Would you like to backup your server before resetting the dimension?")
if backup_ask:
logging.info("User Has Chosen To Backup Server Before Resetting Dimension")
backup_name = askstring("Backup", "Please enter a name for your backup!")
if os.path.exists(f"{user_dir}\\Documents\\EasyMinecraftServer\\Backups\\{version}\\{backup_name}\\"):
showerror(title="Backup", message="Backup with that name already exists!")
logging.error("Backup with that name already exists In reset_overworld()")
return
else:
pass
logging.info("Performing Backup Before Resetting Overworld")
copytree(f"{cwd}\\ServerFiles-{version}\\",
f"{user_dir}\\Documents\\EasyMinecraftServer\\Backups\\{version}\\{backup_name}\\")
showinfo("Backup", "Backup Complete!")
logging.info("Backup Complete In reset_overworld()")
pass
else:
pass
reset_ask = askyesno("Reset", "Are you sure you want to reset the dimension?")
if reset_ask:
logging.info("User Has Chosen To Reset Dimension")
if os.path.exists(f"{cwd}\\ServerFiles-{version}\\world\\region\\"):
pass
else:
showerror(title="Dimension Reset", message="The overworld files do not exist!")
logging.error("The overworld files do not exist In reset_overworld()")
return
try:
rmtree(f"{cwd}\\ServerFiles-{version}\\world\\region\\")
showinfo(title="Dimension Reset", message="Overworld Successfully Reset!")
logging.info("Overworld Successfully Reset In reset_overworld()")
return
except Exception as e:
showerror(title="Dimension Reset", message=f"Error while resetting dimension: {e}")
logging.error("Error In reset_overworld(): " + str(e))
return
else:
showinfo("Dimension Reset", "Dimension Reset Cancelled!")
logging.info("Dimension Reset Cancelled In reset_overworld()")
return
def reset_nether():
version = askstring(title="Select Version",
prompt="Please Select The Version You Would Like To Reset 'THE NETHER' In! This can be any version but must be in the format 'num.num.num'!")
logging.info("Version Selected In reset_nether(): " + str(version))
if not os.path.exists(f"{cwd}\\ServerFiles-{version}\\"):
showerror(title="Error", message="Invalid Version!")
logging.error("Invalid Version Entered")
return
else:
pass
backup_ask = askyesno("Backup", "Would you like to backup your server before resetting the dimension?")
if backup_ask:
backup_name = askstring("Backup", "Please enter a name for your backup!")
if os.path.exists(f"{user_dir}\\Documents\\EasyMinecraftServer\\Backups\\{version}\\{backup_name}\\"):
showerror(title="Backup", message="Backup with that name already exists!")
logging.error("Backup with that name already exists In reset_nether()")
return
else:
pass
logging.info("Performing Backup Before Resetting Nether")
copytree(f"{cwd}\\ServerFiles-{version}\\",
f"{user_dir}\\Documents\\EasyMinecraftServer\\Backups\\{version}\\{backup_name}\\")
showinfo("Backup", "Backup Complete!")
logging.info("Backup Complete In reset_nether()")
pass
else:
pass
reset_ask = askyesno("Reset", "Are you sure you want to reset the dimension?")
if reset_ask:
logging.info("User Has Chosen To Reset Dimension")
if os.path.exists(f"{cwd}\\ServerFiles-{version}\\world\\DIM-1\\region\\"):
pass
else:
showerror(title="Dimension Reset", message="The nether files do not exist!")
logging.error("Nether Files Do Not Exist In reset_nether()")
return
try:
rmtree(f"{cwd}\\ServerFiles-{version}\\world\\DIM-1\\region\\")
showinfo(title="Dimension Reset", message="Nether Successfully Reset!")
logging.info("Nether Successfully Reset In reset_nether()")
return
except Exception as e:
showerror(title="Dimension Reset", message=f"Error while resetting dimension: {e}")
logging.error("Error In reset_nether(): " + str(e))
return
else:
showinfo("Dimension Reset", "Dimension Reset Cancelled!")
logging.info("Dimension Reset Cancelled In reset_nether()")
return
def reset_end():
version = askstring(title="Select Version",
prompt="Please Select The Version You Would Like To Reset 'THE END' In! This can be any version but must be in the format 'num.num.num'!")
logging.info("Version Selected In reset_end(): " + str(version))
if not os.path.exists(f"{cwd}\\ServerFiles-{version}\\"):
showerror(title="Error", message="Invalid Version!")
logging.error("Invalid Version Entered")
return
else:
pass
backup_ask = askyesno("Backup", "Would you like to backup your server before resetting the dimension?")
if backup_ask:
backup_name = askstring("Backup", "Please enter a name for your backup!")
if os.path.exists(f"{user_dir}\\Documents\\EasyMinecraftServer\\Backups\\{version}\\{backup_name}\\"):
showerror(title="Backup", message="Backup with that name already exists!")
logging.error("Backup with that name already exists In reset_end()")
return
else:
pass
logging.info("Performing Backup Before Resetting End")
copytree(f"{cwd}\\ServerFiles-{version}\\",
f"{user_dir}\\Documents\\EasyMinecraftServer\\Backups\\{version}\\{backup_name}\\")
showinfo("Backup", "Backup Complete!")
logging.info("Backup Complete In reset_end()")
pass
else:
pass
reset_ask = askyesno("Reset", "Are you sure you want to reset the dimension?")
if reset_ask:
logging.info("User Has Chosen To Reset Dimension")
if os.path.exists(f"{cwd}\\ServerFiles-{version}\\world\\DIM1\\region\\"):
pass
else:
showerror(title="Dimension Reset", message="The end files do not exist!")
logging.error("End Files Do Not Exist In reset_end()")
return
try:
rmtree(f"{cwd}\\ServerFiles-{version}\\world\\DIM1\\region\\")
showinfo(title="Dimension Reset", message="End Successfully Reset!")
logging.info("End Successfully Reset In reset_end()")
return
except Exception as e:
showerror(title="Dimension Reset", message=f"Error while resetting dimension: {e}")
logging.error("Error In reset_end(): " + str(e))
return
else:
showinfo("Dimension Reset", "Dimension Reset Cancelled!")
logging.info("Dimension Reset Cancelled In reset_end()")
return
def reset_dimension_main():
logging.info("reset_dimension_main() Called")
dim_rest_window = Toplevel(root)
dim_rest_window.title("EasyMinecraftServer - Reset Dimension")
dim_rest_window.geometry("400x200")
dim_rest_window.resizable(False, False)
dim_reset_label = Label(dim_rest_window, text="Please Select The Dimension You Would Like To Reset!")
overworld_button = Button(dim_rest_window, text="Overworld", command=reset_overworld)
nether_button = Button(dim_rest_window, text="Nether", command=reset_nether)
end_button = Button(dim_rest_window, text="End", command=reset_end)
dim_reset_label.pack()
overworld_button.pack()
nether_button.pack()
end_button.pack()
def change_server_properties():
logging.info("change_server_properties() Called")
properties_version = askstring(title="Select Version", prompt="Enter the version you want to change properties for! This can be any version but must be in the format 'num.num.num'!")
try:
os.startfile(f"{cwd}\\ServerFiles-{properties_version}\\server.properties")
logging.info("server.properties Opened In change_server_properties()")
return
except Exception as e:
logging.error(f"Error Opening server.properties In change_server_properties() {e}")
showerror(title="Error", message=f"Error Opening server.properties: {e}")
return
def import_external_server():
version = askstring(title="Select Version",
prompt="Enter the version you want to import! This can be any version but must be in the format 'num.num.num'!")
import_files = str(askdirectory(title="Select Folder To Import"))
if not os.path.exists(f"{import_files}\\world\\") or not os.path.exists(
f"{import_files}\\server.properties") or not os.path.exists(
f"{import_files}\\eula.txt") or not os.path.exists(f"{import_files}\\ops.json") or not os.path.exists(
f"{import_files}\\banned-ips.json") or not os.path.exists(
f"{import_files}\\banned-players.json") or not os.path.exists(
f"{import_files}\\whitelist.json"):
showerror(title="Error", message="Invalid Folder Selected!")
logging.error("Invalid Folder Selected!")
return
else:
if os.path.exists(f"{cwd}\\ServerFiles-{version}\\ops.json\\") or \
os.path.exists(f"{cwd}\\ServerFiles-{version}\\banned-players.json\\") or \
os.path.exists(f"{cwd}\\ServerFiles-{version}\\banned-ips.json\\"):
backup_current_server = askyesno(title="Restore Server Backup",
message="You have current data in the server! Would you like "
"to perform a backup?")
if backup_current_server:
logging.info("User opted to backup server before importing external server")
backup_name = askstring(title="Create Server Backup", prompt="Enter the name of the backup!")
if not backup_name:
showerror(title="Error", message="Invalid Name!")
logging.error("Invalid backup name")
return
else:
pass
if not os.path.exist(f"{user_dir}\\Documents\\EasyMinecraftServer\\Backups\\{version}\\"):
os.mkdir(f"{user_dir}\\Documents\\EasyMinecraftServer\\Backups\\{version}\\")
pass
else:
pass
if os.path.exists(f"{user_dir}\\Documents\\EasyMinecraftServer\\Backups\\{version}\\{backup_name}\\"):
showerror(title="Backup Error",
message="Backup with the same name already exists! Please try again!")
logging.error("Backup with the same name already exists")
return
else:
try:
copytree(f"{cwd}\\ServerFiles-{version}\\",
f"{user_dir}\\Documents\\EasyMinecraftServer\\Backups\\{version}\\{backup_name}\\")
showinfo(title="Backup Successful", message="Backup Successful!")
logging.info("Backup successful before importing external server")
pass
except Exception as e:
showerror(title="Backup Error", message=f"Error while performing backup: {e}")
logging.error(f"Error while performing backup: {e}")
return
pass
else:
showwarning(title="Restore Server Backup", message="You have chosen not to backup the current "
"server! Current server data will be "
"overwritten!")
logging.warning("User opted to not backup server before importing external server")
pass
pass
else:
pass
try:
logging.warning("Importing external server")
rmtree(f"{cwd}\\ServerFiles-{version}\\")
copytree(f"{import_files}\\", f"{cwd}\\ServerFiles-{version}\\")
showinfo(title="External Server", message="Server Successfully Imported!")
logging.info("Server successfully imported")
return
except Exception as e:
showerror(title="Import Error", message=f"Error while performing import: {e}")
logging.error(f"Error while performing import: {e}")
return
def folders_in(path_to_parent):
for fname in os.listdir(path_to_parent):
if os.path.isdir(os.path.join(path_to_parent, fname)):
yield os.path.join(path_to_parent, fname)
def has_folders(path_to_parent):
folders = list(folders_in(path_to_parent))
return folders
def setup(arg):
showinfo(title="Setup", message="Setup for EasyMinecraftServer is required! Please follow the instructions!")
if arg == "all":
subdirectories = has_folders(f"{user_dir}\\Documents\\EasyMinecraftServer\\ProgramBackups\\")
if len(subdirectories) == 0:
pass
else:
restore_backup = askyesno(title="Restore Program Backup", message="You have a backup of EasyMinecraftServer! Would you like to restore it?")
if restore_backup:
backup_files = str(askdirectory(title="Select Backup Folder", initialdir=f"{user_dir}\\Documents\\EasyMinecraftServer\\ProgramBackups\\"))
if not os.path.exists(f"{backup_files}\\settings.json"):
showerror(title="Error", message="Invalid Backup Folder!")
logging.error("Invalid Backup Folder!")
restart_force()
else:
copy(f"{backup_files}\\settings.json", f"{user_dir}\\Documents\\EasyMinecraftServer\\Settings\\settings.json")
showinfo(title="Restore Successful", message="Restore Successful! EasyMinecraftServer will now restart!")
logging.info("Restore successful")
restart_force()
else:
pass
pass
showinfo(title="NGROK", message="Makeshift port-forwarding requires a ngrok account. Please navigate to "
"https://dashboard.ngrok.com/get-started/setup after making a free account and "
"get your authtoken!")
setup_window = Toplevel(root)
setup_window.title("EasyMinecraftServer (SETUP)")
setup_window.geometry("500x500")
setup_window.resizable(False, False)
ngrok_authtoken_label = Label(setup_window, text="Ngrok Authtoken")
ngrok_authtoken_label.pack()
ngrok_authtoken_entry = Entry(setup_window, width=450)
ngrok_authtoken_entry.pack()
ngrok_authtoken_entry.insert(0, "Enter your ngrok authtoken here")
ram_bytes = psutil.virtual_memory().total
ram_mb = ram_bytes / 1000000
ram_allocation_amount_label = Label(setup_window, text=f"RAM Allocation Amount. Total Available: {str(round(float(ram_mb)))} MB")
ram_allocation_amount_label.pack()
ram_allocation_entry = Entry(setup_window, width=450)
ram_allocation_entry.pack()
ram_allocation_entry.insert(0, "Enter the amount of RAM you would like to allocate for the server in MB")
variable = StringVar(setup_window)
auto_server_backup_label = Label(setup_window, text="Auto Server Backup")
auto_server_backup_label.pack()
auto_server_backup_entry = OptionMenu(setup_window, variable, "True", "False")
auto_server_backup_entry.config(width=450)
auto_server_backup_entry.pack()
variable_two = StringVar(setup_window)
server_gui_label = Label(setup_window, text="Server GUI")
server_gui_label.pack()
server_gui_entry = OptionMenu(setup_window, variable_two, "True", "False")
server_gui_entry.config(width=450)
server_gui_entry.pack()
var = IntVar()
submit_button = Button(setup_window,
command=lambda: var.set(1),
font=("TrebuchetMS", 10, 'bold'),
text="Click Here To Save And Continue!", width="400", height="5",
bd=0, bg="#32de97", activebackground="#3c9d9b", fg='#ffffff')
submit_button.pack()
submit_button.wait_variable(var)
new_ngrok_authtoken = ngrok_authtoken_entry.get()
new_ram_allocation_amount = ram_allocation_entry.get()
new_auto_server_backup = variable.get()
new_server_gui = variable_two.get()
if new_ram_allocation_amount >= str(round(float(ram_mb))):
showwarning(title="RAM Allocation Error", message="RAM Allocation Amount is greater than the total available RAM!")
logging.warning("RAM Allocation Amount is greater than the total available RAM!")
restart_force()
sys.exit(0)
else:
pass
settings = {
"ngrok_authtoken": new_ngrok_authtoken,
"ram_allocation_amount": new_ram_allocation_amount,
"auto_server_backup": new_auto_server_backup,
"server_gui": new_server_gui
}
settings_object = json.dumps(settings, indent=4)
settings_file = open(f"{user_dir}\\Documents\\EasyMinecraftServer\\Settings\\settings.json", "w+")
settings_file.truncate(0)
with settings_file as outfile:
outfile.write(settings_object)
pass
settings_file.close()
showinfo(title="EasyMinecraftServer Settings", message="New settings saved! Please restart to continue!")
setup_window.destroy()
restart_force()
sys.exit(0)
elif arg == "auto_server_backup":
showerror(title="Settings Error",
message="Settings file has been corrupted! Program will now be reset and setup will be required again!")
program_reset_force()
sys.exit(0)
elif arg == "server_gui":
showerror(title="Settings Error",
message="Settings file has been corrupted! Program will now be reset and setup will be required again!")
program_reset_force()
sys.exit(0)
elif arg == "ram_allocation_amount":
showerror(title="Settings Error",
message="Settings file has been corrupted! Program will now be reset and setup will be required again!")
program_reset_force()
sys.exit(0)
elif arg == "ngrok_authtoken":
showerror(title="Settings Error",
message="Settings file has been corrupted! Program will now be reset and setup will be required again!")
program_reset_force()
sys.exit(0)
def settings_check():
if not os.path.exists(f"{user_dir}\\Documents\\EasyMinecraftServer\\Settings\\settings.json"):
create_file = open(f"{user_dir}\\Documents\\EasyMinecraftServer\\Settings\\settings.json", "x")
create_file.close()
pass
else:
pass
while True:
settings_file = open(f"{user_dir}\\Documents\\EasyMinecraftServer\\Settings\\settings.json", "r")
settings_content = settings_file.read()
settings_file.close()
if "auto_server_backup" not in settings_content and "server_gui" not in settings_content and "ram_allocation_amount" not in settings_content and "ngrok_authtoken" not in settings_content:
setup("all")
restart_force()
sys.exit(0)
elif "auto_server_backup" not in settings_content:
setup("auto_server_backup")
restart_force()
sys.exit(0)
elif "server_gui" not in settings_content:
setup("server_gui")
restart_force()
sys.exit(0)
elif "ram_allocation_amount" not in settings_content:
setup("ram_allocation_amount")
restart_force()
sys.exit(0)
elif "ngrok_authtoken" not in settings_content:
setup("ngrok_authtoken")
restart_force()
sys.exit(0)
else:
break
return True
def settings():
logging.info("Settings window launched")
settings_window = Toplevel(root)
settings_window.title("EasyMinecraftServer (SETTINGS)")
settings_window.geometry("500x500")
settings_window.resizable(False, False)
ngrok_authtoken_label = Label(settings_window, text="Ngrok Authtoken")
ngrok_authtoken_label.pack()
ngrok_authtoken_entry = Entry(settings_window, width=450)
ngrok_authtoken_entry.pack()
ngrok_authtoken = settings_json["ngrok_authtoken"]
ngrok_authtoken_entry.insert(0, ngrok_authtoken)
ram_bytes = psutil.virtual_memory().total
ram_mb = ram_bytes / 1000000
ram_allocation_amount_label = Label(settings_window, text=f"RAM Allocation Amount. Total Available: {str(round(float(ram_mb)))} MB")
ram_allocation_amount_label.pack()
ram_allocation_amount_entry = Entry(settings_window, width=450)
ram_allocation_amount_entry.pack()
ram_allocation_amount = settings_json["ram_allocation_amount"]
ram_allocation_amount_entry.insert(0, ram_allocation_amount)
variable = StringVar(settings_window)
auto_server_backup = settings_json["auto_server_backup"]
variable.set(auto_server_backup)
auto_server_backup_label = Label(settings_window, text="Auto Server Backup")
auto_server_backup_label.pack()
auto_server_backup_entry = OptionMenu(settings_window, variable, "True", "False")
auto_server_backup_entry.config(width=450)
auto_server_backup_entry.pack()
variable_two = StringVar(settings_window)
server_gui = settings_json["server_gui"]
variable_two.set(server_gui)
server_gui_label = Label(settings_window, text="Server GUI")
server_gui_label.pack()
server_gui_entry = OptionMenu(settings_window, variable_two, "True", "False")
server_gui_entry.config(width=450)
server_gui_entry.pack()
var = IntVar()
submit_button = Button(settings_window,
command=lambda: var.set(1),
font=("TrebuchetMS", 10, 'bold'),
text="Click Here To Save And Continue!", width="400", height="5",
bd=0, bg="#32de97", activebackground="#3c9d9b", fg='#ffffff')
submit_button.pack()
logging.info("Awaiting user input in settings window")
submit_button.wait_variable(var)
logging.info("Writing new settings")
new_ngrok_authtoken = ngrok_authtoken_entry.get()
new_ram_allocation_amount = ram_allocation_amount_entry.get()
new_auto_server_backup = variable.get()
new_server_gui = variable_two.get()
if new_ram_allocation_amount >= str(round(float(ram_mb))):
showwarning(title="RAM Allocation Error", message="RAM Allocation Amount is greater than the total available RAM!")
logging.warning("RAM Allocation Amount is greater than the total available RAM!")
return
else:
pass
settings = {
"ngrok_authtoken": new_ngrok_authtoken,
"ram_allocation_amount": new_ram_allocation_amount,
"auto_server_backup": new_auto_server_backup,
"server_gui": new_server_gui
}
settings_object = json.dumps(settings, indent=4)
settings_file = open(f"{user_dir}\\Documents\\EasyMinecraftServer\\Settings\\settings.json", "w+")
settings_file.truncate(0)
with settings_file as outfile:
outfile.write(settings_object)
pass
settings_file.close()
showinfo(title="EasyMinecraftServer Settings", message="New settings saved! Please restart to continue!")
logging.info("New settings saved")
settings_window.destroy()
restart_force()
sys.exit(0)
def program_reset_force():
logging.warning("Program reset forced")
file_path = f"{user_dir}\\Documents\\EasyMinecraftServer\\"
file_list = os.listdir(file_path)
for folder in file_list:
if folder == "Logs" or folder == "ProgramBackups":
pass
else:
rmtree(f"{file_path}\\{folder}")
pass
pass
showinfo(title="Reset", message="EasyMinecraftServer has been reset!")
logging.warning("EasyMinecraftServer reset")
restart_force()
def program_reset():
logging.warning("Program reset started")
reset_confirm = askyesno(title="Reset", message="Are you sure you want to reset EasyMinecraftServer?")
if reset_confirm:
logging.info("User confirmed reset")
file_path = f"{user_dir}\\Documents\\EasyMinecraftServer\\"
file_list = os.listdir(file_path)
for folder in file_list:
if folder == "Logs" or folder == "ProgramBackups":
pass
else:
rmtree(f"{file_path}\\{folder}")
pass
pass
showinfo(title="Reset", message="EasyMinecraftServer has been reset!")
logging.warning("EasyMinecraftServer reset")
restart_force()
else:
return
def program_backup():
logging.info("Program backup started")
if not os.path.exists(f"{user_dir}\\Documents\\EasyMinecraftServer\\ProgramBackups\\"):
os.mkdir(f"{user_dir}\\Documents\\EasyMinecraftServer\\ProgramBackups\\")
logging.info("Program backup directory created")
pass
else:
pass
backup_name = askstring(title="Backup Name", prompt="What would you like to name your backup?")
if backup_name is None or backup_name == "" or backup_name.isspace():
showerror(title="Backup Error", message="Backup name cannot be empty!")
logging.error("Backup name cannot be empty")
return
elif os.path.exists(f"{user_dir}\\Documents\\EasyMinecraftServer\\ProgramBackups\\{backup_name}\\"):
showerror(title="Backup Error", message="Backup name already exists!")
logging.error("Backup name already exists")
return
else:
logging.info(f"Starting backup with name: {backup_name}")
copytree(f"{user_dir}\\Documents\\EasyMinecraftServer\\Settings\\",
f"{user_dir}\\Documents\\EasyMinecraftServer\\ProgramBackups\\{backup_name}\\")
showinfo(title="Backup", message="EasyMinecraftServer has been backed up!")
logging.info("EasyMinecraftServer backup complete")
return
def program_restore():
logging.info("Program restore started")
path = f"{user_dir}\\Documents\\EasyMinecraftServer\\ProgramBackups\\"
backup_subdirs = os.listdir(path)
if len(backup_subdirs) == 0:
showerror(title="Restore Error", message="No backups found!")
logging.error("No backups found")
return
else:
backup_name = askdirectory(title="Select A Backup To Restore", initialdir=path)
if not os.path.exists(f"{backup_name}\\settings.json"):
showerror(title="Restore Error", message="Invalid Backup!")
logging.error("Invalid backup")
return
else:
if os.path.exists(f"{user_dir}\\Documents\\EasyMinecraftServer\\Settings\\"):
rmtree(f"{user_dir}\\Documents\\EasyMinecraftServer\\Settings\\")
logging.info("EasyMinecraftServer reset")
pass
else:
pass
logging.info("Restoring settings from backup")
copytree(f"{backup_name}\\", f"{user_dir}\\Documents\\EasyMinecraftServer\\Settings\\")
showinfo(title="Restore", message="EasyMinecraftServer backup has been restored!")
logging.info("EasyMinecraftServer backup restored")
restart_force()
sys.exit(0)
def changelog():
logging.info("Changelog window opened")
changelog_window = Toplevel(root)
changelog_window.title("EasyMinecraftServer (CHANGELOG)")
changelog_window.geometry("500x500")
changelog_file = open(f"{cwd}\\CHANGELOG.txt", "r")
changelog_label = Label(changelog_window, text=f"{changelog_file.read()}")
changelog_file.close()
changelog_label.pack()
def update():
logging.info("Manual update check started")
try:
url = "http://github.com/teekar2023/EasyMinecraftServer/releases/latest/"
r = requests.get(url, allow_redirects=True)
redirected_url = r.url
pass
except Exception as e:
showerror(title="Update Error", message=f"Error While Checking For Updates: {e}")
logging.error(f"Error While Checking For Updates: {e}")
return
if redirected_url != "https://github.com/teekar2023/EasyMinecraftServer/releases/tag/v2.9.0":
new_version = redirected_url.replace("https://github.com/teekar2023/EasyMinecraftServer/releases/tag/", "")
logging.warning(f"Update available: {new_version}")
new_url = str(redirected_url) + f"/EasyMinecraftServerInstaller-{new_version}.exe"
download_url = new_url.replace("tag", "download")
update_window = Toplevel(root)
update_window.title("EasyMinecraftServer (UPDATE)")
update_window.geometry("500x500")
update_window.resizable(width=False, height=False)
update_text = Label(update_window,
text="There Is A New Update Available! Click The Button Below If You Wish To Download It!")
update_text.pack()
int_var = IntVar(update_window)
update_button = Button(update_window, command=lambda: int_var.set(1), font=("TrebuchetMS", 12, 'bold'),
text="Download Update", width="500", height="5",
bd=0, bg="#32de97", activebackground="#3c9d9b", fg='#ffffff')
update_button.pack()
changelog_text = Text(update_window, bd=0, bg="white", height="25", width="75", font="TrebuchetMS")
changelog_text.pack()
if not os.path.exists(f"{user_dir}\\Documents\\EasyMinecraftServer\\Update-{new_version}\\"):
os.mkdir(f"{user_dir}\\Documents\\EasyMinecraftServer\\Update-{new_version}\\")
pass
else:
pass
try:
logging.info("Downloading new version changelog")
changelog_url = "https://raw.githubusercontent.com/teekar2023/EasyMinecraftServer/master/CHANGELOG.txt"
changelog_download = urllib.request.urlopen(changelog_url)
if os.path.exists(f"{user_dir}\\Documents\\EasyMinecraftServer\\Update-{new_version}\\changelog.txt"):
os.remove(f"{user_dir}\\Documents\\EasyMinecraftServer\\Update-{new_version}\\changelog.txt")
pass
else:
pass
create_changelog_file = open(f"{user_dir}\\Documents\\EasyMinecraftServer\\Update-{new_version}\\changelog.txt", 'x')
create_changelog_file.close()
changelog_file = open(f"{user_dir}\\Documents\\EasyMinecraftServer\\Update-{new_version}\\changelog.txt", 'wb')
while True:
changelog_data = changelog_download.read()
if not changelog_data:
break
else:
changelog_file.write(changelog_data)
pass
changelog_file.close()
changelog_txt = str(open(f"{user_dir}\\Documents\\EasyMinecraftServer\\Update-{new_version}\\changelog.txt", 'r').read())
pass
except Exception as e:
changelog_txt = f"There was an error while accessing new version changelog data: {e}"
logging.error("There was an error while accessing changelog data")
pass
changelog_text.insert(END, f"{changelog_txt}")
changelog_text.config(state=DISABLED)
update_button.wait_variable(int_var)
if os.path.exists(f"{user_dir}\\Documents\\EasyMinecraftServer\\Update-{new_version}\\EasyMinecraftServerInstaller-{new_version}.exe"):
logging.info("Update already downloaded")
logging.info("Launching update installer")
os.startfile(f"{user_dir}\\Documents\\EasyMinecraftServer\\Update-{new_version}\\EasyMinecraftServerInstaller-{new_version}.exe")
exit_program_force()
else:
try:
f = open(f"{user_dir}\\Documents\\EasyMinecraftServer\\Update-{new_version}\\EasyMinecraftServerInstaller-{new_version}.exe", 'wb')
showwarning(title="EasyMinecraftServer Update",
message="Update will now be downloaded and installer will be launcher. "
"This may take a while to please be patient and d not do anything if program becomes unresponsive!")
logging.info("Downloading update installer")
f2 = urllib.request.urlopen(download_url)
while True:
data = f2.read()
if not data:
break
else:
f.write(data)
pass
pass
f.close()
showinfo(title="EasyMinecraftServer Update", message="Update Downloaded Successfully! Installer Will Now Be Launched To Complete Update!")
logging.info("Update Downloaded Successfully!")
logging.info("Launching update installer")
os.startfile(f"{user_dir}\\Documents\\EasyMinecraftServer\\Update-{new_version}\\EasyMinecraftServerInstaller-{new_version}.exe")
exit_program_force()
except Exception as e:
showerror(title="EasyMinecraftServer Update", message=f"There was an error while downloading update: {e}")
logging.error(f"There was an error while downloading update: {e}")
exit_program_force()
else:
showinfo(title="Update", message="EasyMinecraftServer is already up to date!")
return
def update_event(event):
logging.info("Manual update check started")
try:
url = "http://github.com/teekar2023/EasyMinecraftServer/releases/latest/"
r = requests.get(url, allow_redirects=True)
redirected_url = r.url
pass
except Exception as e:
showerror(title="Update Error", message=f"Error While Checking For Updates: {e}")
logging.error(f"Error While Checking For Updates: {e}")
return
if redirected_url != "https://github.com/teekar2023/EasyMinecraftServer/releases/tag/v2.9.0":
new_version = redirected_url.replace("https://github.com/teekar2023/EasyMinecraftServer/releases/tag/", "")
logging.warning(f"Update available: {new_version}")
new_url = str(redirected_url) + f"/EasyMinecraftServerInstaller-{new_version}.exe"
download_url = new_url.replace("tag", "download")
update_window = Toplevel(root)
update_window.title("EasyMinecraftServer (UPDATE)")
update_window.geometry("500x500")
update_window.resizable(width=False, height=False)
update_text = Label(update_window,
text="There Is A New Update Available! Click The Button Below If You Wish To Download It!")
update_text.pack()
int_var = IntVar(update_window)
update_button = Button(update_window, command=lambda: int_var.set(1), font=("TrebuchetMS", 12, 'bold'),
text="Download Update", width="500", height="5",
bd=0, bg="#32de97", activebackground="#3c9d9b", fg='#ffffff')
update_button.pack()
changelog_text = Text(update_window, bd=0, bg="white", height="25", width="75", font="TrebuchetMS")
changelog_text.pack()
if not os.path.exists(f"{user_dir}\\Documents\\EasyMinecraftServer\\Update-{new_version}\\"):
os.mkdir(f"{user_dir}\\Documents\\EasyMinecraftServer\\Update-{new_version}\\")
pass
else:
pass
try:
logging.info("Downloading new version changelog")
changelog_url = "https://raw.githubusercontent.com/teekar2023/EasyMinecraftServer/master/CHANGELOG.txt"
changelog_download = urllib.request.urlopen(changelog_url)
if os.path.exists(f"{user_dir}\\Documents\\EasyMinecraftServer\\Update-{new_version}\\changelog.txt"):
os.remove(f"{user_dir}\\Documents\\EasyMinecraftServer\\Update-{new_version}\\changelog.txt")
pass
else:
pass
create_changelog_file = open(f"{user_dir}\\Documents\\EasyMinecraftServer\\Update-{new_version}\\changelog.txt", 'x')
create_changelog_file.close()
changelog_file = open(f"{user_dir}\\Documents\\EasyMinecraftServer\\Update-{new_version}\\changelog.txt", 'wb')
while True:
changelog_data = changelog_download.read()
if not changelog_data:
break
else:
changelog_file.write(changelog_data)
pass
changelog_file.close()
changelog_txt = str(open(f"{user_dir}\\Documents\\EasyMinecraftServer\\Update-{new_version}\\changelog.txt", 'r').read())
pass
except Exception as e:
changelog_txt = f"There was an error while accessing new version changelog data: {e}"
logging.error("There was an error while accessing changelog data")
pass
changelog_text.insert(END, f"{changelog_txt}")
changelog_text.config(state=DISABLED)
update_button.wait_variable(int_var)
if os.path.exists(f"{user_dir}\\Documents\\EasyMinecraftServer\\Update-{new_version}\\EasyMinecraftServerInstaller-{new_version}.exe"):
logging.info("Update already downloaded")
logging.info("Launching update installer")
os.startfile(f"{user_dir}\\Documents\\EasyMinecraftServer\\Update-{new_version}\\EasyMinecraftServerInstaller-{new_version}.exe")
exit_program_force()
else:
try:
f = open(f"{user_dir}\\Documents\\EasyMinecraftServer\\Update-{new_version}\\EasyMinecraftServerInstaller.exe-{new_version}", 'wb')
showwarning(title="EasyMinecraftServer Update",
message="Update will now be downloaded and installer will be launcher. "
"This may take a while to please be patient and d not do anything if program becomes unresponsive!")
logging.info("Downloading update installer")
f2 = urllib.request.urlopen(download_url)
while True:
data = f2.read()
if not data:
break
else:
f.write(data)
pass
pass
f.close()
showinfo(title="EasyMinecraftServer Update", message="Update Downloaded Successfully! Installer Will Now Be Launched To Complete Update!")
logging.info("Update Downloaded Successfully!")
logging.info("Launching update installer")
os.startfile(f"{user_dir}\\Documents\\EasyMinecraftServer\\Update-{new_version}\\EasyMinecraftServerInstaller-{new_version}.exe")
exit_program_force()
except Exception as e:
showerror(title="EasyMinecraftServer Update", message=f"There was an error while downloading update: {e}")
logging.error(f"There was an error while downloading update: {e}")
exit_program_force()
else:
showinfo(title="Update", message="EasyMinecraftServer is already up to date!")
return
def exit_program():
exit_confirmation = askyesno("Exit", "Are you sure you want to exit?")
if exit_confirmation:
logging.info("Exiting EasyMinecraftServer")
logging.shutdown()
root.destroy()
PROCNAME = "EasyMinecraftServer.exe"
for proc in psutil.process_iter():
if proc.name() == PROCNAME:
proc.kill()
pass
else:
pass
pass
sys.exit(0)
else:
pass
def exit_program_event(event):
exit_confirmation = askyesno("Exit", "Are you sure you want to exit?")
if exit_confirmation:
logging.info("Exiting EasyMinecraftServer")
logging.shutdown()
root.destroy()
PROCNAME = "EasyMinecraftServer.exe"
for proc in psutil.process_iter():
if proc.name() == PROCNAME:
proc.kill()
pass
else:
pass
pass
sys.exit(0)
else:
pass
def exit_program_force():
logging.info("Exiting EasyMinecraftServer")
logging.shutdown()
root.destroy()
PROCNAME = "EasyMinecraftServer.exe"
for proc in psutil.process_iter():
if proc.name() == PROCNAME:
proc.kill()
pass
else:
pass
pass
sys.exit(0)
def restart_force():
logging.info("Restarting EasyMinecraftServer")
logging.shutdown()
os.execl(sys.executable, os.path.abspath(__file__), *sys.argv)
sys.exit(0)
def restart_program():
confirm_restart = askyesno(title="Restart", message="Restart EasyMinecraftServer?")
if confirm_restart:
logging.info("Restarting EasyMinecraftServer")
logging.shutdown()
os.execl(sys.executable, os.path.abspath(__file__), *sys.argv)
sys.exit(0)
else:
pass
def restart_program_event(event):
confirm_restart = askyesno(title="Restart", message="Restart EasyMinecraftServer?")
if confirm_restart:
logging.info("Restarting EasyMinecraftServer")
logging.shutdown()
os.execl(sys.executable, os.path.abspath(__file__), *sys.argv)
sys.exit(0)
else:
pass
def uninstall_program():
confirm_uninstall = askyesno(title="Uninstall", message="Are you sure you want to uninstall EasyMinecraftServer?")
if confirm_uninstall:
logging.info("Uninstalling EasyMinecraftServer")
reset_all = askyesno(title="Uninstall",
message="Would you like to reset all settings and data including backups?")
if reset_all:
logging.info("Resetting all settings and data including backups")
try:
file_path = f"{user_dir}\\Documents\\EasyMinecraftServer\\"
file_list = os.listdir(file_path)
for folder in file_list:
if folder == "Logs" or folder == "ProgramBackups":
pass
else:
rmtree(f"{file_path}\\{folder}")
pass
pass
showwarning(title="Uninstall", message="EasyMinecraftServer Data Reset!")
logging.info("EasyMinecraftServer Data Reset!")
pass
except Exception as e:
showerror(title="Reset Error", message=f"Error while resetting data and settings: {e}")
showerror(title="Uninstall", message="EasyMinecraftServer Data Reset Failed! These Files Can Be Manually Deleted In Your Documents Folder!")
logging.error(f"Error while resetting data and settings: {e}")
logging.error("EasyMinecraftServer Data Reset Failed")
pass
pass
else:
pass
remove_av = askyesno(title="Anti-Virus Exclusions", message="Would you like to remove all Anti-Virus Exclusions?")
if remove_av:
logging.info("Launching MinecraftServerUnelevator.exe")
os.system(f"MinecraftServerUnelevator")
pass
else:
logging.info("Anti-Virus Exclusion removal denied")
pass
showinfo(title="Uninstall", message="Sorry to see you go! Hope you come back soon!")
logging.info("Launching EasyMinecraftServer Uninstaller")
os.startfile(f"{cwd}\\unins000.exe")
exit_program_force()
else:
showinfo(title="Uninstall", message="Uninstall Cancelled!")
return
def uninstall_program_event(event):
confirm_uninstall = askyesno(title="Uninstall", message="Are you sure you want to uninstall EasyMinecraftServer?")
if confirm_uninstall:
logging.info("Uninstalling EasyMinecraftServer")
reset_all = askyesno(title="Uninstall",
message="Would you like to reset all settings and data including backups?")
if reset_all:
logging.info("Resetting all settings and data including backups")
try:
file_path = f"{user_dir}\\Documents\\EasyMinecraftServer\\"
file_list = os.listdir(file_path)
for folder in file_list:
if folder == "Logs":
pass
else:
rmtree(f"{file_path}\\{folder}")
pass
pass
showwarning(title="Uninstall", message="EasyMinecraftServer Data Reset!")
logging.info("EasyMinecraftServer Data Reset!")
pass
except Exception as e:
showerror(title="Reset Error", message=f"Error while resetting data and settings: {e}")
showerror(title="Uninstall", message="EasyMinecraftServer Data Reset Failed! These Files Can Be Manually Deleted In Your Documents Folder!")
logging.error(f"Error while resetting data and settings: {e}")
logging.error("EasyMinecraftServer Data Reset Failed")
pass
pass
else:
pass
showinfo(title="Uninstall", message="Sorry to see you go! Hope you come back soon!")
logging.info("Launching EasyMinecraftServer Uninstaller")
os.startfile(f"{cwd}\\unins000.exe")
exit_program_force()
else:
showinfo(title="Uninstall", message="Uninstall Cancelled!")
return
def jdk_installer():
logging.info("Launching JDK Download Website")
webbrowser.open("https://download.oracle.com/java/17/latest/jdk-17_windows-x64_bin.exe")
return
def backup_logs():
mod_time = os.path.getmtime(f"{user_dir}\\Documents\\EasyMinecraftServer\\Logs\\app.log")
if not os.path.exists(f"{user_dir}\\Documents\\EasyMinecraftServer\\Logs\\{mod_time}\\"):
os.mkdir(f"{user_dir}\\Documents\\EasyMinecraftServer\\Logs\\{mod_time}\\")
pass
else:
pass
copy(f"{user_dir}\\Documents\\EasyMinecraftServer\\Logs\\app.log", f"{user_dir}\\Documents\\EasyMinecraftServer\\Logs\\{mod_time}\\AppCrash.log")
showinfo(title="Crash Logs", message=f"Crash logs were backed up and can be found here: {user_dir}\\Documents\\EasyMinecraftServer\\Logs\\{mod_time}\\AppCrash.log")
return
def backup_logs_event(event):
logging.info("Backing up server logs due to request")
mod_time = os.path.getmtime(f"{user_dir}\\Documents\\EasyMinecraftServer\\Logs\\app.log")
if not os.path.exists(f"{user_dir}\\Documents\\EasyMinecraftServer\\Logs\\{mod_time}\\"):
os.mkdir(f"{user_dir}\\Documents\\EasyMinecraftServer\\Logs\\{mod_time}\\")
pass
else:
pass
copy(f"{user_dir}\\Documents\\EasyMinecraftServer\\Logs\\app.log", f"{user_dir}\\Documents\\EasyMinecraftServer\\Logs\\{mod_time}\\App.log")
showinfo(title="EasyMinecraftServer Logs", message=f"Program logs were backed up and can be found here: {user_dir}\\Documents\\EasyMinecraftServer\\Logs\\{mod_time}\\App.log")
return
def license_window():
logging.info("Showing license window")
license_window = Toplevel()
license_window.title("EasyMinecraftServer (LICENSE)")
license_window.geometry("500x600")
license_window.resizable(0, 0)
license_text = Text(license_window, width=500, height=600)
license_text.pack()
license_text_string = open(f"{cwd}\\LICENSE.txt", 'r').read()
license_text.insert(END, license_text_string)
license_text.config(state=DISABLED)
return
def license_window_event(event):
logging.info("Showing license window")
license_window = Toplevel()
license_window.title("EasyMinecraftServer (LICENSE)")
license_window.geometry("500x600")
license_window.resizable(0, 0)
license_text = Text(license_window, width=500, height=600)
license_text.pack()
license_text_string = open(f"{cwd}\\LICENSE.txt", 'r').read()
license_text.insert(END, license_text_string)
license_text.config(state=DISABLED)
return
def help_window():
logging.info("Showing help window")
help_window = Toplevel()
help_window.title("EasyMinecraftServer (HELP)")
help_window.geometry("700x400")
help_window.resizable(False, False)
help_text = """EasyMinecraftServer Help
This program was made with the purpose of making hosting minecraft servers and manipulating them easier for everyone!
All the buttons should be pretty self explanatory:
Start Server: Starts the server!
Create Backup Button: Creates a backup of the server files!
Restore Backup Button: Restores a backup of the server files!
Reset Server Button: Resets the server files!
Use Custom Map Button: Allows you to use a custom map in your server!
Reset Dimension Button: Resets a dimension from the server!
Change Server Properties Button: Allows you to change the server properties!
Import External Server Button: Allows you to import an external server to be used with the program!
Hosting a server without port forwarding requires a ngrok account and an authtoken!
More information about ngrok can be found at ngrok.com
If you have any questions or concerns, please contact me at:
sree23palla@outlook.com
Have Fun!
"""
help_label = Label(help_window, text=help_text)
help_label.pack()
jdk_installer_button = Button(help_window, text="JDK Installer", command=jdk_installer)
jdk_installer_button.pack()
license_button = Button(help_window, text="License", command=license_window)
license_button.pack()
return
def help_window_event(event):
logging.info("Showing help window")
help_window = Toplevel()
help_window.title("EasyMinecraftServer (HELP)")
help_window.geometry("700x400")
help_window.resizable(False, False)
help_text = """EasyMinecraftServer Help
This program was made with the purpose of making hosting minecraft servers and manipulating them easier for everyone!
All the buttons should be pretty self explanatory:
Start Server: Starts the server!
Create Backup Button: Creates a backup of the server files!
Restore Backup Button: Restores a backup of the server files!
Reset Server Button: Resets the server files!
Use Custom Map Button: Allows you to use a custom map in your server!
Reset Dimension Button: Resets a dimension from the server!
Change Server Properties Button: Allows you to change the server properties!
Import External Server Button: Allows you to import an external server to be used with the program!
Hosting a server without port forwarding requires a ngrok account and an authtoken!
More information about ngrok can be found at ngrok.com
If you have any questions or concerns, please contact me at:
sree23palla@outlook.com
Have Fun!
"""
help_label = Label(help_window, text=help_text)
help_label.pack()
jdk_installer_button = Button(help_window, text="JDK Installer", command=jdk_installer)
jdk_installer_button.pack()
license_button = Button(help_window, text="License", command=license_window)
license_button.pack()
return
def explorer_logs():
subprocess.Popen(f"explorer {user_dir}\\Documents\\EasyMinecraftServer\\Logs\\")
subprocess.Popen(f"explorer {cwd}")
return
def debug_event(event):
logging.info("Debug function called")
explorer_logs_thread = Thread(target=explorer_logs)
explorer_logs_thread.start()
os.startfile(f"{user_dir}\\Documents\\EasyMinecraftServer\\Logs\\app.log")
showinfo(title="Debug", message="Logs and data folder launched! Press F3 on the main window to create a backup of current logs!")
return
def server_backups():
subprocess.Popen(f"explorer {user_dir}\\Documents\\EasyMinecraftServer\\Backups\\")
return
def server_files():
version = askstring(title="View Server Files", prompt="Enter the version you want to view! This can be any version but must be in the format 'num.num.num'!")
if version == None:
return
else:
if os.path.exists(f"{cwd}\\ServerFiles-{version}\\"):
showwarning(title="View Server Files", message="Do not tamper with ServerFiles unless you know what you "
"are doing! A server backup is recommended before "
"interacting with ServerFiles!")
subprocess.Popen(f"explorer {cwd}\\ServerFiles-{version}\\")
return
else:
showerror(title="Error", message="Version does not exist!")
logging.error("Version does not exist!")
return
def av_exclusions():
exclusion_confirm = askyesno(title="Anti-Virus Exclusion", message="Would you like to launch the anti-virus exception creator to make all program and server files not be scanned by your antivirus program?")
if exclusion_confirm:
showinfo(title="Anti-Virus Exclusion", message="Launching Anti-Virus Exclusion Creator! Program will exit!")
logging.info("Launching Anti-Virus Exclusion Creator!")
os.startfile(f"MinecraftServerElevator.exe")
exit_program_force()
else:
return
def av_exclusions_remove():
exclusion_confirm = askyesno(title="Anti-Virus Exclusion", message="Would you like to remove the anti-virus exception creator to make all program and server files be scanned by your antivirus program?")
if exclusion_confirm:
showinfo(title="Anti-Virus Exclusion", message="Launching Anti-Virus Exclusion Remover! Program will exit!")
logging.info("Launching Anti-Virus Exclusion Remover!")
os.startfile(f"MinecraftServerUnelevator.exe")
exit_program_force()
else:
return
def is_admin():
try:
return ctypes.windll.shell32.IsUserAnAdmin()
except:
return False
if is_admin():
pass
else:
showwarning(title="EasyMinecraftServer", message="EasyMinecraftServer requires administrator privileges to run! Please run as administrator!")
ctypes.windll.shell32.ShellExecuteW(None, "runas", sys.executable, __file__, None, 1)
sys.exit(0)
toaster = ToastNotifier()
cwd = os.getcwd()
user_dir = os.path.expanduser("~")
root = Tk()
root.title("Easy Minecraft Server v2.9.0")
root.geometry("430x640")
root.bind("<Escape>", exit_program_event)
root.bind("<Return>", start_server_event)
root.bind("<Control-s>", start_server_event)
root.bind("<Control-S>", start_server_event)
root.bind("<Control-r>", reset_server_event)
root.bind("<Control-R>", reset_server_event)
root.bind("<F1>", help_window_event)
root.bind("<F2>", license_window_event)
root.bind("<F3>", backup_logs_event)
root.bind("<F4>", uninstall_program_event)
root.bind("<F5>", restart_program_event)
root.bind("<F6>", update_event)
root.bind("<F12>", debug_event)
root.resizable(False, False)
menubar = Menu(root)
main_menu = Menu(menubar, tearoff=0)
if not os.path.exists(f"{user_dir}\\Documents\\EasyMinecraftServer\\"):
os.mkdir(f"{user_dir}\\Documents\\EasyMinecraftServer\\")
pass
else:
pass
if not os.path.exists(f"{user_dir}\\Documents\\EasyMinecraftServer\\Temp\\"):
os.mkdir(f"{user_dir}\\Documents\\EasyMinecraftServer\\Temp\\")
pass
else:
pass
if not os.path.exists(f"{user_dir}\\Documents\\EasyMinecraftServer\\Backups\\"):
os.mkdir(f"{user_dir}\\Documents\\EasyMinecraftServer\\Backups\\")
pass
else:
pass
if not os.path.exists(f"{user_dir}\\Documents\\EasyMinecraftServer\\Backups\\Data\\"):
os.mkdir(f"{user_dir}\\Documents\\EasyMinecraftServer\\Backups\\Data\\")
pass
else:
pass
if not os.path.exists(f"{user_dir}\\Documents\\EasyMinecraftServer\\Logs\\"):
os.mkdir(f"{user_dir}\\Documents\\EasyMinecraftServer\\Logs\\")
pass
else:
pass
if not os.path.exists(f"{user_dir}\\Documents\\EasyMinecraftServer\\Settings\\"):
os.mkdir(f"{user_dir}\\Documents\\EasyMinecraftServer\\Settings\\")
pass
else:
pass
if not os.path.exists(f"{user_dir}\\Documents\\EasyMinecraftServer\\ProgramBackups\\"):
os.mkdir(f"{user_dir}\\Documents\\EasyMinecraftServer\\ProgramBackups\\")
pass
else:
pass
settings_good = settings_check()
if settings_good:
pass
else:
pass
settings_json = json.load(open(f"{user_dir}\\Documents\\EasyMinecraftServer\\Settings\\settings.json", "r"))
if not os.path.exists(f"{user_dir}\\Documents\\EasyMinecraftServer\\Logs\\app.log"):
create_app_log = open(f"{user_dir}\\Documents\\EasyMinecraftServer\\Logs\\app.log", 'x')
create_app_log.close()
pass
else:
pass
try:
log_bytes = open("{user_dir}\\Documents\\EasyMinecraftServer\\Logs\\app.log", "rb")
with log_bytes as f:
f.seek(-2, os.SEEK_END)
while f.read(1) != b'\n':
f.seek(-2, os.SEEK_CUR)
last_line = f.readline().decode()
if "Exiting Program" not in last_line and "Exiting for JDK Install" not in last_line and "Exiting Due To JDK Installation Denial" not in last_line and "Exiting EasyMinecraftServer" not in last_line and "Restarting EasyMinecraftServer" not in last_line and last_line != "" and not last_line.isspace() and "Restarting Program" not in last_line:
showwarning(title="EasyMinecraftServer", message="EasyMinecraftServer has detected that the program did not close properly last time it was run. Submitting a big report on the github page is recommended and log files will now be backed up!")
backup_logs()
pass
except OSError:
pass
log_file = open(f"{user_dir}\\Documents\\EasyMinecraftServer\\Logs\\app.log", "w")
try:
log_file.truncate(0)
pass
except Exception:
pass
logging.basicConfig(filename=f'{user_dir}\\Documents\\EasyMinecraftServer\\Logs\\app.log', filemode='r+', level="DEBUG",
format="%(asctime)s — %(name)s — %(levelname)s — %(funcName)s:%(lineno)d — %(message)s")
logging.info("Easy Minecraft Server v2.9.0 Started")
logging.info("Building GUI")
main_menu.add_command(label="Help", command=help_window)
main_menu.add_command(label="View ServerFiles", command=server_files)
main_menu.add_command(label="Settings", command=settings)
main_menu.add_command(label="Server Backups", command=server_backups)
main_menu.add_command(label="Backup Program", command=program_backup)
main_menu.add_command(label="Restore Program", command=program_restore)
main_menu.add_command(label="Reset Program", command=program_reset)
main_menu.add_command(label="Changelog", command=changelog)
main_menu.add_command(label="Update", command=update)
main_menu.add_command(label="Uninstall", command=uninstall_program)
main_menu.add_command(label="Create Anti-Virus Exclusions", command=av_exclusions)
main_menu.add_command(label="Remove Anti-Virus Exclusions", command=av_exclusions_remove)
main_menu.add_command(label="Restart", command=restart_program)
main_menu.add_command(label="Exit", command=exit_program)
menubar.add_cascade(label="Menu", menu=main_menu)
root.config(menu=menubar)
root.protocol("WM_DELETE_WINDOW", exit_program)
try:
logging.info("Checking for updates")
url = "http://github.com/teekar2023/EasyMinecraftServer/releases/latest/"
r = requests.get(url, allow_redirects=True)
redirected_url = r.url
if redirected_url != "https://github.com/teekar2023/EasyMinecraftServer/releases/tag/v2.9.0":
new_version = redirected_url.replace("https://github.com/teekar2023/EasyMinecraftServer/releases/tag/", "")
logging.warning(f"New version available: {new_version}")
toaster.show_toast("EasyMinecraftServer", f"New update available: {new_version}", icon_path=f"{cwd}\\mc.ico", threaded=True)
update_thread = Thread(target=update)
update_thread.start()
pass
else:
logging.info("No new update available")
file_path = f"{user_dir}\\Documents\\EasyMinecraftServer\\"
file_list = os.listdir(file_path)
for folder in file_list:
if "Update" in folder:
logging.info(f"Deleting update folder: {user_dir}\\Documents\\EasyMinecraftServer\\{folder}")
rmtree(f"{user_dir}\\Documents\\EasyMinecraftServer\\{folder}")
pass
else:
pass
pass
except Exception as e:
showerror(title="Error", message=f"Error while checking for updates: {e}")
logging.error(f"Error while checking for updates: {e}")
pass
java_check = which("java")
if java_check is None:
logging.warning("JDK Not Found")
install_jdk_ask = askyesno(title="JDK Required",
message="Java Development Kit 17 Is Required To Run Minecraft Servers! Would You Like To "
"Download/Install It Now?")
if install_jdk_ask:
logging.info("Launching Download Website")
webbrowser.open("https://download.oracle.com/java/17/latest/jdk-17_windows-x64_bin.exe")
logging.warning("Exiting for JDK Install")
exit_program_force()
sys.exit(0)
pass
else:
showerror(title="JDK Required", message="Java Development Kit 17 Is Required! Please Install It And Restart "
"The Program!")
logging.error("JDK Installation Denied!")
logging.warning("Exiting Due To JDK Installation Denial")
exit_program_force()
sys.exit(0)
pass
else:
logging.info(f"JDK Installation Found: {java_check}")
pass
if os.path.exists(f"{cwd}\\JDK\\"):
logging.info("JDK installer found")
rmtree(f"{cwd}\\JDK\\")
pass
else:
pass
if os.path.exists(f"{cwd}\\1.8.9-recovery\\"):
logging.info("1.8.9-Recovery Found")
rmtree(f"{cwd}\\1.8.9-recovery\\")
pass
else:
pass
if os.path.exists(f"{cwd}\\1.12.2-recovery\\"):
logging.info("1.12.2-Recovery Found")
rmtree(f"{cwd}\\1.12.2-recovery\\")
pass
else:
pass
if os.path.exists(f"{cwd}\\1.16.5-recovery\\"):
logging.info("1.16.5-Recovery Found")
rmtree(f"{cwd}\\1.16.5-recovery\\")
pass
else:
pass
if os.path.exists(f"{cwd}\\1.17.1-recovery\\"):
logging.info("1.17.1-Recovery Found")
rmtree(f"{cwd}\\1.17.1-recovery\\")
pass
else:
pass
if os.path.exists(f"{cwd}\\1.18.1-recovery\\"):
logging.info("1.18.1-Recovery Found")
rmtree(f"{cwd}\\1.18.1-recovery\\")
pass
else:
pass
if os.path.exists(f"{user_dir}\\Documents\\EasyMinecraftServer\\Temp\\launch_version.txt"):
logging.info("Launch Version File Found")
os.remove(f"{user_dir}\\Documents\\EasyMinecraftServer\\Temp\\launch_version.txt")
pass
else:
pass
main_text_label = Label(root, text="Easy Minecraft Server v2.9.0\n"
"Github: https://github.com/teekar2023/EasyMinecraftServer\n"
"Not In Any Way Affiliated With Minecraft, Mojang, Or Microsoft\n"
f"Current Working Directory: {cwd}\n"
f"User Directory: {user_dir}\n"
"Click Any Of The Following Buttons To Begin!")
main_text_label.pack()
start_button = Button(root, text="Start Server", command=start_server, font=("TrebuchetMS", 12, 'bold'),
width="40", height="3",
bd=0, bg="#32de97", activebackground="#3c9d9b", fg='#ffffff')
start_button.pack()
create_backup_button = Button(root, text="Create Server Backup", command=create_server_backup, font=("TrebuchetMS", 12, 'bold'),
width="40", height="3",
bd=0, bg="#32de97", activebackground="#3c9d9b", fg='#ffffff')
create_backup_button.pack()
restore_backup_button = Button(root, text="Restore Server Backup", command=restore_server_backup, font=("TrebuchetMS", 12, 'bold'),
width="40", height="3",
bd=0, bg="#32de97", activebackground="#3c9d9b", fg='#ffffff')
restore_backup_button.pack()
reset_server_button = Button(root, text="Reset Server", command=reset_server, font=("TrebuchetMS", 12, 'bold'),
width="40", height="3",
bd=0, bg="#32de97", activebackground="#3c9d9b", fg='#ffffff')
reset_server_button.pack()
use_custom_map_button = Button(root, text="Use Custom Map In Server", command=inject_custom_map, font=("TrebuchetMS", 12, 'bold'),
width="40", height="3",
bd=0, bg="#32de97", activebackground="#3c9d9b", fg='#ffffff')
use_custom_map_button.pack()
reset_dimension_button = Button(root, text="Reset Dimension In A Server", command=reset_dimension_main, font=("TrebuchetMS", 12, 'bold'),
width="40", height="3",
bd=0, bg="#32de97", activebackground="#3c9d9b", fg='#ffffff')
reset_dimension_button.pack()
change_server_properties_button = Button(root, text="Edit Server Properties",
command=change_server_properties, font=("TrebuchetMS", 12, 'bold'),
width="40", height="3",
bd=0, bg="#32de97", activebackground="#3c9d9b", fg='#ffffff')
change_server_properties_button.pack()
import_external_server_button = Button(root, text="Import External Server", command=import_external_server, font=("TrebuchetMS", 12, 'bold'),
width="40", height="3",
bd=0, bg="#32de97", activebackground="#3c9d9b", fg='#ffffff')
import_external_server_button.pack()
logging.info("GUI Built")
logging.info("Starting Main Loop")
root.mainloop()
logging.info("Main Loop Ended")
logging.warning("Exiting Program")
logging.shutdown()
sys.exit(0)
| 49.898018 | 346 | 0.621257 | 11,921 | 103,239 | 5.256438 | 0.055364 | 0.030896 | 0.030385 | 0.066468 | 0.824615 | 0.785819 | 0.748971 | 0.716112 | 0.693211 | 0.672513 | 0 | 0.013056 | 0.262546 | 103,239 | 2,068 | 347 | 49.922147 | 0.80994 | 0 | 0 | 0.70759 | 0 | 0.016811 | 0.412664 | 0.132721 | 0 | 0 | 0 | 0 | 0 | 1 | 0.023943 | false | 0.111564 | 0.022415 | 0 | 0.092206 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
947edbdffd2216178f506f2057146f8b7b5fe13e | 23,139 | py | Python | pb/risk_server_future_pb2.py | zheng-zy/ot_root | 920236b48458aeed72968bc7ec8e01a084b15951 | [
"Artistic-2.0"
] | null | null | null | pb/risk_server_future_pb2.py | zheng-zy/ot_root | 920236b48458aeed72968bc7ec8e01a084b15951 | [
"Artistic-2.0"
] | null | null | null | pb/risk_server_future_pb2.py | zheng-zy/ot_root | 920236b48458aeed72968bc7ec8e01a084b15951 | [
"Artistic-2.0"
] | null | null | null | # Generated by the protocol buffer compiler. DO NOT EDIT!
# source: risk_server_future.proto
import sys
_b=sys.version_info[0]<3 and (lambda x:x) or (lambda x:x.encode('latin1'))
from google.protobuf import descriptor as _descriptor
from google.protobuf import message as _message
from google.protobuf import reflection as _reflection
from google.protobuf import symbol_database as _symbol_database
from google.protobuf import descriptor_pb2
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
DESCRIPTOR = _descriptor.FileDescriptor(
name='risk_server_future.proto',
package='risk_server_future',
serialized_pb=_b('\n\x18risk_server_future.proto\x12\x12risk_server_future\"\xd6\x0b\n\x15\x66uture_order_set_info\x12%\n\x1d\x61ll_future_order_count_second\x18\x01 \x02(\r\x12#\n\x1b\x61ll_future_order_count_most\x18\x02 \x02(\r\x12%\n\x1dIF_future_single_order_second\x18\x03 \x02(\r\x12#\n\x1bIF_future_single_order_most\x18\x04 \x02(\r\x12%\n\x1dIC_future_single_order_second\x18\x05 \x02(\r\x12#\n\x1bIC_future_single_order_most\x18\x06 \x02(\r\x12%\n\x1dIH_future_single_order_second\x18\x07 \x02(\r\x12#\n\x1bIH_future_single_order_most\x18\x08 \x02(\r\x12!\n\x19\x66uture_order_count_second\x18\t \x02(\r\x12\x1f\n\x17\x66uture_order_count_most\x18\n \x02(\r\x12&\n\x1e\x61ll_future_order_volume_second\x18\x0b \x02(\r\x12$\n\x1c\x61ll_future_order_volume_most\x18\x0c \x02(\r\x12%\n\x1dIF_future_order_volume_second\x18\r \x02(\r\x12#\n\x1bIF_future_order_volume_most\x18\x0e \x02(\r\x12%\n\x1dIC_future_order_volume_second\x18\x0f \x02(\r\x12#\n\x1bIC_future_order_volume_most\x18\x10 \x02(\r\x12%\n\x1dIH_future_order_volume_second\x18\x11 \x02(\r\x12#\n\x1bIH_future_order_volume_most\x18\x12 \x02(\r\x12)\n!account_future_order_count_second\x18\x13 \x02(\r\x12\'\n\x1f\x61\x63\x63ount_future_order_count_most\x18\x14 \x02(\r\x12\'\n\x1f\x61\x63\x63ount_future_buy_order_second\x18\x15 \x02(\r\x12%\n\x1d\x61\x63\x63ount_future_buy_order_most\x18\x16 \x02(\t\x12(\n account_future_sell_order_second\x18\x17 \x02(\r\x12&\n\x1e\x61\x63\x63ount_future_sell_order_most\x18\x18 \x02(\t\x12.\n&account_all_future_order_volume_second\x18\x19 \x02(\r\x12,\n$account_all_future_order_volume_most\x18\x1a \x02(\r\x12-\n%account_IF_future_order_volume_second\x18\x1b \x02(\r\x12+\n#account_IF_future_order_volume_most\x18\x1c \x02(\r\x12-\n%account_IC_future_order_volume_second\x18\x1d \x02(\r\x12+\n#account_IC_future_order_volume_most\x18\x1e \x02(\r\x12-\n%account_IH_future_order_volume_second\x18\x1f \x02(\r\x12+\n#account_IH_future_order_volume_most\x18 \x02(\r\x12\x39\n1account_IF_future_open_position_order_volume_most\x18! \x02(\r\x12\x39\n1account_IC_future_open_position_order_volume_most\x18\" \x02(\r\x12\x39\n1account_IH_future_open_position_order_volume_most\x18# \x02(\r\"W\n\x14\x46uture_Order_Set_Req\x12?\n\x0c\x66uture_order\x18\x02 \x02(\x0b\x32).risk_server_future.future_order_set_info\"6\n\x15\x46uture_Order_Set_Resp\x12\x10\n\x08ret_code\x18\x01 \x02(\x05\x12\x0b\n\x03msg\x18\x02 \x01(\t\"y\n\x17\x46uture_Order_Query_Info\x12\x10\n\x08ret_code\x18\x01 \x02(\x05\x12?\n\x0c\x66uture_order\x18\x02 \x01(\x0b\x32).risk_server_future.future_order_set_info\x12\x0b\n\x03msg\x18\x03 \x01(\t')
)
_sym_db.RegisterFileDescriptor(DESCRIPTOR)
_FUTURE_ORDER_SET_INFO = _descriptor.Descriptor(
name='future_order_set_info',
full_name='risk_server_future.future_order_set_info',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='all_future_order_count_second', full_name='risk_server_future.future_order_set_info.all_future_order_count_second', index=0,
number=1, type=13, cpp_type=3, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='all_future_order_count_most', full_name='risk_server_future.future_order_set_info.all_future_order_count_most', index=1,
number=2, type=13, cpp_type=3, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='IF_future_single_order_second', full_name='risk_server_future.future_order_set_info.IF_future_single_order_second', index=2,
number=3, type=13, cpp_type=3, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='IF_future_single_order_most', full_name='risk_server_future.future_order_set_info.IF_future_single_order_most', index=3,
number=4, type=13, cpp_type=3, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='IC_future_single_order_second', full_name='risk_server_future.future_order_set_info.IC_future_single_order_second', index=4,
number=5, type=13, cpp_type=3, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='IC_future_single_order_most', full_name='risk_server_future.future_order_set_info.IC_future_single_order_most', index=5,
number=6, type=13, cpp_type=3, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='IH_future_single_order_second', full_name='risk_server_future.future_order_set_info.IH_future_single_order_second', index=6,
number=7, type=13, cpp_type=3, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='IH_future_single_order_most', full_name='risk_server_future.future_order_set_info.IH_future_single_order_most', index=7,
number=8, type=13, cpp_type=3, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='future_order_count_second', full_name='risk_server_future.future_order_set_info.future_order_count_second', index=8,
number=9, type=13, cpp_type=3, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='future_order_count_most', full_name='risk_server_future.future_order_set_info.future_order_count_most', index=9,
number=10, type=13, cpp_type=3, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='all_future_order_volume_second', full_name='risk_server_future.future_order_set_info.all_future_order_volume_second', index=10,
number=11, type=13, cpp_type=3, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='all_future_order_volume_most', full_name='risk_server_future.future_order_set_info.all_future_order_volume_most', index=11,
number=12, type=13, cpp_type=3, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='IF_future_order_volume_second', full_name='risk_server_future.future_order_set_info.IF_future_order_volume_second', index=12,
number=13, type=13, cpp_type=3, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='IF_future_order_volume_most', full_name='risk_server_future.future_order_set_info.IF_future_order_volume_most', index=13,
number=14, type=13, cpp_type=3, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='IC_future_order_volume_second', full_name='risk_server_future.future_order_set_info.IC_future_order_volume_second', index=14,
number=15, type=13, cpp_type=3, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='IC_future_order_volume_most', full_name='risk_server_future.future_order_set_info.IC_future_order_volume_most', index=15,
number=16, type=13, cpp_type=3, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='IH_future_order_volume_second', full_name='risk_server_future.future_order_set_info.IH_future_order_volume_second', index=16,
number=17, type=13, cpp_type=3, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='IH_future_order_volume_most', full_name='risk_server_future.future_order_set_info.IH_future_order_volume_most', index=17,
number=18, type=13, cpp_type=3, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='account_future_order_count_second', full_name='risk_server_future.future_order_set_info.account_future_order_count_second', index=18,
number=19, type=13, cpp_type=3, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='account_future_order_count_most', full_name='risk_server_future.future_order_set_info.account_future_order_count_most', index=19,
number=20, type=13, cpp_type=3, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='account_future_buy_order_second', full_name='risk_server_future.future_order_set_info.account_future_buy_order_second', index=20,
number=21, type=13, cpp_type=3, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='account_future_buy_order_most', full_name='risk_server_future.future_order_set_info.account_future_buy_order_most', index=21,
number=22, type=9, cpp_type=9, label=2,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='account_future_sell_order_second', full_name='risk_server_future.future_order_set_info.account_future_sell_order_second', index=22,
number=23, type=13, cpp_type=3, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='account_future_sell_order_most', full_name='risk_server_future.future_order_set_info.account_future_sell_order_most', index=23,
number=24, type=9, cpp_type=9, label=2,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='account_all_future_order_volume_second', full_name='risk_server_future.future_order_set_info.account_all_future_order_volume_second', index=24,
number=25, type=13, cpp_type=3, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='account_all_future_order_volume_most', full_name='risk_server_future.future_order_set_info.account_all_future_order_volume_most', index=25,
number=26, type=13, cpp_type=3, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='account_IF_future_order_volume_second', full_name='risk_server_future.future_order_set_info.account_IF_future_order_volume_second', index=26,
number=27, type=13, cpp_type=3, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='account_IF_future_order_volume_most', full_name='risk_server_future.future_order_set_info.account_IF_future_order_volume_most', index=27,
number=28, type=13, cpp_type=3, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='account_IC_future_order_volume_second', full_name='risk_server_future.future_order_set_info.account_IC_future_order_volume_second', index=28,
number=29, type=13, cpp_type=3, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='account_IC_future_order_volume_most', full_name='risk_server_future.future_order_set_info.account_IC_future_order_volume_most', index=29,
number=30, type=13, cpp_type=3, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='account_IH_future_order_volume_second', full_name='risk_server_future.future_order_set_info.account_IH_future_order_volume_second', index=30,
number=31, type=13, cpp_type=3, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='account_IH_future_order_volume_most', full_name='risk_server_future.future_order_set_info.account_IH_future_order_volume_most', index=31,
number=32, type=13, cpp_type=3, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='account_IF_future_open_position_order_volume_most', full_name='risk_server_future.future_order_set_info.account_IF_future_open_position_order_volume_most', index=32,
number=33, type=13, cpp_type=3, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='account_IC_future_open_position_order_volume_most', full_name='risk_server_future.future_order_set_info.account_IC_future_open_position_order_volume_most', index=33,
number=34, type=13, cpp_type=3, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='account_IH_future_open_position_order_volume_most', full_name='risk_server_future.future_order_set_info.account_IH_future_open_position_order_volume_most', index=34,
number=35, type=13, cpp_type=3, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
extension_ranges=[],
oneofs=[
],
serialized_start=49,
serialized_end=1543,
)
_FUTURE_ORDER_SET_REQ = _descriptor.Descriptor(
name='Future_Order_Set_Req',
full_name='risk_server_future.Future_Order_Set_Req',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='future_order', full_name='risk_server_future.Future_Order_Set_Req.future_order', index=0,
number=2, type=11, cpp_type=10, label=2,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
extension_ranges=[],
oneofs=[
],
serialized_start=1545,
serialized_end=1632,
)
_FUTURE_ORDER_SET_RESP = _descriptor.Descriptor(
name='Future_Order_Set_Resp',
full_name='risk_server_future.Future_Order_Set_Resp',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='ret_code', full_name='risk_server_future.Future_Order_Set_Resp.ret_code', index=0,
number=1, type=5, cpp_type=1, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='msg', full_name='risk_server_future.Future_Order_Set_Resp.msg', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
extension_ranges=[],
oneofs=[
],
serialized_start=1634,
serialized_end=1688,
)
_FUTURE_ORDER_QUERY_INFO = _descriptor.Descriptor(
name='Future_Order_Query_Info',
full_name='risk_server_future.Future_Order_Query_Info',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='ret_code', full_name='risk_server_future.Future_Order_Query_Info.ret_code', index=0,
number=1, type=5, cpp_type=1, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='future_order', full_name='risk_server_future.Future_Order_Query_Info.future_order', index=1,
number=2, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='msg', full_name='risk_server_future.Future_Order_Query_Info.msg', index=2,
number=3, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
extension_ranges=[],
oneofs=[
],
serialized_start=1690,
serialized_end=1811,
)
_FUTURE_ORDER_SET_REQ.fields_by_name['future_order'].message_type = _FUTURE_ORDER_SET_INFO
_FUTURE_ORDER_QUERY_INFO.fields_by_name['future_order'].message_type = _FUTURE_ORDER_SET_INFO
DESCRIPTOR.message_types_by_name['future_order_set_info'] = _FUTURE_ORDER_SET_INFO
DESCRIPTOR.message_types_by_name['Future_Order_Set_Req'] = _FUTURE_ORDER_SET_REQ
DESCRIPTOR.message_types_by_name['Future_Order_Set_Resp'] = _FUTURE_ORDER_SET_RESP
DESCRIPTOR.message_types_by_name['Future_Order_Query_Info'] = _FUTURE_ORDER_QUERY_INFO
future_order_set_info = _reflection.GeneratedProtocolMessageType('future_order_set_info', (_message.Message,), dict(
DESCRIPTOR = _FUTURE_ORDER_SET_INFO,
__module__ = 'risk_server_future_pb2'
# @@protoc_insertion_point(class_scope:risk_server_future.future_order_set_info)
))
_sym_db.RegisterMessage(future_order_set_info)
Future_Order_Set_Req = _reflection.GeneratedProtocolMessageType('Future_Order_Set_Req', (_message.Message,), dict(
DESCRIPTOR = _FUTURE_ORDER_SET_REQ,
__module__ = 'risk_server_future_pb2'
# @@protoc_insertion_point(class_scope:risk_server_future.Future_Order_Set_Req)
))
_sym_db.RegisterMessage(Future_Order_Set_Req)
Future_Order_Set_Resp = _reflection.GeneratedProtocolMessageType('Future_Order_Set_Resp', (_message.Message,), dict(
DESCRIPTOR = _FUTURE_ORDER_SET_RESP,
__module__ = 'risk_server_future_pb2'
# @@protoc_insertion_point(class_scope:risk_server_future.Future_Order_Set_Resp)
))
_sym_db.RegisterMessage(Future_Order_Set_Resp)
Future_Order_Query_Info = _reflection.GeneratedProtocolMessageType('Future_Order_Query_Info', (_message.Message,), dict(
DESCRIPTOR = _FUTURE_ORDER_QUERY_INFO,
__module__ = 'risk_server_future_pb2'
# @@protoc_insertion_point(class_scope:risk_server_future.Future_Order_Query_Info)
))
_sym_db.RegisterMessage(Future_Order_Query_Info)
# @@protoc_insertion_point(module_scope)
| 52.232506 | 2,622 | 0.772289 | 3,399 | 23,139 | 4.825537 | 0.059429 | 0.105292 | 0.062309 | 0.068406 | 0.912084 | 0.846909 | 0.796671 | 0.769174 | 0.757773 | 0.720095 | 0 | 0.039261 | 0.120489 | 23,139 | 442 | 2,623 | 52.350679 | 0.766695 | 0.020701 | 0 | 0.6691 | 1 | 0.004866 | 0.315527 | 0.304534 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.014599 | 0 | 0.014599 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
84f2947f375499147d95a18c35c81380c7dc0bb9 | 11,963 | py | Python | radionets/dl_framework/architectures/res_exp.py | Kevin2/radionets | 44e10a85a096f5cea8e9d83f96db65bdd4df9517 | [
"MIT"
] | null | null | null | radionets/dl_framework/architectures/res_exp.py | Kevin2/radionets | 44e10a85a096f5cea8e9d83f96db65bdd4df9517 | [
"MIT"
] | 16 | 2019-10-09T12:30:27.000Z | 2020-12-09T14:03:03.000Z | radionets/dl_framework/architectures/res_exp.py | Kevin2/radionets | 44e10a85a096f5cea8e9d83f96db65bdd4df9517 | [
"MIT"
] | 3 | 2020-01-08T09:01:09.000Z | 2020-10-19T18:53:13.000Z | import torch
from torch import nn
from radionets.dl_framework.model import (
SRBlock,
Lambda,
symmetry,
GeneralELU,
)
from functools import partial
from math import pi
class SRResNet_shuffle(nn.Module):
def __init__(self):
super().__init__()
self.preBlock = nn.Sequential(
nn.Conv2d(2, 64, 9, stride=1, padding=4, groups=2), nn.PReLU()
)
# ResBlock 14
self.blocks = nn.Sequential(
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
)
self.postBlock = nn.Sequential(
nn.Conv2d(64, 64, 3, stride=1, padding=1, bias=False), nn.BatchNorm2d(64)
)
self.shuffle = nn.Sequential(
nn.Conv2d(64, 252, 3, stride=1, padding=1, bias=True),
nn.PixelShuffle(3),
nn.PReLU(),
)
self.final = nn.Sequential(
nn.Conv2d(28, 2, 9, stride=1, padding=4, groups=2),
)
self.symmetry_amp = Lambda(partial(symmetry, mode="real"))
self.symmetry_imag = Lambda(partial(symmetry, mode="imag"))
self.hardtanh = nn.Hardtanh(-pi, pi)
def forward(self, x):
x = self.preBlock(x)
x = x + self.postBlock(self.blocks(x))
x = self.shuffle(x)
x = self.final(x)
s = x.shape[-1]
x0 = self.symmetry_amp(x[:, 0]).reshape(-1, 1, s, s)
x1 = self.symmetry_imag(x[:, 1]).reshape(-1, 1, s, s)
x1 = self.hardtanh(x1)
return torch.cat([x0, x1], dim=1)
class SRResNet_bigger(nn.Module):
def __init__(self):
super().__init__()
self.preBlock = nn.Sequential(
nn.Conv2d(2, 64, 9, stride=1, padding=4, groups=2), nn.PReLU()
)
# ResBlock 8
self.blocks = nn.Sequential(
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
)
self.postBlock = nn.Sequential(
nn.Conv2d(64, 64, 3, stride=1, padding=1), nn.BatchNorm2d(64)
)
self.final = nn.Sequential(
nn.Conv2d(64, 2, 9, stride=1, padding=4, groups=2),
)
self.symmetry_amp = Lambda(partial(symmetry, mode="real"))
self.symmetry_imag = Lambda(partial(symmetry, mode="imag"))
self.hardtanh = nn.Hardtanh(-pi, pi)
def forward(self, x):
s = x.shape[-1]
x = self.preBlock(x)
x = x + self.postBlock(self.blocks(x))
x = self.final(x)
x0 = self.symmetry_amp(x[:, 0]).reshape(-1, 1, s, s)
x1 = self.hardtanh(x[:, 1]).reshape(-1, 1, s, s)
x1 = self.symmetry_imag(x1).reshape(-1, 1, s, s)
return torch.cat([x0, x1], dim=1)
class SRResNet_bigger_16(nn.Module):
def __init__(self):
super().__init__()
self.preBlock = nn.Sequential(
nn.Conv2d(2, 64, 9, stride=1, padding=4, groups=2), nn.PReLU()
)
# ResBlock 16
self.blocks = nn.Sequential(
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
)
self.postBlock = nn.Sequential(
nn.Conv2d(64, 64, 3, stride=1, padding=1, bias=False), nn.BatchNorm2d(64)
)
self.final = nn.Sequential(nn.Conv2d(64, 2, 9, stride=1, padding=4, groups=2),)
self.symmetry_amp = Lambda(partial(symmetry, mode="real"))
self.symmetry_imag = Lambda(partial(symmetry, mode="imag"))
def forward(self, x):
s = x.shape[-1]
x = self.preBlock(x)
x = x + self.postBlock(self.blocks(x))
x = self.final(x)
x0 = self.symmetry_amp(x[:, 0]).reshape(-1, 1, s, s)
x1 = self.symmetry_imag(x[:, 1]).reshape(-1, 1, s, s)
return torch.cat([x0, x1], dim=1)
class SRResNet_amp(nn.Module):
def __init__(self):
super().__init__()
self.preBlock = nn.Sequential(
nn.Conv2d(2, 64, 9, stride=1, padding=4, groups=2), nn.PReLU()
)
# ResBlock 16
self.blocks = nn.Sequential(
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
)
self.postBlock = nn.Sequential(
nn.Conv2d(64, 64, 3, stride=1, padding=1, bias=False), nn.BatchNorm2d(64)
)
self.final = nn.Sequential(
nn.Conv2d(64, 1, 9, stride=1, padding=4, groups=1),
)
self.symmetry_amp = Lambda(partial(symmetry, mode="real"))
def forward(self, x):
s = x.shape[-1]
x = self.preBlock(x)
x = x + self.postBlock(self.blocks(x))
x = self.final(x)
x = self.symmetry_amp(x).reshape(-1, 1, s, s)
return x
class SRResNet_phase(nn.Module):
def __init__(self):
super().__init__()
self.preBlock = nn.Sequential(
nn.Conv2d(2, 64, 9, stride=1, padding=4, groups=2), nn.PReLU()
)
# ResBlock 16
self.blocks = nn.Sequential(
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
SRBlock(64, 64),
)
self.postBlock = nn.Sequential(
nn.Conv2d(64, 64, 3, stride=1, padding=1, bias=False), nn.BatchNorm2d(64)
)
self.final = nn.Sequential(
nn.Conv2d(64, 1, 9, stride=1, padding=4, groups=1),
)
self.symmetry_imag = Lambda(partial(symmetry, mode="imag"))
self.hardtanh = nn.Hardtanh(-pi, pi)
def forward(self, x):
s = x.shape[-1]
x = self.preBlock(x)
x = x + self.postBlock(self.blocks(x))
x = self.final(x)
x = self.hardtanh(x).reshape(-1, 1, s, s)
x = self.symmetry_imag(x).reshape(-1, 1, s, s)
return x
class SRResNet_unc(nn.Module):
def __init__(self):
super().__init__()
n_channel = 64
self.preBlock = nn.Sequential(
nn.Conv2d(2, n_channel, 9, stride=1, padding=4, groups=2), nn.PReLU()
)
# ResBlock 8
self.blocks = nn.Sequential(
SRBlock(n_channel, n_channel),
SRBlock(n_channel, n_channel),
SRBlock(n_channel, n_channel),
SRBlock(n_channel, n_channel),
SRBlock(n_channel, n_channel),
SRBlock(n_channel, n_channel),
SRBlock(n_channel, n_channel),
SRBlock(n_channel, n_channel),
)
self.postBlock = nn.Sequential(
nn.Conv2d(n_channel, n_channel, 3, stride=1, padding=1, bias=False),
nn.BatchNorm2d(n_channel),
)
self.final = nn.Sequential(
nn.Conv2d(n_channel, 4, 9, stride=1, padding=4, groups=2),
)
self.symmetry_amp = Lambda(partial(symmetry, mode="real"))
self.symmetry_imag = Lambda(partial(symmetry, mode="imag"))
self.elu = GeneralELU(add=+(1 + 1e-10))
def forward(self, x):
s = x.shape[-1]
x = self.preBlock(x)
x = x + self.postBlock(self.blocks(x))
x = self.final(x)
x0 = self.symmetry_amp(x[:, 0]).reshape(-1, 1, s, s)
x0_unc = self.symmetry_amp(x[:, 1]).reshape(-1, 1, s, s)
x0_unc = self.elu(x0_unc)
x1 = self.symmetry_imag(x[:, 2]).reshape(-1, 1, s, s)
x1_unc = self.symmetry_amp(x[:, 3]).reshape(-1, 1, s, s)
x1_unc = self.elu(x1_unc)
return torch.cat([x0, x0_unc, x1, x1_unc], dim=1)
class SRResNet_unc_amp(nn.Module):
def __init__(self):
super().__init__()
n_channel = 56
self.preBlock = nn.Sequential(
nn.Conv2d(1, n_channel, 9, stride=1, padding=4, groups=1), nn.PReLU()
)
# ResBlock 8
self.blocks = nn.Sequential(
SRBlock(n_channel, n_channel),
SRBlock(n_channel, n_channel),
SRBlock(n_channel, n_channel),
SRBlock(n_channel, n_channel),
SRBlock(n_channel, n_channel),
SRBlock(n_channel, n_channel),
SRBlock(n_channel, n_channel),
SRBlock(n_channel, n_channel),
)
self.postBlock = nn.Sequential(
nn.Conv2d(n_channel, n_channel, 3, stride=1, padding=1, bias=False),
nn.BatchNorm2d(n_channel),
)
self.final = nn.Sequential(
nn.Conv2d(n_channel, 2, 9, stride=1, padding=4, groups=1),
)
self.symmetry_amp = Lambda(partial(symmetry, mode="real"))
self.symmetry_imag = Lambda(partial(symmetry, mode="imag"))
self.elu = GeneralELU(add=+(1 + 1e-5))
def forward(self, x):
s = x.shape[-1]
x = self.preBlock(x[:, 0].unsqueeze(1))
x = x + self.postBlock(self.blocks(x))
x = self.final(x)
x0 = self.symmetry_amp(x[:, 0]).reshape(-1, 1, s, s)
x0_unc = self.symmetry_amp(x[:, 1]).reshape(-1, 1, s, s)
x0_unc = self.elu(x0_unc)
return torch.cat([x0, x0_unc], dim=1)
class SRResNet_unc_phase(nn.Module):
def __init__(self):
super().__init__()
n_channel = 56
self.preBlock = nn.Sequential(
nn.Conv2d(1, n_channel, 9, stride=1, padding=4, groups=1), nn.PReLU()
)
# ResBlock 8
self.blocks = nn.Sequential(
SRBlock(n_channel, n_channel),
SRBlock(n_channel, n_channel),
SRBlock(n_channel, n_channel),
SRBlock(n_channel, n_channel),
SRBlock(n_channel, n_channel),
SRBlock(n_channel, n_channel),
SRBlock(n_channel, n_channel),
SRBlock(n_channel, n_channel),
)
self.postBlock = nn.Sequential(
nn.Conv2d(n_channel, n_channel, 3, stride=1, padding=1, bias=False),
nn.BatchNorm2d(n_channel),
)
self.final = nn.Sequential(
nn.Conv2d(n_channel, 2, 9, stride=1, padding=4, groups=1),
)
self.symmetry_amp = Lambda(partial(symmetry, mode="real"))
self.symmetry_imag = Lambda(partial(symmetry, mode="imag"))
self.elu = GeneralELU(add=+(1 + 1e-10))
def forward(self, x):
s = x.shape[-1]
x = self.preBlock(x[:, 1].unsqueeze(1))
x = x + self.postBlock(self.blocks(x))
x = self.final(x)
x0 = self.symmetry_imag(x[:, 0]).reshape(-1, 1, s, s)
x0_unc = self.symmetry_amp(x[:, 1]).reshape(-1, 1, s, s)
x0_unc = self.elu(x0_unc)
return torch.cat([x0, x0_unc], dim=1)
| 27.438073 | 87 | 0.52119 | 1,548 | 11,963 | 3.906331 | 0.051034 | 0.049611 | 0.127336 | 0.193484 | 0.955515 | 0.943443 | 0.926906 | 0.91748 | 0.90425 | 0.892178 | 0 | 0.077058 | 0.33286 | 11,963 | 435 | 88 | 27.501149 | 0.680616 | 0.007607 | 0 | 0.760252 | 0 | 0 | 0.004721 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.050473 | false | 0 | 0.015773 | 0 | 0.116719 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
ca0446f9e3c89b6d30ef1d184929ba0359e3ebcc | 205 | py | Python | ctrlp/__init__.py | cbteeple/ctrlp | 5dec6aea89747dac6ef9896b6fe1b738b7f45830 | [
"MIT"
] | 1 | 2022-01-10T22:11:30.000Z | 2022-01-10T22:11:30.000Z | ctrlp/__init__.py | cbteeple/ctrlp | 5dec6aea89747dac6ef9896b6fe1b738b7f45830 | [
"MIT"
] | null | null | null | ctrlp/__init__.py | cbteeple/ctrlp | 5dec6aea89747dac6ef9896b6fe1b738b7f45830 | [
"MIT"
] | null | null | null | from .comm_handler import CommandHandler
from .validate_commands import CommandValidator
from .config_handler import ConfigHandler
from .comm_handler import CommHandler
from .comm_handler import DataSaver | 34.166667 | 47 | 0.878049 | 25 | 205 | 7 | 0.48 | 0.297143 | 0.257143 | 0.36 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.097561 | 205 | 6 | 48 | 34.166667 | 0.945946 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
ca164d8c1195fd44c92fd03147346b54c6374422 | 16,593 | py | Python | v6.0.5/firewall/test_fortios_firewall_address.py | fortinet-solutions-cse/ansible_fgt_modules | c45fba49258d7c9705e7a8fd9c2a09ea4c8a4719 | [
"Apache-2.0"
] | 14 | 2018-09-25T20:35:25.000Z | 2021-07-14T04:30:54.000Z | v6.0.6/firewall/test_fortios_firewall_address.py | fortinet-solutions-cse/ansible_fgt_modules | c45fba49258d7c9705e7a8fd9c2a09ea4c8a4719 | [
"Apache-2.0"
] | 32 | 2018-10-09T04:13:42.000Z | 2020-05-11T07:20:28.000Z | v6.0.6/firewall/test_fortios_firewall_address.py | fortinet-solutions-cse/ansible_fgt_modules | c45fba49258d7c9705e7a8fd9c2a09ea4c8a4719 | [
"Apache-2.0"
] | 11 | 2018-10-09T00:14:53.000Z | 2021-11-03T10:54:09.000Z | # Copyright 2019 Fortinet, Inc.
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with Ansible. If not, see <https://www.gnu.org/licenses/>.
# Make coding more python3-ish
from __future__ import (absolute_import, division, print_function)
__metaclass__ = type
import os
import json
import pytest
from mock import ANY
from ansible.module_utils.network.fortios.fortios import FortiOSHandler
try:
from ansible.modules.network.fortios import fortios_firewall_address
except ImportError:
pytest.skip("Could not load required modules for testing", allow_module_level=True)
@pytest.fixture(autouse=True)
def connection_mock(mocker):
connection_class_mock = mocker.patch('ansible.modules.network.fortios.fortios_firewall_address.Connection')
return connection_class_mock
fos_instance = FortiOSHandler(connection_mock)
def test_firewall_address_creation(mocker):
schema_method_mock = mocker.patch('ansible.module_utils.network.fortios.fortios.FortiOSHandler.schema')
set_method_result = {'status': 'success', 'http_method': 'POST', 'http_status': 200}
set_method_mock = mocker.patch('ansible.module_utils.network.fortios.fortios.FortiOSHandler.set', return_value=set_method_result)
input_data = {
'username': 'admin',
'state': 'present',
'firewall_address': {
'allow_routing': 'enable',
'associated_interface': 'test_value_4',
'cache_ttl': '5',
'color': '6',
'comment': 'Comment.',
'country': 'test_value_8',
'end_ip': 'test_value_9',
'epg_name': 'test_value_10',
'filter': 'test_value_11',
'fqdn': 'test_value_12',
'name': 'default_name_13',
'obj_id': 'test_value_14',
'organization': 'test_value_15',
'policy_group': 'test_value_16',
'sdn': 'aci',
'sdn_tag': 'test_value_18',
'start_ip': 'test_value_19',
'subnet': 'test_value_20',
'subnet_name': 'test_value_21',
'tenant': 'test_value_22',
'type': 'ipmask',
'uuid': 'test_value_24',
'visibility': 'enable',
'wildcard': 'test_value_26',
'wildcard_fqdn': 'test_value_27'
},
'vdom': 'root'}
is_error, changed, response = fortios_firewall_address.fortios_firewall(input_data, fos_instance)
expected_data = {
'allow-routing': 'enable',
'associated-interface': 'test_value_4',
'cache-ttl': '5',
'color': '6',
'comment': 'Comment.',
'country': 'test_value_8',
'end-ip': 'test_value_9',
'epg-name': 'test_value_10',
'filter': 'test_value_11',
'fqdn': 'test_value_12',
'name': 'default_name_13',
'obj-id': 'test_value_14',
'organization': 'test_value_15',
'policy-group': 'test_value_16',
'sdn': 'aci',
'sdn-tag': 'test_value_18',
'start-ip': 'test_value_19',
'subnet': 'test_value_20',
'subnet-name': 'test_value_21',
'tenant': 'test_value_22',
'type': 'ipmask',
'uuid': 'test_value_24',
'visibility': 'enable',
'wildcard': 'test_value_26',
'wildcard-fqdn': 'test_value_27'
}
set_method_mock.assert_called_with('firewall', 'address', data=expected_data, vdom='root')
schema_method_mock.assert_not_called()
assert not is_error
assert changed
assert response['status'] == 'success'
assert response['http_status'] == 200
def test_firewall_address_creation_fails(mocker):
schema_method_mock = mocker.patch('ansible.module_utils.network.fortios.fortios.FortiOSHandler.schema')
set_method_result = {'status': 'error', 'http_method': 'POST', 'http_status': 500}
set_method_mock = mocker.patch('ansible.module_utils.network.fortios.fortios.FortiOSHandler.set', return_value=set_method_result)
input_data = {
'username': 'admin',
'state': 'present',
'firewall_address': {
'allow_routing': 'enable',
'associated_interface': 'test_value_4',
'cache_ttl': '5',
'color': '6',
'comment': 'Comment.',
'country': 'test_value_8',
'end_ip': 'test_value_9',
'epg_name': 'test_value_10',
'filter': 'test_value_11',
'fqdn': 'test_value_12',
'name': 'default_name_13',
'obj_id': 'test_value_14',
'organization': 'test_value_15',
'policy_group': 'test_value_16',
'sdn': 'aci',
'sdn_tag': 'test_value_18',
'start_ip': 'test_value_19',
'subnet': 'test_value_20',
'subnet_name': 'test_value_21',
'tenant': 'test_value_22',
'type': 'ipmask',
'uuid': 'test_value_24',
'visibility': 'enable',
'wildcard': 'test_value_26',
'wildcard_fqdn': 'test_value_27'
},
'vdom': 'root'}
is_error, changed, response = fortios_firewall_address.fortios_firewall(input_data, fos_instance)
expected_data = {
'allow-routing': 'enable',
'associated-interface': 'test_value_4',
'cache-ttl': '5',
'color': '6',
'comment': 'Comment.',
'country': 'test_value_8',
'end-ip': 'test_value_9',
'epg-name': 'test_value_10',
'filter': 'test_value_11',
'fqdn': 'test_value_12',
'name': 'default_name_13',
'obj-id': 'test_value_14',
'organization': 'test_value_15',
'policy-group': 'test_value_16',
'sdn': 'aci',
'sdn-tag': 'test_value_18',
'start-ip': 'test_value_19',
'subnet': 'test_value_20',
'subnet-name': 'test_value_21',
'tenant': 'test_value_22',
'type': 'ipmask',
'uuid': 'test_value_24',
'visibility': 'enable',
'wildcard': 'test_value_26',
'wildcard-fqdn': 'test_value_27'
}
set_method_mock.assert_called_with('firewall', 'address', data=expected_data, vdom='root')
schema_method_mock.assert_not_called()
assert is_error
assert not changed
assert response['status'] == 'error'
assert response['http_status'] == 500
def test_firewall_address_removal(mocker):
schema_method_mock = mocker.patch('ansible.module_utils.network.fortios.fortios.FortiOSHandler.schema')
delete_method_result = {'status': 'success', 'http_method': 'POST', 'http_status': 200}
delete_method_mock = mocker.patch('ansible.module_utils.network.fortios.fortios.FortiOSHandler.delete', return_value=delete_method_result)
input_data = {
'username': 'admin',
'state': 'absent',
'firewall_address': {
'allow_routing': 'enable',
'associated_interface': 'test_value_4',
'cache_ttl': '5',
'color': '6',
'comment': 'Comment.',
'country': 'test_value_8',
'end_ip': 'test_value_9',
'epg_name': 'test_value_10',
'filter': 'test_value_11',
'fqdn': 'test_value_12',
'name': 'default_name_13',
'obj_id': 'test_value_14',
'organization': 'test_value_15',
'policy_group': 'test_value_16',
'sdn': 'aci',
'sdn_tag': 'test_value_18',
'start_ip': 'test_value_19',
'subnet': 'test_value_20',
'subnet_name': 'test_value_21',
'tenant': 'test_value_22',
'type': 'ipmask',
'uuid': 'test_value_24',
'visibility': 'enable',
'wildcard': 'test_value_26',
'wildcard_fqdn': 'test_value_27'
},
'vdom': 'root'}
is_error, changed, response = fortios_firewall_address.fortios_firewall(input_data, fos_instance)
delete_method_mock.assert_called_with('firewall', 'address', mkey=ANY, vdom='root')
schema_method_mock.assert_not_called()
assert not is_error
assert changed
assert response['status'] == 'success'
assert response['http_status'] == 200
def test_firewall_address_deletion_fails(mocker):
schema_method_mock = mocker.patch('ansible.module_utils.network.fortios.fortios.FortiOSHandler.schema')
delete_method_result = {'status': 'error', 'http_method': 'POST', 'http_status': 500}
delete_method_mock = mocker.patch('ansible.module_utils.network.fortios.fortios.FortiOSHandler.delete', return_value=delete_method_result)
input_data = {
'username': 'admin',
'state': 'absent',
'firewall_address': {
'allow_routing': 'enable',
'associated_interface': 'test_value_4',
'cache_ttl': '5',
'color': '6',
'comment': 'Comment.',
'country': 'test_value_8',
'end_ip': 'test_value_9',
'epg_name': 'test_value_10',
'filter': 'test_value_11',
'fqdn': 'test_value_12',
'name': 'default_name_13',
'obj_id': 'test_value_14',
'organization': 'test_value_15',
'policy_group': 'test_value_16',
'sdn': 'aci',
'sdn_tag': 'test_value_18',
'start_ip': 'test_value_19',
'subnet': 'test_value_20',
'subnet_name': 'test_value_21',
'tenant': 'test_value_22',
'type': 'ipmask',
'uuid': 'test_value_24',
'visibility': 'enable',
'wildcard': 'test_value_26',
'wildcard_fqdn': 'test_value_27'
},
'vdom': 'root'}
is_error, changed, response = fortios_firewall_address.fortios_firewall(input_data, fos_instance)
delete_method_mock.assert_called_with('firewall', 'address', mkey=ANY, vdom='root')
schema_method_mock.assert_not_called()
assert is_error
assert not changed
assert response['status'] == 'error'
assert response['http_status'] == 500
def test_firewall_address_idempotent(mocker):
schema_method_mock = mocker.patch('ansible.module_utils.network.fortios.fortios.FortiOSHandler.schema')
set_method_result = {'status': 'error', 'http_method': 'DELETE', 'http_status': 404}
set_method_mock = mocker.patch('ansible.module_utils.network.fortios.fortios.FortiOSHandler.set', return_value=set_method_result)
input_data = {
'username': 'admin',
'state': 'present',
'firewall_address': {
'allow_routing': 'enable',
'associated_interface': 'test_value_4',
'cache_ttl': '5',
'color': '6',
'comment': 'Comment.',
'country': 'test_value_8',
'end_ip': 'test_value_9',
'epg_name': 'test_value_10',
'filter': 'test_value_11',
'fqdn': 'test_value_12',
'name': 'default_name_13',
'obj_id': 'test_value_14',
'organization': 'test_value_15',
'policy_group': 'test_value_16',
'sdn': 'aci',
'sdn_tag': 'test_value_18',
'start_ip': 'test_value_19',
'subnet': 'test_value_20',
'subnet_name': 'test_value_21',
'tenant': 'test_value_22',
'type': 'ipmask',
'uuid': 'test_value_24',
'visibility': 'enable',
'wildcard': 'test_value_26',
'wildcard_fqdn': 'test_value_27'
},
'vdom': 'root'}
is_error, changed, response = fortios_firewall_address.fortios_firewall(input_data, fos_instance)
expected_data = {
'allow-routing': 'enable',
'associated-interface': 'test_value_4',
'cache-ttl': '5',
'color': '6',
'comment': 'Comment.',
'country': 'test_value_8',
'end-ip': 'test_value_9',
'epg-name': 'test_value_10',
'filter': 'test_value_11',
'fqdn': 'test_value_12',
'name': 'default_name_13',
'obj-id': 'test_value_14',
'organization': 'test_value_15',
'policy-group': 'test_value_16',
'sdn': 'aci',
'sdn-tag': 'test_value_18',
'start-ip': 'test_value_19',
'subnet': 'test_value_20',
'subnet-name': 'test_value_21',
'tenant': 'test_value_22',
'type': 'ipmask',
'uuid': 'test_value_24',
'visibility': 'enable',
'wildcard': 'test_value_26',
'wildcard-fqdn': 'test_value_27'
}
set_method_mock.assert_called_with('firewall', 'address', data=expected_data, vdom='root')
schema_method_mock.assert_not_called()
assert not is_error
assert not changed
assert response['status'] == 'error'
assert response['http_status'] == 404
def test_firewall_address_filter_foreign_attributes(mocker):
schema_method_mock = mocker.patch('ansible.module_utils.network.fortios.fortios.FortiOSHandler.schema')
set_method_result = {'status': 'success', 'http_method': 'POST', 'http_status': 200}
set_method_mock = mocker.patch('ansible.module_utils.network.fortios.fortios.FortiOSHandler.set', return_value=set_method_result)
input_data = {
'username': 'admin',
'state': 'present',
'firewall_address': {
'random_attribute_not_valid': 'tag',
'allow_routing': 'enable',
'associated_interface': 'test_value_4',
'cache_ttl': '5',
'color': '6',
'comment': 'Comment.',
'country': 'test_value_8',
'end_ip': 'test_value_9',
'epg_name': 'test_value_10',
'filter': 'test_value_11',
'fqdn': 'test_value_12',
'name': 'default_name_13',
'obj_id': 'test_value_14',
'organization': 'test_value_15',
'policy_group': 'test_value_16',
'sdn': 'aci',
'sdn_tag': 'test_value_18',
'start_ip': 'test_value_19',
'subnet': 'test_value_20',
'subnet_name': 'test_value_21',
'tenant': 'test_value_22',
'type': 'ipmask',
'uuid': 'test_value_24',
'visibility': 'enable',
'wildcard': 'test_value_26',
'wildcard_fqdn': 'test_value_27'
},
'vdom': 'root'}
is_error, changed, response = fortios_firewall_address.fortios_firewall(input_data, fos_instance)
expected_data = {
'allow-routing': 'enable',
'associated-interface': 'test_value_4',
'cache-ttl': '5',
'color': '6',
'comment': 'Comment.',
'country': 'test_value_8',
'end-ip': 'test_value_9',
'epg-name': 'test_value_10',
'filter': 'test_value_11',
'fqdn': 'test_value_12',
'name': 'default_name_13',
'obj-id': 'test_value_14',
'organization': 'test_value_15',
'policy-group': 'test_value_16',
'sdn': 'aci',
'sdn-tag': 'test_value_18',
'start-ip': 'test_value_19',
'subnet': 'test_value_20',
'subnet-name': 'test_value_21',
'tenant': 'test_value_22',
'type': 'ipmask',
'uuid': 'test_value_24',
'visibility': 'enable',
'wildcard': 'test_value_26',
'wildcard-fqdn': 'test_value_27'
}
set_method_mock.assert_called_with('firewall', 'address', data=expected_data, vdom='root')
schema_method_mock.assert_not_called()
assert not is_error
assert changed
assert response['status'] == 'success'
assert response['http_status'] == 200
| 37.711364 | 142 | 0.577412 | 1,812 | 16,593 | 4.918322 | 0.114238 | 0.171679 | 0.024686 | 0.036468 | 0.882181 | 0.875 | 0.86434 | 0.86434 | 0.86434 | 0.86434 | 0 | 0.032961 | 0.283252 | 16,593 | 439 | 143 | 37.797267 | 0.716388 | 0.040017 | 0 | 0.899204 | 0 | 0 | 0.397348 | 0.054861 | 0 | 0 | 0 | 0 | 0.095491 | 1 | 0.018568 | false | 0 | 0.02122 | 0 | 0.04244 | 0.002653 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ca5258cb496c26d6a5133925c01e49b400a287a0 | 9,003 | py | Python | catalyst/callbacks/tests/test_checkpoint_callback.py | and-kul/catalyst | 51428d7756e62b9b8ee5379f38e9fd576eeb36e5 | [
"Apache-2.0"
] | 2 | 2019-04-19T21:34:31.000Z | 2019-05-02T22:50:25.000Z | catalyst/callbacks/tests/test_checkpoint_callback.py | and-kul/catalyst | 51428d7756e62b9b8ee5379f38e9fd576eeb36e5 | [
"Apache-2.0"
] | 1 | 2021-01-07T16:13:45.000Z | 2021-01-21T09:27:54.000Z | catalyst/callbacks/tests/test_checkpoint_callback.py | and-kul/catalyst | 51428d7756e62b9b8ee5379f38e9fd576eeb36e5 | [
"Apache-2.0"
] | 1 | 2020-12-02T18:42:31.000Z | 2020-12-02T18:42:31.000Z | # flake8: noqa
from io import StringIO
import os
import re
import shutil
import sys
import pytest
import torch
from torch.utils.data import DataLoader, TensorDataset
import catalyst.dl as dl
def test_load_best_on_stage_end():
old_stdout = sys.stdout
sys.stdout = str_stdout = StringIO()
# experiment_setup
logdir = "./logs/checkpoint_callback"
checkpoint = logdir + "/checkpoints"
logfile = checkpoint + "/_metrics.json"
# data
num_samples, num_features = int(1e4), int(1e1)
X = torch.rand(num_samples, num_features)
y = torch.randint(0, 5, size=[num_samples])
dataset = TensorDataset(X, y)
loader = DataLoader(dataset, batch_size=32, num_workers=1)
loaders = {"train": loader, "valid": loader}
# model, criterion, optimizer, scheduler
model = torch.nn.Linear(num_features, 5)
criterion = torch.nn.CrossEntropyLoss()
optimizer = torch.optim.Adam(model.parameters())
runner = dl.SupervisedRunner()
n_epochs = 5
# first stage
runner.train(
model=model,
criterion=criterion,
optimizer=optimizer,
loaders=loaders,
logdir=logdir,
num_epochs=n_epochs,
verbose=False,
callbacks=[
dl.CheckpointCallback(save_n_best=2, load_on_stage_end="best"),
dl.CheckRunCallback(num_epoch_steps=n_epochs),
],
)
sys.stdout = old_stdout
exp_output = str_stdout.getvalue()
assert len(re.findall(r"=> Loading", exp_output)) == 1
assert len(re.findall(r"=> Loading .*best\.pth", exp_output)) == 1
assert os.path.isfile(logfile)
assert os.path.isfile(checkpoint + "/train.4.pth")
assert os.path.isfile(checkpoint + "/train.4_full.pth")
assert os.path.isfile(checkpoint + "/train.5.pth")
assert os.path.isfile(checkpoint + "/train.5_full.pth")
assert os.path.isfile(checkpoint + "/best.pth")
assert os.path.isfile(checkpoint + "/best_full.pth")
assert os.path.isfile(checkpoint + "/last.pth")
assert os.path.isfile(checkpoint + "/last_full.pth")
shutil.rmtree(logdir, ignore_errors=True)
def test_multiple_stages_and_different_checkpoints_to_load():
old_stdout = sys.stdout
sys.stdout = str_stdout = StringIO()
# experiment_setup
logdir = "./logs/checkpoint_callback"
checkpoint = logdir + "/checkpoints"
logfile = checkpoint + "/_metrics.json"
num_epochs = 5
# data
num_samples, num_features = int(1e4), int(1e1)
X = torch.rand(num_samples, num_features)
y = torch.randint(0, 5, size=[num_samples])
dataset = TensorDataset(X, y)
loader = DataLoader(dataset, batch_size=32, num_workers=1)
loaders = {"train": loader, "valid": loader}
# model, criterion, optimizer, scheduler
model = torch.nn.Linear(num_features, 5)
criterion = torch.nn.CrossEntropyLoss()
optimizer = torch.optim.Adam(model.parameters())
runner = dl.SupervisedRunner()
# first stage
runner.train(
model=model,
criterion=criterion,
optimizer=optimizer,
loaders=loaders,
logdir=logdir,
num_epochs=num_epochs,
verbose=False,
callbacks=[
dl.CheckpointCallback(
save_n_best=2,
load_on_stage_end={
"model": "best",
"criterion": "best",
"optimizer": "last",
},
),
dl.CheckRunCallback(num_epoch_steps=num_epochs),
],
)
# second stage
runner.train(
model=model,
criterion=criterion,
optimizer=optimizer,
loaders=loaders,
logdir=logdir,
num_epochs=num_epochs,
verbose=False,
callbacks=[
dl.CheckpointCallback(
save_n_best=3,
load_on_stage_start={
"model": "last",
"criterion": "last",
"optimizer": "best",
},
),
dl.CheckRunCallback(num_epoch_steps=num_epochs),
],
)
sys.stdout = old_stdout
exp_output = str_stdout.getvalue()
assert len(re.findall(r"=> Loading", exp_output)) == 3
assert len(re.findall(r"=> Loading .*best_full\.pth", exp_output)) == 2
assert len(re.findall(r"=> Loading .*last_full\.pth", exp_output)) == 1
assert os.path.isfile(logfile)
assert os.path.isfile(checkpoint + "/train.3.pth")
assert os.path.isfile(checkpoint + "/train.3_full.pth")
assert os.path.isfile(checkpoint + "/train.4.pth")
assert os.path.isfile(checkpoint + "/train.4_full.pth")
assert os.path.isfile(checkpoint + "/train.5.pth")
assert os.path.isfile(checkpoint + "/train.5_full.pth")
assert os.path.isfile(checkpoint + "/best.pth")
assert os.path.isfile(checkpoint + "/best_full.pth")
assert os.path.isfile(checkpoint + "/last.pth")
assert os.path.isfile(checkpoint + "/last_full.pth")
shutil.rmtree(logdir, ignore_errors=True)
def test_resume_with_missing_file():
old_stdout = sys.stdout
sys.stdout = str_stdout = StringIO()
# experiment_setup
logdir = "./logs/checkpoint_callback"
checkpoint = logdir + "/checkpoints"
logfile = checkpoint + "/_metrics.json"
num_epochs = 5
# data
num_samples, num_features = int(1e4), int(1e1)
X = torch.rand(num_samples, num_features)
y = torch.randint(0, 5, size=[num_samples])
dataset = TensorDataset(X, y)
loader = DataLoader(dataset, batch_size=32, num_workers=1)
loaders = {"train": loader, "valid": loader}
# model, criterion, optimizer, scheduler
model = torch.nn.Linear(num_features, 5)
criterion = torch.nn.CrossEntropyLoss()
optimizer = torch.optim.Adam(model.parameters())
runner = dl.SupervisedRunner()
with pytest.raises(FileNotFoundError):
runner.train(
model=model,
criterion=criterion,
optimizer=optimizer,
loaders=loaders,
logdir=logdir,
num_epochs=num_epochs,
verbose=False,
callbacks=[
dl.CheckpointCallback(
save_n_best=2,
load_on_stage_end={
"model": "best",
"criterion": "best",
"optimizer": "last",
},
resume="not_existing_file.pth",
),
dl.CheckRunCallback(num_epoch_steps=num_epochs),
],
)
sys.stdout = old_stdout
exp_output = str_stdout.getvalue()
shutil.rmtree(logdir, ignore_errors=True)
def test_load_on_stage_start_with_empty_dict():
old_stdout = sys.stdout
sys.stdout = str_stdout = StringIO()
# experiment_setup
logdir = "./logs/checkpoint_callback"
checkpoint = logdir + "/checkpoints"
logfile = checkpoint + "/_metrics.json"
num_epochs = 5
# data
num_samples, num_features = int(1e4), int(1e1)
X = torch.rand(num_samples, num_features)
y = torch.randint(0, 5, size=[num_samples])
dataset = TensorDataset(X, y)
loader = DataLoader(dataset, batch_size=32, num_workers=1)
loaders = {"train": loader, "valid": loader}
# model, criterion, optimizer, scheduler
model = torch.nn.Linear(num_features, 5)
criterion = torch.nn.CrossEntropyLoss()
optimizer = torch.optim.Adam(model.parameters())
runner = dl.SupervisedRunner()
# first stage
runner.train(
model=model,
criterion=criterion,
optimizer=optimizer,
loaders=loaders,
logdir=logdir,
num_epochs=num_epochs,
verbose=False,
callbacks=[
dl.CheckpointCallback(save_n_best=2),
dl.CheckRunCallback(num_epoch_steps=num_epochs),
],
)
# second stage
runner.train(
model=model,
criterion=criterion,
optimizer=optimizer,
loaders=loaders,
logdir=logdir,
num_epochs=num_epochs,
verbose=False,
callbacks=[
dl.CheckpointCallback(save_n_best=3, load_on_stage_start={}),
dl.CheckRunCallback(num_epoch_steps=num_epochs),
],
)
sys.stdout = old_stdout
exp_output = str_stdout.getvalue()
assert len(re.findall(r"=> Loading", exp_output)) == 0
assert os.path.isfile(logfile)
assert os.path.isfile(checkpoint + "/train.3.pth")
assert os.path.isfile(checkpoint + "/train.3_full.pth")
assert os.path.isfile(checkpoint + "/train.4.pth")
assert os.path.isfile(checkpoint + "/train.4_full.pth")
assert os.path.isfile(checkpoint + "/train.5.pth")
assert os.path.isfile(checkpoint + "/train.5_full.pth")
assert os.path.isfile(checkpoint + "/best.pth")
assert os.path.isfile(checkpoint + "/best_full.pth")
assert os.path.isfile(checkpoint + "/last.pth")
assert os.path.isfile(checkpoint + "/last_full.pth")
shutil.rmtree(logdir, ignore_errors=True)
| 31.369338 | 75 | 0.62357 | 1,038 | 9,003 | 5.238921 | 0.11657 | 0.045605 | 0.068408 | 0.102611 | 0.92626 | 0.92626 | 0.914307 | 0.903273 | 0.895734 | 0.895734 | 0 | 0.010894 | 0.255693 | 9,003 | 286 | 76 | 31.479021 | 0.800627 | 0.035322 | 0 | 0.819383 | 0 | 0 | 0.098546 | 0.014424 | 0 | 0 | 0 | 0 | 0.162996 | 1 | 0.017621 | false | 0 | 0.039648 | 0 | 0.057269 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ca68db897dfe7a1bc1c50d6379f35f658dc6abba | 154 | py | Python | MFIRAP/d00_utils/__init__.py | igor-morawski/MFIR-AP | d01bc2fd7cdb43231e3fd0dc0b6b2e5e429ba4ff | [
"MIT"
] | 1 | 2020-08-13T05:15:06.000Z | 2020-08-13T05:15:06.000Z | MFIRAP/d00_utils/__init__.py | igor-morawski/MFIR-AP | d01bc2fd7cdb43231e3fd0dc0b6b2e5e429ba4ff | [
"MIT"
] | 10 | 2020-11-13T17:50:26.000Z | 2022-03-12T00:46:05.000Z | MFIRAP/d00_utils/__init__.py | igor-morawski/MFIR-AP | d01bc2fd7cdb43231e3fd0dc0b6b2e5e429ba4ff | [
"MIT"
] | 2 | 2020-08-13T05:15:37.000Z | 2020-08-19T04:55:26.000Z | import MFIRAP.d00_utils.paths
import MFIRAP.d00_utils.verbosity
import MFIRAP.d00_utils.dataset
import MFIRAP.d00_utils.io
import MFIRAP.d00_utils.project | 30.8 | 33 | 0.876623 | 25 | 154 | 5.2 | 0.36 | 0.461538 | 0.576923 | 0.769231 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.068966 | 0.058442 | 154 | 5 | 34 | 30.8 | 0.827586 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
b6e20bbad48304be9ea4244973da08c707f5dc43 | 167 | py | Python | client/sdk6_rte/__init__.py | niubin261/POFSwitch-RoleSwitch | 823a839c97ecc72c4a98d9f588e42ef80cfcffb1 | [
"Unlicense"
] | null | null | null | client/sdk6_rte/__init__.py | niubin261/POFSwitch-RoleSwitch | 823a839c97ecc72c4a98d9f588e42ef80cfcffb1 | [
"Unlicense"
] | null | null | null | client/sdk6_rte/__init__.py | niubin261/POFSwitch-RoleSwitch | 823a839c97ecc72c4a98d9f588e42ef80cfcffb1 | [
"Unlicense"
] | null | null | null | from . import constants
from . import RunTimeEnvironment
from . import RunTimeEnvironment
from . import ttypes
__all__ = ['ttypes', 'constants', 'RunTimeEnvironment']
| 27.833333 | 55 | 0.784431 | 16 | 167 | 7.9375 | 0.375 | 0.314961 | 0.440945 | 0.503937 | 0.519685 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125749 | 167 | 5 | 56 | 33.4 | 0.869863 | 0 | 0 | 0.4 | 0 | 0 | 0.197605 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.8 | 0 | 0.8 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
8e0a197eeb94e817cde6ac1e1e44c86c4c4c1cb6 | 4,854 | py | Python | per/migrations/0027_auto_20201030_1539.py | IFRCGo/ifrcgo-api | c1c3e0cf1076ab48d03db6aaf7a00f8485ca9e1a | [
"MIT"
] | 11 | 2018-06-11T06:05:12.000Z | 2022-03-25T09:31:44.000Z | per/migrations/0027_auto_20201030_1539.py | IFRCGo/ifrcgo-api | c1c3e0cf1076ab48d03db6aaf7a00f8485ca9e1a | [
"MIT"
] | 498 | 2017-11-07T21:20:13.000Z | 2022-03-31T14:37:18.000Z | per/migrations/0027_auto_20201030_1539.py | IFRCGo/ifrcgo-api | c1c3e0cf1076ab48d03db6aaf7a00f8485ca9e1a | [
"MIT"
] | 6 | 2018-04-11T13:29:50.000Z | 2020-07-16T16:52:11.000Z | # Generated by Django 2.2.13 on 2020-10-30 15:39
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('per', '0026_auto_20201029_0851'),
]
operations = [
migrations.AddField(
model_name='assessmenttype',
name='name_ar',
field=models.CharField(max_length=200, null=True, verbose_name='name'),
),
migrations.AddField(
model_name='assessmenttype',
name='name_en',
field=models.CharField(max_length=200, null=True, verbose_name='name'),
),
migrations.AddField(
model_name='assessmenttype',
name='name_es',
field=models.CharField(max_length=200, null=True, verbose_name='name'),
),
migrations.AddField(
model_name='assessmenttype',
name='name_fr',
field=models.CharField(max_length=200, null=True, verbose_name='name'),
),
migrations.AddField(
model_name='formanswer',
name='text_ar',
field=models.CharField(max_length=40, null=True, verbose_name='text'),
),
migrations.AddField(
model_name='formanswer',
name='text_en',
field=models.CharField(max_length=40, null=True, verbose_name='text'),
),
migrations.AddField(
model_name='formanswer',
name='text_es',
field=models.CharField(max_length=40, null=True, verbose_name='text'),
),
migrations.AddField(
model_name='formanswer',
name='text_fr',
field=models.CharField(max_length=40, null=True, verbose_name='text'),
),
migrations.AddField(
model_name='formarea',
name='title_ar',
field=models.CharField(max_length=250, null=True, verbose_name='title'),
),
migrations.AddField(
model_name='formarea',
name='title_en',
field=models.CharField(max_length=250, null=True, verbose_name='title'),
),
migrations.AddField(
model_name='formarea',
name='title_es',
field=models.CharField(max_length=250, null=True, verbose_name='title'),
),
migrations.AddField(
model_name='formarea',
name='title_fr',
field=models.CharField(max_length=250, null=True, verbose_name='title'),
),
migrations.AddField(
model_name='formcomponent',
name='description_ar',
field=models.TextField(blank=True, null=True, verbose_name='description'),
),
migrations.AddField(
model_name='formcomponent',
name='description_en',
field=models.TextField(blank=True, null=True, verbose_name='description'),
),
migrations.AddField(
model_name='formcomponent',
name='description_es',
field=models.TextField(blank=True, null=True, verbose_name='description'),
),
migrations.AddField(
model_name='formcomponent',
name='description_fr',
field=models.TextField(blank=True, null=True, verbose_name='description'),
),
migrations.AddField(
model_name='formdata',
name='notes_ar',
field=models.TextField(blank=True, null=True, verbose_name='notes'),
),
migrations.AddField(
model_name='formdata',
name='notes_en',
field=models.TextField(blank=True, null=True, verbose_name='notes'),
),
migrations.AddField(
model_name='formdata',
name='notes_es',
field=models.TextField(blank=True, null=True, verbose_name='notes'),
),
migrations.AddField(
model_name='formdata',
name='notes_fr',
field=models.TextField(blank=True, null=True, verbose_name='notes'),
),
migrations.AddField(
model_name='formquestion',
name='question_ar',
field=models.CharField(max_length=500, null=True, verbose_name='question'),
),
migrations.AddField(
model_name='formquestion',
name='question_en',
field=models.CharField(max_length=500, null=True, verbose_name='question'),
),
migrations.AddField(
model_name='formquestion',
name='question_es',
field=models.CharField(max_length=500, null=True, verbose_name='question'),
),
migrations.AddField(
model_name='formquestion',
name='question_fr',
field=models.CharField(max_length=500, null=True, verbose_name='question'),
),
]
| 36.223881 | 87 | 0.573136 | 477 | 4,854 | 5.641509 | 0.1174 | 0.160535 | 0.205128 | 0.240803 | 0.945373 | 0.945373 | 0.933482 | 0.876626 | 0.876626 | 0.876626 | 0 | 0.022459 | 0.302843 | 4,854 | 133 | 88 | 36.496241 | 0.772754 | 0.009477 | 0 | 0.755906 | 1 | 0 | 0.13608 | 0.004786 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.007874 | 0 | 0.031496 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
8e3c0264b69b32ac2902f70a3274119fb418c06c | 240 | py | Python | webdriver_test_tools/webdriver/__init__.py | connordelacruz/webdriver-test-tools | fe6906839e4423562c6d4d0aa6b10b2ea90bff6b | [
"MIT"
] | 5 | 2018-07-02T13:18:59.000Z | 2019-10-14T04:55:31.000Z | webdriver_test_tools/webdriver/__init__.py | connordelacruz/webdriver-test-tools | fe6906839e4423562c6d4d0aa6b10b2ea90bff6b | [
"MIT"
] | 1 | 2019-10-16T20:54:25.000Z | 2019-10-16T20:54:25.000Z | webdriver_test_tools/webdriver/__init__.py | connordelacruz/webdriver-test-tools | fe6906839e4423562c6d4d0aa6b10b2ea90bff6b | [
"MIT"
] | 1 | 2019-09-03T05:29:41.000Z | 2019-09-03T05:29:41.000Z | """Extended functionality for Selenium WebDriver.
.. toctree::
webdriver_test_tools.webdriver.locate
webdriver_test_tools.webdriver.actions
webdriver_test_tools.webdriver.support
"""
from . import actions, support, locate
| 17.142857 | 49 | 0.770833 | 26 | 240 | 6.884615 | 0.5 | 0.217877 | 0.301676 | 0.452514 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.145833 | 240 | 13 | 50 | 18.461538 | 0.873171 | 0.791667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
6d07dfe593b9b29113bda7536e43ad6cae533869 | 216 | py | Python | src/common/errors.py | university-my/ultimate-schedule-api | 6dbf2368da8751a8b6105c8d783a4b105f99866d | [
"MIT"
] | 5 | 2020-04-18T16:33:50.000Z | 2021-09-30T09:24:56.000Z | src/common/errors.py | university-my/ultimate-schedule-api | 6dbf2368da8751a8b6105c8d783a4b105f99866d | [
"MIT"
] | 15 | 2020-04-18T13:03:26.000Z | 2021-12-13T20:44:54.000Z | src/common/errors.py | university-my/ultimate-schedule-api | 6dbf2368da8751a8b6105c8d783a4b105f99866d | [
"MIT"
] | 2 | 2020-05-30T20:51:45.000Z | 2021-09-28T10:32:12.000Z | class BaseCustomException(Exception):
def __init__(self, message):
self.message = message
def json(self):
return {"message": self.message}
class ParsingError(BaseCustomException):
pass
| 19.636364 | 40 | 0.685185 | 21 | 216 | 6.857143 | 0.52381 | 0.229167 | 0.25 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.217593 | 216 | 10 | 41 | 21.6 | 0.852071 | 0 | 0 | 0 | 0 | 0 | 0.032407 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | false | 0.142857 | 0 | 0.142857 | 0.714286 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 7 |
6d19b619b135d615a6459b242735d867cd69bb13 | 33,027 | py | Python | common/SurfaceDice.py | qgking/DASC_COVID19 | 3300516b1d0e9896e2fb2ffda8527e0e1a1fcf2c | [
"MIT"
] | 4 | 2021-04-21T05:09:49.000Z | 2022-01-17T13:02:45.000Z | common/SurfaceDice.py | qgking/DASC_COVID19 | 3300516b1d0e9896e2fb2ffda8527e0e1a1fcf2c | [
"MIT"
] | null | null | null | common/SurfaceDice.py | qgking/DASC_COVID19 | 3300516b1d0e9896e2fb2ffda8527e0e1a1fcf2c | [
"MIT"
] | 1 | 2021-07-08T02:20:43.000Z | 2021-07-08T02:20:43.000Z | # -*- coding: utf-8 -*-
# @Time : 20/5/3 11:06
# @Author : qgking
# @Email : qgking@tju.edu.cn
# @Software: PyCharm
# @Desc : SurfaceDice.py
# -*- coding: utf-8 -*-
"""
Code for computing surface Dice
copy from http://medicaldecathlon.com/files/Surface_distance_based_measures.ipynb
"""
import numpy as np
import scipy.ndimage
# neighbour_code_to_normals is a lookup table.
# For every binary neighbour code
# (2x2x2 neighbourhood = 8 neighbours = 8 bits = 256 codes)
# it contains the surface normals of the triangles (called "surfel" for
# "surface element" in the following). The length of the normal
# vector encodes the surfel area.
#
# created by compute_surface_area_lookup_table.ipynb using the
# marching_cube algorithm, see e.g. https://en.wikipedia.org/wiki/Marching_cubes
#
neighbour_code_to_normals = [
[[0, 0, 0]],
[[0.125, 0.125, 0.125]],
[[-0.125, -0.125, 0.125]],
[[-0.25, -0.25, 0.0], [0.25, 0.25, -0.0]],
[[0.125, -0.125, 0.125]],
[[-0.25, -0.0, -0.25], [0.25, 0.0, 0.25]],
[[0.125, -0.125, 0.125], [-0.125, -0.125, 0.125]],
[[0.5, 0.0, -0.0], [0.25, 0.25, 0.25], [0.125, 0.125, 0.125]],
[[-0.125, 0.125, 0.125]],
[[0.125, 0.125, 0.125], [-0.125, 0.125, 0.125]],
[[-0.25, 0.0, 0.25], [-0.25, 0.0, 0.25]],
[[0.5, 0.0, 0.0], [-0.25, -0.25, 0.25], [-0.125, -0.125, 0.125]],
[[0.25, -0.25, 0.0], [0.25, -0.25, 0.0]],
[[0.5, 0.0, 0.0], [0.25, -0.25, 0.25], [-0.125, 0.125, -0.125]],
[[-0.5, 0.0, 0.0], [-0.25, 0.25, 0.25], [-0.125, 0.125, 0.125]],
[[0.5, 0.0, 0.0], [0.5, 0.0, 0.0]],
[[0.125, -0.125, -0.125]],
[[0.0, -0.25, -0.25], [0.0, 0.25, 0.25]],
[[-0.125, -0.125, 0.125], [0.125, -0.125, -0.125]],
[[0.0, -0.5, 0.0], [0.25, 0.25, 0.25], [0.125, 0.125, 0.125]],
[[0.125, -0.125, 0.125], [0.125, -0.125, -0.125]],
[[0.0, 0.0, -0.5], [0.25, 0.25, 0.25], [-0.125, -0.125, -0.125]],
[[-0.125, -0.125, 0.125], [0.125, -0.125, 0.125], [0.125, -0.125, -0.125]],
[[-0.125, -0.125, -0.125], [-0.25, -0.25, -0.25], [0.25, 0.25, 0.25], [0.125, 0.125, 0.125]],
[[-0.125, 0.125, 0.125], [0.125, -0.125, -0.125]],
[[0.0, -0.25, -0.25], [0.0, 0.25, 0.25], [-0.125, 0.125, 0.125]],
[[-0.25, 0.0, 0.25], [-0.25, 0.0, 0.25], [0.125, -0.125, -0.125]],
[[0.125, 0.125, 0.125], [0.375, 0.375, 0.375], [0.0, -0.25, 0.25], [-0.25, 0.0, 0.25]],
[[0.125, -0.125, -0.125], [0.25, -0.25, 0.0], [0.25, -0.25, 0.0]],
[[0.375, 0.375, 0.375], [0.0, 0.25, -0.25], [-0.125, -0.125, -0.125], [-0.25, 0.25, 0.0]],
[[-0.5, 0.0, 0.0], [-0.125, -0.125, -0.125], [-0.25, -0.25, -0.25], [0.125, 0.125, 0.125]],
[[-0.5, 0.0, 0.0], [-0.125, -0.125, -0.125], [-0.25, -0.25, -0.25]],
[[0.125, -0.125, 0.125]],
[[0.125, 0.125, 0.125], [0.125, -0.125, 0.125]],
[[0.0, -0.25, 0.25], [0.0, 0.25, -0.25]],
[[0.0, -0.5, 0.0], [0.125, 0.125, -0.125], [0.25, 0.25, -0.25]],
[[0.125, -0.125, 0.125], [0.125, -0.125, 0.125]],
[[0.125, -0.125, 0.125], [-0.25, -0.0, -0.25], [0.25, 0.0, 0.25]],
[[0.0, -0.25, 0.25], [0.0, 0.25, -0.25], [0.125, -0.125, 0.125]],
[[-0.375, -0.375, 0.375], [-0.0, 0.25, 0.25], [0.125, 0.125, -0.125], [-0.25, -0.0, -0.25]],
[[-0.125, 0.125, 0.125], [0.125, -0.125, 0.125]],
[[0.125, 0.125, 0.125], [0.125, -0.125, 0.125], [-0.125, 0.125, 0.125]],
[[-0.0, 0.0, 0.5], [-0.25, -0.25, 0.25], [-0.125, -0.125, 0.125]],
[[0.25, 0.25, -0.25], [0.25, 0.25, -0.25], [0.125, 0.125, -0.125], [-0.125, -0.125, 0.125]],
[[0.125, -0.125, 0.125], [0.25, -0.25, 0.0], [0.25, -0.25, 0.0]],
[[0.5, 0.0, 0.0], [0.25, -0.25, 0.25], [-0.125, 0.125, -0.125], [0.125, -0.125, 0.125]],
[[0.0, 0.25, -0.25], [0.375, -0.375, -0.375], [-0.125, 0.125, 0.125], [0.25, 0.25, 0.0]],
[[-0.5, 0.0, 0.0], [-0.25, -0.25, 0.25], [-0.125, -0.125, 0.125]],
[[0.25, -0.25, 0.0], [-0.25, 0.25, 0.0]],
[[0.0, 0.5, 0.0], [-0.25, 0.25, 0.25], [0.125, -0.125, -0.125]],
[[0.0, 0.5, 0.0], [0.125, -0.125, 0.125], [-0.25, 0.25, -0.25]],
[[0.0, 0.5, 0.0], [0.0, -0.5, 0.0]],
[[0.25, -0.25, 0.0], [-0.25, 0.25, 0.0], [0.125, -0.125, 0.125]],
[[-0.375, -0.375, -0.375], [-0.25, 0.0, 0.25], [-0.125, -0.125, -0.125], [-0.25, 0.25, 0.0]],
[[0.125, 0.125, 0.125], [0.0, -0.5, 0.0], [-0.25, -0.25, -0.25], [-0.125, -0.125, -0.125]],
[[0.0, -0.5, 0.0], [-0.25, -0.25, -0.25], [-0.125, -0.125, -0.125]],
[[-0.125, 0.125, 0.125], [0.25, -0.25, 0.0], [-0.25, 0.25, 0.0]],
[[0.0, 0.5, 0.0], [0.25, 0.25, -0.25], [-0.125, -0.125, 0.125], [-0.125, -0.125, 0.125]],
[[-0.375, 0.375, -0.375], [-0.25, -0.25, 0.0], [-0.125, 0.125, -0.125], [-0.25, 0.0, 0.25]],
[[0.0, 0.5, 0.0], [0.25, 0.25, -0.25], [-0.125, -0.125, 0.125]],
[[0.25, -0.25, 0.0], [-0.25, 0.25, 0.0], [0.25, -0.25, 0.0], [0.25, -0.25, 0.0]],
[[-0.25, -0.25, 0.0], [-0.25, -0.25, 0.0], [-0.125, -0.125, 0.125]],
[[0.125, 0.125, 0.125], [-0.25, -0.25, 0.0], [-0.25, -0.25, 0.0]],
[[-0.25, -0.25, 0.0], [-0.25, -0.25, 0.0]],
[[-0.125, -0.125, 0.125]],
[[0.125, 0.125, 0.125], [-0.125, -0.125, 0.125]],
[[-0.125, -0.125, 0.125], [-0.125, -0.125, 0.125]],
[[-0.125, -0.125, 0.125], [-0.25, -0.25, 0.0], [0.25, 0.25, -0.0]],
[[0.0, -0.25, 0.25], [0.0, -0.25, 0.25]],
[[0.0, 0.0, 0.5], [0.25, -0.25, 0.25], [0.125, -0.125, 0.125]],
[[0.0, -0.25, 0.25], [0.0, -0.25, 0.25], [-0.125, -0.125, 0.125]],
[[0.375, -0.375, 0.375], [0.0, -0.25, -0.25], [-0.125, 0.125, -0.125], [0.25, 0.25, 0.0]],
[[-0.125, -0.125, 0.125], [-0.125, 0.125, 0.125]],
[[0.125, 0.125, 0.125], [-0.125, -0.125, 0.125], [-0.125, 0.125, 0.125]],
[[-0.125, -0.125, 0.125], [-0.25, 0.0, 0.25], [-0.25, 0.0, 0.25]],
[[0.5, 0.0, 0.0], [-0.25, -0.25, 0.25], [-0.125, -0.125, 0.125], [-0.125, -0.125, 0.125]],
[[-0.0, 0.5, 0.0], [-0.25, 0.25, -0.25], [0.125, -0.125, 0.125]],
[[-0.25, 0.25, -0.25], [-0.25, 0.25, -0.25], [-0.125, 0.125, -0.125], [-0.125, 0.125, -0.125]],
[[-0.25, 0.0, -0.25], [0.375, -0.375, -0.375], [0.0, 0.25, -0.25], [-0.125, 0.125, 0.125]],
[[0.5, 0.0, 0.0], [-0.25, 0.25, -0.25], [0.125, -0.125, 0.125]],
[[-0.25, 0.0, 0.25], [0.25, 0.0, -0.25]],
[[-0.0, 0.0, 0.5], [-0.25, 0.25, 0.25], [-0.125, 0.125, 0.125]],
[[-0.125, -0.125, 0.125], [-0.25, 0.0, 0.25], [0.25, 0.0, -0.25]],
[[-0.25, -0.0, -0.25], [-0.375, 0.375, 0.375], [-0.25, -0.25, 0.0], [-0.125, 0.125, 0.125]],
[[0.0, 0.0, -0.5], [0.25, 0.25, -0.25], [-0.125, -0.125, 0.125]],
[[-0.0, 0.0, 0.5], [0.0, 0.0, 0.5]],
[[0.125, 0.125, 0.125], [0.125, 0.125, 0.125], [0.25, 0.25, 0.25], [0.0, 0.0, 0.5]],
[[0.125, 0.125, 0.125], [0.25, 0.25, 0.25], [0.0, 0.0, 0.5]],
[[-0.25, 0.0, 0.25], [0.25, 0.0, -0.25], [-0.125, 0.125, 0.125]],
[[-0.0, 0.0, 0.5], [0.25, -0.25, 0.25], [0.125, -0.125, 0.125], [0.125, -0.125, 0.125]],
[[-0.25, 0.0, 0.25], [-0.25, 0.0, 0.25], [-0.25, 0.0, 0.25], [0.25, 0.0, -0.25]],
[[0.125, -0.125, 0.125], [0.25, 0.0, 0.25], [0.25, 0.0, 0.25]],
[[0.25, 0.0, 0.25], [-0.375, -0.375, 0.375], [-0.25, 0.25, 0.0], [-0.125, -0.125, 0.125]],
[[-0.0, 0.0, 0.5], [0.25, -0.25, 0.25], [0.125, -0.125, 0.125]],
[[0.125, 0.125, 0.125], [0.25, 0.0, 0.25], [0.25, 0.0, 0.25]],
[[0.25, 0.0, 0.25], [0.25, 0.0, 0.25]],
[[-0.125, -0.125, 0.125], [0.125, -0.125, 0.125]],
[[0.125, 0.125, 0.125], [-0.125, -0.125, 0.125], [0.125, -0.125, 0.125]],
[[-0.125, -0.125, 0.125], [0.0, -0.25, 0.25], [0.0, 0.25, -0.25]],
[[0.0, -0.5, 0.0], [0.125, 0.125, -0.125], [0.25, 0.25, -0.25], [-0.125, -0.125, 0.125]],
[[0.0, -0.25, 0.25], [0.0, -0.25, 0.25], [0.125, -0.125, 0.125]],
[[0.0, 0.0, 0.5], [0.25, -0.25, 0.25], [0.125, -0.125, 0.125], [0.125, -0.125, 0.125]],
[[0.0, -0.25, 0.25], [0.0, -0.25, 0.25], [0.0, -0.25, 0.25], [0.0, 0.25, -0.25]],
[[0.0, 0.25, 0.25], [0.0, 0.25, 0.25], [0.125, -0.125, -0.125]],
[[-0.125, 0.125, 0.125], [0.125, -0.125, 0.125], [-0.125, -0.125, 0.125]],
[[-0.125, 0.125, 0.125], [0.125, -0.125, 0.125], [-0.125, -0.125, 0.125], [0.125, 0.125, 0.125]],
[[-0.0, 0.0, 0.5], [-0.25, -0.25, 0.25], [-0.125, -0.125, 0.125], [-0.125, -0.125, 0.125]],
[[0.125, 0.125, 0.125], [0.125, -0.125, 0.125], [0.125, -0.125, -0.125]],
[[-0.0, 0.5, 0.0], [-0.25, 0.25, -0.25], [0.125, -0.125, 0.125], [0.125, -0.125, 0.125]],
[[0.125, 0.125, 0.125], [-0.125, -0.125, 0.125], [0.125, -0.125, -0.125]],
[[0.0, -0.25, -0.25], [0.0, 0.25, 0.25], [0.125, 0.125, 0.125]],
[[0.125, 0.125, 0.125], [0.125, -0.125, -0.125]],
[[0.5, 0.0, -0.0], [0.25, -0.25, -0.25], [0.125, -0.125, -0.125]],
[[-0.25, 0.25, 0.25], [-0.125, 0.125, 0.125], [-0.25, 0.25, 0.25], [0.125, -0.125, -0.125]],
[[0.375, -0.375, 0.375], [0.0, 0.25, 0.25], [-0.125, 0.125, -0.125], [-0.25, 0.0, 0.25]],
[[0.0, -0.5, 0.0], [-0.25, 0.25, 0.25], [-0.125, 0.125, 0.125]],
[[-0.375, -0.375, 0.375], [0.25, -0.25, 0.0], [0.0, 0.25, 0.25], [-0.125, -0.125, 0.125]],
[[-0.125, 0.125, 0.125], [-0.25, 0.25, 0.25], [0.0, 0.0, 0.5]],
[[0.125, 0.125, 0.125], [0.0, 0.25, 0.25], [0.0, 0.25, 0.25]],
[[0.0, 0.25, 0.25], [0.0, 0.25, 0.25]],
[[0.5, 0.0, -0.0], [0.25, 0.25, 0.25], [0.125, 0.125, 0.125], [0.125, 0.125, 0.125]],
[[0.125, -0.125, 0.125], [-0.125, -0.125, 0.125], [0.125, 0.125, 0.125]],
[[-0.25, -0.0, -0.25], [0.25, 0.0, 0.25], [0.125, 0.125, 0.125]],
[[0.125, 0.125, 0.125], [0.125, -0.125, 0.125]],
[[-0.25, -0.25, 0.0], [0.25, 0.25, -0.0], [0.125, 0.125, 0.125]],
[[0.125, 0.125, 0.125], [-0.125, -0.125, 0.125]],
[[0.125, 0.125, 0.125], [0.125, 0.125, 0.125]],
[[0.125, 0.125, 0.125]],
[[0.125, 0.125, 0.125]],
[[0.125, 0.125, 0.125], [0.125, 0.125, 0.125]],
[[0.125, 0.125, 0.125], [-0.125, -0.125, 0.125]],
[[-0.25, -0.25, 0.0], [0.25, 0.25, -0.0], [0.125, 0.125, 0.125]],
[[0.125, 0.125, 0.125], [0.125, -0.125, 0.125]],
[[-0.25, -0.0, -0.25], [0.25, 0.0, 0.25], [0.125, 0.125, 0.125]],
[[0.125, -0.125, 0.125], [-0.125, -0.125, 0.125], [0.125, 0.125, 0.125]],
[[0.5, 0.0, -0.0], [0.25, 0.25, 0.25], [0.125, 0.125, 0.125], [0.125, 0.125, 0.125]],
[[0.0, 0.25, 0.25], [0.0, 0.25, 0.25]],
[[0.125, 0.125, 0.125], [0.0, 0.25, 0.25], [0.0, 0.25, 0.25]],
[[-0.125, 0.125, 0.125], [-0.25, 0.25, 0.25], [0.0, 0.0, 0.5]],
[[-0.375, -0.375, 0.375], [0.25, -0.25, 0.0], [0.0, 0.25, 0.25], [-0.125, -0.125, 0.125]],
[[0.0, -0.5, 0.0], [-0.25, 0.25, 0.25], [-0.125, 0.125, 0.125]],
[[0.375, -0.375, 0.375], [0.0, 0.25, 0.25], [-0.125, 0.125, -0.125], [-0.25, 0.0, 0.25]],
[[-0.25, 0.25, 0.25], [-0.125, 0.125, 0.125], [-0.25, 0.25, 0.25], [0.125, -0.125, -0.125]],
[[0.5, 0.0, -0.0], [0.25, -0.25, -0.25], [0.125, -0.125, -0.125]],
[[0.125, 0.125, 0.125], [0.125, -0.125, -0.125]],
[[0.0, -0.25, -0.25], [0.0, 0.25, 0.25], [0.125, 0.125, 0.125]],
[[0.125, 0.125, 0.125], [-0.125, -0.125, 0.125], [0.125, -0.125, -0.125]],
[[-0.0, 0.5, 0.0], [-0.25, 0.25, -0.25], [0.125, -0.125, 0.125], [0.125, -0.125, 0.125]],
[[0.125, 0.125, 0.125], [0.125, -0.125, 0.125], [0.125, -0.125, -0.125]],
[[-0.0, 0.0, 0.5], [-0.25, -0.25, 0.25], [-0.125, -0.125, 0.125], [-0.125, -0.125, 0.125]],
[[-0.125, 0.125, 0.125], [0.125, -0.125, 0.125], [-0.125, -0.125, 0.125], [0.125, 0.125, 0.125]],
[[-0.125, 0.125, 0.125], [0.125, -0.125, 0.125], [-0.125, -0.125, 0.125]],
[[0.0, 0.25, 0.25], [0.0, 0.25, 0.25], [0.125, -0.125, -0.125]],
[[0.0, -0.25, -0.25], [0.0, 0.25, 0.25], [0.0, 0.25, 0.25], [0.0, 0.25, 0.25]],
[[0.0, 0.0, 0.5], [0.25, -0.25, 0.25], [0.125, -0.125, 0.125], [0.125, -0.125, 0.125]],
[[0.0, -0.25, 0.25], [0.0, -0.25, 0.25], [0.125, -0.125, 0.125]],
[[0.0, -0.5, 0.0], [0.125, 0.125, -0.125], [0.25, 0.25, -0.25], [-0.125, -0.125, 0.125]],
[[-0.125, -0.125, 0.125], [0.0, -0.25, 0.25], [0.0, 0.25, -0.25]],
[[0.125, 0.125, 0.125], [-0.125, -0.125, 0.125], [0.125, -0.125, 0.125]],
[[-0.125, -0.125, 0.125], [0.125, -0.125, 0.125]],
[[0.25, 0.0, 0.25], [0.25, 0.0, 0.25]],
[[0.125, 0.125, 0.125], [0.25, 0.0, 0.25], [0.25, 0.0, 0.25]],
[[-0.0, 0.0, 0.5], [0.25, -0.25, 0.25], [0.125, -0.125, 0.125]],
[[0.25, 0.0, 0.25], [-0.375, -0.375, 0.375], [-0.25, 0.25, 0.0], [-0.125, -0.125, 0.125]],
[[0.125, -0.125, 0.125], [0.25, 0.0, 0.25], [0.25, 0.0, 0.25]],
[[-0.25, -0.0, -0.25], [0.25, 0.0, 0.25], [0.25, 0.0, 0.25], [0.25, 0.0, 0.25]],
[[-0.0, 0.0, 0.5], [0.25, -0.25, 0.25], [0.125, -0.125, 0.125], [0.125, -0.125, 0.125]],
[[-0.25, 0.0, 0.25], [0.25, 0.0, -0.25], [-0.125, 0.125, 0.125]],
[[0.125, 0.125, 0.125], [0.25, 0.25, 0.25], [0.0, 0.0, 0.5]],
[[0.125, 0.125, 0.125], [0.125, 0.125, 0.125], [0.25, 0.25, 0.25], [0.0, 0.0, 0.5]],
[[-0.0, 0.0, 0.5], [0.0, 0.0, 0.5]],
[[0.0, 0.0, -0.5], [0.25, 0.25, -0.25], [-0.125, -0.125, 0.125]],
[[-0.25, -0.0, -0.25], [-0.375, 0.375, 0.375], [-0.25, -0.25, 0.0], [-0.125, 0.125, 0.125]],
[[-0.125, -0.125, 0.125], [-0.25, 0.0, 0.25], [0.25, 0.0, -0.25]],
[[-0.0, 0.0, 0.5], [-0.25, 0.25, 0.25], [-0.125, 0.125, 0.125]],
[[-0.25, 0.0, 0.25], [0.25, 0.0, -0.25]],
[[0.5, 0.0, 0.0], [-0.25, 0.25, -0.25], [0.125, -0.125, 0.125]],
[[-0.25, 0.0, -0.25], [0.375, -0.375, -0.375], [0.0, 0.25, -0.25], [-0.125, 0.125, 0.125]],
[[-0.25, 0.25, -0.25], [-0.25, 0.25, -0.25], [-0.125, 0.125, -0.125], [-0.125, 0.125, -0.125]],
[[-0.0, 0.5, 0.0], [-0.25, 0.25, -0.25], [0.125, -0.125, 0.125]],
[[0.5, 0.0, 0.0], [-0.25, -0.25, 0.25], [-0.125, -0.125, 0.125], [-0.125, -0.125, 0.125]],
[[-0.125, -0.125, 0.125], [-0.25, 0.0, 0.25], [-0.25, 0.0, 0.25]],
[[0.125, 0.125, 0.125], [-0.125, -0.125, 0.125], [-0.125, 0.125, 0.125]],
[[-0.125, -0.125, 0.125], [-0.125, 0.125, 0.125]],
[[0.375, -0.375, 0.375], [0.0, -0.25, -0.25], [-0.125, 0.125, -0.125], [0.25, 0.25, 0.0]],
[[0.0, -0.25, 0.25], [0.0, -0.25, 0.25], [-0.125, -0.125, 0.125]],
[[0.0, 0.0, 0.5], [0.25, -0.25, 0.25], [0.125, -0.125, 0.125]],
[[0.0, -0.25, 0.25], [0.0, -0.25, 0.25]],
[[-0.125, -0.125, 0.125], [-0.25, -0.25, 0.0], [0.25, 0.25, -0.0]],
[[-0.125, -0.125, 0.125], [-0.125, -0.125, 0.125]],
[[0.125, 0.125, 0.125], [-0.125, -0.125, 0.125]],
[[-0.125, -0.125, 0.125]],
[[-0.25, -0.25, 0.0], [-0.25, -0.25, 0.0]],
[[0.125, 0.125, 0.125], [-0.25, -0.25, 0.0], [-0.25, -0.25, 0.0]],
[[-0.25, -0.25, 0.0], [-0.25, -0.25, 0.0], [-0.125, -0.125, 0.125]],
[[-0.25, -0.25, 0.0], [-0.25, -0.25, 0.0], [-0.25, -0.25, 0.0], [0.25, 0.25, -0.0]],
[[0.0, 0.5, 0.0], [0.25, 0.25, -0.25], [-0.125, -0.125, 0.125]],
[[-0.375, 0.375, -0.375], [-0.25, -0.25, 0.0], [-0.125, 0.125, -0.125], [-0.25, 0.0, 0.25]],
[[0.0, 0.5, 0.0], [0.25, 0.25, -0.25], [-0.125, -0.125, 0.125], [-0.125, -0.125, 0.125]],
[[-0.125, 0.125, 0.125], [0.25, -0.25, 0.0], [-0.25, 0.25, 0.0]],
[[0.0, -0.5, 0.0], [-0.25, -0.25, -0.25], [-0.125, -0.125, -0.125]],
[[0.125, 0.125, 0.125], [0.0, -0.5, 0.0], [-0.25, -0.25, -0.25], [-0.125, -0.125, -0.125]],
[[-0.375, -0.375, -0.375], [-0.25, 0.0, 0.25], [-0.125, -0.125, -0.125], [-0.25, 0.25, 0.0]],
[[0.25, -0.25, 0.0], [-0.25, 0.25, 0.0], [0.125, -0.125, 0.125]],
[[0.0, 0.5, 0.0], [0.0, -0.5, 0.0]],
[[0.0, 0.5, 0.0], [0.125, -0.125, 0.125], [-0.25, 0.25, -0.25]],
[[0.0, 0.5, 0.0], [-0.25, 0.25, 0.25], [0.125, -0.125, -0.125]],
[[0.25, -0.25, 0.0], [-0.25, 0.25, 0.0]],
[[-0.5, 0.0, 0.0], [-0.25, -0.25, 0.25], [-0.125, -0.125, 0.125]],
[[0.0, 0.25, -0.25], [0.375, -0.375, -0.375], [-0.125, 0.125, 0.125], [0.25, 0.25, 0.0]],
[[0.5, 0.0, 0.0], [0.25, -0.25, 0.25], [-0.125, 0.125, -0.125], [0.125, -0.125, 0.125]],
[[0.125, -0.125, 0.125], [0.25, -0.25, 0.0], [0.25, -0.25, 0.0]],
[[0.25, 0.25, -0.25], [0.25, 0.25, -0.25], [0.125, 0.125, -0.125], [-0.125, -0.125, 0.125]],
[[-0.0, 0.0, 0.5], [-0.25, -0.25, 0.25], [-0.125, -0.125, 0.125]],
[[0.125, 0.125, 0.125], [0.125, -0.125, 0.125], [-0.125, 0.125, 0.125]],
[[-0.125, 0.125, 0.125], [0.125, -0.125, 0.125]],
[[-0.375, -0.375, 0.375], [-0.0, 0.25, 0.25], [0.125, 0.125, -0.125], [-0.25, -0.0, -0.25]],
[[0.0, -0.25, 0.25], [0.0, 0.25, -0.25], [0.125, -0.125, 0.125]],
[[0.125, -0.125, 0.125], [-0.25, -0.0, -0.25], [0.25, 0.0, 0.25]],
[[0.125, -0.125, 0.125], [0.125, -0.125, 0.125]],
[[0.0, -0.5, 0.0], [0.125, 0.125, -0.125], [0.25, 0.25, -0.25]],
[[0.0, -0.25, 0.25], [0.0, 0.25, -0.25]],
[[0.125, 0.125, 0.125], [0.125, -0.125, 0.125]],
[[0.125, -0.125, 0.125]],
[[-0.5, 0.0, 0.0], [-0.125, -0.125, -0.125], [-0.25, -0.25, -0.25]],
[[-0.5, 0.0, 0.0], [-0.125, -0.125, -0.125], [-0.25, -0.25, -0.25], [0.125, 0.125, 0.125]],
[[0.375, 0.375, 0.375], [0.0, 0.25, -0.25], [-0.125, -0.125, -0.125], [-0.25, 0.25, 0.0]],
[[0.125, -0.125, -0.125], [0.25, -0.25, 0.0], [0.25, -0.25, 0.0]],
[[0.125, 0.125, 0.125], [0.375, 0.375, 0.375], [0.0, -0.25, 0.25], [-0.25, 0.0, 0.25]],
[[-0.25, 0.0, 0.25], [-0.25, 0.0, 0.25], [0.125, -0.125, -0.125]],
[[0.0, -0.25, -0.25], [0.0, 0.25, 0.25], [-0.125, 0.125, 0.125]],
[[-0.125, 0.125, 0.125], [0.125, -0.125, -0.125]],
[[-0.125, -0.125, -0.125], [-0.25, -0.25, -0.25], [0.25, 0.25, 0.25], [0.125, 0.125, 0.125]],
[[-0.125, -0.125, 0.125], [0.125, -0.125, 0.125], [0.125, -0.125, -0.125]],
[[0.0, 0.0, -0.5], [0.25, 0.25, 0.25], [-0.125, -0.125, -0.125]],
[[0.125, -0.125, 0.125], [0.125, -0.125, -0.125]],
[[0.0, -0.5, 0.0], [0.25, 0.25, 0.25], [0.125, 0.125, 0.125]],
[[-0.125, -0.125, 0.125], [0.125, -0.125, -0.125]],
[[0.0, -0.25, -0.25], [0.0, 0.25, 0.25]],
[[0.125, -0.125, -0.125]],
[[0.5, 0.0, 0.0], [0.5, 0.0, 0.0]],
[[-0.5, 0.0, 0.0], [-0.25, 0.25, 0.25], [-0.125, 0.125, 0.125]],
[[0.5, 0.0, 0.0], [0.25, -0.25, 0.25], [-0.125, 0.125, -0.125]],
[[0.25, -0.25, 0.0], [0.25, -0.25, 0.0]],
[[0.5, 0.0, 0.0], [-0.25, -0.25, 0.25], [-0.125, -0.125, 0.125]],
[[-0.25, 0.0, 0.25], [-0.25, 0.0, 0.25]],
[[0.125, 0.125, 0.125], [-0.125, 0.125, 0.125]],
[[-0.125, 0.125, 0.125]],
[[0.5, 0.0, -0.0], [0.25, 0.25, 0.25], [0.125, 0.125, 0.125]],
[[0.125, -0.125, 0.125], [-0.125, -0.125, 0.125]],
[[-0.25, -0.0, -0.25], [0.25, 0.0, 0.25]],
[[0.125, -0.125, 0.125]],
[[-0.25, -0.25, 0.0], [0.25, 0.25, -0.0]],
[[-0.125, -0.125, 0.125]],
[[0.125, 0.125, 0.125]],
[[0, 0, 0]]]
def compute_surface_distances(mask_gt, mask_pred, spacing_mm):
"""Compute closest distances from all surface points to the other surface.
Finds all surface elements "surfels" in the ground truth mask `mask_gt` and
the predicted mask `mask_pred`, computes their area in mm^2 and the distance
to the closest point on the other surface. It returns two sorted lists of
distances together with the corresponding surfel areas. If one of the masks
is empty, the corresponding lists are empty and all distances in the other
list are `inf`
Args:
mask_gt: 3-dim Numpy array of type bool. The ground truth mask.
mask_pred: 3-dim Numpy array of type bool. The predicted mask.
spacing_mm: 3-element list-like structure. Voxel spacing in x0, x1 and x2
direction
Returns:
A dict with
"distances_gt_to_pred": 1-dim numpy array of type float. The distances in mm
from all ground truth surface elements to the predicted surface,
sorted from smallest to largest
"distances_pred_to_gt": 1-dim numpy array of type float. The distances in mm
from all predicted surface elements to the ground truth surface,
sorted from smallest to largest
"surfel_areas_gt": 1-dim numpy array of type float. The area in mm^2 of
the ground truth surface elements in the same order as
distances_gt_to_pred
"surfel_areas_pred": 1-dim numpy array of type float. The area in mm^2 of
the predicted surface elements in the same order as
distances_pred_to_gt
"""
# compute the area for all 256 possible surface elements
# (given a 2x2x2 neighbourhood) according to the spacing_mm
neighbour_code_to_surface_area = np.zeros([256])
for code in range(256):
normals = np.array(neighbour_code_to_normals[code])
sum_area = 0
for normal_idx in range(normals.shape[0]):
# normal vector
n = np.zeros([3])
n[0] = normals[normal_idx, 0] * spacing_mm[1] * spacing_mm[2]
n[1] = normals[normal_idx, 1] * spacing_mm[0] * spacing_mm[2]
n[2] = normals[normal_idx, 2] * spacing_mm[0] * spacing_mm[1]
area = np.linalg.norm(n)
sum_area += area
neighbour_code_to_surface_area[code] = sum_area
# compute the bounding box of the masks to trim
# the volume to the smallest possible processing subvolume
mask_all = mask_gt | mask_pred
bbox_min = np.zeros(3, np.int64)
bbox_max = np.zeros(3, np.int64)
# max projection to the x0-axis
proj_0 = np.max(np.max(mask_all, axis=2), axis=1)
idx_nonzero_0 = np.nonzero(proj_0)[0]
if len(idx_nonzero_0) == 0:
return {"distances_gt_to_pred": np.array([]),
"distances_pred_to_gt": np.array([]),
"surfel_areas_gt": np.array([]),
"surfel_areas_pred": np.array([])}
bbox_min[0] = np.min(idx_nonzero_0)
bbox_max[0] = np.max(idx_nonzero_0)
# max projection to the x1-axis
proj_1 = np.max(np.max(mask_all, axis=2), axis=0)
idx_nonzero_1 = np.nonzero(proj_1)[0]
bbox_min[1] = np.min(idx_nonzero_1)
bbox_max[1] = np.max(idx_nonzero_1)
# max projection to the x2-axis
proj_2 = np.max(np.max(mask_all, axis=1), axis=0)
idx_nonzero_2 = np.nonzero(proj_2)[0]
bbox_min[2] = np.min(idx_nonzero_2)
bbox_max[2] = np.max(idx_nonzero_2)
# print("bounding box min = {}".format(bbox_min))
# print("bounding box max = {}".format(bbox_max))
# crop the processing subvolume.
# we need to zeropad the cropped region with 1 voxel at the lower,
# the right and the back side. This is required to obtain the "full"
# convolution result with the 2x2x2 kernel
cropmask_gt = np.zeros((bbox_max - bbox_min) + 2, np.uint8)
cropmask_pred = np.zeros((bbox_max - bbox_min) + 2, np.uint8)
cropmask_gt[0:-1, 0:-1, 0:-1] = mask_gt[bbox_min[0]:bbox_max[0] + 1,
bbox_min[1]:bbox_max[1] + 1,
bbox_min[2]:bbox_max[2] + 1]
cropmask_pred[0:-1, 0:-1, 0:-1] = mask_pred[bbox_min[0]:bbox_max[0] + 1,
bbox_min[1]:bbox_max[1] + 1,
bbox_min[2]:bbox_max[2] + 1]
# compute the neighbour code (local binary pattern) for each voxel
# the resultsing arrays are spacially shifted by minus half a voxel in each axis.
# i.e. the points are located at the corners of the original voxels
kernel = np.array([[[128, 64],
[32, 16]],
[[8, 4],
[2, 1]]])
neighbour_code_map_gt = scipy.ndimage.filters.correlate(cropmask_gt.astype(np.uint8), kernel, mode="constant",
cval=0)
neighbour_code_map_pred = scipy.ndimage.filters.correlate(cropmask_pred.astype(np.uint8), kernel, mode="constant",
cval=0)
# create masks with the surface voxels
borders_gt = ((neighbour_code_map_gt != 0) & (neighbour_code_map_gt != 255))
borders_pred = ((neighbour_code_map_pred != 0) & (neighbour_code_map_pred != 255))
# compute the distance transform (closest distance of each voxel to the surface voxels)
if borders_gt.any():
distmap_gt = scipy.ndimage.morphology.distance_transform_edt(~borders_gt, sampling=spacing_mm)
else:
distmap_gt = np.Inf * np.ones(borders_gt.shape)
if borders_pred.any():
distmap_pred = scipy.ndimage.morphology.distance_transform_edt(~borders_pred, sampling=spacing_mm)
else:
distmap_pred = np.Inf * np.ones(borders_pred.shape)
# compute the area of each surface element
surface_area_map_gt = neighbour_code_to_surface_area[neighbour_code_map_gt]
surface_area_map_pred = neighbour_code_to_surface_area[neighbour_code_map_pred]
# create a list of all surface elements with distance and area
distances_gt_to_pred = distmap_pred[borders_gt]
distances_pred_to_gt = distmap_gt[borders_pred]
surfel_areas_gt = surface_area_map_gt[borders_gt]
surfel_areas_pred = surface_area_map_pred[borders_pred]
# sort them by distance
if distances_gt_to_pred.shape != (0,):
sorted_surfels_gt = np.array(sorted(zip(distances_gt_to_pred, surfel_areas_gt)))
distances_gt_to_pred = sorted_surfels_gt[:, 0]
surfel_areas_gt = sorted_surfels_gt[:, 1]
if distances_pred_to_gt.shape != (0,):
sorted_surfels_pred = np.array(sorted(zip(distances_pred_to_gt, surfel_areas_pred)))
distances_pred_to_gt = sorted_surfels_pred[:, 0]
surfel_areas_pred = sorted_surfels_pred[:, 1]
return {"distances_gt_to_pred": distances_gt_to_pred,
"distances_pred_to_gt": distances_pred_to_gt,
"surfel_areas_gt": surfel_areas_gt,
"surfel_areas_pred": surfel_areas_pred}
def compute_average_surface_distance(surface_distances):
distances_gt_to_pred = surface_distances["distances_gt_to_pred"]
distances_pred_to_gt = surface_distances["distances_pred_to_gt"]
surfel_areas_gt = surface_distances["surfel_areas_gt"]
surfel_areas_pred = surface_distances["surfel_areas_pred"]
average_distance_gt_to_pred = np.sum(distances_gt_to_pred * surfel_areas_gt) / np.sum(surfel_areas_gt)
average_distance_pred_to_gt = np.sum(distances_pred_to_gt * surfel_areas_pred) / np.sum(surfel_areas_pred)
return (average_distance_gt_to_pred, average_distance_pred_to_gt)
def compute_robust_hausdorff(surface_distances, percent):
distances_gt_to_pred = surface_distances["distances_gt_to_pred"]
distances_pred_to_gt = surface_distances["distances_pred_to_gt"]
surfel_areas_gt = surface_distances["surfel_areas_gt"]
surfel_areas_pred = surface_distances["surfel_areas_pred"]
if len(distances_gt_to_pred) > 0:
surfel_areas_cum_gt = np.cumsum(surfel_areas_gt) / np.sum(surfel_areas_gt)
idx = np.searchsorted(surfel_areas_cum_gt, percent / 100.0)
perc_distance_gt_to_pred = distances_gt_to_pred[min(idx, len(distances_gt_to_pred) - 1)]
else:
perc_distance_gt_to_pred = np.Inf
if len(distances_pred_to_gt) > 0:
surfel_areas_cum_pred = np.cumsum(surfel_areas_pred) / np.sum(surfel_areas_pred)
idx = np.searchsorted(surfel_areas_cum_pred, percent / 100.0)
perc_distance_pred_to_gt = distances_pred_to_gt[min(idx, len(distances_pred_to_gt) - 1)]
else:
perc_distance_pred_to_gt = np.Inf
return max(perc_distance_gt_to_pred, perc_distance_pred_to_gt)
def compute_surface_overlap_at_tolerance(surface_distances, tolerance_mm):
distances_gt_to_pred = surface_distances["distances_gt_to_pred"]
distances_pred_to_gt = surface_distances["distances_pred_to_gt"]
surfel_areas_gt = surface_distances["surfel_areas_gt"]
surfel_areas_pred = surface_distances["surfel_areas_pred"]
rel_overlap_gt = np.sum(surfel_areas_gt[distances_gt_to_pred <= tolerance_mm]) / np.sum(surfel_areas_gt)
rel_overlap_pred = np.sum(surfel_areas_pred[distances_pred_to_gt <= tolerance_mm]) / np.sum(surfel_areas_pred)
return (rel_overlap_gt, rel_overlap_pred)
def compute_surface_dice_at_tolerance(surface_distances, tolerance_mm):
distances_gt_to_pred = surface_distances["distances_gt_to_pred"]
distances_pred_to_gt = surface_distances["distances_pred_to_gt"]
surfel_areas_gt = surface_distances["surfel_areas_gt"]
surfel_areas_pred = surface_distances["surfel_areas_pred"]
overlap_gt = np.sum(surfel_areas_gt[distances_gt_to_pred <= tolerance_mm])
overlap_pred = np.sum(surfel_areas_pred[distances_pred_to_gt <= tolerance_mm])
surface_dice = (overlap_gt + overlap_pred) / (
np.sum(surfel_areas_gt) + np.sum(surfel_areas_pred))
return surface_dice
def compute_dice_coefficient(mask_gt, mask_pred):
"""Compute soerensen-dice coefficient.
compute the soerensen-dice coefficient between the ground truth mask `mask_gt`
and the predicted mask `mask_pred`.
Args:
mask_gt: 3-dim Numpy array of type bool. The ground truth mask.
mask_pred: 3-dim Numpy array of type bool. The predicted mask.
Returns:
the dice coeffcient as float. If both masks are empty, the result is NaN
"""
volume_sum = mask_gt.sum() + mask_pred.sum()
if volume_sum == 0:
return np.NaN
volume_intersect = (mask_gt & mask_pred).sum()
return 2 * volume_intersect / volume_sum
if __name__ == '__main__':
# %% Some Simple Tests
# single pixels, 2mm away
mask_gt = np.zeros((128, 128, 128), np.uint8)
mask_pred = np.zeros((128, 128, 128), np.uint8)
mask_gt[50, 60, 70] = 1
mask_pred[50, 60, 72] = 1
surface_distances = compute_surface_distances(mask_gt, mask_pred, spacing_mm=(3, 2, 1))
print("surface dice at 1mm: {}".format(compute_surface_dice_at_tolerance(surface_distances, 1)))
print("volumetric dice: {}".format(compute_dice_coefficient(mask_gt, mask_pred)))
# %% two cubes. cube 1 is 100x100x100 mm^3 and cube 2 is 102x100x100 mm^3
mask_gt = np.zeros((100, 100, 100), np.uint8)
mask_pred = np.zeros((100, 100, 100), np.uint8)
spacing_mm = (2, 1, 1)
mask_gt[0:50, :, :] = 1
mask_pred[0:51, :, :] = 1
surface_distances = compute_surface_distances(mask_gt, mask_pred, spacing_mm)
print("surface dice at 1mm: {}".format(compute_surface_dice_at_tolerance(surface_distances, 1)))
print("volumetric dice: {}".format(compute_dice_coefficient(mask_gt, mask_pred)))
print("")
print("expected average_distance_gt_to_pred = 1./6 * 2mm = {}mm".format(1. / 6 * 2))
print("expected volumetric dice: {}".format(2. * 100 * 100 * 100 / (100 * 100 * 100 + 102 * 100 * 100)))
# %% test empty mask in prediction
mask_gt = np.zeros((128, 128, 128), np.uint8)
mask_pred = np.zeros((128, 128, 128), np.uint8)
mask_gt[50, 60, 70] = 1
# mask_pred[50,60,72] = 1
surface_distances = compute_surface_distances(mask_gt, mask_pred, spacing_mm=(3, 2, 1))
print("average surface distance: {} mm".format(compute_average_surface_distance(surface_distances)))
print("hausdorff (100%): {} mm".format(compute_robust_hausdorff(surface_distances, 100)))
print("hausdorff (95%): {} mm".format(compute_robust_hausdorff(surface_distances, 95)))
print("surface overlap at 1mm: {}".format(compute_surface_overlap_at_tolerance(surface_distances, 1)))
print("surface dice at 1mm: {}".format(compute_surface_dice_at_tolerance(surface_distances, 1)))
print("volumetric dice: {}".format(compute_dice_coefficient(mask_gt, mask_pred)))
# %% test empty mask in ground truth
mask_gt = np.zeros((128, 128, 128), np.uint8)
mask_pred = np.zeros((128, 128, 128), np.uint8)
# mask_gt[50,60,70] = 1
mask_pred[50, 60, 72] = 1
surface_distances = compute_surface_distances(mask_gt, mask_pred, spacing_mm=(3, 2, 1))
print("average surface distance: {} mm".format(compute_average_surface_distance(surface_distances)))
print("hausdorff (100%): {} mm".format(compute_robust_hausdorff(surface_distances, 100)))
print("hausdorff (95%): {} mm".format(compute_robust_hausdorff(surface_distances, 95)))
print("surface overlap at 1mm: {}".format(compute_surface_overlap_at_tolerance(surface_distances, 1)))
print("surface dice at 1mm: {}".format(compute_surface_dice_at_tolerance(surface_distances, 1)))
print("volumetric dice: {}".format(compute_dice_coefficient(mask_gt, mask_pred)))
# %% test both masks empty
mask_gt = np.zeros((128, 128, 128), np.uint8)
mask_pred = np.zeros((128, 128, 128), np.uint8)
# mask_gt[50,60,70] = 1
# mask_pred[50,60,72] = 1
surface_distances = compute_surface_distances(mask_gt, mask_pred, spacing_mm=(3, 2, 1))
print("average surface distance: {} mm".format(compute_average_surface_distance(surface_distances)))
print("hausdorff (100%): {} mm".format(compute_robust_hausdorff(surface_distances, 100)))
print("hausdorff (95%): {} mm".format(compute_robust_hausdorff(surface_distances, 95)))
print("surface overlap at 1mm: {}".format(compute_surface_overlap_at_tolerance(surface_distances, 1)))
print("surface dice at 1mm: {}".format(compute_surface_dice_at_tolerance(surface_distances, 1)))
print("volumetric dice: {}".format(compute_dice_coefficient(mask_gt, mask_pred)))
| 57.638743 | 118 | 0.54083 | 6,671 | 33,027 | 2.572478 | 0.041973 | 0.223763 | 0.279704 | 0.372939 | 0.775945 | 0.726065 | 0.705728 | 0.676184 | 0.652759 | 0.646349 | 0 | 0.285939 | 0.192691 | 33,027 | 572 | 119 | 57.73951 | 0.357687 | 0.117752 | 0 | 0.718894 | 0 | 0 | 0.040888 | 0.000933 | 0 | 0 | 0 | 0 | 0 | 1 | 0.013825 | false | 0 | 0.004608 | 0 | 0.036866 | 0.057604 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
6d318b21e9233a6bd2cb649ed92c7ffdeed40043 | 5,517 | py | Python | metaopt/optimizer.py | vuiseng9/OHO | 1c0bdd10502317f41f717deb317d6dcb04b3c6bf | [
"MIT"
] | 8 | 2021-02-24T19:20:30.000Z | 2021-08-11T21:25:54.000Z | metaopt/optimizer.py | vuiseng9/OHO | 1c0bdd10502317f41f717deb317d6dcb04b3c6bf | [
"MIT"
] | null | null | null | metaopt/optimizer.py | vuiseng9/OHO | 1c0bdd10502317f41f717deb317d6dcb04b3c6bf | [
"MIT"
] | 2 | 2021-04-13T00:54:58.000Z | 2021-08-10T21:08:30.000Z | import torch, math
import numpy as np
from torch.optim.optimizer import Optimizer
from itertools import tee
class SGD_Multi_LR(Optimizer):
def __init__(self, params, lr=0.005, weight_decay=0.00001):
params, params_copy = tee(params)
LR, WD = [], []
for p in params:
LR.append(lr*np.ones(p.shape))
WD.append(weight_decay * np.ones(p.shape))
defaults = dict(lr=LR, weight_decay=WD)
super(SGD_Multi_LR, self).__init__(params_copy, defaults)
def __setstate__(self, state):
super(SGD_Multi_LR, self).__setstate__(state)
def step(self):
"""Performs a single optimization step."""
for group in self.param_groups:
for param, lr, wd in zip(group['params'], group['lr'], group['weight_decay']):
if param.grad is None:
continue
d_p = param.grad.data
lr = torch.from_numpy(np.asarray([lr]))
wd = torch.from_numpy(np.asarray([wd]))
if d_p.is_cuda:
lr = lr.cuda()
wd = wd.cuda()
#if len(param.shape) == 1:
# p_change = -lr[0] * d_p
#else:
p_change = -lr[0] * (d_p + wd[0] * param.data)
param.data.add_(p_change)
class SGD_Quotient_LR(Optimizer):
def __init__(self, params, lr=0.005, weight_decay=0.00001, quotient=2):
params, params_copy = tee(params)
LR, WD = [], []
for p in params:
LR.append(lr*np.ones(p.shape))
WD.append(weight_decay * np.ones(p.shape))
self.quotient = quotient
defaults = dict(lr=LR, weight_decay=WD)
super(SGD_Quotient_LR, self).__init__(params_copy, defaults)
def __setstate__(self, state):
super(SGD_Quotient_LR, self).__setstate__(state)
def mlp_step(self):
"""Performs a single optimization step."""
N = len(self.param_groups[0]['params'])
M = 0
for param in self.param_groups[0]['params']:
if len(param.shape) > 1:
M += 1
freq = M // self.quotient
lr_list, l2_list = [], []
quot_i = 0
for k in range(self.quotient):
count = 0
while (count < freq or k == self.quotient-1) and quot_i < N:
param = self.param_groups[0]['params'][quot_i]
lr_list.append(self.param_groups[0]['lr'][2*k])
l2_list.append(self.param_groups[0]['weight_decay'][2*k])
lr_list.append(self.param_groups[0]['lr'][2*k+1])
l2_list.append(self.param_groups[0]['weight_decay'][2*k+1])
count += 1
quot_i += 2
assert len(lr_list) == N, 'lr length does not match'
assert len(l2_list) == N, 'l2 length does not match'
for group in self.param_groups:
for param, lr, wd in zip(group['params'], lr_list, l2_list):
if param.grad is None:
continue
d_p = param.grad.data
lr = torch.from_numpy(np.asarray([lr]))
wd = torch.from_numpy(np.asarray([wd]))
if d_p.is_cuda:
lr = lr.cuda()
wd = wd.cuda()
if len(param.shape) == 1:
p_change = -lr[0] * d_p
else:
p_change = -lr[0] * (d_p + wd[0] * param.data)
param.data.add_(p_change)
def rez_step(self):
"""Performs a single optimization step."""
N = len(self.param_groups[0]['params'])
M = 0
for param in self.param_groups[0]['params']:
if len(param.shape) > 1:
M += 1
freq = M // self.quotient
lr_list, l2_list = [], []
quot_i = 0
for k in range(self.quotient):
count = 0
while (count <= freq or k == self.quotient-1) and quot_i < N:
param = self.param_groups[0]['params'][quot_i]
lr_list.append(self.param_groups[0]['lr'][2*k])
l2_list.append(self.param_groups[0]['weight_decay'][2*k])
lr_list.append(self.param_groups[0]['lr'][2*k+1])
l2_list.append(self.param_groups[0]['weight_decay'][2*k+1])
if len(param.shape) > 2:
lr_list.append(self.param_groups[0]['lr'][2*k+1])
l2_list.append(self.param_groups[0]['weight_decay'][2*k+1])
quot_i += 3
else:
quot_i += 2
count += 1
assert len(lr_list) == N, 'lr length does not match'
assert len(l2_list) == N, 'l2 length does not match'
for group in self.param_groups:
for param, lr, wd in zip(group['params'], lr_list, l2_list):
if param.grad is None:
continue
d_p = param.grad.data
lr = torch.from_numpy(np.asarray([lr]))
wd = torch.from_numpy(np.asarray([wd]))
if d_p.is_cuda:
lr = lr.cuda()
wd = wd.cuda()
if len(param.shape) == 1:
p_change = -lr[0] * d_p
else:
p_change = -lr[0] * (d_p + wd[0] * param.data)
param.data.add_(p_change)
| 33.035928 | 90 | 0.497916 | 726 | 5,517 | 3.584022 | 0.114325 | 0.065719 | 0.109531 | 0.098386 | 0.911606 | 0.884704 | 0.884704 | 0.869716 | 0.869716 | 0.841276 | 0 | 0.027826 | 0.37466 | 5,517 | 166 | 91 | 33.23494 | 0.726377 | 0.030451 | 0 | 0.820513 | 0 | 0 | 0.043911 | 0 | 0 | 0 | 0 | 0 | 0.034188 | 1 | 0.059829 | false | 0 | 0.034188 | 0 | 0.111111 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
610d67b1a5615e806cb0ac054f7802fe85570ca9 | 68,335 | py | Python | SMExp.py | JunHong-1998/Tkinter-MCG-Calculator | 3edecd1d7fb9c9ee1207d99081bf4100c207a94f | [
"CC0-1.0"
] | null | null | null | SMExp.py | JunHong-1998/Tkinter-MCG-Calculator | 3edecd1d7fb9c9ee1207d99081bf4100c207a94f | [
"CC0-1.0"
] | null | null | null | SMExp.py | JunHong-1998/Tkinter-MCG-Calculator | 3edecd1d7fb9c9ee1207d99081bf4100c207a94f | [
"CC0-1.0"
] | null | null | null | import numpy as np
class SingleMatrix:
def __init__(self, Mat, Dim):
self.m = Mat
self.d = Dim
np.set_printoptions(precision=3)
def __del__(self):
class_name=self.__class__.__name__
def BrcktF(self, ar):
return str(ar).replace('[[', '⎾ ').replace(']]', '\n').replace('[', '⎸ ').replace(']', '')
def BrcktI(self, ar):
return str(ar).replace('[[', ' ⎸ ').replace(']]', ' ⏌\n').replace('[', '⎸ ').replace(']', ' ⎹')
def Mat2D(self, arr):
return str(arr).replace('[[', ' ⎾ ').replace(']]', '⏌').replace('[', ' ⎿ ').replace(']', '⏋')
def Mat3D(self, arr):
return str(arr).replace('[[', ' ⎾ ').replace(']]', ' ⏌').replace('[', '⎹ ').replace(']', ' ⎹')
def MatDtm(self, arr):
return str(arr).replace('[[', ' | ').replace(']]', ' |').replace('[', ' | ').replace(']', ' |')
def Determinant(self):
if self.d==2:
cal = round(self.m[0]*self.m[3]-self.m[2]*self.m[1],2)
exp2D = np.array(self.m).reshape(self.d, self.d)
exp = "D ="+self.MatDtm(exp2D)+"\n\n\n D = "+str(self.m[0])+"•"+str(np.round(self.m[3],3))+" - "+str(np.round(self.m[2],3))+"•"+str(np.round(self.m[1],3))+"\n\n\n D = "+str(cal)
elif self.d==3:
cal = round(self.m[0]*(self.m[4]*self.m[8]-self.m[7]*self.m[5])-self.m[1]*(self.m[3]*self.m[8]-self.m[6]*self.m[5])+self.m[2]*(self.m[3]*self.m[7]-self.m[6]*self.m[4]),2)
exp3D = np.array(self.m).reshape(self.d, self.d)
exp = "D ="+self.MatDtm(exp3D)+\
"\n\n D = "+str(self.m[0])+"("+str(np.round(self.m[4],3))+"•"+str(np.round(self.m[8],3))+"-"+str(np.round(self.m[7],3))+"•"+str(np.round(self.m[5],3))+") \n - "+str(np.round(self.m[1],3))+"("+str(np.round(self.m[3],3))+\
"•"+str(np.round(self.m[8],3))+"-"+str(np.round(self.m[6],3))+"•"+str(np.round(self.m[5],3))+") \n + "+str(np.round(self.m[2],3))+"("+str(np.round(self.m[3],3))+"•"+str(np.round(self.m[7],3))+"-"+str(np.round(self.m[6],3))+"•"+str(np.round(self.m[4],3))+")\n\n D = "+str(cal)
elif self.d==4:
cal = round(self.m[0]*self.m[5]*self.m[10]*self.m[15]-self.m[0]*self.m[5]*self.m[11]*self.m[14]-self.m[0]*self.m[6]*self.m[9]*self.m[15]+self.m[0]*self.m[6]*self.m[11]*self.m[13]+self.m[0]*self.m[7]*self.m[9]*self.m[14]-self.m[0]*self.m[7]*self.m[10]*self.m[13]\
-self.m[1]*self.m[4]*self.m[10]*self.m[15]+self.m[1]*self.m[4]*self.m[11]*self.m[14]+self.m[1]*self.m[6]*self.m[8]*self.m[15]-self.m[1]*self.m[6]*self.m[11]*self.m[12]-self.m[1]*self.m[7]*self.m[8]*self.m[14]+self.m[1]*self.m[7]*self.m[10]*self.m[12]\
+self.m[2]*self.m[4]*self.m[9]*self.m[15]-self.m[2]*self.m[4]*self.m[11]*self.m[13]-self.m[2]*self.m[5]*self.m[8]*self.m[15]+self.m[2]*self.m[5]*self.m[11]*self.m[12]+self.m[2]*self.m[7]*self.m[8]*self.m[13]-self.m[2]*self.m[7]*self.m[9]*self.m[12]\
-self.m[3]*self.m[4]*self.m[9]*self.m[14]+self.m[3]*self.m[4]*self.m[10]*self.m[13]+self.m[3]*self.m[5]*self.m[8]*self.m[14]-self.m[3]*self.m[5]*self.m[10]*self.m[12]-self.m[3]*self.m[6]*self.m[8]*self.m[13]+self.m[3]*self.m[6]*self.m[9]*self.m[12],2)
exp4D = np.array(self.m).reshape(self.d, self.d)
exp = "D ="+self.MatDtm(exp4D)+\
"\n\n D = "+str(np.round(self.m[0],3))+"•"+str(np.round(self.m[5],3))+"•"+str(np.round(self.m[10],3))+"•"+str(np.round(self.m[15],3))+\
"-"+str(np.round(self.m[0],3))+"•"+str(np.round(self.m[5],3))+"•"+str(np.round(self.m[11],3))+"•"+str(np.round(self.m[14],3))+\
"\n -"+str(np.round(self.m[0],3))+"•"+str(np.round(self.m[6],3))+"•"+str(np.round(self.m[9],3))+"•"+str(np.round(self.m[15],3))+\
"+"+str(np.round(self.m[0],3))+"•"+str(np.round(self.m[6],3))+"•"+str(np.round(self.m[11],3))+"•"+str(np.round(self.m[13],3))+\
"\n +"+str(np.round(self.m[0],3))+"•"+str(np.round(self.m[7],3))+"•"+str(np.round(self.m[9],3))+"•"+str(np.round(self.m[14],3))+\
"-"+str(np.round(self.m[0],3))+"•"+str(np.round(self.m[7],3))+"•"+str(np.round(self.m[10],3))+"•"+str(np.round(self.m[13],3))+\
"\n\n -"+str(np.round(self.m[1],3))+"•"+str(np.round(self.m[4],3))+"•"+str(np.round(self.m[10],3))+"•"+str(np.round(self.m[15],3))+\
"+"+str(np.round(self.m[1],3))+"•"+str(np.round(self.m[4],3))+"•"+str(np.round(self.m[11],3))+"•"+str(np.round(self.m[14],3))+\
"\n +"+str(np.round(self.m[1],3))+"•"+str(np.round(self.m[6],3))+"•"+str(np.round(self.m[8],3))+"•"+str(np.round(self.m[15],3))+\
"-"+str(np.round(self.m[1],3))+"•"+str(np.round(self.m[6],3))+"•"+str(np.round(self.m[11],3))+"•"+str(np.round(self.m[12],3))+\
"\n -"+str(np.round(self.m[1],3))+"•"+str(np.round(self.m[7],3))+"•"+str(np.round(self.m[8],3))+"•"+str(np.round(self.m[14],3))+\
"+"+str(np.round(self.m[1],3))+"•"+str(np.round(self.m[7],3))+"•"+str(np.round(self.m[10],3))+"•"+str(np.round(self.m[12],3))+\
"\n\n +"+str(np.round(self.m[2],3))+"•"+str(np.round(self.m[4],3))+"•"+str(np.round(self.m[9],3))+"•"+str(np.round(self.m[15],3))+\
"-"+str(np.round(self.m[2],3))+"•"+str(np.round(self.m[4],3))+"•"+str(np.round(self.m[11],3))+"•"+str(np.round(self.m[13],3))+\
"\n -"+str(np.round(self.m[2],3))+"•"+str(np.round(self.m[5],3))+"•"+str(np.round(self.m[8],3))+"•"+str(np.round(self.m[15],3))+\
"+"+str(np.round(self.m[2],3))+"•"+str(np.round(self.m[5],3))+"•"+str(np.round(self.m[11],3))+"•"+str(np.round(self.m[12],3))+\
"\n +"+str(np.round(self.m[2],3))+"•"+str(np.round(self.m[7],3))+"•"+str(np.round(self.m[8],3))+"•"+str(np.round(self.m[13],3))+\
"-"+str(np.round(self.m[2],3))+"•"+str(np.round(self.m[7],3))+"•"+str(np.round(self.m[9],3))+"•"+str(np.round(self.m[12],3))+\
"\n\n -"+str(np.round(self.m[3],3))+"•"+str(np.round(self.m[4],3))+"•"+str(np.round(self.m[9],3))+"•"+str(np.round(self.m[14],3))+\
"+"+str(np.round(self.m[3],3))+"•"+str(np.round(self.m[4],3))+"•"+str(np.round(self.m[10],3))+"•"+str(np.round(self.m[13],3))+\
"\n +"+str(np.round(self.m[3],3))+"•"+str(np.round(self.m[5],3))+"•"+str(np.round(self.m[8],3))+"•"+str(np.round(self.m[14],3))+\
"-"+str(np.round(self.m[3],3))+"•"+str(np.round(self.m[5],3))+"•"+str(np.round(self.m[10],3))+"•"+str(np.round(self.m[2],3))+\
"\n -"+str(np.round(self.m[3],3))+"•"+str(np.round(self.m[6],3))+"•"+str(np.round(self.m[8],3))+"•"+str(np.round(self.m[13],3))+\
"+"+str(np.round(self.m[3],3))+"•"+str(np.round(self.m[6],3))+"•"+str(np.round(self.m[9],3))+"•"+str(np.round(self.m[12],3))+\
"\n\n D = "+str(cal)
else:
exp = "Please choose a \ndimension before compute"
return exp
def Inverse(self):
if self.d == 2:
dtm = round(self.m[0] * self.m[3] - self.m[2] * self.m[1], 2)
if dtm!=0:
exp2D = np.array(self.m).reshape(self.d, self.d)
cal = round(1/(self.m[0]*self.m[3]-self.m[1]*self.m[2]),3)
Eq0 = np.array([self.m[3], self.m[1]*-1, self.m[2]*-1, self.m[0]]).reshape(self.d, self.d)
Eq1 = Eq0 * cal
exp= "(A) = \n\n\nA⁻¹ =\n"+ "(1 / (" + str(np.round(self.m[0],3)) + "•" + str(np.round(self.m[3],3)) + " - " + str(np.round(self.m[1],3)) + "•" + str(np.round(self.m[2],3)) + ")) x\n\nA⁻¹ = \n" + str(cal)+ " x\n\n\nA⁻¹ = "
else:
exp="The determinant is 0,"
elif self.d == 3:
dtm = round(self.m[0]*(self.m[4]*self.m[8]-self.m[7]*self.m[5])-self.m[1]*(self.m[3]*self.m[8]-self.m[6]*self.m[5])+self.m[2]*(self.m[3]*self.m[7]-self.m[6]*self.m[4]),2)
if dtm!=0:
Eq0 = np.array(self.m).reshape(self.d, self.d)
if Eq0[0,0]==0 and Eq0[1,0]!=0:
Eq1 = np.array([Eq0[1], Eq0[0], Eq0[2]])
elif Eq0[0,0]==0 and Eq0[2,0]!=0:
Eq1 = np.array([Eq0[2], Eq0[1], Eq0[0]])
else:
Eq1 = np.array([Eq0[0], Eq0[1] - Eq0[0] * Eq0[1, 0] / Eq0[0, 0], Eq0[2]])
if Eq1[2,0]==0:
Eq2 = Eq1
else:
Eq2 = np.array([Eq1[0], Eq1[1], Eq1[2] - Eq1[0] * Eq1[2, 0] / Eq1[0, 0]])
if Eq2[1,1]==0:
Eq3 = np.array([Eq2[0], Eq2[2], Eq2[1]])
elif Eq2[2,1]==0:
Eq3 = Eq2
else:
Eq3 = np.array([Eq2[0], Eq2[1], Eq2[2]-Eq2[1]*Eq2[2,1]/Eq2[1,1]])
if Eq3[0,2]==0:
Eq4 = Eq3
else:
Eq4 = np.array([Eq3[0] - Eq3[2] * Eq3[0, 2] / Eq3[2, 2], Eq3[1], Eq3[2]])
if Eq4[1,2]==0:
Eq5 = Eq4
else:
Eq5 = np.array([Eq4[0], Eq4[1]-Eq4[2]*Eq4[1,2]/Eq4[2,2], Eq4[2]])
if Eq5[0,1]==0:
Eq6 = Eq5
else:
Eq6 = np.array([Eq5[0] - Eq5[1] * Eq5[0, 1] / Eq5[1, 1], Eq5[1], Eq5[2]])
Eq7 = np.array([Eq6[0]/Eq6[0,0], Eq6[1]/Eq6[1,1], Eq6[2]/Eq6[2,2]])
exp = "(A|I) = "+self.BrcktF(Eq0)+"\nR₂ - R₁(R₂[2,1]/R₁[1,1])\nR₃ - R₁(R₃[3,1]/R₁[1,1])\n"+self.BrcktF(np.round(Eq2,3))+"\nR₃ - R₂(R₃[3,2]/R₂[2,2]\n"+self.BrcktF(np.round(Eq3,3))+\
"\nR₁ - R₃(R₁[1,3]/R₃[3,3])\nR₂ - R₃(R₂[2,3]/R₃[3,3])\n"+self.BrcktF(np.round(Eq5,3))+"\nR₁ - R₂(R₁[1,2]/R₂[2,2])\n"+self.BrcktF(np.round(Eq6,3))+"\nR₁/R₁[1,1];\nR₂/R₂[2,2];\nR₃/R₃[3,3]\n"+self.BrcktF(np.round(Eq7,3))
else:
exp="The determinant is 0,\nmatrix is not invertible !"
elif self.d == 4:
dtm = round(self.m[0]*self.m[5]*self.m[10]*self.m[15]-self.m[0]*self.m[5]*self.m[11]*self.m[14]-self.m[0]*self.m[6]*self.m[9]*self.m[15]+self.m[0]*self.m[6]*self.m[11]*self.m[13]+self.m[0]*self.m[7]*self.m[9]*self.m[14]-self.m[0]*self.m[7]*self.m[10]*self.m[13]\
-self.m[1]*self.m[4]*self.m[10]*self.m[15]+self.m[1]*self.m[4]*self.m[11]*self.m[14]+self.m[1]*self.m[6]*self.m[8]*self.m[15]-self.m[1]*self.m[6]*self.m[11]*self.m[12]-self.m[1]*self.m[7]*self.m[8]*self.m[14]+self.m[1]*self.m[7]*self.m[10]*self.m[12]\
+self.m[2]*self.m[4]*self.m[9]*self.m[15]-self.m[2]*self.m[4]*self.m[11]*self.m[13]-self.m[2]*self.m[5]*self.m[8]*self.m[15]+self.m[2]*self.m[5]*self.m[11]*self.m[12]+self.m[2]*self.m[7]*self.m[8]*self.m[13]-self.m[2]*self.m[7]*self.m[9]*self.m[12]\
-self.m[3]*self.m[4]*self.m[9]*self.m[14]+self.m[3]*self.m[4]*self.m[10]*self.m[13]+self.m[3]*self.m[5]*self.m[8]*self.m[14]-self.m[3]*self.m[5]*self.m[10]*self.m[12]-self.m[3]*self.m[6]*self.m[8]*self.m[13]+self.m[3]*self.m[6]*self.m[9]*self.m[12],2)
if dtm!=0:
Eq0 = np.array(self.m).reshape(self.d, self.d)
if Eq0[0,0]==0 and Eq0[1,0]!=0:
Eq1 = np.array([Eq0[1], Eq0[0], Eq0[2], Eq0[3]])
elif Eq0[0,0]==0 and Eq0[2,0]!=0:
Eq1 = np.array([Eq0[2], Eq0[1], Eq0[0], Eq0[3]])
elif Eq0[0,0]==0 and Eq0[3,0]!=0:
Eq1 = np.array([Eq0[3], Eq0[1], Eq0[2], Eq0[0]])
elif Eq0[1,0]==0:
Eq1 = Eq0
else:
Eq1 = np.array([Eq0[0], Eq0[1] - Eq0[0] * Eq0[1, 0] / Eq0[0, 0], Eq0[2], Eq0[3]])
if Eq1[2,0]==0:
Eq2 = Eq1
else:
Eq2 = np.array([Eq1[0], Eq1[1], Eq1[2] - Eq1[0] * Eq1[2, 0] / Eq1[0, 0], Eq0[3]])
if Eq2[3,0]==0:
Eq3 = Eq2
else:
Eq3 = np.array([Eq2[0], Eq2[1], Eq2[2], Eq2[3] - Eq2[0] * Eq2[3, 0] / Eq2[0, 0]])
if Eq3[1,1]==0 and Eq3[3,1]!=0:
Eq4 = np.array([Eq3[0], Eq3[3], Eq3[2], Eq3[1]])
elif Eq3[3, 1] == 0:
Eq4 = Eq3
else:
Eq4 = np.array([Eq3[0], Eq3[1], Eq3[2], Eq3[3] - Eq3[1] * Eq3[3, 1] / Eq3[1, 1]])
if Eq4[1,1]==0 and Eq4[2,1]!=0:
Eq5 = np.array([Eq4[0], Eq4[2], Eq4[1], Eq4[3]])
elif Eq4[2,1]==0:
Eq5 = Eq4
else:
Eq5 = np.array([Eq4[0], Eq4[1], Eq4[2] - Eq4[1] * Eq4[2, 1] / Eq4[1, 1], Eq4[3]])
if Eq5[2,2]==0:
Eq6 = np.array([Eq5[0], Eq5[1], Eq5[3], Eq5[2]])
elif Eq5[3,2]==0:
Eq6 = Eq5
else:
Eq6 = np.array([Eq5[0], Eq5[1], Eq5[2], Eq5[3] - Eq5[2] * Eq5[3, 2] / Eq5[2, 2]])
if Eq6[2,3]==0:
Eq7 = Eq6
else:
Eq7 = np.array([Eq6[0], Eq6[1], Eq6[2]-Eq6[3]*Eq6[2, 3]/Eq6[3, 3], Eq6[3]])
if Eq7[1,3]==0:
Eq8 = Eq7
else:
Eq8 = np.array([Eq7[0], Eq7[1] - Eq7[3] * Eq7[1, 3] / Eq7[3, 3], Eq7[2], Eq7[3]])
if Eq8[0,3]==0:
Eq9 = Eq8
else:
Eq9 = np.array([Eq8[0] - Eq8[3] * Eq8[0, 3] / Eq8[3, 3], Eq8[1], Eq8[2], Eq8[3]])
if Eq9[1,2]==0:
Eq10 = Eq9
else:
Eq10 = np.array([Eq9[0], Eq9[1] - Eq9[2] * Eq9[1, 2] / Eq9[2, 2], Eq9[2], Eq9[3]])
if Eq10[0,2]==0:
Eq11 = Eq10
else:
Eq11 = np.array([Eq10[0] - Eq10[2] * Eq10[0, 2] / Eq10[2, 2], Eq10[1], Eq10[2], Eq10[3]])
if Eq11[0,1]==0:
Eq12 = Eq11
else:
Eq12 = np.array([Eq11[0] - Eq11[1] * Eq11[0, 1] / Eq11[1, 1], Eq11[1], Eq11[2], Eq11[3]])
Eq13 = np.array([Eq12[0]/Eq12[0,0], Eq12[1]/Eq12[1,1], Eq12[2]/Eq12[2,2], Eq12[3]/Eq12[3,3]])
exp = "(A|I) = "+self.BrcktF(np.round(Eq0,3))+"\nR₂ - R₁(R₂[2,1]/R₁[1,1])\nR₃ - R₁(R₃[3,1]/R₁[1,1])\n"+self.BrcktF(np.round(Eq3,3))+"\n\nR₃ - R₂(R₃[3,2]/R₂[2,2]" \
"\n"+self.BrcktF(np.round(Eq5,3))+"\n\nR₄ - R₂(R₄[4,3]/R₂[2,3])\n"+self.BrcktF(np.round(Eq6,3))+"\n\nR₁ - R₄(R₁[1,4]/R₄[4,4])\nR₂ - R₄(R₂[2,4]/R₄[4,4])\n"+\
self.BrcktF(np.round(Eq9,3))+"\n\nR₁ - R₃(R₁[1,3]/R₃[3,3])\n"+self.BrcktF(np.round(Eq11,3))+"\n\nR₁ - R₂(R₁[1,2]/R₂[2,2])\n"+self.BrcktF(np.round(Eq12,3))+\
"\n\nR₁/R₁[1,1];R₂/R₂[2,2]\n"+ self.BrcktF(np.round(Eq13,3))
else:
exp="The determinant is 0,\nmatrix is not invertible !"
else:
exp = "Please choose\nbefore"
return exp
def InverseRight(self):
if self.d == 2:
dtm = round(self.m[0] * self.m[3] - self.m[2] * self.m[1], 2)
if dtm != 0:
exp2D = np.array(self.m).reshape(self.d, self.d)
cal = 1 / (self.m[0] * self.m[3] - self.m[1] * self.m[2])
Eq0 = np.array([self.m[3], self.m[1] * -1, self.m[2] * -1, self.m[0]]).reshape(self.d, self.d)
Eq1 = Eq0 * cal
exp = "" + self.Mat2D(exp2D) + "\n\n" + self.Mat2D(Eq0) + "\n\n\n"+ self.Mat2D(Eq0) +"\n\n\n" + self.Mat2D(np.round(Eq1,3))
else:
exp = "Matrix is not invertible ! "
elif self.d == 3:
dtm = round(self.m[0] * (self.m[4] * self.m[8] - self.m[7] * self.m[5]) - self.m[1] * (self.m[3] * self.m[8] - self.m[6] * self.m[5]) + self.m[2] * (self.m[3] * self.m[7] - self.m[6] * self.m[4]), 2)
if dtm != 0:
Eq0 = np.array(self.m).reshape(self.d, self.d)
EqI0 = np.array([1, 0, 0, 0, 1, 0, 0, 0, 1]).reshape(self.d, self.d)
if Eq0[0, 0] == 0 and Eq0[1, 0] != 0:
Eq1 = np.array([Eq0[1], Eq0[0], Eq0[2]])
EqI1 = np.array([EqI0[1], EqI0[0], EqI0[2]])
elif Eq0[0, 0] == 0 and Eq0[2, 0] != 0:
Eq1 = np.array([Eq0[2], Eq0[1], Eq0[0]])
EqI1 = np.array([EqI0[2], EqI0[1], EqI0[0]])
else:
Eq1 = np.array([Eq0[0], Eq0[1] - Eq0[0] * Eq0[1, 0] / Eq0[0, 0], Eq0[2]])
EqI1 = np.array([EqI0[0], EqI0[1] - EqI0[0] * Eq0[1, 0] / Eq0[0, 0], EqI0[2]])
if Eq1[2, 0] == 0:
Eq2 = Eq1
EqI2 = EqI1
else:
Eq2 = np.array([Eq1[0], Eq1[1], Eq1[2] - Eq1[0] * Eq1[2, 0] / Eq1[0, 0]])
EqI2 = np.array([EqI1[0], EqI1[1], EqI1[2] - EqI1[0] * Eq1[2, 0] / Eq1[0, 0]])
if Eq2[1, 1] == 0:
Eq3 = np.array([Eq2[0], Eq2[2], Eq2[1]])
EqI3 = np.array([EqI2[0], EqI2[2], EqI2[1]])
elif Eq2[2, 1] == 0:
Eq3 = Eq2
EqI3 = EqI2
else:
Eq3 = np.array([Eq2[0], Eq2[1], Eq2[2] - Eq2[1] * Eq2[2, 1] / Eq2[1, 1]])
EqI3 = np.array([EqI2[0], EqI2[1], EqI2[2] - EqI2[1] * Eq2[2, 1] / Eq2[1, 1]])
if Eq3[0, 2] == 0:
Eq4 = Eq3
EqI4 = EqI3
else:
Eq4 = np.array([Eq3[0] - Eq3[2] * Eq3[0, 2] / Eq3[2, 2], Eq3[1], Eq3[2]])
EqI4 = np.array([EqI3[0] - EqI3[2] * Eq3[0, 2] / Eq3[2, 2], EqI3[1], EqI3[2]])
if Eq4[1, 2] == 0:
Eq5 = Eq4
EqI5 = EqI4
else:
Eq5 = np.array([Eq4[0], Eq4[1] - Eq4[2] * Eq4[1, 2] / Eq4[2, 2], Eq4[2]])
EqI5 = np.array([EqI4[0], EqI4[1] - EqI4[2] * Eq4[1, 2] / Eq4[2, 2], EqI4[2]])
if Eq5[0, 1] == 0:
Eq6 = Eq5
EqI6 = EqI5
else:
Eq6 = np.array([Eq5[0] - Eq5[1] * Eq5[0, 1] / Eq5[1, 1], Eq5[1], Eq5[2]])
EqI6 = np.array([EqI5[0] - EqI5[1] * Eq5[0, 1] / Eq5[1, 1], EqI5[1], EqI5[2]])
EqI7 = np.array([EqI6[0] / Eq6[0, 0], EqI6[1] / Eq6[1, 1], EqI6[2] / Eq6[2, 2]])
exp = self.BrcktI(EqI0) +"\n\n\n"+ self.BrcktI(np.round(EqI2,3)) +"\n\n"+ self.BrcktI(np.round(EqI3,3)) +"\n\n\n"+ self.BrcktI(np.round(EqI5,3)) +"\n\n"+ self.BrcktI(np.round(EqI6,3)) +"\n\n\n\n"+self.BrcktI(np.round(EqI7,3))
else:
exp=""
elif self.d == 4:
dtm = round(self.m[0] * self.m[5] * self.m[10] * self.m[15] - self.m[0] * self.m[5] * self.m[11] * self.m[14] -self.m[0] * self.m[6] * self.m[9] * self.m[15] + self.m[0] * self.m[6] * self.m[11] * self.m[13] +
self.m[0] * self.m[7] * self.m[9] * self.m[14] - self.m[0] * self.m[7] * self.m[10] * self.m[13] - self.m[1] * self.m[4] * self.m[10] * self.m[15] + self.m[1] * self.m[4] * self.m[11] * self.m[14] +
self.m[1] * self.m[6] * self.m[8] * self.m[15] - self.m[1] * self.m[6] * self.m[11] * self.m[12] -self.m[1] * self.m[7] * self.m[8] * self.m[14] + self.m[1] * self.m[7] * self.m[10] * self.m[12]
+ self.m[2] * self.m[4] * self.m[9] * self.m[15] - self.m[2] * self.m[4] * self.m[11] * self.m[13] -self.m[2] * self.m[5] * self.m[8] * self.m[15] + self.m[2] * self.m[5] * self.m[11] * self.m[12] +
self.m[2] * self.m[7] * self.m[8] * self.m[13] - self.m[2] * self.m[7] * self.m[9] * self.m[12] - self.m[3] * self.m[4] * self.m[9] * self.m[14] + self.m[3] * self.m[4] * self.m[10] * self.m[13] +
self.m[3] * self.m[5] * self.m[8] * self.m[14] - self.m[3] * self.m[5] * self.m[10] * self.m[12] - self.m[3] * self.m[6] * self.m[8] * self.m[13] + self.m[3] * self.m[6] * self.m[9] * self.m[12], 2)
if dtm != 0:
Eq0 = np.array(self.m).reshape(self.d, self.d)
EqI0 = np.array([1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1]).reshape(self.d, self.d)
if Eq0[0, 0] == 0 and Eq0[1, 0] != 0:
Eq1 = np.array([Eq0[1], Eq0[0], Eq0[2], Eq0[3]])
EqI1 = np.array([EqI0[1], EqI0[0], EqI0[2], EqI0[3]])
elif Eq0[0, 0] == 0 and Eq0[2, 0] != 0:
Eq1 = np.array([Eq0[2], Eq0[1], Eq0[0], Eq0[3]])
EqI1 = np.array([EqI0[2], EqI0[1], EqI0[0], EqI0[3]])
elif Eq0[0, 0] == 0 and Eq0[3, 0] != 0:
Eq1 = np.array([Eq0[3], Eq0[1], Eq0[2], Eq0[0]])
EqI1 = np.array([EqI0[2], EqI0[1], EqI0[0], EqI0[3]])
elif Eq0[1, 0] == 0:
Eq1 = Eq0
EqI1 = EqI0
else:
Eq1 = np.array([Eq0[0], Eq0[1] - Eq0[0] * Eq0[1, 0] / Eq0[0, 0], Eq0[2], Eq0[3]])
EqI1 = np.array([EqI0[0], EqI0[1] - EqI0[0] * Eq0[1, 0] / Eq0[0, 0], EqI0[2], EqI0[3]])
if Eq1[2, 0] == 0:
Eq2 = Eq1
EqI2 = EqI1
else:
Eq2 = np.array([Eq1[0], Eq1[1], Eq1[2] - Eq1[0] * Eq1[2, 0] / Eq1[0, 0], Eq0[3]])
EqI2 = np.array([EqI1[0], EqI1[1], EqI1[2] - EqI1[0] * Eq1[2, 0] / Eq1[0, 0], EqI1[3]])
if Eq2[3, 0] == 0:
Eq3 = Eq2
EqI3 = EqI2
else:
Eq3 = np.array([Eq2[0], Eq2[1], Eq2[2], Eq2[3] - Eq2[0] * Eq2[3, 0] / Eq2[0, 0]])
EqI3 = np.array([EqI2[0], EqI2[1], EqI2[2], EqI2[3] - EqI2[0] * Eq2[3, 0] / Eq2[0, 0]])
if Eq3[1, 1] == 0 and Eq3[3, 1] != 0:
Eq4 = np.array([Eq3[0], Eq3[3], Eq3[2], Eq3[1]])
EqI4 = np.array([EqI3[0], EqI3[3], EqI3[2], EqI3[1]])
elif Eq3[3, 1] == 0:
Eq4 = Eq3
EqI4 = EqI3
else:
Eq4 = np.array([Eq3[0], Eq3[1], Eq3[2], Eq3[3] - Eq3[1] * Eq3[3, 1] / Eq3[1, 1]])
EqI4 = np.array([EqI3[0], EqI3[1], EqI3[2], EqI3[3] - EqI3[1] * Eq3[3, 1] / Eq3[1, 1]])
if Eq4[1, 1] == 0 and Eq4[2, 1] != 0:
Eq5 = np.array([Eq4[0], Eq4[2], Eq4[1], Eq4[3]])
EqI5 = np.array([EqI4[0], EqI4[2], EqI4[1], EqI4[3]])
elif Eq4[2, 1] == 0:
Eq5 = Eq4
EqI5 = EqI4
else:
Eq5 = np.array([Eq4[0], Eq4[1], Eq4[2] - Eq4[1] * Eq4[2, 1] / Eq4[1, 1], Eq4[3]])
EqI5 = np.array([EqI4[0], EqI4[1], EqI4[2] - EqI4[1] * Eq4[2, 1] / Eq4[1, 1], EqI4[3]])
if Eq5[2, 2] == 0:
Eq6 = np.array([Eq5[0], Eq5[1], Eq5[3], Eq5[2]])
EqI6 = np.array([EqI5[0], EqI5[1], EqI5[3], EqI5[2]])
elif Eq5[3, 2] == 0:
Eq6 = Eq5
EqI6 = EqI5
else:
Eq6 = np.array([Eq5[0], Eq5[1], Eq5[2], Eq5[3] - Eq5[2] * Eq5[3, 2] / Eq5[2, 2]])
EqI6 = np.array([EqI5[0], EqI5[1], EqI5[2], EqI5[3] - EqI5[2] * Eq5[3, 2] / Eq5[2, 2]])
if Eq6[2, 3] == 0:
Eq7 = Eq6
EqI7 = EqI6
else:
Eq7 = np.array([Eq6[0], Eq6[1], Eq6[2] - Eq6[3] * Eq6[2, 3] / Eq6[3, 3], Eq6[3]])
EqI7 = np.array([EqI6[0], EqI6[1], EqI6[2] - EqI6[3] * Eq6[2, 3] / Eq6[3, 3], EqI6[3]])
if Eq7[1, 3] == 0:
Eq8 = Eq7
EqI8 = EqI7
else:
Eq8 = np.array([Eq7[0], Eq7[1] - Eq7[3] * Eq7[1, 3] / Eq7[3, 3], Eq7[2], Eq7[3]])
EqI8 = np.array([EqI7[0], EqI7[1] - EqI7[3] * Eq7[1, 3] / Eq7[3, 3], EqI7[2], EqI7[3]])
if Eq8[0, 3] == 0:
Eq9 = Eq8
EqI9 = EqI8
else:
Eq9 = np.array([Eq8[0] - Eq8[3] * Eq8[0, 3] / Eq8[3, 3], Eq8[1], Eq8[2], Eq8[3]])
EqI9 = np.array([EqI8[0] - EqI8[3] * Eq8[0, 3] / Eq8[3, 3], EqI8[1], EqI8[2], EqI8[3]])
if Eq9[1, 2] == 0:
Eq10 = Eq9
EqI10 = EqI9
else:
Eq10 = np.array([Eq9[0], Eq9[1] - Eq9[2] * Eq9[1, 2] / Eq9[2, 2], Eq9[2], Eq9[3]])
EqI10 = np.array([EqI9[0], EqI9[1] - EqI9[2] * Eq9[1, 2] / Eq9[2, 2], EqI9[2], EqI9[3]])
if Eq10[0, 2] == 0:
Eq11 = Eq10
EqI11 = EqI10
else:
Eq11 = np.array([Eq10[0] - Eq10[2] * Eq10[0, 2] / Eq10[2, 2], Eq10[1], Eq10[2], Eq10[3]])
EqI11 = np.array([EqI10[0] - EqI10[2] * Eq10[0, 2] / Eq10[2, 2], EqI10[1], EqI10[2], EqI10[3]])
if Eq11[0, 1] == 0:
Eq12 = Eq11
EqI12 = EqI11
else:
Eq12 = np.array([Eq11[0] - Eq11[1] * Eq11[0, 1] / Eq11[1, 1], Eq11[1], Eq11[2], Eq11[3]])
EqI12 = np.array([EqI11[0] - EqI11[1] * Eq11[0, 1] / Eq11[1, 1], EqI11[1], EqI11[2], EqI11[3]])
EqI13 = np.array([EqI12[0] / Eq12[0, 0], EqI12[1] / Eq12[1, 1], EqI12[2] / Eq12[2, 2], EqI12[3] / Eq12[3, 3]])
exp=self.BrcktI(np.round(EqI0,3))+"\n\nR₄ - R₁(R₄[4,1]/R₁[1,1])\n"+self.BrcktI(np.round(EqI3,3))+"\n\n\n"+self.BrcktI(np.round(EqI5,3))+"\n\nR₄- R₂(R₄[4,2]/R₂[2,2]\n"+self.BrcktI(np.round(EqI6,3))+"\n\n\nR₃ - R₄(R₂[3,4]/R₄[4,4])\n"+\
self.BrcktI(np.round(EqI9,3))+"\n\nR₂ - R₃(R₁[2,3]/R₃[3,3])\n"+self.BrcktI(np.round(EqI11,3))+"\n\n\n"+self.BrcktI(np.round(EqI12,3))+" \n\nR₃/R₃[3,3];R₄/R₄[4,4]\n"+self.BrcktI(np.round(EqI13,3))
else:
exp=""
else:
exp="a dimension\ncompute"
return exp
def Transpose(self):
if self.d == 2:
exp2D = np.array(self.m).reshape(self.d, self.d)
cal = exp2D.transpose()
exp = "[A] =\n"+self.Mat2D(exp2D)+"\n\nTranspose of [A] =\n" +self.Mat2D(cal)
elif self.d == 3:
exp3D = np.array(self.m).reshape(self.d, self.d)
cal = exp3D.transpose()
exp = "[A] =\n"+self.Mat3D(exp3D) + "\n\nTranspose of [A] =\n" +self.Mat3D(cal)
elif self.d == 4:
exp4D = np.array(self.m).reshape(self.d, self.d)
cal = exp4D.transpose()
exp = "[A] =\n"+self.Mat3D(exp4D) + "\n\nTranspose of [A] =\n" +self.Mat3D(cal)
else:
exp = "Please choose \na dimension\nbefore compute"
return exp
def Scalar(self):
if self.d == 3:
exp3D = np.array(self.m).reshape(self.d, self.d)
cal = self.m[0] * self.m[4] * self.m[8] + self.m[1] * self.m[5] * self.m[6] + self.m[2] * self.m[3] *self.m[7] - self.m[2] * self.m[4] * self.m[6] - self.m[1] * self.m[3] * self.m[8] \
- self.m[0] * self.m[5] * self.m[7]
exp = "[A] =\n"+str(exp3D).replace('[[', ' \t| ').replace(']]', '|').replace('[', ' \t| ').replace(']', '|')+"\n\na•(b x c) = \n " +str(np.round(self.m[0],3))+"•"+str(np.round(np.round(self.m[4],3),3)) \
+"•"+str(np.round(self.m[8],3))+"+"+str(np.round(self.m[1],3))+"•"+str(np.round(self.m[5],3))+"•"+str(np.round(self.m[6],3))+"\n +"+str(np.round(self.m[2],3))+"•"+str(np.round(self.m[3],3))+"•"+str(np.round(self.m[7],3))+"-"+str(np.round(self.m[2],3))\
+"•"+str(np.round(self.m[4],3))+"•"+str(np.round(self.m[6],3))+"\n -"+str(np.round(self.m[1],3))+"•"+str(np.round(self.m[3],3))+"•"+str(np.round(self.m[8],3))+"-"+str(np.round(self.m[0],3))+"•"+str(np.round(self.m[5],3))+"•"+str(np.round(self.m[7],3))\
+"\n\n= ("+str(np.round(self.m[0],3) * np.round(self.m[4],3) * np.round(self.m[8],3))+")+("+str(np.round(self.m[1],3) * np.round(self.m[5],3) * np.round(self.m[6],3))+")+("+str(np.round(self.m[2],3) * np.round(self.m[3],3) * np.round(self.m[7],3))+")\n -("\
+str(np.round(self.m[2],3) * np.round(self.m[4],3) * np.round(self.m[6],3))+")-("+str(np.round(self.m[1],3) * np.round(self.m[3],3) * np.round(self.m[8],3))+")-("+str(np.round(self.m[0],3) * np.round(self.m[5],3) * np.round(self.m[7],3))+")"+"\n\n="+str(cal)
else:
exp = "Please choose \n3x3 dimension \nto compute"
return exp
def Triangular(self):
if self.d == 2:
Eq0 = np.array(self.m).reshape(self.d, self.d)
if Eq0[0, 0] == 0:
Eq1 = np.array([Eq0[1], Eq0[0]])
elif Eq0[1, 0] == 0:
Eq1 = Eq0
else:
Eq1 = np.array([Eq0[0], Eq0[1] - Eq0[0] * Eq0[1, 0] / Eq0[0, 0]])
exp = self.Mat2D(Eq0) + "\n\nR₂ - R₁(R₂[2,1]/R₁[1,1])\n" + self.Mat2D(np.round(Eq1,3))
elif self.d == 3:
Eq0 = np.array(self.m).reshape(self.d, self.d)
if Eq0[0, 0] == 0 and Eq0[1, 0] != 0:
Eq1 = np.array([Eq0[1], Eq0[0], Eq0[2]])
elif Eq0[0, 0] == 0 and Eq0[2, 0] != 0:
Eq1 = np.array([Eq0[2], Eq0[1], Eq0[0]])
else:
Eq1 = np.array([Eq0[0], Eq0[1] - Eq0[0] * Eq0[1, 0] / Eq0[0, 0], Eq0[2]])
if Eq1[2, 0] == 0:
Eq2 = Eq1
else:
Eq2 = np.array([Eq1[0], Eq1[1], Eq1[2] - Eq1[0] * Eq1[2, 0] / Eq1[0, 0]])
if Eq2[1, 1] == 0:
Eq3 = np.array([Eq2[0], Eq2[2], Eq2[1]])
elif Eq2[2, 1] == 0:
Eq3 = Eq2
else:
Eq3 = np.array([Eq2[0], Eq2[1], Eq2[2] - Eq2[1] * Eq2[2, 1] / Eq2[1, 1]])
exp = self.Mat3D(Eq0) + "\n\nR₂ - R₁(R₂[2,1]/R₁[1,1])\nR₃ - R₁(R₃[3,1]/R₁[1,1])\n" + self.Mat3D(
np.round(Eq2,3)) + "\n\nR₃ - R₂(R₃[3,2]/R₂[2,2]\n" + self.Mat3D(np.round(Eq3,3))
elif self.d == 4:
Eq0 = np.array(self.m).reshape(self.d, self.d)
if Eq0[0, 0] == 0 and Eq0[1, 0] != 0:
Eq1 = np.array([Eq0[1], Eq0[0], Eq0[2], Eq0[3]])
elif Eq0[0, 0] == 0 and Eq0[2, 0] != 0:
Eq1 = np.array([Eq0[2], Eq0[1], Eq0[0], Eq0[3]])
elif Eq0[0, 0] == 0 and Eq0[3, 0] != 0:
Eq1 = np.array([Eq0[3], Eq0[1], Eq0[2], Eq0[0]])
elif Eq0[1, 0] == 0:
Eq1 = Eq0
else:
Eq1 = np.array([Eq0[0], Eq0[1] - Eq0[0] * Eq0[1, 0] / Eq0[0, 0], Eq0[2], Eq0[3]])
if Eq1[2, 0] == 0:
Eq2 = Eq1
else:
Eq2 = np.array([Eq1[0], Eq1[1], Eq1[2] - Eq1[0] * Eq1[2, 0] / Eq1[0, 0], Eq0[3]])
if Eq2[3, 0] == 0:
Eq3 = Eq2
else:
Eq3 = np.array([Eq2[0], Eq2[1], Eq2[2], Eq2[3] - Eq2[0] * Eq2[3, 0] / Eq2[0, 0]])
if Eq3[1, 1] == 0 and Eq3[3, 1] != 0:
Eq4 = np.array([Eq3[0], Eq3[3], Eq3[2], Eq3[1]])
elif Eq3[3, 1] == 0:
Eq4 = Eq3
else:
Eq4 = np.array([Eq3[0], Eq3[1], Eq3[2], Eq3[3] - Eq3[1] * Eq3[3, 1] / Eq3[1, 1]])
if Eq4[1, 1] == 0 and Eq4[2, 1] != 0:
Eq5 = np.array([Eq4[0], Eq4[2], Eq4[1], Eq4[3]])
elif Eq4[2, 1] == 0:
Eq5 = Eq4
else:
Eq5 = np.array([Eq4[0], Eq4[1], Eq4[2] - Eq4[1] * Eq4[2, 1] / Eq4[1, 1], Eq4[3]])
if Eq5[2, 2] == 0:
Eq6 = np.array([Eq5[0], Eq5[1], Eq5[3], Eq5[2]])
elif Eq5[3, 2] == 0:
Eq6 = Eq5
else:
Eq6 = np.array([Eq5[0], Eq5[1], Eq5[2], Eq5[3] - Eq5[2] * Eq5[3, 2] / Eq5[2, 2]])
exp = self.Mat3D(Eq0) + "\n\nR₂ - R₁(R₂[2,1]/R₁[1,1])\nR₃ - R₁(R₃[3,1]/R₁[1,1])\nR₄ - R₁(R₄[4,1]/R₁[1,1])\n" \
+ self.Mat3D(np.round(Eq3,3)) + "\n\nR₃ - R₂(R₃[3,2]/R₂[2,2])\nR₄ - R₂(R₄[4,2]/R₂[2,2])\n" \
+ self.Mat3D(np.round(Eq5,3)) + "\n\nR₄ - R₃(R₄[4,3]/R₃[3,3])\n" + self.Mat3D(np.round(Eq6,3))
else:
exp = "Please choose a \ndimension before compute"
return exp
def Trace(self):
if self.d == 2:
exp2D = np.array(self.m).reshape(self.d, self.d)
cal = round(self.m[0] + self.m[3], 3)
exp = "Trace =\n" + self.Mat2D(exp2D) + "\n\nTr = " + str(self.m[0]) + " + " + str(np.round(self.m[3],3)) + "\n\nTr = " + str(cal)
elif self.d == 3:
exp3D = np.array(self.m).reshape(self.d, self.d)
cal = round(self.m[0] + self.m[4] + self.m[8], 3)
exp = "Trace =\n" + self.Mat3D(exp3D) + "\n\nTr = " + str(self.m[0]) + " + " + str(np.round(self.m[4],3)) + " + " + str(
np.round(self.m[8])) + "\n\nTr = " + str(cal)
elif self.d == 4:
exp4D = np.array(self.m).reshape(self.d, self.d)
cal = round(self.m[0] + self.m[5] + self.m[10] + self.m[15], 3)
exp = "Trace =\n" + self.Mat3D(exp4D) + "\n\nTr = " + str(self.m[0]) + " + " + str(np.round(self.m[5],3)) + " + " + str(
np.round(self.m[10],3)) + " + " + str(np.round(self.m[15],3)) + "\n\nTr = " + str(cal)
else:
exp = "Please choose \na dimension before \ncompute"
return exp
def LUDec(self):
if self.d == 2:
Eq0 = np.array(self.m).reshape(self.d, self.d)
if Eq0[0, 0] == 0:
Eq1 = np.array([Eq0[1], Eq0[0]])
L = 0
else:
L = Eq0[1, 0] / Eq0[0, 0]
Eq1 = np.array([Eq0[0], Eq0[1] - Eq0[0] * L])
EL = np.array([0, 1, L, 0]).reshape(self.d, self.d)
exp = "[A] =\n"+self.Mat2D(Eq0) + "\n\n[A] = [L][U]\nR₂ - R₁(R₂[2,1]/R₁[1,1])\n\n[U] = \n" + self.Mat2D(
np.round(Eq1,3)) + "\n\n[L] = \n" + self.Mat2D(np.round(EL,3))
elif self.d == 3:
Eq0 = np.array(self.m).reshape(self.d, self.d)
L1 = 0
L2 = 0
L3 = 0
if Eq0[0, 0] == 0 and Eq0[1, 0] == 0 and Eq0[2, 0] == 0:
Eq1 = Eq0
elif Eq0[0, 0] == 0 and Eq0[1, 0] != 0:
Eq1 = np.array([Eq0[1], Eq0[0], Eq0[2]])
elif Eq0[0, 0] == 0 and Eq0[2, 0] != 0:
Eq1 = np.array([Eq0[2], Eq0[1], Eq0[0]])
else:
L1 = Eq0[1, 0] / Eq0[0, 0]
Eq1 = np.array([Eq0[0], Eq0[1] - Eq0[0] * L1, Eq0[2]])
if Eq1[2, 0] == 0:
Eq2 = Eq1
else:
L2 = Eq1[2, 0] / Eq1[0, 0]
Eq2 = np.array([Eq1[0], Eq1[1], Eq1[2] - Eq1[0] * L2])
if Eq2[0, 0] == 0 and Eq2[1, 0] == 0 and Eq2[2, 0] == 0:
if Eq2[0, 1] == 0 and Eq2[1, 1] != 0:
Eq3 = np.array([Eq2[1], Eq2[0], Eq2[2]])
elif Eq2[0, 1] == 0 and Eq2[2, 1] != 0:
Eq3 = np.array([Eq2[2], Eq2[1], Eq2[0]])
elif Eq2[1, 1] == 0:
Eq3 = Eq2
else:
L1 = Eq2[1, 1] / Eq2[0, 1]
Eq3 = np.array([Eq2[0], Eq2[1] - Eq2[0] * L1, Eq2[2]])
elif Eq2[0, 0] != 0 and Eq2[0, 1] == 0 and Eq2[1, 1] == 0 and Eq2[2, 1] == 0:
if Eq2[1, 2] == 0 and Eq2[2, 2] != 0:
Eq3 = np.array([Eq2[0], Eq2[2], Eq2[1]])
elif Eq2[2, 2] == 0:
Eq3 = Eq2
else:
L3 = Eq2[2, 2] / Eq2[1, 2]
Eq3 = np.array([Eq2[0], Eq2[1], Eq2[2] - Eq2[1] * L3])
elif Eq2[1, 1] == 0 and Eq2[2, 1] != 0:
Eq3 = np.array([Eq2[0], Eq2[2], Eq2[1]])
elif Eq2[2, 1] == 0:
Eq3 = Eq2
else:
L3 = Eq2[2, 1] / Eq2[1, 1]
Eq3 = np.array([Eq2[0], Eq2[1], Eq2[2] - Eq2[1] * L3])
if Eq3[0, 0] == 0 and Eq3[1, 0] == 0 and Eq3[2, 0] == 0 and Eq3[0, 1] == 0 and Eq3[1, 1] == 0 and Eq3[
2, 1] == 0:
if Eq3[0, 2] == 0 and Eq3[1, 2] != 0:
Eq4 = np.array([Eq3[1], Eq3[0], Eq3[2]])
elif Eq3[0, 2] == 0 and Eq3[2, 2] != 0:
Eq4 = np.array([Eq3[2], Eq3[1], Eq3[0]])
elif Eq3[0, 2] == 0 and Eq3[1, 2] == 0 and Eq3[2, 2] == 0:
Eq4 = Eq3
else:
L1 = Eq3[1, 2] / Eq3[0, 2]
Eq4 = np.array([Eq3[0], Eq3[1] - Eq3[0] * L1, Eq3[2]])
elif Eq3[0, 0] == 0 and Eq3[1, 0] == 0 and Eq3[2, 0] == 0:
if Eq3[2, 1] == 0:
Eq4 = Eq3
else:
L2 = Eq3[2, 1] / Eq3[0, 1]
Eq4 = np.array([Eq3[0], Eq3[1], Eq3[2] - Eq3[0] * L2])
else:
Eq4 = Eq3
if Eq4[0, 0] == 0 and Eq4[1, 0] == 0 and Eq4[2, 0] == 0 and Eq4[0, 1] == 0 and Eq4[1, 1] == 0 and Eq4[
2, 1] == 0:
if Eq4[2, 2] == 0:
Eq5 = Eq4
else:
L2 = Eq4[2, 2] / Eq4[0, 2]
Eq5 = np.array([Eq4[0], Eq4[1], Eq4[2] - Eq4[0] * L2])
elif Eq4[0, 0] != 0 and Eq4[1, 1] == 0 and Eq4[2, 1] == 0:
if Eq4[1, 2] == 0 and Eq4[2, 2] != 0:
Eq5 = np.array([Eq4[0], Eq4[2], Eq4[1]])
elif Eq4[2, 2] == 0:
Eq5 = Eq4
else:
L3 = Eq4[2, 2] / Eq4[1, 2]
Eq5 = np.array([Eq4[0], Eq4[1], Eq4[2] - Eq4[1] * L3])
elif Eq4[0, 0] != 0 and Eq4[1, 1] == 0 and Eq4[1, 2] == 0 and Eq4[2, 2] != 0 or Eq4[0, 0] == 0 and Eq4[
0, 1] != 0 and Eq4[1, 2] == 0 and Eq4[2, 2] != 0:
Eq5 = np.array([Eq4[0], Eq4[2], Eq4[1]])
else:
Eq5 = Eq4
EL = np.array([1, 0, 0, L1, 1, 0, L2, L3, 1]).reshape(self.d, self.d)
exp = "[A] =\n"+self.Mat3D(Eq0) + "\n\n[A] = [L][U]\n\n[U] = \n" + self.Mat3D(np.round(Eq5,3)) + "\n\n[L] = \n" + self.Mat3D(np.round(EL,3))
elif self.d == 4:
Eq0 = np.array(self.m).reshape(self.d, self.d)
L1 = 0
L2 = 0
L3 = 0
L4 = 0
L5 = 0
L6 = 0
if Eq0[0, 0] == 0 and Eq0[1, 0] != 0:
Eq1 = np.array([Eq0[1], Eq0[0], Eq0[2], Eq0[3]])
elif Eq0[0, 0] == 0 and Eq0[2, 0] != 0:
Eq1 = np.array([Eq0[2], Eq0[1], Eq0[0], Eq0[3]])
elif Eq0[0, 0] == 0 and Eq0[3, 0] != 0:
Eq1 = np.array([Eq0[3], Eq0[1], Eq0[2], Eq0[0]])
elif Eq0[1, 0] == 0:
Eq1 = Eq0
else:
L1 = Eq0[1, 0] / Eq0[0, 0]
Eq1 = np.array([Eq0[0], Eq0[1] - Eq0[0] * L1, Eq0[2], Eq0[3]])
if Eq1[2, 0] == 0:
Eq2 = Eq1
else:
L2 = Eq1[2, 0] / Eq1[0, 0]
Eq2 = np.array([Eq1[0], Eq1[1], Eq1[2] - Eq1[0] * L2, Eq0[3]])
if Eq2[3, 0] == 0:
Eq3 = Eq2
else:
L3 = Eq2[3, 0] / Eq2[0, 0]
Eq3 = np.array([Eq2[0], Eq2[1], Eq2[2], Eq2[3] - Eq2[0] * L3])
if Eq3[0, 0] == 0 and Eq3[1, 0] == 0 and Eq3[2, 0] == 0 and Eq3[3, 0] == 0:
if Eq3[0, 1] == 0 and Eq3[1, 1] != 0:
Eq4 = np.array([Eq3[1], Eq3[0], Eq3[2], Eq3[3]])
elif Eq3[0, 1] == 0 and Eq3[2, 1] != 0:
Eq4 = np.array([Eq3[2], Eq3[1], Eq3[0], Eq3[3]])
elif Eq3[0, 1] == 0 and Eq3[3, 1] != 0:
Eq4 = np.array([Eq3[3], Eq3[1], Eq3[2], Eq3[0]])
elif Eq3[1, 1] == 0:
Eq4 = Eq3
else:
L1 = Eq3[1, 1] / Eq3[0, 1]
Eq4 = np.array([Eq3[0], Eq3[1] - Eq3[0] * L1, Eq3[2], Eq3[3]])
elif Eq3[1, 1] == 0 and Eq3[2, 1] != 0:
Eq4 = np.array([Eq3[0], Eq3[2], Eq3[1], Eq3[3]])
elif Eq3[1, 1] == 0 and Eq3[3, 1] != 0:
Eq4 = np.array([Eq3[0], Eq3[3], Eq3[2], Eq3[1]])
elif Eq3[2, 1] == 0:
Eq4 = Eq3
else:
L4 = Eq3[2, 1] / Eq3[1, 1]
Eq4 = np.array([Eq3[0], Eq3[1], Eq3[2] - Eq3[1] * L4, Eq3[3]])
if Eq4[0, 0] == 0 and Eq4[1, 0] == 0 and Eq4[2, 0] == 0 and Eq4[3, 0] == 0:
if Eq4[2, 1] == 0:
Eq5 = Eq4
else:
L2 = Eq4[2, 1] / Eq4[0, 1]
Eq5 = np.array([Eq4[0], Eq4[1], Eq4[2] - Eq4[0] * L2, Eq4[3]])
elif Eq4[3, 1] == 0:
Eq5 = Eq4
else:
L5 = Eq4[3, 1] / Eq4[1, 1]
Eq5 = np.array([Eq4[0], Eq4[1], Eq4[2], Eq4[3] - Eq4[1] * L5])
if Eq5[0, 0] == 0 and Eq5[1, 0] == 0 and Eq5[2, 0] == 0 and Eq5[3, 0] == 0 and Eq5[0, 1] == 0 and Eq5[1, 1] == 0 and Eq5[2, 1] == 0 and Eq5[3, 1] == 0:
if Eq5[0, 2] == 0 and Eq5[1, 2] != 0:
Eq6 = np.array([Eq5[1], Eq5[0], Eq5[2], Eq5[3]])
elif Eq5[0, 2] == 0 and Eq5[2, 2] != 0:
Eq6 = np.array([Eq5[2], Eq5[1], Eq5[0], Eq5[3]])
elif Eq5[0, 2] == 0 and Eq5[3, 2] != 0:
Eq6 = np.array([Eq5[3], Eq5[1], Eq5[2], Eq5[0]])
elif Eq5[1, 2] == 0:
Eq6 = Eq5
else:
L1 = Eq5[1, 2] / Eq5[0, 2]
Eq6 = np.array([Eq5[0], Eq5[1] - Eq5[0] * L1, Eq5[2], Eq5[3]])
elif Eq5[0, 1] == 0 and Eq5[1, 1] == 0 and Eq5[2, 1] == 0 and Eq5[3, 1] == 0 and Eq5[0, 2] == 0 and Eq5[1, 2] == 0 and Eq5[2, 2] == 0 and Eq5[3, 2] == 0:
if Eq5[1, 3] == 0 and Eq5[2, 3] != 0:
Eq6 = np.array([Eq5[0], Eq5[2], Eq5[1], Eq5[3]])
elif Eq5[1, 3] == 0 and Eq5[3, 3] != 0:
Eq6 = np.array([Eq5[0], Eq5[3], Eq5[2], Eq5[1]])
elif Eq5[2, 3] == 0:
Eq6 = Eq5
else:
L1 = Eq5[2, 3] / Eq5[1, 3]
Eq6 = np.array([Eq5[0], Eq5[1], Eq5[2] - Eq5[1] * L1, Eq5[3]])
elif Eq5[0, 0] == 0 and Eq5[1, 0] == 0 and Eq5[2, 0] == 0 and Eq5[3, 0] == 0:
if Eq5[3, 1] == 0:
Eq6 = Eq5
else:
L3 = Eq5[3, 1] / Eq5[0, 1]
Eq6 = np.array([Eq5[0], Eq5[1], Eq5[2], Eq5[3] - Eq5[0] * L3])
elif Eq5[0, 1] == 0 and Eq5[1, 1] == 0 and Eq5[2, 1] == 0 and Eq5[3, 1] == 0:
if Eq5[1, 2] == 0 and Eq5[2, 2] != 0:
Eq6 = np.array([Eq5[0], Eq5[2], Eq5[1], Eq5[3]])
elif Eq5[1, 2] == 0 and Eq5[3, 2] != 0:
Eq6 = np.array([Eq5[0], Eq5[3], Eq5[2], Eq5[1]])
elif Eq5[2, 2] == 0:
Eq6 = Eq5
else:
L1 = Eq5[2, 2] / Eq5[1, 2]
Eq6 = np.array([Eq5[0], Eq5[1], Eq5[2] - Eq5[1] * L1, Eq5[3]])
elif Eq5[2, 2] == 0 and Eq5[3, 2] != 0:
Eq6 = np.array([Eq5[0], Eq5[1], Eq5[3], Eq5[2]])
elif Eq5[3, 2] == 0:
Eq6 = Eq5
else:
L6 = Eq5[3, 2] / Eq5[2, 2]
Eq6 = np.array([Eq5[0], Eq5[1], Eq5[2], Eq5[3] - Eq5[2] * L6])
if Eq6[0, 0] == 0 and Eq6[1, 0] == 0 and Eq6[2, 0] == 0 and Eq6[3, 0] == 0 and Eq6[0, 1] == 0 and Eq6[1, 1] == 0 and Eq6[2, 1] == 0 and Eq6[3, 1] == 0 and Eq6[0, 2] == 0 and Eq6[1, 2] == 0 and Eq6[2, 2] == 0 and Eq6[3, 2] == 0:
if Eq6[0, 3] == 0 and Eq6[1, 3] != 0:
Eq7 = np.array([Eq6[1], Eq6[0], Eq6[2], Eq6[3]])
elif Eq6[0, 3] == 0 and Eq6[2, 3] != 0:
Eq7 = np.array([Eq6[2], Eq6[1], Eq6[0], Eq6[3]])
elif Eq6[0, 3] == 0 and Eq6[3, 3] != 0:
Eq7 = np.array([Eq6[3], Eq6[1], Eq6[2], Eq6[0]])
elif Eq6[1, 3] == 0:
Eq7 = Eq6
else:
L1 = Eq6[1, 3] / Eq6[0, 3]
Eq7 = np.array([Eq6[0], Eq6[1] - Eq6[0] * L1, Eq6[2], Eq6[3]])
elif Eq6[0, 0] == 0 and Eq6[1, 0] == 0 and Eq6[2, 0] == 0 and Eq6[3, 0] == 0 and Eq6[0, 1] == 0 and Eq6[1, 1] == 0 and Eq6[2, 1] == 0 and Eq6[3, 1] == 0:
if Eq6[2, 2] == 0:
Eq7 = Eq6
else:
L2 = Eq6[2, 2] / Eq6[0, 2]
Eq7 = np.array([Eq6[0], Eq6[1], Eq6[2] - Eq6[0] * L2, Eq6[3]])
elif Eq5[0, 1] == 0 and Eq5[1, 1] == 0 and Eq5[2, 1] == 0 and Eq5[3, 1] == 0 and Eq5[0, 2] == 0 and Eq5[1, 2] == 0 and Eq5[2, 2] == 0 and Eq5[3, 2] == 0:
if Eq6[3, 3] == 0:
Eq7 = Eq6
else:
L2 = Eq6[3, 3] / Eq6[1, 3]
Eq7 = np.array([Eq6[0], Eq6[1], Eq6[2], Eq6[3] - Eq6[1] * L2])
elif Eq6[0, 0] == 0 and Eq6[1, 0] == 0 and Eq6[2, 0] == 0 and Eq6[3, 0] == 0:
if Eq6[1, 2] == 0 and Eq6[2, 2] != 0:
Eq7 = np.array([Eq6[0], Eq6[2], Eq6[1], Eq6[3]])
elif Eq6[1, 2] == 0 and Eq6[3, 2] != 0:
Eq7 = np.array([Eq6[0], Eq6[3], Eq6[2], Eq6[1]])
elif Eq6[2, 2] == 0:
Eq7 = Eq6
else:
L4 = Eq6[2, 2] / Eq6[1, 2]
Eq7 = np.array([Eq6[0], Eq6[1], Eq6[2] - Eq6[1] * L4, Eq6[3]])
elif Eq6[0, 1] == 0 and Eq6[1, 1] == 0 and Eq6[2, 1] == 0 and Eq6[3, 1] == 0:
if Eq6[3, 2] == 0:
Eq7 = Eq6
else:
L2 = Eq6[3, 2] / Eq6[1, 2]
Eq7 = np.array([Eq6[0], Eq6[1], Eq6[2], Eq6[3] - Eq6[1] * L2])
elif Eq6[0, 2] == 0 and Eq6[1, 2] == 0 and Eq6[2, 2] == 0 and Eq6[3, 2] == 0:
if Eq6[2, 3] == 0 and Eq6[3, 3] != 0:
Eq7 = np.array([Eq6[0], Eq6[1], Eq6[3], Eq6[2]])
elif Eq6[3, 3] == 0:
Eq7 = Eq6
else:
L6 = Eq6[3, 3] / Eq6[2, 3]
Eq7 = np.array([Eq6[0], Eq6[1], Eq6[2], Eq6[3] - Eq6[2] * L6])
else:
Eq7 = Eq6
if Eq7[0, 0] == 0 and Eq7[1, 0] == 0 and Eq7[2, 0] == 0 and Eq7[3, 0] == 0 and Eq7[0, 1] == 0 and Eq7[1, 1] == 0 and Eq7[2, 1] == 0 and Eq7[3, 1] == 0 and Eq7[0, 2] == 0 and Eq7[1, 2] == 0 and Eq7[2, 2] == 0 and Eq7[3, 2] == 0:
if Eq7[2, 3] == 0:
Eq8 = Eq7
else:
L2 = Eq7[2, 3] / Eq7[0, 3]
Eq8 = np.array([Eq7[0], Eq7[1], Eq7[2] - Eq7[0] * L2, Eq7[3]])
elif Eq7[0, 0] == 0 and Eq7[1, 0] == 0 and Eq7[2, 0] == 0 and Eq7[3, 0] == 0 and Eq7[0, 1] == 0 and Eq7[1, 1] == 0 and Eq7[2, 1] == 0 and Eq7[3, 1] == 0:
if Eq7[3, 2] == 0:
Eq8 = Eq7
else:
L3 = Eq7[3, 2] / Eq7[0, 2]
Eq8 = np.array([Eq7[0], Eq7[1], Eq7[2], Eq7[3] - Eq7[0] * L3])
elif Eq7[0, 0] == 0 and Eq7[1, 0] == 0 and Eq7[2, 0] == 0 and Eq7[3, 0] == 0:
if Eq7[3, 2] == 0:
Eq8 = Eq7
else:
L5 = Eq7[3, 2] / Eq7[1, 2]
Eq8 = np.array([Eq7[0], Eq7[1], Eq7[2], Eq7[3] - Eq7[1] * L5])
elif Eq7[0, 1] == 0 and Eq7[1, 1] == 0 and Eq7[2, 1] == 0 and Eq7[3, 1] == 0:
if Eq7[2, 3] == 0 and Eq7[3, 3] != 0:
Eq8 = np.array([Eq7[0], Eq7[1], Eq7[3], Eq7[2]])
elif Eq7[3, 3] == 0:
Eq8 = Eq7
else:
L3 = Eq7[3, 3] / Eq7[2, 3]
Eq8 = np.array([Eq7[0], Eq7[1], Eq7[2], Eq7[3] - Eq7[2] * L3])
else:
Eq8 = Eq7
if Eq8[0, 0] == 0 and Eq8[1, 0] == 0 and Eq8[2, 0] == 0 and Eq8[3, 0] == 0 and Eq8[0, 1] == 0 and Eq8[1, 1] == 0 and Eq8[2, 1] == 0 and Eq8[3, 1] == 0 and Eq8[0, 2] == 0 and Eq8[1, 2] == 0 and Eq8[2, 2] == 0 and Eq8[3, 2] == 0:
if Eq8[3, 3] == 0:
Eq9 = Eq8
else:
L3 = Eq8[3, 3] / Eq8[0, 3]
Eq9 = np.array([Eq8[0], Eq8[1], Eq8[2], Eq8[3] - Eq8[0] * L3])
elif Eq8[0, 0] == 0 and Eq8[1, 0] == 0 and Eq8[2, 0] == 0 and Eq8[3, 0] == 0 and Eq8[0, 1] == 0 and Eq8[1, 1] == 0 and Eq8[2, 1] == 0 and Eq8[3, 1] == 0:
if Eq8[1, 3] == 0 and Eq8[2, 3] != 0:
Eq9 = np.array([Eq8[0], Eq8[2], Eq8[1], Eq8[3]])
elif Eq8[1, 3] == 0 and Eq8[3, 3] != 0:
Eq9 = np.array([Eq8[0], Eq8[3], Eq8[2], Eq8[1]])
elif Eq8[2, 3] == 0:
Eq9 = Eq8
else:
L4 = Eq8[2, 3] / Eq8[1, 3]
Eq9 = np.array([Eq8[0], Eq8[1], Eq8[2] - Eq8[1] * L4, Eq8[3]])
elif Eq8[0, 0] == 0 and Eq8[1, 0] == 0 and Eq8[2, 0] == 0 and Eq8[3, 0] == 0:
if Eq8[2, 3] == 0 and Eq8[3, 3] != 0:
Eq9 = np.array([Eq8[0], Eq8[1], Eq8[3], Eq8[2]])
elif Eq8[3, 3] == 0:
Eq9 = Eq8
else:
L6 = Eq8[3, 3] / Eq8[2, 3]
Eq9 = np.array([Eq8[0], Eq8[1], Eq8[2], Eq8[3] - Eq8[2] * L6])
else:
Eq9 = Eq8
if Eq9[0, 0] == 0 and Eq9[1, 0] == 0 and Eq9[2, 0] == 0 and Eq9[3, 0] == 0 and Eq9[0, 1] == 0 and Eq9[1, 1] == 0 and Eq9[2, 1] == 0 and Eq9[3, 1] == 0:
if Eq9[3, 3] == 0:
Eq10 = Eq9
else:
L5 = Eq9[3, 3] / Eq9[1, 3]
Eq10 = np.array([Eq9[0], Eq9[1], Eq9[2], Eq9[3] - Eq9[1] * L5])
else:
Eq10 = Eq9
EL = np.array([1, 0, 0, 0, L1, 1, 0, 0, L2, L3, 1, 0, L4, L5, L6, 1]).reshape(self.d, self.d)
exp = "[A] =\n" +self.Mat3D(np.round(Eq0,3)) + "\n\n[A] = [L][U]\n\n[U] = \n" + self.Mat3D(np.round(Eq10,3)) + "\n\n[L] = \n" + self.Mat3D(np.round(EL,3))
else:
exp = "Please choose a dimension \nbefore compute"
return exp
def Rank(self):
if self.d == 2:
Eq0 = np.array(self.m).reshape(self.d, self.d)
if Eq0[0,0]==0 and Eq0[1,0]!=0:
Eq1 = np.array([Eq0[1], Eq0[0]])
elif Eq0[1,0]==0:
Eq1=Eq0
else:
Eq1 = np.array([Eq0[0], Eq0[1]-Eq0[0]*Eq0[1,0]/Eq0[0,0]])
if Eq1[0,0] == 0 and Eq1[0,1]==0:
Eq2 = np.array([Eq1[1], Eq1[0]])
else:
Eq2 = Eq1
if Eq2[0,0]!=0:
if Eq2[1,1]!=0:
r = 2
elif Eq2[1,1]==0:
r = 1
else:
r = 0
else:
if Eq2[0,1]!=0:
r = 1
else:
r = 0
exp = "[A] =\n"+self.Mat2D(Eq0)+"\n\nRank =\n"+self.Mat2D(np.round(Eq2,3))+"\n\nRank : "+str(r)
elif self.d == 3:
Eq0 = np.array(self.m).reshape(self.d, self.d)
if Eq0[0, 0] == 0 and Eq0[1, 0] == 0 and Eq0[2, 0] == 0:
Eq1 = Eq0
elif Eq0[0, 0] == 0 and Eq0[1, 0] != 0:
Eq1 = np.array([Eq0[1], Eq0[0], Eq0[2]])
elif Eq0[0, 0] == 0 and Eq0[2, 0] != 0:
Eq1 = np.array([Eq0[2], Eq0[1], Eq0[0]])
else:
Eq1 = np.array([Eq0[0], Eq0[1] - Eq0[0] * Eq0[1, 0] / Eq0[0, 0], Eq0[2]])
if Eq1[2, 0] == 0:
Eq2 = Eq1
else:
Eq2 = np.array([Eq1[0], Eq1[1], Eq1[2] - Eq1[0] * Eq1[2, 0] / Eq1[0, 0]])
if Eq2[0,0]==0 and Eq2[1,0]==0 and Eq2[2,0]==0:
if Eq2[0,1]==0 and Eq2[1,1]!=0:
Eq3 = np.array([Eq2[1], Eq2[0], Eq2[2]])
elif Eq2[0,1]==0 and Eq2[2,1]!=0:
Eq3 = np.array([Eq2[2], Eq2[1], Eq2[0]])
elif Eq2[1,1]==0:
Eq3 = Eq2
else:
Eq3 = np.array([Eq2[0], Eq2[1]-Eq2[0]*Eq2[1,1]/Eq2[0,1], Eq2[2]])
elif Eq2[0,0]!=0 and Eq2[0,1]==0 and Eq2[1,1]==0 and Eq2[2,1]==0:
if Eq2[1,2]==0 and Eq2[2,2]!=0:
Eq3 = np.array([Eq2[0], Eq2[2], Eq2[1]])
elif Eq2[2,2]==0:
Eq3 = Eq2
else:
Eq3 = np.array([Eq2[0], Eq2[1], Eq2[2]-Eq2[1]*Eq2[2,2]/Eq2[1,2]])
elif Eq2[1, 1] == 0 and Eq2[2,1]!=0:
Eq3 = np.array([Eq2[0], Eq2[2], Eq2[1]])
elif Eq2[2, 1] == 0:
Eq3 = Eq2
else:
Eq3 = np.array([Eq2[0], Eq2[1], Eq2[2] - Eq2[1] * Eq2[2, 1] / Eq2[1, 1]])
if Eq3[0,0]==0 and Eq3[1,0]==0 and Eq3[2,0]==0 and Eq3[0,1]==0 and Eq3[1,1]==0 and Eq3[2,1]==0:
if Eq3[0,2]==0 and Eq3[1,2]!=0:
Eq4 = np.array([Eq3[1], Eq3[0], Eq3[2]])
elif Eq3[0,2]==0 and Eq3[2,2]!=0:
Eq4 = np.array([Eq3[2], Eq3[1], Eq3[0]])
elif Eq3[0,2]==0 and Eq3[1,2]==0 and Eq3[2,2]==0:
Eq4 = Eq3
else:
Eq4 = np.array([Eq3[0], Eq3[1]-Eq3[0]*Eq3[1,2]/Eq3[0,2], Eq3[2]])
elif Eq3[0,0]==0 and Eq3[1,0]==0 and Eq3[2,0]==0:
if Eq3[2,1]==0:
Eq4 = Eq3
else:
Eq4 = np.array([Eq3[0], Eq3[1], Eq3[2]-Eq3[0]*Eq3[2,1]/Eq3[0,1]])
else:
Eq4 = Eq3
if Eq4[0, 0] == 0 and Eq4[1, 0] == 0 and Eq4[2, 0] == 0 and Eq4[0, 1] == 0 and Eq4[1, 1] == 0 and Eq4[2, 1] == 0:
if Eq4[2,2]==0:
Eq5 = Eq4
else:
Eq5 = np.array([Eq4[0], Eq4[1], Eq4[2]-Eq4[0]*Eq4[2,2]/Eq4[0,2]])
elif Eq4[0,0]!=0 and Eq4[1,1]==0 and Eq4[2,1]==0:
if Eq4[1,2]==0 and Eq4[2,2]!=0:
Eq5 = np.array([Eq4[0], Eq4[2], Eq4[1]])
elif Eq4[2,2]==0:
Eq5=Eq4
else:
Eq5 = np.array([Eq4[0], Eq4[1], Eq4[2]-Eq4[1]*Eq4[2,2]/Eq4[1,2]])
elif Eq4[0,0]!=0 and Eq4[1,1]==0 and Eq4[1,2]==0 and Eq4[2,2]!=0 or Eq4[0,0]==0 and Eq4[0,1]!=0 and Eq4[1,2]==0 and Eq4[2,2]!=0:
Eq5 = np.array([Eq4[0], Eq4[2], Eq4[1]])
else:
Eq5 = Eq4
if Eq5[0,0]!=0:
if Eq5[1,1]!=0 and Eq5[2,2]!=0:
r = 3
elif Eq5[1,1]!=0 and Eq5[2,2]==0 or Eq5[1,1]==0 and Eq5[1,2]!=0:
r = 2
else:
r = 1
elif Eq5[0,1]!=0:
if Eq5[1,2]!=0:
r = 2
else:
r = 1
elif Eq5[0,2]!=0:
r = 1
elif np.count_nonzero(Eq5==0) == 9:
r = 0
else:
r = 1
exp="[A] =\n"+self.Mat3D(Eq0)+"\n\nRank = \n"+self.Mat3D(np.round(Eq5,3))+"\n\nRank : "+str(r)
elif self.d == 4:
Eq0 = np.array(self.m).reshape(self.d, self.d)
if Eq0[0, 0] == 0 and Eq0[1, 0] != 0:
Eq1 = np.array([Eq0[1], Eq0[0], Eq0[2], Eq0[3]])
elif Eq0[0, 0] == 0 and Eq0[2, 0] != 0:
Eq1 = np.array([Eq0[2], Eq0[1], Eq0[0], Eq0[3]])
elif Eq0[0, 0] == 0 and Eq0[3, 0] != 0:
Eq1 = np.array([Eq0[3], Eq0[1], Eq0[2], Eq0[0]])
elif Eq0[1, 0] == 0:
Eq1 = Eq0
else:
Eq1 = np.array([Eq0[0], Eq0[1] - Eq0[0] * Eq0[1, 0] / Eq0[0, 0], Eq0[2], Eq0[3]])
if Eq1[2, 0] == 0:
Eq2 = Eq1
else:
Eq2 = np.array([Eq1[0], Eq1[1], Eq1[2] - Eq1[0] * Eq1[2, 0] / Eq1[0, 0], Eq0[3]])
if Eq2[3, 0] == 0:
Eq3 = Eq2
else:
Eq3 = np.array([Eq2[0], Eq2[1], Eq2[2], Eq2[3] - Eq2[0] * Eq2[3, 0] / Eq2[0, 0]])
if Eq3[0,0]==0 and Eq3[1,0]==0 and Eq3[2,0]==0 and Eq3[3,0]==0:
if Eq3[0,1]==0 and Eq3[1,1]!=0:
Eq4 = np.array([Eq3[1], Eq3[0], Eq3[2], Eq3[3]])
elif Eq3[0,1]==0 and Eq3[2,1]!=0:
Eq4 = np.array([Eq3[2], Eq3[1], Eq3[0], Eq3[3]])
elif Eq3[0,1]==0 and Eq3[3,1]!=0:
Eq4 = np.array([Eq3[3], Eq3[1], Eq3[2], Eq3[0]])
elif Eq3[1,1]==0:
Eq4 = Eq3
else:
Eq4 = np.array([Eq3[0], Eq3[1]-Eq3[0]*Eq3[1,1]/Eq3[0,1], Eq3[2], Eq3[3]])
elif Eq3[1,1]==0 and Eq3[2,1]!=0:
Eq4 = np.array([Eq3[0], Eq3[2], Eq3[1], Eq3[3]])
elif Eq3[1,1]==0 and Eq3[3,1]!=0:
Eq4 = np.array([Eq3[0], Eq3[3], Eq3[2], Eq3[1]])
elif Eq3[2,1]==0:
Eq4 = Eq3
else:
Eq4 = np.array([Eq3[0], Eq3[1], Eq3[2]-Eq3[1]*Eq3[2,1]/Eq3[1,1], Eq3[3]])
if Eq4[0,0]==0 and Eq4[1,0]==0 and Eq4[2,0]==0 and Eq4[3,0]==0:
if Eq4[2,1]==0:
Eq5 = Eq4
else:
Eq5 = np.array([Eq4[0], Eq4[1], Eq4[2]-Eq4[0]*Eq4[2,1]/Eq4[0,1], Eq4[3]])
elif Eq4[3,1]==0:
Eq5 = Eq4
else:
Eq5 = np.array([Eq4[0], Eq4[1], Eq4[2], Eq4[3]-Eq4[1]*Eq4[3,1]/Eq4[1,1]])
if Eq5[0,0]==0 and Eq5[1,0]==0 and Eq5[2,0]==0 and Eq5[3,0]==0 and Eq5[0,1]==0 and Eq5[1,1]==0 and Eq5[2,1]==0 and Eq5[3,1]==0:
if Eq5[0,2]==0 and Eq5[1,2]!=0:
Eq6 = np.array([Eq5[1], Eq5[0], Eq5[2], Eq5[3]])
elif Eq5[0,2]==0 and Eq5[2,2]!=0:
Eq6 = np.array([Eq5[2], Eq5[1], Eq5[0], Eq5[3]])
elif Eq5[0,2]==0 and Eq5[3,2]!=0:
Eq6 = np.array([Eq5[3], Eq5[1], Eq5[2], Eq5[0]])
elif Eq5[1,2]==0:
Eq6 = Eq5
else:
Eq6 = np.array([Eq5[0], Eq5[1]-Eq5[0]*Eq5[1,2]/Eq5[0,2], Eq5[2], Eq5[3]])
elif Eq5[0,1]==0 and Eq5[1,1]==0 and Eq5[2,1]==0 and Eq5[3,1]==0 and Eq5[0,2]==0 and Eq5[1,2]==0 and Eq5[2,2]==0 and Eq5[3,2]==0:
if Eq5[1,3]==0 and Eq5[2,3]!=0:
Eq6 = np.array([Eq5[0], Eq5[2], Eq5[1], Eq5[3]])
elif Eq5[1,3]==0 and Eq5[3,3]!=0:
Eq6 = np.array([Eq5[0], Eq5[3], Eq5[2], Eq5[1]])
elif Eq5[2,3]==0:
Eq6 = Eq5
else:
Eq6 = np.array([Eq5[0], Eq5[1], Eq5[2]-Eq5[1]*Eq5[2,3]/Eq5[1,3], Eq5[3]])
elif Eq5[0,0]==0 and Eq5[1,0]==0 and Eq5[2,0]==0 and Eq5[3,0]==0:
if Eq5[3,1]==0:
Eq6 = Eq5
else:
Eq6 = np.array([Eq5[0], Eq5[1], Eq5[2], Eq5[3]-Eq5[0]*Eq5[3,1]/Eq5[0,1]])
elif Eq5[0,1]==0 and Eq5[1,1]==0 and Eq5[2,1]==0 and Eq5[3,1]==0:
if Eq5[1,2]==0 and Eq5[2,2]!=0:
Eq6 = np.array([Eq5[0], Eq5[2], Eq5[1], Eq5[3]])
elif Eq5[1,2]==0 and Eq5[3,2]!=0:
Eq6 = np.array([Eq5[0], Eq5[3], Eq5[2], Eq5[1]])
elif Eq5[2,2]==0:
Eq6 = Eq5
else: Eq6 = np.array([Eq5[0], Eq5[1], Eq5[2]-Eq5[1]*Eq5[2,2]/Eq5[1,2], Eq5[3]])
elif Eq5[2,2]==0 and Eq5[3,2]!=0:
Eq6 = np.array([Eq5[0], Eq5[1], Eq5[3], Eq5[2]])
elif Eq5[3,2]==0:
Eq6 = Eq5
else:
Eq6 = np.array([Eq5[0], Eq5[1], Eq5[2], Eq5[3]-Eq5[2]*Eq5[3,2]/Eq5[2,2]])
if Eq6[0,0]==0 and Eq6[1,0]==0 and Eq6[2,0]==0 and Eq6[3,0]==0 and Eq6[0,1]==0 and Eq6[1,1]==0 and Eq6[2,1]==0 and Eq6[3,1]==0 and Eq6[0,2]==0 and Eq6[1,2]==0 and Eq6[2,2]==0 and Eq6[3,2]==0:
if Eq6[0,3]==0 and Eq6[1,3]!=0:
Eq7 = np.array([Eq6[1], Eq6[0], Eq6[2], Eq6[3]])
elif Eq6[0,3]==0 and Eq6[2,3]!=0:
Eq7 = np.array([Eq6[2], Eq6[1], Eq6[0], Eq6[3]])
elif Eq6[0,3]==0 and Eq6[3,3]!=0:
Eq7 = np.array([Eq6[3], Eq6[1], Eq6[2], Eq6[0]])
elif Eq6[1,3]==0:
Eq7 = Eq6
else:
Eq7 = np.array([Eq6[0], Eq6[1]-Eq6[0]*Eq6[1,3]/Eq6[0,3], Eq6[2], Eq6[3]])
elif Eq6[0,0]==0 and Eq6[1,0]==0 and Eq6[2,0]==0 and Eq6[3,0]==0 and Eq6[0,1]==0 and Eq6[1,1]==0 and Eq6[2,1]==0 and Eq6[3,1]==0:
if Eq6[2,2]==0:
Eq7 = Eq6
else:
Eq7 = np.array([Eq6[0], Eq6[1], Eq6[2]-Eq6[0]*Eq6[2,2]/Eq6[0,2], Eq6[3]])
elif Eq5[0, 1] == 0 and Eq5[1, 1] == 0 and Eq5[2, 1] == 0 and Eq5[3, 1] == 0 and Eq5[0, 2] == 0 and Eq5[1, 2] == 0 and Eq5[2, 2] == 0 and Eq5[3, 2] == 0:
if Eq6[3,3]==0:
Eq7 = Eq6
else:
Eq7 = np.array([Eq6[0], Eq6[1], Eq6[2], Eq6[3]-Eq6[1]*Eq6[3,3]/Eq6[1,3]])
elif Eq6[0,0]==0 and Eq6[1,0]==0 and Eq6[2,0]==0 and Eq6[3,0]==0:
if Eq6[1,2]==0 and Eq6[2,2]!=0:
Eq7 = np.array([Eq6[0], Eq6[2], Eq6[1], Eq6[3]])
elif Eq6[1,2]==0 and Eq6[3,2]!=0:
Eq7 = np.array([Eq6[0], Eq6[3], Eq6[2], Eq6[1]])
elif Eq6[2,2]==0:
Eq7 = Eq6
else:
Eq7 = np.array([Eq6[0], Eq6[1], Eq6[2]-Eq6[1]*Eq6[2,2]/Eq6[1,2], Eq6[3]])
elif Eq6[0,1]==0 and Eq6[1,1]==0 and Eq6[2,1]==0 and Eq6[3,1]==0:
if Eq6[3,2]==0:
Eq7 = Eq6
else:
Eq7 = np.array([Eq6[0], Eq6[1], Eq6[2], Eq6[3]-Eq6[1]*Eq6[3,2]/Eq6[1,2]])
elif Eq6[0,2]==0 and Eq6[1,2]==0 and Eq6[2,2]==0 and Eq6[3,2]==0:
if Eq6[2,3]==0 and Eq6[3,3]!=0:
Eq7 = np.array([Eq6[0], Eq6[1], Eq6[3], Eq6[2]])
elif Eq6[3,3]==0:
Eq7 = Eq6
else:
Eq7 = np.array([Eq6[0], Eq6[1], Eq6[2], Eq6[3]-Eq6[2]*Eq6[3,3]/Eq6[2,3]])
else:
Eq7 = Eq6
if Eq7[0,0]==0 and Eq7[1,0]==0 and Eq7[2,0]==0 and Eq7[3,0]==0 and Eq7[0,1]==0 and Eq7[1,1]==0 and Eq7[2,1]==0 and Eq7[3,1]==0 and Eq7[0,2]==0 and Eq7[1,2]==0 and Eq7[2,2]==0 and Eq7[3,2]==0:
if Eq7[2,3]==0:
Eq8 = Eq7
else:
Eq8 = np.array([Eq7[0], Eq7[1], Eq7[2]-Eq7[0]*Eq7[2,3]/Eq7[0,3], Eq7[3]])
elif Eq7[0,0]==0 and Eq7[1,0]==0 and Eq7[2,0]==0 and Eq7[3,0]==0 and Eq7[0,1]==0 and Eq7[1,1]==0 and Eq7[2,1]==0 and Eq7[3,1]==0:
if Eq7[3,2]==0:
Eq8 = Eq7
else:
Eq8 = np.array([Eq7[0], Eq7[1], Eq7[2], Eq7[3]-Eq7[0]*Eq7[3,2]/Eq7[0,2]])
elif Eq7[0,0]==0 and Eq7[1,0]==0 and Eq7[2,0]==0 and Eq7[3,0]==0:
if Eq7[3,2]==0:
Eq8 = Eq7
else:
Eq8 = np.array([Eq7[0], Eq7[1], Eq7[2], Eq7[3]-Eq7[1]*Eq7[3,2]/Eq7[1,2]])
elif Eq7[0,1]==0 and Eq7[1,1]==0 and Eq7[2,1]==0 and Eq7[3,1]==0:
if Eq7[2,3]==0 and Eq7[3,3]!=0:
Eq8 = np.array([Eq7[0], Eq7[1], Eq7[3], Eq7[2]])
elif Eq7[3,3]==0:
Eq8 = Eq7
else:
Eq8 = np.array([Eq7[0], Eq7[1], Eq7[2], Eq7[3]-Eq7[2]*Eq7[3,3]/Eq7[2,3]])
else:
Eq8 = Eq7
if Eq8[0,0]==0 and Eq8[1,0]==0 and Eq8[2,0]==0 and Eq8[3,0]==0 and Eq8[0,1]==0 and Eq8[1,1]==0 and Eq8[2,1]==0 and Eq8[3,1]==0 and Eq8[0,2]==0 and Eq8[1,2]==0 and Eq8[2,2]==0 and Eq8[3,2]==0:
if Eq8[3,3]==0:
Eq9 = Eq8
else:
Eq9 = np.array([Eq8[0], Eq8[1], Eq8[2], Eq8[3]-Eq8[0]*Eq8[3,3]/Eq8[0,3]])
elif Eq8[0,0]==0 and Eq8[1,0]==0 and Eq8[2,0]==0 and Eq8[3,0]==0 and Eq8[0,1]==0 and Eq8[1,1]==0 and Eq8[2,1]==0 and Eq8[3,1]==0:
if Eq8[1,3]==0 and Eq8[2,3]!=0:
Eq9 = np.array([Eq8[0], Eq8[2], Eq8[1], Eq8[3]])
elif Eq8[1,3]==0 and Eq8[3,3]!=0:
Eq9 = np.array([Eq8[0], Eq8[3], Eq8[2], Eq8[1]])
elif Eq8[2,3]==0:
Eq9 = Eq8
else:
Eq9 = np.array([Eq8[0], Eq8[1], Eq8[2]-Eq8[1]*Eq8[2,3]/Eq8[1,3], Eq8[3]])
elif Eq8[0,0]==0 and Eq8[1,0]==0 and Eq8[2,0]==0 and Eq8[3,0]==0:
if Eq8[2,3]==0 and Eq8[3,3]!=0:
Eq9 = np.array([Eq8[0], Eq8[1], Eq8[3], Eq8[2]])
elif Eq8[3,3]==0:
Eq9 = Eq8
else:
Eq9 = np.array([Eq8[0], Eq8[1], Eq8[2], Eq8[3]-Eq8[2]*Eq8[3,3]/Eq8[2,3]])
else:
Eq9 = Eq8
if Eq9[0,0]==0 and Eq9[1,0]==0 and Eq9[2,0]==0 and Eq9[3,0]==0 and Eq9[0,1]==0 and Eq9[1,1]==0 and Eq9[2,1]==0 and Eq9[3,1]==0:
if Eq9[3,3]==0:
Eq10 = Eq9
else:
Eq10 = np.array([Eq9[0], Eq9[1], Eq9[2], Eq9[3]-Eq9[1]*Eq9[3,3]/Eq9[1,3]])
else:
Eq10 = Eq9
if Eq10[0,0]!=0:
if Eq10[1,1]!=0 and Eq10[2,2]!=0 and Eq10[3,3]!=0:
r = 4
elif Eq10[1,1]!=0 and Eq10[2,2]!=0 and Eq10[3,3]==0 or Eq10[1,1]!=0 and Eq10[2,2]==0 and Eq10[2,3]!=0 or Eq10[1,1]==0 and Eq10[1,2]!=0 and Eq10[2,3]!=0:
r = 3
elif Eq10[1,1]!=0 and Eq10[2,2]==0 and Eq10[2,3]==0 or Eq10[1,1]==0 and Eq10[1,2]!=0 and Eq10[2,3]==0 or Eq10[1,1]==0 and Eq10[1,2]==0 and Eq10[1,3]!=0:
r = 2
else:
r = 1
elif Eq10[0,1]!=0:
if Eq10[1,2]!=0 and Eq10[2,3]!=0:
r = 3
elif Eq10[1,2]!=0 and Eq10[2,3]==0 or Eq10[1,2]==0 and Eq10[1,3]!=0:
r = 2
else:
r = 1
elif Eq10[0,2]!=0:
if Eq10[1,3]!=0:
r = 2
else:
r = 1
elif Eq10[0,3]!=0:
r = 1
elif np.count_nonzero(Eq10==0) == 16:
r = 0
else:
r = 1
exp="[A] =\n"+self.Mat3D(np.round(Eq0,3))+"\n\nRank = \n"+self.Mat3D(np.round(Eq10,3))+"\n\n\tRank : "+str(r)
else:
exp = "Please choose \na dimension \nbefore compute"
return exp | 59.681223 | 297 | 0.387239 | 11,847 | 68,335 | 2.241411 | 0.012915 | 0.108835 | 0.064397 | 0.071854 | 0.94193 | 0.929126 | 0.895345 | 0.87049 | 0.851359 | 0.831664 | 0 | 0.187538 | 0.379732 | 68,335 | 1,145 | 298 | 59.681223 | 0.4363 | 0 | 0 | 0.744907 | 0 | 0.010629 | 0.036091 | 0.009525 | 0.003543 | 0 | 0 | 0 | 0 | 1 | 0.014172 | false | 0 | 0.000886 | 0.004429 | 0.028344 | 0.000886 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
b6330551128e6c8d9466e83a91b5a730df8584f6 | 73,092 | py | Python | lenv/lib/python3.6/site-packages/Crypto/SelfTest/Signature/test_vectors/ECDSA/SigGen.txt.py | shrey-c/DataLeakageDjango | a827c5a09e5501921f9fb97b656755671238dd63 | [
"BSD-3-Clause"
] | 6 | 2020-05-03T12:03:21.000Z | 2020-09-07T08:33:58.000Z | lenv/lib/python3.6/site-packages/Crypto/SelfTest/Signature/test_vectors/ECDSA/SigGen.txt.py | shrey-c/DataLeakageDjango | a827c5a09e5501921f9fb97b656755671238dd63 | [
"BSD-3-Clause"
] | 3 | 2020-04-17T06:50:44.000Z | 2022-01-13T02:16:48.000Z | lenv/lib/python3.6/site-packages/Crypto/SelfTest/Signature/test_vectors/ECDSA/SigGen.txt.py | shrey-c/DataLeakageDjango | a827c5a09e5501921f9fb97b656755671238dd63 | [
"BSD-3-Clause"
] | null | null | null | XXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
| 99.716235 | 262 | 0.972747 | 1,896 | 73,092 | 37.5 | 0.004219 | 0.010127 | 0.167089 | 0.127215 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0.027253 | 73,092 | 732 | 263 | 99.852459 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
b638631482e17c22ee049558c8cb406a3d60249e | 31,762 | py | Python | temboardui/plugins/monitoring/chartdata.py | bersace/temboard | 3feb7bbe24f5eabcb66c0bd35f32d548d8613b15 | [
"PostgreSQL"
] | null | null | null | temboardui/plugins/monitoring/chartdata.py | bersace/temboard | 3feb7bbe24f5eabcb66c0bd35f32d548d8613b15 | [
"PostgreSQL"
] | null | null | null | temboardui/plugins/monitoring/chartdata.py | bersace/temboard | 3feb7bbe24f5eabcb66c0bd35f32d548d8613b15 | [
"PostgreSQL"
] | null | null | null | import cStringIO
import pandas
import datetime
def zoom_level(start, end):
zoom = 0
if end:
d = end - start
else:
d = datetime.datetime.utcnow() - start
if d.days > 1 and d.days <= 31:
zoom = 1
elif d.days > 31 and d.days <= 365:
zoom = 2
elif d.days > 365:
zoom = 2
return zoom
def get_tablename(probename, start, end):
zoom = zoom_level(start, end)
if zoom == 1:
return 'metric_%s_30m_current' % (probename)
elif zoom == 2:
return 'metric_%s_6h_current' % (probename)
else:
return
def get_loadaverage(session, host_id, start, end, interval):
"""
Loadaverage data loader for chart rendering.
"""
# Instanciate a new string buffer needed by copy_expert()
data_buffer = cStringIO.StringIO()
# Get a new psycopg2 cursor from the current sqlalchemy session
cur = session.connection().connection.cursor()
# Change working schema to 'monitoring'
cur.execute("SET search_path TO monitoring")
# Get the "zoom level", depending on the time interval
zl = zoom_level(start, end)
if interval == 'all':
interval = ['load1', 'load5', 'load15']
else:
interval = [interval]
interval_sql = (',').join(['(record).%s' % i for i in interval])
# Usage of COPY .. TO STDOUT WITH CSV for data extraction
query = """
COPY (
SELECT
datetime AS date,
%s
FROM""" % (interval_sql)
if zl == 0:
# Look up in non-aggregated data
query += """
monitoring.expand_data_by_host_id(
'metric_loadavg',
tstzrange('%s', '%s'),
%s)
AS (datetime timestamp with time zone,
host_id integer,
record metric_loadavg_record))
TO STDOUT WITH CSV HEADER""" % (
start,
end,
host_id
)
else:
tablename = get_tablename('loadavg', start, end)
query += """
monitoring.%s
WHERE
host_id = %s
AND datetime >= '%s'
AND datetime <= '%s'
ORDER BY datetime ASC)
TO STDOUT WITH CSV HEADER""" % (
tablename,
host_id,
start,
end
)
# Retreive data using copy_expert()
cur.copy_expert(query, data_buffer)
cur.close()
data = data_buffer.getvalue()
data_buffer.close()
return data
def get_cpu(session, host_id, start, end):
data_buffer = cStringIO.StringIO()
cur = session.connection().connection.cursor()
cur.execute("SET search_path TO monitoring")
zl = zoom_level(start, end)
query = """
COPY (
SELECT
datetime AS date,
round((SUM((record).time_user)/(SUM((record).time_user)+SUM((record).time_system)+SUM((record).time_idle)+SUM((record).time_iowait)+SUM((record).time_steal))::float*100)::numeric, 1) AS user,
round((SUM((record).time_system)/(SUM((record).time_user)+SUM((record).time_system)+SUM((record).time_idle)+SUM((record).time_iowait)+SUM((record).time_steal))::float*100)::numeric, 1) AS system,
round((SUM((record).time_iowait)/(SUM((record).time_user)+SUM((record).time_system)+SUM((record).time_idle)+SUM((record).time_iowait)+SUM((record).time_steal))::float*100)::numeric, 1) AS iowait,
round((SUM((record).time_steal)/(SUM((record).time_user)+SUM((record).time_system)+SUM((record).time_idle)+SUM((record).time_iowait)+SUM((record).time_steal))::float*100)::numeric, 1) AS steal
FROM""" # noqa
if zl == 0:
query += """
monitoring.expand_data_by_host_id(
'metric_cpu',
tstzrange('%s', '%s'),
%s)
AS (datetime timestamp with time zone,
host_id integer,
cpu text,
record metric_cpu_record)
GROUP BY datetime, host_id
ORDER BY datetime)
TO STDOUT WITH CSV HEADER""" % (
start,
end,
host_id
)
else:
tablename = get_tablename('cpu', start, end)
query += """
monitoring.%s
WHERE
host_id = %s
AND datetime >= '%s'
AND datetime <= '%s'
GROUP BY datetime, host_id
ORDER BY datetime)
TO STDOUT WITH CSV HEADER""" % (
tablename,
host_id,
start,
end
)
cur.copy_expert(query, data_buffer)
cur.close()
data = data_buffer.getvalue()
data_buffer.close()
return data
def get_tps(session, instance_id, start, end):
data_buffer = cStringIO.StringIO()
cur = session.connection().connection.cursor()
cur.execute("SET search_path TO monitoring")
zl = zoom_level(start, end)
query = """
COPY (
SELECT
datetime AS date,
round(SUM((record).n_commit)/(extract('epoch' from MIN((record).measure_interval)))) AS commit,
round(SUM((record).n_rollback)/(extract('epoch' from MIN((record).measure_interval)))) AS rollback
FROM""" # noqa
if zl == 0:
query += """
monitoring.expand_data_by_instance_id(
'metric_xacts',
tstzrange('%s', '%s'),
%s)
AS (datetime timestamp with time zone,
instance_id integer,
dbname text,
record metric_xacts_record)
GROUP BY datetime, instance_id
ORDER BY datetime)
TO STDOUT WITH CSV HEADER""" % (
start,
end,
instance_id
)
else:
tablename = get_tablename('xacts', start, end)
query += """
monitoring.%s
WHERE
instance_id = %s
AND datetime >= '%s'
AND datetime <= '%s'
GROUP BY datetime, instance_id
ORDER BY datetime)
TO STDOUT WITH CSV HEADER""" % (
tablename,
instance_id,
start,
end
)
cur.copy_expert(query, data_buffer)
cur.close()
data = data_buffer.getvalue()
data_buffer.close()
return data
def get_db_size(session, instance_id, start, end):
data_buffer = cStringIO.StringIO()
data_pivot = cStringIO.StringIO()
cur = session.connection().connection.cursor()
cur.execute("SET search_path TO monitoring")
zl = zoom_level(start, end)
query = """
COPY (
SELECT
datetime AS date,
dbname,
(record).size
FROM"""
if zl == 0:
query += """
monitoring.expand_data_by_instance_id(
'metric_db_size',
tstzrange('%s', '%s'),
%s)
AS (datetime timestamp with time zone,
instance_id integer,
dbname text,
record metric_db_size_record))
TO STDOUT WITH CSV HEADER""" % (
start,
end,
instance_id
)
else:
tablename = get_tablename('db_size', start, end)
query += """
monitoring.%s
WHERE
instance_id = %s
AND datetime >= '%s'
AND datetime <= '%s'
ORDER BY 1,2 ASC)
TO STDOUT WITH CSV HEADER""" % (
tablename,
instance_id,
start,
end
)
cur.copy_expert(query, data_buffer)
cur.close()
df = pandas.read_csv(cStringIO.StringIO(data_buffer.getvalue()))
dfp = df.pivot(index='date', columns='dbname', values='size')
dfp.to_csv(data_pivot)
data = data_pivot.getvalue()
data_buffer.close()
data_pivot.close()
return data
def get_instance_size(session, instance_id, start, end):
data_buffer = cStringIO.StringIO()
cur = session.connection().connection.cursor()
cur.execute("SET search_path TO monitoring")
zl = zoom_level(start, end)
query = """
COPY (
SELECT
datetime AS date,
SUM((record).size) AS size
FROM"""
if zl == 0:
query += """
monitoring.expand_data_by_instance_id(
'metric_db_size',
tstzrange('%s', '%s'),
%s)
AS (datetime timestamp with time zone,
instance_id integer,
dbname text,
record metric_db_size_record)
GROUP BY datetime, instance_id
ORDER BY datetime)
TO STDOUT WITH CSV HEADER""" % (
start,
end,
instance_id
)
else:
tablename = get_tablename('db_size', start, end)
query += """
monitoring.%s
WHERE
instance_id = %s
AND datetime >= '%s'
AND datetime <= '%s'
GROUP BY datetime, instance_id
ORDER BY 1,2 ASC)
TO STDOUT WITH CSV HEADER""" % (
tablename,
instance_id,
start,
end
)
cur.copy_expert(query, data_buffer)
cur.close()
data = data_buffer.getvalue()
data_buffer.close()
return data
def get_memory(session, host_id, start, end):
data_buffer = cStringIO.StringIO()
cur = session.connection().connection.cursor()
cur.execute("SET search_path TO monitoring")
zl = zoom_level(start, end)
query = """
COPY (
SELECT
datetime AS date,
(record).mem_free AS free,
(record).mem_cached AS cached,
(record).mem_buffers AS buffers,
((record).mem_used - (record).mem_cached - (record).mem_buffers) AS other
FROM""" # noqa
if zl == 0:
query += """
monitoring.expand_data_by_host_id(
'metric_memory',
tstzrange('%s', '%s'),
%s)
AS (datetime timestamp with time zone,
host_id integer,
record metric_memory_record))
TO STDOUT WITH CSV HEADER""" % (
start,
end,
host_id
)
else:
tablename = get_tablename('memory', start, end)
query += """
monitoring.%s
WHERE
host_id = %s
AND datetime >= '%s'
AND datetime <= '%s'
ORDER BY datetime)
TO STDOUT WITH CSV HEADER""" % (
tablename,
host_id,
start,
end
)
cur.copy_expert(query, data_buffer)
cur.close()
data = data_buffer.getvalue()
data_buffer.close()
return data
def get_swap(session, host_id, start, end):
data_buffer = cStringIO.StringIO()
cur = session.connection().connection.cursor()
cur.execute("SET search_path TO monitoring")
zl = zoom_level(start, end)
query = """
COPY (
SELECT
datetime AS date,
(record).swap_used AS used
FROM"""
if zl == 0:
query += """
monitoring.expand_data_by_host_id(
'metric_memory',
tstzrange('%s', '%s'),
%s)
AS (datetime timestamp with time zone,
host_id integer,
record metric_memory_record))
TO STDOUT WITH CSV HEADER""" % (
start,
end,
host_id
)
else:
tablename = get_tablename('memory', start, end)
query += """
monitoring.%s
WHERE
host_id = %s
AND datetime >= '%s'
AND datetime <= '%s'
ORDER BY datetime)
TO STDOUT WITH CSV HEADER""" % (
tablename,
host_id,
start,
end
)
cur.copy_expert(query, data_buffer)
cur.close()
data = data_buffer.getvalue()
data_buffer.close()
return data
def get_sessions(session, instance_id, start, end):
data_buffer = cStringIO.StringIO()
cur = session.connection().connection.cursor()
cur.execute("SET search_path TO monitoring")
zl = zoom_level(start, end)
query = """
COPY (
SELECT
datetime AS date,
SUM((record).active) AS active,
SUM((record).waiting) AS waiting,
SUM((record).idle) AS idle,
SUM((record).idle_in_xact) AS idle_in_xact,
SUM((record).idle_in_xact_aborted) AS idle_in_xact_aborted,
SUM((record).fastpath) AS fastpath,
SUM((record).disabled) AS disabled
FROM"""
if zl == 0:
query += """
monitoring.expand_data_by_instance_id(
'metric_sessions',
tstzrange('%s', '%s'),
%s)
AS (datetime timestamp with time zone,
instance_id integer,
dbname text,
record metric_sessions_record)
GROUP BY datetime, instance_id
ORDER BY datetime)
TO STDOUT WITH CSV HEADER""" % (
start,
end,
instance_id
)
else:
tablename = get_tablename('sessions', start, end)
query += """
monitoring.%s
WHERE
instance_id = %s
AND datetime >= '%s'
AND datetime <= '%s'
GROUP BY datetime, instance_id
ORDER BY 1,2 ASC)
TO STDOUT WITH CSV HEADER""" % (
tablename,
instance_id,
start,
end
)
cur.copy_expert(query, data_buffer)
cur.close()
data = data_buffer.getvalue()
data_buffer.close()
return data
def get_blocks(session, instance_id, start, end):
data_buffer = cStringIO.StringIO()
cur = session.connection().connection.cursor()
cur.execute("SET search_path TO monitoring")
zl = zoom_level(start, end)
query = """
COPY (
SELECT
datetime AS date,
ROUND(SUM((record).blks_read)/(extract('epoch' from MIN((record).measure_interval)))) AS blks_read_s,
ROUND(SUM((record).blks_hit)/(extract('epoch' from MIN((record).measure_interval)))) AS blks_hit_s
FROM""" # noqa
if zl == 0:
query += """
monitoring.expand_data_by_instance_id(
'metric_blocks',
tstzrange('%s', '%s'),
%s)
AS (datetime timestamp with time zone,
instance_id integer,
dbname text,
record metric_blocks_record)
GROUP BY datetime, instance_id
ORDER BY datetime)
TO STDOUT WITH CSV HEADER""" % (
start,
end,
instance_id
)
else:
tablename = get_tablename('blocks', start, end)
query += """
monitoring.%s
WHERE
instance_id = %s
AND datetime >= '%s'
AND datetime <= '%s'
GROUP BY datetime, instance_id
ORDER BY 1,2 ASC)
TO STDOUT WITH CSV HEADER""" % (
tablename,
instance_id,
start,
end
)
cur.copy_expert(query, data_buffer)
cur.close()
data = data_buffer.getvalue()
data_buffer.close()
return data
def get_hitreadratio(session, instance_id, start, end):
data_buffer = cStringIO.StringIO()
cur = session.connection().connection.cursor()
cur.execute("SET search_path TO monitoring")
zl = zoom_level(start, end)
query = """
COPY (
SELECT
datetime AS date,
CASE
WHEN (SUM((record).blks_hit) + SUM((record).blks_read)) > 0
THEN ROUND((SUM((record).blks_hit)::FLOAT/(SUM((record).blks_hit) + SUM((record).blks_read)::FLOAT) * 100)::numeric, 2)
ELSE 100 END AS hit_read_ratio
FROM""" # noqa
if zl == 0:
query += """
monitoring.expand_data_by_instance_id(
'metric_blocks',
tstzrange('%s', '%s'),
%s)
AS (datetime timestamp with time zone,
instance_id integer,
dbname text,
record metric_blocks_record)
GROUP BY datetime, instance_id
ORDER BY datetime)
TO STDOUT WITH CSV HEADER""" % (
start,
end,
instance_id
)
else:
tablename = get_tablename('blocks', start, end)
query += """
monitoring.%s
WHERE
instance_id = %s
AND datetime >= '%s'
AND datetime <= '%s'
GROUP BY datetime, instance_id
ORDER BY 1,2 ASC)
TO STDOUT WITH CSV HEADER""" % (
tablename,
instance_id,
start,
end
)
cur.copy_expert(query, data_buffer)
cur.close()
data = data_buffer.getvalue()
data_buffer.close()
return data
def get_checkpoints(session, instance_id, start, end):
data_buffer = cStringIO.StringIO()
cur = session.connection().connection.cursor()
cur.execute("SET search_path TO monitoring")
zl = zoom_level(start, end)
query = """
COPY (
SELECT
datetime AS date,
(record).checkpoints_timed AS timed,
(record).checkpoints_req AS req,
ROUND(((record).checkpoint_write_time/1000)::numeric, 1) AS write_time,
ROUND(((record).checkpoint_sync_time/1000)::numeric,1) AS sync_time
FROM"""
if zl == 0:
query += """
monitoring.expand_data_by_instance_id(
'metric_bgwriter',
tstzrange('%s', '%s'),
%s)
AS (datetime timestamp with time zone,
instance_id integer,
record metric_bgwriter_record))
TO STDOUT WITH CSV HEADER""" % (
start,
end,
instance_id
)
else:
tablename = get_tablename('bgwriter', start, end)
query += """
monitoring.%s
WHERE
instance_id = %s
AND datetime >= '%s'
AND datetime <= '%s'
ORDER BY 1,2 ASC)
TO STDOUT WITH CSV HEADER""" % (
tablename,
instance_id,
start,
end
)
cur.copy_expert(query, data_buffer)
cur.close()
data = data_buffer.getvalue()
data_buffer.close()
return data
def get_written_buffers(session, instance_id, start, end):
data_buffer = cStringIO.StringIO()
cur = session.connection().connection.cursor()
cur.execute("SET search_path TO monitoring")
zl = zoom_level(start, end)
query = """
COPY (
SELECT
datetime AS date,
(record).buffers_checkpoint AS checkpoint,
(record).buffers_clean AS clean,
(record).buffers_backend AS backend
FROM"""
if zl == 0:
query += """
monitoring.expand_data_by_instance_id(
'metric_bgwriter',
tstzrange('%s', '%s'),
%s)
AS (datetime timestamp with time zone,
instance_id integer,
record metric_bgwriter_record))
TO STDOUT WITH CSV HEADER""" % (
start,
end,
instance_id
)
else:
tablename = get_tablename('bgwriter', start, end)
query += """
monitoring.%s
WHERE
instance_id = %s
AND datetime >= '%s'
AND datetime <= '%s'
ORDER BY 1,2 ASC)
TO STDOUT WITH CSV HEADER""" % (
tablename,
instance_id,
start,
end
)
cur.copy_expert(query, data_buffer)
cur.close()
data = data_buffer.getvalue()
data_buffer.close()
return data
def get_locks(session, instance_id, start, end):
data_buffer = cStringIO.StringIO()
cur = session.connection().connection.cursor()
cur.execute("SET search_path TO monitoring")
zl = zoom_level(start, end)
query = """
COPY (
SELECT
datetime AS date,
SUM((record).access_share) AS access_share,
SUM((record).row_share) AS row_share,
SUM((record).row_exclusive) AS row_exclusive,
SUM((record).share_update_exclusive) AS share_update_exclusive,
SUM((record).share) AS share,
SUM((record).share_row_exclusive) AS share_row_exclusive,
SUM((record).exclusive) AS exclusive,
SUM((record).access_exclusive) AS access_exclusive,
SUM((record).siread) AS siread
FROM"""
if zl == 0:
query += """
monitoring.expand_data_by_instance_id(
'metric_locks',
tstzrange('%s', '%s'),
%s)
AS (datetime timestamp with time zone,
instance_id integer,
dbname text,
record metric_locks_record)
GROUP BY datetime, instance_id
ORDER BY datetime)
TO STDOUT WITH CSV HEADER""" % (
start,
end,
instance_id
)
else:
tablename = get_tablename('locks', start, end)
query += """
monitoring.%s
WHERE
instance_id = %s
AND datetime >= '%s'
AND datetime <= '%s'
GROUP BY datetime, instance_id
ORDER BY 1,2 ASC)
TO STDOUT WITH CSV HEADER""" % (
tablename,
instance_id,
start,
end
)
cur.copy_expert(query, data_buffer)
cur.close()
data = data_buffer.getvalue()
data_buffer.close()
return data
def get_waiting_locks(session, instance_id, start, end):
data_buffer = cStringIO.StringIO()
cur = session.connection().connection.cursor()
cur.execute("SET search_path TO monitoring")
zl = zoom_level(start, end)
query = """
COPY (
SELECT
datetime AS date,
SUM((record).waiting_access_share) AS access_share,
SUM((record).waiting_row_share) AS row_share,
SUM((record).waiting_row_exclusive) AS row_exclusive,
SUM((record).waiting_share_update_exclusive) AS share_update_exclusive,
SUM((record).waiting_share) AS share,
SUM((record).waiting_share_row_exclusive) AS share_row_exclusive,
SUM((record).waiting_exclusive) AS exclusive,
SUM((record).waiting_access_exclusive) AS access_exclusive
FROM"""
if zl == 0:
query += """
monitoring.expand_data_by_instance_id(
'metric_locks',
tstzrange('%s', '%s'),
%s)
AS (datetime timestamp with time zone,
instance_id integer,
dbname text,
record metric_locks_record)
GROUP BY datetime, instance_id
ORDER BY datetime)
TO STDOUT WITH CSV HEADER""" % (
start,
end,
instance_id
)
else:
tablename = get_tablename('locks', start, end)
query += """
monitoring.%s
WHERE
instance_id = %s
AND datetime >= '%s'
AND datetime <= '%s'
GROUP BY datetime, instance_id
ORDER BY 1,2 ASC)
TO STDOUT WITH CSV HEADER""" % (
tablename,
instance_id,
start,
end
)
cur.copy_expert(query, data_buffer)
cur.close()
data = data_buffer.getvalue()
data_buffer.close()
return data
def get_fs_size(session, host_id, start, end):
data_buffer = cStringIO.StringIO()
data_pivot = cStringIO.StringIO()
cur = session.connection().connection.cursor()
cur.execute("SET search_path TO monitoring")
zl = zoom_level(start, end)
query = """
COPY (
SELECT
datetime AS date,
mount_point,
(record).used AS size
FROM
"""
if zl == 0:
query += """
monitoring.expand_data_by_host_id(
'metric_filesystems_size',
tstzrange('%s', '%s'),
%s)
AS (datetime timestamp with time zone,
host_id integer,
mount_point text,
record metric_filesystems_size_record))
TO STDOUT WITH CSV HEADER""" % (
start,
end,
host_id
)
else:
tablename = get_tablename('filesystems_size', start, end)
query += """
monitoring.%s
WHERE
host_id = %s
AND datetime >= '%s'
AND datetime <= '%s'
ORDER BY 1,2 ASC)
TO STDOUT WITH CSV HEADER""" % (
tablename,
host_id,
start,
end
)
cur.copy_expert(query, data_buffer)
cur.close()
df = pandas.read_csv(cStringIO.StringIO(data_buffer.getvalue()))
dfp = df.pivot(index='date', columns='mount_point', values='size')
dfp.to_csv(data_pivot)
data = data_pivot.getvalue()
data_buffer.close()
data_pivot.close()
return data
def get_fs_usage(session, host_id, start, end):
data_buffer = cStringIO.StringIO()
data_pivot = cStringIO.StringIO()
cur = session.connection().connection.cursor()
cur.execute("SET search_path TO monitoring")
zl = zoom_level(start, end)
query = """
COPY (
SELECT
datetime AS date,
mount_point,
round((((record).used::FLOAT/(record).total::FLOAT)*100)::numeric, 1) AS usage
FROM""" # noqa
if zl == 0:
query += """
monitoring.expand_data_by_host_id(
'metric_filesystems_size',
tstzrange('%s', '%s'),
%s)
AS (datetime timestamp with time zone,
host_id integer,
mount_point text,
record metric_filesystems_size_record))
TO STDOUT WITH CSV HEADER""" % (
start,
end,
host_id
)
else:
tablename = get_tablename('filesystems_size', start, end)
query += """
monitoring.%s
WHERE
host_id = %s
AND datetime >= '%s'
AND datetime <= '%s'
ORDER BY 1,2 ASC)
TO STDOUT WITH CSV HEADER""" % (
tablename,
host_id,
start,
end
)
cur.copy_expert(query, data_buffer)
cur.close()
df = pandas.read_csv(cStringIO.StringIO(data_buffer.getvalue()))
dfp = df.pivot(index='date', columns='mount_point', values='usage')
dfp.to_csv(data_pivot)
data = data_pivot.getvalue()
data_buffer.close()
data_pivot.close()
return data
def get_ctxforks(session, host_id, start, end):
data_buffer = cStringIO.StringIO()
cur = session.connection().connection.cursor()
cur.execute("SET search_path TO monitoring")
zl = zoom_level(start, end)
query = """
COPY (
SELECT
datetime AS date,
round(SUM((record).context_switches)/(extract('epoch' from MIN((record).measure_interval)))) AS context_switches_s,
round(SUM((record).forks)/(extract('epoch' from MIN((record).measure_interval)))) AS forks_s
FROM""" # noqa
if zl == 0:
query += """
monitoring.expand_data_by_host_id(
'metric_process',
tstzrange('%s', '%s'),
%s)
AS (datetime timestamp with time zone,
host_id integer,
record metric_process_record)
GROUP BY datetime
ORDER BY datetime)
TO STDOUT WITH CSV HEADER""" % (
start,
end,
host_id
)
else:
tablename = get_tablename('process', start, end)
query += """
monitoring.%s
WHERE
host_id = %s
AND datetime >= '%s'
AND datetime <= '%s'
GROUP BY datetime
ORDER BY datetime)
TO STDOUT WITH CSV HEADER""" % (
tablename,
host_id,
start,
end
)
cur.copy_expert(query, data_buffer)
cur.close()
data = data_buffer.getvalue()
data_buffer.close()
return data
def get_tblspc_size(session, instance_id, start, end):
data_buffer = cStringIO.StringIO()
data_pivot = cStringIO.StringIO()
cur = session.connection().connection.cursor()
cur.execute("SET search_path TO monitoring")
zl = zoom_level(start, end)
query = """
COPY (
SELECT
datetime AS date,
spcname,
(record).size
FROM"""
if zl == 0:
query += """
monitoring.expand_data_by_instance_id(
'metric_tblspc_size',
tstzrange('%s', '%s'),
%s)
AS (datetime timestamp with time zone,
instance_id integer,
spcname text,
record metric_tblspc_size_record))
TO STDOUT WITH CSV HEADER""" % (
start,
end,
instance_id
)
else:
tablename = get_tablename('tblspc_size', start, end)
query += """
monitoring.%s
WHERE
instance_id = %s
AND datetime >= '%s'
AND datetime <= '%s'
ORDER BY 1,2 ASC)
TO STDOUT WITH CSV HEADER""" % (
tablename,
instance_id,
start,
end
)
cur.copy_expert(query, data_buffer)
cur.close()
df = pandas.read_csv(cStringIO.StringIO(data_buffer.getvalue()))
dfp = df.pivot(index='date', columns='spcname', values='size')
dfp.to_csv(data_pivot)
data = data_pivot.getvalue()
data_buffer.close()
data_pivot.close()
return data
def get_wal_files_size(session, instance_id, start, end):
data_buffer = cStringIO.StringIO()
cur = session.connection().connection.cursor()
cur.execute("SET search_path TO monitoring")
zl = zoom_level(start, end)
query = """
COPY (
SELECT
datetime AS date,
(record).written_size,
(record).total_size
FROM"""
if zl == 0:
query += """
monitoring.expand_data_by_instance_id(
'metric_wal_files',
tstzrange('%s', '%s'),
%s)
AS (datetime timestamp with time zone,
instance_id integer,
record metric_wal_files_record))
TO STDOUT WITH CSV HEADER""" % (
start,
end,
instance_id
)
else:
tablename = get_tablename('wal_files', start, end)
query += """
monitoring.%s
WHERE
instance_id = %s
AND datetime >= '%s'
AND datetime <= '%s'
ORDER BY 1,2 ASC)
TO STDOUT WITH CSV HEADER""" % (
tablename,
instance_id,
start,
end
)
cur.copy_expert(query, data_buffer)
cur.close()
data = data_buffer.getvalue()
data_buffer.close()
return data
def get_wal_files_count(session, instance_id, start, end):
data_buffer = cStringIO.StringIO()
cur = session.connection().connection.cursor()
cur.execute("SET search_path TO monitoring")
zl = zoom_level(start, end)
query = """
COPY (
SELECT
datetime AS date,
(record).archive_ready,
(record).total
FROM"""
if zl == 0:
query += """
monitoring.expand_data_by_instance_id(
'metric_wal_files',
tstzrange('%s', '%s'),
%s)
AS (datetime timestamp with time zone,
instance_id integer,
record metric_wal_files_record))
TO STDOUT WITH CSV HEADER""" % (
start,
end,
instance_id
)
else:
tablename = get_tablename('wal_files', start, end)
query += """
monitoring.%s
WHERE
instance_id = %s
AND datetime >= '%s'
AND datetime <= '%s'
ORDER BY 1,2 ASC)
TO STDOUT WITH CSV HEADER""" % (
tablename,
instance_id,
start,
end
)
cur.copy_expert(query, data_buffer)
cur.close()
data = data_buffer.getvalue()
data_buffer.close()
return data
def get_wal_files_rate(session, instance_id, start, end):
data_buffer = cStringIO.StringIO()
cur = session.connection().connection.cursor()
cur.execute("SET search_path TO monitoring")
zl = zoom_level(start, end)
query = """
COPY (
SELECT
datetime AS date,
round(SUM((record).written_size)/(extract('epoch' from MIN((record).measure_interval)))) AS written_size_s
FROM""" # noqa
if zl == 0:
query += """
monitoring.expand_data_by_instance_id(
'metric_wal_files',
tstzrange('%s', '%s'),
%s)
AS (datetime timestamp with time zone,
instance_id integer,
record metric_wal_files_record)
GROUP BY datetime, instance_id
ORDER BY datetime)
TO STDOUT WITH CSV HEADER""" % (
start,
end,
instance_id
)
else:
tablename = get_tablename('wal_files', start, end)
query += """
monitoring.%s
WHERE
instance_id = %s
AND datetime >= '%s'
AND datetime <= '%s'
GROUP BY datetime, instance_id
ORDER BY 1,2 ASC)
TO STDOUT WITH CSV HEADER""" % (
tablename,
instance_id,
start,
end
)
cur.copy_expert(query, data_buffer)
cur.close()
data = data_buffer.getvalue()
data_buffer.close()
return data
| 27.35745 | 203 | 0.568919 | 3,678 | 31,762 | 4.732735 | 0.054377 | 0.049635 | 0.029643 | 0.037054 | 0.89217 | 0.863216 | 0.86132 | 0.845292 | 0.827196 | 0.810881 | 0 | 0.005265 | 0.318242 | 31,762 | 1,160 | 204 | 27.381034 | 0.798605 | 0.013066 | 0 | 0.834413 | 0 | 0.012026 | 0.520069 | 0.118977 | 0 | 0 | 0 | 0 | 0 | 1 | 0.021277 | false | 0 | 0.002775 | 0 | 0.047179 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
fcc30676971647a547f469e9bf94c15dfd157d3c | 27,916 | py | Python | model-code/refactored-version/mlp-tabular/helper_files/helper_data.py | Raschka-research-group/corn-ordinal-neuralnet | 083fc95c3bda896e8eb12626df5ed37c18774716 | [
"MIT"
] | 6 | 2021-11-18T02:51:30.000Z | 2022-03-16T23:33:36.000Z | model-code/refactored-version/mlp-tabular/helper_files/helper_data.py | Raschka-research-group/corn-ordinal-neuralnet | 083fc95c3bda896e8eb12626df5ed37c18774716 | [
"MIT"
] | null | null | null | model-code/refactored-version/mlp-tabular/helper_files/helper_data.py | Raschka-research-group/corn-ordinal-neuralnet | 083fc95c3bda896e8eb12626df5ed37c18774716 | [
"MIT"
] | 1 | 2021-12-15T08:16:18.000Z | 2021-12-15T08:16:18.000Z | import random
import pandas as pd
import torch
from torch.utils.data import sampler
from torch.utils.data import Dataset
from torch.utils.data import DataLoader
from torch.utils.data import SubsetRandomSampler
from torchvision import transforms
from torchvision import datasets
def label_to_levels(label, num_classes, dtype=torch.float32):
"""Converts integer class label to extended binary label vector
Parameters
----------
label : int
Class label to be converted into a extended
binary vector. Should be smaller than num_classes-1.
num_classes : int
The number of class clabels in the dataset. Assumes
class labels start at 0. Determines the size of the
output vector.
dtype : torch data type (default=torch.float32)
Data type of the torch output vector for the
extended binary labels.
Returns
----------
levels : torch.tensor, shape=(num_classes-1,)
Extended binary label vector. Type is determined
by the `dtype` parameter.
Examples
----------
>>> label_to_levels(0, num_classes=5)
tensor([0., 0., 0., 0.])
>>> label_to_levels(1, num_classes=5)
tensor([1., 0., 0., 0.])
>>> label_to_levels(3, num_classes=5)
tensor([1., 1., 1., 0.])
>>> label_to_levels(4, num_classes=5)
tensor([1., 1., 1., 1.])
"""
if not label <= num_classes-1:
raise ValueError('Class label must be smaller or '
'equal to %d (num_classes-1). Got %d.'
% (num_classes-1, label))
if isinstance(label, torch.Tensor):
int_label = label.item()
else:
int_label = label
levels = [1]*int_label + [0]*(num_classes - 1 - int_label)
levels = torch.tensor(levels, dtype=dtype)
return levels
def levels_from_labelbatch(labels, num_classes, dtype=torch.float32):
"""
Converts a list of integer class label to extended binary label vectors
Parameters
----------
labels : list or 1D orch.tensor, shape=(num_labels,)
A list or 1D torch.tensor with integer class labels
to be converted into extended binary label vectors.
num_classes : int
The number of class clabels in the dataset. Assumes
class labels start at 0. Determines the size of the
output vector.
dtype : torch data type (default=torch.float32)
Data type of the torch output vector for the
extended binary labels.
Returns
----------
levels : torch.tensor, shape=(num_labels, num_classes-1)
Examples
----------
>>> levels_from_labelbatch(labels=[2, 1, 4], num_classes=5)
tensor([[1., 1., 0., 0.],
[1., 0., 0., 0.],
[1., 1., 1., 1.]])
"""
levels = []
for label in labels:
levels_from_label = label_to_levels(
label=label, num_classes=num_classes, dtype=dtype)
levels.append(levels_from_label)
levels = torch.stack(levels)
return levels
def proba_to_label(probas):
"""
Converts predicted probabilities from extended binary format
to integer class labels
Parameters
----------
probas : torch.tensor, shape(n_examples, n_labels)
Torch tensor consisting of probabilities returned by CORAL model.
Examples
----------
>>> # 3 training examples, 6 classes
>>> probas = torch.tensor([[0.934, 0.861, 0.323, 0.492, 0.295],
... [0.496, 0.485, 0.267, 0.124, 0.058],
... [0.985, 0.967, 0.920, 0.819, 0.506]])
>>> proba_to_label(probas)
tensor([2, 0, 5])
"""
predict_levels = probas > 0.5
predicted_labels = torch.sum(predict_levels, dim=1)
return predicted_labels
class BalancedBatchSampler(torch.utils.data.sampler.Sampler):
# adopted from https://github.com/galatolofederico/pytorch-balanced-batch/blob/master/sampler.py
def __init__(self, dataset, labels=None):
self.labels = labels
self.dataset = dict()
self.balanced_max = 0
# Save all the indices for all the classes
for idx in range(0, len(dataset)):
label = self._get_label(dataset, idx)
if label not in self.dataset:
self.dataset[label] = list()
self.dataset[label].append(idx)
self.balanced_max = len(self.dataset[label]) \
if len(self.dataset[label]) > self.balanced_max else self.balanced_max
# Oversample the classes with fewer elements than the max
for label in self.dataset:
while len(self.dataset[label]) < self.balanced_max:
self.dataset[label].append(random.choice(self.dataset[label]))
self.keys = list(self.dataset.keys())
self.currentkey = 0
self.indices = [-1]*len(self.keys)
def __iter__(self):
while self.indices[self.currentkey] < self.balanced_max - 1:
self.indices[self.currentkey] += 1
yield self.dataset[self.keys[self.currentkey]][self.indices[self.currentkey]]
self.currentkey = (self.currentkey + 1) % len(self.keys)
self.indices = [-1]*len(self.keys)
def _get_label(self, dataset, idx, labels = None):
if self.labels is not None:
return self.labels[idx].item()
else:
# Trying guessing
dataset_type = type(dataset)
if dataset_type is torchvision.datasets.MNIST:
return dataset.train_labels[idx].item()
elif dataset_type is torchvision.datasets.ImageFolder:
return dataset.imgs[idx][1]
else:
raise Exception("You should pass the tensor of labels to the constructor as second argument")
def __len__(self):
return self.balanced_max*len(self.keys)
class UnNormalize(object):
def __init__(self, mean, std):
self.mean = mean
self.std = std
def __call__(self, tensor):
"""
Parameters:
------------
tensor (Tensor): Tensor image of size (C, H, W) to be normalized.
Returns:
------------
Tensor: Normalized image.
"""
for t, m, s in zip(tensor, self.mean, self.std):
t.mul_(s).add_(m)
return tensor
def get_dataloaders_mnist(batch_size, num_workers=0,
validation_fraction=None,
train_transforms=None,
test_transforms=None):
if train_transforms is None:
train_transforms = transforms.ToTensor()
if test_transforms is None:
test_transforms = transforms.ToTensor()
train_dataset = datasets.MNIST(root='data',
train=True,
transform=train_transforms,
download=True)
valid_dataset = datasets.MNIST(root='data',
train=True,
transform=test_transforms)
test_dataset = datasets.MNIST(root='data',
train=False,
transform=test_transforms)
if validation_fraction is not None:
num = int(validation_fraction * 60000)
train_indices = torch.arange(0, 60000 - num)
valid_indices = torch.arange(60000 - num, 60000)
train_sampler = SubsetRandomSampler(train_indices)
valid_sampler = SubsetRandomSampler(valid_indices)
valid_loader = DataLoader(dataset=valid_dataset,
batch_size=batch_size,
num_workers=num_workers,
sampler=valid_sampler)
train_loader = DataLoader(dataset=train_dataset,
batch_size=batch_size,
num_workers=num_workers,
drop_last=True,
sampler=train_sampler)
else:
train_loader = DataLoader(dataset=train_dataset,
batch_size=batch_size,
num_workers=num_workers,
drop_last=True,
shuffle=True)
test_loader = DataLoader(dataset=test_dataset,
batch_size=batch_size,
num_workers=num_workers,
shuffle=False)
if validation_fraction is None:
return train_loader, test_loader
else:
return train_loader, valid_loader, test_loader
def get_dataloaders_cifar10(batch_size, num_workers=0,
validation_fraction=None,
train_transforms=None,
test_transforms=None):
if train_transforms is None:
train_transforms = transforms.ToTensor()
if test_transforms is None:
test_transforms = transforms.ToTensor()
train_dataset = datasets.CIFAR10(root='data',
train=True,
transform=train_transforms,
download=True)
valid_dataset = datasets.CIFAR10(root='data',
train=True,
transform=test_transforms)
test_dataset = datasets.CIFAR10(root='data',
train=False,
transform=test_transforms)
if validation_fraction is not None:
num = int(validation_fraction * 50000)
train_indices = torch.arange(0, 50000 - num)
valid_indices = torch.arange(50000 - num, 50000)
train_sampler = SubsetRandomSampler(train_indices)
valid_sampler = SubsetRandomSampler(valid_indices)
valid_loader = DataLoader(dataset=valid_dataset,
batch_size=batch_size,
num_workers=num_workers,
sampler=valid_sampler)
train_loader = DataLoader(dataset=train_dataset,
batch_size=batch_size,
num_workers=num_workers,
drop_last=True,
sampler=train_sampler)
else:
train_loader = DataLoader(dataset=train_dataset,
batch_size=batch_size,
num_workers=num_workers,
drop_last=True,
shuffle=True)
test_loader = DataLoader(dataset=test_dataset,
batch_size=batch_size,
num_workers=num_workers,
shuffle=False)
if validation_fraction is None:
return train_loader, test_loader
else:
return train_loader, valid_loader, test_loader
######################################################################################
class HotspotDataset_v1(Dataset):
def __init__(self, csv_path):
df = pd.read_csv(csv_path)
self.y = torch.from_numpy(df['ddG'].values).to(torch.int64)
df = df.drop('ddG', axis=1)
self.features = torch.from_numpy(df.values).to(torch.float32)
def __getitem__(self, index):
features = self.features[index]
label = self.y[index]
return features, label
def __len__(self):
return self.y.shape[0]
def get_dataloaders_hotspot_v1(batch_size, train_csv_path, test_csv_path, balanced=False, num_workers=0):
train_dataset = HotspotDataset_v1(csv_path=train_csv_path)
test_dataset = HotspotDataset_v1(csv_path=test_csv_path)
if balanced:
train_loader = DataLoader(dataset=train_dataset,
batch_size=batch_size,
drop_last=True,
shuffle=False,
sampler=BalancedBatchSampler(train_dataset, labels=train_dataset.y),
num_workers=num_workers)
else:
train_loader = DataLoader(dataset=train_dataset,
batch_size=batch_size,
drop_last=True,
shuffle=True,
num_workers=num_workers)
test_loader = DataLoader(dataset=test_dataset,
batch_size=batch_size,
drop_last=False,
shuffle=False,
num_workers=num_workers)
return train_loader, test_loader
class HotspotDataset_v2_2class(Dataset):
def __init__(self, csv_path):
feature_list = ['avg bond number', 'Hbond',
'Hphob', 'consurf', "B' side chain", "hotspot ratio"]
df = pd.read_csv(csv_path)
self.y = torch.from_numpy(df['2-class'].values).to(torch.int64)
self.features = torch.from_numpy(df[feature_list].values).to(torch.float32)
def __getitem__(self, index):
features = self.features[index]
label = self.y[index]
return features, label
def __len__(self):
return self.y.shape[0]
def get_dataloaders_hotspot_v2(batch_size, train_csv_path, test_csv_path, balanced=False, num_workers=0, num_classes=2):
if num_classes == 2:
train_dataset = HotspotDataset_v2_2class(csv_path=train_csv_path)
test_dataset = HotspotDataset_v2_2class(csv_path=test_csv_path)
elif num_classes == 3:
raise NotImplementedError('Not implemented yet')
else:
raise ValueError('num_classes option invalid')
if balanced:
train_loader = DataLoader(dataset=train_dataset,
batch_size=batch_size,
drop_last=True,
shuffle=False,
sampler=BalancedBatchSampler(train_dataset, labels=train_dataset.y),
num_workers=num_workers)
else:
train_loader = DataLoader(dataset=train_dataset,
batch_size=batch_size,
drop_last=True,
shuffle=True,
num_workers=num_workers)
test_loader = DataLoader(dataset=test_dataset,
batch_size=batch_size,
drop_last=False,
shuffle=False,
num_workers=num_workers)
return train_loader, test_loader
#############################################
class HotspotDataset_v3_2class(Dataset):
def __init__(self, csv_path):
feature_list = ['avg bond number', 'Hbond',
'Hphob', 'consurf', "B' side chain", "hotspot ratio"]
df = pd.read_csv(csv_path)
self.y = torch.from_numpy(df['2-class'].values).to(torch.int64)
self.features = torch.from_numpy(df[feature_list].values).to(torch.float32)
## add One-hot encoded amino acids
codes = ['A', 'R', 'N', 'D', 'C', 'E', 'Q', 'G', 'H',
'I', 'L', 'K', 'M', 'F', 'P', 'S', 'T', 'W', 'Y', 'V']
code_to_int = {c:i for i,c in enumerate(codes)}
df['residue'] = df['residue'].map(code_to_int)
tensor = torch.from_numpy(df['residue'].values)
onehot = torch.nn.functional.one_hot(tensor).to(torch.float32)
self.features = torch.cat((self.features, onehot), dim=1)
def __getitem__(self, index):
features = self.features[index]
label = self.y[index]
return features, label
def __len__(self):
return self.y.shape[0]
def get_dataloaders_hotspot_v3(batch_size, train_csv_path, test_csv_path, balanced=False, num_workers=0, num_classes=2):
if num_classes == 2:
train_dataset = HotspotDataset_v3_2class(csv_path=train_csv_path)
test_dataset = HotspotDataset_v3_2class(csv_path=test_csv_path)
elif num_classes == 3:
raise NotImplementedError('Not implemented yet')
else:
raise ValueError('num_classes option invalid')
if balanced:
train_loader = DataLoader(dataset=train_dataset,
batch_size=batch_size,
drop_last=True,
shuffle=False,
sampler=BalancedBatchSampler(train_dataset, labels=train_dataset.y),
num_workers=num_workers)
else:
train_loader = DataLoader(dataset=train_dataset,
batch_size=batch_size,
drop_last=True,
shuffle=True,
num_workers=num_workers)
test_loader = DataLoader(dataset=test_dataset,
batch_size=batch_size,
drop_last=False,
shuffle=False,
num_workers=num_workers)
return train_loader, test_loader
#############################################
class HotspotDataset_v3_2_2class(Dataset):
def __init__(self, csv_path):
feature_list = ['avg bond number', 'Hbond',
'Hphob', 'consurf', "B' side chain"]
df = pd.read_csv(csv_path)
self.y = torch.from_numpy(df['2-class'].values).to(torch.int64)
self.features = torch.from_numpy(df[feature_list].values).to(torch.float32)
## add One-hot encoded amino acids
codes = ['A', 'R', 'N', 'D', 'C', 'E', 'Q', 'G', 'H',
'I', 'L', 'K', 'M', 'F', 'P', 'S', 'T', 'W', 'Y', 'V']
code_to_int = {c:i for i,c in enumerate(codes)}
df['residue'] = df['residue'].map(code_to_int)
tensor = torch.from_numpy(df['residue'].values)
onehot = torch.nn.functional.one_hot(tensor).to(torch.float32)
self.features = torch.cat((self.features, onehot), dim=1)
def __getitem__(self, index):
features = self.features[index]
label = self.y[index]
return features, label
def __len__(self):
return self.y.shape[0]
def get_dataloaders_hotspot_v3_2(batch_size, train_csv_path, test_csv_path, balanced=False, num_workers=0, num_classes=2):
if num_classes == 2:
train_dataset = HotspotDataset_v3_2_2class(csv_path=train_csv_path)
test_dataset = HotspotDataset_v3_2_2class(csv_path=test_csv_path)
elif num_classes == 3:
raise NotImplementedError('Not implemented yet')
else:
raise ValueError('num_classes option invalid')
if balanced:
train_loader = DataLoader(dataset=train_dataset,
batch_size=batch_size,
drop_last=True,
shuffle=False,
sampler=BalancedBatchSampler(train_dataset, labels=train_dataset.y),
num_workers=num_workers)
else:
train_loader = DataLoader(dataset=train_dataset,
batch_size=batch_size,
drop_last=True,
shuffle=True,
num_workers=num_workers)
test_loader = DataLoader(dataset=test_dataset,
batch_size=batch_size,
drop_last=False,
shuffle=False,
num_workers=num_workers)
return train_loader, test_loader
#############################################
class HotspotDataset_v4_2class(Dataset):
def __init__(self, csv_path):
feature_list = ['avg bond number', 'Hbond',
'Hphob', 'consurf', "B' side chain", "hotspot ratio"]
df = pd.read_csv(csv_path)
self.y = torch.from_numpy(df['2-class'].values).to(torch.int64)
self.features = torch.from_numpy(df[feature_list].values).to(torch.float32)
# convert aa char to int
codes = ['A', 'R', 'N', 'D', 'C', 'E', 'Q', 'G', 'H',
'I', 'L', 'K', 'M', 'F', 'P', 'S', 'T', 'W', 'Y', 'V']
code_to_int = {c:i for i,c in enumerate(codes)}
self.residues = df['residue'].map(code_to_int)
def __getitem__(self, index):
features = self.features[index]
residue = self.residues[index]
label = self.y[index]
return (features, residue), label
def __len__(self):
return self.y.shape[0]
def get_dataloaders_hotspot_v4(batch_size, train_csv_path, test_csv_path, balanced=False, num_workers=0, num_classes=2):
if num_classes == 2:
train_dataset = HotspotDataset_v4_2class(csv_path=train_csv_path)
test_dataset = HotspotDataset_v4_2class(csv_path=test_csv_path)
elif num_classes == 3:
raise NotImplementedError('Not implemented yet')
else:
raise ValueError('num_classes option invalid')
if balanced:
train_loader = DataLoader(dataset=train_dataset,
batch_size=batch_size,
drop_last=True,
shuffle=False,
sampler=BalancedBatchSampler(train_dataset, labels=train_dataset.y),
num_workers=num_workers)
else:
train_loader = DataLoader(dataset=train_dataset,
batch_size=batch_size,
drop_last=True,
shuffle=True,
num_workers=num_workers)
test_loader = DataLoader(dataset=test_dataset,
batch_size=batch_size,
drop_last=False,
shuffle=False,
num_workers=num_workers)
return train_loader, test_loader
#############################################
class HotspotDataset_v4_2_2class(Dataset):
def __init__(self, csv_path):
feature_list = ['avg bond number', 'Hbond',
'Hphob', 'consurf', "B' side chain"]
df = pd.read_csv(csv_path)
self.y = torch.from_numpy(df['2-class'].values).to(torch.int64)
self.features = torch.from_numpy(df[feature_list].values).to(torch.float32)
# convert aa char to int
codes = ['A', 'R', 'N', 'D', 'C', 'E', 'Q', 'G', 'H',
'I', 'L', 'K', 'M', 'F', 'P', 'S', 'T', 'W', 'Y', 'V']
code_to_int = {c:i for i,c in enumerate(codes)}
self.residues = df['residue'].map(code_to_int)
def __getitem__(self, index):
features = self.features[index]
residue = self.residues[index]
label = self.y[index]
return (features, residue), label
def __len__(self):
return self.y.shape[0]
class HotspotDataset_v4_2_3class(Dataset):
def __init__(self, csv_path):
feature_list = ['avg bond number', 'Hbond',
'Hphob', 'consurf', "B' side chain"]
df = pd.read_csv(csv_path)
self.y = torch.from_numpy(df['3-class'].values).to(torch.int64)
self.features = torch.from_numpy(df[feature_list].values).to(torch.float32)
# convert aa char to int
codes = ['A', 'R', 'N', 'D', 'C', 'E', 'Q', 'G', 'H',
'I', 'L', 'K', 'M', 'F', 'P', 'S', 'T', 'W', 'Y', 'V']
code_to_int = {c:i for i,c in enumerate(codes)}
self.residues = df['residue'].map(code_to_int)
def __getitem__(self, index):
features = self.features[index]
residue = self.residues[index]
label = self.y[index]
return (features, residue), label
def __len__(self):
return self.y.shape[0]
class Fireman(Dataset):
def __init__(self, csv_path):
feature_list = ['V1','V2','V3','V4','V5','V6','V7','V8','V9','V10']
df = pd.read_csv(csv_path)
self.y = torch.from_numpy(df['response'].values).to(torch.int64)
self.features = torch.from_numpy(df[feature_list].values).to(torch.float32)
# convert aa char to int
def __getitem__(self, index):
features = self.features[index]
label = self.y[index]
return features, label
def __len__(self):
return self.y.shape[0]
def get_data_loaders_fireman(batch_size, train_csv_path, valid_csv_path, test_csv_path, balanced=True, num_workers=0, num_classes=16):
train_dataset = Fireman(csv_path=train_csv_path)
valid_dataset = Fireman(csv_path=valid_csv_path)
test_dataset = Fireman(csv_path=test_csv_path)
if balanced:
train_loader = DataLoader(dataset=train_dataset,
batch_size=batch_size,
drop_last=True,
shuffle=False,
sampler=BalancedBatchSampler(train_dataset, labels=train_dataset.y),
num_workers=num_workers)
else:
train_loader = DataLoader(dataset=train_dataset,
batch_size=batch_size,
drop_last=True,
shuffle=True,
num_workers=num_workers)
valid_loader = DataLoader(dataset=valid_dataset,
batch_size=batch_size,
drop_last=False,
shuffle=False,
num_workers=num_workers)
test_loader = DataLoader(dataset=test_dataset,
batch_size=batch_size,
drop_last=False,
shuffle=False,
num_workers=num_workers)
return train_loader, valid_loader, test_loader
def get_dataloaders_hotspot_v4_2(batch_size, train_csv_path, test_csv_path, balanced=False, num_workers=0, num_classes=2):
if num_classes == 2:
train_dataset = HotspotDataset_v4_2_2class(csv_path=train_csv_path)
test_dataset = HotspotDataset_v4_2_2class(csv_path=test_csv_path)
elif num_classes == 3:
train_dataset = HotspotDataset_v4_2_3class(csv_path=train_csv_path)
test_dataset = HotspotDataset_v4_2_3class(csv_path=test_csv_path)
else:
raise ValueError('num_classes option invalid')
if balanced:
train_loader = DataLoader(dataset=train_dataset,
batch_size=batch_size,
drop_last=True,
shuffle=False,
sampler=BalancedBatchSampler(train_dataset, labels=train_dataset.y),
num_workers=num_workers)
else:
train_loader = DataLoader(dataset=train_dataset,
batch_size=batch_size,
drop_last=True,
shuffle=True,
num_workers=num_workers)
test_loader = DataLoader(dataset=test_dataset,
batch_size=batch_size,
drop_last=False,
shuffle=False,
num_workers=num_workers)
return train_loader, test_loader
| 36.301691 | 134 | 0.544562 | 3,037 | 27,916 | 4.752058 | 0.09055 | 0.043029 | 0.04781 | 0.043653 | 0.800859 | 0.764759 | 0.748683 | 0.727896 | 0.72069 | 0.717226 | 0 | 0.018438 | 0.351089 | 27,916 | 768 | 135 | 36.348958 | 0.77825 | 0.096755 | 0 | 0.750511 | 0 | 0 | 0.037415 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.08589 | false | 0.002045 | 0.018405 | 0.018405 | 0.196319 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
1e149b666629136e3b97a2b4cbb34ca660cfced1 | 92 | py | Python | parameters_8000.py | pigaov10/prodam.gerenciador | 58ec55d43c41b312fa1c2074ea4e53ac8f51fcaa | [
"BSD-3-Clause"
] | null | null | null | parameters_8000.py | pigaov10/prodam.gerenciador | 58ec55d43c41b312fa1c2074ea4e53ac8f51fcaa | [
"BSD-3-Clause"
] | 1 | 2016-01-15T12:38:54.000Z | 2016-01-15T12:38:54.000Z | parameters_8000.py | pigaov10/prodam.gerenciador | 58ec55d43c41b312fa1c2074ea4e53ac8f51fcaa | [
"BSD-3-Clause"
] | null | null | null | password="pbkdf2(1000,20,sha512)$bb3c6b5f80f7d12a$8370ec3b7cdb8be22fca624984c7f73bb8308211"
| 46 | 91 | 0.891304 | 7 | 92 | 11.714286 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.472527 | 0.01087 | 92 | 1 | 92 | 92 | 0.428571 | 0 | 0 | 0 | 0 | 0 | 0.869565 | 0.869565 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
1e85dbbb05aeb69744c7fcc046494f34b7e5bac2 | 10,111 | py | Python | reconstruction/general_utils/download_utils.py | tecdatalab/biostructure | a30e907e83fa5bbfb934d951b7c663b622104fcc | [
"Apache-2.0"
] | null | null | null | reconstruction/general_utils/download_utils.py | tecdatalab/biostructure | a30e907e83fa5bbfb934d951b7c663b622104fcc | [
"Apache-2.0"
] | 15 | 2019-06-17T16:13:39.000Z | 2022-02-27T05:23:59.000Z | reconstruction/general_utils/download_utils.py | tecdatalab/biostructure | a30e907e83fa5bbfb934d951b7c663b622104fcc | [
"Apache-2.0"
] | null | null | null | import os
import urllib.request
import progressbar
import shutil
from general_utils.temp_utils import gen_dir, free_dir
from general_utils.terminal_utils import get_out
import time
import random
class MyProgressBar:
def __init__(self):
self.pbar = None
def __call__(self, block_num, block_size, total_size):
if not self.pbar:
self.pbar = progressbar.ProgressBar(maxval=total_size)
self.pbar.start()
downloaded = block_num * block_size
if downloaded < total_size:
self.pbar.update(downloaded)
else:
self.pbar.finish()
def download_emd(id_code, exit_path, create_progress_bar=False):
can_try = 60
url_format_principal_list = ['ftp://ftp.wwpdb.org/pub/emdb/structures/EMD-{0}/map/emd_{0}.map.gz',
'ftp://ftp.rcsb.org/pub/emdb/structures/EMD-{0}/map/emd_{0}.map.gz',
'ftp://ftp.ebi.ac.uk/pub/databases/emdb/structures/EMD-{0}/map/emd_{0}.map.gz',
'ftp://ftp.pdbj.org/pub/emdb/structures/EMD-{0}/map/emd_{0}.map.gz']
while True:
random.shuffle(url_format_principal_list)
flag = False
er = None
for url_format_string in url_format_principal_list:
try:
download_biomolecular_zip_file(id_code, exit_path, url_format_string, 'emd_{0}.map.gz', 'emd_{0}.map',
create_progress_bar)
flag = True
break
except Exception as e:
er = e
permit_errors_codes = ["421", "104"]
flag_error = False
for i in permit_errors_codes:
if str(e).find(i) != -1:
flag_error = True
break
if not flag_error:
raise e
if not flag:
can_try -= 1
if can_try < 0:
raise er
else:
time.sleep(15)
else:
break
def download_emd_xml(id_code, exit_path, create_progress_bar=False):
can_try = 60
url_format_principal_list = ['ftp://ftp.wwpdb.org/pub/emdb/structures/EMD-{0}/header/emd-{0}.xml',
'ftp://ftp.rcsb.org/pub/emdb/structures/EMD-{0}/header/emd-{0}.xml',
'ftp://ftp.ebi.ac.uk/pub/databases/emdb/structures/EMD-{0}/header/emd-{0}.xml',
'ftp://ftp.pdbj.org/pub/emdb/structures/EMD-{0}/header/emd-{0}.xml']
while True:
random.shuffle(url_format_principal_list)
flag = False
er = None
for url_format_string in url_format_principal_list:
try:
download_biomolecular_file(id_code, exit_path, url_format_string, create_progress_bar)
flag = True
break
except Exception as e:
er = e
permit_errors_codes = ["421", "104"]
flag_error = False
for i in permit_errors_codes:
if str(e).find(i) != -1:
flag_error = True
break
if not flag_error:
raise e
if not flag:
can_try -= 1
if can_try < 0:
raise er
else:
time.sleep(15)
else:
break
def download_pdb(id_code, exit_path, create_progress_bar=False):
id_code = id_code.lower()
ok_flag = False
er = None
try:
download_pdb_in_pdb(id_code, exit_path, create_progress_bar)
ok_flag = True
except Exception as e:
er = e
ok_flag = False
if not ok_flag:
try:
download_pdb_in_http(id_code, exit_path, create_progress_bar)
ok_flag = True
except Exception as e:
er = e
ok_flag = False
if not ok_flag:
try:
download_pdb_in_mmCIF(id_code, exit_path, create_progress_bar)
ok_flag = True
except Exception as e:
er = e
ok_flag = False
if not ok_flag:
try:
download_pdb_in_mmCIF_http(id_code, exit_path, create_progress_bar)
ok_flag = True
except Exception as e:
er = e
ok_flag = False
if not ok_flag:
raise er
else:
return True
def download_pdb_in_http(id_code, exit_path, create_progress_bar=False):
can_try = 30
url_format_principal_list = ['https://files.rcsb.org/download/{0}.pdb']
while True:
random.shuffle(url_format_principal_list)
flag = False
er = None
for url_format_string in url_format_principal_list:
try:
download_biomolecular_file(id_code.upper(), exit_path, url_format_string, create_progress_bar)
flag = True
break
except Exception as e:
er = e
permit_errors_codes = ["421", "104"]
flag_error = False
for i in permit_errors_codes:
if str(e).find(i) != -1:
flag_error = True
break
if not flag_error:
raise e
if not flag:
can_try -= 1
if can_try < 0:
raise er
else:
time.sleep(15)
else:
break
def download_pdb_in_mmCIF_http(id_code, exit_path, create_progress_bar=False):
from general_utils.pdb_utils import mmCIF_to_pdb
er = None
can_try = 30
url_format_principal_list = ['https://files.rcsb.org/download/{0}.cif']
while True:
random.shuffle(url_format_principal_list)
flag = False
for url_format_string in url_format_principal_list:
try:
path_temp = gen_dir()
temp_file_path = path_temp + "/" + os.path.basename(exit_path).split(".")[0] + ".cif"
download_biomolecular_file(id_code.upper(), temp_file_path, url_format_string, create_progress_bar)
mmCIF_to_pdb(temp_file_path, exit_path)
free_dir(path_temp)
flag = True
break
except Exception as e:
er = e
permit_errors_codes = ["421", "104"]
flag_error = False
for i in permit_errors_codes:
if str(e).find(i) != -1:
flag_error = True
break
if not flag_error:
raise e
if not flag:
can_try -= 1
if can_try < 0:
raise er
else:
time.sleep(15)
else:
break
def download_pdb_in_pdb(id_code, exit_path, create_progress_bar=False):
can_try = 30
url_format_principal_list = ['ftp://ftp.wwpdb.org/pub/pdb/data/structures/all/pdb/pdb{0}.ent.gz',
'ftp://ftp.rcsb.org/pub/pdb/data/structures/all/pdb/pdb{0}.ent.gz',
'ftp://ftp.ebi.ac.uk/pub/databases/pdb/data/structures/all/pdb/pdb{0}.ent.gz',
'ftp://ftp.pdbj.org/pub/pdb/data/structures/all/pdb/pdb{0}.ent.gz']
while True:
random.shuffle(url_format_principal_list)
flag = False
er = None
for url_format_string in url_format_principal_list:
try:
download_biomolecular_zip_file(id_code, exit_path, url_format_string, 'pdb{0}.ent.gz', 'pdb{0}.ent',
create_progress_bar)
flag = True
break
except Exception as e:
er = e
permit_errors_codes = ["421", "104"]
flag_error = False
for i in permit_errors_codes:
if str(e).find(i) != -1:
flag_error = True
break
if not flag_error:
raise e
if not flag:
can_try -= 1
if can_try < 0:
raise er
else:
time.sleep(15)
else:
break
def download_pdb_in_mmCIF(id_code, exit_path, create_progress_bar=False):
from general_utils.pdb_utils import mmCIF_to_pdb
er = None
can_try = 30
url_format_principal_list = ['ftp://ftp.wwpdb.org/pub/pdb/data/structures/all/mmCIF/{0}.cif.gz',
'ftp://ftp.rcsb.org/pub/pdb/data/structures/all/mmCIF/{0}.cif.gz',
'ftp://ftp.ebi.ac.uk/pub/databases/pdb/data/structures/all/mmCIF/{0}.cif.gz',
'ftp://ftp.pdbj.org/pub/pdb/data/structures/all/mmCIF/{0}.cif.gz']
while True:
random.shuffle(url_format_principal_list)
flag = False
for url_format_string in url_format_principal_list:
try:
path_temp = gen_dir()
temp_file_path = path_temp + "/" + os.path.basename(exit_path).split(".")[0] + ".cif"
download_biomolecular_zip_file(id_code, temp_file_path, url_format_string, '{0}.cif.gz', '{0}.cif',
create_progress_bar)
mmCIF_to_pdb(temp_file_path, exit_path)
free_dir(path_temp)
flag = True
break
except Exception as e:
er = e
permit_errors_codes = ["421", "104"]
flag_error = False
for i in permit_errors_codes:
if str(e).find(i) != -1:
flag_error = True
break
if not flag_error:
raise e
if not flag:
can_try -= 1
if can_try < 0:
raise er
else:
time.sleep(15)
else:
break
def download_biomolecular_zip_file(id_code, exit_path, url_format_string, file_zip, file_unzip,
create_progress_bar=False):
# path = './temp_emd_download'
path = gen_dir()
path = os.path.abspath(path)
exit_path = os.path.abspath(exit_path)
file_url = url_format_string.format(id_code)
if os.path.exists(path):
shutil.rmtree(path)
os.mkdir(path)
###-------------
try:
with open(path + "/" + file_zip.format(id_code), 'w') as fp:
pass
except:
pass
###------------
if create_progress_bar:
urllib.request.urlretrieve(file_url, path + "/" + file_zip.format(id_code), MyProgressBar())
else:
urllib.request.urlretrieve(file_url, path + "/" + file_zip.format(id_code))
get_out("gunzip", path + "/" + file_zip.format(id_code))
get_out("mv", path + "/" + file_unzip.format(id_code), exit_path)
free_dir(path)
def download_biomolecular_file(id_code, exit_path, url_format_string, create_progress_bar=False):
# path = './temp_emd_download'
exit_path = os.path.abspath(exit_path)
file_url = url_format_string.format(id_code)
###-----------------
try:
with open(exit_path, 'w') as fp:
pass
except:
pass
###-----------------
if create_progress_bar:
urllib.request.urlretrieve(file_url, exit_path, MyProgressBar())
else:
urllib.request.urlretrieve(file_url, exit_path)
| 29.738235 | 110 | 0.607754 | 1,417 | 10,111 | 4.060692 | 0.092449 | 0.05318 | 0.062044 | 0.068822 | 0.880257 | 0.876955 | 0.86114 | 0.843761 | 0.826382 | 0.817344 | 0 | 0.015404 | 0.280882 | 10,111 | 339 | 111 | 29.825959 | 0.775959 | 0.011473 | 0 | 0.750853 | 0 | 0.054608 | 0.128457 | 0.107816 | 0 | 0 | 0 | 0 | 0 | 1 | 0.037543 | false | 0.013652 | 0.03413 | 0 | 0.078498 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
94fe9233d5bf6914ddf1bf3d804a379e3ea926e0 | 62 | py | Python | breaker/fakers/__init__.py | thespacedoctor/breaker | 9f7b7c311fb6463634dd6e77ccc8f0323cda6cc6 | [
"MIT"
] | null | null | null | breaker/fakers/__init__.py | thespacedoctor/breaker | 9f7b7c311fb6463634dd6e77ccc8f0323cda6cc6 | [
"MIT"
] | 6 | 2016-07-21T13:42:43.000Z | 2018-11-30T19:34:07.000Z | breaker/fakers/__init__.py | thespacedoctor/breaker | 9f7b7c311fb6463634dd6e77ccc8f0323cda6cc6 | [
"MIT"
] | null | null | null | from generate_faker_catalogue import generate_faker_catalogue
| 31 | 61 | 0.935484 | 8 | 62 | 6.75 | 0.625 | 0.481481 | 0.814815 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.064516 | 62 | 1 | 62 | 62 | 0.931034 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
bf7b0dd95d95452177b31453bd18cdb4f9840499 | 4,297 | py | Python | visbeer/test/test_coffee_beer_service.py | lukaselmer/vis-beer | ecabbfa8aeb540ea3cb66cd1a1d2192b8e439085 | [
"MIT"
] | null | null | null | visbeer/test/test_coffee_beer_service.py | lukaselmer/vis-beer | ecabbfa8aeb540ea3cb66cd1a1d2192b8e439085 | [
"MIT"
] | null | null | null | visbeer/test/test_coffee_beer_service.py | lukaselmer/vis-beer | ecabbfa8aeb540ea3cb66cd1a1d2192b8e439085 | [
"MIT"
] | null | null | null | import unittest
import datetime
from visbeer.services.beer_service import BeerService
from visbeer.services.coffee_service import CoffeeService
from visbeer.services.data_service import DataService, DATETIME_FORMAT
from visbeer.test.mocks.flag_service_mock import FlagServiceMock
class CoffeeBeerServiceTestCase(unittest.TestCase):
def test_dispense_coffee(self):
mock = FlagServiceMock()
bs = BeerService('010101@rfid.ethz.ch', DataService(mock))
cs = CoffeeService('010101@rfid.ethz.ch', DataService(mock))
self.assertEqual(1, bs.status())
self.assertEqual(2, cs.status())
cs.dispensed()
self.assertEqual(0, bs.status())
self.assertEqual(1, cs.status())
cs.dispensed()
self.assertEqual(0, bs.status())
cs.dispensed()
self.assertEqual(0, bs.status())
mock.data['010101@rfid.ethz.ch']['coffee_beer|credits'] = 4
self.assertEqual(2, bs.status())
self.assertEqual(4, cs.status())
cs.dispensed()
self.assertEqual(3, mock.data['010101@rfid.ethz.ch']['coffee_beer|credits'])
self.assertEqual(1, bs.status())
self.assertEqual(3, cs.status())
cs.dispensed()
self.assertEqual(2, mock.data['010101@rfid.ethz.ch']['coffee_beer|credits'])
self.assertEqual(1, bs.status())
self.assertEqual(2, cs.status())
cs.dispensed()
self.assertEqual(1, mock.data['010101@rfid.ethz.ch']['coffee_beer|credits'])
self.assertEqual(0, bs.status())
self.assertEqual(1, cs.status())
cs.dispensed()
self.assertEqual(0, mock.data['010101@rfid.ethz.ch']['coffee_beer|credits'])
self.assertEqual(0, bs.status())
self.assertEqual(0, cs.status())
cs.dispensed()
self.assertEqual(0, mock.data['010101@rfid.ethz.ch']['coffee_beer|credits'])
self.assertEqual(0, bs.status())
self.assertEqual(0, cs.status())
def test_dispense_beer(self):
mock = FlagServiceMock()
bs = BeerService('010101@rfid.ethz.ch', DataService(mock))
cs = CoffeeService('010101@rfid.ethz.ch', DataService(mock))
self.assertEqual(1, bs.status())
self.assertEqual(2, cs.status())
bs.dispensed()
self.assertEqual(0, bs.status())
self.assertEqual(0, cs.status())
bs.dispensed()
self.assertEqual(0, bs.status())
mock.data['010101@rfid.ethz.ch']['coffee_beer|credits'] = 4
self.assertEqual(2, bs.status())
self.assertEqual(4, cs.status())
bs.dispensed()
self.assertEqual(2, mock.data['010101@rfid.ethz.ch']['coffee_beer|credits'])
self.assertEqual(1, bs.status())
self.assertEqual(2, cs.status())
bs.dispensed()
self.assertEqual(0, mock.data['010101@rfid.ethz.ch']['coffee_beer|credits'])
self.assertEqual(0, bs.status())
self.assertEqual(0, cs.status())
bs.dispensed()
self.assertEqual(0, mock.data['010101@rfid.ethz.ch']['coffee_beer|credits'])
self.assertEqual(0, bs.status())
self.assertEqual(0, cs.status())
def test_dispense_mixed(self):
mock = FlagServiceMock()
bs = BeerService('010101@rfid.ethz.ch', DataService(mock))
cs = CoffeeService('010101@rfid.ethz.ch', DataService(mock))
self.assertEqual(1, bs.status())
self.assertEqual(2, cs.status())
bs.dispensed()
self.assertEqual(0, bs.status())
self.assertEqual(0, cs.status())
mock.data['010101@rfid.ethz.ch']['coffee_beer|credits'] = 4
self.assertEqual(2, bs.status())
self.assertEqual(4, cs.status())
cs.dispensed()
self.assertEqual(3, mock.data['010101@rfid.ethz.ch']['coffee_beer|credits'])
self.assertEqual(1, bs.status())
self.assertEqual(3, cs.status())
bs.dispensed()
self.assertEqual(1, mock.data['010101@rfid.ethz.ch']['coffee_beer|credits'])
self.assertEqual(0, bs.status())
self.assertEqual(1, cs.status())
cs.dispensed()
self.assertEqual(0, mock.data['010101@rfid.ethz.ch']['coffee_beer|credits'])
self.assertEqual(0, bs.status())
self.assertEqual(0, cs.status())
if __name__ == '__main__':
unittest.main()
| 39.422018 | 84 | 0.632069 | 525 | 4,297 | 5.108571 | 0.08381 | 0.302013 | 0.149142 | 0.119314 | 0.872856 | 0.872856 | 0.870619 | 0.870619 | 0.869873 | 0.86689 | 0 | 0.052013 | 0.208052 | 4,297 | 108 | 85 | 39.787037 | 0.736115 | 0 | 0 | 0.873684 | 0 | 0 | 0.152199 | 0 | 0 | 0 | 0 | 0 | 0.568421 | 1 | 0.031579 | false | 0 | 0.063158 | 0 | 0.105263 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
bf8d52a16b09590fde2666ac160124648adc759f | 39 | py | Python | Boolean/Boolean14.py | liyuanyuan11/Python | d94cc7ab39e56c6e24bfc741a30da77590d1d220 | [
"MIT"
] | null | null | null | Boolean/Boolean14.py | liyuanyuan11/Python | d94cc7ab39e56c6e24bfc741a30da77590d1d220 | [
"MIT"
] | null | null | null | Boolean/Boolean14.py | liyuanyuan11/Python | d94cc7ab39e56c6e24bfc741a30da77590d1d220 | [
"MIT"
] | null | null | null | print(42<62)
print(42<=42)
print(42<24) | 13 | 13 | 0.692308 | 9 | 39 | 3 | 0.444444 | 0.777778 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.324324 | 0.051282 | 39 | 3 | 14 | 13 | 0.405405 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 8 |
44a5f72b1158cbd6c89de323050272bc2db50347 | 126,780 | py | Python | cisco-ios-xr/ydk/models/cisco_ios_xr/Cisco_IOS_XR_infra_tc_oper.py | tkamata-test/ydk-py | b637e7853a8edbbd31fbc05afa3aa4110b31c5f9 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | cisco-ios-xr/ydk/models/cisco_ios_xr/Cisco_IOS_XR_infra_tc_oper.py | tkamata-test/ydk-py | b637e7853a8edbbd31fbc05afa3aa4110b31c5f9 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | cisco-ios-xr/ydk/models/cisco_ios_xr/Cisco_IOS_XR_infra_tc_oper.py | tkamata-test/ydk-py | b637e7853a8edbbd31fbc05afa3aa4110b31c5f9 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | """ Cisco_IOS_XR_infra_tc_oper
This module contains a collection of YANG definitions
for Cisco IOS\-XR infra\-tc package operational data.
This module contains definitions
for the following management objects\:
traffic\-collector\: Global Traffic Collector configuration
commands
Copyright (c) 2013\-2016 by Cisco Systems, Inc.
All rights reserved.
"""
import re
import collections
from enum import Enum
from ydk.types import Empty, YList, YLeafList, DELETE, Decimal64, FixedBitsDict
from ydk.errors import YPYError, YPYModelError
class TcOperAfNameEnum(Enum):
"""
TcOperAfNameEnum
Tc oper af name
.. data:: ipv4 = 0
IPv4
.. data:: ipv6 = 1
IPv6
"""
ipv4 = 0
ipv6 = 1
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_infra_tc_oper as meta
return meta._meta_table['TcOperAfNameEnum']
class TrafficCollector(object):
"""
Global Traffic Collector configuration commands
.. attribute:: afs
Address Family specific operational data
**type**\: :py:class:`Afs <ydk.models.cisco_ios_xr.Cisco_IOS_XR_infra_tc_oper.TrafficCollector.Afs>`
.. attribute:: external_interfaces
External Interface
**type**\: :py:class:`ExternalInterfaces <ydk.models.cisco_ios_xr.Cisco_IOS_XR_infra_tc_oper.TrafficCollector.ExternalInterfaces>`
.. attribute:: summary
Traffic Collector summary
**type**\: :py:class:`Summary <ydk.models.cisco_ios_xr.Cisco_IOS_XR_infra_tc_oper.TrafficCollector.Summary>`
.. attribute:: vrf_table
VRF specific operational data
**type**\: :py:class:`VrfTable <ydk.models.cisco_ios_xr.Cisco_IOS_XR_infra_tc_oper.TrafficCollector.VrfTable>`
"""
_prefix = 'infra-tc-oper'
_revision = '2015-11-09'
def __init__(self):
self.afs = TrafficCollector.Afs()
self.afs.parent = self
self.external_interfaces = TrafficCollector.ExternalInterfaces()
self.external_interfaces.parent = self
self.summary = TrafficCollector.Summary()
self.summary.parent = self
self.vrf_table = TrafficCollector.VrfTable()
self.vrf_table.parent = self
class ExternalInterfaces(object):
"""
External Interface
.. attribute:: external_interface
External Interface
**type**\: list of :py:class:`ExternalInterface <ydk.models.cisco_ios_xr.Cisco_IOS_XR_infra_tc_oper.TrafficCollector.ExternalInterfaces.ExternalInterface>`
"""
_prefix = 'infra-tc-oper'
_revision = '2015-11-09'
def __init__(self):
self.parent = None
self.external_interface = YList()
self.external_interface.parent = self
self.external_interface.name = 'external_interface'
class ExternalInterface(object):
"""
External Interface
.. attribute:: interface_name <key>
The Interface Name
**type**\: str
**pattern:** (([a\-zA\-Z0\-9\_]\*\\d+/){3,4}\\d+)\|(([a\-zA\-Z0\-9\_]\*\\d+/){3,4}\\d+\\.\\d+)\|(([a\-zA\-Z0\-9\_]\*\\d+/){2}([a\-zA\-Z0\-9\_]\*\\d+))\|(([a\-zA\-Z0\-9\_]\*\\d+/){2}([a\-zA\-Z0\-9\_]+))\|([a\-zA\-Z0\-9\_\-]\*\\d+)\|([a\-zA\-Z0\-9\_\-]\*\\d+\\.\\d+)\|(mpls)\|(dwdm)
.. attribute:: interface_handle
Interface handle
**type**\: int
**range:** 0..4294967295
.. attribute:: interface_name_xr
Interface name in Display format
**type**\: str
.. attribute:: is_interface_enabled
Flag to indicate interface enabled or not
**type**\: bool
.. attribute:: vrfid
Interface VRF ID
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'infra-tc-oper'
_revision = '2015-11-09'
def __init__(self):
self.parent = None
self.interface_name = None
self.interface_handle = None
self.interface_name_xr = None
self.is_interface_enabled = None
self.vrfid = None
@property
def _common_path(self):
if self.interface_name is None:
raise YPYModelError('Key property interface_name is None')
return '/Cisco-IOS-XR-infra-tc-oper:traffic-collector/Cisco-IOS-XR-infra-tc-oper:external-interfaces/Cisco-IOS-XR-infra-tc-oper:external-interface[Cisco-IOS-XR-infra-tc-oper:interface-name = ' + str(self.interface_name) + ']'
def is_config(self):
''' Returns True if this instance represents config data else returns False '''
return False
def _has_data(self):
if not self.is_config():
return False
if self.interface_name is not None:
return True
if self.interface_handle is not None:
return True
if self.interface_name_xr is not None:
return True
if self.is_interface_enabled is not None:
return True
if self.vrfid is not None:
return True
return False
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_infra_tc_oper as meta
return meta._meta_table['TrafficCollector.ExternalInterfaces.ExternalInterface']['meta_info']
@property
def _common_path(self):
return '/Cisco-IOS-XR-infra-tc-oper:traffic-collector/Cisco-IOS-XR-infra-tc-oper:external-interfaces'
def is_config(self):
''' Returns True if this instance represents config data else returns False '''
return False
def _has_data(self):
if not self.is_config():
return False
if self.external_interface is not None:
for child_ref in self.external_interface:
if child_ref._has_data():
return True
return False
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_infra_tc_oper as meta
return meta._meta_table['TrafficCollector.ExternalInterfaces']['meta_info']
class Summary(object):
"""
Traffic Collector summary
.. attribute:: checkpoint_message_statistic
Statistics per message type for Chkpt
**type**\: list of :py:class:`CheckpointMessageStatistic <ydk.models.cisco_ios_xr.Cisco_IOS_XR_infra_tc_oper.TrafficCollector.Summary.CheckpointMessageStatistic>`
.. attribute:: collection_interval
Statistic collection interval in minutes
**type**\: int
**range:** 0..255
**units**\: minute
.. attribute:: collection_message_statistic
Statistics per message type for STAT collector
**type**\: list of :py:class:`CollectionMessageStatistic <ydk.models.cisco_ios_xr.Cisco_IOS_XR_infra_tc_oper.TrafficCollector.Summary.CollectionMessageStatistic>`
.. attribute:: collection_timer_is_running
TRUE if collection timer is running
**type**\: bool
.. attribute:: database_statistics_external_interface
Database statistics for External Interface
**type**\: :py:class:`DatabaseStatisticsExternalInterface <ydk.models.cisco_ios_xr.Cisco_IOS_XR_infra_tc_oper.TrafficCollector.Summary.DatabaseStatisticsExternalInterface>`
.. attribute:: history_size
Statistics history size
**type**\: int
**range:** 0..255
.. attribute:: timeout_interval
Statistic history timeout interval in hours
**type**\: int
**range:** 0..65535
**units**\: hour
.. attribute:: timeout_timer_is_running
TRUE if history timeout timer is running
**type**\: bool
.. attribute:: vrf_statistic
VRF table statistics
**type**\: list of :py:class:`VrfStatistic <ydk.models.cisco_ios_xr.Cisco_IOS_XR_infra_tc_oper.TrafficCollector.Summary.VrfStatistic>`
"""
_prefix = 'infra-tc-oper'
_revision = '2015-11-09'
def __init__(self):
self.parent = None
self.checkpoint_message_statistic = YList()
self.checkpoint_message_statistic.parent = self
self.checkpoint_message_statistic.name = 'checkpoint_message_statistic'
self.collection_interval = None
self.collection_message_statistic = YList()
self.collection_message_statistic.parent = self
self.collection_message_statistic.name = 'collection_message_statistic'
self.collection_timer_is_running = None
self.database_statistics_external_interface = TrafficCollector.Summary.DatabaseStatisticsExternalInterface()
self.database_statistics_external_interface.parent = self
self.history_size = None
self.timeout_interval = None
self.timeout_timer_is_running = None
self.vrf_statistic = YList()
self.vrf_statistic.parent = self
self.vrf_statistic.name = 'vrf_statistic'
class DatabaseStatisticsExternalInterface(object):
"""
Database statistics for External Interface
.. attribute:: number_of_add_o_perations
Number of add operations
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: number_of_delete_o_perations
Number of delete operations
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: number_of_entries
Number of DB entries
**type**\: int
**range:** 0..4294967295
.. attribute:: number_of_stale_entries
Number of stale entries
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'infra-tc-oper'
_revision = '2015-11-09'
def __init__(self):
self.parent = None
self.number_of_add_o_perations = None
self.number_of_delete_o_perations = None
self.number_of_entries = None
self.number_of_stale_entries = None
@property
def _common_path(self):
return '/Cisco-IOS-XR-infra-tc-oper:traffic-collector/Cisco-IOS-XR-infra-tc-oper:summary/Cisco-IOS-XR-infra-tc-oper:database-statistics-external-interface'
def is_config(self):
''' Returns True if this instance represents config data else returns False '''
return False
def _has_data(self):
if not self.is_config():
return False
if self.number_of_add_o_perations is not None:
return True
if self.number_of_delete_o_perations is not None:
return True
if self.number_of_entries is not None:
return True
if self.number_of_stale_entries is not None:
return True
return False
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_infra_tc_oper as meta
return meta._meta_table['TrafficCollector.Summary.DatabaseStatisticsExternalInterface']['meta_info']
class VrfStatistic(object):
"""
VRF table statistics
.. attribute:: database_statistics_ipv4
Database statistics for IPv4 table
**type**\: :py:class:`DatabaseStatisticsIpv4 <ydk.models.cisco_ios_xr.Cisco_IOS_XR_infra_tc_oper.TrafficCollector.Summary.VrfStatistic.DatabaseStatisticsIpv4>`
.. attribute:: database_statistics_tunnel
Database statistics for Tunnel table
**type**\: :py:class:`DatabaseStatisticsTunnel <ydk.models.cisco_ios_xr.Cisco_IOS_XR_infra_tc_oper.TrafficCollector.Summary.VrfStatistic.DatabaseStatisticsTunnel>`
.. attribute:: vrf_name
VRF name
**type**\: str
"""
_prefix = 'infra-tc-oper'
_revision = '2015-11-09'
def __init__(self):
self.parent = None
self.database_statistics_ipv4 = TrafficCollector.Summary.VrfStatistic.DatabaseStatisticsIpv4()
self.database_statistics_ipv4.parent = self
self.database_statistics_tunnel = TrafficCollector.Summary.VrfStatistic.DatabaseStatisticsTunnel()
self.database_statistics_tunnel.parent = self
self.vrf_name = None
class DatabaseStatisticsIpv4(object):
"""
Database statistics for IPv4 table
.. attribute:: number_of_add_o_perations
Number of add operations
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: number_of_delete_o_perations
Number of delete operations
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: number_of_entries
Number of DB entries
**type**\: int
**range:** 0..4294967295
.. attribute:: number_of_stale_entries
Number of stale entries
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'infra-tc-oper'
_revision = '2015-11-09'
def __init__(self):
self.parent = None
self.number_of_add_o_perations = None
self.number_of_delete_o_perations = None
self.number_of_entries = None
self.number_of_stale_entries = None
@property
def _common_path(self):
return '/Cisco-IOS-XR-infra-tc-oper:traffic-collector/Cisco-IOS-XR-infra-tc-oper:summary/Cisco-IOS-XR-infra-tc-oper:vrf-statistic/Cisco-IOS-XR-infra-tc-oper:database-statistics-ipv4'
def is_config(self):
''' Returns True if this instance represents config data else returns False '''
return False
def _has_data(self):
if not self.is_config():
return False
if self.number_of_add_o_perations is not None:
return True
if self.number_of_delete_o_perations is not None:
return True
if self.number_of_entries is not None:
return True
if self.number_of_stale_entries is not None:
return True
return False
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_infra_tc_oper as meta
return meta._meta_table['TrafficCollector.Summary.VrfStatistic.DatabaseStatisticsIpv4']['meta_info']
class DatabaseStatisticsTunnel(object):
"""
Database statistics for Tunnel table
.. attribute:: number_of_add_o_perations
Number of add operations
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: number_of_delete_o_perations
Number of delete operations
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: number_of_entries
Number of DB entries
**type**\: int
**range:** 0..4294967295
.. attribute:: number_of_stale_entries
Number of stale entries
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'infra-tc-oper'
_revision = '2015-11-09'
def __init__(self):
self.parent = None
self.number_of_add_o_perations = None
self.number_of_delete_o_perations = None
self.number_of_entries = None
self.number_of_stale_entries = None
@property
def _common_path(self):
return '/Cisco-IOS-XR-infra-tc-oper:traffic-collector/Cisco-IOS-XR-infra-tc-oper:summary/Cisco-IOS-XR-infra-tc-oper:vrf-statistic/Cisco-IOS-XR-infra-tc-oper:database-statistics-tunnel'
def is_config(self):
''' Returns True if this instance represents config data else returns False '''
return False
def _has_data(self):
if not self.is_config():
return False
if self.number_of_add_o_perations is not None:
return True
if self.number_of_delete_o_perations is not None:
return True
if self.number_of_entries is not None:
return True
if self.number_of_stale_entries is not None:
return True
return False
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_infra_tc_oper as meta
return meta._meta_table['TrafficCollector.Summary.VrfStatistic.DatabaseStatisticsTunnel']['meta_info']
@property
def _common_path(self):
return '/Cisco-IOS-XR-infra-tc-oper:traffic-collector/Cisco-IOS-XR-infra-tc-oper:summary/Cisco-IOS-XR-infra-tc-oper:vrf-statistic'
def is_config(self):
''' Returns True if this instance represents config data else returns False '''
return False
def _has_data(self):
if not self.is_config():
return False
if self.database_statistics_ipv4 is not None and self.database_statistics_ipv4._has_data():
return True
if self.database_statistics_tunnel is not None and self.database_statistics_tunnel._has_data():
return True
if self.vrf_name is not None:
return True
return False
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_infra_tc_oper as meta
return meta._meta_table['TrafficCollector.Summary.VrfStatistic']['meta_info']
class CollectionMessageStatistic(object):
"""
Statistics per message type for STAT collector
.. attribute:: byte_received
Number of bytes received
**type**\: int
**range:** 0..18446744073709551615
**units**\: byte
.. attribute:: byte_sent
Number of bytes sent
**type**\: int
**range:** 0..18446744073709551615
**units**\: byte
.. attribute:: maimum_latency_timestamp
Timestamp of maximum latency
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: maximum_roundtrip_latency
Maximum roundtrip latency in msec
**type**\: int
**range:** 0..4294967295
.. attribute:: packet_received
Number of packets received
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: packet_sent
Number of packets sent
**type**\: int
**range:** 0..18446744073709551615
"""
_prefix = 'infra-tc-oper'
_revision = '2015-11-09'
def __init__(self):
self.parent = None
self.byte_received = None
self.byte_sent = None
self.maimum_latency_timestamp = None
self.maximum_roundtrip_latency = None
self.packet_received = None
self.packet_sent = None
@property
def _common_path(self):
return '/Cisco-IOS-XR-infra-tc-oper:traffic-collector/Cisco-IOS-XR-infra-tc-oper:summary/Cisco-IOS-XR-infra-tc-oper:collection-message-statistic'
def is_config(self):
''' Returns True if this instance represents config data else returns False '''
return False
def _has_data(self):
if not self.is_config():
return False
if self.byte_received is not None:
return True
if self.byte_sent is not None:
return True
if self.maimum_latency_timestamp is not None:
return True
if self.maximum_roundtrip_latency is not None:
return True
if self.packet_received is not None:
return True
if self.packet_sent is not None:
return True
return False
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_infra_tc_oper as meta
return meta._meta_table['TrafficCollector.Summary.CollectionMessageStatistic']['meta_info']
class CheckpointMessageStatistic(object):
"""
Statistics per message type for Chkpt
.. attribute:: byte_received
Number of bytes received
**type**\: int
**range:** 0..18446744073709551615
**units**\: byte
.. attribute:: byte_sent
Number of bytes sent
**type**\: int
**range:** 0..18446744073709551615
**units**\: byte
.. attribute:: maimum_latency_timestamp
Timestamp of maximum latency
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: maximum_roundtrip_latency
Maximum roundtrip latency in msec
**type**\: int
**range:** 0..4294967295
.. attribute:: packet_received
Number of packets received
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: packet_sent
Number of packets sent
**type**\: int
**range:** 0..18446744073709551615
"""
_prefix = 'infra-tc-oper'
_revision = '2015-11-09'
def __init__(self):
self.parent = None
self.byte_received = None
self.byte_sent = None
self.maimum_latency_timestamp = None
self.maximum_roundtrip_latency = None
self.packet_received = None
self.packet_sent = None
@property
def _common_path(self):
return '/Cisco-IOS-XR-infra-tc-oper:traffic-collector/Cisco-IOS-XR-infra-tc-oper:summary/Cisco-IOS-XR-infra-tc-oper:checkpoint-message-statistic'
def is_config(self):
''' Returns True if this instance represents config data else returns False '''
return False
def _has_data(self):
if not self.is_config():
return False
if self.byte_received is not None:
return True
if self.byte_sent is not None:
return True
if self.maimum_latency_timestamp is not None:
return True
if self.maximum_roundtrip_latency is not None:
return True
if self.packet_received is not None:
return True
if self.packet_sent is not None:
return True
return False
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_infra_tc_oper as meta
return meta._meta_table['TrafficCollector.Summary.CheckpointMessageStatistic']['meta_info']
@property
def _common_path(self):
return '/Cisco-IOS-XR-infra-tc-oper:traffic-collector/Cisco-IOS-XR-infra-tc-oper:summary'
def is_config(self):
''' Returns True if this instance represents config data else returns False '''
return False
def _has_data(self):
if not self.is_config():
return False
if self.checkpoint_message_statistic is not None:
for child_ref in self.checkpoint_message_statistic:
if child_ref._has_data():
return True
if self.collection_interval is not None:
return True
if self.collection_message_statistic is not None:
for child_ref in self.collection_message_statistic:
if child_ref._has_data():
return True
if self.collection_timer_is_running is not None:
return True
if self.database_statistics_external_interface is not None and self.database_statistics_external_interface._has_data():
return True
if self.history_size is not None:
return True
if self.timeout_interval is not None:
return True
if self.timeout_timer_is_running is not None:
return True
if self.vrf_statistic is not None:
for child_ref in self.vrf_statistic:
if child_ref._has_data():
return True
return False
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_infra_tc_oper as meta
return meta._meta_table['TrafficCollector.Summary']['meta_info']
class VrfTable(object):
"""
VRF specific operational data
.. attribute:: default_vrf
DefaultVRF specific operational data
**type**\: :py:class:`DefaultVrf <ydk.models.cisco_ios_xr.Cisco_IOS_XR_infra_tc_oper.TrafficCollector.VrfTable.DefaultVrf>`
"""
_prefix = 'infra-tc-oper'
_revision = '2015-11-09'
def __init__(self):
self.parent = None
self.default_vrf = TrafficCollector.VrfTable.DefaultVrf()
self.default_vrf.parent = self
class DefaultVrf(object):
"""
DefaultVRF specific operational data
.. attribute:: afs
Address Family specific operational data
**type**\: :py:class:`Afs <ydk.models.cisco_ios_xr.Cisco_IOS_XR_infra_tc_oper.TrafficCollector.VrfTable.DefaultVrf.Afs>`
"""
_prefix = 'infra-tc-oper'
_revision = '2015-11-09'
def __init__(self):
self.parent = None
self.afs = TrafficCollector.VrfTable.DefaultVrf.Afs()
self.afs.parent = self
class Afs(object):
"""
Address Family specific operational data
.. attribute:: af
Operational data for given Address Family
**type**\: list of :py:class:`Af <ydk.models.cisco_ios_xr.Cisco_IOS_XR_infra_tc_oper.TrafficCollector.VrfTable.DefaultVrf.Afs.Af>`
"""
_prefix = 'infra-tc-oper'
_revision = '2015-11-09'
def __init__(self):
self.parent = None
self.af = YList()
self.af.parent = self
self.af.name = 'af'
class Af(object):
"""
Operational data for given Address Family
.. attribute:: af_name <key>
Address Family name
**type**\: :py:class:`TcOperAfNameEnum <ydk.models.cisco_ios_xr.Cisco_IOS_XR_infra_tc_oper.TcOperAfNameEnum>`
.. attribute:: counters
Show Counters
**type**\: :py:class:`Counters <ydk.models.cisco_ios_xr.Cisco_IOS_XR_infra_tc_oper.TrafficCollector.VrfTable.DefaultVrf.Afs.Af.Counters>`
"""
_prefix = 'infra-tc-oper'
_revision = '2015-11-09'
def __init__(self):
self.parent = None
self.af_name = None
self.counters = TrafficCollector.VrfTable.DefaultVrf.Afs.Af.Counters()
self.counters.parent = self
class Counters(object):
"""
Show Counters
.. attribute:: prefixes
Prefix Database
**type**\: :py:class:`Prefixes <ydk.models.cisco_ios_xr.Cisco_IOS_XR_infra_tc_oper.TrafficCollector.VrfTable.DefaultVrf.Afs.Af.Counters.Prefixes>`
.. attribute:: tunnels
Tunnels
**type**\: :py:class:`Tunnels <ydk.models.cisco_ios_xr.Cisco_IOS_XR_infra_tc_oper.TrafficCollector.VrfTable.DefaultVrf.Afs.Af.Counters.Tunnels>`
"""
_prefix = 'infra-tc-oper'
_revision = '2015-11-09'
def __init__(self):
self.parent = None
self.prefixes = TrafficCollector.VrfTable.DefaultVrf.Afs.Af.Counters.Prefixes()
self.prefixes.parent = self
self.tunnels = TrafficCollector.VrfTable.DefaultVrf.Afs.Af.Counters.Tunnels()
self.tunnels.parent = self
class Prefixes(object):
"""
Prefix Database
.. attribute:: prefix
Show Prefix Counter
**type**\: list of :py:class:`Prefix <ydk.models.cisco_ios_xr.Cisco_IOS_XR_infra_tc_oper.TrafficCollector.VrfTable.DefaultVrf.Afs.Af.Counters.Prefixes.Prefix>`
"""
_prefix = 'infra-tc-oper'
_revision = '2015-11-09'
def __init__(self):
self.parent = None
self.prefix = YList()
self.prefix.parent = self
self.prefix.name = 'prefix'
class Prefix(object):
"""
Show Prefix Counter
.. attribute:: base_counter_statistics
Base counter statistics
**type**\: :py:class:`BaseCounterStatistics <ydk.models.cisco_ios_xr.Cisco_IOS_XR_infra_tc_oper.TrafficCollector.VrfTable.DefaultVrf.Afs.Af.Counters.Prefixes.Prefix.BaseCounterStatistics>`
.. attribute:: ipaddr
IP Address
**type**\: str
**pattern:** [\\w\\\-\\.\:,\_@#%$\\+=\\\|;]+
.. attribute:: is_active
Prefix is Active and collecting new Statistics
**type**\: bool
.. attribute:: label
Local Label
**type**\: int
**range:** 16..1048575
.. attribute:: label_xr
Label
**type**\: int
**range:** 0..4294967295
.. attribute:: mask
Prefix Mask
**type**\: str
**pattern:** [\\w\\\-\\.\:,\_@#%$\\+=\\\|;]+
.. attribute:: prefix
Prefix Address (V4 or V6 Format)
**type**\: str
.. attribute:: traffic_matrix_counter_statistics
Traffic Matrix (TM) counter statistics
**type**\: :py:class:`TrafficMatrixCounterStatistics <ydk.models.cisco_ios_xr.Cisco_IOS_XR_infra_tc_oper.TrafficCollector.VrfTable.DefaultVrf.Afs.Af.Counters.Prefixes.Prefix.TrafficMatrixCounterStatistics>`
"""
_prefix = 'infra-tc-oper'
_revision = '2015-11-09'
def __init__(self):
self.parent = None
self.base_counter_statistics = TrafficCollector.VrfTable.DefaultVrf.Afs.Af.Counters.Prefixes.Prefix.BaseCounterStatistics()
self.base_counter_statistics.parent = self
self.ipaddr = None
self.is_active = None
self.label = None
self.label_xr = None
self.mask = None
self.prefix = None
self.traffic_matrix_counter_statistics = TrafficCollector.VrfTable.DefaultVrf.Afs.Af.Counters.Prefixes.Prefix.TrafficMatrixCounterStatistics()
self.traffic_matrix_counter_statistics.parent = self
class BaseCounterStatistics(object):
"""
Base counter statistics
.. attribute:: count_history
Counter History
**type**\: list of :py:class:`CountHistory <ydk.models.cisco_ios_xr.Cisco_IOS_XR_infra_tc_oper.TrafficCollector.VrfTable.DefaultVrf.Afs.Af.Counters.Prefixes.Prefix.BaseCounterStatistics.CountHistory>`
.. attribute:: transmit_bytes_per_second_switched
Average Rate of Bytes/second switched
**type**\: int
**range:** 0..18446744073709551615
**units**\: byte/s
.. attribute:: transmit_packets_per_second_switched
Average Rate of Packets/second switched
**type**\: int
**range:** 0..18446744073709551615
**units**\: packet/s
"""
_prefix = 'infra-tc-oper'
_revision = '2015-11-09'
def __init__(self):
self.parent = None
self.count_history = YList()
self.count_history.parent = self
self.count_history.name = 'count_history'
self.transmit_bytes_per_second_switched = None
self.transmit_packets_per_second_switched = None
class CountHistory(object):
"""
Counter History
.. attribute:: event_end_timestamp
Event End timestamp
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: event_start_timestamp
Event Start timestamp
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: is_valid
Flag to indicate if this history entry is valid
**type**\: bool
.. attribute:: transmit_number_of_bytes_switched
Number of Bytes switched in this interval
**type**\: int
**range:** 0..18446744073709551615
**units**\: byte
.. attribute:: transmit_number_of_packets_switched
Number of packets switched in this interval
**type**\: int
**range:** 0..18446744073709551615
"""
_prefix = 'infra-tc-oper'
_revision = '2015-11-09'
def __init__(self):
self.parent = None
self.event_end_timestamp = None
self.event_start_timestamp = None
self.is_valid = None
self.transmit_number_of_bytes_switched = None
self.transmit_number_of_packets_switched = None
@property
def _common_path(self):
if self.parent is None:
raise YPYModelError('parent is not set . Cannot derive path.')
return self.parent._common_path +'/Cisco-IOS-XR-infra-tc-oper:count-history'
def is_config(self):
''' Returns True if this instance represents config data else returns False '''
return False
def _has_data(self):
if not self.is_config():
return False
if self.event_end_timestamp is not None:
return True
if self.event_start_timestamp is not None:
return True
if self.is_valid is not None:
return True
if self.transmit_number_of_bytes_switched is not None:
return True
if self.transmit_number_of_packets_switched is not None:
return True
return False
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_infra_tc_oper as meta
return meta._meta_table['TrafficCollector.VrfTable.DefaultVrf.Afs.Af.Counters.Prefixes.Prefix.BaseCounterStatistics.CountHistory']['meta_info']
@property
def _common_path(self):
if self.parent is None:
raise YPYModelError('parent is not set . Cannot derive path.')
return self.parent._common_path +'/Cisco-IOS-XR-infra-tc-oper:base-counter-statistics'
def is_config(self):
''' Returns True if this instance represents config data else returns False '''
return False
def _has_data(self):
if not self.is_config():
return False
if self.count_history is not None:
for child_ref in self.count_history:
if child_ref._has_data():
return True
if self.transmit_bytes_per_second_switched is not None:
return True
if self.transmit_packets_per_second_switched is not None:
return True
return False
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_infra_tc_oper as meta
return meta._meta_table['TrafficCollector.VrfTable.DefaultVrf.Afs.Af.Counters.Prefixes.Prefix.BaseCounterStatistics']['meta_info']
class TrafficMatrixCounterStatistics(object):
"""
Traffic Matrix (TM) counter statistics
.. attribute:: count_history
Counter History
**type**\: list of :py:class:`CountHistory <ydk.models.cisco_ios_xr.Cisco_IOS_XR_infra_tc_oper.TrafficCollector.VrfTable.DefaultVrf.Afs.Af.Counters.Prefixes.Prefix.TrafficMatrixCounterStatistics.CountHistory>`
.. attribute:: transmit_bytes_per_second_switched
Average Rate of Bytes/second switched
**type**\: int
**range:** 0..18446744073709551615
**units**\: byte/s
.. attribute:: transmit_packets_per_second_switched
Average Rate of Packets/second switched
**type**\: int
**range:** 0..18446744073709551615
**units**\: packet/s
"""
_prefix = 'infra-tc-oper'
_revision = '2015-11-09'
def __init__(self):
self.parent = None
self.count_history = YList()
self.count_history.parent = self
self.count_history.name = 'count_history'
self.transmit_bytes_per_second_switched = None
self.transmit_packets_per_second_switched = None
class CountHistory(object):
"""
Counter History
.. attribute:: event_end_timestamp
Event End timestamp
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: event_start_timestamp
Event Start timestamp
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: is_valid
Flag to indicate if this history entry is valid
**type**\: bool
.. attribute:: transmit_number_of_bytes_switched
Number of Bytes switched in this interval
**type**\: int
**range:** 0..18446744073709551615
**units**\: byte
.. attribute:: transmit_number_of_packets_switched
Number of packets switched in this interval
**type**\: int
**range:** 0..18446744073709551615
"""
_prefix = 'infra-tc-oper'
_revision = '2015-11-09'
def __init__(self):
self.parent = None
self.event_end_timestamp = None
self.event_start_timestamp = None
self.is_valid = None
self.transmit_number_of_bytes_switched = None
self.transmit_number_of_packets_switched = None
@property
def _common_path(self):
if self.parent is None:
raise YPYModelError('parent is not set . Cannot derive path.')
return self.parent._common_path +'/Cisco-IOS-XR-infra-tc-oper:count-history'
def is_config(self):
''' Returns True if this instance represents config data else returns False '''
return False
def _has_data(self):
if not self.is_config():
return False
if self.event_end_timestamp is not None:
return True
if self.event_start_timestamp is not None:
return True
if self.is_valid is not None:
return True
if self.transmit_number_of_bytes_switched is not None:
return True
if self.transmit_number_of_packets_switched is not None:
return True
return False
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_infra_tc_oper as meta
return meta._meta_table['TrafficCollector.VrfTable.DefaultVrf.Afs.Af.Counters.Prefixes.Prefix.TrafficMatrixCounterStatistics.CountHistory']['meta_info']
@property
def _common_path(self):
if self.parent is None:
raise YPYModelError('parent is not set . Cannot derive path.')
return self.parent._common_path +'/Cisco-IOS-XR-infra-tc-oper:traffic-matrix-counter-statistics'
def is_config(self):
''' Returns True if this instance represents config data else returns False '''
return False
def _has_data(self):
if not self.is_config():
return False
if self.count_history is not None:
for child_ref in self.count_history:
if child_ref._has_data():
return True
if self.transmit_bytes_per_second_switched is not None:
return True
if self.transmit_packets_per_second_switched is not None:
return True
return False
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_infra_tc_oper as meta
return meta._meta_table['TrafficCollector.VrfTable.DefaultVrf.Afs.Af.Counters.Prefixes.Prefix.TrafficMatrixCounterStatistics']['meta_info']
@property
def _common_path(self):
if self.parent is None:
raise YPYModelError('parent is not set . Cannot derive path.')
return self.parent._common_path +'/Cisco-IOS-XR-infra-tc-oper:prefix'
def is_config(self):
''' Returns True if this instance represents config data else returns False '''
return False
def _has_data(self):
if not self.is_config():
return False
if self.base_counter_statistics is not None and self.base_counter_statistics._has_data():
return True
if self.ipaddr is not None:
return True
if self.is_active is not None:
return True
if self.label is not None:
return True
if self.label_xr is not None:
return True
if self.mask is not None:
return True
if self.prefix is not None:
return True
if self.traffic_matrix_counter_statistics is not None and self.traffic_matrix_counter_statistics._has_data():
return True
return False
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_infra_tc_oper as meta
return meta._meta_table['TrafficCollector.VrfTable.DefaultVrf.Afs.Af.Counters.Prefixes.Prefix']['meta_info']
@property
def _common_path(self):
if self.parent is None:
raise YPYModelError('parent is not set . Cannot derive path.')
return self.parent._common_path +'/Cisco-IOS-XR-infra-tc-oper:prefixes'
def is_config(self):
''' Returns True if this instance represents config data else returns False '''
return False
def _has_data(self):
if not self.is_config():
return False
if self.prefix is not None:
for child_ref in self.prefix:
if child_ref._has_data():
return True
return False
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_infra_tc_oper as meta
return meta._meta_table['TrafficCollector.VrfTable.DefaultVrf.Afs.Af.Counters.Prefixes']['meta_info']
class Tunnels(object):
"""
Tunnels
.. attribute:: tunnel
Tunnel information
**type**\: list of :py:class:`Tunnel <ydk.models.cisco_ios_xr.Cisco_IOS_XR_infra_tc_oper.TrafficCollector.VrfTable.DefaultVrf.Afs.Af.Counters.Tunnels.Tunnel>`
"""
_prefix = 'infra-tc-oper'
_revision = '2015-11-09'
def __init__(self):
self.parent = None
self.tunnel = YList()
self.tunnel.parent = self
self.tunnel.name = 'tunnel'
class Tunnel(object):
"""
Tunnel information
.. attribute:: interface_name <key>
The Interface Name
**type**\: str
**pattern:** (([a\-zA\-Z0\-9\_]\*\\d+/){3,4}\\d+)\|(([a\-zA\-Z0\-9\_]\*\\d+/){3,4}\\d+\\.\\d+)\|(([a\-zA\-Z0\-9\_]\*\\d+/){2}([a\-zA\-Z0\-9\_]\*\\d+))\|(([a\-zA\-Z0\-9\_]\*\\d+/){2}([a\-zA\-Z0\-9\_]+))\|([a\-zA\-Z0\-9\_\-]\*\\d+)\|([a\-zA\-Z0\-9\_\-]\*\\d+\\.\\d+)\|(mpls)\|(dwdm)
.. attribute:: base_counter_statistics
Base counter statistics
**type**\: :py:class:`BaseCounterStatistics <ydk.models.cisco_ios_xr.Cisco_IOS_XR_infra_tc_oper.TrafficCollector.VrfTable.DefaultVrf.Afs.Af.Counters.Tunnels.Tunnel.BaseCounterStatistics>`
.. attribute:: interface_handle
Interface handle
**type**\: int
**range:** 0..4294967295
.. attribute:: interface_name_xr
Interface name in Display format
**type**\: str
.. attribute:: is_active
Interface is Active and collecting new Statistics
**type**\: bool
.. attribute:: vrfid
Interface VRF ID
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'infra-tc-oper'
_revision = '2015-11-09'
def __init__(self):
self.parent = None
self.interface_name = None
self.base_counter_statistics = TrafficCollector.VrfTable.DefaultVrf.Afs.Af.Counters.Tunnels.Tunnel.BaseCounterStatistics()
self.base_counter_statistics.parent = self
self.interface_handle = None
self.interface_name_xr = None
self.is_active = None
self.vrfid = None
class BaseCounterStatistics(object):
"""
Base counter statistics
.. attribute:: count_history
Counter History
**type**\: list of :py:class:`CountHistory <ydk.models.cisco_ios_xr.Cisco_IOS_XR_infra_tc_oper.TrafficCollector.VrfTable.DefaultVrf.Afs.Af.Counters.Tunnels.Tunnel.BaseCounterStatistics.CountHistory>`
.. attribute:: transmit_bytes_per_second_switched
Average Rate of Bytes/second switched
**type**\: int
**range:** 0..18446744073709551615
**units**\: byte/s
.. attribute:: transmit_packets_per_second_switched
Average Rate of Packets/second switched
**type**\: int
**range:** 0..18446744073709551615
**units**\: packet/s
"""
_prefix = 'infra-tc-oper'
_revision = '2015-11-09'
def __init__(self):
self.parent = None
self.count_history = YList()
self.count_history.parent = self
self.count_history.name = 'count_history'
self.transmit_bytes_per_second_switched = None
self.transmit_packets_per_second_switched = None
class CountHistory(object):
"""
Counter History
.. attribute:: event_end_timestamp
Event End timestamp
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: event_start_timestamp
Event Start timestamp
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: is_valid
Flag to indicate if this history entry is valid
**type**\: bool
.. attribute:: transmit_number_of_bytes_switched
Number of Bytes switched in this interval
**type**\: int
**range:** 0..18446744073709551615
**units**\: byte
.. attribute:: transmit_number_of_packets_switched
Number of packets switched in this interval
**type**\: int
**range:** 0..18446744073709551615
"""
_prefix = 'infra-tc-oper'
_revision = '2015-11-09'
def __init__(self):
self.parent = None
self.event_end_timestamp = None
self.event_start_timestamp = None
self.is_valid = None
self.transmit_number_of_bytes_switched = None
self.transmit_number_of_packets_switched = None
@property
def _common_path(self):
if self.parent is None:
raise YPYModelError('parent is not set . Cannot derive path.')
return self.parent._common_path +'/Cisco-IOS-XR-infra-tc-oper:count-history'
def is_config(self):
''' Returns True if this instance represents config data else returns False '''
return False
def _has_data(self):
if not self.is_config():
return False
if self.event_end_timestamp is not None:
return True
if self.event_start_timestamp is not None:
return True
if self.is_valid is not None:
return True
if self.transmit_number_of_bytes_switched is not None:
return True
if self.transmit_number_of_packets_switched is not None:
return True
return False
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_infra_tc_oper as meta
return meta._meta_table['TrafficCollector.VrfTable.DefaultVrf.Afs.Af.Counters.Tunnels.Tunnel.BaseCounterStatistics.CountHistory']['meta_info']
@property
def _common_path(self):
if self.parent is None:
raise YPYModelError('parent is not set . Cannot derive path.')
return self.parent._common_path +'/Cisco-IOS-XR-infra-tc-oper:base-counter-statistics'
def is_config(self):
''' Returns True if this instance represents config data else returns False '''
return False
def _has_data(self):
if not self.is_config():
return False
if self.count_history is not None:
for child_ref in self.count_history:
if child_ref._has_data():
return True
if self.transmit_bytes_per_second_switched is not None:
return True
if self.transmit_packets_per_second_switched is not None:
return True
return False
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_infra_tc_oper as meta
return meta._meta_table['TrafficCollector.VrfTable.DefaultVrf.Afs.Af.Counters.Tunnels.Tunnel.BaseCounterStatistics']['meta_info']
@property
def _common_path(self):
if self.parent is None:
raise YPYModelError('parent is not set . Cannot derive path.')
if self.interface_name is None:
raise YPYModelError('Key property interface_name is None')
return self.parent._common_path +'/Cisco-IOS-XR-infra-tc-oper:tunnel[Cisco-IOS-XR-infra-tc-oper:interface-name = ' + str(self.interface_name) + ']'
def is_config(self):
''' Returns True if this instance represents config data else returns False '''
return False
def _has_data(self):
if not self.is_config():
return False
if self.interface_name is not None:
return True
if self.base_counter_statistics is not None and self.base_counter_statistics._has_data():
return True
if self.interface_handle is not None:
return True
if self.interface_name_xr is not None:
return True
if self.is_active is not None:
return True
if self.vrfid is not None:
return True
return False
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_infra_tc_oper as meta
return meta._meta_table['TrafficCollector.VrfTable.DefaultVrf.Afs.Af.Counters.Tunnels.Tunnel']['meta_info']
@property
def _common_path(self):
if self.parent is None:
raise YPYModelError('parent is not set . Cannot derive path.')
return self.parent._common_path +'/Cisco-IOS-XR-infra-tc-oper:tunnels'
def is_config(self):
''' Returns True if this instance represents config data else returns False '''
return False
def _has_data(self):
if not self.is_config():
return False
if self.tunnel is not None:
for child_ref in self.tunnel:
if child_ref._has_data():
return True
return False
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_infra_tc_oper as meta
return meta._meta_table['TrafficCollector.VrfTable.DefaultVrf.Afs.Af.Counters.Tunnels']['meta_info']
@property
def _common_path(self):
if self.parent is None:
raise YPYModelError('parent is not set . Cannot derive path.')
return self.parent._common_path +'/Cisco-IOS-XR-infra-tc-oper:counters'
def is_config(self):
''' Returns True if this instance represents config data else returns False '''
return False
def _has_data(self):
if not self.is_config():
return False
if self.prefixes is not None and self.prefixes._has_data():
return True
if self.tunnels is not None and self.tunnels._has_data():
return True
return False
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_infra_tc_oper as meta
return meta._meta_table['TrafficCollector.VrfTable.DefaultVrf.Afs.Af.Counters']['meta_info']
@property
def _common_path(self):
if self.af_name is None:
raise YPYModelError('Key property af_name is None')
return '/Cisco-IOS-XR-infra-tc-oper:traffic-collector/Cisco-IOS-XR-infra-tc-oper:vrf-table/Cisco-IOS-XR-infra-tc-oper:default-vrf/Cisco-IOS-XR-infra-tc-oper:afs/Cisco-IOS-XR-infra-tc-oper:af[Cisco-IOS-XR-infra-tc-oper:af-name = ' + str(self.af_name) + ']'
def is_config(self):
''' Returns True if this instance represents config data else returns False '''
return False
def _has_data(self):
if not self.is_config():
return False
if self.af_name is not None:
return True
if self.counters is not None and self.counters._has_data():
return True
return False
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_infra_tc_oper as meta
return meta._meta_table['TrafficCollector.VrfTable.DefaultVrf.Afs.Af']['meta_info']
@property
def _common_path(self):
return '/Cisco-IOS-XR-infra-tc-oper:traffic-collector/Cisco-IOS-XR-infra-tc-oper:vrf-table/Cisco-IOS-XR-infra-tc-oper:default-vrf/Cisco-IOS-XR-infra-tc-oper:afs'
def is_config(self):
''' Returns True if this instance represents config data else returns False '''
return False
def _has_data(self):
if not self.is_config():
return False
if self.af is not None:
for child_ref in self.af:
if child_ref._has_data():
return True
return False
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_infra_tc_oper as meta
return meta._meta_table['TrafficCollector.VrfTable.DefaultVrf.Afs']['meta_info']
@property
def _common_path(self):
return '/Cisco-IOS-XR-infra-tc-oper:traffic-collector/Cisco-IOS-XR-infra-tc-oper:vrf-table/Cisco-IOS-XR-infra-tc-oper:default-vrf'
def is_config(self):
''' Returns True if this instance represents config data else returns False '''
return False
def _has_data(self):
if not self.is_config():
return False
if self.afs is not None and self.afs._has_data():
return True
return False
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_infra_tc_oper as meta
return meta._meta_table['TrafficCollector.VrfTable.DefaultVrf']['meta_info']
@property
def _common_path(self):
return '/Cisco-IOS-XR-infra-tc-oper:traffic-collector/Cisco-IOS-XR-infra-tc-oper:vrf-table'
def is_config(self):
''' Returns True if this instance represents config data else returns False '''
return False
def _has_data(self):
if not self.is_config():
return False
if self.default_vrf is not None and self.default_vrf._has_data():
return True
return False
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_infra_tc_oper as meta
return meta._meta_table['TrafficCollector.VrfTable']['meta_info']
class Afs(object):
"""
Address Family specific operational data
.. attribute:: af
Operational data for given Address Family
**type**\: list of :py:class:`Af <ydk.models.cisco_ios_xr.Cisco_IOS_XR_infra_tc_oper.TrafficCollector.Afs.Af>`
"""
_prefix = 'infra-tc-oper'
_revision = '2015-11-09'
def __init__(self):
self.parent = None
self.af = YList()
self.af.parent = self
self.af.name = 'af'
class Af(object):
"""
Operational data for given Address Family
.. attribute:: af_name <key>
Address Family name
**type**\: :py:class:`TcOperAfNameEnum <ydk.models.cisco_ios_xr.Cisco_IOS_XR_infra_tc_oper.TcOperAfNameEnum>`
.. attribute:: counters
Show Counters
**type**\: :py:class:`Counters <ydk.models.cisco_ios_xr.Cisco_IOS_XR_infra_tc_oper.TrafficCollector.Afs.Af.Counters>`
"""
_prefix = 'infra-tc-oper'
_revision = '2015-11-09'
def __init__(self):
self.parent = None
self.af_name = None
self.counters = TrafficCollector.Afs.Af.Counters()
self.counters.parent = self
class Counters(object):
"""
Show Counters
.. attribute:: prefixes
Prefix Database
**type**\: :py:class:`Prefixes <ydk.models.cisco_ios_xr.Cisco_IOS_XR_infra_tc_oper.TrafficCollector.Afs.Af.Counters.Prefixes>`
.. attribute:: tunnels
Tunnels
**type**\: :py:class:`Tunnels <ydk.models.cisco_ios_xr.Cisco_IOS_XR_infra_tc_oper.TrafficCollector.Afs.Af.Counters.Tunnels>`
"""
_prefix = 'infra-tc-oper'
_revision = '2015-11-09'
def __init__(self):
self.parent = None
self.prefixes = TrafficCollector.Afs.Af.Counters.Prefixes()
self.prefixes.parent = self
self.tunnels = TrafficCollector.Afs.Af.Counters.Tunnels()
self.tunnels.parent = self
class Prefixes(object):
"""
Prefix Database
.. attribute:: prefix
Show Prefix Counter
**type**\: list of :py:class:`Prefix <ydk.models.cisco_ios_xr.Cisco_IOS_XR_infra_tc_oper.TrafficCollector.Afs.Af.Counters.Prefixes.Prefix>`
"""
_prefix = 'infra-tc-oper'
_revision = '2015-11-09'
def __init__(self):
self.parent = None
self.prefix = YList()
self.prefix.parent = self
self.prefix.name = 'prefix'
class Prefix(object):
"""
Show Prefix Counter
.. attribute:: base_counter_statistics
Base counter statistics
**type**\: :py:class:`BaseCounterStatistics <ydk.models.cisco_ios_xr.Cisco_IOS_XR_infra_tc_oper.TrafficCollector.Afs.Af.Counters.Prefixes.Prefix.BaseCounterStatistics>`
.. attribute:: ipaddr
IP Address
**type**\: str
**pattern:** [\\w\\\-\\.\:,\_@#%$\\+=\\\|;]+
.. attribute:: is_active
Prefix is Active and collecting new Statistics
**type**\: bool
.. attribute:: label
Local Label
**type**\: int
**range:** 16..1048575
.. attribute:: label_xr
Label
**type**\: int
**range:** 0..4294967295
.. attribute:: mask
Prefix Mask
**type**\: str
**pattern:** [\\w\\\-\\.\:,\_@#%$\\+=\\\|;]+
.. attribute:: prefix
Prefix Address (V4 or V6 Format)
**type**\: str
.. attribute:: traffic_matrix_counter_statistics
Traffic Matrix (TM) counter statistics
**type**\: :py:class:`TrafficMatrixCounterStatistics <ydk.models.cisco_ios_xr.Cisco_IOS_XR_infra_tc_oper.TrafficCollector.Afs.Af.Counters.Prefixes.Prefix.TrafficMatrixCounterStatistics>`
"""
_prefix = 'infra-tc-oper'
_revision = '2015-11-09'
def __init__(self):
self.parent = None
self.base_counter_statistics = TrafficCollector.Afs.Af.Counters.Prefixes.Prefix.BaseCounterStatistics()
self.base_counter_statistics.parent = self
self.ipaddr = None
self.is_active = None
self.label = None
self.label_xr = None
self.mask = None
self.prefix = None
self.traffic_matrix_counter_statistics = TrafficCollector.Afs.Af.Counters.Prefixes.Prefix.TrafficMatrixCounterStatistics()
self.traffic_matrix_counter_statistics.parent = self
class BaseCounterStatistics(object):
"""
Base counter statistics
.. attribute:: count_history
Counter History
**type**\: list of :py:class:`CountHistory <ydk.models.cisco_ios_xr.Cisco_IOS_XR_infra_tc_oper.TrafficCollector.Afs.Af.Counters.Prefixes.Prefix.BaseCounterStatistics.CountHistory>`
.. attribute:: transmit_bytes_per_second_switched
Average Rate of Bytes/second switched
**type**\: int
**range:** 0..18446744073709551615
**units**\: byte/s
.. attribute:: transmit_packets_per_second_switched
Average Rate of Packets/second switched
**type**\: int
**range:** 0..18446744073709551615
**units**\: packet/s
"""
_prefix = 'infra-tc-oper'
_revision = '2015-11-09'
def __init__(self):
self.parent = None
self.count_history = YList()
self.count_history.parent = self
self.count_history.name = 'count_history'
self.transmit_bytes_per_second_switched = None
self.transmit_packets_per_second_switched = None
class CountHistory(object):
"""
Counter History
.. attribute:: event_end_timestamp
Event End timestamp
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: event_start_timestamp
Event Start timestamp
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: is_valid
Flag to indicate if this history entry is valid
**type**\: bool
.. attribute:: transmit_number_of_bytes_switched
Number of Bytes switched in this interval
**type**\: int
**range:** 0..18446744073709551615
**units**\: byte
.. attribute:: transmit_number_of_packets_switched
Number of packets switched in this interval
**type**\: int
**range:** 0..18446744073709551615
"""
_prefix = 'infra-tc-oper'
_revision = '2015-11-09'
def __init__(self):
self.parent = None
self.event_end_timestamp = None
self.event_start_timestamp = None
self.is_valid = None
self.transmit_number_of_bytes_switched = None
self.transmit_number_of_packets_switched = None
@property
def _common_path(self):
if self.parent is None:
raise YPYModelError('parent is not set . Cannot derive path.')
return self.parent._common_path +'/Cisco-IOS-XR-infra-tc-oper:count-history'
def is_config(self):
''' Returns True if this instance represents config data else returns False '''
return False
def _has_data(self):
if not self.is_config():
return False
if self.event_end_timestamp is not None:
return True
if self.event_start_timestamp is not None:
return True
if self.is_valid is not None:
return True
if self.transmit_number_of_bytes_switched is not None:
return True
if self.transmit_number_of_packets_switched is not None:
return True
return False
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_infra_tc_oper as meta
return meta._meta_table['TrafficCollector.Afs.Af.Counters.Prefixes.Prefix.BaseCounterStatistics.CountHistory']['meta_info']
@property
def _common_path(self):
if self.parent is None:
raise YPYModelError('parent is not set . Cannot derive path.')
return self.parent._common_path +'/Cisco-IOS-XR-infra-tc-oper:base-counter-statistics'
def is_config(self):
''' Returns True if this instance represents config data else returns False '''
return False
def _has_data(self):
if not self.is_config():
return False
if self.count_history is not None:
for child_ref in self.count_history:
if child_ref._has_data():
return True
if self.transmit_bytes_per_second_switched is not None:
return True
if self.transmit_packets_per_second_switched is not None:
return True
return False
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_infra_tc_oper as meta
return meta._meta_table['TrafficCollector.Afs.Af.Counters.Prefixes.Prefix.BaseCounterStatistics']['meta_info']
class TrafficMatrixCounterStatistics(object):
"""
Traffic Matrix (TM) counter statistics
.. attribute:: count_history
Counter History
**type**\: list of :py:class:`CountHistory <ydk.models.cisco_ios_xr.Cisco_IOS_XR_infra_tc_oper.TrafficCollector.Afs.Af.Counters.Prefixes.Prefix.TrafficMatrixCounterStatistics.CountHistory>`
.. attribute:: transmit_bytes_per_second_switched
Average Rate of Bytes/second switched
**type**\: int
**range:** 0..18446744073709551615
**units**\: byte/s
.. attribute:: transmit_packets_per_second_switched
Average Rate of Packets/second switched
**type**\: int
**range:** 0..18446744073709551615
**units**\: packet/s
"""
_prefix = 'infra-tc-oper'
_revision = '2015-11-09'
def __init__(self):
self.parent = None
self.count_history = YList()
self.count_history.parent = self
self.count_history.name = 'count_history'
self.transmit_bytes_per_second_switched = None
self.transmit_packets_per_second_switched = None
class CountHistory(object):
"""
Counter History
.. attribute:: event_end_timestamp
Event End timestamp
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: event_start_timestamp
Event Start timestamp
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: is_valid
Flag to indicate if this history entry is valid
**type**\: bool
.. attribute:: transmit_number_of_bytes_switched
Number of Bytes switched in this interval
**type**\: int
**range:** 0..18446744073709551615
**units**\: byte
.. attribute:: transmit_number_of_packets_switched
Number of packets switched in this interval
**type**\: int
**range:** 0..18446744073709551615
"""
_prefix = 'infra-tc-oper'
_revision = '2015-11-09'
def __init__(self):
self.parent = None
self.event_end_timestamp = None
self.event_start_timestamp = None
self.is_valid = None
self.transmit_number_of_bytes_switched = None
self.transmit_number_of_packets_switched = None
@property
def _common_path(self):
if self.parent is None:
raise YPYModelError('parent is not set . Cannot derive path.')
return self.parent._common_path +'/Cisco-IOS-XR-infra-tc-oper:count-history'
def is_config(self):
''' Returns True if this instance represents config data else returns False '''
return False
def _has_data(self):
if not self.is_config():
return False
if self.event_end_timestamp is not None:
return True
if self.event_start_timestamp is not None:
return True
if self.is_valid is not None:
return True
if self.transmit_number_of_bytes_switched is not None:
return True
if self.transmit_number_of_packets_switched is not None:
return True
return False
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_infra_tc_oper as meta
return meta._meta_table['TrafficCollector.Afs.Af.Counters.Prefixes.Prefix.TrafficMatrixCounterStatistics.CountHistory']['meta_info']
@property
def _common_path(self):
if self.parent is None:
raise YPYModelError('parent is not set . Cannot derive path.')
return self.parent._common_path +'/Cisco-IOS-XR-infra-tc-oper:traffic-matrix-counter-statistics'
def is_config(self):
''' Returns True if this instance represents config data else returns False '''
return False
def _has_data(self):
if not self.is_config():
return False
if self.count_history is not None:
for child_ref in self.count_history:
if child_ref._has_data():
return True
if self.transmit_bytes_per_second_switched is not None:
return True
if self.transmit_packets_per_second_switched is not None:
return True
return False
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_infra_tc_oper as meta
return meta._meta_table['TrafficCollector.Afs.Af.Counters.Prefixes.Prefix.TrafficMatrixCounterStatistics']['meta_info']
@property
def _common_path(self):
if self.parent is None:
raise YPYModelError('parent is not set . Cannot derive path.')
return self.parent._common_path +'/Cisco-IOS-XR-infra-tc-oper:prefix'
def is_config(self):
''' Returns True if this instance represents config data else returns False '''
return False
def _has_data(self):
if not self.is_config():
return False
if self.base_counter_statistics is not None and self.base_counter_statistics._has_data():
return True
if self.ipaddr is not None:
return True
if self.is_active is not None:
return True
if self.label is not None:
return True
if self.label_xr is not None:
return True
if self.mask is not None:
return True
if self.prefix is not None:
return True
if self.traffic_matrix_counter_statistics is not None and self.traffic_matrix_counter_statistics._has_data():
return True
return False
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_infra_tc_oper as meta
return meta._meta_table['TrafficCollector.Afs.Af.Counters.Prefixes.Prefix']['meta_info']
@property
def _common_path(self):
if self.parent is None:
raise YPYModelError('parent is not set . Cannot derive path.')
return self.parent._common_path +'/Cisco-IOS-XR-infra-tc-oper:prefixes'
def is_config(self):
''' Returns True if this instance represents config data else returns False '''
return False
def _has_data(self):
if not self.is_config():
return False
if self.prefix is not None:
for child_ref in self.prefix:
if child_ref._has_data():
return True
return False
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_infra_tc_oper as meta
return meta._meta_table['TrafficCollector.Afs.Af.Counters.Prefixes']['meta_info']
class Tunnels(object):
"""
Tunnels
.. attribute:: tunnel
Tunnel information
**type**\: list of :py:class:`Tunnel <ydk.models.cisco_ios_xr.Cisco_IOS_XR_infra_tc_oper.TrafficCollector.Afs.Af.Counters.Tunnels.Tunnel>`
"""
_prefix = 'infra-tc-oper'
_revision = '2015-11-09'
def __init__(self):
self.parent = None
self.tunnel = YList()
self.tunnel.parent = self
self.tunnel.name = 'tunnel'
class Tunnel(object):
"""
Tunnel information
.. attribute:: interface_name <key>
The Interface Name
**type**\: str
**pattern:** (([a\-zA\-Z0\-9\_]\*\\d+/){3,4}\\d+)\|(([a\-zA\-Z0\-9\_]\*\\d+/){3,4}\\d+\\.\\d+)\|(([a\-zA\-Z0\-9\_]\*\\d+/){2}([a\-zA\-Z0\-9\_]\*\\d+))\|(([a\-zA\-Z0\-9\_]\*\\d+/){2}([a\-zA\-Z0\-9\_]+))\|([a\-zA\-Z0\-9\_\-]\*\\d+)\|([a\-zA\-Z0\-9\_\-]\*\\d+\\.\\d+)\|(mpls)\|(dwdm)
.. attribute:: base_counter_statistics
Base counter statistics
**type**\: :py:class:`BaseCounterStatistics <ydk.models.cisco_ios_xr.Cisco_IOS_XR_infra_tc_oper.TrafficCollector.Afs.Af.Counters.Tunnels.Tunnel.BaseCounterStatistics>`
.. attribute:: interface_handle
Interface handle
**type**\: int
**range:** 0..4294967295
.. attribute:: interface_name_xr
Interface name in Display format
**type**\: str
.. attribute:: is_active
Interface is Active and collecting new Statistics
**type**\: bool
.. attribute:: vrfid
Interface VRF ID
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'infra-tc-oper'
_revision = '2015-11-09'
def __init__(self):
self.parent = None
self.interface_name = None
self.base_counter_statistics = TrafficCollector.Afs.Af.Counters.Tunnels.Tunnel.BaseCounterStatistics()
self.base_counter_statistics.parent = self
self.interface_handle = None
self.interface_name_xr = None
self.is_active = None
self.vrfid = None
class BaseCounterStatistics(object):
"""
Base counter statistics
.. attribute:: count_history
Counter History
**type**\: list of :py:class:`CountHistory <ydk.models.cisco_ios_xr.Cisco_IOS_XR_infra_tc_oper.TrafficCollector.Afs.Af.Counters.Tunnels.Tunnel.BaseCounterStatistics.CountHistory>`
.. attribute:: transmit_bytes_per_second_switched
Average Rate of Bytes/second switched
**type**\: int
**range:** 0..18446744073709551615
**units**\: byte/s
.. attribute:: transmit_packets_per_second_switched
Average Rate of Packets/second switched
**type**\: int
**range:** 0..18446744073709551615
**units**\: packet/s
"""
_prefix = 'infra-tc-oper'
_revision = '2015-11-09'
def __init__(self):
self.parent = None
self.count_history = YList()
self.count_history.parent = self
self.count_history.name = 'count_history'
self.transmit_bytes_per_second_switched = None
self.transmit_packets_per_second_switched = None
class CountHistory(object):
"""
Counter History
.. attribute:: event_end_timestamp
Event End timestamp
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: event_start_timestamp
Event Start timestamp
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: is_valid
Flag to indicate if this history entry is valid
**type**\: bool
.. attribute:: transmit_number_of_bytes_switched
Number of Bytes switched in this interval
**type**\: int
**range:** 0..18446744073709551615
**units**\: byte
.. attribute:: transmit_number_of_packets_switched
Number of packets switched in this interval
**type**\: int
**range:** 0..18446744073709551615
"""
_prefix = 'infra-tc-oper'
_revision = '2015-11-09'
def __init__(self):
self.parent = None
self.event_end_timestamp = None
self.event_start_timestamp = None
self.is_valid = None
self.transmit_number_of_bytes_switched = None
self.transmit_number_of_packets_switched = None
@property
def _common_path(self):
if self.parent is None:
raise YPYModelError('parent is not set . Cannot derive path.')
return self.parent._common_path +'/Cisco-IOS-XR-infra-tc-oper:count-history'
def is_config(self):
''' Returns True if this instance represents config data else returns False '''
return False
def _has_data(self):
if not self.is_config():
return False
if self.event_end_timestamp is not None:
return True
if self.event_start_timestamp is not None:
return True
if self.is_valid is not None:
return True
if self.transmit_number_of_bytes_switched is not None:
return True
if self.transmit_number_of_packets_switched is not None:
return True
return False
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_infra_tc_oper as meta
return meta._meta_table['TrafficCollector.Afs.Af.Counters.Tunnels.Tunnel.BaseCounterStatistics.CountHistory']['meta_info']
@property
def _common_path(self):
if self.parent is None:
raise YPYModelError('parent is not set . Cannot derive path.')
return self.parent._common_path +'/Cisco-IOS-XR-infra-tc-oper:base-counter-statistics'
def is_config(self):
''' Returns True if this instance represents config data else returns False '''
return False
def _has_data(self):
if not self.is_config():
return False
if self.count_history is not None:
for child_ref in self.count_history:
if child_ref._has_data():
return True
if self.transmit_bytes_per_second_switched is not None:
return True
if self.transmit_packets_per_second_switched is not None:
return True
return False
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_infra_tc_oper as meta
return meta._meta_table['TrafficCollector.Afs.Af.Counters.Tunnels.Tunnel.BaseCounterStatistics']['meta_info']
@property
def _common_path(self):
if self.parent is None:
raise YPYModelError('parent is not set . Cannot derive path.')
if self.interface_name is None:
raise YPYModelError('Key property interface_name is None')
return self.parent._common_path +'/Cisco-IOS-XR-infra-tc-oper:tunnel[Cisco-IOS-XR-infra-tc-oper:interface-name = ' + str(self.interface_name) + ']'
def is_config(self):
''' Returns True if this instance represents config data else returns False '''
return False
def _has_data(self):
if not self.is_config():
return False
if self.interface_name is not None:
return True
if self.base_counter_statistics is not None and self.base_counter_statistics._has_data():
return True
if self.interface_handle is not None:
return True
if self.interface_name_xr is not None:
return True
if self.is_active is not None:
return True
if self.vrfid is not None:
return True
return False
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_infra_tc_oper as meta
return meta._meta_table['TrafficCollector.Afs.Af.Counters.Tunnels.Tunnel']['meta_info']
@property
def _common_path(self):
if self.parent is None:
raise YPYModelError('parent is not set . Cannot derive path.')
return self.parent._common_path +'/Cisco-IOS-XR-infra-tc-oper:tunnels'
def is_config(self):
''' Returns True if this instance represents config data else returns False '''
return False
def _has_data(self):
if not self.is_config():
return False
if self.tunnel is not None:
for child_ref in self.tunnel:
if child_ref._has_data():
return True
return False
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_infra_tc_oper as meta
return meta._meta_table['TrafficCollector.Afs.Af.Counters.Tunnels']['meta_info']
@property
def _common_path(self):
if self.parent is None:
raise YPYModelError('parent is not set . Cannot derive path.')
return self.parent._common_path +'/Cisco-IOS-XR-infra-tc-oper:counters'
def is_config(self):
''' Returns True if this instance represents config data else returns False '''
return False
def _has_data(self):
if not self.is_config():
return False
if self.prefixes is not None and self.prefixes._has_data():
return True
if self.tunnels is not None and self.tunnels._has_data():
return True
return False
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_infra_tc_oper as meta
return meta._meta_table['TrafficCollector.Afs.Af.Counters']['meta_info']
@property
def _common_path(self):
if self.af_name is None:
raise YPYModelError('Key property af_name is None')
return '/Cisco-IOS-XR-infra-tc-oper:traffic-collector/Cisco-IOS-XR-infra-tc-oper:afs/Cisco-IOS-XR-infra-tc-oper:af[Cisco-IOS-XR-infra-tc-oper:af-name = ' + str(self.af_name) + ']'
def is_config(self):
''' Returns True if this instance represents config data else returns False '''
return False
def _has_data(self):
if not self.is_config():
return False
if self.af_name is not None:
return True
if self.counters is not None and self.counters._has_data():
return True
return False
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_infra_tc_oper as meta
return meta._meta_table['TrafficCollector.Afs.Af']['meta_info']
@property
def _common_path(self):
return '/Cisco-IOS-XR-infra-tc-oper:traffic-collector/Cisco-IOS-XR-infra-tc-oper:afs'
def is_config(self):
''' Returns True if this instance represents config data else returns False '''
return False
def _has_data(self):
if not self.is_config():
return False
if self.af is not None:
for child_ref in self.af:
if child_ref._has_data():
return True
return False
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_infra_tc_oper as meta
return meta._meta_table['TrafficCollector.Afs']['meta_info']
@property
def _common_path(self):
return '/Cisco-IOS-XR-infra-tc-oper:traffic-collector'
def is_config(self):
''' Returns True if this instance represents config data else returns False '''
return False
def _has_data(self):
if not self.is_config():
return False
if self.afs is not None and self.afs._has_data():
return True
if self.external_interfaces is not None and self.external_interfaces._has_data():
return True
if self.summary is not None and self.summary._has_data():
return True
if self.vrf_table is not None and self.vrf_table._has_data():
return True
return False
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_infra_tc_oper as meta
return meta._meta_table['TrafficCollector']['meta_info']
| 43.181199 | 313 | 0.408866 | 9,818 | 126,780 | 5.043899 | 0.023732 | 0.037479 | 0.046849 | 0.046647 | 0.928737 | 0.910806 | 0.904081 | 0.896852 | 0.893076 | 0.892369 | 0 | 0.028543 | 0.528285 | 126,780 | 2,935 | 314 | 43.195911 | 0.799512 | 0.213654 | 0 | 0.884615 | 0 | 0.013622 | 0.102046 | 0.069928 | 0 | 0 | 0 | 0 | 0 | 1 | 0.153045 | false | 0 | 0.035256 | 0.010417 | 0.486378 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
78147a67bab2d5c9a0fa9dadad646bdafd4f69cd | 6,923 | py | Python | qtim_tools/qtim_features/test.py | QTIM-Lab/qtim_tools | 92bd15ec7a81c5eda70d11a015f74538f3c41e22 | [
"Apache-2.0"
] | 12 | 2017-03-29T18:17:24.000Z | 2020-03-19T05:28:56.000Z | qtim_tools/qtim_features/test.py | QTIM-Lab/qtim_tools | 92bd15ec7a81c5eda70d11a015f74538f3c41e22 | [
"Apache-2.0"
] | 7 | 2017-03-08T21:06:01.000Z | 2017-06-21T19:01:58.000Z | qtim_tools/qtim_features/test.py | QTIM-Lab/qtim_tools | 92bd15ec7a81c5eda70d11a015f74538f3c41e22 | [
"Apache-2.0"
] | 5 | 2017-03-02T09:08:21.000Z | 2019-10-26T05:37:39.000Z | import os
import extract_features
def test():
# This code should be run from the folder above the main "qtim_tools" folder using the command "python -m qtim_tools.qtim_features.test"
# All niftis in this folder will be processed. The program searches for a nifti file, and then checks if there is a matching labelmap file with the suffix '-label'.
# It currently loads from some built in data from the qtim_tools project, but you can change the filepath below to anywhere.
test_folder = os.path.abspath(os.path.join(os.path.dirname(__file__),'..','test_data','test_data_features','Phantom_Intensity'))
# If labels is set to False, the whole image will be processed. This can take a very long time for GLCM features especially, so it is best we stick to labels.
labels = True
# The only available features are 'GLCM', 'morphology', and 'statistics' for now.
features = ['GLCM','morphology', 'statistics']
# In order for GLCM to work correctly, an image has to be reduced to a set amount of gray-levels. Using all available levels in an image will most likely produce a useless result.
# More levels will result in more intensive computation.
levels = 100
# This will save a spreadsheet of all requested feature results.
outfile = 'test_feature_results_intensity.csv'
# If your label is for some reason masked with a value other than zero, change this parameter.
mask_value = 0
# The erode parameter will take [x,y,z] pixels off in each dimension. On many volumes, it is not useful to erode in the z (axial) slice because of high slice thickness.
# Currently, the erode parameter only applies to GLCM. It does not apply to intensity statistic features, although maybe it should.
erode = [0,0,0]
# If overwrite is False, then the program will try to save to the chosen filename with '_copy' appended if the chosen filename already exists.
overwrite = True
extract_features.generate_feature_list_batch(folder=test_folder, features=features, labels=labels, levels=levels, outfile=outfile, mask_value=mask_value, erode=erode, overwrite=overwrite)
def test_parallel():
test_folder = os.path.abspath(os.path.join(os.path.dirname(__file__),'..','test_data','test_data_features','Phantom_GLCM'))
test_folder = '/home/administrator/data/tbData/tbType/TrainingSet'
generate_feature_list_parallel(folder=test_folder, features=['GLCM','morphology', 'statistics'], labels=True, levels=100, outfile='lung_features_results_parallel_500.csv',test=False, mask_value=0, erode=[0,0,0], overwrite=True, processes=35)
return
def parse_command_line(argv):
# This code should be run from the folder above the main "qtim_tools" folder using the command "python -m qtim_tools.qtim_features.test"
# All niftis in this folder will be processed. The program searches for a nifti file, and then checks if there is a matching labelmap file with the suffix '-label'.
# It currently loads from some built in data from the qtim_tools project, but you can change the filepath below to anywhere.
test_folder = os.path.abspath(os.path.join(os.path.dirname(__file__),'..','test_data','test_data_features','Phantom_Intensity'))
# If labels is set to False, the whole image will be processed. This can take a very long time for GLCM features especially, so it is best we stick to labels.
labels = True
# The only available features are 'GLCM', 'morphology', and 'statistics' for now.
features = ['GLCM','morphology', 'statistics']
# In order for GLCM to work correctly, an image has to be reduced to a set amount of gray-levels. Using all available levels in an image will most likely produce a useless result.
# More levels will result in more intensive computation.
levels = 100
# This will save a spreadsheet of all requested feature results.
outfile = 'test_feature_results_intensity.csv'
# If your label is for some reason masked with a value other than zero, change this parameter.
mask_value = 0
# The erode parameter will take [x,y,z] pixels off in each dimension. On many volumes, it is not useful to erode in the z (axial) slice because of high slice thickness.
# Currently, the erode parameter only applies to GLCM. It does not apply to intensity statistic features, although maybe it should.
erode = [0,0,0]
# If overwrite is False, then the program will try to save to the chosen filename with '_copy' appended if the chosen filename already exists.
overwrite = True
extract_features.generate_feature_list_batch(folder=test_folder, features=features, labels=labels, levels=levels, outfile=outfile, mask_value=mask_value, erode=erode, overwrite=overwrite)
def test_2():
# This code should be run from the folder above the main "qtim_tools" folder using the command "python -m qtim_tools.qtim_features.test"
# All niftis in this folder will be processed. The program searches for a nifti file, and then checks if there is a matching labelmap file with the suffix '-label'.
# It currently loads from some built in data from the qtim_tools project, but you can change the filepath below to anywhere.
test_folder = os.path.abspath(os.path.join(os.path.dirname(__file__),'..','test_data','test_data_features','Phantom_Intensity'))
# If labels is set to False, the whole image will be processed. This can take a very long time for GLCM features especially, so it is best we stick to labels.
labels = True
# The only available features are 'GLCM', 'morphology', and 'statistics' for now.
features = ['GLCM','morphology', 'statistics']
# In order for GLCM to work correctly, an image has to be reduced to a set amount of gray-levels. Using all available levels in an image will most likely produce a useless result.
# More levels will result in more intensive computation.
levels = 100
# This will save a spreadsheet of all requested feature results.
outfile = 'test_feature_results_intensity.csv'
# If your label is for some reason masked with a value other than zero, change this parameter.
mask_value = 0
# The erode parameter will take [x,y,z] pixels off in each dimension. On many volumes, it is not useful to erode in the z (axial) slice because of high slice thickness.
# Currently, the erode parameter only applies to GLCM. It does not apply to intensity statistic features, although maybe it should.
erode = [0,0,0]
# If overwrite is False, then the program will try to save to the chosen filename with '_copy' appended if the chosen filename already exists.
overwrite = True
generate_feature_list_batch(folder=test_folder, features=features, labels=labels, levels=levels, outfile=outfile, mask_value=mask_value, erode=erode, overwrite=overwrite, mode="maximal_slice")
print 'new test now'
if __name__ == '__main__':
test_method()
| 60.2 | 245 | 0.746786 | 1,078 | 6,923 | 4.695733 | 0.166048 | 0.014224 | 0.01778 | 0.012643 | 0.927301 | 0.927301 | 0.927301 | 0.927301 | 0.927301 | 0.927301 | 0 | 0.005996 | 0.180991 | 6,923 | 114 | 246 | 60.72807 | 0.886772 | 0.643796 | 0 | 0.65 | 0 | 0 | 0.205107 | 0.078254 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.05 | null | null | 0.025 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
78d59cac1fcf5bdca4840d652670c4eeae6e487a | 35,786 | py | Python | gitea_api/api/notification_api.py | awalker125/gitea-api | 2dea0493d4b6a92d6e63a7284afb2c80cbf35cf7 | [
"MIT"
] | null | null | null | gitea_api/api/notification_api.py | awalker125/gitea-api | 2dea0493d4b6a92d6e63a7284afb2c80cbf35cf7 | [
"MIT"
] | null | null | null | gitea_api/api/notification_api.py | awalker125/gitea-api | 2dea0493d4b6a92d6e63a7284afb2c80cbf35cf7 | [
"MIT"
] | 1 | 2022-01-27T14:12:40.000Z | 2022-01-27T14:12:40.000Z | # coding: utf-8
"""
Gitea API.
This documentation describes the Gitea API. # noqa: E501
OpenAPI spec version: 1.15.3
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from gitea_api.api_client import ApiClient
class NotificationApi(object):
"""NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
Ref: https://github.com/swagger-api/swagger-codegen
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def notify_get_list(self, **kwargs): # noqa: E501
"""List users's notification threads # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.notify_get_list(async_req=True)
>>> result = thread.get()
:param async_req bool
:param bool all: If true, show notifications marked as read. Default value is false
:param list[str] status_types: Show notifications with the provided status types. Options are: unread, read and/or pinned. Defaults to unread & pinned.
:param list[str] subject_type: filter notifications by subject type
:param datetime since: Only show notifications updated after the given time. This is a timestamp in RFC 3339 format
:param datetime before: Only show notifications updated before the given time. This is a timestamp in RFC 3339 format
:param int page: page number of results to return (1-based)
:param int limit: page size of results
:return: list[NotificationThread]
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.notify_get_list_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.notify_get_list_with_http_info(**kwargs) # noqa: E501
return data
def notify_get_list_with_http_info(self, **kwargs): # noqa: E501
"""List users's notification threads # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.notify_get_list_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param bool all: If true, show notifications marked as read. Default value is false
:param list[str] status_types: Show notifications with the provided status types. Options are: unread, read and/or pinned. Defaults to unread & pinned.
:param list[str] subject_type: filter notifications by subject type
:param datetime since: Only show notifications updated after the given time. This is a timestamp in RFC 3339 format
:param datetime before: Only show notifications updated before the given time. This is a timestamp in RFC 3339 format
:param int page: page number of results to return (1-based)
:param int limit: page size of results
:return: list[NotificationThread]
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['all', 'status_types', 'subject_type', 'since', 'before', 'page', 'limit'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method notify_get_list" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'all' in params:
query_params.append(('all', params['all'])) # noqa: E501
if 'status_types' in params:
query_params.append(('status-types', params['status_types'])) # noqa: E501
collection_formats['status-types'] = 'multi' # noqa: E501
if 'subject_type' in params:
query_params.append(('subject-type', params['subject_type'])) # noqa: E501
collection_formats['subject-type'] = 'multi' # noqa: E501
if 'since' in params:
query_params.append(('since', params['since'])) # noqa: E501
if 'before' in params:
query_params.append(('before', params['before'])) # noqa: E501
if 'page' in params:
query_params.append(('page', params['page'])) # noqa: E501
if 'limit' in params:
query_params.append(('limit', params['limit'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['AccessToken', 'AuthorizationHeaderToken', 'BasicAuth', 'SudoHeader', 'SudoParam', 'TOTPHeader', 'Token'] # noqa: E501
return self.api_client.call_api(
'/notifications', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='list[NotificationThread]', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def notify_get_repo_list(self, owner, repo, **kwargs): # noqa: E501
"""List users's notification threads on a specific repo # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.notify_get_repo_list(owner, repo, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str owner: owner of the repo (required)
:param str repo: name of the repo (required)
:param bool all: If true, show notifications marked as read. Default value is false
:param list[str] status_types: Show notifications with the provided status types. Options are: unread, read and/or pinned. Defaults to unread & pinned
:param list[str] subject_type: filter notifications by subject type
:param datetime since: Only show notifications updated after the given time. This is a timestamp in RFC 3339 format
:param datetime before: Only show notifications updated before the given time. This is a timestamp in RFC 3339 format
:param int page: page number of results to return (1-based)
:param int limit: page size of results
:return: list[NotificationThread]
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.notify_get_repo_list_with_http_info(owner, repo, **kwargs) # noqa: E501
else:
(data) = self.notify_get_repo_list_with_http_info(owner, repo, **kwargs) # noqa: E501
return data
def notify_get_repo_list_with_http_info(self, owner, repo, **kwargs): # noqa: E501
"""List users's notification threads on a specific repo # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.notify_get_repo_list_with_http_info(owner, repo, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str owner: owner of the repo (required)
:param str repo: name of the repo (required)
:param bool all: If true, show notifications marked as read. Default value is false
:param list[str] status_types: Show notifications with the provided status types. Options are: unread, read and/or pinned. Defaults to unread & pinned
:param list[str] subject_type: filter notifications by subject type
:param datetime since: Only show notifications updated after the given time. This is a timestamp in RFC 3339 format
:param datetime before: Only show notifications updated before the given time. This is a timestamp in RFC 3339 format
:param int page: page number of results to return (1-based)
:param int limit: page size of results
:return: list[NotificationThread]
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['owner', 'repo', 'all', 'status_types', 'subject_type', 'since', 'before', 'page', 'limit'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method notify_get_repo_list" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'owner' is set
if self.api_client.client_side_validation and ('owner' not in params or
params['owner'] is None): # noqa: E501
raise ValueError("Missing the required parameter `owner` when calling `notify_get_repo_list`") # noqa: E501
# verify the required parameter 'repo' is set
if self.api_client.client_side_validation and ('repo' not in params or
params['repo'] is None): # noqa: E501
raise ValueError("Missing the required parameter `repo` when calling `notify_get_repo_list`") # noqa: E501
collection_formats = {}
path_params = {}
if 'owner' in params:
path_params['owner'] = params['owner'] # noqa: E501
if 'repo' in params:
path_params['repo'] = params['repo'] # noqa: E501
query_params = []
if 'all' in params:
query_params.append(('all', params['all'])) # noqa: E501
if 'status_types' in params:
query_params.append(('status-types', params['status_types'])) # noqa: E501
collection_formats['status-types'] = 'multi' # noqa: E501
if 'subject_type' in params:
query_params.append(('subject-type', params['subject_type'])) # noqa: E501
collection_formats['subject-type'] = 'multi' # noqa: E501
if 'since' in params:
query_params.append(('since', params['since'])) # noqa: E501
if 'before' in params:
query_params.append(('before', params['before'])) # noqa: E501
if 'page' in params:
query_params.append(('page', params['page'])) # noqa: E501
if 'limit' in params:
query_params.append(('limit', params['limit'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['AccessToken', 'AuthorizationHeaderToken', 'BasicAuth', 'SudoHeader', 'SudoParam', 'TOTPHeader', 'Token'] # noqa: E501
return self.api_client.call_api(
'/repos/{owner}/{repo}/notifications', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='list[NotificationThread]', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def notify_get_thread(self, id, **kwargs): # noqa: E501
"""Get notification thread by ID # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.notify_get_thread(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: id of notification thread (required)
:return: NotificationThread
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.notify_get_thread_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.notify_get_thread_with_http_info(id, **kwargs) # noqa: E501
return data
def notify_get_thread_with_http_info(self, id, **kwargs): # noqa: E501
"""Get notification thread by ID # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.notify_get_thread_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: id of notification thread (required)
:return: NotificationThread
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method notify_get_thread" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in params or
params['id'] is None): # noqa: E501
raise ValueError("Missing the required parameter `id` when calling `notify_get_thread`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['AccessToken', 'AuthorizationHeaderToken', 'BasicAuth', 'SudoHeader', 'SudoParam', 'TOTPHeader', 'Token'] # noqa: E501
return self.api_client.call_api(
'/notifications/threads/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='NotificationThread', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def notify_new_available(self, **kwargs): # noqa: E501
"""Check if unread notifications exist # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.notify_new_available(async_req=True)
>>> result = thread.get()
:param async_req bool
:return: NotificationCount
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.notify_new_available_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.notify_new_available_with_http_info(**kwargs) # noqa: E501
return data
def notify_new_available_with_http_info(self, **kwargs): # noqa: E501
"""Check if unread notifications exist # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.notify_new_available_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:return: NotificationCount
If the method is called asynchronously,
returns the request thread.
"""
all_params = [] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method notify_new_available" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json', 'text/html']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json', 'text/plain']) # noqa: E501
# Authentication setting
auth_settings = ['AccessToken', 'AuthorizationHeaderToken', 'BasicAuth', 'SudoHeader', 'SudoParam', 'TOTPHeader', 'Token'] # noqa: E501
return self.api_client.call_api(
'/notifications/new', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='NotificationCount', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def notify_read_list(self, **kwargs): # noqa: E501
"""Mark notification threads as read, pinned or unread # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.notify_read_list(async_req=True)
>>> result = thread.get()
:param async_req bool
:param datetime last_read_at: Describes the last point that notifications were checked. Anything updated since this time will not be updated.
:param str all: If true, mark all notifications on this repo. Default value is false
:param list[str] status_types: Mark notifications with the provided status types. Options are: unread, read and/or pinned. Defaults to unread.
:param str to_status: Status to mark notifications as, Defaults to read.
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.notify_read_list_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.notify_read_list_with_http_info(**kwargs) # noqa: E501
return data
def notify_read_list_with_http_info(self, **kwargs): # noqa: E501
"""Mark notification threads as read, pinned or unread # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.notify_read_list_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param datetime last_read_at: Describes the last point that notifications were checked. Anything updated since this time will not be updated.
:param str all: If true, mark all notifications on this repo. Default value is false
:param list[str] status_types: Mark notifications with the provided status types. Options are: unread, read and/or pinned. Defaults to unread.
:param str to_status: Status to mark notifications as, Defaults to read.
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['last_read_at', 'all', 'status_types', 'to_status'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method notify_read_list" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'last_read_at' in params:
query_params.append(('last_read_at', params['last_read_at'])) # noqa: E501
if 'all' in params:
query_params.append(('all', params['all'])) # noqa: E501
if 'status_types' in params:
query_params.append(('status-types', params['status_types'])) # noqa: E501
collection_formats['status-types'] = 'multi' # noqa: E501
if 'to_status' in params:
query_params.append(('to-status', params['to_status'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['AccessToken', 'AuthorizationHeaderToken', 'BasicAuth', 'SudoHeader', 'SudoParam', 'TOTPHeader', 'Token'] # noqa: E501
return self.api_client.call_api(
'/notifications', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def notify_read_repo_list(self, owner, repo, **kwargs): # noqa: E501
"""Mark notification threads as read, pinned or unread on a specific repo # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.notify_read_repo_list(owner, repo, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str owner: owner of the repo (required)
:param str repo: name of the repo (required)
:param str all: If true, mark all notifications on this repo. Default value is false
:param list[str] status_types: Mark notifications with the provided status types. Options are: unread, read and/or pinned. Defaults to unread.
:param str to_status: Status to mark notifications as. Defaults to read.
:param datetime last_read_at: Describes the last point that notifications were checked. Anything updated since this time will not be updated.
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.notify_read_repo_list_with_http_info(owner, repo, **kwargs) # noqa: E501
else:
(data) = self.notify_read_repo_list_with_http_info(owner, repo, **kwargs) # noqa: E501
return data
def notify_read_repo_list_with_http_info(self, owner, repo, **kwargs): # noqa: E501
"""Mark notification threads as read, pinned or unread on a specific repo # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.notify_read_repo_list_with_http_info(owner, repo, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str owner: owner of the repo (required)
:param str repo: name of the repo (required)
:param str all: If true, mark all notifications on this repo. Default value is false
:param list[str] status_types: Mark notifications with the provided status types. Options are: unread, read and/or pinned. Defaults to unread.
:param str to_status: Status to mark notifications as. Defaults to read.
:param datetime last_read_at: Describes the last point that notifications were checked. Anything updated since this time will not be updated.
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['owner', 'repo', 'all', 'status_types', 'to_status', 'last_read_at'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method notify_read_repo_list" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'owner' is set
if self.api_client.client_side_validation and ('owner' not in params or
params['owner'] is None): # noqa: E501
raise ValueError("Missing the required parameter `owner` when calling `notify_read_repo_list`") # noqa: E501
# verify the required parameter 'repo' is set
if self.api_client.client_side_validation and ('repo' not in params or
params['repo'] is None): # noqa: E501
raise ValueError("Missing the required parameter `repo` when calling `notify_read_repo_list`") # noqa: E501
collection_formats = {}
path_params = {}
if 'owner' in params:
path_params['owner'] = params['owner'] # noqa: E501
if 'repo' in params:
path_params['repo'] = params['repo'] # noqa: E501
query_params = []
if 'all' in params:
query_params.append(('all', params['all'])) # noqa: E501
if 'status_types' in params:
query_params.append(('status-types', params['status_types'])) # noqa: E501
collection_formats['status-types'] = 'multi' # noqa: E501
if 'to_status' in params:
query_params.append(('to-status', params['to_status'])) # noqa: E501
if 'last_read_at' in params:
query_params.append(('last_read_at', params['last_read_at'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['AccessToken', 'AuthorizationHeaderToken', 'BasicAuth', 'SudoHeader', 'SudoParam', 'TOTPHeader', 'Token'] # noqa: E501
return self.api_client.call_api(
'/repos/{owner}/{repo}/notifications', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def notify_read_thread(self, id, **kwargs): # noqa: E501
"""Mark notification thread as read by ID # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.notify_read_thread(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: id of notification thread (required)
:param str to_status: Status to mark notifications as
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.notify_read_thread_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.notify_read_thread_with_http_info(id, **kwargs) # noqa: E501
return data
def notify_read_thread_with_http_info(self, id, **kwargs): # noqa: E501
"""Mark notification thread as read by ID # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.notify_read_thread_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: id of notification thread (required)
:param str to_status: Status to mark notifications as
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'to_status'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method notify_read_thread" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in params or
params['id'] is None): # noqa: E501
raise ValueError("Missing the required parameter `id` when calling `notify_read_thread`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'to_status' in params:
query_params.append(('to-status', params['to_status'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['AccessToken', 'AuthorizationHeaderToken', 'BasicAuth', 'SudoHeader', 'SudoParam', 'TOTPHeader', 'Token'] # noqa: E501
return self.api_client.call_api(
'/notifications/threads/{id}', 'PATCH',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
| 44.50995 | 159 | 0.619516 | 4,251 | 35,786 | 5.006116 | 0.051753 | 0.049622 | 0.026362 | 0.023683 | 0.975706 | 0.973826 | 0.972699 | 0.968564 | 0.964663 | 0.962972 | 0 | 0.017327 | 0.287179 | 35,786 | 803 | 160 | 44.56538 | 0.816927 | 0.366428 | 0 | 0.827189 | 1 | 0 | 0.201577 | 0.043632 | 0 | 0 | 0 | 0 | 0 | 1 | 0.034562 | false | 0 | 0.009217 | 0 | 0.09447 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
1538b69730715b354556382be6f6343f107847d6 | 14,243 | py | Python | Modell/CSV/diagramm_sprache_genauigkeit.py | NoahEmbedded/EmbeddedKWD | 2380d56b0b75bae4fedeb60885358332766f7319 | [
"MIT"
] | null | null | null | Modell/CSV/diagramm_sprache_genauigkeit.py | NoahEmbedded/EmbeddedKWD | 2380d56b0b75bae4fedeb60885358332766f7319 | [
"MIT"
] | null | null | null | Modell/CSV/diagramm_sprache_genauigkeit.py | NoahEmbedded/EmbeddedKWD | 2380d56b0b75bae4fedeb60885358332766f7319 | [
"MIT"
] | null | null | null | import matplotlib.pyplot as plt
import numpy as np
#Input = Sprache
#datenTF
#output sprache
spIn_spOut = np.array([100.0,100.0,0.61,99.84,99.17,100.0,98.2,100.0,100.0,100.0,92.67,99.98,100.0,99.84,100.0,96.43,99.99,100.0,99.97,97.62,100.0,100.0,92.77,100.0,99.93,96.97,100.0,100.0,100.0,100.0,94.32,97.25,88.54,100.0,78.25,100.0,100.0,82.26,100.0,100.0,99.99,11.78,77.43,100.0,100.0,99.69,100.0,99.66,99.99,100.0,100.0,98.32,2.45,100.0,4.2,26.09,98.98,96.09,94.92,82.16,100.0,99.99,100.0,100.0,4.92,1.98,100.0,86.59,97.08,100.0,99.94,100.0,100.0,100.0,100.0,100.0,99.97,11.96,100.0,85.7,93.36,100.0,99.95,0.76,100.0,100.0,100.0,100.0,99.99,100.0,93.68,100.0,79.97,100.0,100.0,100.0,99.98,100.0,99.93,100.0,50.76,100.0,100.0,76.21,99.89,100.0,100.0,76.73,17.59,100.0,66.92,95.29,100.0,97.4,99.69,99.85,99.99,100.0,100.0,1.11,98.35,100.0,100.0,99.75,99.98,100.0,100.0,100.0,100.0,100.0,100.0,98.49,100.0,99.99,12.14,100.0,100.0,71.05,100.0,99.96,100.0,99.67,99.98,100.0,99.19,97.06,97.42,86.73,55.24,35.23,100.0,100.0,100.0,100.0,14.57,100.0,100.0,99.89,100.0,99.87,100.0,100.0,100.0,100.0,99.99,100.0,100.0,100.0,99.92,70.29,100.0,100.0,100.0,99.98,99.11,100.0,100.0,100.0,99.98,56.94,98.76,92.32,1.88,99.69,99.96,99.74,99.01,96.03,59.72,100.0,100.0,0.77,100.0,100.0,100.0,15.04,100.0,47.34,100.0,96.9,100.0,99.68,100.0,65.52,100.0,99.03,94.87,100.0,98.33,99.24,99.9,100.0,4.81,100.0,100.0,95.19,99.51,99.92,1.18,100.0,100.0,100.0,96.91,99.99,4.31,0.0,92.39,100.0,98.56,100.0,100.0,100.0,0.0,100.0,56.89,100.0,100.0,100.0,100.0,97.98,4.64,34.55,99.71,99.22,72.04,99.92,97.49,100.0,100.0,100.0,96.51,7.58,99.96,100.0,99.96,100.0,0.69,100.0,99.74,0.66,99.99,100.0,99.81,100.0,0.04,100.0,100.0,100.0,99.99,99.54,57.23,99.71,100.0,99.97,100.0,99.94,7.01,100.0,100.0,100.0,99.63,100.0,100.0,100.0,98.88,98.31,99.89,100.0,98.73,99.96,100.0,99.99,99.82,100.0,99.99,99.86,99.98,99.54,100.0,92.38])
#output marvin
spIn_maOut = np.array([0.0,0.0,99.39,0.16,0.83,0.0,1.8,0.0,0.0,0.0,7.33,0.02,0.0,0.16,0.0,3.57,0.01,0.0,0.03,2.38,0.0,0.0,7.23,0.0,0.07,3.03,0.0,0.0,0.0,0.0,5.68,2.75,11.46,0.0,21.75,0.0,0.0,17.74,0.0,0.0,0.01,88.22,22.57,0.0,0.0,0.31,0.0,0.34,0.01,0.0,0.0,1.68,97.55,0.0,95.8,73.91,1.02,3.91,5.08,17.84,0.0,0.01,0.0,0.0,95.08,98.02,0.0,13.41,2.92,0.0,0.06,0.0,0.0,0.0,0.0,0.0,0.03,88.04,0.0,14.3,6.64,0.0,0.05,99.24,0.0,0.0,0.0,0.0,0.01,0.0,6.32,0.0,20.03,0.0,0.0,0.0,0.02,0.0,0.07,0.0,49.24,0.0,0.0,23.79,0.11,0.0,0.0,23.27,82.41,0.0,33.08,4.71,0.0,2.6,0.31,0.15,0.01,0.0,0.0,98.89,1.65,0.0,0.0,0.25,0.02,0.0,0.0,0.0,0.0,0.0,0.0,1.51,0.0,0.01,87.86,0.0,0.0,28.95,0.0,0.04,0.0,0.33,0.02,0.0,0.81,2.94,2.58,13.27,44.76,64.77,0.0,0.0,0.0,0.0,85.43,0.0,0.0,0.11,0.0,0.13,0.0,0.0,0.0,0.0,0.01,0.0,0.0,0.0,0.08,29.71,0.0,0.0,0.0,0.02,0.89,0.0,0.0,0.0,0.02,42.68,1.24,7.68,98.12,0.31,0.04,0.26,0.99,3.97,40.28,0.0,0.0,99.23,0.0,0.0,0.0,84.96,0.0,52.66,0.0,3.1,0.0,0.32,0.0,34.48,0.0,0.97,5.13,0.0,1.67,0.76,0.1,0.0,95.19,0.0,0.0,4.81,0.49,0.08,0.0,0.0,0.0,0.0,3.09,0.01,95.69,100.0,7.61,0.0,1.44,0.0,0.0,0.0,100.0,0.0,43.11,0.0,0.0,0.0,0.0,2.02,95.36,65.45,0.29,0.78,27.96,0.08,2.51,0.0,0.0,0.0,3.49,92.42,0.04,0.0,0.04,0.0,99.31,0.0,0.26,99.34,0.01,0.0,0.19,0.0,99.96,0.0,0.0,0.0,0.01,0.46,42.77,0.29,0.0,0.03,0.0,0.06,92.99,0.0,0.0,0.0,0.29,0.0,0.0,0.0,1.12,1.69,0.11,0.0,1.27,0.04,0.0,0.01,0.18,0.0,0.01,0.14,0.02,0.46,0.0,7.62])
#output stille
spIn_stOut = np.array([0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.38,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,98.82,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.08,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0])
verteilungSP = [0]*10
verteilungMA = [0]*10
verteilungST = [0]*10
#sp_out verteilung
for i in spIn_spOut:
if i <= 10:
verteilungSP[0]+=1
elif i <= 20 and i >10:
verteilungSP[1]+=1
elif i <= 30 and i >20:
verteilungSP[2]+=1
elif i <= 40 and i >30:
verteilungSP[3]+=1
elif i <= 50 and i >40:
verteilungSP[4]+=1
elif i <= 60 and i >50:
verteilungSP[5]+=1
elif i <= 70 and i >60:
verteilungSP[6]+=1
elif i <= 80 and i >70:
verteilungSP[7]+=1
elif i <= 90 and i >80:
verteilungSP[8]+=1
elif i <= 100 and i >90:
verteilungSP[9]+=1
#ma_out verteilung
for i in spIn_maOut:
if i <= 10:
verteilungMA[0]+=1
elif i <= 20 and i >10:
verteilungMA[1]+=1
elif i <= 30 and i >20:
verteilungMA[2]+=1
elif i <= 40 and i >30:
verteilungMA[3]+=1
elif i <= 50 and i >40:
verteilungMA[4]+=1
elif i <= 60 and i >50:
verteilungMA[5]+=1
elif i <= 70 and i >60:
verteilungMA[6]+=1
elif i <= 80 and i >70:
verteilungMA[7]+=1
elif i <= 90 and i >80:
verteilungMA[8]+=1
elif i <= 100 and i >90:
verteilungMA[9]+=1
#st_out verteilung
for i in spIn_stOut:
if i <= 10:
verteilungST[0]+=1
elif i <= 20 and i >10:
verteilungST[1]+=1
elif i <= 30 and i >20:
verteilungST[2]+=1
elif i <= 40 and i >30:
verteilungST[3]+=1
elif i <= 50 and i >40:
verteilungST[4]+=1
elif i <= 60 and i >50:
verteilungST[5]+=1
elif i <= 70 and i >60:
verteilungST[6]+=1
elif i <= 80 and i >70:
verteilungST[7]+=1
elif i <= 90 and i >80:
verteilungST[8]+=1
elif i <= 100 and i >90:
verteilungST[9]+=1
#daten TFLite
#output sprache
lite_spIn_spOut = np.array([100.0,100.0,0.61,99.84,99.17,100.0,98.2,100.0,100.0,100.0,92.67,99.98,100.0,99.84,100.0,96.43,99.99,100.0,99.97,97.62,100.0,100.0,92.77,100.0,99.93,96.97,100.0,100.0,100.0,100.0,94.32,97.25,88.54,100.0,78.25,100.0,100.0,82.26,100.0,100.0,99.99,11.78,77.43,100.0,100.0,99.69,100.0,99.66,99.99,100.0,100.0,98.32,2.45,100.0,4.2,26.09,98.98,96.09,94.92,82.16,100.0,99.99,100.0,100.0,4.92,1.98,100.0,86.59,97.08,100.0,99.94,100.0,100.0,100.0,100.0,100.0,99.97,11.96,100.0,85.7,93.36,100.0,99.95,0.76,100.0,100.0,100.0,100.0,99.99,100.0,93.68,100.0,79.97,100.0,100.0,100.0,99.98,100.0,99.93,100.0,50.76,100.0,100.0,76.21,99.89,100.0,100.0,76.73,17.59,100.0,66.92,95.29,100.0,97.4,99.69,99.85,99.99,100.0,100.0,1.11,98.35,100.0,100.0,99.75,99.98,100.0,100.0,100.0,100.0,100.0,100.0,98.49,100.0,99.99,12.14,100.0,100.0,71.05,100.0,99.96,100.0,99.67,99.98,100.0,99.19,97.06,97.42,86.73,55.24,35.23,100.0,100.0,100.0,100.0,14.57,100.0,100.0,99.89,100.0,99.87,100.0,100.0,100.0,100.0,99.99,100.0,100.0,100.0,99.92,70.29,100.0,100.0,100.0,99.98,99.11,100.0,100.0,100.0,99.98,56.94,98.76,92.32,1.88,99.69,99.96,99.74,99.01,96.03,59.72,100.0,100.0,0.77,100.0,100.0,100.0,15.04,100.0,47.34,100.0,96.9,100.0,99.68,100.0,65.52,100.0,99.03,94.87,100.0,98.33,99.24,99.9,100.0,4.81,100.0,100.0,95.19,99.51,99.92,1.18,100.0,100.0,100.0,96.91,99.99,4.31,0.0,92.39,100.0,98.56,100.0,100.0,100.0,0.0,100.0,56.89,100.0,100.0,100.0,100.0,97.98,4.64,34.56,99.71,99.22,72.04,99.92,97.49,100.0,100.0,100.0,96.51,7.58,99.96,100.0,99.96,100.0,0.69,100.0,99.74,0.66,99.99,100.0,99.81,100.0,0.04,100.0,100.0,100.0,99.99,99.54,57.23,99.71,100.0,99.97,100.0,99.94,7.01,100.0,100.0,100.0,99.63,100.0,100.0,100.0,98.88,98.31,99.89,100.0,98.73,99.96,100.0,99.99,99.82,100.0,99.99,99.86,99.98,99.54,100.0,92.38])
#output marvin
lite_spIn_maOut = np.array([0.0,0.0,99.39,0.16,0.83,0.0,1.8,0.0,0.0,0.0,7.33,0.02,0.0,0.16,0.0,3.57,0.01,0.0,0.03,2.38,0.0,0.0,7.23,0.0,0.07,3.03,0.0,0.0,0.0,0.0,5.68,2.75,11.46,0.0,21.75,0.0,0.0,17.74,0.0,0.0,0.01,88.22,22.57,0.0,0.0,0.31,0.0,0.34,0.01,0.0,0.0,1.68,97.55,0.0,95.8,73.91,1.02,3.91,5.08,17.84,0.0,0.01,0.0,0.0,95.08,98.02,0.0,13.41,2.92,0.0,0.06,0.0,0.0,0.0,0.0,0.0,0.03,88.04,0.0,14.3,6.64,0.0,0.05,99.24,0.0,0.0,0.0,0.0,0.01,0.0,6.32,0.0,20.03,0.0,0.0,0.0,0.02,0.0,0.07,0.0,49.24,0.0,0.0,23.79,0.11,0.0,0.0,23.27,82.41,0.0,33.08,4.71,0.0,2.6,0.31,0.15,0.01,0.0,0.0,98.89,1.65,0.0,0.0,0.25,0.02,0.0,0.0,0.0,0.0,0.0,0.0,1.51,0.0,0.01,87.86,0.0,0.0,28.95,0.0,0.04,0.0,0.33,0.02,0.0,0.81,2.94,2.58,13.27,44.76,64.77,0.0,0.0,0.0,0.0,85.43,0.0,0.0,0.11,0.0,0.13,0.0,0.0,0.0,0.0,0.01,0.0,0.0,0.0,0.08,29.71,0.0,0.0,0.0,0.02,0.89,0.0,0.0,0.0,0.02,42.68,1.24,7.68,98.12,0.31,0.04,0.26,0.99,3.97,40.28,0.0,0.0,99.23,0.0,0.0,0.0,84.96,0.0,52.66,0.0,3.1,0.0,0.32,0.0,34.48,0.0,0.97,5.13,0.0,1.67,0.76,0.1,0.0,95.19,0.0,0.0,4.81,0.49,0.08,0.0,0.0,0.0,0.0,3.09,0.01,95.69,100.0,7.61,0.0,1.44,0.0,0.0,0.0,100.0,0.0,43.11,0.0,0.0,0.0,0.0,2.02,95.36,65.44,0.29,0.78,27.96,0.08,2.51,0.0,0.0,0.0,3.49,92.42,0.04,0.0,0.04,0.0,99.31,0.0,0.26,99.34,0.01,0.0,0.19,0.0,99.96,0.0,0.0,0.0,0.01,0.46,42.77,0.29,0.0,0.03,0.0,0.06,92.99,0.0,0.0,0.0,0.29,0.0,0.0,0.0,1.12,1.69,0.11,0.0,1.27,0.04,0.0,0.01,0.18,0.0,0.01,0.14,0.02,0.46,0.0,7.62])
#output stille
lite_spIn_stOut = np.array([0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.38,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,98.82,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.08,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0])
lite_verteilungSP = [0]*10
lite_verteilungMA = [0]*10
lite_verteilungST = [0]*10
#sp_out verteilung
for i in lite_spIn_spOut:
if i <= 10:
lite_verteilungSP[0]+=1
elif i <= 20 and i >10:
lite_verteilungSP[1]+=1
elif i <= 30 and i >20:
lite_verteilungSP[2]+=1
elif i <= 40 and i >30:
lite_verteilungSP[3]+=1
elif i <= 50 and i >40:
lite_verteilungSP[4]+=1
elif i <= 60 and i >50:
lite_verteilungSP[5]+=1
elif i <= 70 and i >60:
lite_verteilungSP[6]+=1
elif i <= 80 and i >70:
lite_verteilungSP[7]+=1
elif i <= 90 and i >80:
lite_verteilungSP[8]+=1
elif i <= 100 and i >90:
lite_verteilungSP[9]+=1
#ma_out verteilung
for i in lite_spIn_maOut:
if i <= 10:
lite_verteilungMA[0]+=1
elif i <= 20 and i >10:
lite_verteilungMA[1]+=1
elif i <= 30 and i >20:
lite_verteilungMA[2]+=1
elif i <= 40 and i >30:
lite_verteilungMA[3]+=1
elif i <= 50 and i >40:
lite_verteilungMA[4]+=1
elif i <= 60 and i >50:
lite_verteilungMA[5]+=1
elif i <= 70 and i >60:
lite_verteilungMA[6]+=1
elif i <= 80 and i >70:
lite_verteilungMA[7]+=1
elif i <= 90 and i >80:
lite_verteilungMA[8]+=1
elif i <= 100 and i >90:
lite_verteilungMA[9]+=1
#st_out verteilung
for i in lite_spIn_stOut:
if i <= 10:
lite_verteilungST[0]+=1
elif i <= 20 and i >10:
lite_verteilungST[1]+=1
elif i <= 30 and i >20:
lite_verteilungST[2]+=1
elif i <= 40 and i >30:
lite_verteilungST[3]+=1
elif i <= 50 and i >40:
lite_verteilungST[4]+=1
elif i <= 60 and i >50:
lite_verteilungST[5]+=1
elif i <= 70 and i >60:
lite_verteilungST[6]+=1
elif i <= 80 and i >70:
lite_verteilungST[7]+=1
elif i <= 90 and i >80:
lite_verteilungST[8]+=1
elif i <= 100 and i >90:
lite_verteilungST[9]+=1
#Plotten der Daten
labels = ["0-10","10-20","20-30","30-40","40-50","50-60","60-70","70-80","80-90","90-100"]
X = np.arange(10)
fig,axs = plt.subplots(1,2)
fig.suptitle("Ergebnisverteilung für Input = Sprache",fontsize="xx-large")
#plot tf
axs[0].set_title("Tensorflow")
axs[0].set_xlabel("Ergebnis in %",fontsize="large")
axs[0].set_ylabel("Sampleanzahl",fontsize="large")
axs[0].bar(x = X+0,height = verteilungSP,width=0.25,color = "b",label = "Output = Sprache")
axs[0].bar(x = X+0.25,height = verteilungMA,width=0.25,color = "g",label = "Output = Marvin Go")
axs[0].bar(x = X+0.5,height = verteilungST,width=0.25,color = "r",label = "Output = Stille")
axs[0].legend()
axs[0].yaxis.grid(True,linestyle = "--")
axs[0].set_xticks(X+0.25)
axs[0].set_xticklabels(labels)
#plot tf_lite
axs[1].set_title("Tensorflow Lite")
axs[1].set_xlabel("Ergebnis in %",fontsize="large")
axs[1].set_ylabel("Sampleanzahl",fontsize="large")
axs[1].bar(x = X+0,height = lite_verteilungSP,width=0.25,color = "b",label = "Output = Sprache")
axs[1].bar(x = X+0.25,height = lite_verteilungMA,width=0.25,color = "g",label = "Output = Marvin Go")
axs[1].bar(x = X+0.5,height = lite_verteilungST,width=0.25,color = "r",label = "Output = Stille")
axs[1].legend()
axs[1].yaxis.grid(True,linestyle = "--")
axs[1].set_xticks(X+0.25)
axs[1].set_xticklabels(labels)
plt.show() | 76.165775 | 1,801 | 0.588078 | 4,515 | 14,243 | 1.839424 | 0.040532 | 0.405057 | 0.54401 | 0.665623 | 0.935099 | 0.920409 | 0.897532 | 0.886213 | 0.886213 | 0.671403 | 0 | 0.438337 | 0.098224 | 14,243 | 187 | 1,802 | 76.165775 | 0.208268 | 0.017833 | 0 | 0.363636 | 0 | 0 | 0.0214 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.012121 | 0 | 0.012121 | 0 | 0 | 0 | 1 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 13 |
ecbacbd5603630315fb1ae03e273398ecd9458e5 | 2,289 | py | Python | measurement.py | tasteer/ImageCompletion_IncompleteData | 8586273b3f84268790b9dc08bc1182c2ff8a2a4f | [
"MIT"
] | 120 | 2018-02-21T08:04:04.000Z | 2021-12-22T10:55:11.000Z | measurement.py | tasteer/ImageCompletion_IncompleteData | 8586273b3f84268790b9dc08bc1182c2ff8a2a4f | [
"MIT"
] | 2 | 2018-03-03T18:54:53.000Z | 2018-10-17T03:13:26.000Z | measurement.py | tasteer/ImageCompletion_IncompleteData | 8586273b3f84268790b9dc08bc1182c2ff8a2a4f | [
"MIT"
] | 27 | 2018-02-22T12:19:52.000Z | 2019-10-17T06:44:38.000Z | import tensorflow as tf
import numpy as np
import scipy.stats as st
#A randomly chosen patch is set to zero
def block_patch(input, k_size=32):
shape = input.get_shape().as_list()
#for training images
if len(shape) == 3:
patch = tf.zeros([k_size, k_size, shape[-1]], dtype=tf.float32)
rand_num = tf.random_uniform([2], minval=0, maxval=shape[0]-k_size, dtype=tf.int32)
h_, w_ = rand_num[0], rand_num[1]
padding = [[h_, shape[0]-h_-k_size], [w_, shape[1]-w_-k_size], [0, 0]]
padded = tf.pad(patch, padding, "CONSTANT", constant_values=1)
res = tf.multiply(input, padded) + (1-padded)
#for generated images
else:
patch = tf.zeros([k_size, k_size, shape[-1]], dtype=tf.float32)
res = []
for idx in range(0,shape[0]):
rand_num = tf.random_uniform([2], minval=0, maxval=shape[0]-k_size, dtype=tf.int32)
h_, w_ = rand_num[0], rand_num[1]
padding = [[h_, shape[0]-h_-k_size], [w_, shape[1]-w_-k_size], [0, 0]]
padded = tf.pad(patch, padding, "CONSTANT", constant_values=1)
res.append(tf.multiply(input[idx], padded) + (1-padded))
res = tf.stack(res)
return res, padded
#All pixels outside a randomly chosen patch are set to zero
def keep_patch(input, k_size=32):
shape = input.get_shape().as_list()
#for training images
if len(shape) == 3:
#generate a patch
patch = tf.ones([k_size, k_size, shape[-1]], dtype=tf.float32)
#add padding of 0 randomly to all sides (size should not be greater than the image)
rand_num = tf.random_uniform([2], minval=0, maxval=shape[0]-k_size, dtype=tf.int32)
h_, w_ = rand_num[0], rand_num[1]
padding = [[h_, shape[0]-h_-k_size], [w_, shape[1]-w_-k_size], [0, 0]]
padded = tf.pad(patch, padding, "CONSTANT", constant_values=0)
res = tf.multiply(input, padded) + (1-padded)
#for generated images
else:
patch = tf.ones([k_size, k_size, shape[-1]], dtype=tf.float32)
res = []
for idx in range(0,shape[0]):
rand_num = tf.random_uniform([2], minval=0, maxval=shape[0]-k_size, dtype=tf.int32)
h_, w_ = rand_num[0], rand_num[1]
padding = [[h_, shape[0]-h_-k_size], [w_, shape[1]-w_-k_size], [0, 0]]
padded = tf.pad(patch, padding, "CONSTANT", constant_values=0)
res.append(tf.multiply(input[idx], padded) + (1-padded))
res = tf.stack(res)
return res, padded
| 32.7 | 86 | 0.665793 | 401 | 2,289 | 3.620948 | 0.197007 | 0.075758 | 0.033058 | 0.027548 | 0.842975 | 0.842975 | 0.842975 | 0.842975 | 0.842975 | 0.842975 | 0 | 0.039083 | 0.161643 | 2,289 | 69 | 87 | 33.173913 | 0.717561 | 0.118829 | 0 | 0.883721 | 0 | 0 | 0.015944 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.046512 | false | 0 | 0.069767 | 0 | 0.162791 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
bf2b25cc7b6d63a71e37ac0c6a8b9278a2958aca | 3,446 | py | Python | tests/test_prioritytype.py | cznewt/pyanp | 2b90363ba3f9d60aef1e4f74c6cda9f5fda83561 | [
"MIT"
] | 17 | 2019-01-03T08:11:40.000Z | 2021-12-27T03:19:26.000Z | tests/test_prioritytype.py | ArdalanMarandi/pyanp | 731fa2c6090e274843f854081610599a2001e00a | [
"MIT"
] | 57 | 2018-03-31T13:18:39.000Z | 2020-12-28T19:37:33.000Z | tests/test_prioritytype.py | ArdalanMarandi/pyanp | 731fa2c6090e274843f854081610599a2001e00a | [
"MIT"
] | 14 | 2018-03-17T18:31:36.000Z | 2022-03-11T16:52:38.000Z | from unittest import TestCase
from pyanp.prioritizer import PriorityType, priority_type_default
import numpy as np
import pandas as pd
class TestPriorityType(TestCase):
def test_crud(self):
p = priority_type_default()
self.assertEqual(p, PriorityType.RAW)
def test_lists(self):
lvals = [1, 3, 2, 4]
raw = PriorityType.RAW.apply(lvals)
# Apply always returns a copy
self.assertFalse(raw is lvals)
# Apply always returns the same type
self.assertTrue(isinstance(raw, type(lvals)))
# Raw means do nothing
np.testing.assert_array_equal(lvals, raw)
normalize = PriorityType.NORMALIZE.apply(lvals)
# Apply always returns a copy
self.assertFalse(normalize is lvals)
# Apply always returns the same type
self.assertTrue(isinstance(normalize, type(lvals)))
# Raw means do nothing
np.testing.assert_array_equal(normalize, np.array(lvals)/10.0)
ideal = PriorityType.IDEALIZE.apply(lvals)
# Apply always returns a copy
self.assertFalse(ideal is lvals)
# Apply always returns thy same type
self.assertTrue(isinstance(ideal, type(lvals)))
# Raw means do nothing
np.testing.assert_array_equal(ideal, np.array(lvals)/4.0)
def test_nparray(self):
lvals = np.array([1, 3, 2, 4])
raw = PriorityType.RAW.apply(lvals)
# Apply always returns a copy
self.assertFalse(raw is lvals)
# Apply always returns the same type
self.assertTrue(isinstance(raw, type(lvals)))
# Raw means do nothing
np.testing.assert_array_equal(lvals, raw)
normalize = PriorityType.NORMALIZE.apply(lvals)
# Apply always returns a copy
self.assertFalse(normalize is lvals)
# Apply always returns the same type
self.assertTrue(isinstance(normalize, type(lvals)))
# Raw means do nothing
np.testing.assert_array_equal(normalize, np.array(lvals)/10.0)
ideal = PriorityType.IDEALIZE.apply(lvals)
# Apply always returns a copy
self.assertFalse(ideal is lvals)
# Apply always returns thy same type
self.assertTrue(isinstance(ideal, type(lvals)))
# Raw means do nothing
np.testing.assert_array_equal(ideal, np.array(lvals)/4.0)
def test_series(self):
lvals = pd.Series(data=[1, 3, 2, 4], index=['Bill', 'John', 'Dan', 'Keith'])
raw = PriorityType.RAW.apply(lvals)
# Apply always returns a copy
self.assertFalse(raw is lvals)
# Apply always returns the same type
self.assertTrue(isinstance(raw, type(lvals)))
# Raw means do nothing
np.testing.assert_array_equal(lvals, raw)
normalize = PriorityType.NORMALIZE.apply(lvals)
# Apply always returns a copy
self.assertFalse(normalize is lvals)
# Apply always returns the same type
self.assertTrue(isinstance(normalize, type(lvals)))
# Raw means do nothing
np.testing.assert_array_equal(normalize, np.array(lvals)/10.0)
ideal = PriorityType.IDEALIZE.apply(lvals)
# Apply always returns a copy
self.assertFalse(ideal is lvals)
# Apply always returns thy same type
self.assertTrue(isinstance(ideal, type(lvals)))
# Raw means do nothing
np.testing.assert_array_equal(ideal, np.array(lvals)/4.0)
| 38.719101 | 84 | 0.655252 | 439 | 3,446 | 5.084282 | 0.14123 | 0.080645 | 0.129032 | 0.185484 | 0.863351 | 0.863351 | 0.863351 | 0.863351 | 0.863351 | 0.863351 | 0 | 0.01051 | 0.254498 | 3,446 | 88 | 85 | 39.159091 | 0.858311 | 0.219095 | 0 | 0.72 | 0 | 0 | 0.006011 | 0 | 0 | 0 | 0 | 0 | 0.56 | 1 | 0.08 | false | 0 | 0.08 | 0 | 0.18 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
bf2ddae2197a67535b58bcefd6848bb51962ae34 | 21,233 | py | Python | tests/test_api.py | cazier/jeopardy | 2843985e98b9d871e0872c23f4ae2f8f0fa5f42a | [
"MIT"
] | null | null | null | tests/test_api.py | cazier/jeopardy | 2843985e98b9d871e0872c23f4ae2f8f0fa5f42a | [
"MIT"
] | 20 | 2021-01-15T20:47:59.000Z | 2022-01-23T17:53:58.000Z | tests/test_api.py | cazier/jeopardy | 2843985e98b9d871e0872c23f4ae2f8f0fa5f42a | [
"MIT"
] | null | null | null | import json
import pathlib
import pytest
import requests
from jeopardy import web, config
from jeopardy.api import models
API_VERSION = config.api_version
@pytest.fixture
def testclient():
jeopardy = web.create_app()
jeopardy.config["SQLALCHEMY_DATABASE_URI"] = f"sqlite:///{pathlib.Path('tests/files/test-full.db').absolute()}"
jeopardy.config["TESTING"] = True
with jeopardy.test_client() as client:
yield client
@pytest.fixture
def api_emptyclient():
jeopardy = web.create_app()
jeopardy.config["SQLALCHEMY_DATABASE_URI"] = f"sqlite:///{pathlib.Path('tests/files/test-empty.db').absolute()}"
jeopardy.config["TESTING"] = True
with jeopardy.app_context():
models.db.create_all()
with jeopardy.test_client() as client:
yield client
models.db.drop_all()
def test_get_details_methods(api_emptyclient):
rv = {
api_emptyclient.post(f"/api/v{API_VERSION}/details").status_code,
api_emptyclient.delete(f"/api/v{API_VERSION}/details").status_code,
api_emptyclient.put(f"/api/v{API_VERSION}/details").status_code,
api_emptyclient.patch(f"/api/v{API_VERSION}/details").status_code,
}
assert rv == {405}
def test_empty_client(api_emptyclient):
for endpoint in [f"/api/v{API_VERSION}/details", f"/api/v{API_VERSION}/show", f"/api/v{API_VERSION}/set"]:
rv = api_emptyclient.get(endpoint)
assert rv.status_code == 404
assert "no items" in rv.get_json()["message"]
def test_invalid_endpoint(api_emptyclient):
rv = api_emptyclient.get(f"/api/v{API_VERSION}/alex")
assert rv.status_code == 404
def test_get_details(testclient, test_data):
rv = testclient.get(f"/api/v{API_VERSION}/details")
assert rv.status_code == 200
assert rv.get_json()["sets"]["total"] == len(test_data)
assert list(rv.get_json().keys()) == ["air_dates", "categories", "sets", "shows"]
def test_pagination(testclient, test_data):
rv = testclient.get(f"/api/v{API_VERSION}/set", query_string={"number": 13, "start": 4})
assert rv.status_code == 200
assert len(rv.get_json()["data"]) == 13
rv = testclient.get(f"/api/v{API_VERSION}/set", query_string={"start": 100})
assert rv.status_code == 200
assert len(rv.get_json()["data"]) == len(test_data) - 100
rv = testclient.get(f"/api/v{API_VERSION}/set", query_string={"start": 200})
assert rv.status_code == 400
assert rv.get_json()["message"] == "start number too great"
def test_sets_including_by_id(testclient, test_data):
rv = testclient.get(f"/api/v{API_VERSION}/set")
assert rv.status_code == 200
assert len(rv.get_json()["data"]) == 100
rv = testclient.get(f"/api/v{API_VERSION}/set/id/1")
assert rv.status_code == 200
assert rv.get_json() == dict(test_data[0], **{"id": 1})
rv = testclient.get(f"/api/v{API_VERSION}/set/id/200")
assert rv.status_code == 404
assert rv.get_json() == {"message": "no items were found with that query"}
def test_set_changes(testclient, test_data):
set_ = test_data[-1]
rv = testclient.delete(f"/api/v{API_VERSION}/set/id/119")
assert rv.status_code == 200
assert rv.get_json() == {"deleted": 119}
rv = testclient.post(f"/api/v{API_VERSION}/set", json=set_)
assert rv.status_code == 200
assert rv.get_json() == dict(set_, **{"id": 119})
rv = testclient.get(f"/api/v{API_VERSION}/set/id/119")
assert rv.status_code == 200
assert rv.get_json() == dict(set_, **{"id": 119})
rv = testclient.post(f"/api/v{API_VERSION}/set", json=set_)
assert rv.status_code == 400
assert rv.get_json() == {"message": "The question set supplied is already in the database!"}
set_.pop("show")
rv = testclient.post(f"/api/v{API_VERSION}/set", json=set_)
assert rv.status_code == 400
assert rv.get_json() == {"message": "The question set supplied is missing some data. Every field is required."}
def test_sets_by_show(testclient, test_data):
matching = [i for i in test_data if i["show"] == 1]
rv = testclient.get(f"/api/v{API_VERSION}/set/show/number/1")
assert rv.status_code == 200
assert len(rv.get_json()["data"]) == len(matching)
rv = testclient.get(f"/api/v{API_VERSION}/set/show/id/1")
assert rv.status_code == 200
assert len(rv.get_json()["data"]) == len(matching)
rv = testclient.get(f"/api/v{API_VERSION}/set/show/number/100")
assert rv.status_code == 400
assert rv.get_json() == {
"message": "Unfortunately, there is no show in the database with that number. Please double check your values."
}
rv = testclient.get(f"/api/v{API_VERSION}/set/show/id/100")
assert rv.status_code == 400
assert rv.get_json() == {
"message": "Unfortunately, there is no show in the database with that ID. Please double check your values."
}
def test_sets_by_date(testclient, test_data):
rv = testclient.get(f"/api/v{API_VERSION}/set/date/1992")
matching = [i for i in test_data if i["date"] == "1992-08-13"]
assert rv.status_code == 200
assert len(rv.get_json()["data"]) == len(matching)
rv = testclient.get(f"/api/v{API_VERSION}/set/date/2020/02/31")
assert rv.status_code == 400
assert "check that your date is valid" in rv.get_json()["message"]
def test_sets_by_years(testclient, test_data):
rv = testclient.get(f"/api/v{API_VERSION}/set/years/1992/1992")
matching = [i for i in test_data if i["date"] == "1992-08-13"]
assert rv.status_code == 200
assert len(rv.get_json()["data"]) == len(matching)
rv = testclient.get(f"/api/v{API_VERSION}/set/years/1992/1993")
assert rv.status_code == 200
assert len(rv.get_json()["data"]) == len(matching)
rv = testclient.get(f"/api/v{API_VERSION}/set/years/1991/1992")
assert rv.status_code == 200
assert len(rv.get_json()["data"]) == len(matching)
rv = testclient.get(f"/api/v{API_VERSION}/set/years/1994/1992")
assert rv.status_code == 400
assert "The stop year must come after the starting year." in rv.get_json()["message"]
rv = testclient.get(f"/api/v{API_VERSION}/set/years/0/20000")
assert rv.status_code == 400
assert "year range must be between 0001 and 9999" in rv.get_json()["message"]
def test_sets_by_round(testclient, test_data):
matching = [i for i in test_data if i["round"] == 1]
rv = testclient.get(f"/api/v{API_VERSION}/set/round/1")
assert rv.status_code == 200
assert len(rv.get_json()["data"]) == len(matching)
rv = testclient.get(f"/api/v{API_VERSION}/set/round/4")
assert rv.status_code == 400
assert "round number must be" in rv.get_json()["message"]
rv = testclient.get(f"/api/v{API_VERSION}/set/round/-1")
assert rv.status_code == 400
assert "round number must be" in rv.get_json()["message"]
def test_sets_by_round_empty(api_emptyclient):
rv = api_emptyclient.get(f"/api/v{API_VERSION}/set/round/1")
assert rv.status_code == 404
assert rv.get_json() == {"message": "no items were found with that query"}
def test_show(testclient):
rv = testclient.get(f"/api/v{API_VERSION}/show")
assert rv.status_code == 200
assert len(rv.get_json()["data"]) == 2
def test_show_by_id(testclient):
rv = testclient.get(f"/api/v{API_VERSION}/show/id/1")
assert rv.status_code == 200
assert rv.get_json() == {"date": "1992-08-13", "id": 1, "number": 1}
rv = testclient.get(f"/api/v{API_VERSION}/show/id/200")
assert rv.status_code == 404
assert rv.get_json() == {"message": "no items were found with that query"}
def test_show_by_number(testclient):
rv = testclient.get(f"/api/v{API_VERSION}/show/number/1")
assert rv.status_code == 200
assert rv.get_json() == {"date": "1992-08-13", "id": 1, "number": 1}
rv = testclient.get(f"/api/v{API_VERSION}/show/number/200")
assert rv.status_code == 404
assert rv.get_json() == {"message": "no items were found with that query"}
def test_show_by_date(testclient):
rv = testclient.get(f"/api/v{API_VERSION}/show/date/1992")
assert rv.status_code == 200
assert rv.get_json()["data"] == [{"date": "1992-08-13", "id": 1, "number": 1}]
rv = testclient.get(f"/api/v{API_VERSION}/show/date/1992/8")
assert rv.status_code == 200
assert rv.get_json()["data"] == [{"date": "1992-08-13", "id": 1, "number": 1}]
rv = testclient.get(f"/api/v{API_VERSION}/show/date/1992/08")
assert rv.status_code == 200
assert rv.get_json()["data"] == [{"date": "1992-08-13", "id": 1, "number": 1}]
rv = testclient.get(f"/api/v{API_VERSION}/show/date/1992/8/13")
assert rv.status_code == 200
assert rv.get_json()["data"] == [{"date": "1992-08-13", "id": 1, "number": 1}]
rv = testclient.get(f"/api/v{API_VERSION}/show/date/1992/08/13")
assert rv.status_code == 200
assert rv.get_json()["data"] == [{"date": "1992-08-13", "id": 1, "number": 1}]
rv = testclient.get(f"/api/v{API_VERSION}/show/date/2020/01/01")
assert rv.status_code == 404
assert rv.get_json() == {"message": "no items were found with that query"}
rv = testclient.get(f"/api/v{API_VERSION}/show/date/00/01/01")
assert rv.status_code == 400
assert "check that your date is valid" in rv.get_json()["message"]
rv = testclient.get(f"/api/v{API_VERSION}/show/date/2020/02/31")
assert rv.status_code == 400
assert "check that your date is valid" in rv.get_json()["message"]
def test_shows_by_years(testclient, test_data):
rv = testclient.get(f"/api/v{API_VERSION}/show/years/1992/1992")
assert rv.status_code == 200
assert rv.get_json()["data"] == [{"date": "1992-08-13", "id": 1, "number": 1}]
rv = testclient.get(f"/api/v{API_VERSION}/show/years/1992/1993")
assert rv.status_code == 200
assert rv.get_json()["data"] == [{"date": "1992-08-13", "id": 1, "number": 1}]
rv = testclient.get(f"/api/v{API_VERSION}/show/years/1991/1992")
assert rv.status_code == 200
assert rv.get_json()["data"] == [{"date": "1992-08-13", "id": 1, "number": 1}]
rv = testclient.get(f"/api/v{API_VERSION}/show/years/1994/1992")
assert rv.status_code == 400
assert "The stop year must come after the starting year." in rv.get_json()["message"]
rv = testclient.get(f"/api/v{API_VERSION}/show/years/0/20000")
assert rv.status_code == 400
assert "year range must be between 0001 and 9999" in rv.get_json()["message"]
def test_categories(testclient, test_data):
rv = testclient.get(f"/api/v{API_VERSION}/category")
assert rv.status_code == 200
assert len(rv.get_json()["data"]) == len({f"{i['category']}_{i['show']}" for i in test_data})
def test_category_by_id(testclient):
rv = testclient.get(f"/api/v{API_VERSION}/category/id/1")
assert rv.status_code == 200
assert rv.get_json() == {"complete": True, "date": "1992-08-13", "id": 1, "name": "DOGS", "round": 0, "show": 1}
rv = testclient.get(f"/api/v{API_VERSION}/category/id/200")
assert rv.status_code == 404
assert rv.get_json() == {"message": "no items were found with that query"}
def test_categories_by_date(testclient, test_data):
rv = testclient.get(f"/api/v{API_VERSION}/category/date/1992")
expected = {f"{i['category']}_{i['show']}" for i in test_data if i["date"] == "1992-08-13"}
assert rv.status_code == 200
assert len(rv.get_json()["data"]) == len(expected)
rv = testclient.get(f"/api/v{API_VERSION}/category/date/2020/02/31")
assert rv.status_code == 400
assert "check that your date is valid" in rv.get_json()["message"]
def test_category_by_years(testclient, test_data):
rv = testclient.get(f"/api/v{API_VERSION}/category/years/1992/1992")
expected = {f"{i['category']}_{i['show']}" for i in test_data if i["date"] == "1992-08-13"}
assert rv.status_code == 200
assert len(rv.get_json()["data"]) == len(expected)
rv = testclient.get(f"/api/v{API_VERSION}/category/years/1992/1993")
assert rv.status_code == 200
assert len(rv.get_json()["data"]) == len(expected)
rv = testclient.get(f"/api/v{API_VERSION}/category/years/1991/1992")
assert rv.status_code == 200
assert len(rv.get_json()["data"]) == len(expected)
rv = testclient.get(f"/api/v{API_VERSION}/category/years/1994/1992")
assert rv.status_code == 400
assert "The stop year must come after the starting year." in rv.get_json()["message"]
rv = testclient.get(f"/api/v{API_VERSION}/category/years/0/20000")
assert rv.status_code == 400
assert "year range must be between 0001 and 9999" in rv.get_json()["message"]
def test_categories_by_completion(testclient, test_data):
complete = {f"{i['category']}_{i['show']}" for i in test_data if i["complete"]}
incomplete = {f"{i['category']}_{i['show']}" for i in test_data if not i["complete"]}
rv = testclient.get(f"/api/v{API_VERSION}/category/complete")
assert rv.status_code == 200
assert len(rv.get_json()["data"]) == len(complete)
rv = testclient.get(f"/api/v{API_VERSION}/category/complete/true")
assert rv.status_code == 200
assert len(rv.get_json()["data"]) == len(complete)
rv = testclient.get(f"/api/v{API_VERSION}/category/incomplete")
assert rv.status_code == 200
assert len(rv.get_json()["data"]) == len(incomplete)
rv = testclient.get(f"/api/v{API_VERSION}/category/complete/false")
assert rv.status_code == 200
assert len(rv.get_json()["data"]) == len(incomplete)
rv = testclient.get(f"/api/v{API_VERSION}/category/complete/alex")
assert rv.status_code == 400
assert "completion status must be" in rv.get_json()["message"]
def test_categories_by_name(testclient, test_data):
matching = {f"{i['category']}_{i['show']}" for i in test_data if "BEER" in i["category"]}
rv = testclient.get(f"/api/v{API_VERSION}/category/name/BEER")
assert rv.status_code == 200
assert len(rv.get_json()["data"]) == len(matching)
matching = {f"{i['category']}_{i['show']}" for i in test_data if "D" in i["category"]}
rv = testclient.get(f"/api/v{API_VERSION}/category/name/D")
assert rv.status_code == 200
assert len(rv.get_json()["data"]) == len(matching)
def test_categories_by_show(testclient, test_data):
matching = {f"{i['category']}" for i in test_data if i["show"] == 1}
rv = testclient.get(f"/api/v{API_VERSION}/category/show/number/1")
assert rv.status_code == 200
assert len(rv.get_json()["data"]) == len(matching)
rv = testclient.get(f"/api/v{API_VERSION}/category/show/id/1")
assert rv.status_code == 200
assert len(rv.get_json()["data"]) == len(matching)
rv = testclient.get(f"/api/v{API_VERSION}/category/show/number/100")
assert rv.status_code == 400
assert rv.get_json() == {
"message": "Unfortunately, there is no show in the database with that number. Please double check your values."
}
rv = testclient.get(f"/api/v{API_VERSION}/category/show/id/100")
assert rv.status_code == 400
assert rv.get_json() == {
"message": "Unfortunately, there is no show in the database with that ID. Please double check your values."
}
def test_categories_by_round(testclient, test_data):
matching = {f"{i['category']}_{i['show']}" for i in test_data if i["round"] == 1}
rv = testclient.get(f"/api/v{API_VERSION}/category/round/1")
assert rv.status_code == 200
assert len(rv.get_json()["data"]) == len(matching)
rv = testclient.get(f"/api/v{API_VERSION}/category/round/4")
assert rv.status_code == 400
assert "round number must be" in rv.get_json()["message"]
rv = testclient.get(f"/api/v{API_VERSION}/category/round/-1")
assert rv.status_code == 400
assert "round number must be" in rv.get_json()["message"]
def test_categories_by_round_empty(api_emptyclient):
rv = api_emptyclient.get(f"/api/v{API_VERSION}/category/round/1")
assert rv.status_code == 404
assert rv.get_json() == {"message": "no items were found with that query"}
def test_empty_db(api_emptyclient, test_data):
question = test_data[0]
rv = api_emptyclient.post(f"/api/v{API_VERSION}/set", json=question)
assert rv.data != None
def test_game_resource(testclient, test_data):
rv = testclient.get(f"/api/v{API_VERSION}/game")
assert rv.status_code == 200
assert len(rv.get_json()) == 6
expected = [i for i in test_data if i["complete"]]
rv = testclient.get(f"/api/v{API_VERSION}/game", query_string={"round": 1})
assert rv.status_code == 200
assert len(rv.get_json()) == min(
6,
len(
{f'{i["category"]}_{i["show"]}' for i in expected if i["round"] == 1}.difference(
{f'{i["category"]}_{i["show"]}' for i in expected if i["external"]}
)
),
)
expected = [i for i in test_data if (i["date"][:4] == "1992") & (i["complete"])]
rv = testclient.get(f"/api/v{API_VERSION}/game", query_string={"start": 1992, "stop": 1992})
assert rv.status_code == 200
assert len(rv.get_json()) == min(
6,
len(
{f'{i["category"]}_{i["show"]}' for i in expected if i["round"] != 2}.difference(
{f'{i["category"]}_{i["show"]}' for i in expected if i["external"]}
)
),
)
expected = [i for i in test_data if (i["show"] == 2) & (i["complete"])]
rv = testclient.get(f"/api/v{API_VERSION}/game", query_string={"show_number": 2})
assert rv.status_code == 200
assert len(rv.get_json()) == min(
6,
len(
{f'{i["category"]}_{i["show"]}' for i in expected if i["round"] != 2}.difference(
{f'{i["category"]}_{i["show"]}' for i in expected if i["external"]}
)
),
)
expected = [i for i in test_data if (i["show"] == 2) & (i["complete"])]
rv = testclient.get(f"/api/v{API_VERSION}/game", query_string={"show_id": 2})
assert rv.status_code == 200
assert len(rv.get_json()) == min(
6,
len(
{f'{i["category"]}_{i["show"]}' for i in expected if i["round"] != 2}.difference(
{f'{i["category"]}_{i["show"]}' for i in expected if i["external"]}
)
),
)
rv = testclient.get(f"/api/v{API_VERSION}/game", query_string={"show_id": 2, "show_number": 2})
assert rv.status_code == 400
assert rv.get_json() == {"message": "Only one of Show Number or Show ID may be supplied at a time."}
rv = testclient.get(
f"/api/v{API_VERSION}/game", query_string={"show_number": 2, "round": 1, "allow_external": True}
)
assert rv.status_code == 200
assert any([j["external"] for i in rv.get_json() for j in i["sets"]])
rv = testclient.get(
f"/api/v{API_VERSION}/game", query_string={"show_number": 2, "round": 0, "allow_incomplete": True}
)
assert rv.status_code == 200
assert any([not (i["category"]["complete"]) for i in rv.get_json()])
rv = testclient.get(f"/api/v{API_VERSION}/game", query_string={"round": 4})
assert rv.status_code == 400
assert rv.get_json() == {
"message": "The round number must be one of 0 (Jeopardy!), 1 (Double Jeopardy!), or 2 (Final Jeopardy!)"
}
rv = testclient.get(f"/api/v{API_VERSION}/game", query_string={"size": 30})
assert rv.status_code == 400
assert "categories were found. Please reduce the size." in rv.get_json()["message"]
# Due to omitting duplicated category names
rv = testclient.get(f"/api/v{API_VERSION}/game", query_string={"size": 18})
assert rv.status_code == 400
assert "categories were found. Please reduce the size." in rv.get_json()["message"]
# """
# | ROUTE | | | | | | | | | |
# | ------------------- | ----- | ----------------- | ------ | ----- | -------- | ---- | -------- | --------------- | ------ |
# | CATEGORY (MULTIPLE) | N | Y | N | Y | N | Y | Y | Y (NAME and ID) | |
# | COMPLETE | N/A | N/A | N/A | N/A | N/A | N/A | N/A | N/A | |
# | DATE | N/A | N/A | N/A | N/A | N/A | N/A | N/A | N/A | |
# | EXTERNAL | N/A | N/A | N/A | N/A | N/A | N/A | N/A | N/A | |
# | ROUND | N/A | N/A | N/A | N/A | N/A | N/A | N/A | N/A | |
# | SET (MULTIPLE) | Y | Y | Y (ID) | Y | Y | Y | Y | Y | |
# | SHOW (MULTIPLE) | N | Y (NUMBER and ID) | N | N | N | Y | N | N | |
# | VALUE | N/A | N/A | N/A | N/A | N/A | N/A | N/A | N/A | |
# | | VALUE | SHOW | SET | ROUND | EXTERNAL | DATE | COMPLETE | CATEGORY | FILTER |
# https://restfulapi.net/http-status-codes/
# https://restfulapi.net/http-methods/#summary
# """
| 35.866554 | 126 | 0.616069 | 3,214 | 21,233 | 3.930305 | 0.061917 | 0.074098 | 0.035624 | 0.056998 | 0.884737 | 0.879987 | 0.863838 | 0.857821 | 0.835655 | 0.824731 | 0 | 0.043445 | 0.207554 | 21,233 | 591 | 127 | 35.927242 | 0.707298 | 0.071116 | 0 | 0.513441 | 0 | 0.002688 | 0.31388 | 0.181325 | 0 | 0 | 0 | 0 | 0.451613 | 1 | 0.080645 | false | 0 | 0.016129 | 0 | 0.096774 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
173ee503f5b1c75dab9980003bef2462361691c9 | 17,910 | py | Python | tests/trestle/tasks/cis_to_component_definition_test.py | CyberFlameGO/compliance-trestle | aeae771e0e90c7c69ef914ca02d4857ed6f50222 | [
"Apache-2.0"
] | 1 | 2022-01-07T01:11:03.000Z | 2022-01-07T01:11:03.000Z | tests/trestle/tasks/cis_to_component_definition_test.py | CyberFlameGO/compliance-trestle | aeae771e0e90c7c69ef914ca02d4857ed6f50222 | [
"Apache-2.0"
] | null | null | null | tests/trestle/tasks/cis_to_component_definition_test.py | CyberFlameGO/compliance-trestle | aeae771e0e90c7c69ef914ca02d4857ed6f50222 | [
"Apache-2.0"
] | null | null | null | # -*- mode:python; coding:utf-8 -*-
# Copyright (c) 2021 IBM Corp. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# https://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""cis-to-component-definition task tests."""
import configparser
import os
import pathlib
import uuid
from _pytest.monkeypatch import MonkeyPatch
from tests.test_utils import text_files_equal
import trestle
import trestle.tasks.cis_to_component_definition as cis_to_component_definition
from trestle.tasks.base_task import TaskOutcome
def monkey_uuid_1():
"""Monkey create UUID."""
return uuid.UUID('56666738-0f9a-4e38-9aac-c0fad00a5821')
def monkey_exception():
"""Monkey exception."""
raise Exception('foobar')
def monkey_trestle_version():
"""Monkey trestle version."""
return '0.21.0'
def test_cis_to_component_definition_print_info(tmp_path: pathlib.Path):
"""Test print_info call."""
config = configparser.ConfigParser()
config_path = pathlib.Path('tests/data/tasks/cis-to-component-definition/test-cis-to-component-definition.config')
config.read(config_path)
section = config['task.cis-to-component-definition']
section['output-dir'] = str(tmp_path)
tgt = cis_to_component_definition.CisToComponentDefinition(section)
retval = tgt.print_info()
assert retval is None
def test_cis_to_component_definition_simulate(tmp_path: pathlib.Path):
"""Test simulate call."""
config = configparser.ConfigParser()
config_path = pathlib.Path('tests/data/tasks/cis-to-component-definition/test-cis-to-component-definition.config')
config.read(config_path)
section = config['task.cis-to-component-definition']
section['output-dir'] = str(tmp_path)
tgt = cis_to_component_definition.CisToComponentDefinition(section)
retval = tgt.simulate()
assert retval == TaskOutcome.SIM_SUCCESS
assert len(os.listdir(str(tmp_path))) == 0
def test_cis_to_component_definition_execute(tmp_path: pathlib.Path, monkeypatch: MonkeyPatch):
"""Test execute call."""
monkeypatch.setattr(uuid, 'uuid4', monkey_uuid_1)
monkeypatch.setattr(trestle, '__version__', monkey_trestle_version())
config = configparser.ConfigParser()
config_path = pathlib.Path('tests/data/tasks/cis-to-component-definition/test-cis-to-component-definition.config')
config.read(config_path)
section = config['task.cis-to-component-definition']
d_expected = pathlib.Path(section['output-dir'])
d_produced = tmp_path
section['output-dir'] = str(tmp_path)
tgt = cis_to_component_definition.CisToComponentDefinition(section)
tgt.set_timestamp('2021-07-19T14:03:03.000+00:00')
retval = tgt.execute()
assert retval == TaskOutcome.SUCCESS
list_dir = os.listdir(d_produced)
assert len(list_dir) == 1
assert d_expected != d_produced
for fn in list_dir:
f_expected = d_expected / fn
f_produced = d_produced / fn
result = text_files_equal(f_expected, f_produced)
assert (result)
def test_cis_to_component_definition_execute_selected_rules2(tmp_path: pathlib.Path, monkeypatch: MonkeyPatch):
"""Test execute selected rules call."""
monkeypatch.setattr(uuid, 'uuid4', monkey_uuid_1)
monkeypatch.setattr(trestle, '__version__', monkey_trestle_version())
config = configparser.ConfigParser()
config_path = pathlib.Path('tests/data/tasks/cis-to-component-definition/test-cis-to-component-definition.config')
config.read(config_path)
section = config['task.cis-to-component-definition']
d_expected = pathlib.Path(section['output-dir'])
d_produced = tmp_path
section['output-dir'] = str(tmp_path)
section['selected-rules'] = 'tests/data/tasks/cis-to-component-definition/selected_rules2.json'
tgt = cis_to_component_definition.CisToComponentDefinition(section)
tgt.set_timestamp('2021-07-19T14:03:03.000+00:00')
retval = tgt.execute()
assert retval == TaskOutcome.SUCCESS
list_dir = os.listdir(d_produced)
assert len(list_dir) == 1
assert d_expected != d_produced
for fn in list_dir:
f_expected = d_expected / fn
f_produced = d_produced / fn
result = text_files_equal(f_expected, f_produced)
assert (result)
def test_cis_to_component_definition_execute_enabled_rules2(tmp_path: pathlib.Path, monkeypatch: MonkeyPatch):
"""Test execute enabled rules call."""
monkeypatch.setattr(uuid, 'uuid4', monkey_uuid_1)
monkeypatch.setattr(trestle, '__version__', monkey_trestle_version())
config = configparser.ConfigParser()
config_path = pathlib.Path('tests/data/tasks/cis-to-component-definition/test-cis-to-component-definition.config')
config.read(config_path)
section = config['task.cis-to-component-definition']
d_expected = pathlib.Path(section['output-dir'])
d_produced = tmp_path
section['output-dir'] = str(tmp_path)
section['enabled-rules'] = 'tests/data/tasks/cis-to-component-definition/enabled_rules2.json'
tgt = cis_to_component_definition.CisToComponentDefinition(section)
tgt.set_timestamp('2021-07-19T14:03:03.000+00:00')
retval = tgt.execute()
assert retval == TaskOutcome.SUCCESS
list_dir = os.listdir(d_produced)
assert len(list_dir) == 1
assert d_expected != d_produced
for fn in list_dir:
f_expected = d_expected / fn
f_produced = d_produced / fn
result = text_files_equal(f_expected, f_produced)
assert (result)
def test_cis_to_component_definition_execute_enabled_rules3(tmp_path: pathlib.Path, monkeypatch: MonkeyPatch):
"""Test execute enabled rules call."""
monkeypatch.setattr(uuid, 'uuid4', monkey_uuid_1)
monkeypatch.setattr(trestle, '__version__', monkey_trestle_version())
config = configparser.ConfigParser()
config_path = pathlib.Path('tests/data/tasks/cis-to-component-definition/test-cis-to-component-definition.config')
config.read(config_path)
section = config['task.cis-to-component-definition']
d_expected = pathlib.Path(section['output-dir'])
d_produced = tmp_path
section['output-dir'] = str(tmp_path)
section['enabled-rules'] = 'tests/data/tasks/cis-to-component-definition/enabled_rules3.json'
tgt = cis_to_component_definition.CisToComponentDefinition(section)
tgt.set_timestamp('2021-07-19T14:03:03.000+00:00')
retval = tgt.execute()
assert retval == TaskOutcome.SUCCESS
list_dir = os.listdir(d_produced)
assert len(list_dir) == 1
assert d_expected != d_produced
def test_cis_to_component_definition_bogus_config(tmp_path: pathlib.Path):
"""Test execute call bogus config."""
section = None
tgt = cis_to_component_definition.CisToComponentDefinition(section)
retval = tgt.execute()
assert retval == TaskOutcome.FAILURE
def test_cis_to_component_definition_missing_profile_list(tmp_path: pathlib.Path):
"""Test execute call missing profile-list."""
config = configparser.ConfigParser()
config_path = pathlib.Path('tests/data/tasks/cis-to-component-definition/test-cis-to-component-definition.config')
config.read(config_path)
section = config['task.cis-to-component-definition']
section['output-dir'] = str(tmp_path)
section.pop('profile-list')
tgt = cis_to_component_definition.CisToComponentDefinition(section)
retval = tgt.execute()
assert retval == TaskOutcome.FAILURE
def test_cis_to_component_definition_missing_component_name(tmp_path: pathlib.Path):
"""Test execute call missing component-name."""
config = configparser.ConfigParser()
config_path = pathlib.Path('tests/data/tasks/cis-to-component-definition/test-cis-to-component-definition.config')
config.read(config_path)
section = config['task.cis-to-component-definition']
section['output-dir'] = str(tmp_path)
section.pop('component-name')
tgt = cis_to_component_definition.CisToComponentDefinition(section)
retval = tgt.execute()
assert retval == TaskOutcome.FAILURE
def test_cis_to_component_definition_missing_profile_type(tmp_path: pathlib.Path):
"""Test execute call missing profile-type."""
config = configparser.ConfigParser()
config_path = pathlib.Path('tests/data/tasks/cis-to-component-definition/test-cis-to-component-definition.config')
config.read(config_path)
section = config['task.cis-to-component-definition']
section['output-dir'] = str(tmp_path)
section.pop('profile-type')
tgt = cis_to_component_definition.CisToComponentDefinition(section)
retval = tgt.execute()
assert retval == TaskOutcome.FAILURE
def test_cis_to_component_definition_missing_profile_ns(tmp_path: pathlib.Path):
"""Test execute call missing profile-ns."""
config = configparser.ConfigParser()
config_path = pathlib.Path('tests/data/tasks/cis-to-component-definition/test-cis-to-component-definition.config')
config.read(config_path)
section = config['task.cis-to-component-definition']
section['output-dir'] = str(tmp_path)
section.pop('profile-ns')
tgt = cis_to_component_definition.CisToComponentDefinition(section)
retval = tgt.execute()
assert retval == TaskOutcome.FAILURE
def test_cis_to_component_definition_missing_profile_key(tmp_path: pathlib.Path):
"""Test execute missing profile-file."""
config = configparser.ConfigParser()
config_path = pathlib.Path('tests/data/tasks/cis-to-component-definition/test-cis-to-component-definition.config')
config.read(config_path)
section = config['task.cis-to-component-definition']
section['output-dir'] = str(tmp_path)
profile_list = section['profile-list'].split()
for profile in profile_list:
section.pop(f'profile-file.{profile}')
tgt = cis_to_component_definition.CisToComponentDefinition(section)
retval = tgt.execute()
assert retval == TaskOutcome.FAILURE
def test_cis_to_component_definition_missing_profile_file(tmp_path: pathlib.Path):
"""Test execute missing profile-file."""
config = configparser.ConfigParser()
config_path = pathlib.Path('tests/data/tasks/cis-to-component-definition/test-cis-to-component-definition.config')
config.read(config_path)
section = config['task.cis-to-component-definition']
section['output-dir'] = str(tmp_path)
profile_list = section['profile-list'].split()
for profile in profile_list:
section[f'profile-file.{profile}'] = '/foobar'
tgt = cis_to_component_definition.CisToComponentDefinition(section)
retval = tgt.execute()
assert retval == TaskOutcome.SUCCESS
def test_cis_to_component_definition_missing_profile_url(tmp_path: pathlib.Path):
"""Test execute missinf profile-url."""
config = configparser.ConfigParser()
config_path = pathlib.Path('tests/data/tasks/cis-to-component-definition/test-cis-to-component-definition.config')
config.read(config_path)
section = config['task.cis-to-component-definition']
section['output-dir'] = str(tmp_path)
profile_list = section['profile-list'].split()
for profile in profile_list:
section.pop(f'profile-url.{profile}')
tgt = cis_to_component_definition.CisToComponentDefinition(section)
retval = tgt.execute()
assert retval == TaskOutcome.FAILURE
def test_cis_to_component_definition_missing_profile_title(tmp_path: pathlib.Path):
"""Test execute call missing profile-title."""
config = configparser.ConfigParser()
config_path = pathlib.Path('tests/data/tasks/cis-to-component-definition/test-cis-to-component-definition.config')
config.read(config_path)
section = config['task.cis-to-component-definition']
section['output-dir'] = str(tmp_path)
profile_list = section['profile-list'].split()
for profile in profile_list:
section.pop(f'profile-title.{profile}')
tgt = cis_to_component_definition.CisToComponentDefinition(section)
retval = tgt.execute()
assert retval == TaskOutcome.FAILURE
def test_cis_to_component_definition_missing_output_dir(tmp_path: pathlib.Path):
"""Test execute call missing output-dir."""
config = configparser.ConfigParser()
config_path = pathlib.Path('tests/data/tasks/cis-to-component-definition/test-cis-to-component-definition.config')
config.read(config_path)
section = config['task.cis-to-component-definition']
section.pop('output-dir')
tgt = cis_to_component_definition.CisToComponentDefinition(section)
retval = tgt.execute()
assert retval == TaskOutcome.FAILURE
def test_cis_to_component_definition_no_overwrite(tmp_path: pathlib.Path):
"""Test execute no overwrite."""
config = configparser.ConfigParser()
config_path = pathlib.Path('tests/data/tasks/cis-to-component-definition/test-cis-to-component-definition.config')
config.read(config_path)
section = config['task.cis-to-component-definition']
section['output-dir'] = str(tmp_path)
tgt = cis_to_component_definition.CisToComponentDefinition(section)
retval = tgt.execute()
assert retval == TaskOutcome.SUCCESS
section['output-overwrite'] = 'false'
retval = tgt.execute()
assert retval == TaskOutcome.FAILURE
def test_cis_to_component_definition_duplicate_rule(tmp_path: pathlib.Path):
"""Test execute duplicate rule exists."""
config = configparser.ConfigParser()
config_path = pathlib.Path('tests/data/tasks/cis-to-component-definition/test-cis-to-component-definition2.config')
config.read(config_path)
section = config['task.cis-to-component-definition']
section['output-dir'] = str(tmp_path)
tgt = cis_to_component_definition.CisToComponentDefinition(section)
retval = tgt.execute()
assert retval == TaskOutcome.SUCCESS
section['output-overwrite'] = 'false'
retval = tgt.execute()
assert retval == TaskOutcome.FAILURE
def test_cis_to_component_definition_exception(tmp_path: pathlib.Path, monkeypatch: MonkeyPatch):
"""Test _get_cis_rules exception."""
monkeypatch.setattr(cis_to_component_definition.CisToComponentDefinition, '_get_cis_rules', monkey_exception)
config = configparser.ConfigParser()
config_path = pathlib.Path('tests/data/tasks/cis-to-component-definition/test-cis-to-component-definition2.config')
config.read(config_path)
section = config['task.cis-to-component-definition']
section['output-dir'] = str(tmp_path)
tgt = cis_to_component_definition.CisToComponentDefinition(section)
retval = tgt.execute()
assert retval == TaskOutcome.FAILURE
def test_cis_to_component_definition_missing_rules_section(tmp_path: pathlib.Path, monkeypatch: MonkeyPatch):
"""Test missing section selected-rules."""
monkeypatch.setattr(cis_to_component_definition.CisToComponentDefinition, '_get_cis_rules', monkey_exception)
config = configparser.ConfigParser()
config_path = pathlib.Path('tests/data/tasks/cis-to-component-definition/test-cis-to-component-definition2.config')
config.read(config_path)
section = config['task.cis-to-component-definition']
section['output-dir'] = str(tmp_path)
section.pop('selected-rules')
tgt = cis_to_component_definition.CisToComponentDefinition(section)
retval = tgt.execute()
assert retval == TaskOutcome.FAILURE
def test_cis_to_component_definition_missing_rules_file(tmp_path: pathlib.Path, monkeypatch: MonkeyPatch):
"""Test missing file enabled-rules."""
monkeypatch.setattr(cis_to_component_definition.CisToComponentDefinition, '_get_cis_rules', monkey_exception)
config = configparser.ConfigParser()
config_path = pathlib.Path('tests/data/tasks/cis-to-component-definition/test-cis-to-component-definition2.config')
config.read(config_path)
section = config['task.cis-to-component-definition']
section['output-dir'] = str(tmp_path)
section['enabled-rules'] = '/foobar'
tgt = cis_to_component_definition.CisToComponentDefinition(section)
retval = tgt.execute()
assert retval == TaskOutcome.FAILURE
def test_cis_to_component_definition_missing_parameters_key(tmp_path: pathlib.Path, monkeypatch: MonkeyPatch):
"""Test missing file enabled-rules."""
monkeypatch.setattr(cis_to_component_definition.CisToComponentDefinition, '_get_cis_rules', monkey_exception)
config = configparser.ConfigParser()
config_path = pathlib.Path('tests/data/tasks/cis-to-component-definition/test-cis-to-component-definition2.config')
config.read(config_path)
section = config['task.cis-to-component-definition']
section['output-dir'] = str(tmp_path)
section.pop('rule-to-parameters-map')
tgt = cis_to_component_definition.CisToComponentDefinition(section)
retval = tgt.execute()
assert retval == TaskOutcome.FAILURE
def test_cis_to_component_definition_missing_parameters_file(tmp_path: pathlib.Path, monkeypatch: MonkeyPatch):
"""Test missing file enabled-rules."""
monkeypatch.setattr(cis_to_component_definition.CisToComponentDefinition, '_get_cis_rules', monkey_exception)
config = configparser.ConfigParser()
config_path = pathlib.Path('tests/data/tasks/cis-to-component-definition/test-cis-to-component-definition2.config')
config.read(config_path)
section = config['task.cis-to-component-definition']
section['output-dir'] = str(tmp_path)
section['rule-to-parameters-map'] = '/foobar'
tgt = cis_to_component_definition.CisToComponentDefinition(section)
retval = tgt.execute()
assert retval == TaskOutcome.FAILURE
| 45 | 119 | 0.749972 | 2,257 | 17,910 | 5.742136 | 0.07798 | 0.047454 | 0.13287 | 0.216667 | 0.892284 | 0.883256 | 0.871759 | 0.86034 | 0.839506 | 0.819213 | 0 | 0.009281 | 0.133668 | 17,910 | 397 | 120 | 45.11335 | 0.825986 | 0.082356 | 0 | 0.8 | 0 | 0.073333 | 0.225163 | 0.186356 | 0 | 0 | 0 | 0 | 0.123333 | 1 | 0.086667 | false | 0 | 0.03 | 0 | 0.123333 | 0.006667 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
179d1d622055948db9251eead554868cc80e2546 | 222 | py | Python | tests/test_bootstrap.py | Solstice-Short-Film/solstice-bootstrap | ace828e00d3f10ff3e8a70d23170f138e4633ff8 | [
"MIT"
] | null | null | null | tests/test_bootstrap.py | Solstice-Short-Film/solstice-bootstrap | ace828e00d3f10ff3e8a70d23170f138e4633ff8 | [
"MIT"
] | null | null | null | tests/test_bootstrap.py | Solstice-Short-Film/solstice-bootstrap | ace828e00d3f10ff3e8a70d23170f138e4633ff8 | [
"MIT"
] | null | null | null | #! /usr/bin/env python
# -*- coding: utf-8 -*-
"""
Module that contains tests for solstice-bootstrap
"""
import pytest
from solstice.bootstrap import __version__
def test_version():
assert __version__.__version__
| 14.8 | 49 | 0.725225 | 27 | 222 | 5.481481 | 0.777778 | 0.22973 | 0.310811 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005348 | 0.157658 | 222 | 14 | 50 | 15.857143 | 0.786096 | 0.418919 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.25 | 1 | 0.25 | true | 0 | 0.5 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
179e8ea9027a96efaf2e621d0e97cbbbd7eace10 | 9,438 | py | Python | tests/unit/core/providers/aws/s3/_helpers/test_comparator.py | avosper-intellaegis/runway | 757d4e7db269ec16479b044ac82a69f25fa2a450 | [
"Apache-2.0"
] | 134 | 2018-02-26T21:35:23.000Z | 2022-03-03T00:30:27.000Z | tests/unit/core/providers/aws/s3/_helpers/test_comparator.py | asksmruti/runway | 8aca76df9372e3d13eb35e12f81758f618e89e74 | [
"Apache-2.0"
] | 937 | 2018-03-08T22:04:35.000Z | 2022-03-30T12:21:47.000Z | tests/unit/core/providers/aws/s3/_helpers/test_comparator.py | asksmruti/runway | 8aca76df9372e3d13eb35e12f81758f618e89e74 | [
"Apache-2.0"
] | 70 | 2018-02-26T23:48:11.000Z | 2022-03-02T18:44:30.000Z | """Test runway.core.providers.aws.s3._helpers.comparator."""
# pylint: disable=no-self-use
from __future__ import annotations
import datetime
from typing import List, Optional
import pytest
from mock import Mock
from runway.core.providers.aws.s3._helpers.comparator import Comparator
from runway.core.providers.aws.s3._helpers.file_generator import FileStats
MODULE = "runway.core.providers.aws.s3._helpers.comparator"
NOW = datetime.datetime.now()
class TestComparator:
"""Test Comparator."""
comparator: Comparator
not_at_dest_sync_strategy: Mock
not_at_src_sync_strategy: Mock
sync_strategy: Mock
def setup_method(self) -> None:
"""Run before each test method if run to return the class instance attrs to default."""
self.not_at_dest_sync_strategy = Mock()
self.not_at_src_sync_strategy = Mock()
self.sync_strategy = Mock()
self.comparator = Comparator(
self.sync_strategy,
self.not_at_dest_sync_strategy,
self.not_at_src_sync_strategy,
)
def test_call_compare_key_equal_should_not_sync(self) -> None:
"""Test call compare key equal should not sync."""
self.sync_strategy.determine_should_sync.return_value = False
ref_list: List[FileStats] = []
result_list: List[FileStats] = []
src_files = [
FileStats(
src="",
dest="",
compare_key="comparator_test.py",
size=10,
last_update=NOW,
src_type="local",
dest_type="s3",
operation_name="upload",
)
]
dest_files = [
FileStats(
src="",
dest="",
compare_key="comparator_test.py",
size=10,
last_update=NOW,
src_type="s3",
dest_type="local",
operation_name="",
)
]
files = self.comparator.call(iter(src_files), iter(dest_files))
for filename in files:
result_list.append(filename)
assert result_list == ref_list
# Try when the sync strategy says to sync the file.
self.sync_strategy.determine_should_sync.return_value = True
ref_list = []
result_list = []
files = self.comparator.call(iter(src_files), iter(dest_files))
ref_list.append(src_files[0])
for filename in files:
result_list.append(filename)
assert result_list == ref_list
def test_call_compare_key_greater(self):
"""Test call compare key greater."""
self.not_at_dest_sync_strategy.determine_should_sync.return_value = False
self.not_at_src_sync_strategy.determine_should_sync.return_value = True
src_files: List[FileStats] = []
dest_files: List[FileStats] = []
ref_list: List[FileStats] = []
result_list: List[FileStats] = []
src_file = FileStats(
src="",
dest="",
compare_key="domparator_test.py",
size=10,
last_update=NOW,
src_type="local",
dest_type="s3",
operation_name="upload",
)
dest_file = FileStats(
src="",
dest="",
compare_key="comparator_test.py",
size=10,
last_update=NOW,
src_type="s3",
dest_type="local",
operation_name="",
)
src_files.append(src_file)
dest_files.append(dest_file)
ref_list.append(dest_file)
files = self.comparator.call(iter(src_files), iter(dest_files))
for filename in files:
result_list.append(filename)
assert result_list == ref_list
# Now try when the sync strategy says not to sync the file.
self.not_at_src_sync_strategy.determine_should_sync.return_value = False
result_list = []
ref_list = []
files = self.comparator.call(iter(src_files), iter(dest_files))
for filename in files:
result_list.append(filename)
assert result_list == ref_list
def test_call_compare_key_less(self) -> None:
"""Test call compare key less."""
self.not_at_src_sync_strategy.determine_should_sync.return_value = False
self.not_at_dest_sync_strategy.determine_should_sync.return_value = True
ref_list: List[FileStats] = []
result_list: List[FileStats] = []
src_files: List[FileStats] = []
dest_files: List[FileStats] = []
src_file = FileStats(
src="",
dest="",
compare_key="bomparator_test.py",
size=10,
last_update=NOW,
src_type="local",
dest_type="s3",
operation_name="upload",
)
dest_file = FileStats(
src="",
dest="",
compare_key="comparator_test.py",
size=10,
last_update=NOW,
src_type="s3",
dest_type="local",
operation_name="",
)
src_files.append(src_file)
dest_files.append(dest_file)
ref_list.append(src_file)
files = self.comparator.call(iter(src_files), iter(dest_files))
for filename in files:
result_list.append(filename)
assert result_list == ref_list
# Now try when the sync strategy says not to sync the file.
self.not_at_dest_sync_strategy.determine_should_sync.return_value = False
result_list = []
ref_list = []
files = self.comparator.call(iter(src_files), iter(dest_files))
for filename in files:
result_list.append(filename)
assert result_list == ref_list
def test_call_empty_dest(self) -> None:
"""Test call empty dest."""
self.not_at_dest_sync_strategy.determine_should_sync.return_value = True
src_files: List[FileStats] = []
dest_files: List[FileStats] = []
ref_list: List[FileStats] = []
result_list: List[FileStats] = []
src_file = FileStats(
src="",
dest="",
compare_key="domparator_test.py",
size=10,
last_update=NOW,
src_type="local",
dest_type="s3",
operation_name="upload",
)
src_files.append(src_file)
ref_list.append(src_file)
files = self.comparator.call(iter(src_files), iter(dest_files))
for filename in files:
result_list.append(filename)
assert result_list == ref_list
# Now try when the sync strategy says not to sync the file.
self.not_at_dest_sync_strategy.determine_should_sync.return_value = False
result_list = []
ref_list = []
files = self.comparator.call(iter(src_files), iter(dest_files))
for filename in files:
result_list.append(filename)
assert result_list == ref_list
def test_call_empty_src(self) -> None:
"""Test call empty src."""
self.not_at_src_sync_strategy.determine_should_sync.return_value = True
src_files: List[FileStats] = []
dest_files: List[FileStats] = []
ref_list: List[FileStats] = []
result_list: List[FileStats] = []
dest_file = FileStats(
src="",
dest="",
compare_key="comparator_test.py",
size=10,
last_update=NOW,
src_type="s3",
dest_type="local",
operation_name="",
)
dest_files.append(dest_file)
ref_list.append(dest_file)
files = self.comparator.call(iter(src_files), iter(dest_files))
for filename in files:
result_list.append(filename)
assert result_list == ref_list
# Now try when the sync strategy says not to sync the file.
self.not_at_src_sync_strategy.determine_should_sync.return_value = False
result_list = []
ref_list = []
files = self.comparator.call(iter(src_files), iter(dest_files))
for filename in files:
result_list.append(filename)
assert result_list == ref_list
def test_call_empty_src_dest(self) -> None:
"""Test call."""
src_files: List[FileStats] = []
dest_files: List[FileStats] = []
ref_list: List[FileStats] = []
result_list: List[FileStats] = []
files = self.comparator.call(iter(src_files), iter(dest_files))
for filename in files:
result_list.append(filename)
assert result_list == ref_list
@pytest.mark.parametrize(
"src_file, dest_file, expected",
[
(None, None, "equal"),
(None, Mock(compare_key=""), "equal"),
(Mock(compare_key=""), None, "equal"),
(Mock(compare_key=""), Mock(compare_key=""), "equal"),
(Mock(compare_key="tes"), Mock(compare_key="test"), "less_than"),
(Mock(compare_key="test"), Mock(compare_key="tes"), "greater_than"),
],
)
def test_compare_comp_key(
self,
dest_file: Optional[FileStats],
expected: str,
src_file: Optional[FileStats],
) -> None:
"""Test compare_comp_key."""
assert Comparator.compare_comp_key(src_file, dest_file) == expected
| 35.085502 | 95 | 0.589955 | 1,096 | 9,438 | 4.781022 | 0.085766 | 0.062977 | 0.037214 | 0.048664 | 0.863931 | 0.832634 | 0.783969 | 0.740458 | 0.726527 | 0.702481 | 0 | 0.004432 | 0.306633 | 9,438 | 268 | 96 | 35.216418 | 0.796302 | 0.068341 | 0 | 0.726087 | 0 | 0 | 0.040746 | 0.005494 | 0 | 0 | 0 | 0 | 0.052174 | 1 | 0.034783 | false | 0 | 0.030435 | 0 | 0.086957 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
da33ceff7e06d0e3c0bcefb406f1bcecbebff5e9 | 9,385 | py | Python | app/core/controller/ControllerEquipoCategoriaIcono.py | raquelnany/ACTIVO_DJANGO_DOCKER_PROGRES | 6dfcea960bac60ff862f204440cc52689bba0e2a | [
"MIT"
] | null | null | null | app/core/controller/ControllerEquipoCategoriaIcono.py | raquelnany/ACTIVO_DJANGO_DOCKER_PROGRES | 6dfcea960bac60ff862f204440cc52689bba0e2a | [
"MIT"
] | null | null | null | app/core/controller/ControllerEquipoCategoriaIcono.py | raquelnany/ACTIVO_DJANGO_DOCKER_PROGRES | 6dfcea960bac60ff862f204440cc52689bba0e2a | [
"MIT"
] | null | null | null | from ..serializers import EquipoCategoriaIconoSerializaer
from ..models import Equipo_Categoria_Icono,Equipo_Categoria, Modelo_Icono
class ControllerEquipoCategoriaIcono:
def crearequipocategoriaicono(request):
datosEquipoCategoriaIcono = request.data
try:
equipo_categoria = Equipo_Categoria.objects.get(id_equipo_categoria=datosEquipoCategoriaIcono['equipo_categoria'])
modelo_icono = Modelo_Icono.objects.get(id_modelo_icono=datosEquipoCategoriaIcono['modelo_icono'])
equipoCategoriaIconoNuevo = Equipo_Categoria_Icono.objects.create(
equipo_categoria= equipo_categoria,
modelo_icono = modelo_icono,
)
except Exception:
return {"estatus":"Error"}
equipoCategoriaIconoNuevo.save()
return {"estatus":"Ok", 'equipo_categoria_icono:': equipoCategoriaIconoNuevo.id_equipo_categoria_icono}
def listarequipocategoriaicono(id_equipo_categoria_icono=None):
if id_equipo_categoria_icono:
try:
queryset = Equipo_Categoria_Icono.objects.get(id_equipo_categoria_icono=id_equipo_categoria_icono)
except Equipo_Categoria_Icono.DoesNotExist:
return ({'result': 'No se encontró el equipo categoria icono'})
serializer = EquipoCategoriaIconoSerializaer(queryset)
return serializer.data
else:
queryset = Equipo_Categoria_Icono.objects.all()
serializer = EquipoCategoriaIconoSerializaer(queryset, many=True)
return serializer.data
def generarequipocategoriaIcono(self):
EQI=Equipo_Categoria_Icono.objects.create(
equipo_categoria = Equipo_Categoria.objects.get(id_equipo_categoria=3),
modelo_icono = Modelo_Icono.objects.get(id_modelo_icono=9)
)
EQI.save()
EQI=Equipo_Categoria_Icono.objects.create(
equipo_categoria = Equipo_Categoria.objects.get(id_equipo_categoria=3),
modelo_icono = Modelo_Icono.objects.get(id_modelo_icono=9)
)
EQI.save()
EQI=Equipo_Categoria_Icono.objects.create(
equipo_categoria = Equipo_Categoria.objects.get(id_equipo_categoria=3),
modelo_icono = Modelo_Icono.objects.get(id_modelo_icono=10)
)
EQI.save()
EQI=Equipo_Categoria_Icono.objects.create(
equipo_categoria = Equipo_Categoria.objects.get(id_equipo_categoria=3),
modelo_icono = Modelo_Icono.objects.get(id_modelo_icono=19)
)
EQI.save()
EQI=Equipo_Categoria_Icono.objects.create(
equipo_categoria = Equipo_Categoria.objects.get(id_equipo_categoria=3),
modelo_icono = Modelo_Icono.objects.get(id_modelo_icono=20)
)
EQI.save()
EQI=Equipo_Categoria_Icono.objects.create(
equipo_categoria = Equipo_Categoria.objects.get(id_equipo_categoria=3),
modelo_icono = Modelo_Icono.objects.get(id_modelo_icono=21)
)
EQI.save()
EQI=Equipo_Categoria_Icono.objects.create(
equipo_categoria = Equipo_Categoria.objects.get(id_equipo_categoria=1),
modelo_icono = Modelo_Icono.objects.get(id_modelo_icono=1)
)
EQI.save()
EQI=Equipo_Categoria_Icono.objects.create(
equipo_categoria = Equipo_Categoria.objects.get(id_equipo_categoria=1),
modelo_icono = Modelo_Icono.objects.get(id_modelo_icono=3)
)
EQI.save()
EQI=Equipo_Categoria_Icono.objects.create(
equipo_categoria = Equipo_Categoria.objects.get(id_equipo_categoria=1),
modelo_icono = Modelo_Icono.objects.get(id_modelo_icono=5)
)
EQI.save()
EQI=Equipo_Categoria_Icono.objects.create(
equipo_categoria = Equipo_Categoria.objects.get(id_equipo_categoria=1),
modelo_icono = Modelo_Icono.objects.get(id_modelo_icono=10)
)
EQI.save()
EQI=Equipo_Categoria_Icono.objects.create(
equipo_categoria = Equipo_Categoria.objects.get(id_equipo_categoria=1),
modelo_icono = Modelo_Icono.objects.get(id_modelo_icono=13)
)
EQI.save()
EQI=Equipo_Categoria_Icono.objects.create(
equipo_categoria = Equipo_Categoria.objects.get(id_equipo_categoria=1),
modelo_icono = Modelo_Icono.objects.get(id_modelo_icono=17)
)
EQI.save()
EQI=Equipo_Categoria_Icono.objects.create(
equipo_categoria = Equipo_Categoria.objects.get(id_equipo_categoria=2),
modelo_icono = Modelo_Icono.objects.get(id_modelo_icono=2)
)
EQI.save()
EQI=Equipo_Categoria_Icono.objects.create(
equipo_categoria = Equipo_Categoria.objects.get(id_equipo_categoria=2),
modelo_icono = Modelo_Icono.objects.get(id_modelo_icono=4)
)
EQI.save()
EQI=Equipo_Categoria_Icono.objects.create(
equipo_categoria = Equipo_Categoria.objects.get(id_equipo_categoria=2),
modelo_icono = Modelo_Icono.objects.get(id_modelo_icono=6)
)
EQI.save()
EQI=Equipo_Categoria_Icono.objects.create(
equipo_categoria = Equipo_Categoria.objects.get(id_equipo_categoria=2),
modelo_icono = Modelo_Icono.objects.get(id_modelo_icono=7)
)
EQI.save()
EQI=Equipo_Categoria_Icono.objects.create(
equipo_categoria = Equipo_Categoria.objects.get(id_equipo_categoria=2),
modelo_icono = Modelo_Icono.objects.get(id_modelo_icono=10)
)
EQI.save()
EQI=Equipo_Categoria_Icono.objects.create(
equipo_categoria = Equipo_Categoria.objects.get(id_equipo_categoria=2),
modelo_icono = Modelo_Icono.objects.get(id_modelo_icono=22)
)
EQI.save()
EQI=Equipo_Categoria_Icono.objects.create(
equipo_categoria = Equipo_Categoria.objects.get(id_equipo_categoria=2),
modelo_icono = Modelo_Icono.objects.get(id_modelo_icono=14)
)
EQI.save()
EQI=Equipo_Categoria_Icono.objects.create(
equipo_categoria = Equipo_Categoria.objects.get(id_equipo_categoria=2),
modelo_icono = Modelo_Icono.objects.get(id_modelo_icono=23)
)
EQI.save()
EQI=Equipo_Categoria_Icono.objects.create(
equipo_categoria = Equipo_Categoria.objects.get(id_equipo_categoria=2),
modelo_icono = Modelo_Icono.objects.get(id_modelo_icono=18)
)
EQI.save()
EQI=Equipo_Categoria_Icono.objects.create(
equipo_categoria = Equipo_Categoria.objects.get(id_equipo_categoria=4),
modelo_icono = Modelo_Icono.objects.get(id_modelo_icono=10)
)
EQI.save()
EQI=Equipo_Categoria_Icono.objects.create(
equipo_categoria = Equipo_Categoria.objects.get(id_equipo_categoria=5),
modelo_icono = Modelo_Icono.objects.get(id_modelo_icono=10)
)
EQI.save()
EQI=Equipo_Categoria_Icono.objects.create(
equipo_categoria = Equipo_Categoria.objects.get(id_equipo_categoria=6),
modelo_icono = Modelo_Icono.objects.get(id_modelo_icono=10)
)
EQI.save()
EQI=Equipo_Categoria_Icono.objects.create(
equipo_categoria = Equipo_Categoria.objects.get(id_equipo_categoria=7),
modelo_icono = Modelo_Icono.objects.get(id_modelo_icono=10)
)
EQI.save()
EQI=Equipo_Categoria_Icono.objects.create(
equipo_categoria = Equipo_Categoria.objects.get(id_equipo_categoria=8),
modelo_icono = Modelo_Icono.objects.get(id_modelo_icono=10)
)
EQI.save()
EQI=Equipo_Categoria_Icono.objects.create(
equipo_categoria = Equipo_Categoria.objects.get(id_equipo_categoria=9),
modelo_icono = Modelo_Icono.objects.get(id_modelo_icono=10)
)
EQI.save()
EQI=Equipo_Categoria_Icono.objects.create(
equipo_categoria = Equipo_Categoria.objects.get(id_equipo_categoria=10),
modelo_icono = Modelo_Icono.objects.get(id_modelo_icono=10)
)
EQI.save()
EQI=Equipo_Categoria_Icono.objects.create(
equipo_categoria = Equipo_Categoria.objects.get(id_equipo_categoria=11),
modelo_icono = Modelo_Icono.objects.get(id_modelo_icono=10)
)
EQI.save()
EQI=Equipo_Categoria_Icono.objects.create(
equipo_categoria = Equipo_Categoria.objects.get(id_equipo_categoria=12),
modelo_icono = Modelo_Icono.objects.get(id_modelo_icono=10)
)
EQI.save()
EQI=Equipo_Categoria_Icono.objects.create(
equipo_categoria = Equipo_Categoria.objects.get(id_equipo_categoria=13),
modelo_icono = Modelo_Icono.objects.get(id_modelo_icono=10)
)
EQI.save()
EQI=Equipo_Categoria_Icono.objects.create(
equipo_categoria = Equipo_Categoria.objects.get(id_equipo_categoria=14),
modelo_icono = Modelo_Icono.objects.get(id_modelo_icono=10)
)
EQI.save()
EQI=Equipo_Categoria_Icono.objects.create(
equipo_categoria = Equipo_Categoria.objects.get(id_equipo_categoria=15),
modelo_icono = Modelo_Icono.objects.get(id_modelo_icono=10)
)
EQI.save() | 40.627706 | 126 | 0.679702 | 1,068 | 9,385 | 5.623596 | 0.064607 | 0.377123 | 0.137862 | 0.161838 | 0.854146 | 0.839161 | 0.833 | 0.833 | 0.833 | 0.805528 | 0 | 0.01337 | 0.234949 | 9,385 | 231 | 127 | 40.627706 | 0.82312 | 0 | 0 | 0.556701 | 0 | 0 | 0.012572 | 0.00245 | 0 | 0 | 0 | 0 | 0 | 1 | 0.015464 | false | 0 | 0.010309 | 0 | 0.056701 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
da4420b24c00e49a840c879b2a7a28d86bfdf47c | 39,564 | py | Python | pines/img.py | jpn--/pine | 3980a9f0b09dd36b2fed7e52750847637be5f067 | [
"MIT"
] | 2 | 2017-08-09T02:42:37.000Z | 2020-06-16T14:14:16.000Z | pines/img.py | jpn--/pine | 3980a9f0b09dd36b2fed7e52750847637be5f067 | [
"MIT"
] | null | null | null | pines/img.py | jpn--/pine | 3980a9f0b09dd36b2fed7e52750847637be5f067 | [
"MIT"
] | null | null | null |
import base64, os
# data:image/png;base64
favicon = 'iVBORw0KGgoAAAANSUhEUgAAACAAAAAgCAMAAABEpIrGAAAABGdBTUEAALGPC/xhBQAAAAFzUkdCAK7OHOkAAAMAUExURQAAAA0oAg4pAhAsAgAAAAAAAAwaCH9/fwAAAAAAAFNTUwAAAAICAn9/fxYyCgIDAhc3CyxfFgAAAAAEACxgFg0aCH9/fw0mAn9/fwAAAAAAAAAAABY0CgAAABc0CgAAABc4CwAAABc2CkBAQClcFCteFixiF2NjYz8/PwsVBQwXBhISEgwjAg0mAgMNAxIjChkZGSIiFwAAABIqCR1RCQ4ODgMDAwIGAQMDAwAAAAAAAAAAACdYEwsbBSJSDg8PDxYyCgAAABAvAwAAAAcQBAAAAAAAAA8jByJTDho7CwAAABo8C1V/VShZEwgXAwAAAAgWAytdFgAAAD+BIwAAAAAAAAgRAwAAAESbIEmmIkOZIEmnIkqpI0qoIkehIUagIUSaIEegIUeiIUafIEWdIEijIUurI0ikIU2vJAECAFC2JUusI0KXHz2LHECUHFTAJ0yuJEWeIFjIKUmlIkqqI0ytI0OXH0GWHVO9J0+1JT+TG0uqI0ysI02wIwIEAQMHAUCRHkScHj2RG0SeHxIqCEScIFbFKEGUHhQtCQgTBFrPKg0eBkGYHUCXHEaeIT6NHEKZHj2OHDqJGkObHixmFUOaHxUvCU6zJAIFAVS/J1C3JTuHG02xJDFwFlK2KTN1GCtjFFO4KUKWHyRSETyKHDqGGxYyCgcQA1fGKT+QHlnLKVzRK0GVH1XDKF7YLEumJjuLGkedJE2sJz+PHUqpIk+wKDiCGlK8JipeE02xJTqDGjJyFz6QHDV8GDV6GUioIVG4Jk2uJDqFG0usJFfHKQ8jB1GzKEWhHwQIAk+0JRAkB1K7JhxBDVO+JydZEhYzCgQLAgsaBUCTHQ0dBlXCKFzSKydXEl3VLCRUETyMGkCRIEGVH0qjJUypJ0ulJkWbIVGpLUWdIUCTH1K8Jy1nFU2tIzZ/GE+uJ0SgH1CxKFO3KTBtFhk6DC1pFUajICpgE0qtIlK1KFzTK2LeLVG5JlCzJTFxFy9rFljHKV7VLBg2Cw8hByNREREnCCBJDwoXBX24O08AAABXdFJOUwC/v78jf7kEHRsCIRYG8YHx9QcZ9bkQvg1zfTHxafEn8IbyBvX19QkEtLkOtL9NlgoWXez1EqDHnS5aVPPZ+RDY9s3U/W58+fHveu8G9Jg12fVk/UuSnoMSbwMAAAMkSURBVDjLY2CgAmDnxSXDBZbhYjDU1xVnZGQCAmYwALGYGBnFdfQYlEHyLCZTN95bv+n51mML5736CCQWfny5ddPd9Xc2TvJjUGeQYHB6sLd/0sQ9eeXleY2NSfsbm8uBoGrPhg27moJDNBh4hTyq+iPux6dHxpTmp52efa0itqIyJrY4qS8hKvdmmBADO49vwsSIwsS4mJqYtylF5zpPfvmW+zW2ODFxas7KS5o8DOzCKlMbCsvishMjso4ub/veufpc+JI5a0ojIxJab1VpcTKwc6pObi/8Gn968am2rMstiy8uCF+6em5nbeL01Ig1eWwcQAWCze3R75esOnOq7eSVjvD68Ia1+fPrC6bvyI181AtRcLwh/Xx49o3urqwVa7vDV16fc3X3ie1vwmfHr+kBK1BrTqk5++vitL9H8td2nQ+fteps1/wLZ+qSsyITICaIvFuyb++h3+Edq/JXXO8I//rzwLrX4XULlsxfGtUjClIQ+GFx+Inw5RfCr12duW5ZeMWPT/0zMpfPqAvvjuoFK1A7XhaRnBSd2xl+o2t1R/jBz9vqw6dkx6dHpUBNEHlRVhxZsPNK6//wf0AT2naFZ+5eMX3OluK4KIgjRZrLktJjMw7nJ3WH/2kJXzAjc19R6c7q6LjkHKg3GxclpcZkn8zIL7kcnhmeWbetKCOrIA2oYBE0oCa3JydGVmdX1p4tPRqeWb/0CEg+NrUvpb0KrEB1QkNhX2p0WnZWddHc8PBlRTWzCmpiiuNaI26BFQirTJg4FeiNyNjSkoop4eEdB2oOxkSnJ6ZE5axs0uJg8OGxudSfEBGfGhmbET3zSXj4tMcxGSWx6XGFUWX9NzU5GNSFHB4eWrn50aKohPiEpzvCw1uqU+JTEnLWrNw8O2duKDcDg72sXd6E5LjI2JLaLWATtqfVppVEp8en5DwLCOJhYFC25LeV45ORkxNQUPJ2Dw938RRQEBCQk+HjMzbn99cApnsDaWtJKVZWVn4vRVnn8HBHV0U3VhCQkjSSFoLkD1MObhYg4JbnsTKz4BeWB/NYuDm0uUjPhrxiYhIMVAYA2dxKLrticAYAAAAASUVORK5CYII%3D'
eye = 'data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAEAAAABACAMAAACdt4HsAAAABGdBTUEAALGPC/xhBQAAAAFzUkdCAK7OHOkAAABgUExURUxpcQCj2wCPxQCa1QCZ1ACd2QCMwgCDuQCP0wCs3wBvoACOxwCGvACOxwCPxgCY0QCPzwCTzACTywCj1QCa2ACU1gCe2QCl3ACi2wCp3QCj5ACY4ACy6ACs6AC37ACt5OoTzCwAAAAUdFJOUwD7XPrR+0UE/f0WnCxwz6rquoflF5GMKAAAApRJREFUWMPtVtmCoyAQFCGKt0aNkUP+/y+nOQRijk12Z99SM5kM3V0F3ZxJ8sUXX/x35B5/RX7dfoddZum5ApzTrPxIQ8eVaUVwsTkUmFRp6Vzv0LNuFJtQhYeC5thlb0iAP2+I2iKyF9kUafLXEpBl3oybKIrVcMS2Cf0xjVW3R5B4XgtwpCOMfLV06BHqB5WEEVmJFXIZ06eDyJN6UALCDJTq6t1Td0o5cyHUUD9UAFuDxbpDjFlYSVDWMXLh5sEg8qQcRIFcDJJjaSqSJ+6rHKV3FmIojwp5kvViXXYozQ8x+t9yVN69ij47KjRXhZYgkFl+2rVtl1qFLAgsSF2b2/QruQQ+kp2h1K2SANXWptnJKGSRlS8EfA0SXSMstU67xhIZSGzbSxyD5OAUQLy94SNFjIfsViSJiSPqJky2Jk3DP100rub3guRZd5hKdHFAMtWWs7G4qMvltCs4/nUPP9nwzskaS2clveXqFYDPvNmFMzNHJNhPjJiZvo/UYyD/IkD0EAjDU4wTsykEM2Y2BXa6CcSWf6eAmS1iLGCLeIwj+zTcKmDW2mncrZjZaWwPYSTfF0IyMDxP7meGj1tIDBswt5Csc7JhmA3xUqzYvEP7KruUCWOcM0bsUq6glwBWJfFuSxo+Rd7ZbyZC/GaK/RM/nAl6O3Ma/P39du5DD5Tfb2d9oLCZ+oj2eKC0Xp/O7P5AsWlQTulMqf6j+4iPND0+64Gg5uG5qus88Jk6jYlW4VCtoOkcMx/qZye7rlofJDht7bHe6pHt9D59cbeYi8VKGHCuc+Z8bwP95cXirzYeNAJA6I9XW7hc+4MGsPu3Ltf4etcsi5l+cL3HD4zGPjCaDx8Yv/DE+Y1H1hdffPEhfgDKU1gnLBbrLAAAAABJRU5ErkJggg=='
favicon_raw = b'\x89PNG\r\n\x1a\n\x00\x00\x00\rIHDR\x00\x00\x00 \x00\x00\x00 \x08\x03\x00\x00\x00D\xa4\x8a\xc6\x00\x00\x00\x04gAMA\x00\x00\xb1\x8f\x0b\xfca\x05\x00\x00\x00\x01sRGB\x00\xae\xce\x1c\xe9\x00\x00\x03\x00PLTE\x00\x00\x00\r(\x02\x0e)\x02\x10,\x02\x00\x00\x00\x00\x00\x00\x0c\x1a\x08\x7f\x7f\x7f\x00\x00\x00\x00\x00\x00SSS\x00\x00\x00\x02\x02\x02\x7f\x7f\x7f\x162\n\x02\x03\x02\x177\x0b,_\x16\x00\x00\x00\x00\x04\x00,`\x16\r\x1a\x08\x7f\x7f\x7f\r&\x02\x7f\x7f\x7f\x00\x00\x00\x00\x00\x00\x00\x00\x00\x164\n\x00\x00\x00\x174\n\x00\x00\x00\x178\x0b\x00\x00\x00\x176\n@@@)\\\x14+^\x16,b\x17ccc???\x0b\x15\x05\x0c\x17\x06\x12\x12\x12\x0c#\x02\r&\x02\x03\r\x03\x12#\n\x19\x19\x19""\x17\x00\x00\x00\x12*\t\x1dQ\t\x0e\x0e\x0e\x03\x03\x03\x02\x06\x01\x03\x03\x03\x00\x00\x00\x00\x00\x00\x00\x00\x00\'X\x13\x0b\x1b\x05"R\x0e\x0f\x0f\x0f\x162\n\x00\x00\x00\x10/\x03\x00\x00\x00\x07\x10\x04\x00\x00\x00\x00\x00\x00\x0f#\x07"S\x0e\x1a;\x0b\x00\x00\x00\x1a<\x0bU\x7fU(Y\x13\x08\x17\x03\x00\x00\x00\x08\x16\x03+]\x16\x00\x00\x00?\x81#\x00\x00\x00\x00\x00\x00\x08\x11\x03\x00\x00\x00D\x9b I\xa6"C\x99 I\xa7"J\xa9#J\xa8"G\xa1!F\xa0!D\x9a G\xa0!G\xa2!F\x9f E\x9d H\xa3!K\xab#H\xa4!M\xaf$\x01\x02\x00P\xb6%K\xac#B\x97\x1f=\x8b\x1c@\x94\x1cT\xc0\'L\xae$E\x9e X\xc8)I\xa5"J\xaa#L\xad#C\x97\x1fA\x96\x1dS\xbd\'O\xb5%?\x93\x1bK\xaa#L\xac#M\xb0#\x02\x04\x01\x03\x07\x01@\x91\x1eD\x9c\x1e=\x91\x1bD\x9e\x1f\x12*\x08D\x9c V\xc5(A\x94\x1e\x14-\t\x08\x13\x04Z\xcf*\r\x1e\x06A\x98\x1d@\x97\x1cF\x9e!>\x8d\x1cB\x99\x1e=\x8e\x1c:\x89\x1aC\x9b\x1e,f\x15C\x9a\x1f\x15/\tN\xb3$\x02\x05\x01T\xbf\'P\xb7%;\x87\x1bM\xb1$1p\x16R\xb6)3u\x18+c\x14S\xb8)B\x96\x1f$R\x11<\x8a\x1c:\x86\x1b\x162\n\x07\x10\x03W\xc6)?\x90\x1eY\xcb)\\\xd1+A\x95\x1fU\xc3(^\xd8,K\xa6&;\x8b\x1aG\x9d$M\xac\'?\x8f\x1dJ\xa9"O\xb0(8\x82\x1aR\xbc&*^\x13M\xb1%:\x83\x1a2r\x17>\x90\x1c5|\x185z\x19H\xa8!Q\xb8&M\xae$:\x85\x1bK\xac$W\xc7)\x0f#\x07Q\xb3(E\xa1\x1f\x04\x08\x02O\xb4%\x10$\x07R\xbb&\x1cA\rS\xbe\'\'Y\x12\x163\n\x04\x0b\x02\x0b\x1a\x05@\x93\x1d\r\x1d\x06U\xc2(\\\xd2+\'W\x12]\xd5,$T\x11<\x8c\x1a@\x91 A\x95\x1fJ\xa3%L\xa9\'K\xa5&E\x9b!Q\xa9-E\x9d!@\x93\x1fR\xbc\'-g\x15M\xad#6\x7f\x18O\xae\'D\xa0\x1fP\xb1(S\xb7)0m\x16\x19:\x0c-i\x15F\xa3 *`\x13J\xad"R\xb5(\\\xd3+b\xde-Q\xb9&P\xb3%1q\x17/k\x16X\xc7)^\xd5,\x186\x0b\x0f!\x07#Q\x11\x11\'\x08 I\x0f\n\x17\x05}\xb8;O\x00\x00\x00WtRNS\x00\xbf\xbf\xbf#\x7f\xb9\x04\x1d\x1b\x02!\x16\x06\xf1\x81\xf1\xf5\x07\x19\xf5\xb9\x10\xbe\rs}1\xf1i\xf1\'\xf0\x86\xf2\x06\xf5\xf5\xf5\t\x04\xb4\xb9\x0e\xb4\xbfM\x96\n\x16]\xec\xf5\x12\xa0\xc7\x9d.ZT\xf3\xd9\xf9\x10\xd8\xf6\xcd\xd4\xfdn|\xf9\xf1\xefz\xef\x06\xf4\x985\xd9\xf5d\xfdK\x92\x9e\x83\x12o\x03\x00\x00\x03$IDAT8\xcbc`\xa0\x02`\xe7\xc5%\xc3\x05\x96\xe1b0\xd4\xd7\x15gdd\x02\x02f0\x00\xb1\x98\x18\x19\xc5u\xf4\x18\x94A\xf2,&S7\xde[\xbf\xe9\xf9\xd6c\x0b\xe7\xbd\xfa\x08$\x16~|\xb9u\xd3\xdd\xf5w6N\xf2cPg\x90`pz\xb0\xb7\x7f\xd2\xc4=y\xe5\xe5y\x8d\x8dI\xfb\x1b\x9b\xcb\x81\xa0j\xcf\x86\r\xbb\x9a\x82C4\x18x\x85<\xaa\xfa#\xee\xc7\xa7G\xc6\x94\xe6\xa7\x9d\x9e}\xad"\xb6\xa22&\xb68\xa9/!*\xf7f\x98\x10\x03;\x8fo\xc2\xc4\x88\xc2\xc4\xb8\x98\x9a\x98\xb7)E\xe7:O~\xf9\x96\xfb5\xb681qj\xce\xcaK\x9a<\x0c\xec\xc2*S\x1b\n\xcb\xe2\xb2\x13#\xb2\x8e.o\xfb\xde\xb9\xfa\\\xf8\x929kJ##\x12ZoUiq2\xb0s\xaaNn/\xfc\x1a\x7fz\xf1\xa9\xb6\xac\xcb-\x8b/.\x08_\xbazngm\xe2\xf4\xd4\x885yl\x1c@\x05\x82\xcd\xed\xd1\xef\x97\xac:s\xaa\xed\xe4\x95\x8e\xf0\xfa\xf0\x86\xb5\xf9\xf3\xeb\x0b\xa6\xef\xc8\x8d|\xd4\x0bQp\xbc!\xfd|x\xf6\x8d\xee\xae\xac\x15k\xbb\xc3W^\x9fsu\xf7\x89\xedo\xc2g\xc7\xaf\xe9\x01+PkN\xa99\xfb\xeb\xe2\xb4\xbfG\xf2\xd7v\x9d\x0f\x9f\xb5\xeal\xd7\xfc\x0bg\xea\x92\xb3"\x13 &\x88\xbc[\xb2o\xef\xa1\xdf\xe1\x1d\xab\xf2W\\\xef\x08\xff\xfa\xf3\xc0\xba\xd7\xe1u\x0b\x96\xcc_\x1a\xd5#\nR\x10\xf8aq\xf8\x89\xf0\xe5\x17\xc2\xaf]\x9d\xb9nYx\xc5\x8fO\xfd32\x97\xcf\xa8\x0b\xef\x8e\xea\x05+P;^\x16\x91\x9c\x14\x9d\xdb\x19~\xa3kuG\xf8\xc1\xcf\xdb\xea\xc3\xa7d\xc7\xa7G\xa5@M\x10yQV\x1cY\xb0\xf3J\xeb\xff\xf0\x7f@\x13\xdav\x85g\xee^1}\xce\x96\xe2\xb8(\x88#E\x9a\xcb\x92\xd2c3\x0e\xe7\'u\x87\xffi\t_0#s_Q\xe9\xce\xea\xe8\xb8\xe4\x1c\xa87\x1b\x17%\xa5\xc6d\x9f\xcc\xc8/\xb9\x1c\x9e\x19\x9eY\xb7\xad(#\xab \r\xa8`\x114\xa0&\xb7\'\'FVgW\xd6\x9e-=\x1a\x9eY\xbf\xf4\x08H>6\xb5/\xa5\xbd\n\xac@uBCa_jtZvVu\xd1\xdc\xf0\xf0eE5\xb3\njb\x8a\xe3Z#n\x81\x15\x08\xabL\x988\x15\xe8\x8d\xc8\xd8\xd2\x92\x8a)\xe1\xe1\x1d\x07j\x0e\xc6D\xa7\'\xa6D\xe5\xacl\xd2\xe2`\xf0\xe1\xb1\xb9\xd4\x9f\x10\x11\x9f\x1a\x19\x9b\x11=\xf3Ix\xf8\xb4\xc71\x19%\xb1\xe9q\x85Qe\xfd759\x18\xd4\x85\x1c\x1e\x1eZ\xb9\xf9\xd1\xa2\xa8\x84\xf8\x84\xa7;\xc2\xc3[\xaaS\xe2S\x12r\xd6\xac\xdc<;gn(7\x03\x83\xbd\xac]\xde\x84\xe4\xb8\xc8\xd8\x92\xda-`\x13\xb6\xa7\xd5\xa6\x95D\xa7\xc7\xa7\xe4<\x0b\x08\xe2a`P\xb6\xe4\xb7\x95\xe3\x93\x91\x93\x13PP\xf2v\x0f\x0fw\xf1\x14P\x10\x10\x90\x93\xe1\xe336\xe7\xf7\xd7\x00\xa6{\x03ikI)VVV~/EY\xe7\xf0pGWE7V\x10\x90\x924\x92\x16\x82\xe4\x0fS\x0en\x16 \xe0\x96\xe7\xb12\xb3\xe0\x17\x96\x07\xf3X\xb89\xb4\xb9H\xcf\x86\xbcbb\x12\x0cT\x06\x00\xd9\xdcJ.\xbbbp\x06\x00\x00\x00\x00IEND\xaeB`\x82'
georgiatechlogo= "iVBORw0KGgoAAAANSUhEUgAAAqMAAAE7CAYAAADpQojbAABGCUlEQVR4AezBgQAAAACAoP2pF6kCAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAACYvft3tbqO4zh+tBwOHvGQDeGhlG6BIpH/QHeKZofIsbAlckuquTEoIi6tFQ1eqCkaolsN0Q8QBJeo7AcFaYQ5XBIp+3HfvQYH+cDXw4Xvh0AeH3gsZ3wt58mbA2fM53mb7813x6n4Mn697nyciAnbYwQAgO3ZG69GNZ63jRjtDQBgERtRjedsI0Z7AwBYxAdRN9iKF2wjRnsDALgr3o1qvG6b3jEKAMD+eD+q8ZZtescoAAAr8XVU4w3b9I5RAAAOx8Woxjtxu316xigAAIfi56jGJ7HfPj1jFACAI3EhqvF5HLSPGAUA6Gk1LruMitH/AwDAY3ElqvFp3GMfMQoA0NPxgRj9KY7ZR4wCAPR0LH6PalyKx+0jRgEAejoZf0Q1LsSj9hGjAAC9TOPl2IpqXIzjNhKjAAC9zGItKlpX49mYBGIUAOjPZbSxFrfZSYwCAPQwi9eiBrwSO+0kRgEAeljERtSAl2KHncRoDwAA98W5qAHrsctOPWIUAICVOBs14OPYbaceMQoAwIH4ImrAR7HHTj1iFACAB+OHqAEbLqO9YhQAgNW4HDXgs7jTTj1iFACA1bgUNeCbOGSnHjEKAMBD8VvUgO/jqJ16xCgAAI/EZtSA8y6jvWIUAIAn49qSGD1ipx4xCgDAyfgrasBmPGGnHjEKAMCJ+DNqwLU4ZaceMQoAwDPxd9SAK/GUncQoAMDYprEWdRNX42lbiVEAgLHti/Wom/g3Xoxd9hKjAABj2htvRi3xdszsJUYBAMa0L05HLbHu/+nFKADA2O6PM1FLnI6pvcQoAMCYjsaP7W9EY6v57MOY20uMAgCM6YH4rgnRb+OXqBucjUVMAjHa38qBg9OYxyz2xB2xMyYAwK0h3/mH46uo6/6JM02gVpyLuzWSGB0jMnfEPO6Nh+M/9u48zMayD+D4b3bGjD2FLEQWEkIWKpVKFi2ipaRoKZRF3lKJ9tKSpPCiUCSvEoVCpRRZQlJZhIiQnWEwc7/fP54/XPfV7Od5zvOc8/vjc811mXMe48yY+c793EsH3IoB+A/6YTDGYQomOj7EW+iLfhiEPrgJ7dEUlVEGEghKKaWUxmgr7INxnMLX+MWK0ZX5GhnVGFVOHDZCJ/TCK3gXc7ADh5AGk09HcBA/YzY+wDD0QCc0RmlInimllCqKC9Eal6BliLTAZbgSZ0NU1Mbo9db80HS8h6+tGP0FFSFKYzRbTvjVQwc8gg/wGw7jFIxHTuIw1mEyHkYnNEEpiFJKqRw1xEzsw05sDZEt2O24BaI0Rh3H8RQ+tGJ0O5pBVM6idX5nDdyKkVhhx6cPnMQRbMNoXIvzs72tr5RSqhV+hnFRL4iK2hjthAzr6M/7McaK0Z1oA1E5i6YITUZ9PIqlSMNJGJ87jf1Yhf+iLVIhSimlHEAzLHf5+/H9EBW1MXq7fQ497sYo68934BKI0hgVJ9yuwgf4FUdgAuokduEz3I6yEKWUUiBGsUxjVLkUorF40orOI7gUT+Y/RlUkR2gR3IpPsBMmwhzEd3hUb98rpZTGqHI9RuPwnB2juAi9rD8/ir6IhagojFFn1eOH2AsTBb7D3SgM8YBSSmmMqmgM0iF2dKIxbrIWNp3C80iCZE9FWoSm4HH8BRNl0jEeLSCeUUopjVEVBZyz5l+xYjQdTdDeWth0Ao8jAZI9FUkh2hwzYKLcb7gbCRAPKKWUxqiKhhg9F9PsuaGohZth7BhFPCR7KhIitCj6YKuGKIBjeE3nknpFKaUxGgU0RmvjSys6l6I8rkeadWb9aKRAsqeCHqLn4b0st2hSn6MBxHtKKaUxqiIqRqvjc9jHflbGhdhsvW8BykCyp4IcohfjK5hsqaW4EOI9pZTSGFWRwbkdb4+MrkBlVMc6631foDQkeyrI80O/hVG5shjVIN5TSimN0eBTzn6iv8M4MjEHZ6Ee1lsxOgelINlTQQzRVvgRJk/UQlSCKKVUhGqBn1yO0QcgUUljtAP2wjhOYhyS0AB2jP6IcpDsqaCFaLMCfaNR81ABopRSEegy/ALjogchUUljtB12W9s6jUEySmCeFaPbUA2SPRWkEL0IP8OoAhmH4hCllIowdTAav2IZvsPiEPgOq/AbOkCiksbo1dhljYxORAkkYooVoztRA5I9FZQQLY+FMKrAMnXOk1IqQhVBTTRBIzTERSHQEI3RFKUhUUlj9E6cgHEcx3AkoAg+gb0HaVVI9lQQQjQeY5Hps308t+BHLMUCfITJztuvsRyrsQNHYXzkFzSBKKWUUrmiMToI9rn0vSBZxOguNIAEn8Zof5yGCaOT2IDP8CoewtWohAooiSTEIQlnoxJqohN6Yzg+xVocgAmzoRCllFIqVzRGH7Fi8zB6QhCHcdb79+EmSPaU30P0EuyECZPtWIBn0BDFEA/Jh3gURS3cg0lYH8bQ/kXPsVdKKZVrGqMDshoZdTxrvf8A7oSo7Pn9mM9ZMGHyA25EKTtAQyAGyaiHt7EHJgyegQRADBIRjwQkI8VSCM77YV9DxSIhD69hHCSiqDgkOAoh5QzJSAzU517FWJ+zBKQgFSmWItb3UIBrQHJNY3RYViOjjqdhx+itkODSGL0lTLez92E4qkI8kIK2+DwMo6QLUAPiMwlIQXXUxVXohR64H8MxAePwX8cQ3IX70BNtUA9lkIzEKIuOJBRDZVyIjuiJHngIwzEe4854Owjd8QC6ogVqo9gZoSpZUgkockYQVEdj1Ec9VEDqGeEf43KoFEIqKqIVbkFP3INhGIfxjjfQH/fjTlyAYiiMWIgKmxgkIBXFcT7qogP64B50w30Yg4kYi/863sVI9MVdZ3wN9EY71EUNlEYqEv/1c64hWhjjrdj8B5dmMzK6H7cgBhIwGqPOrew1YTqpqCuSIR6rjmE4BOORI3gYEmZJqIR2eAgj8QmWYjl+wx7sxN84kcW/5S/swk78hpWYj6l4G33QBCUicPS0MKqiM4ZiHOZgCVZiA3Y69mbxGh7EDuzGVqzFcnyG9/A87sD5KOLxqEoMSqIKKqJCCJ2HckiC5EEiKuFq9MJI/A+z8CkWYxVWYCUWYTZm4b+o58JrlIpm6I8JmI3v8Qu2Yid2wV5UeRr7sAvbsRLz8QH+g3ooDMlBKqqG+HNUEVVQDoVy8Tk5O9Qfg6MyqqGIB1/rRVAZnTAAIzAbC7AEy7ERe874nP0Nk4UM7MWOM74G9mADluFHfIPZGIOh6IFGKIpYjdHiRfG+FZt70Qji+I/1/uN4BiUgAaIx6nyzeTIMq+enobYPRrSGIh3GI1PCeNu4AQZiEn7AH0iDcclR/IK5GIfeqB3wW+9NMRgf4Hv8BeOiPViGaXgJXVHOozsIL+FHfI9F+DYEvsFKTMGFkFyohF6YgK+x2Y67XOoSwtfmGgzDx/g1xN9D0rEaU/AgzoZk4QYsw3f4NgQW4Tv8iLloDclGPfwXy0P8dbIIS7HSxX1Ga6MrXsB0/ICt9ufSQ/9gLT7FK+iBhlE+MjoB9shoE4ijG47DOE5hTDBPYdIYvRxbYTz0I0pCfCAZUzz+t9eEeKQm+mACVsCE0Wksxxu4CSUhAVAXPTER62DCaA8WYpDL24WVwNcwLtmGyyHZaI7nsRAZIQi8GyAFUAk3YzK2w3ggA7NwMwS2vjAuuh2SjdbYAOOihyAhkoqOeBNLsRvGpw5hFd5BOxSPshgtg+lWjO5GY4ijK/ZZJzS9hjKQYNEY7QvjoV24EuIjzbARxgN7cS/EZU3QF9/B+NARTEFbpPg4Qh/DYhgfWovn0caF+aWp+B+MS37NZneJZPTDehhLuGK0C2bjRBj3Wn7Z/gUOvXDaxe9V1+Xie+cyl3+BvR9SQGfjdryLf2ACZi8m4W4kR0mM1sd3VoyuQR2I406kWTH6esBiVGPUmY8zB8YjR9ET4kNPwXjkZRfn/zXEc/gDJgAOYwI6+ug3//PRL0DH4R7B27gYCSGM0Y9djtGWEEsTfADjCHeMNsDrPjpIY6Y1vaG3izH6TwTEaCG0t0fXAywDH6FxFMToZVhjxehcVIY4bsIeGEcGpgXvFCaN0Xs9nB+TjolIhfjQdR6Oeixw4bfbyvgP1sME0GF8gPqQMCmHu7ACmTABsxnP44IAxmgMeuNPGB/EaALuwzYYn/kF7SDoiVMao/+qHl7HPzARZj3uQYkIjtHmWGHF6JdWaDbFUusxK9AQkjXlt4VLo2A88hMaQHyqNj7CHuzAZvwRMs71cAyfh3CFaCKuwxKcggm4zehux7oHmuErpEXAyMlG3IJYn8doCwhiMch67cMZo2fjDRyH8aktqI/OSNcYdTjQAb/DRLDDeANnRWiMXoLVVmjOt2K0Hr6yHvMj6kOypvwSonFoih9hPJCJkT7friIeZVEHtVHDBTVRH5VD9FqUwhs4CBNBjuINlPdoC6PO2AYTQQ5hGIr7OEYbQXA39sP4IEbPx3SYABiNV3FSY9QBtMY2mChwAsNRJAJj9Fbstm7BT7ZWyjfEIitGf0A9iP9pjBZCTxyB8cDPaAUJGXUBPoWJYFNxLsQlxTEYuyP4B9UMNPFhjK5EQ1yM5TA+iNEq+BAmIPZjDzI1RgG0wO8wUeQoekdgjD6IdBhHGl609hBthMVWjK7HxRD/0xgthRdgPDI5ZCfyqBh0wlKYKDAD57i0SGkyjsJEuBVo7bMYXYQO+AQZPojRszARJrA0RlthHUwU2otmERajD+AEjOMYnkZxiOMsTLSPDEUbiP9pjLb08FZUOgaEZPW4isW92AETRf4X4uNi62ABTBRZh5Y+itGt+AyHYcIco8kYg+MwgaQxWgPfw0SxGSgdQTHaE8etkdGXURLiiMXLVoyewNUQ/9MYvQfLYDywGo0gqsDuwF6YKPQRqkAKqCg+golCa3BRuGPU5oMYfQAm0DRGX4MJk4PYjA04DBNGt0RQjD5pReZBDEAyxFEYI+0jQXEFxP80Rl/Fbg/n/RWGqAK5DFtgPPQ3NuAXzMf/MA9LsT4M33jHFPBrKQZP4QhMlPoWNTVGAdTAVpjA0hitiZ883K1iLWZiOJ7Eg+iEduiFIXgJ07EUx2A88jaKRECIxmCUFZm7cD3kDEl2jOIEroP4n8boZzAeeRGiCqQelnp83OSruB4XoRpSICiMs3EhuuMtfO9R4J3C/fmc8hGPB8I8cpGBQzgBE0afoJjGaOWieN8n23Edytc0AY3RGDyPTA9OwFqIvqiThzUQZXA3pnq0sOo3tI6QGH3Tisy/0BZyhhg8YT3uFPoiFuJvGqPfe7jK7x5IvqlzPDrS8wg+QVuURGweNgc/G908mvqxA80hedQKezwOjH1Yj6/xPp7HAxiMKZiD5diFdBgPPYPCUR6jbZEB46FT+Bu/WV8XD2IwpmIefsSfOY6qaYwWx+ce/ILeHyULeBJUI0z24HPaK0Ju04+wR0bRCWLpbscoHkcCxN80RjfDeGAr2kM8UhJXoy3a+NTVuBYX5uI88WSM9uhEl+tRDFIA5fEaDsC46N08/mCohK9gPLITU3EzqqAUiiABgjikoBjOQQu8inU45eG2QJdEcYyWwzQYD23A27gG52bzdVEcZ6MhBmAujmqM/qur8BuMS3bjPsSE8JjdER78oilB5oTkOCsyd6ItxNLNjlE8gjiIv2mM/unhqUutIB5pji3YjU0+tQ17cjm3p7UHR9l9GeItQRLRCzthXJKGrpBcuh8ZHi60ao2z8/FDqi6e83A0bDbKBDRGTyPdhpPIsG/bIhOdIY7OSPPwYx2PukjNx/+nCrgNazVGHUAMBrg4RegEHkUCJIQa4QeXt1IsHPAYLYXPrMjcgCoQy512jGJQAGJUY9TDCfs/4mKIR64O0L6RX+YwElkSX3rwMdSFuKAH9vtg/9EL8buHJ6GUDcH2XTdju0cfc4eAxGga1mIcXkA/dHLciOvREbejP57DGMzFXhzF9RAUwhsejpL3D9EZ4hdhkcYogDi8AuOStS4dupGEZ2Bc8jFKBjxGy2G+FZm/ohzE0hmnrZOa/ouyEIjSGF3u8Ua8V2F/gPbOTIHYEIN2Lo/cLENjl/dEfQrGJSfRJxfzWYfAuOwYnkJyiH+xWg/jss9wrk9j9CS+wXO4DU1xNuJzOaJYGjXRHnejCgQt8DOMy3bjLkgI1cQ8jVEn6tzdL/g/Lh5h3dfFRVfTUSwCRkbnWDH6G86FWC7DTuuxM1EDYlMao7mnMVoBX7k8WnMTxGXV8a3Lq8KzC8CmWOPBgpSXrEAKlXYejJBm4DYfxuh36I1akBDr71FIPw1xQTWwsltHRtEdIzAYT4TAYLyCQS5uIJ+a476oGqNV8LU9MppFjDbDZhhHJj5EFYhNaYzmnsZoZ5cXsoxFLMQD3Vwc4d2OtpAs9PJgy5d5KAtxySsezHedgKI+idGTeAeVIS6IxwQYl32O8hCXtMffuoApkJrhV5gsaYy2wi9WjC5CKYjlUuyyYnQaqkIgSmN0GZpCck1jtCjecnkLp7sgHqmLdS6O6r2cTXSM92A7rO4QF1XB1x6ca32jD2L0GPpDXFTJo6Ng+0FclIi3NUYDp1KuToDTGO2IrVaMTkURiOVibLAe+zUaQCBKY3RpnhcwaYxe5vJcwakoA/FIUQx1ccXrVygMsTTDz16NKLpsMIzLhoQ5Rg+hB8Rl13qwtd0S1Ie47HYc1hj1tUSUQW1cl+tFqRqj18AOzI+QArFUxizrsVvRBgJRGqO/4kpIrmmM3gvjoqcgiEWiy2IRg47YAeOCLbj03wLOg9vb90A8cIUHOwJMRnKYYvQkHoR4oD8yvAh7D1TGLI3RsEhAIgqjGEriLJyL+uiEHngaX2ALDsHkisZoe/xhBeZ0pEIs5+B967FbcCXE3zRGV3p4AlNPSK5ojMbhJZc3/p6CO9ELD7vsQfTGWBx08dauvao+FmNgXLTJw1+0Uj04/GAVLg5TjE7y6DztGAz1IKxvg3jkKWRqjIZUDOIQjwQkIR7noTmuwD14CE9gAj7GMvyB7diHwzgJk3sao84Rn71xzArMN1EIYimPj6zHbsIVEH/TGJ0H45FhkFzRGK2AGS7/oPwHf+FvD+xy3h5yeTTqHSRCHMUw24Pz3atDPNLPg/mBt4chRvejDcQDCR7sL/q3x6fO9UKGxmiBxCERJdAMN+MBvIjRmIkv8APWYB2242/sx2mYkNEYjcUQKy6P4xHEQiwVMMN6/H50hPibxuhYD28ZvA3JFY3R+vlarKLmozjEUc+Dc/KHIxXikZtdPpnpOB70OEZP4n2Ug3gg0YMR5oVoAPFIVxzRGM21GBTCebgRj2AUPsLsM0Y3d4c1MjVGn7Ti8ij6QGBLwrPW40/iDoi/aYz2xy8ehkJ5SI40Rtvna3GFWoIy1obxG2Fckok7IR66zIM9U19ArIcx+jduRoyHMfoOjIumoqrHv4Bv1hjNUXXciicxAQuwFekwvqO36R9FhjUyOgixENgewGnrSNDuEH/TGPVkVZ9jJ26A5Ehj9O58jX6pX6zN0W/Eny5vFN8e4qEGHmxJNAoJHsboNlwEiaAY/QTnQzzSDjs0Rv9VCbTDUMzHIRjf0xhNwAiYM+zDnZAs9EeGFaPdIP6mMVoXk2A8kJnreaMao91wACZP1Ho0hDiuw1aXY7QjxEPnexCHb3kco8tQIcJidIbHc4nbQ2PU4UhFZ/wXu2ECRWM0GVNhr45vnscYvRfibxqjRTEUxiOzPdqP8VoXQtrrGM37qnO1CS0jPEZrYmYExWgmJuMsSATNGZ2Oah7fpt8EjVGgMaYiEyaQNEYL4wPYq+PrQ7JwC05apzANQyzEvzRG43Et/oDxwF6P5tjVxywscN7OzqM5mIJPsTsMMXpnvm4lqb3oEeExWgMfR2CMlomwkdF5uADikQ7YrTFauRSGYCdM4GmMvp/HkdHWOGo951UkQvxIY9ThrD6eCOORL1ES4qJ4lHQUQ/E8KoFCqI7VMBqjgXAIAxALwfXYFmkxGkkjo45pOBvikVi85ME82Gsg3iDUdGunOvgUJgBO4GiO+4/qbfr/WWG5GRdBsnAFDlvPeRHxEB/TGHWCdKCHtzP24WEUgvhcM+wJQ4x2wS6YPFGH0NfDGM1EV4iHGuPbCFvA9KGXMep4BMZlvbyP66iN0RqYDeNzO/A5+qAHvoL5VxqjlfCNFZZLURaShStxxHrOCxqjwYnRa7ENxiOrUB/iY8XxfpgWMF2KVS5HlIlAB9EHcRA09WAbpIchHroC6z04KjYmwmP0Vg9+0XwDSRCX1cWiKI7RVIyC8YEMHMZG/IAZGIFHcSsuQ20IymEqzL/SGK2F762w/AalIVlogX9gn2VfCeJ/GqOpeAPGI5mYgbIQn+qKI2GK0XpYCOOSNMzDyxiJUQE3EpMw3PolpywWeBAcJSAe6YYMlze9fwAS4THaFD/BuGgr2kFcdh/SozRGE9Ebf8N47DT2YTHG4xkMxO24Eo1RDSnZjOZ+AvOvNEarYpEVlt/iLEgWamGD9ZyvcSHEzzRGHc7ct0MeB+kbiIP4zFXYEMatncpjusvzle6BIBGFI0AR2KNQhTDVgznQdT28HfukB3Mdr4uCGC2Dz2C8mvLgkoqYBxOlMdoIa2A8chwrMRoDcTMaoDgkj2plO/9bY7Q+frTCclEOI6PnYZX1nC9QGxIMGqNn4W0Yjw1BDMQnrsYv4dxn1PEcjIv+A4lwMRjhwQ/qjhAPVPVgXtx8VIyCGI3FCI8O+7gJ4pLXYKI0RgtjGNJgXHYEn6I/GodoEKVatoMOGqPtYY9yzkQRSBZq4HfrOV+iDsTvNEYdzq2FI2G41fEyykDCKB6dsRbGBzF6rwcrpmMgEa4XTkRI2N/owe3I0YiJ9Bh1POTVMbUujZ7fhZ1RHKOVsBjGZXvQz4XpOBdgLsy/0hjthu1WWL6TwzZNVbHces5PaAEJDo3RwngDJgw+RwtIGFTCq0iD8UmMXoO/YFzyNWpCIlwtfA/jopW40INflkbCeLUgKwpitBV+hfHAcjSChEgPHICJ4hhtgkMe3Ja/zsUjXLfD/CuN0a7YaoXlWCRBslAcE63nHEBnSBBojDqcW3SbYMJgN55HFcR78MO9OgZihZ+OA3Wcgwmuzht1wiMKvAnjsucgLrreg1HRFWgURTEaj2EwHtmC21CoAB/veRiOdJgoj9HuOAbjknS8BHHJndnubKIxegv+tMJyXA4xWghvWM/ZrzEatBh1OP9JDsOEyRo8i/ou3EougfZ4G+uR7qOz6W09Xf5mNTNMc/USEO8CycJtHnw9b8e1EBeUxecwLnsMMdESo46rPP5edwzTcCsqWq+3LRaCZFyKN+yR3CiO0WQP7hSsRTmISx7TE5j+nXN85xM4BuPIwDDEQLJQBG9bMboX10OCR2M0CdNgwigNv2I6+uMylIfkQUlUwOUYhP9igb2nqo9j9AKsdnl09FGPz1WfgqX4HHNDYA5+xshsXscymArjsoWoAQmhYhiFUx7s0XolJMpi9BxMh/HYTizHuxiEu3ADbkQnx0CMw5fYAOdrABqjxTAOxkVfuTivvmK2/680RuNhj3AeRk9INlIw1npeGrpBAkZjFM7Rak4Ihd9B/ITPMAmjMAx90QcPO+7H4xiH9zADc7AKR2Esfo/RBA+28tmIGyAuK4+xLodgag6Lf054sF3ZZyE80CEFbyIDxmXjUCLaYtRxDY7BhMlRbMcGx2+OgzD/SmO0qAe7vyx0cevB2/I851djdB9uh2QjFv2QDuM4jQGQYNEYtfceXQ/jQ5k4gTQcdxzzbPTAgxh11MWfMC76FZ1d3tPxLaS7+MtKzxxGMUpjioeLVboU4AdZDC7CJI9CdC+aQ6I0RotiPEygaIy+A+OixSgMCbFCeBcmSxqjcXgZGTCOQ+gBycENOGLd3n8UEkQaow7nh+pfMCosMRqLgTAu24WHXTi+8HxMgHHRTKRActDCw5Na9mE4rkaVPEwruRxD8BOMR55DYrTGqKMuNsMEgsZocQ9u0+924QStWDyR4zxljdFETIY5w050gOTgNthzTf8DCTCNUTh7Xnq40Edj1FYSS2E88A4uRklIAZyHHljlwdziOyC5kIhhYdghYiYeRlu0QB3URl00xeXojrH4IwyLBStCojxGY3AvMmB8T2O0kEfbEM5HKiQEEtE/V78Qa4wWxjdWjK7L5eb1N1sxmoknIcGnMVocr8N4T2PU0dHDOWS7MBVdUAelUATx2XyTTUUFXISeHo7sjcvjUXxVsBwmDE7jb6zBSvyMv8L4i95h3AiJ9hh1JGMcjL9pjDru9WAe+Cm8HIKvz/PxmsdrFz5ESoBjdKEVo6tRGZKDq3HAeu5IxEGCSmPU4SxueB77YLylMeoY4vGc2HRswxyMxXPoiEtwrfP2NryM97ACe3AaxgOb8rnZ/MX4FSaKncBTiIXGKBwlMQ3G9zRGW3u4Ldd8dEJ5JORyT9gyuAB98DuMx6YhNcC36edbQfkLakFy0BA7rOe+hxRIkGmMOpwFGV2wF8ZzGqOF8boVpF7KxH7swT7swWHf/oDMXmf8CROFTmEEnPnBGqOWc/AhTECsxCKcjrIYrYWNMB45glUYhwHojIvQyNEY3dEfw/EdtlqfFy8tRPWAxmhpLLKC8mdUg+Sgfhab5ReGqAiIUYfzg9zruW0ao44i+AAmimXgcUgB3YrtMFHmAxSDaIxmqQTehQmA7rgT6VEWoyl4BWlh2gd7D7acYSsO4BiMD6xBq4DGaBOssYJyDpIgOWgAe2R0LApBVKTEqMNZiPE1TBQ6jWPI8DpGHeWi+FbiCbwSwon592BXlP3ycy5EYzRHZfCCj39h2YUnkYBbcDKaYtTRFFthVETF6GVYZwXlDEgu1MYW67lfoAZERVKMOpzTZp6JslHSrzAEizweGbVVwHichIkS+/E0SoR4q5Ue2AgTwY7gLZSDaIzmWmG0xyxkwvjEcnRFPAS9ous2vb3fqP19EGobOgU0RpvBHhmdiVhIDipiqfXclWgBUZEWow7nh/m1mBsF26L8hEaIwfBwxqjjbAzGDpgItwP3uHgiSmvMhYlAP+MBlIRojObL2XgcS8K8zV06PkJDiCNqY9RRA5/BAMqRjj4BDNEYtMGvVlB+kssYLQt78dMSNIaoCItRmzPiMiyCVyl/jaYQx0BkhjNGHTHojB9hItT36OjRwpVROAoTIT63vm41RgumNp7AIuz3+EjkuXgQKRAPY/QAOvs1Rh2X4jeYAMn04HvN8wGM0ULoge1WUL4DyYVz8Z313KVoAlERGqM255vSiAg6zWQ3RqMa7MUvJ8MYo7ZKGI89MBFiJ15DZYiH7sZamADbjpdRAqIxGnJVcT9mYBMOuDgquQj34ixIFh50MUb/Qns/x6ijJ/bDBMR7GOXyFINJSApYjKZgIPZYG9cPhuTCOZhnxegGXAOJBhqjDme0ri7GYwsyYQJoMa5FYYilNY76IEZtHTEb+wK+r+H/0AqxkDCoglexJYAB/wEuRQxEY9RVSaiJbvgvVmIj/kZaHqcuZeAwdmAVxqN9LqdX9HYxRhejod9j1NExAHeJ9uFVpKKxy2sufsSFAdzwvg/+tmL0KUgupGCEda59GnpCoojGKOwo7Y9VOBmQeTbrMBBlIVmoF6JvItNdOCWjKDrifwGbT7oXH+IqJEPCLA51MQI7kOnzEfyJaI4ESIikYgaMi6ZaMRpUSTgbNdAS92AoRmAmfnAsPsNSzMc4DEUXNEDZPI5oPe7i1+c81IJkoyl+DHeMOmpgNA7B+MhxzEIHxEFwHr5weQuqAQGL0WQ8fGaMOoZBciERQ3DCitEekCijMWpzRpo64HWswmkfbgT+Ma7P5UbBZfElTAHNcvGUjGJoiqewzMd7hq7HUDT26VnKsaiHh/Cjz6J0M15BKxSBhFhRzPZgu6lzIBEoDoVRCuUcZc9QDmWQUoDFefEuH9f8Kc6DZKMlVvohRh0JaINFMGF2DO/icthTLYrgGZd/Hn6GUgGK0ZJ4C2kwjgO4H5ILhfAsTsI4juBOjU+NUQDOiFdt3IBX8G0YbymfwHd4Du1QJo8/pKeE6Ji5ohAXJaAyrsNwfINjYQ7QVXgdXVEHsZAAqIy2GI7vkR6m1fETcA8ucHlOWDEsgHHRHJSF5IsqjUkwLhmZizsVl2I9jIseguRRRfTEe/gbxiOHMBtP4LIc7n61cfn7yFE8gISAxOjZ+BDmDJvQIQ8x+iJOwThOoy9EaYz+W5imoAE6og9ewXT8hIMu/Ga9Bd9jMh7DTWiApHyOeLTAQ45eedQHD+Maj79JJKMBrscAjMMCrMZBl25NbcJCvIvHcQuaIRUSUMloiK54HO9jiQvTItKwGtMwBLehpYe3tRNxDR5CH/QKoT7ojatRGJIvqgG+hXFJn1zuRNEZD7vwdfIQ+qIOJJ/K4ho8iklYGuLvd3uwFFMxGB1wPuJyObBxF/qjl0v/xy4LeIz+hqshuRCDu3DIusaLEJVjjCrnP0t5tEI39MMQvI6JmIrp+AarHavOsBpr8BWm4n2MwAt4CO3RCKXB3wdVDHVxCe7AY3gK72A63senWIWfscryM1ZgLqbhQ4zFixiEu3Et6qJUhI9OXYzr0BvP4i1MxhTMxUqssqzGWnxxxmNH42kMRA9c5lp8qjK4HnfhVtwSQt1wAypCXHSLi/MjT6I9JIKUxMW4HQPxPCbhQ3yKpfg5i58tc/A+pmAUhqE/uuJilIGoAsXoWZiI0zCOzegIyaXW2G3F6BsQpTFa0I31U1ECZVAflztan+FyXIm6KIFiSMjzqmIVg2SURlFUwGVog8us17wNWqI6SqMEkhCrr2HlOBRFcVTDpWhtuRxtcD6KOuL1a9YzjbAcp3AI/2BfiBzDQfR3+evsORiXbEI9SASLRTGUQEU0w5VZ/GyphqJI1f+nrsVoedib1q9CS0gutcU/1jVegyhfxahSSinnl6ivPViglQhxwRUuHzQyDkUhSnkUo1WxAfbZ8hUguXQN7Bh9GzEQ5ZsYVUop5cxVfRbGRX/iLkiIpbi87VYmboEo5XGMrrdCcjbOguTSxfjLusYMFIcoX8WoUkopZwFVBoyLtuJGSIiUw4cubzO2BlUhSnkYo5XwuxWS83AuJJcqYAPsa1SAKF/FqFJKKWel9bcwLjuAx1GxAHuKFsLVWArjsrshSnkco3Ww2QrJOXkcGT0ni9HVshDlxxhVSikN0h4eHsSxFhPQG5eiBqqgBOxttcqiLjrheSzAARiXrUQFiFIex2h72KcvvZXH+Z7lYI+MzsLZEOW7GFVKKeUcxLENxkMHsA5LsBhzMRkTnLcf4yssxx/I9PDj6olYiFIex2hX7LVC8gVIHpyD5dY1VqAFRPkuRpVSSjm3zfshDSaKHcObYTyWV2mMdoI9Mjockgel8LF1jT9wI0T5MkaVUko5+8FOQgZMlJqlt+dVmGP0NuwtYIwWx0RkwjjWowNE+TZGlVJKOUdOfgQThRajBUSpMIVoEgbjMIwjEwMgeVAC9pGiGzRGAxGjSimlnAVFM2GiyDJcAlEqjDFaDK/iOIzjMG6H5EExjLeOFN2NuyHK9zGqlFLKWcE+BybCZeITNIIoFeYYTcELSINxHMWdkDwojCewD8aRgWEQpTGqVNCipCbaoCmahYRq4SgP8bGqmIgTMBFoD/qhJCRAiqA+WqM5moXQJWiCMhDleYymwo7RNNwFyYME9MJO63b/0xClMapUkEI0Hm/jALbhzwJT2/GX46kAbB+UjK5YFWGjoV8E+LZ8I6zCXuwI8dfmLmzBfRDlLWd/0E+QAePYhvaQPEhCX2tV/mkMgSiNUaWCFKMJ+ATGFeotxEECoCpewBaYAJuD7igNCaiWOAbjokchyvMYPQ8/wJxhKRrnI0YHYo91rTchSmNUqaDF6BQYV6hnEBvAEBqMBQHaAmoH3sNdEXLWfENshXFJGh6AKM9jtCoWWwG5CBdC8iAW12Ojda0ZEKUxqpRojCrHswE+5acGeuBVfIJffbZx/WpMxBNoh6KQCHERtrkcow9CVL6CsgiqoTHa43pUgeRCRSyAPTLaBJJHDfGTda3PIblUEzfgWjRBecRrjCqlNEY1Rv0mHqVxJQZjDD7CXKzCNhxApsOESCYOYQdWYDam4A30QXOkQDykMaohmoh2mIaV2IEjeCYPAbjOCsgvUBOSR82wxrrWZ5BcSMFoHMF2rMREtNQYVUppjGqM+l08CqEsWqMbBuJ5vI55+ApzMQeLsAqrYVuDZViIOZiPzzEaL2AQ7kZLlEKCD+bgaozqavjHcciKwJlIguSgCeyjQMchNUQxOj+XH0dVrLeeewr9NEaVUhqjGqNBE4NYxCEeSWeIwzlohKa4+AxN0Rx1UQJxSHLEI9YRA/ERjVHdJ3QQ9lkhtxLV8xmjbyERkkc1sMS61k+oCMlBS9j/huPopTGqlPJ3jCqNUaUxqjH6CP6xQu4gekJy0Ai7rOeORTIkj4rgC+ta61AbkoNeOGk9Nw0PaowqpTRGNUaV0hgNWIw6xiExFyOSe63nvYPCkDxKwjzrWmtRLRdHif4PRmNUKaUxqjGqlMZo5MToAqRAsnE7jlnPGwTJh0KwR0Z/zsXK/urYpjGqlNIY1RhVSmM0smJ0TS7mjd6P49YRng9A8iERM2DO8Bc6QrJxLY5rjCqlNEY1RpXSGI2sGN2D7oiBZOEOpFkx2geSDwl4E+YMR9Arh+cMhtEYVUppjGqMKqUxGlkxmomJKJzHkdGHIPkQj2dxCsZxAD0gWaiEGRqjSimNUY1RpTRGIyxGHUtRJps5ni8jwwrAWwswMvoKMq1V/dnF6GXYoTGqlNIY1RhVSmM0MmN0YzZHe5bABOvxu9CmACOjT+MEjCMdjyMGAls3GI1RpVT0xKh6WmNUaYxGVYzuRg8kQSzFMBr2PNNrIPkQix74y7rmeCRCLAkYrjGqlPJjjP4PxhVqJOIgSmmMRkWMnsQklM8iRsdYj/8HbSH5EIPO2Ghd890sYrgWPtcYVUr5MUanwuAUTnvglCMTxiOZSPfo35iBTMebGqNKYzR6YtSxERdALFXxDewTk5oVIEZvxh+wT3RKhFiuwx6NUaWU32I0Bk3QDR3QyQMdHC/gGIzL0vAi2nn4b7zeUR2ilMZoVMXoAVwNsVyA363HzkP1AsTo7dhqXfNTJEMs/4GBxqhSSjkh3BIHYFx2CK0gAZOKJBSCLRmichSPlCxexyQkOzRGC64QkmG/1kUQF2UxegT9kAA7Rtdbj52FypB8aobl1jV/QCnYR4eO1RhVSimHE6OdcRjGZUdwE8SHyqExOqAb+uARvIYJGI0xlrEYh1cxCH3RA1ejMSojARJFyqAJOuJ29MEAjMR7WbyOozHe8SQeRi/cjCtwAUprjFqIezTEteiGAXj+jK9L+7Uej5F4BH1xC9qgEUpEaIyexlRUhD1nc5X12Pk4H5JP58I+EnQpzrYeVxtfaYwqpZTDidGuOOJRjN4MCbMUVMe1uAdDMQ2rcQAnCjCP9iR2YxU+xQj0wtWojVRIhEhGNbTFg3gek7EGh3C8AK9jBo5hO77DJAzBHWiNWkiOwhitho7ohzFYjn04kc+/exdWYiL6oRNqISYSYtSxB60gZ2iCP63HTUd5SD6dhwXWNZfA3uv0ZuzSGFVKKYcTo108jNGuYQzQC9EFz+ML7MUJZMC46CR24Wu8hs6oFdAwLYSauBHPYx72I92DhXAZSMOfWIDn0Al1kRrBMVoMrfAwFuIITsGE2CkcxLd4FC1QMgJi9Diuh5yhHY5Ypy+9hLgCxuhC6+9egXLW44bBWDIdGqNKqQikMZqCRhiC1VY0hUs6lmAwGqIwxOeKoi4ewBKfvI6ZOI6leALNkRJBMRqHi/AqtuA0jEcy8A/exsUBitHDOGCF3ikMRizE0cGK0QwMhRRAScyy/u4/UBfiSMQk6zFH8TuOaowqpSKLxmhxXILx2Ip0GJ85gvUYh1ZIgvhMIhrjv9iIozA+lIa/8DYuQOGAx2hFDMNGjyPUdhq/ozcKBSBGN2M50q3g+wxVrJHRw1aMPocYSD7FY6L1925Hc2u+6NfWY3ZgDP7UGFVKRQaN0SS0xWzsRgZMAOzASNSF+EQtjMYOmADZjHG4NKAxegd+gPGRExiNcj6P0W2YjhXWufM7cbm1z2eadZv8WcQWMEbfhbE+nsYQx234C/a80sewRWNUKRV8GqON8T52wgTUWgxCPCRMimMA1sME2Ha8gjI+j9HeEMTgWeyH8anZuMDnt+lfwtPWyOdJ3A5x9LHmaKajB6QA4vGeFZpb0RDieNl6/zG8jEHYrTGqlAoujdE49MImmAgxEdUgHquPT3ASJkJ8ict9GqPH0QWCV2AC4GtU82mMZqA3roV9wtETEMdg2FHYFVIAcXjbuu4h3AlBEmbCnlN6PfrikMaoUiqYNEbPxUicgokw3+FqiEfa4jeYCPQbHkS8z2L0b/TEHTABMgWF/RajjttQDfbt8Jkok8XIaBpuhRTQUzBWHD8GQW0sh32LvhjuxSmNUaVU8GiMXopvYCLYJnRz+bZ9PO7BDpgIlo7HkeijGN2BxdgOEyCZ6OHTGL0dyVhj/fnfuBKCgbD39bwNUkADcNJayd8Lgluxw/p7Z0DQU/cZVUoFj8boXfgTJgrsw/2IcWnf0MdxCMYRDUEa74cYDfiI/h+o68MY7QXBWzCWHhCMsP78MNpDCugxZFgjoz0heNN630k8BkFv3WdUKRUcGqOx6IV/YKLIXtztws4DT+IETBRJxyDEFDhG1ROI81mMPgJBFxy33vc0ymKy9ecb0AJSQP2QZl17MIrgS+vPt6AVBI8gQ2PU/5RSGqNx6B6m1fKZOG0xHluPjpAQeQhHYKLQLtxc4BhVS1DDZzE6GIJGsN/3IS7HWGRaMdoSUkA3Yov1d05Ga6yw/vwrlIBgME5rjPqfUkpjtAs2w3jkKHZiMd7DG3gTI/AWFmIHDnq8krkqpICuwHaYKLYMdQsco+pWn8Xo4xBUwSrrfb9jEEbihLWq/RJIAV2JX6z9Sz/BQGyAOcN7EMeTuoDJ/5RSGqNXeLjv5Qa8gy5ohPNQBsVQ3FECldEQ7fACNsB44KMCntbUEMvDfKrPCaTDEbZTht4tcIyqt1HEhzFaHBOQbs3h/AafWrfTl6IBpICuwa/WUZ+z8AkOWvNFB0EcQzRG/U0ppTF6Pn6Ecdlu9C/AefEX4VkPXstMPAnJh1KYAuOxVRiFJ9ADndHlDHfiaXyKvTAe2Y8bCxSjaiVa+jBGY3A77BXsu7DGitRpqAQpoPZYb+0z+gvsj/EXtIEgRmPU35RSGqMJeAbGZStwFSQEbsQSD+Y8VoXk0b046eFCoQ9xF+rlcvSsIq7E89gF44EvkOTjGN2KeRiFPuiN/o5eeAYzsBRpMB7biQ5+i1FHdayz3r8Hm6z4m4hzIQVUC1/BOE4jHcbyEYpDEKcx6m9KKY3RK/A3jIs+R1UXTjL63MWtezLweh73zDwPX8J4YCl6ohQkn67y6OP9Aw18GKO/40Vci2o5rFo/B41wBz7y+AStU3jQpzFaEous9x/ELiv+pqMKpICK4kOYHLwA0RhVSin/x2gqxsC46DOcC3FBVXzl8v6jjSG59JgHczNPYjKqQEKgPN5z+eM+iVdR2Ccxug/vokEBtuy6F1thPDLMpzGaguHWFk+ncAIZMMjEeJSFFFBpzIDJxkncpzGqlFLBiNE7XP64fkAdiIvaurzw6vVcjj6WwywYl41FcUgIlcJol0f7VqG8D2J0J+4P0YKg67AWxgMv+jRGY9DWXsluOY1HEQcpoLPwMUw21uJSjVGllPJ/jKZiBoxLDuMGiMtiMcLllf+5GR29B/s8mHtZE+KCcljm8rzMlmGO0YPoHuKTtu71aEHYRCT5LUYd52IxTBZO4SFICBTGezDZeB9lNEaVUsr/MXqbywtYpuMciAeucnF/1JO4DZKNGEyGcdFhdIW4JAYvunxrvBcKhzFG30UyJITKYgaMy6ajqE9jtAhm5hCjAxADKaA4vAyTjScgGqNKKeX/GH3f5c3sr4N4JBETYFwyAiUhWajnwer+8SgBcVEH7HFx5f9rKBOmGN3g4qjyYx7FaDGfxmg8XkNmNrfpByAWUkCxeAEmC+norjGqlFL+j9FK+MnFVejfogLEQz1cXISzJZttqWLwIPa4HPetIS4ri3fwAz4JkRn4Eh/hThQLQ4ymYzjEJTdiV7TGqKMt/oCB7QSug4RALF6EycLPuFRjVCml/B+j3Vw8YjMNb6MiElEMRV1SDIURi7tdfo0fgMAWj7EwLlqLqhAPlEEllA+hCiiHoogNQ4yuxSUQlzTFkiiP0dKYDwNbmhWIBTUUJguTcY7GqFJK+TtGY/Csy9v4/IwPMQFTMcUlUzEJY/ED0mFcMgQCW3F8B+OiCUiBRCAvYnSWy/Mtq2FWlMdoPN7PJkYvh4RIH5yGga0fRGNUKaX8HaMp+ARG5clElIJYKuFXl+P+YUi+aYyOh7goBaOjOkaRzVzOE7gCEiLdcQTGko6OGqNKKeX/GC2D72FUnixFM4jlKvwF45J/cIXGaIEMh7goCSM0Rotfj50wlm1oBgmRHkiDsfyEJhqjSinl/xgtix9hVJ78gfawpzwMQprL+3PW1hjNt6N4AOKiZLylMVq8JGbDWJbjwhDH6FHYpy69iTIao0op5f8YvQibYFSe7MctEMtAHINxyWacD8k3jdH7IO7QGLW8CWNZi4sgIXITdll/xzHcAdEYVUop/8do83ydqa2OowfE0gdHXI7RGhqj+ZaGBzVGPYvRgTCW1WgACZF2sLeR2o/m0BhVSqkAxGhT/AGj8iQD/4GcIQEv4aTLm7WfBwkAjVGN0fbYaj3+T1wMCZGbsdf6O35EdWiMKqVUAGK0Rb5GRlUmnoKcoagHOxP8gvKQANAY1Rg9C/YWTydwAyREhlnXz8BrKAqNUaWUCkCM1sYGGJUnmbD3Gk3CGzjl8shoFYjfaYxqjDr6w47FIUiAFFAKPrCufwAdINAYVUqpAMToOVgOo/LsaYjlQZc/t5t0AZPGaMBi9CYYyyxUgxRQA/xgXXsLzoNAY1QppQKyz+i3MC45jf34BwewHwcC7iAy8CjE0g9HYVyyDXUhAaExqjHaEL9az9mLa0K04b09X3QxzoJAY1QppQIQo8UxH8YlhzAOd+E29MDdAdcDfdEQYrkD+11+PW+ERJAYjdGIjtEkjLCecxrdIAUQg1et62biNSRAoDGqlFIBiNEEvAPjkgP4D5IhUaABNrs8V3U4kiAeqIOb0MV52zlEbsYNqKYjo5Edo45uVvxl4oUCzhsthpmwR1zbQRwao0op5fcYddzr4sdzHG+hNCQKnIUVMC5ag0oejVi+iKM4iH+wN0SOYQe6aIxGRYy2wWHreV+jBiSfLsYy65o/o6LGqMaoUip4MXqpy9s7fYvqkCiQhI9hXHQIl0BcloBPXT444BqN0aiI0XrYEMJ5o7HoAfvs+/lI0RjVGFVKBS9GU11exHQAzSFRIBZDcAzGRaOQCHFRU2yEccmvaKAxGhUxmoq3kGlt8XQfJB/i8Bbs+aIvIU5jVGNUKRWwGHWMwGkXV9S/gkKQKNAYP3lwHOmNLt+i/9Dlj38UztEYjfwYdXTBUeu5LyMekkdnYaF1rV1oB7FpjCqlVDBitAV+hXHJPjSDeKgk6jhqhlAdlMhhdPR9GJctQzWIC1piJ4xLdqML4jRGoyZGm2Gf9dw5OA+SR1fgN+tay1BSY1RjVCkV3BhNxDQYF73n4SrwCzAVizAfczCvAObiKyzAqFzs9dnHo8/xBygOCaEq+ALGRetQXfcZjaoYPR92QO5CR0geJOBh7LGuNQ1i0xhVSqmAxKijD9I9OLUoAeKiUi6H9WSUhWSjNGbBeGASzoaEwIWYAeOyCUjWGI2qGC2JcWdGoGMoJA+KYwoyYBwn/v3j0BhVSqmgxWgxTIFx0Sm8hlSIC8pgAoxLMnA7JBfux1EYD8zFlSgFyYeiuNGjo2G341I9gSnqYjQGnWGPaH6AREguVYQ9wvoLrtAY1RhVSgU8Rh1XYgeMy2bgQkiIFEFzzIZx0ew87PGZ6nIY23bjfbRDSRR2JCLJkowUVEInTMdhGA9MRILGaHTFqKMWNlrPX4J6kFy6Gvbc0+lI0RjVGFVKRUaMxmI4jAfW4iGch3hIPsSjDl7FbzAuOohOkDzohL9hPHIaO7EEY9AXd+Me3Oe8vRfP4mP8hL9gPLIJzfVs+qiN0fJZrIK/A5IL8RiME7BX5QtsGqNKKRW0GHXUx2IYD+zHdxiD3miGUlmM6CWiMIqiKfpgDFYiHcZl0/IRAIl4CpkwHjuNPdiG7dhxxtujMGEwFDEao1EboyUxEmnWNUZBcqEUPraem44HNEY1RpVSERSjjquwEcZDh7ASszEJozEGYzHGMQmzsNLt19DyG2pA8uEcjISJcv9DSYjGaNTGaDw6Y7N1jYVIzuVt/q3Wc1ejhcaoxqhSKvJiNBHdsR8myu1HR0gBlMDHMFHqW9SCaIxGb4w6zsES6xrr0RCSg3Y4aD13PFI0RjVGlVIRFqOOQhgEE+X6QkKgMpbBRJktuAgCjVGN0XjMtq5xDD1zcR79EGRYz30U8u80RpVSKtAx6ojBUJyAiTKn8WyIz4C/EF/DRIltuAKiMaox6ojDM7AXIU3IYYunMvjces5xdNcY1RhVSkVojFruwh8wUWIf+iAOEmKlMREmwm1AG4jGqMao5Uqss66zNIfjPOtjp/Wcn9FCY1RjVCkVBTHquAILYSJYBjbhToiLUvAs9sBEmJOYi0YQaIxqjNpKYp51nc1oAslClzMD0jEGxTRGNUaVUlESo45z8ArSYCLMCUxDU4gHYtEVKyNssddzOBeiMaoxmoUEvGtd5yj6IAZiKYwXrcdnoBckexqjSikVUTHqSMZtWIiTMBHgdzyAChCPNcJI7A74iPJqdM7FHFuNUY3RWAyFsXyEIhBLVcyDHa83aIxqjCqloixGLXXwEL7HKZiAycRejMJFkDBKRVtMwlGYgDiFJRiAOpACughbNUYjO0YdV2ET7Dmg5SCW1rDPtF+JppA80hhVSiknEL2K0ZshLrsAD2MWDgRkFO8bDEV7JEN8ohhuwSRs9XmELsZA1ArxKPEOl6di9PYgRkd5cIBA8YDHaHHYWzz9iVYQy4Owb9GPRRlIHmmMQimlMXoHjEfugHjkLNyCV/A9jsP4yDrMQH+cB/GxorgST2ABjvroNXwXvVADEmLNPPi3PgJxURG8C+OiOSgZ8BiNxdvWtQ6hPwpDHKkYCztGe0DyQWNUKaWcyPgJm5y3q12wAavQBhIGtdAPE7AY25AB46G9WIe5eA5XoAgkYCqgD8bhG/wF45ED+BVz8TwuQSzEJbUxBxuxFquwOgTWYD2WogvERUkYjA0h/jeswjpsxMtICXKMOh6FHZkfojzE0QDfWo87iEsg+aAxqpRSThDVRB1Uw/kuqI2aSIGEWRlcjqfwMeZjBf7CQRzH6QLcdj2KP7ERK/AxnsOtqGHHU8AVQ1u8hJn4FuuwG0cLcDjBMezDRizHNxiP7qjj4VSGQqiK2jg/hKqjJqp5MNcyBmVQy4V/Qw3UQVnERkCMtsMW63pbcCHEcTMOWY/5HvUgeaIxqpRSyomNFJRDS9yGfhiKqZiDjzEzG7Oct2/gETyA5jgf5VAYMZAIFoNCKImquBr3YSDewEzMxsxsfIbpeBF90BkXoCyKIwGilIsxWhQfWNc7gk4Qx9PW+9PxMkqGLEY1RpVSSjknHaWiOIqiWC4k/Wt0qiQUQ3EUy0ZxpGp0qjDG6P/Zu9/YqO86gOPf20FX6q6rmZuTduXWnjjnHhiVWYUqU7c5l9nMP5sNYWPFONfUZqzbLKMiqVBbQCP+UazEqBjiIwhEEn3kE8U/wUThiZoYTTQBI6FQ0qYtvX59xx7JzyYHkd41vcv7wesu9+Dyy/fZO5/fN/kEDCAmTGIXUqjFUcSEC3gc4caM0WogSZJUzhjdhGnEgjyO4nbkcAYx4a/IlSJGjVFJkiRjtB1/Qkz4B+7DwxhHTPgF7ixFjBqjkiRJxuidOISYMIFH0Y+YMIth1CLcDGNUkiTJGF3oOcSEKezDccSEc/gQws0yRheSJEkyRh/BDGLBLH6LvyAmnEVjKWPUGJUkSTJG1+IsYsEcxjCBmPBz3F7KGDVGJUmSjNEa7EO8jjmMIF26GDVGJUmSjFHgY5hELOIcPoCwKMboQpIkScYoHsS/EIs4hdcbo0nGqCRJMkZfQyiBFpxGLOInSCMs0i34gjFa+SRJkjGax6sIJZDCAKYRF7iEXoQS6cOUMVrJJEmSMQocwyfQia3o+j9txTPoxEFMIS5wEQfxNDYt4jmdeAbHMGuMSpIkVX6MTuMyxnFlES7jCvKIC+QxifHFPCfx32lEY1SSJKly3IaXcQGxykzgBWNUkiRp+cpgAFcQq9A2Y1SSJGn5ymA7LiEij1gicwXxBvKIZTCDXmNUkiRp+apBO76Eb2MYe0tkBEP4Pn6J3+EUfoPT+BkOYAgj2FtC38AetBmjkiRJy1sat5ZBTUE9VqMx8d2Eu1CHlajBrWWQNkYlSZIkY1SSJElJrWuyKWSQwwPIYRVuQViOAh+SJElVgyhbgVqsQm2VWXGdEG3FFryErXgCm/EienCfMSpJklTeEK3BRgzhW9iPr1aBb2IY7y0Som34IkbQjXegHvfjcxjGIN5ljJaPJEkyRjN4DROI1ebiiYZehCTO/WaMYgfSWIG34p24F6GgBz/Gu43R8pEkScZoPy4iVpmZ8eMNvQjX5Obvgx7CTrwO9XgKfRjAID6M2xAwiFHkjFFJkqQSq9p1oCca4vjJhom5X9W/gIDwprtb0muz2faWNdkDhdfyAR/BNqzGHXgffoROBDShD582RstDkiQZo68sjFHkMYOrFSSPiDg2H6OT8Y+ZbgSE+1uzjUxGv8y5H0JACh3oQkjowHO4K/F72BhdOpIkyRj9IdqwHg8vY4+gDe04jFnEMYz/lBj9faYbAaFl/q7o9/B2BKTwGDYjFGTwCjqxCgHPYtQYXRqSJMkYzWMbQoV5EdPFYpTz5nAQ6xAK3o9d6MDzOIn9WINQsAXfNUaXhiRJMkbn0I9QQVLYjpliMcor+rWc+Qd4D0LBRuzEBvTgn/gUQsJT+IoxunQkSZKT0b4KnYxO/c+d0T9kuhEQ3tKSvTuX/e8U9EmkEfAYupDCSnwS+/AAAlLYhEFjdClIkiRjFDiCDfggOvDEMtWBh/AojiCPOIbLTEZnf13fjXDNvc3Zdbn5Kec6BDyJHoSCOzCK3UjjbRjBZmN0qUiSJGMUmEW+gswhJkyOHW/oRrimpTlbR4zuLUw634hWbERIaEYX1uPz2IMmY7Q8JEmSMdqHfyNWmQle138WIYlzr8Z38HXkUIeVCAV1aMJuDKHZDUzlI0mS3MA0gEuIVeYqXiqym74Ru3AM/XgcD2I9BnAYO3CPu+klSZLKF6P12INYpXYgFAnSenwUXXgZX8N2fAYfxxsQjFFJkqTyxWgdnscZnMff8PcqcB5/xrPXidEU7kEbnsar2IINaEb6P+3WsQAAAADAIH/rMezvyCEZBQAAGQUAYJFRAABkFAAAZBQAABkFAAAZBQBARgEAQEYBAJBRAACQUQAAZBQAAGQUAAAZBQBARgEAQEYBAJBRAACQUQAAZBQAAGQUAAAZBQAAGQUAQEYBAEBGAQCQUQAAkFEAAGQUAABkFAAAGQUAgACWNszV/cg43QAAAABJRU5ErkJggg=="
camsyslogo = "iVBORw0KGgoAAAANSUhEUgAAAlgAAAD7CAMAAACBrxqHAAABU1BMVEUAAAAdi8wntntAQEFkeLqBvkEdi8wntntAQEFkeLqBvkEdi8wntntAQEFkeLqBvkEdi8wjwPEntntAQEFkeLqBvkEdi8wntntAQEFkeLqBvkEdi8wntntAQEFFuWhkeLqBvkEdi8wntntAQEFkeLqBvkEdi8wntntAQEFSv5lkeLqBvkEdi8xAQEFkeLqBvkEdi8wntntAQEFKldBkeLqBvkEdi8wntntAQEFkeLqBvkEdi8wntntAQEFkeLqBvkFAQEFOfsBkeLqBvkEdi8wntntAQEFkeLpnvFKBvkEdi8wntntAQEFkeLqBvkEdi8wdjs4eldMfnNgfn9ogotwhrOMituoiuewjve8jv+ojwPEkvcwkvtQkvtskv+Iluq8lu7YlvL0lvMUmuJEmuZkmuaAmuqcntnsnt4Int4pAQEFSfb9Tg8NkeLpqv21rvFCBvkELczlWAAAAT3RSTlMAEBAQEBAgICAgIDAwMDAwQEBAQEBAUFBQUFBgYGBgYGBwcHBwcICAgICAgI+Pj4+fn5+fn5+vr6+vr7+/v7+/z8/Pz9/f39/f3+/v7+/vVuDtZwAAEBFJREFUeAHs211LFV0YxvFb0f3wWJhutEgjK8ooI420SCOplNJwr1603JMvltrsqKn5/kcdelIkMetas2b9f5/hz30w1xqTw+BMp/Tl+TkDWVVuxlA5slobtyRhaKFT+jM7aGkiq9IDzhVZ+bTAuaoeWXUmDGRVuUXOVYrGl0vOFWLLqlyu/lyBrDo3DGRVueUhSw2ueMiKc0VWa6VvLM5kxeIcCbLiXJGV7FyBdzHhF2eQFQ9kyKr+5wpkxeKMoVlNVizOvIvhXEWCrDhXZMXiXDdkxeKM8cWScwXZcysPOFdkxeJcK2TF4owJZVacK97F+DBrICsW59ogKx7I4PLLn0o/bo36NuKJ4dT6b67mSl93Mu82nR8rdmpk9SqX2u9m3m05T6atnsiqt5NF3NV6n9URWeXHXX1XHCyx/+RZ9T5mUXflWvZXZHUvVzvqRt7VnNUPWfU+ZfquOFjqrOSO9qLvasnqhqzyg0zivfNoxP4M7Ye53uGuqKvXYQ4W2o/zRp4rQVdu0iDOSjDhCLpizRFkJZhwBF0JDhYu6LPSfRMVdLVuv4FLq3kQx119V6w5+qz050rlw0lXzM/6rOTfRFW237gTrDn6rPQTjr4r1hzh41C9wz19Vxws9XMrvYOsSV25MwZ1VoIJR9BVBGsOWeX7WfiumJ/1WeknHH1XrDnydzGCCUfprTvBmqPPSvhNVOqdS+BgkZVwwtF15a6aHlnpz5W6K9actj4r/YQj78pN89xKTz/hyLtyfWQV0uFuQ7uaI6uQDjK1TafRIquUzlW25TQe8C4mgW+i+q7cCFklMOHou1oiq1C+dBvclTtPVgl8ExV0xZrTf02fVeAJR9+Vm+RdTAi9z1mzu1ohq/ATjvAHQplpskpgwtF3td5HVgl8E5V35e7wLiaBCUfflWuRlfxcpdDVHFnJJ5wUunItsmrGhKPvijXnrDarU3wT1duuvivm53Z9jAZy//s3qaeGFEwVYlMGuqreC/snoCsOFoY3CrGNAQNdVe+2ga44WHGgq2LeQFce/G+gKw5WHOiqGDbQVfWeWLNh4FkRwpiBrlhzlOiKNUePrjhYdKV33UBXrDnh0RXzM4J1VTT5YGG+0GPNoSvm58jQFQeLrlhzIkFXzM90Ff5gga5Yc3C30GN+5sdU/cECXTE/69EV8zNdMT83A10xP9MVa054dMX8jLEitEcGfkxlzakPumJ+piu9iwa6Ys1J3i/2zljXeRVrw5ZoOAUVDQ36Czfol9xQuWFE5Yor8f1fwJyT7HUmyctrb6I4M+jwVp/2t5K9s/z4iRPANHDV5/CzsrdMI6c9sqp/rr4jLBPSLtmiV8fdDbdMLDbc445/4z0eHvocT46iDgelWOmPHstOJ+hR2f9OZl3ygcT2ytX7ozl+259TliO0wr3I8/+/p6iJR0BO+FDIFpFRux+VQmU6e2xJwRz2KO+QqPkLw4R3F6YiV30MP7ttxxTPhVXuJRsHC1pJuGBgYbKhYGFpK1iSLajDHmEWdQFYsOC5T2HFXbKllPIuiafceFpwrqzEwSLxDCyMawZLUuajHkmTUtklWV0CFnDV3WiOSj8dje6nRWb5advChUWUBXQELqwDsLb0GHFF0ZVnSE8pCDQHq8DvuCUpfMVZXvBi5UdOupSrYOWE8T1z1Tz8vN6PRHhsp4r7LZafvWk7UFaQCqIs6X6sgYU46lBAoQLW9FK64DlRAwt/sZkzdVD+wco94xbqJ0/ireuWq3ZhhfoFTOBG0nfmPC0QsLx0nQkrBgYWKOOGcTkCS+LgLz8FS6JXcOgtC7OvRz1ysPrmqnX42Tyeo3iSmgkjmilcWT/AMGVJ8zUFi+CiKFhwOdQGlsTf1TgDweSlzvg/HKyeuWoefk6PFy94MgYqLLkaOgKLKUuENXGwIPJbASzCoGkDS2IqDtoYV/J/8QKwgKu+hp8tnKASRS43RVhyDT8fgMWUJcJqAascgoU/ty1gwTXADKfYOlXjwp+5zFj/ggPcy/BzJNaBgLDkGBTFwWLKEmE1G0v/HizdAhY4KIPUVeO7gO1owfMFozlUWBAQligrcLCYskRYDWAZevEO8VDaBNb8wqWGT5kXgwVcdTj87KSJ7cJCZSFYTFkirAawVvp1AyRBKQWLv0L38k6orwcLuepVWHD93SYsUBaAxZR1F1YDWCrSL0jJRZJuAgvfCwNeLXwFLOCqz9GcFfrLAxfGoCwAiyhLhMXBivYhLqw3gIuZzsFyCT/CcbA4FysK8CtgAVd9Dj+n1hGsJAeEKAvAQmWJsChYJElXIU+PyTiq2AyWSFlSWntExwptN1yBsK4FC4SFykKwUFkirCawVsNHG/lclnaw8I9CUnHGVQNYHXAF+eNisEBYqCw8NqAsEVarsaL5HVhZIPgkWPYYnU+D9cf//y/l/6ZrwQJhgbLw2ICyRFhYx2c3kJkS1Fib/ThY4Xqw+k8zWCAsUFbkx0aUhcL6/adCPW90dkN4SMx4CNvBWvEaqx2sfxJFnJR2YeGnew6WKAuFdQIWzuNx+OdU5ydsbWCRpuCHRMmSkmSARa1fmjAs4TkLuASAEWWBsDhY5JdnAhZMcTHtYLH3vnjcowEWBCYCnAuLR3OwRFkoLA7W4SABgIWj1UsbWPiL/Ms3726A1RLVMhU77QeJdbBAWUlIbgULP/YTsHBWahtYsTpWmAZYTVn5eIV+Wbzq98NoDpYoCw9oM1gBwCITyd4ESzjKcELZAVZDYDofTCF9nU4SKhFlAVigLOn5lWCZprFCSAYzOjLJdoAFgdmRhp256wuC9vzNA4ERZeHx5GARXhyCReaaNoCFr2VDUPY4wGq/fC9I1utg7nY20SRysERZ0PIGsFZy8U6cE9rBoquTjFxGDrDaV3/Z6plbFBcWFBsKjChLDucbYIUdShlYsXU+FtxoYKnO/duzHmD9PipXRm7tzw8dCosrK1FgRFnS8VawlM9oEgQLn7QBLGWXja4Aj/s9Qb8OCeQBFouWfq7e3M3vE3QYhYWNP7wo97ju+ddjhQUHC8/nvDeuhC64EL5G1r7OVkCc4yY/JSuhIeaf5yzJY4eXiQiLKQuBwWX59hwsllAFCKJ+t/yLhBx8Xx5qtnw2VjiGofnB3NzEhcWVxYFBYTWBleyRmeCT7vwWWCt9jTruJGUxAyzatfKKlQfdgLBQWQAMKMu+B1Ze2EQ/RnluBgtvpIZNgmzL/fwbYJEoH/PfHV5nU7k9op4OYm4lT9/aG15xUKctxhzctPF0zEAqDT72MWrCQGx4vH9RCk6RXpCof/L9NafjjCh+lo2MjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjCgbQvorS/Bm+kSMDzH9mTXMdhqRKDvfG33aFjuHtfMGqjnvjynRVm4GFcmD1zuQz015mcSbXub8+nSW89Kl+ozzdJBwtm2gSrfggTTpLObpL1lIoxP0uR637k9Z/dRfQjlZqmIKruLEha24KJHvWxr2s5yXpuZNqDVZ7QPPspHp9Ty4jgSjQqn3GWK3HVJ8n2u/IB7b6uhNRYo5pyapS8FCaCCR1OASNX8BWA6wYn/Lslezqi65SmG2dg4x18iaEZ97DNZKW7YYrHUhJGloVt8AiytL04MJT7J9DCykJYVgrQ9RrBQZ/3m5FYYkj8wd3rkhKDS2gRebFex7Cm+SFh3v0v5SZsNj7nuKhKfARhUQg0xsQA0csERLRFilpixduW9TeIpGsCq0bF5NEpuqmDu8+bh7qewgtnazC7XiBn25enYlcEd900iP92Yjq/4gATvKyzZRFhNW4k8nSARUFkSkCmFgheoG+QHaIg0srnpXEjX1klh/98jwglVtS9uAHjMPZdCY5WqwPCuWl2rl6aiwrBJlfRAsSy4l7h6LKCzHt+fsI2QneovaMfj+6LBd9wZk8ovS1WBNoiwmLHw6/NO5sprAQgsZwnLBh5MGrlMvYYIt2H/YsF3fJeYoB3jGXQ6WKIsJ6+DpxFTyjw+C5aFR/F6IiZhpuWPfFVhskbiGl/bYMpXpXiCBrXq/HCxRFhHW0dMtIiqirFawuK6hLbDRkGeVfYHlmnZLW+D6AA4PNfblYImymLCkjpC3e9h46gNgmZb7gSToapfZ8FsEHrUJh6J3fGjAq7NvgiXKYsKSOkLexjeeagcLXUgDDTRT54lwJ6yjGNnoVM7Cosk3pkH9d8ASZTFhSR0XFlfW22DlFgk5+GKxy9iHr8nVb191ViIvy7evSMGZ74NFlCXCkjouLFDWB8Bqk/gmIzjQwE6C41IlLbNVvxB1lHt1M6tJ8hqc/gBYKUAsA0uUhcJCsFBYXFntYDVfYuHoUY7B6p7fDJ/wCt6clMP38ISsexLosBWsSgIDS5SFwmJggbBAWe1gASm/x8OTBvYWt+2YEjUdtMYxUdwyELP6b4ElykJhAVggLFDWx8Cafh+dyN1Re4tfCztyGC21m+KdWWqwJv0VsERZICyoQ2Ghsr4BFsbUG6im7mLmJb2+lsgwwJEcjHZhTa8aNNdfY6GyRFhYh8ICZX0OrDYoSAP11Ge0dTIh62DI0+NhoLH2YULWnr/xqRCUJcLiT5dEWKAs/f2Ld2jgIg2EJvUWF3HOHLa1yYc/NvTXg4XKEmFBHQ65g7Lil79uIDEBGthndMaxnnawyB5L14MlykJhSR0Ki0R/AKwNpgy1R0WYI9df4BKjHSzMvTFfAkuUBcKSOhQWSXwfLPxo8Im9cPuITKrFLO+B5cOf8VUJfhEsURYIC+pAWBD9Pljn+8WQBlr+oan7yQ3hPbBWzsgXwRJlgbCgDoQFie+DhRvGYNJfCS+UrwcN7COFvvu/aawgMyIh5ptgibJEWFAHHwltJSsoqwEsHDdzdPA1sJkQ0KWtsynvEFXaL95xyjv2+0tgibJAWFCHHwn5jsRvg6VlVggZsjevpHl2rNauNoReKXHNYOHsbiTuG2CJsrSc5rQuoQhw4nA7WPizWlsCfr0HDMIJ0EW26hpbFWFRTVNbcYdpV4DUS8ESZUU5GqSOGgKU1Q4WLnsqvjq1xJEG4sj01tt8rBL0A1ZzwcFA3lZoIe5a6RJ0sA2saCtRHCw5Z+BoQN0GFags+y5YOHafPO5sHp8r5aem0kA7dZN5f1zQbW1YMozsNYFlijx+Dd5aO4e1wE7AbWCR2GOwvNT5gzqPFais9CZY8JWzzEqy1v2nK1mRBm4xuFsDU4FX0kH8Xk82UxtYEpN3CLTlYrBERqIjWseFhcpqBwuXl2MWbOAGRV3eb8akHVPO16jQqLDzSTNfBMsLz1iHwmpTVjtYZOJbsmT4BhP11FfwNmn76tXUABZEzy/W2hboytVggY6gDiq4ssw7YGH8ije4q0aHFwgzNLCTGB/WxG9h2B7twpJuWYKTpowoG2L6KzF48+sGjn2C/90eHAsAAAAADPK3HsP+6gMAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAIz7kPGBAfqCAAAAABJRU5ErkJggg=="
from .xhtml import Elem
georgiatechlogo_element = Elem('img', {
'src': "data:image/png;base64,{}".format(georgiatechlogo),
'style': 'float:right;max-height:40px;margin-top:0;margin-right:20px;margin-left:20px'
})
camsyslogo_element = Elem('img', {
'src': "data:image/png;base64,{}".format(camsyslogo),
'style': 'float:right;max-height:44px;margin-top:0'
})
_use_local_logo = False
def local_logo(filename=None):
from .configure import cached, save
cfg = cached()
if filename is not None and os.path.exists(filename):
png = base64.b64encode(open(filename, "rb").read())
cfg['local_logo'] = png
save(cfg)
if 'local_logo' in cfg:
return cfg['local_logo']
| 807.428571 | 24,003 | 0.934107 | 2,158 | 39,564 | 17.118165 | 0.57924 | 0.012831 | 0.012182 | 0.008771 | 0.008554 | 0.006686 | 0.005712 | 0.004575 | 0.003682 | 0 | 0 | 0.146956 | 0.005358 | 39,564 | 48 | 24,004 | 824.25 | 0.791777 | 0.000531 | 0 | 0.08 | 0 | 0.6 | 0.958445 | 0.954955 | 0 | 1 | 0 | 0 | 0 | 1 | 0.04 | false | 0 | 0.12 | 0 | 0.2 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
da92f3d57fce6772846a74be2d6f6ee7d5a732c1 | 11,473 | py | Python | test/test_profiles.py | VDBWRAIR/pyjip | dc147afebbabd550828fa51cc052db4aa07c5d3b | [
"BSD-3-Clause"
] | 18 | 2015-05-08T06:39:09.000Z | 2020-11-30T10:51:36.000Z | test/test_profiles.py | VDBWRAIR/pyjip | dc147afebbabd550828fa51cc052db4aa07c5d3b | [
"BSD-3-Clause"
] | 9 | 2015-01-02T09:55:53.000Z | 2016-02-03T18:31:10.000Z | test/test_profiles.py | castlabs/pyjip | 947f615d591a940438316e6d21291f730bfcda66 | [
"BSD-3-Clause"
] | 5 | 2016-02-01T16:52:36.000Z | 2021-03-10T12:08:39.000Z | #!/usr/bin/env python
import jip
import pytest
def test_tool_name():
@jip.tool()
class MyTool():
def validate(self):
pass
def get_command(self):
return "echo"
p = jip.Pipeline()
p.run('MyTool')
p.expand()
jobs = jip.create_jobs(p)
assert len(jobs) == 1
assert jobs[0].name == "MyTool"
assert jobs[0].pipeline is None
def test_tool_name_external_profile_on_pipeline():
@jip.tool()
class MyTool():
def validate(self):
pass
def get_command(self):
return "echo"
p = jip.Pipeline()
p.run('MyTool')
p.expand()
profile = jip.Profile(name="testname")
jobs = jip.create_jobs(p, profile=profile)
assert len(jobs) == 1
assert jobs[0].name == "MyTool"
assert jobs[0].pipeline == "testname"
def test_tool_name_set_in_validate():
@jip.tool()
class MyTool():
def validate(self):
self.name("testtool")
def get_command(self):
return "echo"
p = jip.Pipeline()
p.run('MyTool')
p.expand()
profile = jip.Profile(name="testname")
jobs = jip.create_jobs(p, profile=profile)
assert len(jobs) == 1
assert jobs[0].name == "testtool"
assert jobs[0].pipeline == "testname"
def test_tool_name_set_in_validate_with_job():
@jip.tool()
class MyTool():
def validate(self):
self.job.name = "testtool"
def get_command(self):
return "echo"
p = jip.Pipeline()
p.run('MyTool')
p.expand()
profile = jip.Profile(name="testname")
jobs = jip.create_jobs(p, profile=profile)
assert len(jobs) == 1
assert jobs[0].name == "testtool"
assert jobs[0].pipeline == "testname"
def test_tool_name_in_pipeline_context():
@jip.tool()
class MyTool():
def validate(self):
self.job.name = "testtool"
def get_command(self):
return "echo"
@jip.pipeline()
class MyPipeline():
def validate(self):
self.name("thepipeline")
def pipeline(self):
p = jip.Pipeline()
p.run('MyTool')
return p
p = jip.Pipeline()
p.run('MyPipeline')
p.expand()
jobs = jip.create_jobs(p)
assert len(jobs) == 1
assert jobs[0].name == "testtool"
assert jobs[0].pipeline == "thepipeline"
def test_tool_name_in_pipeline_context_with_custom_profile():
@jip.tool()
class MyTool():
def validate(self):
self.job.name = "testtool"
def get_command(self):
return "echo"
@jip.pipeline()
class MyPipeline():
def validate(self):
self.name("thepipeline")
def pipeline(self):
p = jip.Pipeline()
p.run('MyTool')
return p
p = jip.Pipeline()
p.run('MyPipeline')
p.expand()
profile = jip.Profile(name="customname")
jobs = jip.create_jobs(p, profile=profile)
assert len(jobs) == 1
assert jobs[0].name == "testtool"
assert jobs[0].pipeline == "customname"
def test_tool_name_in_pipeline_context_with_custom_profile_and_custom_name():
@jip.tool()
class MyTool():
def validate(self):
self.job.name = "testtool"
def get_command(self):
return "echo"
@jip.pipeline()
class MyPipeline():
def validate(self):
self.name("thepipeline")
def pipeline(self):
p = jip.Pipeline()
p.job('Tool1').run('MyTool')
return p
p = jip.Pipeline()
p.run('MyPipeline')
p.expand()
profile = jip.Profile(name="customname")
jobs = jip.create_jobs(p, profile=profile)
assert len(jobs) == 1
assert jobs[0].name == "Tool1"
assert jobs[0].pipeline == "customname"
def test_tool_name_in_pipelines_with_multiplexing():
@jip.tool()
class MyTool():
"""mytool
usage:
mytool <data>
"""
def validate(self):
self.job.name = "Tool"
def get_command(self):
return "echo"
@jip.pipeline()
class MyPipeline():
def validate(self):
self.name("thepipeline")
def pipeline(self):
p = jip.Pipeline()
p.run('MyTool', data=["A", "B"])
return p
p = jip.Pipeline()
p.run('MyPipeline')
p.expand()
profile = jip.Profile(name="customname")
jobs = jip.create_jobs(p, profile=profile)
assert len(jobs) == 2
assert jobs[0].name == "Tool.0"
assert jobs[0].pipeline == "customname"
assert jobs[1].name == "Tool.1"
assert jobs[1].pipeline == "customname"
def test_tool_name_in_pipelines_with_multiplexing_and_custom_name():
@jip.tool()
class MyTool():
"""mytool
usage:
mytool <data>
"""
def validate(self):
self.job.name = "somename"
def get_command(self):
return "echo"
@jip.pipeline()
class MyPipeline():
def validate(self):
self.name("thepipeline")
def pipeline(self):
p = jip.Pipeline()
p.job("Tool").run('MyTool', data=["A", "B"])
return p
p = jip.Pipeline()
p.run('MyPipeline')
profile = jip.Profile(name="customname")
jobs = jip.create_jobs(p, profile=profile)
assert len(jobs) == 2
assert jobs[0].name == "Tool.0"
assert jobs[0].pipeline == "customname"
assert jobs[1].name == "Tool.1"
assert jobs[1].pipeline == "customname"
def test_tool_name_in_pipelines_with_multiplexing_and_custom_template_name():
@jip.tool()
class MyTool():
"""mytool
usage:
mytool <data>
"""
def validate(self):
self.job.name = "${data}"
def get_command(self):
return "echo"
@jip.pipeline()
class MyPipeline():
def validate(self):
self.name("thepipeline")
def pipeline(self):
p = jip.Pipeline()
p.run('MyTool', data=["A", "B"])
return p
p = jip.Pipeline()
p.run('MyPipeline')
p.expand()
profile = jip.Profile(name="customname")
jobs = jip.create_jobs(p, profile=profile)
assert len(jobs) == 2
assert jobs[0].name == "A"
assert jobs[0].pipeline == "customname"
assert jobs[1].name == "B"
assert jobs[1].pipeline == "customname"
def test_tool_name_in_pipelines_with_multiplexing_and_custom_template_name_as_job():
@jip.tool()
class MyTool():
"""mytool
usage:
mytool <data>
"""
def validate(self):
self.job.name = "something"
def get_command(self):
return "echo"
@jip.pipeline()
class MyPipeline():
def validate(self):
self.name("thepipeline")
def pipeline(self):
p = jip.Pipeline()
p.job("${data}").run('MyTool', data=["A", "B"])
return p
p = jip.Pipeline()
p.run('MyPipeline')
p.expand()
profile = jip.Profile(name="customname")
jobs = jip.create_jobs(p, profile=profile)
assert len(jobs) == 2
assert jobs[0].name == "A"
assert jobs[0].configuration['data'].get() == "A"
assert jobs[0].pipeline == "customname"
assert jobs[1].name == "B"
assert jobs[1].configuration['data'].get() == "B"
assert jobs[1].pipeline == "customname"
@pytest.mark.parametrize('data', [1, 3])
def test_pipeline_overwrites_tool(data):
@jip.tool()
class MyTool():
def setup(self):
self.profile.threads = 2
def get_command(self):
return "echo"
@jip.pipeline()
class MyPipeline():
def pipeline(self):
p = jip.Pipeline()
p.job(threads=data).run('MyTool')
return p
p = jip.Pipeline()
p.run('MyPipeline')
p.expand()
jobs = jip.create_jobs(p)
assert jobs[0].threads == data
def test_pipeline_overwrites_pipeline_from_spec():
@jip.tool()
class MyTool():
def setup(self):
self.profile.threads = 2
self.profile.queue = "Org"
def get_command(self):
return "echo"
@jip.pipeline()
class MyPipeline():
def pipeline(self):
p = jip.Pipeline()
p.job(threads=3, queue="Yeah").run('MyTool')
return p
p = jip.Pipeline()
p.run('MyPipeline')
p.expand()
profile = jip.Profile(threads=10, queue="Test")
profile.specs['MyTool'] = jip.Profile(threads=5)
profile.apply_to_pipeline(p)
jobs = jip.create_jobs(p)
assert jobs[0].threads == 5
assert jobs[0].queue == "Yeah"
def test_pipeline_tool_defaults():
@jip.tool()
class MyTool():
def setup(self):
self.profile.threads = 2
self.profile.queue = "Org"
def get_command(self):
return "echo"
@jip.pipeline()
class MyPipeline():
def pipeline(self):
p = jip.Pipeline()
p.job().run('MyTool')
return p
p = jip.Pipeline()
p.run('MyPipeline')
p.expand()
profile = jip.Profile()
profile.specs['MyTool'] = jip.Profile()
profile.apply_to_pipeline(p)
jobs = jip.create_jobs(p)
assert jobs[0].threads == 2
assert jobs[0].queue == "Org"
def test_pipeline_tool_defaults_global():
@jip.tool()
class MyTool():
def setup(self):
self.profile.threads = 2
self.profile.queue = "Org"
def get_command(self):
return "echo"
@jip.pipeline()
class MyPipeline():
def pipeline(self):
p = jip.Pipeline()
p.job().run('MyTool')
return p
p = jip.Pipeline()
p.run('MyPipeline')
p.expand()
profile = jip.Profile(threads=5, queue="yeah")
profile.specs['MyTool'] = jip.Profile()
profile.apply_to_pipeline(p)
jobs = jip.create_jobs(p)
assert jobs[0].threads == 2
assert jobs[0].queue == "Org"
def test_pipeline_tool_defaults_global_job():
@jip.tool()
class MyTool():
def setup(self):
self.profile.threads = 2
self.profile.queue = "Org"
def get_command(self):
return "echo"
@jip.pipeline()
class MyPipeline():
def pipeline(self):
p = jip.Pipeline()
p.job(threads=3, queue="Intern").run('MyTool')
return p
p = jip.Pipeline()
p.run('MyPipeline')
p.expand()
profile = jip.Profile(threads=5, queue="yeah")
profile.specs['MyTool'] = jip.Profile()
profile.apply_to_pipeline(p)
jobs = jip.create_jobs(p)
assert jobs[0].threads == 3
assert jobs[0].queue == "Intern"
def test_pipeline_tool_spec_regexp():
@jip.tool()
class MyTool():
def get_command(self):
return "echo"
@jip.pipeline()
class MyPipeline():
def pipeline(self):
p = jip.Pipeline()
p.job(threads=3, queue="Intern").run('MyTool')
return p
p = jip.Pipeline()
p.run('MyPipeline')
p.expand()
profile = jip.Profile(threads=5, queue="yeah", priority="high")
profile.specs['My*'] = jip.Profile(threads=10, queue="rock")
profile.apply_to_pipeline(p)
jobs = jip.create_jobs(p)
assert jobs[0].threads == 10
assert jobs[0].queue == "rock"
assert jobs[0].priority == "high"
| 23.65567 | 84 | 0.563148 | 1,380 | 11,473 | 4.573913 | 0.06087 | 0.069708 | 0.060995 | 0.061787 | 0.926172 | 0.909221 | 0.903517 | 0.899398 | 0.892269 | 0.890368 | 0 | 0.010487 | 0.293559 | 11,473 | 484 | 85 | 23.704545 | 0.768291 | 0.0129 | 0 | 0.867403 | 0 | 0 | 0.078722 | 0 | 0 | 0 | 0 | 0 | 0.151934 | 1 | 0.19337 | false | 0.005525 | 0.005525 | 0.046961 | 0.364641 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e5079fe5356826dd3843da5107ca40b635d25c6b | 10,600 | py | Python | deepaccess/interpret/ExpectedPatternEffect.py | jhammelman/DeepAccessTransfer | 8ca978873e2fcb1b95d90902e3fb38e710027776 | [
"MIT"
] | 2 | 2021-08-16T18:34:59.000Z | 2022-02-19T16:05:21.000Z | deepaccess/interpret/ExpectedPatternEffect.py | jhammelman/DeepAccessTransfer | 8ca978873e2fcb1b95d90902e3fb38e710027776 | [
"MIT"
] | null | null | null | deepaccess/interpret/ExpectedPatternEffect.py | jhammelman/DeepAccessTransfer | 8ca978873e2fcb1b95d90902e3fb38e710027776 | [
"MIT"
] | 1 | 2021-05-26T21:54:53.000Z | 2021-05-26T21:54:53.000Z | from scipy.stats import wilcoxon, norm
import numpy as np
from deepaccess.ensemble_utils import *
def motif2test(motiffile, backgroundfile, p=None):
motifs = open(motiffile).read().split(">")[1:]
motifmats = {}
for motif in motifs:
motifname = motif.strip().split("\n")[0]
motiflines = motif.strip().split("\n")[1:]
motif_mat = np.zeros((len(motiflines), 4))
for i, line in enumerate(motiflines):
counts = np.array([float(c) for c in line.split("\t")])
motif_mat[i, :] = counts
motifmats[motifname] = motif_mat
names = []
all_backgrounds = []
for motifname in motifmats.keys():
motif = motifmats[motifname]
backgrounds = fa_to_onehot(backgroundfile)
if p is None:
start = int(backgrounds.shape[1] / 2 - motif.shape[0] / 2)
else:
start = p
for bi in range(backgrounds.shape[0]):
for pos in range(motif.shape[0]):
consensus_char = np.argmax(motif[pos, :])
backgrounds[bi, pos + start, :] = 0
backgrounds[bi, pos + start, consensus_char] = 1.0
names.append(motifname + "_EPEDistribution_" + str(bi))
all_backgrounds.append(backgrounds)
null_backgrounds = fa_to_onehot(backgroundfile)
return(
np.concatenate(all_backgrounds, axis=0),
np.array(null_backgrounds),
names
)
def fasta2test(fastafile, backgroundfile, p=None):
fastanames = [f.split("\n")[0]
for f in open(fastafile).read().split(">")[1:]]
fastas = fa_to_onehot(fastafile, make_uniform_length=False)
names = []
all_backgrounds = []
for fi, fastaname in enumerate(fastanames):
fasta = fastas[fi]
backgrounds = fa_to_onehot(backgroundfile)
if p is None:
start = int(backgrounds.shape[1] / 2 - fasta.shape[0] / 2)
else:
start = p
for bi in range(backgrounds.shape[0]):
for pos in range(fasta.shape[0]):
# if fasta is not N replace background,
# otherwise keep background the same
if np.max(fasta[pos, :]) != 0.25:
consensus_char = np.argmax(fasta[pos, :])
backgrounds[bi, pos + start, :] = 0
backgrounds[bi, pos + start, consensus_char] = 1.0
names.append(fastaname + "_EPEDistribution_" + str(bi))
all_backgrounds.append(backgrounds)
null_backgrounds = fa_to_onehot(backgroundfile)
return(
np.concatenate(all_backgrounds, axis=0),
np.array(null_backgrounds),
names
)
def rank_ratio_statistic(num, denom):
sorted_by_ranks = np.argsort(np.abs(np.log2(num / denom)))
Wp = 0
Wn = 0
n = num.shape[0]
for rank, ri in enumerate(sorted_by_ranks):
if np.sign(num[ri] - denom[ri]) == 1:
Wp += rank + 1
elif np.sign(num[ri] - denom[ri]) == -1:
Wn += rank + 1
W = min(Wp, Wn)
num = (W - n * (n + 1) / 4)
denom = np.sqrt(n * (n + 1) * (2 * n + 1) / 24)
return (
norm.cdf(x=(num / denom)) * 2
)
def ExpectedPatternEffect(predict_function, class_ind, X_p, X, seqsets):
fx_p = predict_function(X_p)
fx = predict_function(X)
patternseqs = np.array([s.split("_EPEDistribution")[0] for s in seqsets])
patterns = list(
sorted(set([s.split("_EPEDistribution")[0] for s in seqsets]))
)
stats = []
pvals = []
ratio_pvals = []
adj_pvals = []
EPEs = []
for pattern in patterns:
ind_pattern_seqs = np.where(pattern == patternseqs)[0]
if len(class_ind) == 1:
num = fx_p[ind_pattern_seqs, class_ind[0]].reshape((-1,))
denom = fx[:, class_ind[0]].reshape((-1,))
else:
num = np.mean(
[fx_p[ind_pattern_seqs, ind] for ind in class_ind], axis=0
).reshape((-1,))
denom = np.mean(
[fx[:, ind] for ind in class_ind], axis=0
).reshape((-1,))
fc = num / denom
ratio_pvals.append(rank_ratio_statistic(num, denom))
stat, pval = wilcoxon(num, denom)
EPEs.append(np.mean(np.log2(fc)))
pvals.append(pval)
adj_pvals.append(pval * len(patterns))
stats.append(stat)
return (
fx_p,
fx,
{
"Pattern": patterns,
"ExpectedPatternEffect": EPEs,
"Significance": ratio_pvals,
"AdjustedSignificance": [r * len(patterns) for r in ratio_pvals],
"WilcoxonTestStatistic": stats,
"WilcoxonSignificance": pvals,
"WilcoxonSignificanceAdj": adj_pvals,
},
)
def DifferentialExpectedPatternEffect(
predict_function, class1_ind, class2_ind, X_p, X, seqsets
):
fx_p, fx, c1_EPEdata = ExpectedPatternEffect(
predict_function, class1_ind, X_p, X, seqsets
)
patternseqs = np.array([s.split("_EPEDistribution")[0] for s in seqsets])
patterns = list(
set(sorted([s.split("_EPEDistribution")[0] for s in seqsets]))
)
stats = []
pvals = []
adj_pvals = []
ratio_pvals = []
DiffEPEs = []
for pattern in patterns:
ind_pattern_seqs = np.where(pattern == patternseqs)[0]
if len(class1_ind) == 1:
num = fx_p[ind_pattern_seqs, class1_ind].reshape((-1,))
denom = fx[:, class1_ind].reshape((-1,))
else:
num = np.mean(
[fx_p[ind_pattern_seqs, ind] for ind in class1_ind], axis=0
).reshape((-1,))
denom = np.mean(
[fx[:, ind] for ind in class1_ind], axis=0
).reshape((-1,))
fc1 = num / denom
if len(class2_ind) == 1:
num = fx_p[ind_pattern_seqs, class2_ind].reshape((-1,))
denom = fx[:, class2_ind].reshape((-1,))
else:
num = np.mean(
[fx_p[ind_pattern_seqs, ind] for ind in class2_ind], axis=0
).reshape((-1,))
denom = np.mean(
[fx[:, ind] for ind in class2_ind], axis=0
).reshape((-1,))
fc2 = num / denom
stat, pval = wilcoxon(fc1, fc2)
DiffEPEs.append(np.mean(np.log2(fc1 / fc2)))
ratio_pvals.append(rank_ratio_statistic(fc1, fc2))
pvals.append(pval)
adj_pvals.append(pval * len(patterns))
stats.append(stat)
return (
fx_p,
fx,
{
"Pattern": patterns,
"DifferentialExpectedPatternEffect": DiffEPEs,
"Significance": ratio_pvals,
"AdjustedSignificance": [r * len(patterns) for r in ratio_pvals],
"WilcoxonTestStatistic": stats,
"WilcoxonSignificance": pvals,
"WilcoxonSignificanceAdj": adj_pvals,
},
)
def SubtractDifferentialExpectedPatternEffect(
predict_function, class1_ind, class2_ind, X_p, X, seqsets
):
fx_p, fx, c1_EPEdata = ExpectedPatternEffect(
predict_function, class1_ind, X_p, X, seqsets
)
patternseqs = np.array([s.split("_EPEDistribution")[0] for s in seqsets])
patterns = list(
set(sorted([s.split("_EPEDistribution")[0] for s in seqsets]))
)
stats = []
pvals = []
adj_pvals = []
ratio_pvals = []
DiffEPEs = []
for pattern in patterns:
ind_pattern_seqs = np.where(pattern == patternseqs)[0]
if len(class1_ind) == 1:
num = fx_p[ind_pattern_seqs, class1_ind].reshape((-1,))
denom = fx[:, class1_ind].reshape((-1,))
else:
num = np.mean(
[fx_p[ind_pattern_seqs, ind] for ind in class1_ind], axis=0
).reshape((-1,))
denom = np.mean(
[fx[:, ind] for ind in class1_ind], axis=0
).reshape((-1,))
fc1 = num / denom
if len(class2_ind) == 1:
num = fx_p[ind_pattern_seqs, class2_ind].reshape((-1,))
denom = fx[:, class2_ind].reshape((-1,))
else:
num = np.mean(
[fx_p[ind_pattern_seqs, ind] for ind in class2_ind], axis=0
).reshape((-1,))
denom = np.mean(
[fx[:, ind] for ind in class2_ind], axis=0
).reshape((-1,))
fc2 = num / denom
stat, pval = wilcoxon(fc1, fc2)
DiffEPEs.append(np.mean(fc1 - fc2))
ratio_pvals.append(rank_ratio_statistic(fc1, fc2))
pvals.append(pval)
adj_pvals.append(pval * len(patterns))
stats.append(stat)
return (
fx_p,
fx,
{
"Pattern": patterns,
"DifferentialExpectedPatternEffect": DiffEPEs,
"Significance": ratio_pvals,
"AdjustedSignificance": [r * len(patterns) for r in ratio_pvals],
"WilcoxonTestStatistic": stats,
"WilcoxonSignificance": pvals,
"WilcoxonSignificanceAdj": adj_pvals,
},
)
def SubtractExpectedPatternEffect(
predict_function, class_ind, X_p, X, seqsets
):
fx_p = predict_function(X_p)
fx = predict_function(X)
patternseqs = np.array([s.split("_EPEDistribution")[0] for s in seqsets])
patterns = list(
sorted(set([s.split("_EPEDistribution")[0] for s in seqsets]))
)
stats = []
pvals = []
ratio_pvals = []
adj_pvals = []
EPEs = []
for pattern in patterns:
ind_pattern_seqs = np.where(pattern == patternseqs)[0]
if len(class_ind) == 1:
num = fx_p[ind_pattern_seqs, class_ind[0]].reshape((-1,))
denom = fx[:, class_ind[0]].reshape((-1,))
else:
num = np.mean(
[fx_p[ind_pattern_seqs, ind] for ind in class_ind], axis=0
).reshape((-1,))
denom = np.mean(
[fx[:, ind] for ind in class_ind], axis=0
).reshape((-1,))
fc = num - denom
ratio_pvals.append(rank_ratio_statistic(num, denom))
stat, pval = wilcoxon(num, denom)
EPEs.append(np.mean(fc))
pvals.append(pval)
adj_pvals.append(pval * len(patterns))
stats.append(stat)
return (
fx_p,
fx,
{
"Pattern": patterns,
"ExpectedPatternEffect": EPEs,
"Significance": ratio_pvals,
"AdjustedSignificance": [r * len(patterns) for r in ratio_pvals],
"WilcoxonTestStatistic": stats,
"WilcoxonSignificance": pvals,
"WilcoxonSignificanceAdj": adj_pvals,
},
)
| 34.304207 | 77 | 0.553962 | 1,234 | 10,600 | 4.601297 | 0.123987 | 0.033815 | 0.039451 | 0.027474 | 0.805389 | 0.798697 | 0.798697 | 0.792004 | 0.792004 | 0.792004 | 0 | 0.020805 | 0.315283 | 10,600 | 308 | 78 | 34.415584 | 0.761505 | 0.006792 | 0 | 0.729825 | 0 | 0 | 0.065748 | 0.026983 | 0 | 0 | 0 | 0 | 0 | 1 | 0.024561 | false | 0 | 0.010526 | 0 | 0.052632 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e518527ffe1a2a291caa11d815246ac3f099b7cc | 154,136 | py | Python | trac/Lib/site-packages/libsvn/wc.py | thinkbase/PortableTrac | 9ea0210f6b88f135ef73f370b48127af0495b2d7 | [
"BSD-3-Clause"
] | 2 | 2015-08-06T04:19:21.000Z | 2020-04-29T23:52:10.000Z | trac/Lib/site-packages/libsvn/wc.py | thinkbase/PortableTrac | 9ea0210f6b88f135ef73f370b48127af0495b2d7 | [
"BSD-3-Clause"
] | null | null | null | trac/Lib/site-packages/libsvn/wc.py | thinkbase/PortableTrac | 9ea0210f6b88f135ef73f370b48127af0495b2d7 | [
"BSD-3-Clause"
] | null | null | null | # This file was automatically generated by SWIG (http://www.swig.org).
# Version 1.3.40
#
# Do not make changes to this file unless you know what you are doing--modify
# the SWIG interface file instead.
# This file is compatible with both classic and new-style classes.
from sys import version_info
if version_info >= (2,6,0):
def swig_import_helper():
from os.path import dirname
import imp
fp = None
try:
fp, pathname, description = imp.find_module('_wc', [dirname(__file__)])
except ImportError:
import _wc
return _wc
if fp is not None:
try:
_mod = imp.load_module('_wc', fp, pathname, description)
finally:
fp.close()
return _mod
_wc = swig_import_helper()
del swig_import_helper
else:
import _wc
del version_info
def _swig_setattr_nondynamic(self,class_type,name,value,static=1):
if (name == "thisown"): return self.this.own(value)
if (name == "this"):
if type(value).__name__ == 'SwigPyObject':
self.__dict__[name] = value
return
method = class_type.__swig_setmethods__.get(name,None)
if method: return method(self,value)
if (not static) or hasattr(self,name):
self.__dict__[name] = value
else:
raise AttributeError("You cannot add attributes to %s" % self)
def _swig_setattr(self,class_type,name,value):
return _swig_setattr_nondynamic(self,class_type,name,value,0)
def _swig_getattr(self,class_type,name):
if (name == "thisown"): return self.this.own()
method = class_type.__swig_getmethods__.get(name,None)
if method: return method(self)
raise AttributeError(name)
def _swig_repr(self):
try: strthis = "proxy of " + self.this.__repr__()
except: strthis = ""
return "<%s.%s; %s >" % (self.__class__.__module__, self.__class__.__name__, strthis,)
import core
import delta
import ra
def svn_wc_version():
"""svn_wc_version() -> svn_version_t"""
return _wc.svn_wc_version()
SVN_WC_TRANSLATE_FROM_NF = _wc.SVN_WC_TRANSLATE_FROM_NF
SVN_WC_TRANSLATE_TO_NF = _wc.SVN_WC_TRANSLATE_TO_NF
SVN_WC_TRANSLATE_FORCE_EOL_REPAIR = _wc.SVN_WC_TRANSLATE_FORCE_EOL_REPAIR
SVN_WC_TRANSLATE_NO_OUTPUT_CLEANUP = _wc.SVN_WC_TRANSLATE_NO_OUTPUT_CLEANUP
SVN_WC_TRANSLATE_FORCE_COPY = _wc.SVN_WC_TRANSLATE_FORCE_COPY
SVN_WC_TRANSLATE_USE_GLOBAL_TMP = _wc.SVN_WC_TRANSLATE_USE_GLOBAL_TMP
def svn_wc_adm_open3(*args):
"""
svn_wc_adm_open3(svn_wc_adm_access_t associated, char path, svn_boolean_t write_lock,
int levels_to_lock, svn_cancel_func_t cancel_func,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_adm_open3(*args)
def svn_wc_adm_open2(*args):
"""
svn_wc_adm_open2(svn_wc_adm_access_t associated, char path, svn_boolean_t write_lock,
int levels_to_lock, apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_adm_open2(*args)
def svn_wc_adm_open(*args):
"""
svn_wc_adm_open(svn_wc_adm_access_t associated, char path, svn_boolean_t write_lock,
svn_boolean_t tree_lock, apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_adm_open(*args)
def svn_wc_adm_probe_open3(*args):
"""
svn_wc_adm_probe_open3(svn_wc_adm_access_t associated, char path, svn_boolean_t write_lock,
int levels_to_lock, svn_cancel_func_t cancel_func,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_adm_probe_open3(*args)
def svn_wc_adm_probe_open2(*args):
"""
svn_wc_adm_probe_open2(svn_wc_adm_access_t associated, char path, svn_boolean_t write_lock,
int levels_to_lock, apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_adm_probe_open2(*args)
def svn_wc_adm_probe_open(*args):
"""
svn_wc_adm_probe_open(svn_wc_adm_access_t associated, char path, svn_boolean_t write_lock,
svn_boolean_t tree_lock, apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_adm_probe_open(*args)
def svn_wc_adm_open_anchor(*args):
"""
svn_wc_adm_open_anchor(char path, svn_boolean_t write_lock, int levels_to_lock,
svn_cancel_func_t cancel_func, apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_adm_open_anchor(*args)
def svn_wc_adm_retrieve(*args):
"""svn_wc_adm_retrieve(svn_wc_adm_access_t associated, char path, apr_pool_t pool) -> svn_error_t"""
return _wc.svn_wc_adm_retrieve(*args)
def svn_wc_adm_probe_retrieve(*args):
"""svn_wc_adm_probe_retrieve(svn_wc_adm_access_t associated, char path, apr_pool_t pool) -> svn_error_t"""
return _wc.svn_wc_adm_probe_retrieve(*args)
def svn_wc_adm_probe_try3(*args):
"""
svn_wc_adm_probe_try3(svn_wc_adm_access_t associated, char path, svn_boolean_t write_lock,
int levels_to_lock, svn_cancel_func_t cancel_func,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_adm_probe_try3(*args)
def svn_wc_adm_probe_try2(*args):
"""
svn_wc_adm_probe_try2(svn_wc_adm_access_t associated, char path, svn_boolean_t write_lock,
int levels_to_lock, apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_adm_probe_try2(*args)
def svn_wc_adm_probe_try(*args):
"""
svn_wc_adm_probe_try(svn_wc_adm_access_t associated, char path, svn_boolean_t write_lock,
svn_boolean_t tree_lock, apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_adm_probe_try(*args)
def svn_wc_adm_close2(*args):
"""svn_wc_adm_close2(svn_wc_adm_access_t adm_access, apr_pool_t scratch_pool) -> svn_error_t"""
return _wc.svn_wc_adm_close2(*args)
def svn_wc_adm_close(*args):
"""svn_wc_adm_close(svn_wc_adm_access_t adm_access) -> svn_error_t"""
return _wc.svn_wc_adm_close(*args)
def svn_wc_adm_access_path(*args):
"""svn_wc_adm_access_path(svn_wc_adm_access_t adm_access) -> char"""
return _wc.svn_wc_adm_access_path(*args)
def svn_wc_adm_access_pool(*args):
"""svn_wc_adm_access_pool(svn_wc_adm_access_t adm_access) -> apr_pool_t"""
return _wc.svn_wc_adm_access_pool(*args)
def svn_wc_adm_locked(*args):
"""svn_wc_adm_locked(svn_wc_adm_access_t adm_access) -> svn_boolean_t"""
return _wc.svn_wc_adm_locked(*args)
def svn_wc_locked(*args):
"""svn_wc_locked(char path, apr_pool_t pool) -> svn_error_t"""
return _wc.svn_wc_locked(*args)
def svn_wc_is_adm_dir(*args):
"""svn_wc_is_adm_dir(char name, apr_pool_t pool) -> svn_boolean_t"""
return _wc.svn_wc_is_adm_dir(*args)
def svn_wc_get_adm_dir(*args):
"""svn_wc_get_adm_dir(apr_pool_t pool) -> char"""
return _wc.svn_wc_get_adm_dir(*args)
def svn_wc_set_adm_dir(*args):
"""svn_wc_set_adm_dir(char name, apr_pool_t pool) -> svn_error_t"""
return _wc.svn_wc_set_adm_dir(*args)
def svn_wc_init_traversal_info(*args):
"""svn_wc_init_traversal_info(apr_pool_t pool) -> svn_wc_traversal_info_t"""
return _wc.svn_wc_init_traversal_info(*args)
def svn_wc_edited_externals(*args):
"""svn_wc_edited_externals(svn_wc_traversal_info_t traversal_info)"""
return _wc.svn_wc_edited_externals(*args)
def svn_wc_traversed_depths(*args):
"""svn_wc_traversed_depths(svn_wc_traversal_info_t traversal_info)"""
return _wc.svn_wc_traversed_depths(*args)
class svn_wc_external_item2_t:
"""Proxy of C svn_wc_external_item2_t struct"""
__swig_setmethods__ = {}
__setattr__ = lambda self, name, value: _swig_setattr(self, svn_wc_external_item2_t, name, value)
__swig_getmethods__ = {}
__getattr__ = lambda self, name: _swig_getattr(self, svn_wc_external_item2_t, name)
__repr__ = _swig_repr
__swig_setmethods__["target_dir"] = _wc.svn_wc_external_item2_t_target_dir_set
__swig_getmethods__["target_dir"] = _wc.svn_wc_external_item2_t_target_dir_get
__swig_setmethods__["url"] = _wc.svn_wc_external_item2_t_url_set
__swig_getmethods__["url"] = _wc.svn_wc_external_item2_t_url_get
__swig_setmethods__["revision"] = _wc.svn_wc_external_item2_t_revision_set
__swig_getmethods__["revision"] = _wc.svn_wc_external_item2_t_revision_get
__swig_setmethods__["peg_revision"] = _wc.svn_wc_external_item2_t_peg_revision_set
__swig_getmethods__["peg_revision"] = _wc.svn_wc_external_item2_t_peg_revision_get
def set_parent_pool(self, parent_pool=None):
"""Create a new proxy object for svn_wc_external_item2_t"""
import libsvn.core, weakref
self.__dict__["_parent_pool"] = \
parent_pool or libsvn.core.application_pool;
if self.__dict__["_parent_pool"]:
self.__dict__["_is_valid"] = weakref.ref(
self.__dict__["_parent_pool"]._is_valid)
def assert_valid(self):
"""Assert that this object is using valid pool memory"""
if "_is_valid" in self.__dict__:
assert self.__dict__["_is_valid"](), "Variable has already been deleted"
def __getattr__(self, name):
"""Get an attribute from this object"""
self.assert_valid()
value = _swig_getattr(self, self.__class__, name)
members = self.__dict__.get("_members")
if members is not None:
old_value = members.get(name)
if (old_value is not None and value is not None and
value is not old_value):
try:
value.__dict__.update(old_value.__dict__)
except AttributeError:
pass
if hasattr(value, "assert_valid"):
value.assert_valid()
return value
def __setattr__(self, name, value):
"""Set an attribute on this object"""
self.assert_valid()
self.__dict__.setdefault("_members",{})[name] = value
return _swig_setattr(self, self.__class__, name, value)
def __init__(self):
"""__init__(self) -> svn_wc_external_item2_t"""
this = _wc.new_svn_wc_external_item2_t()
try: self.this.append(this)
except: self.this = this
__swig_destroy__ = _wc.delete_svn_wc_external_item2_t
__del__ = lambda self : None;
svn_wc_external_item2_t_swigregister = _wc.svn_wc_external_item2_t_swigregister
svn_wc_external_item2_t_swigregister(svn_wc_external_item2_t)
def svn_wc_external_item_create(*args):
"""svn_wc_external_item_create(apr_pool_t pool) -> svn_error_t"""
return _wc.svn_wc_external_item_create(*args)
def svn_wc_external_item2_dup(*args):
"""svn_wc_external_item2_dup(svn_wc_external_item2_t item, apr_pool_t pool) -> svn_wc_external_item2_t"""
return _wc.svn_wc_external_item2_dup(*args)
class svn_wc_external_item_t:
"""Proxy of C svn_wc_external_item_t struct"""
__swig_setmethods__ = {}
__setattr__ = lambda self, name, value: _swig_setattr(self, svn_wc_external_item_t, name, value)
__swig_getmethods__ = {}
__getattr__ = lambda self, name: _swig_getattr(self, svn_wc_external_item_t, name)
__repr__ = _swig_repr
__swig_setmethods__["target_dir"] = _wc.svn_wc_external_item_t_target_dir_set
__swig_getmethods__["target_dir"] = _wc.svn_wc_external_item_t_target_dir_get
__swig_setmethods__["url"] = _wc.svn_wc_external_item_t_url_set
__swig_getmethods__["url"] = _wc.svn_wc_external_item_t_url_get
__swig_setmethods__["revision"] = _wc.svn_wc_external_item_t_revision_set
__swig_getmethods__["revision"] = _wc.svn_wc_external_item_t_revision_get
def set_parent_pool(self, parent_pool=None):
"""Create a new proxy object for svn_wc_external_item_t"""
import libsvn.core, weakref
self.__dict__["_parent_pool"] = \
parent_pool or libsvn.core.application_pool;
if self.__dict__["_parent_pool"]:
self.__dict__["_is_valid"] = weakref.ref(
self.__dict__["_parent_pool"]._is_valid)
def assert_valid(self):
"""Assert that this object is using valid pool memory"""
if "_is_valid" in self.__dict__:
assert self.__dict__["_is_valid"](), "Variable has already been deleted"
def __getattr__(self, name):
"""Get an attribute from this object"""
self.assert_valid()
value = _swig_getattr(self, self.__class__, name)
members = self.__dict__.get("_members")
if members is not None:
old_value = members.get(name)
if (old_value is not None and value is not None and
value is not old_value):
try:
value.__dict__.update(old_value.__dict__)
except AttributeError:
pass
if hasattr(value, "assert_valid"):
value.assert_valid()
return value
def __setattr__(self, name, value):
"""Set an attribute on this object"""
self.assert_valid()
self.__dict__.setdefault("_members",{})[name] = value
return _swig_setattr(self, self.__class__, name, value)
def __init__(self):
"""__init__(self) -> svn_wc_external_item_t"""
this = _wc.new_svn_wc_external_item_t()
try: self.this.append(this)
except: self.this = this
__swig_destroy__ = _wc.delete_svn_wc_external_item_t
__del__ = lambda self : None;
svn_wc_external_item_t_swigregister = _wc.svn_wc_external_item_t_swigregister
svn_wc_external_item_t_swigregister(svn_wc_external_item_t)
def svn_wc_external_item_dup(*args):
"""svn_wc_external_item_dup(svn_wc_external_item_t item, apr_pool_t pool) -> svn_wc_external_item_t"""
return _wc.svn_wc_external_item_dup(*args)
def svn_wc_parse_externals_description3(*args):
"""
svn_wc_parse_externals_description3(char parent_directory, char desc, svn_boolean_t canonicalize_url,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_parse_externals_description3(*args)
def svn_wc_parse_externals_description2(*args):
"""svn_wc_parse_externals_description2(char parent_directory, char desc, apr_pool_t pool) -> svn_error_t"""
return _wc.svn_wc_parse_externals_description2(*args)
def svn_wc_parse_externals_description(*args):
"""svn_wc_parse_externals_description(char parent_directory, char desc, apr_pool_t pool) -> svn_error_t"""
return _wc.svn_wc_parse_externals_description(*args)
svn_wc_notify_add = _wc.svn_wc_notify_add
svn_wc_notify_copy = _wc.svn_wc_notify_copy
svn_wc_notify_delete = _wc.svn_wc_notify_delete
svn_wc_notify_restore = _wc.svn_wc_notify_restore
svn_wc_notify_revert = _wc.svn_wc_notify_revert
svn_wc_notify_failed_revert = _wc.svn_wc_notify_failed_revert
svn_wc_notify_resolved = _wc.svn_wc_notify_resolved
svn_wc_notify_skip = _wc.svn_wc_notify_skip
svn_wc_notify_update_delete = _wc.svn_wc_notify_update_delete
svn_wc_notify_update_add = _wc.svn_wc_notify_update_add
svn_wc_notify_update_update = _wc.svn_wc_notify_update_update
svn_wc_notify_update_completed = _wc.svn_wc_notify_update_completed
svn_wc_notify_update_external = _wc.svn_wc_notify_update_external
svn_wc_notify_status_completed = _wc.svn_wc_notify_status_completed
svn_wc_notify_status_external = _wc.svn_wc_notify_status_external
svn_wc_notify_commit_modified = _wc.svn_wc_notify_commit_modified
svn_wc_notify_commit_added = _wc.svn_wc_notify_commit_added
svn_wc_notify_commit_deleted = _wc.svn_wc_notify_commit_deleted
svn_wc_notify_commit_replaced = _wc.svn_wc_notify_commit_replaced
svn_wc_notify_commit_postfix_txdelta = _wc.svn_wc_notify_commit_postfix_txdelta
svn_wc_notify_blame_revision = _wc.svn_wc_notify_blame_revision
svn_wc_notify_locked = _wc.svn_wc_notify_locked
svn_wc_notify_unlocked = _wc.svn_wc_notify_unlocked
svn_wc_notify_failed_lock = _wc.svn_wc_notify_failed_lock
svn_wc_notify_failed_unlock = _wc.svn_wc_notify_failed_unlock
svn_wc_notify_exists = _wc.svn_wc_notify_exists
svn_wc_notify_changelist_set = _wc.svn_wc_notify_changelist_set
svn_wc_notify_changelist_clear = _wc.svn_wc_notify_changelist_clear
svn_wc_notify_changelist_moved = _wc.svn_wc_notify_changelist_moved
svn_wc_notify_merge_begin = _wc.svn_wc_notify_merge_begin
svn_wc_notify_foreign_merge_begin = _wc.svn_wc_notify_foreign_merge_begin
svn_wc_notify_update_replace = _wc.svn_wc_notify_update_replace
svn_wc_notify_property_added = _wc.svn_wc_notify_property_added
svn_wc_notify_property_modified = _wc.svn_wc_notify_property_modified
svn_wc_notify_property_deleted = _wc.svn_wc_notify_property_deleted
svn_wc_notify_property_deleted_nonexistent = _wc.svn_wc_notify_property_deleted_nonexistent
svn_wc_notify_revprop_set = _wc.svn_wc_notify_revprop_set
svn_wc_notify_revprop_deleted = _wc.svn_wc_notify_revprop_deleted
svn_wc_notify_merge_completed = _wc.svn_wc_notify_merge_completed
svn_wc_notify_tree_conflict = _wc.svn_wc_notify_tree_conflict
svn_wc_notify_failed_external = _wc.svn_wc_notify_failed_external
svn_wc_notify_state_inapplicable = _wc.svn_wc_notify_state_inapplicable
svn_wc_notify_state_unknown = _wc.svn_wc_notify_state_unknown
svn_wc_notify_state_unchanged = _wc.svn_wc_notify_state_unchanged
svn_wc_notify_state_missing = _wc.svn_wc_notify_state_missing
svn_wc_notify_state_obstructed = _wc.svn_wc_notify_state_obstructed
svn_wc_notify_state_changed = _wc.svn_wc_notify_state_changed
svn_wc_notify_state_merged = _wc.svn_wc_notify_state_merged
svn_wc_notify_state_conflicted = _wc.svn_wc_notify_state_conflicted
svn_wc_notify_lock_state_inapplicable = _wc.svn_wc_notify_lock_state_inapplicable
svn_wc_notify_lock_state_unknown = _wc.svn_wc_notify_lock_state_unknown
svn_wc_notify_lock_state_unchanged = _wc.svn_wc_notify_lock_state_unchanged
svn_wc_notify_lock_state_locked = _wc.svn_wc_notify_lock_state_locked
svn_wc_notify_lock_state_unlocked = _wc.svn_wc_notify_lock_state_unlocked
class svn_wc_notify_t:
"""Proxy of C svn_wc_notify_t struct"""
__swig_setmethods__ = {}
__setattr__ = lambda self, name, value: _swig_setattr(self, svn_wc_notify_t, name, value)
__swig_getmethods__ = {}
__getattr__ = lambda self, name: _swig_getattr(self, svn_wc_notify_t, name)
__repr__ = _swig_repr
__swig_setmethods__["path"] = _wc.svn_wc_notify_t_path_set
__swig_getmethods__["path"] = _wc.svn_wc_notify_t_path_get
__swig_setmethods__["action"] = _wc.svn_wc_notify_t_action_set
__swig_getmethods__["action"] = _wc.svn_wc_notify_t_action_get
__swig_setmethods__["kind"] = _wc.svn_wc_notify_t_kind_set
__swig_getmethods__["kind"] = _wc.svn_wc_notify_t_kind_get
__swig_setmethods__["mime_type"] = _wc.svn_wc_notify_t_mime_type_set
__swig_getmethods__["mime_type"] = _wc.svn_wc_notify_t_mime_type_get
__swig_setmethods__["lock"] = _wc.svn_wc_notify_t_lock_set
__swig_getmethods__["lock"] = _wc.svn_wc_notify_t_lock_get
__swig_setmethods__["err"] = _wc.svn_wc_notify_t_err_set
__swig_getmethods__["err"] = _wc.svn_wc_notify_t_err_get
__swig_setmethods__["content_state"] = _wc.svn_wc_notify_t_content_state_set
__swig_getmethods__["content_state"] = _wc.svn_wc_notify_t_content_state_get
__swig_setmethods__["prop_state"] = _wc.svn_wc_notify_t_prop_state_set
__swig_getmethods__["prop_state"] = _wc.svn_wc_notify_t_prop_state_get
__swig_setmethods__["lock_state"] = _wc.svn_wc_notify_t_lock_state_set
__swig_getmethods__["lock_state"] = _wc.svn_wc_notify_t_lock_state_get
__swig_setmethods__["revision"] = _wc.svn_wc_notify_t_revision_set
__swig_getmethods__["revision"] = _wc.svn_wc_notify_t_revision_get
__swig_setmethods__["changelist_name"] = _wc.svn_wc_notify_t_changelist_name_set
__swig_getmethods__["changelist_name"] = _wc.svn_wc_notify_t_changelist_name_get
__swig_setmethods__["merge_range"] = _wc.svn_wc_notify_t_merge_range_set
__swig_getmethods__["merge_range"] = _wc.svn_wc_notify_t_merge_range_get
__swig_setmethods__["url"] = _wc.svn_wc_notify_t_url_set
__swig_getmethods__["url"] = _wc.svn_wc_notify_t_url_get
__swig_setmethods__["path_prefix"] = _wc.svn_wc_notify_t_path_prefix_set
__swig_getmethods__["path_prefix"] = _wc.svn_wc_notify_t_path_prefix_get
__swig_setmethods__["prop_name"] = _wc.svn_wc_notify_t_prop_name_set
__swig_getmethods__["prop_name"] = _wc.svn_wc_notify_t_prop_name_get
__swig_setmethods__["rev_props"] = _wc.svn_wc_notify_t_rev_props_set
__swig_getmethods__["rev_props"] = _wc.svn_wc_notify_t_rev_props_get
def set_parent_pool(self, parent_pool=None):
"""Create a new proxy object for svn_wc_notify_t"""
import libsvn.core, weakref
self.__dict__["_parent_pool"] = \
parent_pool or libsvn.core.application_pool;
if self.__dict__["_parent_pool"]:
self.__dict__["_is_valid"] = weakref.ref(
self.__dict__["_parent_pool"]._is_valid)
def assert_valid(self):
"""Assert that this object is using valid pool memory"""
if "_is_valid" in self.__dict__:
assert self.__dict__["_is_valid"](), "Variable has already been deleted"
def __getattr__(self, name):
"""Get an attribute from this object"""
self.assert_valid()
value = _swig_getattr(self, self.__class__, name)
members = self.__dict__.get("_members")
if members is not None:
old_value = members.get(name)
if (old_value is not None and value is not None and
value is not old_value):
try:
value.__dict__.update(old_value.__dict__)
except AttributeError:
pass
if hasattr(value, "assert_valid"):
value.assert_valid()
return value
def __setattr__(self, name, value):
"""Set an attribute on this object"""
self.assert_valid()
self.__dict__.setdefault("_members",{})[name] = value
return _swig_setattr(self, self.__class__, name, value)
def __init__(self):
"""__init__(self) -> svn_wc_notify_t"""
this = _wc.new_svn_wc_notify_t()
try: self.this.append(this)
except: self.this = this
__swig_destroy__ = _wc.delete_svn_wc_notify_t
__del__ = lambda self : None;
svn_wc_notify_t_swigregister = _wc.svn_wc_notify_t_swigregister
svn_wc_notify_t_swigregister(svn_wc_notify_t)
def svn_wc_create_notify(*args):
"""svn_wc_create_notify(char path, svn_wc_notify_action_t action, apr_pool_t pool) -> svn_wc_notify_t"""
return _wc.svn_wc_create_notify(*args)
def svn_wc_create_notify_url(*args):
"""svn_wc_create_notify_url(char url, svn_wc_notify_action_t action, apr_pool_t pool) -> svn_wc_notify_t"""
return _wc.svn_wc_create_notify_url(*args)
def svn_wc_dup_notify(*args):
"""svn_wc_dup_notify(svn_wc_notify_t notify, apr_pool_t pool) -> svn_wc_notify_t"""
return _wc.svn_wc_dup_notify(*args)
svn_wc_conflict_action_edit = _wc.svn_wc_conflict_action_edit
svn_wc_conflict_action_add = _wc.svn_wc_conflict_action_add
svn_wc_conflict_action_delete = _wc.svn_wc_conflict_action_delete
svn_wc_conflict_reason_edited = _wc.svn_wc_conflict_reason_edited
svn_wc_conflict_reason_obstructed = _wc.svn_wc_conflict_reason_obstructed
svn_wc_conflict_reason_deleted = _wc.svn_wc_conflict_reason_deleted
svn_wc_conflict_reason_missing = _wc.svn_wc_conflict_reason_missing
svn_wc_conflict_reason_unversioned = _wc.svn_wc_conflict_reason_unversioned
svn_wc_conflict_reason_added = _wc.svn_wc_conflict_reason_added
svn_wc_conflict_kind_text = _wc.svn_wc_conflict_kind_text
svn_wc_conflict_kind_property = _wc.svn_wc_conflict_kind_property
svn_wc_conflict_kind_tree = _wc.svn_wc_conflict_kind_tree
svn_wc_operation_none = _wc.svn_wc_operation_none
svn_wc_operation_update = _wc.svn_wc_operation_update
svn_wc_operation_switch = _wc.svn_wc_operation_switch
svn_wc_operation_merge = _wc.svn_wc_operation_merge
class svn_wc_conflict_version_t:
"""Proxy of C svn_wc_conflict_version_t struct"""
__swig_setmethods__ = {}
__setattr__ = lambda self, name, value: _swig_setattr(self, svn_wc_conflict_version_t, name, value)
__swig_getmethods__ = {}
__getattr__ = lambda self, name: _swig_getattr(self, svn_wc_conflict_version_t, name)
__repr__ = _swig_repr
__swig_setmethods__["repos_url"] = _wc.svn_wc_conflict_version_t_repos_url_set
__swig_getmethods__["repos_url"] = _wc.svn_wc_conflict_version_t_repos_url_get
__swig_setmethods__["peg_rev"] = _wc.svn_wc_conflict_version_t_peg_rev_set
__swig_getmethods__["peg_rev"] = _wc.svn_wc_conflict_version_t_peg_rev_get
__swig_setmethods__["path_in_repos"] = _wc.svn_wc_conflict_version_t_path_in_repos_set
__swig_getmethods__["path_in_repos"] = _wc.svn_wc_conflict_version_t_path_in_repos_get
__swig_setmethods__["node_kind"] = _wc.svn_wc_conflict_version_t_node_kind_set
__swig_getmethods__["node_kind"] = _wc.svn_wc_conflict_version_t_node_kind_get
def set_parent_pool(self, parent_pool=None):
"""Create a new proxy object for svn_wc_conflict_version_t"""
import libsvn.core, weakref
self.__dict__["_parent_pool"] = \
parent_pool or libsvn.core.application_pool;
if self.__dict__["_parent_pool"]:
self.__dict__["_is_valid"] = weakref.ref(
self.__dict__["_parent_pool"]._is_valid)
def assert_valid(self):
"""Assert that this object is using valid pool memory"""
if "_is_valid" in self.__dict__:
assert self.__dict__["_is_valid"](), "Variable has already been deleted"
def __getattr__(self, name):
"""Get an attribute from this object"""
self.assert_valid()
value = _swig_getattr(self, self.__class__, name)
members = self.__dict__.get("_members")
if members is not None:
old_value = members.get(name)
if (old_value is not None and value is not None and
value is not old_value):
try:
value.__dict__.update(old_value.__dict__)
except AttributeError:
pass
if hasattr(value, "assert_valid"):
value.assert_valid()
return value
def __setattr__(self, name, value):
"""Set an attribute on this object"""
self.assert_valid()
self.__dict__.setdefault("_members",{})[name] = value
return _swig_setattr(self, self.__class__, name, value)
def __init__(self):
"""__init__(self) -> svn_wc_conflict_version_t"""
this = _wc.new_svn_wc_conflict_version_t()
try: self.this.append(this)
except: self.this = this
__swig_destroy__ = _wc.delete_svn_wc_conflict_version_t
__del__ = lambda self : None;
svn_wc_conflict_version_t_swigregister = _wc.svn_wc_conflict_version_t_swigregister
svn_wc_conflict_version_t_swigregister(svn_wc_conflict_version_t)
def svn_wc_conflict_version_create(*args):
"""
svn_wc_conflict_version_create(char repos_url, char path_in_repos, svn_revnum_t peg_rev,
svn_node_kind_t node_kind, apr_pool_t pool) -> svn_wc_conflict_version_t
"""
return _wc.svn_wc_conflict_version_create(*args)
def svn_wc_conflict_version_dup(*args):
"""svn_wc_conflict_version_dup(svn_wc_conflict_version_t version, apr_pool_t pool) -> svn_wc_conflict_version_t"""
return _wc.svn_wc_conflict_version_dup(*args)
class svn_wc_conflict_description_t:
"""Proxy of C svn_wc_conflict_description_t struct"""
__swig_setmethods__ = {}
__setattr__ = lambda self, name, value: _swig_setattr(self, svn_wc_conflict_description_t, name, value)
__swig_getmethods__ = {}
__getattr__ = lambda self, name: _swig_getattr(self, svn_wc_conflict_description_t, name)
__repr__ = _swig_repr
__swig_setmethods__["path"] = _wc.svn_wc_conflict_description_t_path_set
__swig_getmethods__["path"] = _wc.svn_wc_conflict_description_t_path_get
__swig_setmethods__["node_kind"] = _wc.svn_wc_conflict_description_t_node_kind_set
__swig_getmethods__["node_kind"] = _wc.svn_wc_conflict_description_t_node_kind_get
__swig_setmethods__["kind"] = _wc.svn_wc_conflict_description_t_kind_set
__swig_getmethods__["kind"] = _wc.svn_wc_conflict_description_t_kind_get
__swig_setmethods__["property_name"] = _wc.svn_wc_conflict_description_t_property_name_set
__swig_getmethods__["property_name"] = _wc.svn_wc_conflict_description_t_property_name_get
__swig_setmethods__["is_binary"] = _wc.svn_wc_conflict_description_t_is_binary_set
__swig_getmethods__["is_binary"] = _wc.svn_wc_conflict_description_t_is_binary_get
__swig_setmethods__["mime_type"] = _wc.svn_wc_conflict_description_t_mime_type_set
__swig_getmethods__["mime_type"] = _wc.svn_wc_conflict_description_t_mime_type_get
__swig_setmethods__["access"] = _wc.svn_wc_conflict_description_t_access_set
__swig_getmethods__["access"] = _wc.svn_wc_conflict_description_t_access_get
__swig_setmethods__["action"] = _wc.svn_wc_conflict_description_t_action_set
__swig_getmethods__["action"] = _wc.svn_wc_conflict_description_t_action_get
__swig_setmethods__["reason"] = _wc.svn_wc_conflict_description_t_reason_set
__swig_getmethods__["reason"] = _wc.svn_wc_conflict_description_t_reason_get
__swig_setmethods__["base_file"] = _wc.svn_wc_conflict_description_t_base_file_set
__swig_getmethods__["base_file"] = _wc.svn_wc_conflict_description_t_base_file_get
__swig_setmethods__["their_file"] = _wc.svn_wc_conflict_description_t_their_file_set
__swig_getmethods__["their_file"] = _wc.svn_wc_conflict_description_t_their_file_get
__swig_setmethods__["my_file"] = _wc.svn_wc_conflict_description_t_my_file_set
__swig_getmethods__["my_file"] = _wc.svn_wc_conflict_description_t_my_file_get
__swig_setmethods__["merged_file"] = _wc.svn_wc_conflict_description_t_merged_file_set
__swig_getmethods__["merged_file"] = _wc.svn_wc_conflict_description_t_merged_file_get
__swig_setmethods__["operation"] = _wc.svn_wc_conflict_description_t_operation_set
__swig_getmethods__["operation"] = _wc.svn_wc_conflict_description_t_operation_get
__swig_setmethods__["src_left_version"] = _wc.svn_wc_conflict_description_t_src_left_version_set
__swig_getmethods__["src_left_version"] = _wc.svn_wc_conflict_description_t_src_left_version_get
__swig_setmethods__["src_right_version"] = _wc.svn_wc_conflict_description_t_src_right_version_set
__swig_getmethods__["src_right_version"] = _wc.svn_wc_conflict_description_t_src_right_version_get
def set_parent_pool(self, parent_pool=None):
"""Create a new proxy object for svn_wc_conflict_description_t"""
import libsvn.core, weakref
self.__dict__["_parent_pool"] = \
parent_pool or libsvn.core.application_pool;
if self.__dict__["_parent_pool"]:
self.__dict__["_is_valid"] = weakref.ref(
self.__dict__["_parent_pool"]._is_valid)
def assert_valid(self):
"""Assert that this object is using valid pool memory"""
if "_is_valid" in self.__dict__:
assert self.__dict__["_is_valid"](), "Variable has already been deleted"
def __getattr__(self, name):
"""Get an attribute from this object"""
self.assert_valid()
value = _swig_getattr(self, self.__class__, name)
members = self.__dict__.get("_members")
if members is not None:
old_value = members.get(name)
if (old_value is not None and value is not None and
value is not old_value):
try:
value.__dict__.update(old_value.__dict__)
except AttributeError:
pass
if hasattr(value, "assert_valid"):
value.assert_valid()
return value
def __setattr__(self, name, value):
"""Set an attribute on this object"""
self.assert_valid()
self.__dict__.setdefault("_members",{})[name] = value
return _swig_setattr(self, self.__class__, name, value)
def __init__(self):
"""__init__(self) -> svn_wc_conflict_description_t"""
this = _wc.new_svn_wc_conflict_description_t()
try: self.this.append(this)
except: self.this = this
__swig_destroy__ = _wc.delete_svn_wc_conflict_description_t
__del__ = lambda self : None;
svn_wc_conflict_description_t_swigregister = _wc.svn_wc_conflict_description_t_swigregister
svn_wc_conflict_description_t_swigregister(svn_wc_conflict_description_t)
def svn_wc_conflict_description_create_text(*args):
"""svn_wc_conflict_description_create_text(char path, svn_wc_adm_access_t adm_access, apr_pool_t pool) -> svn_wc_conflict_description_t"""
return _wc.svn_wc_conflict_description_create_text(*args)
def svn_wc_conflict_description_create_prop(*args):
"""
svn_wc_conflict_description_create_prop(char path, svn_wc_adm_access_t adm_access, svn_node_kind_t node_kind,
char property_name, apr_pool_t pool) -> svn_wc_conflict_description_t
"""
return _wc.svn_wc_conflict_description_create_prop(*args)
def svn_wc_conflict_description_create_tree(*args):
"""
svn_wc_conflict_description_create_tree(char path, svn_wc_adm_access_t adm_access, svn_node_kind_t node_kind,
svn_wc_operation_t operation,
svn_wc_conflict_version_t src_left_version,
svn_wc_conflict_version_t src_right_version,
apr_pool_t pool) -> svn_wc_conflict_description_t
"""
return _wc.svn_wc_conflict_description_create_tree(*args)
svn_wc_conflict_choose_postpone = _wc.svn_wc_conflict_choose_postpone
svn_wc_conflict_choose_base = _wc.svn_wc_conflict_choose_base
svn_wc_conflict_choose_theirs_full = _wc.svn_wc_conflict_choose_theirs_full
svn_wc_conflict_choose_mine_full = _wc.svn_wc_conflict_choose_mine_full
svn_wc_conflict_choose_theirs_conflict = _wc.svn_wc_conflict_choose_theirs_conflict
svn_wc_conflict_choose_mine_conflict = _wc.svn_wc_conflict_choose_mine_conflict
svn_wc_conflict_choose_merged = _wc.svn_wc_conflict_choose_merged
class svn_wc_conflict_result_t:
"""Proxy of C svn_wc_conflict_result_t struct"""
__swig_setmethods__ = {}
__setattr__ = lambda self, name, value: _swig_setattr(self, svn_wc_conflict_result_t, name, value)
__swig_getmethods__ = {}
__getattr__ = lambda self, name: _swig_getattr(self, svn_wc_conflict_result_t, name)
__repr__ = _swig_repr
__swig_setmethods__["choice"] = _wc.svn_wc_conflict_result_t_choice_set
__swig_getmethods__["choice"] = _wc.svn_wc_conflict_result_t_choice_get
__swig_setmethods__["merged_file"] = _wc.svn_wc_conflict_result_t_merged_file_set
__swig_getmethods__["merged_file"] = _wc.svn_wc_conflict_result_t_merged_file_get
__swig_setmethods__["save_merged"] = _wc.svn_wc_conflict_result_t_save_merged_set
__swig_getmethods__["save_merged"] = _wc.svn_wc_conflict_result_t_save_merged_get
def set_parent_pool(self, parent_pool=None):
"""Create a new proxy object for svn_wc_conflict_result_t"""
import libsvn.core, weakref
self.__dict__["_parent_pool"] = \
parent_pool or libsvn.core.application_pool;
if self.__dict__["_parent_pool"]:
self.__dict__["_is_valid"] = weakref.ref(
self.__dict__["_parent_pool"]._is_valid)
def assert_valid(self):
"""Assert that this object is using valid pool memory"""
if "_is_valid" in self.__dict__:
assert self.__dict__["_is_valid"](), "Variable has already been deleted"
def __getattr__(self, name):
"""Get an attribute from this object"""
self.assert_valid()
value = _swig_getattr(self, self.__class__, name)
members = self.__dict__.get("_members")
if members is not None:
old_value = members.get(name)
if (old_value is not None and value is not None and
value is not old_value):
try:
value.__dict__.update(old_value.__dict__)
except AttributeError:
pass
if hasattr(value, "assert_valid"):
value.assert_valid()
return value
def __setattr__(self, name, value):
"""Set an attribute on this object"""
self.assert_valid()
self.__dict__.setdefault("_members",{})[name] = value
return _swig_setattr(self, self.__class__, name, value)
def __init__(self):
"""__init__(self) -> svn_wc_conflict_result_t"""
this = _wc.new_svn_wc_conflict_result_t()
try: self.this.append(this)
except: self.this = this
__swig_destroy__ = _wc.delete_svn_wc_conflict_result_t
__del__ = lambda self : None;
svn_wc_conflict_result_t_swigregister = _wc.svn_wc_conflict_result_t_swigregister
svn_wc_conflict_result_t_swigregister(svn_wc_conflict_result_t)
def svn_wc_create_conflict_result(*args):
"""
svn_wc_create_conflict_result(svn_wc_conflict_choice_t choice, char merged_file,
apr_pool_t pool) -> svn_wc_conflict_result_t
"""
return _wc.svn_wc_create_conflict_result(*args)
class svn_wc_diff_callbacks3_t:
"""Proxy of C svn_wc_diff_callbacks3_t struct"""
__swig_setmethods__ = {}
__setattr__ = lambda self, name, value: _swig_setattr(self, svn_wc_diff_callbacks3_t, name, value)
__swig_getmethods__ = {}
__getattr__ = lambda self, name: _swig_getattr(self, svn_wc_diff_callbacks3_t, name)
__repr__ = _swig_repr
__swig_setmethods__["file_changed"] = _wc.svn_wc_diff_callbacks3_t_file_changed_set
__swig_getmethods__["file_changed"] = _wc.svn_wc_diff_callbacks3_t_file_changed_get
__swig_setmethods__["file_added"] = _wc.svn_wc_diff_callbacks3_t_file_added_set
__swig_getmethods__["file_added"] = _wc.svn_wc_diff_callbacks3_t_file_added_get
__swig_setmethods__["file_deleted"] = _wc.svn_wc_diff_callbacks3_t_file_deleted_set
__swig_getmethods__["file_deleted"] = _wc.svn_wc_diff_callbacks3_t_file_deleted_get
__swig_setmethods__["dir_added"] = _wc.svn_wc_diff_callbacks3_t_dir_added_set
__swig_getmethods__["dir_added"] = _wc.svn_wc_diff_callbacks3_t_dir_added_get
__swig_setmethods__["dir_deleted"] = _wc.svn_wc_diff_callbacks3_t_dir_deleted_set
__swig_getmethods__["dir_deleted"] = _wc.svn_wc_diff_callbacks3_t_dir_deleted_get
__swig_setmethods__["dir_props_changed"] = _wc.svn_wc_diff_callbacks3_t_dir_props_changed_set
__swig_getmethods__["dir_props_changed"] = _wc.svn_wc_diff_callbacks3_t_dir_props_changed_get
__swig_setmethods__["dir_opened"] = _wc.svn_wc_diff_callbacks3_t_dir_opened_set
__swig_getmethods__["dir_opened"] = _wc.svn_wc_diff_callbacks3_t_dir_opened_get
__swig_setmethods__["dir_closed"] = _wc.svn_wc_diff_callbacks3_t_dir_closed_set
__swig_getmethods__["dir_closed"] = _wc.svn_wc_diff_callbacks3_t_dir_closed_get
def set_parent_pool(self, parent_pool=None):
"""Create a new proxy object for svn_wc_diff_callbacks3_t"""
import libsvn.core, weakref
self.__dict__["_parent_pool"] = \
parent_pool or libsvn.core.application_pool;
if self.__dict__["_parent_pool"]:
self.__dict__["_is_valid"] = weakref.ref(
self.__dict__["_parent_pool"]._is_valid)
def assert_valid(self):
"""Assert that this object is using valid pool memory"""
if "_is_valid" in self.__dict__:
assert self.__dict__["_is_valid"](), "Variable has already been deleted"
def __getattr__(self, name):
"""Get an attribute from this object"""
self.assert_valid()
value = _swig_getattr(self, self.__class__, name)
members = self.__dict__.get("_members")
if members is not None:
old_value = members.get(name)
if (old_value is not None and value is not None and
value is not old_value):
try:
value.__dict__.update(old_value.__dict__)
except AttributeError:
pass
if hasattr(value, "assert_valid"):
value.assert_valid()
return value
def __setattr__(self, name, value):
"""Set an attribute on this object"""
self.assert_valid()
self.__dict__.setdefault("_members",{})[name] = value
return _swig_setattr(self, self.__class__, name, value)
def file_changed(self, *args):
return svn_wc_diff_callbacks3_invoke_file_changed(self, *args)
def file_added(self, *args):
return svn_wc_diff_callbacks3_invoke_file_added(self, *args)
def file_deleted(self, *args):
return svn_wc_diff_callbacks3_invoke_file_deleted(self, *args)
def dir_added(self, *args):
return svn_wc_diff_callbacks3_invoke_dir_added(self, *args)
def dir_deleted(self, *args):
return svn_wc_diff_callbacks3_invoke_dir_deleted(self, *args)
def dir_props_changed(self, *args):
return svn_wc_diff_callbacks3_invoke_dir_props_changed(self, *args)
def dir_opened(self, *args):
return svn_wc_diff_callbacks3_invoke_dir_opened(self, *args)
def dir_closed(self, *args):
return svn_wc_diff_callbacks3_invoke_dir_closed(self, *args)
def __init__(self):
"""__init__(self) -> svn_wc_diff_callbacks3_t"""
this = _wc.new_svn_wc_diff_callbacks3_t()
try: self.this.append(this)
except: self.this = this
__swig_destroy__ = _wc.delete_svn_wc_diff_callbacks3_t
__del__ = lambda self : None;
svn_wc_diff_callbacks3_t_swigregister = _wc.svn_wc_diff_callbacks3_t_swigregister
svn_wc_diff_callbacks3_t_swigregister(svn_wc_diff_callbacks3_t)
class svn_wc_diff_callbacks2_t:
"""Proxy of C svn_wc_diff_callbacks2_t struct"""
__swig_setmethods__ = {}
__setattr__ = lambda self, name, value: _swig_setattr(self, svn_wc_diff_callbacks2_t, name, value)
__swig_getmethods__ = {}
__getattr__ = lambda self, name: _swig_getattr(self, svn_wc_diff_callbacks2_t, name)
__repr__ = _swig_repr
__swig_setmethods__["file_changed"] = _wc.svn_wc_diff_callbacks2_t_file_changed_set
__swig_getmethods__["file_changed"] = _wc.svn_wc_diff_callbacks2_t_file_changed_get
__swig_setmethods__["file_added"] = _wc.svn_wc_diff_callbacks2_t_file_added_set
__swig_getmethods__["file_added"] = _wc.svn_wc_diff_callbacks2_t_file_added_get
__swig_setmethods__["file_deleted"] = _wc.svn_wc_diff_callbacks2_t_file_deleted_set
__swig_getmethods__["file_deleted"] = _wc.svn_wc_diff_callbacks2_t_file_deleted_get
__swig_setmethods__["dir_added"] = _wc.svn_wc_diff_callbacks2_t_dir_added_set
__swig_getmethods__["dir_added"] = _wc.svn_wc_diff_callbacks2_t_dir_added_get
__swig_setmethods__["dir_deleted"] = _wc.svn_wc_diff_callbacks2_t_dir_deleted_set
__swig_getmethods__["dir_deleted"] = _wc.svn_wc_diff_callbacks2_t_dir_deleted_get
__swig_setmethods__["dir_props_changed"] = _wc.svn_wc_diff_callbacks2_t_dir_props_changed_set
__swig_getmethods__["dir_props_changed"] = _wc.svn_wc_diff_callbacks2_t_dir_props_changed_get
def set_parent_pool(self, parent_pool=None):
"""Create a new proxy object for svn_wc_diff_callbacks2_t"""
import libsvn.core, weakref
self.__dict__["_parent_pool"] = \
parent_pool or libsvn.core.application_pool;
if self.__dict__["_parent_pool"]:
self.__dict__["_is_valid"] = weakref.ref(
self.__dict__["_parent_pool"]._is_valid)
def assert_valid(self):
"""Assert that this object is using valid pool memory"""
if "_is_valid" in self.__dict__:
assert self.__dict__["_is_valid"](), "Variable has already been deleted"
def __getattr__(self, name):
"""Get an attribute from this object"""
self.assert_valid()
value = _swig_getattr(self, self.__class__, name)
members = self.__dict__.get("_members")
if members is not None:
old_value = members.get(name)
if (old_value is not None and value is not None and
value is not old_value):
try:
value.__dict__.update(old_value.__dict__)
except AttributeError:
pass
if hasattr(value, "assert_valid"):
value.assert_valid()
return value
def __setattr__(self, name, value):
"""Set an attribute on this object"""
self.assert_valid()
self.__dict__.setdefault("_members",{})[name] = value
return _swig_setattr(self, self.__class__, name, value)
def file_changed(self, *args):
return svn_wc_diff_callbacks2_invoke_file_changed(self, *args)
def file_added(self, *args):
return svn_wc_diff_callbacks2_invoke_file_added(self, *args)
def file_deleted(self, *args):
return svn_wc_diff_callbacks2_invoke_file_deleted(self, *args)
def dir_added(self, *args):
return svn_wc_diff_callbacks2_invoke_dir_added(self, *args)
def dir_deleted(self, *args):
return svn_wc_diff_callbacks2_invoke_dir_deleted(self, *args)
def dir_props_changed(self, *args):
return svn_wc_diff_callbacks2_invoke_dir_props_changed(self, *args)
def __init__(self):
"""__init__(self) -> svn_wc_diff_callbacks2_t"""
this = _wc.new_svn_wc_diff_callbacks2_t()
try: self.this.append(this)
except: self.this = this
__swig_destroy__ = _wc.delete_svn_wc_diff_callbacks2_t
__del__ = lambda self : None;
svn_wc_diff_callbacks2_t_swigregister = _wc.svn_wc_diff_callbacks2_t_swigregister
svn_wc_diff_callbacks2_t_swigregister(svn_wc_diff_callbacks2_t)
class svn_wc_diff_callbacks_t:
"""Proxy of C svn_wc_diff_callbacks_t struct"""
__swig_setmethods__ = {}
__setattr__ = lambda self, name, value: _swig_setattr(self, svn_wc_diff_callbacks_t, name, value)
__swig_getmethods__ = {}
__getattr__ = lambda self, name: _swig_getattr(self, svn_wc_diff_callbacks_t, name)
__repr__ = _swig_repr
__swig_setmethods__["file_changed"] = _wc.svn_wc_diff_callbacks_t_file_changed_set
__swig_getmethods__["file_changed"] = _wc.svn_wc_diff_callbacks_t_file_changed_get
__swig_setmethods__["file_added"] = _wc.svn_wc_diff_callbacks_t_file_added_set
__swig_getmethods__["file_added"] = _wc.svn_wc_diff_callbacks_t_file_added_get
__swig_setmethods__["file_deleted"] = _wc.svn_wc_diff_callbacks_t_file_deleted_set
__swig_getmethods__["file_deleted"] = _wc.svn_wc_diff_callbacks_t_file_deleted_get
__swig_setmethods__["dir_added"] = _wc.svn_wc_diff_callbacks_t_dir_added_set
__swig_getmethods__["dir_added"] = _wc.svn_wc_diff_callbacks_t_dir_added_get
__swig_setmethods__["dir_deleted"] = _wc.svn_wc_diff_callbacks_t_dir_deleted_set
__swig_getmethods__["dir_deleted"] = _wc.svn_wc_diff_callbacks_t_dir_deleted_get
__swig_setmethods__["props_changed"] = _wc.svn_wc_diff_callbacks_t_props_changed_set
__swig_getmethods__["props_changed"] = _wc.svn_wc_diff_callbacks_t_props_changed_get
def set_parent_pool(self, parent_pool=None):
"""Create a new proxy object for svn_wc_diff_callbacks_t"""
import libsvn.core, weakref
self.__dict__["_parent_pool"] = \
parent_pool or libsvn.core.application_pool;
if self.__dict__["_parent_pool"]:
self.__dict__["_is_valid"] = weakref.ref(
self.__dict__["_parent_pool"]._is_valid)
def assert_valid(self):
"""Assert that this object is using valid pool memory"""
if "_is_valid" in self.__dict__:
assert self.__dict__["_is_valid"](), "Variable has already been deleted"
def __getattr__(self, name):
"""Get an attribute from this object"""
self.assert_valid()
value = _swig_getattr(self, self.__class__, name)
members = self.__dict__.get("_members")
if members is not None:
old_value = members.get(name)
if (old_value is not None and value is not None and
value is not old_value):
try:
value.__dict__.update(old_value.__dict__)
except AttributeError:
pass
if hasattr(value, "assert_valid"):
value.assert_valid()
return value
def __setattr__(self, name, value):
"""Set an attribute on this object"""
self.assert_valid()
self.__dict__.setdefault("_members",{})[name] = value
return _swig_setattr(self, self.__class__, name, value)
def file_changed(self, *args):
return svn_wc_diff_callbacks_invoke_file_changed(self, *args)
def file_added(self, *args):
return svn_wc_diff_callbacks_invoke_file_added(self, *args)
def file_deleted(self, *args):
return svn_wc_diff_callbacks_invoke_file_deleted(self, *args)
def dir_added(self, *args):
return svn_wc_diff_callbacks_invoke_dir_added(self, *args)
def dir_deleted(self, *args):
return svn_wc_diff_callbacks_invoke_dir_deleted(self, *args)
def props_changed(self, *args):
return svn_wc_diff_callbacks_invoke_props_changed(self, *args)
def __init__(self):
"""__init__(self) -> svn_wc_diff_callbacks_t"""
this = _wc.new_svn_wc_diff_callbacks_t()
try: self.this.append(this)
except: self.this = this
__swig_destroy__ = _wc.delete_svn_wc_diff_callbacks_t
__del__ = lambda self : None;
svn_wc_diff_callbacks_t_swigregister = _wc.svn_wc_diff_callbacks_t_swigregister
svn_wc_diff_callbacks_t_swigregister(svn_wc_diff_callbacks_t)
def svn_wc_check_wc(*args):
"""svn_wc_check_wc(char path, apr_pool_t pool) -> svn_error_t"""
return _wc.svn_wc_check_wc(*args)
def svn_wc_has_binary_prop(*args):
"""svn_wc_has_binary_prop(char path, svn_wc_adm_access_t adm_access, apr_pool_t pool) -> svn_error_t"""
return _wc.svn_wc_has_binary_prop(*args)
def svn_wc_text_modified_p(*args):
"""
svn_wc_text_modified_p(char filename, svn_boolean_t force_comparison, svn_wc_adm_access_t adm_access,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_text_modified_p(*args)
def svn_wc_props_modified_p(*args):
"""svn_wc_props_modified_p(char path, svn_wc_adm_access_t adm_access, apr_pool_t pool) -> svn_error_t"""
return _wc.svn_wc_props_modified_p(*args)
SVN_WC_ADM_DIR_NAME = _wc.SVN_WC_ADM_DIR_NAME
svn_wc_schedule_normal = _wc.svn_wc_schedule_normal
svn_wc_schedule_add = _wc.svn_wc_schedule_add
svn_wc_schedule_delete = _wc.svn_wc_schedule_delete
svn_wc_schedule_replace = _wc.svn_wc_schedule_replace
SVN_WC_ENTRY_WORKING_SIZE_UNKNOWN = _wc.SVN_WC_ENTRY_WORKING_SIZE_UNKNOWN
class svn_wc_entry_t:
"""Proxy of C svn_wc_entry_t struct"""
__swig_setmethods__ = {}
__setattr__ = lambda self, name, value: _swig_setattr(self, svn_wc_entry_t, name, value)
__swig_getmethods__ = {}
__getattr__ = lambda self, name: _swig_getattr(self, svn_wc_entry_t, name)
__repr__ = _swig_repr
__swig_setmethods__["name"] = _wc.svn_wc_entry_t_name_set
__swig_getmethods__["name"] = _wc.svn_wc_entry_t_name_get
__swig_setmethods__["revision"] = _wc.svn_wc_entry_t_revision_set
__swig_getmethods__["revision"] = _wc.svn_wc_entry_t_revision_get
__swig_setmethods__["url"] = _wc.svn_wc_entry_t_url_set
__swig_getmethods__["url"] = _wc.svn_wc_entry_t_url_get
__swig_setmethods__["repos"] = _wc.svn_wc_entry_t_repos_set
__swig_getmethods__["repos"] = _wc.svn_wc_entry_t_repos_get
__swig_setmethods__["uuid"] = _wc.svn_wc_entry_t_uuid_set
__swig_getmethods__["uuid"] = _wc.svn_wc_entry_t_uuid_get
__swig_setmethods__["kind"] = _wc.svn_wc_entry_t_kind_set
__swig_getmethods__["kind"] = _wc.svn_wc_entry_t_kind_get
__swig_setmethods__["schedule"] = _wc.svn_wc_entry_t_schedule_set
__swig_getmethods__["schedule"] = _wc.svn_wc_entry_t_schedule_get
__swig_setmethods__["copied"] = _wc.svn_wc_entry_t_copied_set
__swig_getmethods__["copied"] = _wc.svn_wc_entry_t_copied_get
__swig_setmethods__["deleted"] = _wc.svn_wc_entry_t_deleted_set
__swig_getmethods__["deleted"] = _wc.svn_wc_entry_t_deleted_get
__swig_setmethods__["absent"] = _wc.svn_wc_entry_t_absent_set
__swig_getmethods__["absent"] = _wc.svn_wc_entry_t_absent_get
__swig_setmethods__["incomplete"] = _wc.svn_wc_entry_t_incomplete_set
__swig_getmethods__["incomplete"] = _wc.svn_wc_entry_t_incomplete_get
__swig_setmethods__["copyfrom_url"] = _wc.svn_wc_entry_t_copyfrom_url_set
__swig_getmethods__["copyfrom_url"] = _wc.svn_wc_entry_t_copyfrom_url_get
__swig_setmethods__["copyfrom_rev"] = _wc.svn_wc_entry_t_copyfrom_rev_set
__swig_getmethods__["copyfrom_rev"] = _wc.svn_wc_entry_t_copyfrom_rev_get
__swig_setmethods__["conflict_old"] = _wc.svn_wc_entry_t_conflict_old_set
__swig_getmethods__["conflict_old"] = _wc.svn_wc_entry_t_conflict_old_get
__swig_setmethods__["conflict_new"] = _wc.svn_wc_entry_t_conflict_new_set
__swig_getmethods__["conflict_new"] = _wc.svn_wc_entry_t_conflict_new_get
__swig_setmethods__["conflict_wrk"] = _wc.svn_wc_entry_t_conflict_wrk_set
__swig_getmethods__["conflict_wrk"] = _wc.svn_wc_entry_t_conflict_wrk_get
__swig_setmethods__["prejfile"] = _wc.svn_wc_entry_t_prejfile_set
__swig_getmethods__["prejfile"] = _wc.svn_wc_entry_t_prejfile_get
__swig_setmethods__["text_time"] = _wc.svn_wc_entry_t_text_time_set
__swig_getmethods__["text_time"] = _wc.svn_wc_entry_t_text_time_get
__swig_setmethods__["prop_time"] = _wc.svn_wc_entry_t_prop_time_set
__swig_getmethods__["prop_time"] = _wc.svn_wc_entry_t_prop_time_get
__swig_setmethods__["checksum"] = _wc.svn_wc_entry_t_checksum_set
__swig_getmethods__["checksum"] = _wc.svn_wc_entry_t_checksum_get
__swig_setmethods__["cmt_rev"] = _wc.svn_wc_entry_t_cmt_rev_set
__swig_getmethods__["cmt_rev"] = _wc.svn_wc_entry_t_cmt_rev_get
__swig_setmethods__["cmt_date"] = _wc.svn_wc_entry_t_cmt_date_set
__swig_getmethods__["cmt_date"] = _wc.svn_wc_entry_t_cmt_date_get
__swig_setmethods__["cmt_author"] = _wc.svn_wc_entry_t_cmt_author_set
__swig_getmethods__["cmt_author"] = _wc.svn_wc_entry_t_cmt_author_get
__swig_setmethods__["lock_token"] = _wc.svn_wc_entry_t_lock_token_set
__swig_getmethods__["lock_token"] = _wc.svn_wc_entry_t_lock_token_get
__swig_setmethods__["lock_owner"] = _wc.svn_wc_entry_t_lock_owner_set
__swig_getmethods__["lock_owner"] = _wc.svn_wc_entry_t_lock_owner_get
__swig_setmethods__["lock_comment"] = _wc.svn_wc_entry_t_lock_comment_set
__swig_getmethods__["lock_comment"] = _wc.svn_wc_entry_t_lock_comment_get
__swig_setmethods__["lock_creation_date"] = _wc.svn_wc_entry_t_lock_creation_date_set
__swig_getmethods__["lock_creation_date"] = _wc.svn_wc_entry_t_lock_creation_date_get
__swig_setmethods__["has_props"] = _wc.svn_wc_entry_t_has_props_set
__swig_getmethods__["has_props"] = _wc.svn_wc_entry_t_has_props_get
__swig_setmethods__["has_prop_mods"] = _wc.svn_wc_entry_t_has_prop_mods_set
__swig_getmethods__["has_prop_mods"] = _wc.svn_wc_entry_t_has_prop_mods_get
__swig_setmethods__["cachable_props"] = _wc.svn_wc_entry_t_cachable_props_set
__swig_getmethods__["cachable_props"] = _wc.svn_wc_entry_t_cachable_props_get
__swig_setmethods__["present_props"] = _wc.svn_wc_entry_t_present_props_set
__swig_getmethods__["present_props"] = _wc.svn_wc_entry_t_present_props_get
__swig_setmethods__["changelist"] = _wc.svn_wc_entry_t_changelist_set
__swig_getmethods__["changelist"] = _wc.svn_wc_entry_t_changelist_get
__swig_setmethods__["working_size"] = _wc.svn_wc_entry_t_working_size_set
__swig_getmethods__["working_size"] = _wc.svn_wc_entry_t_working_size_get
__swig_setmethods__["keep_local"] = _wc.svn_wc_entry_t_keep_local_set
__swig_getmethods__["keep_local"] = _wc.svn_wc_entry_t_keep_local_get
__swig_setmethods__["depth"] = _wc.svn_wc_entry_t_depth_set
__swig_getmethods__["depth"] = _wc.svn_wc_entry_t_depth_get
__swig_setmethods__["tree_conflict_data"] = _wc.svn_wc_entry_t_tree_conflict_data_set
__swig_getmethods__["tree_conflict_data"] = _wc.svn_wc_entry_t_tree_conflict_data_get
__swig_setmethods__["file_external_path"] = _wc.svn_wc_entry_t_file_external_path_set
__swig_getmethods__["file_external_path"] = _wc.svn_wc_entry_t_file_external_path_get
__swig_setmethods__["file_external_peg_rev"] = _wc.svn_wc_entry_t_file_external_peg_rev_set
__swig_getmethods__["file_external_peg_rev"] = _wc.svn_wc_entry_t_file_external_peg_rev_get
__swig_setmethods__["file_external_rev"] = _wc.svn_wc_entry_t_file_external_rev_set
__swig_getmethods__["file_external_rev"] = _wc.svn_wc_entry_t_file_external_rev_get
def set_parent_pool(self, parent_pool=None):
"""Create a new proxy object for svn_wc_entry_t"""
import libsvn.core, weakref
self.__dict__["_parent_pool"] = \
parent_pool or libsvn.core.application_pool;
if self.__dict__["_parent_pool"]:
self.__dict__["_is_valid"] = weakref.ref(
self.__dict__["_parent_pool"]._is_valid)
def assert_valid(self):
"""Assert that this object is using valid pool memory"""
if "_is_valid" in self.__dict__:
assert self.__dict__["_is_valid"](), "Variable has already been deleted"
def __getattr__(self, name):
"""Get an attribute from this object"""
self.assert_valid()
value = _swig_getattr(self, self.__class__, name)
members = self.__dict__.get("_members")
if members is not None:
old_value = members.get(name)
if (old_value is not None and value is not None and
value is not old_value):
try:
value.__dict__.update(old_value.__dict__)
except AttributeError:
pass
if hasattr(value, "assert_valid"):
value.assert_valid()
return value
def __setattr__(self, name, value):
"""Set an attribute on this object"""
self.assert_valid()
self.__dict__.setdefault("_members",{})[name] = value
return _swig_setattr(self, self.__class__, name, value)
def __init__(self):
"""__init__(self) -> svn_wc_entry_t"""
this = _wc.new_svn_wc_entry_t()
try: self.this.append(this)
except: self.this = this
__swig_destroy__ = _wc.delete_svn_wc_entry_t
__del__ = lambda self : None;
svn_wc_entry_t_swigregister = _wc.svn_wc_entry_t_swigregister
svn_wc_entry_t_swigregister(svn_wc_entry_t)
SVN_WC_ENTRY_THIS_DIR = _wc.SVN_WC_ENTRY_THIS_DIR
def svn_wc_entry(*args):
"""
svn_wc_entry(char path, svn_wc_adm_access_t adm_access, svn_boolean_t show_hidden,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_entry(*args)
def svn_wc_entries_read(*args):
"""
svn_wc_entries_read(svn_wc_adm_access_t adm_access, svn_boolean_t show_hidden,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_entries_read(*args)
def svn_wc_entry_dup(*args):
"""svn_wc_entry_dup(svn_wc_entry_t entry, apr_pool_t pool) -> svn_wc_entry_t"""
return _wc.svn_wc_entry_dup(*args)
def svn_wc_conflicted_p2(*args):
"""svn_wc_conflicted_p2(char path, svn_wc_adm_access_t adm_access, apr_pool_t pool) -> svn_error_t"""
return _wc.svn_wc_conflicted_p2(*args)
def svn_wc_conflicted_p(*args):
"""svn_wc_conflicted_p(char dir_path, svn_wc_entry_t entry, apr_pool_t pool) -> svn_error_t"""
return _wc.svn_wc_conflicted_p(*args)
def svn_wc_get_ancestry(*args):
"""svn_wc_get_ancestry(char path, svn_wc_adm_access_t adm_access, apr_pool_t pool) -> svn_error_t"""
return _wc.svn_wc_get_ancestry(*args)
class svn_wc_entry_callbacks2_t:
"""Proxy of C svn_wc_entry_callbacks2_t struct"""
__swig_setmethods__ = {}
__setattr__ = lambda self, name, value: _swig_setattr(self, svn_wc_entry_callbacks2_t, name, value)
__swig_getmethods__ = {}
__getattr__ = lambda self, name: _swig_getattr(self, svn_wc_entry_callbacks2_t, name)
__repr__ = _swig_repr
__swig_setmethods__["found_entry"] = _wc.svn_wc_entry_callbacks2_t_found_entry_set
__swig_getmethods__["found_entry"] = _wc.svn_wc_entry_callbacks2_t_found_entry_get
__swig_setmethods__["handle_error"] = _wc.svn_wc_entry_callbacks2_t_handle_error_set
__swig_getmethods__["handle_error"] = _wc.svn_wc_entry_callbacks2_t_handle_error_get
def set_parent_pool(self, parent_pool=None):
"""Create a new proxy object for svn_wc_entry_callbacks2_t"""
import libsvn.core, weakref
self.__dict__["_parent_pool"] = \
parent_pool or libsvn.core.application_pool;
if self.__dict__["_parent_pool"]:
self.__dict__["_is_valid"] = weakref.ref(
self.__dict__["_parent_pool"]._is_valid)
def assert_valid(self):
"""Assert that this object is using valid pool memory"""
if "_is_valid" in self.__dict__:
assert self.__dict__["_is_valid"](), "Variable has already been deleted"
def __getattr__(self, name):
"""Get an attribute from this object"""
self.assert_valid()
value = _swig_getattr(self, self.__class__, name)
members = self.__dict__.get("_members")
if members is not None:
old_value = members.get(name)
if (old_value is not None and value is not None and
value is not old_value):
try:
value.__dict__.update(old_value.__dict__)
except AttributeError:
pass
if hasattr(value, "assert_valid"):
value.assert_valid()
return value
def __setattr__(self, name, value):
"""Set an attribute on this object"""
self.assert_valid()
self.__dict__.setdefault("_members",{})[name] = value
return _swig_setattr(self, self.__class__, name, value)
def found_entry(self, *args):
return svn_wc_entry_callbacks2_invoke_found_entry(self, *args)
def handle_error(self, *args):
return svn_wc_entry_callbacks2_invoke_handle_error(self, *args)
def __init__(self):
"""__init__(self) -> svn_wc_entry_callbacks2_t"""
this = _wc.new_svn_wc_entry_callbacks2_t()
try: self.this.append(this)
except: self.this = this
__swig_destroy__ = _wc.delete_svn_wc_entry_callbacks2_t
__del__ = lambda self : None;
svn_wc_entry_callbacks2_t_swigregister = _wc.svn_wc_entry_callbacks2_t_swigregister
svn_wc_entry_callbacks2_t_swigregister(svn_wc_entry_callbacks2_t)
class svn_wc_entry_callbacks_t:
"""Proxy of C svn_wc_entry_callbacks_t struct"""
__swig_setmethods__ = {}
__setattr__ = lambda self, name, value: _swig_setattr(self, svn_wc_entry_callbacks_t, name, value)
__swig_getmethods__ = {}
__getattr__ = lambda self, name: _swig_getattr(self, svn_wc_entry_callbacks_t, name)
__repr__ = _swig_repr
__swig_setmethods__["found_entry"] = _wc.svn_wc_entry_callbacks_t_found_entry_set
__swig_getmethods__["found_entry"] = _wc.svn_wc_entry_callbacks_t_found_entry_get
def set_parent_pool(self, parent_pool=None):
"""Create a new proxy object for svn_wc_entry_callbacks_t"""
import libsvn.core, weakref
self.__dict__["_parent_pool"] = \
parent_pool or libsvn.core.application_pool;
if self.__dict__["_parent_pool"]:
self.__dict__["_is_valid"] = weakref.ref(
self.__dict__["_parent_pool"]._is_valid)
def assert_valid(self):
"""Assert that this object is using valid pool memory"""
if "_is_valid" in self.__dict__:
assert self.__dict__["_is_valid"](), "Variable has already been deleted"
def __getattr__(self, name):
"""Get an attribute from this object"""
self.assert_valid()
value = _swig_getattr(self, self.__class__, name)
members = self.__dict__.get("_members")
if members is not None:
old_value = members.get(name)
if (old_value is not None and value is not None and
value is not old_value):
try:
value.__dict__.update(old_value.__dict__)
except AttributeError:
pass
if hasattr(value, "assert_valid"):
value.assert_valid()
return value
def __setattr__(self, name, value):
"""Set an attribute on this object"""
self.assert_valid()
self.__dict__.setdefault("_members",{})[name] = value
return _swig_setattr(self, self.__class__, name, value)
def found_entry(self, *args):
return svn_wc_entry_callbacks_invoke_found_entry(self, *args)
def __init__(self):
"""__init__(self) -> svn_wc_entry_callbacks_t"""
this = _wc.new_svn_wc_entry_callbacks_t()
try: self.this.append(this)
except: self.this = this
__swig_destroy__ = _wc.delete_svn_wc_entry_callbacks_t
__del__ = lambda self : None;
svn_wc_entry_callbacks_t_swigregister = _wc.svn_wc_entry_callbacks_t_swigregister
svn_wc_entry_callbacks_t_swigregister(svn_wc_entry_callbacks_t)
def svn_wc_walk_entries3(*args):
"""
svn_wc_walk_entries3(char path, svn_wc_adm_access_t adm_access, svn_wc_entry_callbacks2_t walk_callbacks,
void walk_baton,
svn_depth_t depth, svn_boolean_t show_hidden,
svn_cancel_func_t cancel_func, apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_walk_entries3(*args)
def svn_wc_walk_entries2(*args):
"""
svn_wc_walk_entries2(char path, svn_wc_adm_access_t adm_access, svn_wc_entry_callbacks_t walk_callbacks,
void walk_baton,
svn_boolean_t show_hidden, svn_cancel_func_t cancel_func,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_walk_entries2(*args)
def svn_wc_walk_entries(*args):
"""
svn_wc_walk_entries(char path, svn_wc_adm_access_t adm_access, svn_wc_entry_callbacks_t walk_callbacks,
void walk_baton,
svn_boolean_t show_hidden, apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_walk_entries(*args)
def svn_wc_mark_missing_deleted(*args):
"""svn_wc_mark_missing_deleted(char path, svn_wc_adm_access_t parent, apr_pool_t pool) -> svn_error_t"""
return _wc.svn_wc_mark_missing_deleted(*args)
def svn_wc_ensure_adm3(*args):
"""
svn_wc_ensure_adm3(char path, char uuid, char url, char repos, svn_revnum_t revision,
svn_depth_t depth, apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_ensure_adm3(*args)
def svn_wc_ensure_adm2(*args):
"""
svn_wc_ensure_adm2(char path, char uuid, char url, char repos, svn_revnum_t revision,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_ensure_adm2(*args)
def svn_wc_ensure_adm(*args):
"""
svn_wc_ensure_adm(char path, char uuid, char url, svn_revnum_t revision,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_ensure_adm(*args)
def svn_wc_maybe_set_repos_root(*args):
"""
svn_wc_maybe_set_repos_root(svn_wc_adm_access_t adm_access, char path, char repos,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_maybe_set_repos_root(*args)
svn_wc_status_none = _wc.svn_wc_status_none
svn_wc_status_unversioned = _wc.svn_wc_status_unversioned
svn_wc_status_normal = _wc.svn_wc_status_normal
svn_wc_status_added = _wc.svn_wc_status_added
svn_wc_status_missing = _wc.svn_wc_status_missing
svn_wc_status_deleted = _wc.svn_wc_status_deleted
svn_wc_status_replaced = _wc.svn_wc_status_replaced
svn_wc_status_modified = _wc.svn_wc_status_modified
svn_wc_status_merged = _wc.svn_wc_status_merged
svn_wc_status_conflicted = _wc.svn_wc_status_conflicted
svn_wc_status_ignored = _wc.svn_wc_status_ignored
svn_wc_status_obstructed = _wc.svn_wc_status_obstructed
svn_wc_status_external = _wc.svn_wc_status_external
svn_wc_status_incomplete = _wc.svn_wc_status_incomplete
class svn_wc_status2_t:
"""Proxy of C svn_wc_status2_t struct"""
__swig_setmethods__ = {}
__setattr__ = lambda self, name, value: _swig_setattr(self, svn_wc_status2_t, name, value)
__swig_getmethods__ = {}
__getattr__ = lambda self, name: _swig_getattr(self, svn_wc_status2_t, name)
__repr__ = _swig_repr
__swig_setmethods__["entry"] = _wc.svn_wc_status2_t_entry_set
__swig_getmethods__["entry"] = _wc.svn_wc_status2_t_entry_get
__swig_setmethods__["text_status"] = _wc.svn_wc_status2_t_text_status_set
__swig_getmethods__["text_status"] = _wc.svn_wc_status2_t_text_status_get
__swig_setmethods__["prop_status"] = _wc.svn_wc_status2_t_prop_status_set
__swig_getmethods__["prop_status"] = _wc.svn_wc_status2_t_prop_status_get
__swig_setmethods__["locked"] = _wc.svn_wc_status2_t_locked_set
__swig_getmethods__["locked"] = _wc.svn_wc_status2_t_locked_get
__swig_setmethods__["copied"] = _wc.svn_wc_status2_t_copied_set
__swig_getmethods__["copied"] = _wc.svn_wc_status2_t_copied_get
__swig_setmethods__["switched"] = _wc.svn_wc_status2_t_switched_set
__swig_getmethods__["switched"] = _wc.svn_wc_status2_t_switched_get
__swig_setmethods__["repos_text_status"] = _wc.svn_wc_status2_t_repos_text_status_set
__swig_getmethods__["repos_text_status"] = _wc.svn_wc_status2_t_repos_text_status_get
__swig_setmethods__["repos_prop_status"] = _wc.svn_wc_status2_t_repos_prop_status_set
__swig_getmethods__["repos_prop_status"] = _wc.svn_wc_status2_t_repos_prop_status_get
__swig_setmethods__["repos_lock"] = _wc.svn_wc_status2_t_repos_lock_set
__swig_getmethods__["repos_lock"] = _wc.svn_wc_status2_t_repos_lock_get
__swig_setmethods__["url"] = _wc.svn_wc_status2_t_url_set
__swig_getmethods__["url"] = _wc.svn_wc_status2_t_url_get
__swig_setmethods__["ood_last_cmt_rev"] = _wc.svn_wc_status2_t_ood_last_cmt_rev_set
__swig_getmethods__["ood_last_cmt_rev"] = _wc.svn_wc_status2_t_ood_last_cmt_rev_get
__swig_setmethods__["ood_last_cmt_date"] = _wc.svn_wc_status2_t_ood_last_cmt_date_set
__swig_getmethods__["ood_last_cmt_date"] = _wc.svn_wc_status2_t_ood_last_cmt_date_get
__swig_setmethods__["ood_kind"] = _wc.svn_wc_status2_t_ood_kind_set
__swig_getmethods__["ood_kind"] = _wc.svn_wc_status2_t_ood_kind_get
__swig_setmethods__["ood_last_cmt_author"] = _wc.svn_wc_status2_t_ood_last_cmt_author_set
__swig_getmethods__["ood_last_cmt_author"] = _wc.svn_wc_status2_t_ood_last_cmt_author_get
__swig_setmethods__["tree_conflict"] = _wc.svn_wc_status2_t_tree_conflict_set
__swig_getmethods__["tree_conflict"] = _wc.svn_wc_status2_t_tree_conflict_get
__swig_setmethods__["file_external"] = _wc.svn_wc_status2_t_file_external_set
__swig_getmethods__["file_external"] = _wc.svn_wc_status2_t_file_external_get
__swig_setmethods__["pristine_text_status"] = _wc.svn_wc_status2_t_pristine_text_status_set
__swig_getmethods__["pristine_text_status"] = _wc.svn_wc_status2_t_pristine_text_status_get
__swig_setmethods__["pristine_prop_status"] = _wc.svn_wc_status2_t_pristine_prop_status_set
__swig_getmethods__["pristine_prop_status"] = _wc.svn_wc_status2_t_pristine_prop_status_get
def set_parent_pool(self, parent_pool=None):
"""Create a new proxy object for svn_wc_status2_t"""
import libsvn.core, weakref
self.__dict__["_parent_pool"] = \
parent_pool or libsvn.core.application_pool;
if self.__dict__["_parent_pool"]:
self.__dict__["_is_valid"] = weakref.ref(
self.__dict__["_parent_pool"]._is_valid)
def assert_valid(self):
"""Assert that this object is using valid pool memory"""
if "_is_valid" in self.__dict__:
assert self.__dict__["_is_valid"](), "Variable has already been deleted"
def __getattr__(self, name):
"""Get an attribute from this object"""
self.assert_valid()
value = _swig_getattr(self, self.__class__, name)
members = self.__dict__.get("_members")
if members is not None:
old_value = members.get(name)
if (old_value is not None and value is not None and
value is not old_value):
try:
value.__dict__.update(old_value.__dict__)
except AttributeError:
pass
if hasattr(value, "assert_valid"):
value.assert_valid()
return value
def __setattr__(self, name, value):
"""Set an attribute on this object"""
self.assert_valid()
self.__dict__.setdefault("_members",{})[name] = value
return _swig_setattr(self, self.__class__, name, value)
def __init__(self):
"""__init__(self) -> svn_wc_status2_t"""
this = _wc.new_svn_wc_status2_t()
try: self.this.append(this)
except: self.this = this
__swig_destroy__ = _wc.delete_svn_wc_status2_t
__del__ = lambda self : None;
svn_wc_status2_t_swigregister = _wc.svn_wc_status2_t_swigregister
svn_wc_status2_t_swigregister(svn_wc_status2_t)
class svn_wc_status_t:
"""Proxy of C svn_wc_status_t struct"""
__swig_setmethods__ = {}
__setattr__ = lambda self, name, value: _swig_setattr(self, svn_wc_status_t, name, value)
__swig_getmethods__ = {}
__getattr__ = lambda self, name: _swig_getattr(self, svn_wc_status_t, name)
__repr__ = _swig_repr
__swig_setmethods__["entry"] = _wc.svn_wc_status_t_entry_set
__swig_getmethods__["entry"] = _wc.svn_wc_status_t_entry_get
__swig_setmethods__["text_status"] = _wc.svn_wc_status_t_text_status_set
__swig_getmethods__["text_status"] = _wc.svn_wc_status_t_text_status_get
__swig_setmethods__["prop_status"] = _wc.svn_wc_status_t_prop_status_set
__swig_getmethods__["prop_status"] = _wc.svn_wc_status_t_prop_status_get
__swig_setmethods__["locked"] = _wc.svn_wc_status_t_locked_set
__swig_getmethods__["locked"] = _wc.svn_wc_status_t_locked_get
__swig_setmethods__["copied"] = _wc.svn_wc_status_t_copied_set
__swig_getmethods__["copied"] = _wc.svn_wc_status_t_copied_get
__swig_setmethods__["switched"] = _wc.svn_wc_status_t_switched_set
__swig_getmethods__["switched"] = _wc.svn_wc_status_t_switched_get
__swig_setmethods__["repos_text_status"] = _wc.svn_wc_status_t_repos_text_status_set
__swig_getmethods__["repos_text_status"] = _wc.svn_wc_status_t_repos_text_status_get
__swig_setmethods__["repos_prop_status"] = _wc.svn_wc_status_t_repos_prop_status_set
__swig_getmethods__["repos_prop_status"] = _wc.svn_wc_status_t_repos_prop_status_get
def set_parent_pool(self, parent_pool=None):
"""Create a new proxy object for svn_wc_status_t"""
import libsvn.core, weakref
self.__dict__["_parent_pool"] = \
parent_pool or libsvn.core.application_pool;
if self.__dict__["_parent_pool"]:
self.__dict__["_is_valid"] = weakref.ref(
self.__dict__["_parent_pool"]._is_valid)
def assert_valid(self):
"""Assert that this object is using valid pool memory"""
if "_is_valid" in self.__dict__:
assert self.__dict__["_is_valid"](), "Variable has already been deleted"
def __getattr__(self, name):
"""Get an attribute from this object"""
self.assert_valid()
value = _swig_getattr(self, self.__class__, name)
members = self.__dict__.get("_members")
if members is not None:
old_value = members.get(name)
if (old_value is not None and value is not None and
value is not old_value):
try:
value.__dict__.update(old_value.__dict__)
except AttributeError:
pass
if hasattr(value, "assert_valid"):
value.assert_valid()
return value
def __setattr__(self, name, value):
"""Set an attribute on this object"""
self.assert_valid()
self.__dict__.setdefault("_members",{})[name] = value
return _swig_setattr(self, self.__class__, name, value)
def __init__(self):
"""__init__(self) -> svn_wc_status_t"""
this = _wc.new_svn_wc_status_t()
try: self.this.append(this)
except: self.this = this
__swig_destroy__ = _wc.delete_svn_wc_status_t
__del__ = lambda self : None;
svn_wc_status_t_swigregister = _wc.svn_wc_status_t_swigregister
svn_wc_status_t_swigregister(svn_wc_status_t)
def svn_wc_dup_status2(*args):
"""svn_wc_dup_status2(svn_wc_status2_t orig_stat, apr_pool_t pool) -> svn_wc_status2_t"""
return _wc.svn_wc_dup_status2(*args)
def svn_wc_dup_status(*args):
"""svn_wc_dup_status(svn_wc_status_t orig_stat, apr_pool_t pool) -> svn_wc_status_t"""
return _wc.svn_wc_dup_status(*args)
def svn_wc_status2(*args):
"""svn_wc_status2(char path, svn_wc_adm_access_t adm_access, apr_pool_t pool) -> svn_error_t"""
return _wc.svn_wc_status2(*args)
def svn_wc_status(*args):
"""svn_wc_status(char path, svn_wc_adm_access_t adm_access, apr_pool_t pool) -> svn_error_t"""
return _wc.svn_wc_status(*args)
def svn_wc_get_status_editor4(*args):
"""
svn_wc_get_status_editor4(svn_wc_adm_access_t anchor, char target, svn_depth_t depth,
svn_boolean_t get_all, svn_boolean_t no_ignore,
apr_array_header_t ignore_patterns,
svn_wc_status_func3_t status_func, void status_baton,
svn_cancel_func_t cancel_func, svn_wc_traversal_info_t traversal_info,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_get_status_editor4(*args)
def svn_wc_get_status_editor3(*args):
"""
svn_wc_get_status_editor3(svn_wc_adm_access_t anchor, char target, svn_depth_t depth,
svn_boolean_t get_all, svn_boolean_t no_ignore,
apr_array_header_t ignore_patterns,
svn_wc_status_func2_t status_func, svn_cancel_func_t cancel_func,
svn_wc_traversal_info_t traversal_info,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_get_status_editor3(*args)
def svn_wc_get_status_editor2(*args):
"""
svn_wc_get_status_editor2(svn_wc_adm_access_t anchor, char target, apr_hash_t config,
svn_boolean_t recurse, svn_boolean_t get_all,
svn_boolean_t no_ignore, svn_wc_status_func2_t status_func,
svn_cancel_func_t cancel_func,
svn_wc_traversal_info_t traversal_info,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_get_status_editor2(*args)
def svn_wc_get_status_editor(*args):
"""
svn_wc_get_status_editor(svn_wc_adm_access_t anchor, char target, apr_hash_t config,
svn_boolean_t recurse, svn_boolean_t get_all,
svn_boolean_t no_ignore, svn_wc_status_func_t status_func,
svn_cancel_func_t cancel_func,
svn_wc_traversal_info_t traversal_info,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_get_status_editor(*args)
def svn_wc_status_set_repos_locks(*args):
"""
svn_wc_status_set_repos_locks(void set_locks_baton, apr_hash_t locks, char repos_root,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_status_set_repos_locks(*args)
def svn_wc_copy2(*args):
"""
svn_wc_copy2(char src, svn_wc_adm_access_t dst_parent, char dst_basename,
svn_cancel_func_t cancel_func, svn_wc_notify_func2_t notify_func,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_copy2(*args)
def svn_wc_copy(*args):
"""
svn_wc_copy(char src, svn_wc_adm_access_t dst_parent, char dst_basename,
svn_cancel_func_t cancel_func, svn_wc_notify_func_t notify_func,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_copy(*args)
def svn_wc_delete3(*args):
"""
svn_wc_delete3(char path, svn_wc_adm_access_t adm_access, svn_cancel_func_t cancel_func,
svn_wc_notify_func2_t notify_func,
svn_boolean_t keep_local, apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_delete3(*args)
def svn_wc_delete2(*args):
"""
svn_wc_delete2(char path, svn_wc_adm_access_t adm_access, svn_cancel_func_t cancel_func,
svn_wc_notify_func2_t notify_func,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_delete2(*args)
def svn_wc_delete(*args):
"""
svn_wc_delete(char path, svn_wc_adm_access_t adm_access, svn_cancel_func_t cancel_func,
svn_wc_notify_func_t notify_func,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_delete(*args)
def svn_wc_add3(*args):
"""
svn_wc_add3(char path, svn_wc_adm_access_t parent_access, svn_depth_t depth,
char copyfrom_url, svn_revnum_t copyfrom_rev,
svn_cancel_func_t cancel_func,
svn_wc_notify_func2_t notify_func, apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_add3(*args)
def svn_wc_add2(*args):
"""
svn_wc_add2(char path, svn_wc_adm_access_t parent_access, char copyfrom_url,
svn_revnum_t copyfrom_rev, svn_cancel_func_t cancel_func,
svn_wc_notify_func2_t notify_func,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_add2(*args)
def svn_wc_add(*args):
"""
svn_wc_add(char path, svn_wc_adm_access_t parent_access, char copyfrom_url,
svn_revnum_t copyfrom_rev, svn_cancel_func_t cancel_func,
svn_wc_notify_func_t notify_func,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_add(*args)
def svn_wc_add_repos_file3(*args):
"""
svn_wc_add_repos_file3(char dst_path, svn_wc_adm_access_t adm_access, svn_stream_t new_base_contents,
svn_stream_t new_contents,
apr_hash_t new_base_props, apr_hash_t new_props,
char copyfrom_url, svn_revnum_t copyfrom_rev,
svn_cancel_func_t cancel_func,
svn_wc_notify_func2_t notify_func, apr_pool_t scratch_pool) -> svn_error_t
"""
return _wc.svn_wc_add_repos_file3(*args)
def svn_wc_add_repos_file2(*args):
"""
svn_wc_add_repos_file2(char dst_path, svn_wc_adm_access_t adm_access, char new_text_base_path,
char new_text_path, apr_hash_t new_base_props,
apr_hash_t new_props,
char copyfrom_url, svn_revnum_t copyfrom_rev,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_add_repos_file2(*args)
def svn_wc_add_repos_file(*args):
"""
svn_wc_add_repos_file(char dst_path, svn_wc_adm_access_t adm_access, char new_text_path,
apr_hash_t new_props, char copyfrom_url,
svn_revnum_t copyfrom_rev, apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_add_repos_file(*args)
def svn_wc_remove_from_revision_control(*args):
"""
svn_wc_remove_from_revision_control(svn_wc_adm_access_t adm_access, char name, svn_boolean_t destroy_wf,
svn_boolean_t instant_error,
svn_cancel_func_t cancel_func, apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_remove_from_revision_control(*args)
def svn_wc_resolved_conflict4(*args):
"""
svn_wc_resolved_conflict4(char path, svn_wc_adm_access_t adm_access, svn_boolean_t resolve_text,
svn_boolean_t resolve_props,
svn_boolean_t resolve_tree, svn_depth_t depth,
svn_wc_conflict_choice_t conflict_choice,
svn_wc_notify_func2_t notify_func, svn_cancel_func_t cancel_func,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_resolved_conflict4(*args)
def svn_wc_resolved_conflict3(*args):
"""
svn_wc_resolved_conflict3(char path, svn_wc_adm_access_t adm_access, svn_boolean_t resolve_text,
svn_boolean_t resolve_props,
svn_depth_t depth, svn_wc_conflict_choice_t conflict_choice,
svn_wc_notify_func2_t notify_func,
svn_cancel_func_t cancel_func, apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_resolved_conflict3(*args)
def svn_wc_resolved_conflict2(*args):
"""
svn_wc_resolved_conflict2(char path, svn_wc_adm_access_t adm_access, svn_boolean_t resolve_text,
svn_boolean_t resolve_props,
svn_boolean_t recurse, svn_wc_notify_func2_t notify_func,
svn_cancel_func_t cancel_func,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_resolved_conflict2(*args)
def svn_wc_resolved_conflict(*args):
"""
svn_wc_resolved_conflict(char path, svn_wc_adm_access_t adm_access, svn_boolean_t resolve_text,
svn_boolean_t resolve_props,
svn_boolean_t recurse, svn_wc_notify_func_t notify_func,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_resolved_conflict(*args)
def svn_wc_committed_queue_create(*args):
"""svn_wc_committed_queue_create(apr_pool_t pool) -> svn_wc_committed_queue_t"""
return _wc.svn_wc_committed_queue_create(*args)
def svn_wc_queue_committed2(*args):
"""
svn_wc_queue_committed2(svn_wc_committed_queue_t queue, char path, svn_wc_adm_access_t adm_access,
svn_boolean_t recurse,
apr_array_header_t wcprop_changes, svn_boolean_t remove_lock,
svn_boolean_t remove_changelist,
svn_checksum_t checksum, apr_pool_t scratch_pool) -> svn_error_t
"""
return _wc.svn_wc_queue_committed2(*args)
def svn_wc_queue_committed(*args):
"""
svn_wc_queue_committed(char path, svn_wc_adm_access_t adm_access, svn_boolean_t recurse,
apr_array_header_t wcprop_changes,
svn_boolean_t remove_lock, svn_boolean_t remove_changelist,
unsigned char digest, apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_queue_committed(*args)
def svn_wc_process_committed_queue(*args):
"""
svn_wc_process_committed_queue(svn_wc_committed_queue_t queue, svn_wc_adm_access_t adm_access,
svn_revnum_t new_revnum, char rev_date,
char rev_author, apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_process_committed_queue(*args)
def svn_wc_process_committed4(*args):
"""
svn_wc_process_committed4(char path, svn_wc_adm_access_t adm_access, svn_boolean_t recurse,
svn_revnum_t new_revnum, char rev_date,
char rev_author, apr_array_header_t wcprop_changes,
svn_boolean_t remove_lock, svn_boolean_t remove_changelist,
unsigned char digest,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_process_committed4(*args)
def svn_wc_process_committed3(*args):
"""
svn_wc_process_committed3(char path, svn_wc_adm_access_t adm_access, svn_boolean_t recurse,
svn_revnum_t new_revnum, char rev_date,
char rev_author, apr_array_header_t wcprop_changes,
svn_boolean_t remove_lock, unsigned char digest,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_process_committed3(*args)
def svn_wc_process_committed2(*args):
"""
svn_wc_process_committed2(char path, svn_wc_adm_access_t adm_access, svn_boolean_t recurse,
svn_revnum_t new_revnum, char rev_date,
char rev_author, apr_array_header_t wcprop_changes,
svn_boolean_t remove_lock, apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_process_committed2(*args)
def svn_wc_process_committed(*args):
"""
svn_wc_process_committed(char path, svn_wc_adm_access_t adm_access, svn_boolean_t recurse,
svn_revnum_t new_revnum, char rev_date,
char rev_author, apr_array_header_t wcprop_changes,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_process_committed(*args)
def svn_wc_crawl_revisions4(*args):
"""
svn_wc_crawl_revisions4(char path, svn_wc_adm_access_t adm_access, svn_ra_reporter3_t reporter,
void report_baton, svn_boolean_t restore_files,
svn_depth_t depth, svn_boolean_t honor_depth_exclude,
svn_boolean_t depth_compatibility_trick,
svn_boolean_t use_commit_times,
svn_wc_notify_func2_t notify_func,
svn_wc_traversal_info_t traversal_info,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_crawl_revisions4(*args)
def svn_wc_crawl_revisions3(*args):
"""
svn_wc_crawl_revisions3(char path, svn_wc_adm_access_t adm_access, svn_ra_reporter3_t reporter,
void report_baton, svn_boolean_t restore_files,
svn_depth_t depth, svn_boolean_t depth_compatibility_trick,
svn_boolean_t use_commit_times,
svn_wc_notify_func2_t notify_func,
svn_wc_traversal_info_t traversal_info,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_crawl_revisions3(*args)
def svn_wc_crawl_revisions2(*args):
"""
svn_wc_crawl_revisions2(char path, svn_wc_adm_access_t adm_access, svn_ra_reporter2_t reporter,
svn_boolean_t restore_files,
svn_boolean_t recurse, svn_boolean_t use_commit_times,
svn_wc_notify_func2_t notify_func,
svn_wc_traversal_info_t traversal_info,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_crawl_revisions2(*args)
def svn_wc_crawl_revisions(*args):
"""
svn_wc_crawl_revisions(char path, svn_wc_adm_access_t adm_access, svn_ra_reporter_t reporter,
void report_baton, svn_boolean_t restore_files,
svn_boolean_t recurse,
svn_boolean_t use_commit_times, svn_wc_notify_func_t notify_func,
svn_wc_traversal_info_t traversal_info,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_crawl_revisions(*args)
def svn_wc_is_wc_root(*args):
"""svn_wc_is_wc_root(char path, svn_wc_adm_access_t adm_access, apr_pool_t pool) -> svn_error_t"""
return _wc.svn_wc_is_wc_root(*args)
def svn_wc_get_actual_target(*args):
"""svn_wc_get_actual_target(char path, apr_pool_t pool) -> svn_error_t"""
return _wc.svn_wc_get_actual_target(*args)
def svn_wc_get_update_editor3(*args):
"""
svn_wc_get_update_editor3(svn_wc_adm_access_t anchor, char target, svn_boolean_t use_commit_times,
svn_depth_t depth, svn_boolean_t depth_is_sticky,
svn_boolean_t allow_unver_obstructions,
svn_wc_notify_func2_t notify_func,
svn_cancel_func_t cancel_func, svn_wc_conflict_resolver_func_t conflict_func,
void conflict_baton,
svn_wc_get_file_t fetch_func,
void fetch_baton, char diff3_cmd, apr_array_header_t preserved_exts,
svn_wc_traversal_info_t ti,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_get_update_editor3(*args)
def svn_wc_get_update_editor2(*args):
"""
svn_wc_get_update_editor2(svn_wc_adm_access_t anchor, char target, svn_boolean_t use_commit_times,
svn_boolean_t recurse,
svn_wc_notify_func2_t notify_func, svn_cancel_func_t cancel_func,
char diff3_cmd, svn_wc_traversal_info_t ti,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_get_update_editor2(*args)
def svn_wc_get_update_editor(*args):
"""
svn_wc_get_update_editor(svn_wc_adm_access_t anchor, char target, svn_boolean_t use_commit_times,
svn_boolean_t recurse,
svn_wc_notify_func_t notify_func, svn_cancel_func_t cancel_func,
char diff3_cmd, svn_wc_traversal_info_t ti,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_get_update_editor(*args)
def svn_wc_get_switch_editor3(*args):
"""
svn_wc_get_switch_editor3(svn_wc_adm_access_t anchor, char target, char switch_url,
svn_boolean_t use_commit_times, svn_depth_t depth,
svn_boolean_t depth_is_sticky, svn_boolean_t allow_unver_obstructions,
svn_wc_notify_func2_t notify_func,
svn_cancel_func_t cancel_func,
svn_wc_conflict_resolver_func_t conflict_func,
void conflict_baton, char diff3_cmd,
apr_array_header_t preserved_exts, svn_wc_traversal_info_t ti,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_get_switch_editor3(*args)
def svn_wc_get_switch_editor2(*args):
"""
svn_wc_get_switch_editor2(svn_wc_adm_access_t anchor, char target, char switch_url,
svn_boolean_t use_commit_times, svn_boolean_t recurse,
svn_wc_notify_func2_t notify_func,
svn_cancel_func_t cancel_func, char diff3_cmd,
svn_wc_traversal_info_t ti, apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_get_switch_editor2(*args)
def svn_wc_get_switch_editor(*args):
"""
svn_wc_get_switch_editor(svn_wc_adm_access_t anchor, char target, char switch_url,
svn_boolean_t use_commit_times, svn_boolean_t recurse,
svn_wc_notify_func_t notify_func,
svn_cancel_func_t cancel_func, char diff3_cmd,
svn_wc_traversal_info_t ti, apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_get_switch_editor(*args)
def svn_wc_prop_list(*args):
"""svn_wc_prop_list(char path, svn_wc_adm_access_t adm_access, apr_pool_t pool) -> svn_error_t"""
return _wc.svn_wc_prop_list(*args)
def svn_wc_prop_get(*args):
"""
svn_wc_prop_get(char name, char path, svn_wc_adm_access_t adm_access,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_prop_get(*args)
def svn_wc_prop_set3(*args):
"""
svn_wc_prop_set3(char name, svn_string_t value, char path, svn_wc_adm_access_t adm_access,
svn_boolean_t skip_checks,
svn_wc_notify_func2_t notify_func, apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_prop_set3(*args)
def svn_wc_prop_set2(*args):
"""
svn_wc_prop_set2(char name, svn_string_t value, char path, svn_wc_adm_access_t adm_access,
svn_boolean_t skip_checks,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_prop_set2(*args)
def svn_wc_prop_set(*args):
"""
svn_wc_prop_set(char name, svn_string_t value, char path, svn_wc_adm_access_t adm_access,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_prop_set(*args)
def svn_wc_is_normal_prop(*args):
"""svn_wc_is_normal_prop(char name) -> svn_boolean_t"""
return _wc.svn_wc_is_normal_prop(*args)
def svn_wc_is_wc_prop(*args):
"""svn_wc_is_wc_prop(char name) -> svn_boolean_t"""
return _wc.svn_wc_is_wc_prop(*args)
def svn_wc_is_entry_prop(*args):
"""svn_wc_is_entry_prop(char name) -> svn_boolean_t"""
return _wc.svn_wc_is_entry_prop(*args)
def svn_wc_canonicalize_svn_prop(*args):
"""
svn_wc_canonicalize_svn_prop(char propname, svn_string_t propval, char path, svn_node_kind_t kind,
svn_boolean_t skip_some_checks,
svn_wc_canonicalize_svn_prop_get_file_t prop_getter,
void getter_baton, apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_canonicalize_svn_prop(*args)
def svn_wc_get_diff_editor5(*args):
"""
svn_wc_get_diff_editor5(svn_wc_adm_access_t anchor, char target, svn_wc_diff_callbacks3_t callbacks,
void callback_baton,
svn_depth_t depth, svn_boolean_t ignore_ancestry,
svn_boolean_t use_text_base, svn_boolean_t reverse_order,
svn_cancel_func_t cancel_func,
apr_array_header_t changelists, apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_get_diff_editor5(*args)
def svn_wc_get_diff_editor4(*args):
"""
svn_wc_get_diff_editor4(svn_wc_adm_access_t anchor, char target, svn_wc_diff_callbacks2_t callbacks,
svn_depth_t depth,
svn_boolean_t ignore_ancestry, svn_boolean_t use_text_base,
svn_boolean_t reverse_order, svn_cancel_func_t cancel_func,
apr_array_header_t changelists,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_get_diff_editor4(*args)
def svn_wc_get_diff_editor3(*args):
"""
svn_wc_get_diff_editor3(svn_wc_adm_access_t anchor, char target, svn_wc_diff_callbacks2_t callbacks,
svn_boolean_t recurse,
svn_boolean_t ignore_ancestry, svn_boolean_t use_text_base,
svn_boolean_t reverse_order,
svn_cancel_func_t cancel_func, apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_get_diff_editor3(*args)
def svn_wc_get_diff_editor2(*args):
"""
svn_wc_get_diff_editor2(svn_wc_adm_access_t anchor, char target, svn_wc_diff_callbacks_t callbacks,
void callback_baton,
svn_boolean_t recurse, svn_boolean_t ignore_ancestry,
svn_boolean_t use_text_base, svn_boolean_t reverse_order,
svn_cancel_func_t cancel_func,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_get_diff_editor2(*args)
def svn_wc_get_diff_editor(*args):
"""
svn_wc_get_diff_editor(svn_wc_adm_access_t anchor, char target, svn_wc_diff_callbacks_t callbacks,
void callback_baton,
svn_boolean_t recurse, svn_boolean_t use_text_base,
svn_boolean_t reverse_order, svn_cancel_func_t cancel_func,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_get_diff_editor(*args)
def svn_wc_diff5(*args):
"""
svn_wc_diff5(svn_wc_adm_access_t anchor, char target, svn_wc_diff_callbacks3_t callbacks,
void callback_baton,
svn_depth_t depth, svn_boolean_t ignore_ancestry,
apr_array_header_t changelists, apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_diff5(*args)
def svn_wc_diff4(*args):
"""
svn_wc_diff4(svn_wc_adm_access_t anchor, char target, svn_wc_diff_callbacks2_t callbacks,
svn_depth_t depth,
svn_boolean_t ignore_ancestry, apr_array_header_t changelists,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_diff4(*args)
def svn_wc_diff3(*args):
"""
svn_wc_diff3(svn_wc_adm_access_t anchor, char target, svn_wc_diff_callbacks2_t callbacks,
svn_boolean_t recurse,
svn_boolean_t ignore_ancestry, apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_diff3(*args)
def svn_wc_diff2(*args):
"""
svn_wc_diff2(svn_wc_adm_access_t anchor, char target, svn_wc_diff_callbacks_t callbacks,
void callback_baton,
svn_boolean_t recurse, svn_boolean_t ignore_ancestry,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_diff2(*args)
def svn_wc_diff(*args):
"""
svn_wc_diff(svn_wc_adm_access_t anchor, char target, svn_wc_diff_callbacks_t callbacks,
void callback_baton,
svn_boolean_t recurse, apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_diff(*args)
def svn_wc_get_prop_diffs(*args):
"""svn_wc_get_prop_diffs(char path, svn_wc_adm_access_t adm_access, apr_pool_t pool) -> svn_error_t"""
return _wc.svn_wc_get_prop_diffs(*args)
svn_wc_merge_unchanged = _wc.svn_wc_merge_unchanged
svn_wc_merge_merged = _wc.svn_wc_merge_merged
svn_wc_merge_conflict = _wc.svn_wc_merge_conflict
svn_wc_merge_no_merge = _wc.svn_wc_merge_no_merge
def svn_wc_merge3(*args):
"""
svn_wc_merge3(char left, char right, char merge_target, svn_wc_adm_access_t adm_access,
char left_label, char right_label,
char target_label, svn_boolean_t dry_run,
char diff3_cmd, apr_array_header_t merge_options,
apr_array_header_t prop_diff,
svn_wc_conflict_resolver_func_t conflict_func,
void conflict_baton, apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_merge3(*args)
def svn_wc_merge2(*args):
"""
svn_wc_merge2(char left, char right, char merge_target, svn_wc_adm_access_t adm_access,
char left_label, char right_label,
char target_label, svn_boolean_t dry_run,
char diff3_cmd, apr_array_header_t merge_options,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_merge2(*args)
def svn_wc_merge(*args):
"""
svn_wc_merge(char left, char right, char merge_target, svn_wc_adm_access_t adm_access,
char left_label, char right_label,
char target_label, svn_boolean_t dry_run,
char diff3_cmd, apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_merge(*args)
def svn_wc_merge_props2(*args):
"""
svn_wc_merge_props2(svn_wc_notify_state_t state, char path, svn_wc_adm_access_t adm_access,
apr_hash_t baseprops, apr_array_header_t propchanges,
svn_boolean_t base_merge,
svn_boolean_t dry_run, svn_wc_conflict_resolver_func_t conflict_func,
void conflict_baton,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_merge_props2(*args)
def svn_wc_merge_props(*args):
"""
svn_wc_merge_props(svn_wc_notify_state_t state, char path, svn_wc_adm_access_t adm_access,
apr_hash_t baseprops, apr_array_header_t propchanges,
svn_boolean_t base_merge,
svn_boolean_t dry_run, apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_merge_props(*args)
def svn_wc_merge_prop_diffs(*args):
"""
svn_wc_merge_prop_diffs(svn_wc_notify_state_t state, char path, svn_wc_adm_access_t adm_access,
apr_array_header_t propchanges,
svn_boolean_t base_merge, svn_boolean_t dry_run,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_merge_prop_diffs(*args)
def svn_wc_get_pristine_contents(*args):
"""svn_wc_get_pristine_contents(char path, apr_pool_t result_pool, apr_pool_t scratch_pool) -> svn_error_t"""
return _wc.svn_wc_get_pristine_contents(*args)
def svn_wc_get_pristine_copy_path(*args):
"""svn_wc_get_pristine_copy_path(char path, apr_pool_t pool) -> svn_error_t"""
return _wc.svn_wc_get_pristine_copy_path(*args)
def svn_wc_cleanup2(*args):
"""
svn_wc_cleanup2(char path, char diff3_cmd, svn_cancel_func_t cancel_func,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_cleanup2(*args)
def svn_wc_cleanup(*args):
"""
svn_wc_cleanup(char path, svn_wc_adm_access_t optional_adm_access,
char diff3_cmd, svn_cancel_func_t cancel_func,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_cleanup(*args)
def svn_wc_relocate3(*args):
"""
svn_wc_relocate3(char path, svn_wc_adm_access_t adm_access, char _from,
char to, svn_boolean_t recurse, svn_wc_relocation_validator3_t validator,
void validator_baton,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_relocate3(*args)
def svn_wc_relocate2(*args):
"""
svn_wc_relocate2(char path, svn_wc_adm_access_t adm_access, char _from,
char to, svn_boolean_t recurse, svn_wc_relocation_validator2_t validator,
void validator_baton,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_relocate2(*args)
def svn_wc_relocate(*args):
"""
svn_wc_relocate(char path, svn_wc_adm_access_t adm_access, char _from,
char to, svn_boolean_t recurse, svn_wc_relocation_validator_t validator,
void validator_baton,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_relocate(*args)
def svn_wc_revert3(*args):
"""
svn_wc_revert3(char path, svn_wc_adm_access_t parent_access, svn_depth_t depth,
svn_boolean_t use_commit_times,
apr_array_header_t changelists, svn_cancel_func_t cancel_func,
svn_wc_notify_func2_t notify_func,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_revert3(*args)
def svn_wc_revert2(*args):
"""
svn_wc_revert2(char path, svn_wc_adm_access_t parent_access, svn_boolean_t recursive,
svn_boolean_t use_commit_times,
svn_cancel_func_t cancel_func, svn_wc_notify_func2_t notify_func,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_revert2(*args)
def svn_wc_revert(*args):
"""
svn_wc_revert(char path, svn_wc_adm_access_t parent_access, svn_boolean_t recursive,
svn_boolean_t use_commit_times,
svn_cancel_func_t cancel_func, svn_wc_notify_func_t notify_func,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_revert(*args)
def svn_wc_create_tmp_file2(*args):
"""svn_wc_create_tmp_file2(char path, svn_io_file_del_t delete_when, apr_pool_t pool) -> svn_error_t"""
return _wc.svn_wc_create_tmp_file2(*args)
def svn_wc_create_tmp_file(*args):
"""svn_wc_create_tmp_file(char path, svn_boolean_t delete_on_close, apr_pool_t pool) -> svn_error_t"""
return _wc.svn_wc_create_tmp_file(*args)
def svn_wc_translated_file2(*args):
"""
svn_wc_translated_file2(char src, char versioned_file, svn_wc_adm_access_t adm_access,
apr_uint32_t flags, apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_translated_file2(*args)
def svn_wc_translated_file(*args):
"""
svn_wc_translated_file(char vfile, svn_wc_adm_access_t adm_access, svn_boolean_t force_repair,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_translated_file(*args)
def svn_wc_translated_stream(*args):
"""
svn_wc_translated_stream(char path, char versioned_file, svn_wc_adm_access_t adm_access,
apr_uint32_t flags, apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_translated_stream(*args)
def svn_wc_transmit_text_deltas2(*args):
"""
svn_wc_transmit_text_deltas2(char path, svn_wc_adm_access_t adm_access, svn_boolean_t fulltext,
svn_delta_editor_t editor, void file_baton,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_transmit_text_deltas2(*args)
def svn_wc_transmit_text_deltas(*args):
"""
svn_wc_transmit_text_deltas(char path, svn_wc_adm_access_t adm_access, svn_boolean_t fulltext,
svn_delta_editor_t editor, void file_baton,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_transmit_text_deltas(*args)
def svn_wc_transmit_prop_deltas(*args):
"""
svn_wc_transmit_prop_deltas(char path, svn_wc_adm_access_t adm_access, svn_wc_entry_t entry,
svn_delta_editor_t editor, void baton,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_transmit_prop_deltas(*args)
def svn_wc_get_default_ignores(*args):
"""svn_wc_get_default_ignores(apr_hash_t config, apr_pool_t pool) -> svn_error_t"""
return _wc.svn_wc_get_default_ignores(*args)
def svn_wc_get_ignores(*args):
"""
svn_wc_get_ignores(apr_hash_t config, svn_wc_adm_access_t adm_access,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_get_ignores(*args)
def svn_wc_match_ignore_list(*args):
"""svn_wc_match_ignore_list(char str, apr_array_header_t list, apr_pool_t pool) -> svn_boolean_t"""
return _wc.svn_wc_match_ignore_list(*args)
def svn_wc_add_lock(*args):
"""
svn_wc_add_lock(char path, svn_lock_t lock, svn_wc_adm_access_t adm_access,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_add_lock(*args)
def svn_wc_remove_lock(*args):
"""svn_wc_remove_lock(char path, svn_wc_adm_access_t adm_access, apr_pool_t pool) -> svn_error_t"""
return _wc.svn_wc_remove_lock(*args)
class svn_wc_revision_status_t:
"""Proxy of C svn_wc_revision_status_t struct"""
__swig_setmethods__ = {}
__setattr__ = lambda self, name, value: _swig_setattr(self, svn_wc_revision_status_t, name, value)
__swig_getmethods__ = {}
__getattr__ = lambda self, name: _swig_getattr(self, svn_wc_revision_status_t, name)
__repr__ = _swig_repr
__swig_setmethods__["min_rev"] = _wc.svn_wc_revision_status_t_min_rev_set
__swig_getmethods__["min_rev"] = _wc.svn_wc_revision_status_t_min_rev_get
__swig_setmethods__["max_rev"] = _wc.svn_wc_revision_status_t_max_rev_set
__swig_getmethods__["max_rev"] = _wc.svn_wc_revision_status_t_max_rev_get
__swig_setmethods__["switched"] = _wc.svn_wc_revision_status_t_switched_set
__swig_getmethods__["switched"] = _wc.svn_wc_revision_status_t_switched_get
__swig_setmethods__["modified"] = _wc.svn_wc_revision_status_t_modified_set
__swig_getmethods__["modified"] = _wc.svn_wc_revision_status_t_modified_get
__swig_setmethods__["sparse_checkout"] = _wc.svn_wc_revision_status_t_sparse_checkout_set
__swig_getmethods__["sparse_checkout"] = _wc.svn_wc_revision_status_t_sparse_checkout_get
def set_parent_pool(self, parent_pool=None):
"""Create a new proxy object for svn_wc_revision_status_t"""
import libsvn.core, weakref
self.__dict__["_parent_pool"] = \
parent_pool or libsvn.core.application_pool;
if self.__dict__["_parent_pool"]:
self.__dict__["_is_valid"] = weakref.ref(
self.__dict__["_parent_pool"]._is_valid)
def assert_valid(self):
"""Assert that this object is using valid pool memory"""
if "_is_valid" in self.__dict__:
assert self.__dict__["_is_valid"](), "Variable has already been deleted"
def __getattr__(self, name):
"""Get an attribute from this object"""
self.assert_valid()
value = _swig_getattr(self, self.__class__, name)
members = self.__dict__.get("_members")
if members is not None:
old_value = members.get(name)
if (old_value is not None and value is not None and
value is not old_value):
try:
value.__dict__.update(old_value.__dict__)
except AttributeError:
pass
if hasattr(value, "assert_valid"):
value.assert_valid()
return value
def __setattr__(self, name, value):
"""Set an attribute on this object"""
self.assert_valid()
self.__dict__.setdefault("_members",{})[name] = value
return _swig_setattr(self, self.__class__, name, value)
def __init__(self):
"""__init__(self) -> svn_wc_revision_status_t"""
this = _wc.new_svn_wc_revision_status_t()
try: self.this.append(this)
except: self.this = this
__swig_destroy__ = _wc.delete_svn_wc_revision_status_t
__del__ = lambda self : None;
svn_wc_revision_status_t_swigregister = _wc.svn_wc_revision_status_t_swigregister
svn_wc_revision_status_t_swigregister(svn_wc_revision_status_t)
def svn_wc_revision_status(*args):
"""
svn_wc_revision_status(char wc_path, char trail_url, svn_boolean_t committed,
svn_cancel_func_t cancel_func, apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_revision_status(*args)
def svn_wc_set_changelist(*args):
"""
svn_wc_set_changelist(char path, char changelist, svn_wc_adm_access_t adm_access,
svn_cancel_func_t cancel_func, svn_wc_notify_func2_t notify_func,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_set_changelist(*args)
def svn_wc_crop_tree(*args):
"""
svn_wc_crop_tree(svn_wc_adm_access_t anchor, char target, svn_depth_t depth,
svn_wc_notify_func2_t notify_func, svn_cancel_func_t cancel_func,
apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_crop_tree(*args)
class svn_wc_adm_access_t:
"""Proxy of C svn_wc_adm_access_t struct"""
__swig_setmethods__ = {}
__setattr__ = lambda self, name, value: _swig_setattr(self, svn_wc_adm_access_t, name, value)
__swig_getmethods__ = {}
__getattr__ = lambda self, name: _swig_getattr(self, svn_wc_adm_access_t, name)
def __init__(self, *args, **kwargs): raise AttributeError("No constructor defined")
__repr__ = _swig_repr
def set_parent_pool(self, parent_pool=None):
"""Create a new proxy object for svn_wc_adm_access_t"""
import libsvn.core, weakref
self.__dict__["_parent_pool"] = \
parent_pool or libsvn.core.application_pool;
if self.__dict__["_parent_pool"]:
self.__dict__["_is_valid"] = weakref.ref(
self.__dict__["_parent_pool"]._is_valid)
def assert_valid(self):
"""Assert that this object is using valid pool memory"""
if "_is_valid" in self.__dict__:
assert self.__dict__["_is_valid"](), "Variable has already been deleted"
def __getattr__(self, name):
"""Get an attribute from this object"""
self.assert_valid()
value = _swig_getattr(self, self.__class__, name)
members = self.__dict__.get("_members")
if members is not None:
old_value = members.get(name)
if (old_value is not None and value is not None and
value is not old_value):
try:
value.__dict__.update(old_value.__dict__)
except AttributeError:
pass
if hasattr(value, "assert_valid"):
value.assert_valid()
return value
def __setattr__(self, name, value):
"""Set an attribute on this object"""
self.assert_valid()
self.__dict__.setdefault("_members",{})[name] = value
return _swig_setattr(self, self.__class__, name, value)
svn_wc_adm_access_t_swigregister = _wc.svn_wc_adm_access_t_swigregister
svn_wc_adm_access_t_swigregister(svn_wc_adm_access_t)
class svn_wc_traversal_info_t:
"""Proxy of C svn_wc_traversal_info_t struct"""
__swig_setmethods__ = {}
__setattr__ = lambda self, name, value: _swig_setattr(self, svn_wc_traversal_info_t, name, value)
__swig_getmethods__ = {}
__getattr__ = lambda self, name: _swig_getattr(self, svn_wc_traversal_info_t, name)
def __init__(self, *args, **kwargs): raise AttributeError("No constructor defined")
__repr__ = _swig_repr
def set_parent_pool(self, parent_pool=None):
"""Create a new proxy object for svn_wc_traversal_info_t"""
import libsvn.core, weakref
self.__dict__["_parent_pool"] = \
parent_pool or libsvn.core.application_pool;
if self.__dict__["_parent_pool"]:
self.__dict__["_is_valid"] = weakref.ref(
self.__dict__["_parent_pool"]._is_valid)
def assert_valid(self):
"""Assert that this object is using valid pool memory"""
if "_is_valid" in self.__dict__:
assert self.__dict__["_is_valid"](), "Variable has already been deleted"
def __getattr__(self, name):
"""Get an attribute from this object"""
self.assert_valid()
value = _swig_getattr(self, self.__class__, name)
members = self.__dict__.get("_members")
if members is not None:
old_value = members.get(name)
if (old_value is not None and value is not None and
value is not old_value):
try:
value.__dict__.update(old_value.__dict__)
except AttributeError:
pass
if hasattr(value, "assert_valid"):
value.assert_valid()
return value
def __setattr__(self, name, value):
"""Set an attribute on this object"""
self.assert_valid()
self.__dict__.setdefault("_members",{})[name] = value
return _swig_setattr(self, self.__class__, name, value)
svn_wc_traversal_info_t_swigregister = _wc.svn_wc_traversal_info_t_swigregister
svn_wc_traversal_info_t_swigregister(svn_wc_traversal_info_t)
class svn_wc_committed_queue_t:
"""Proxy of C svn_wc_committed_queue_t struct"""
__swig_setmethods__ = {}
__setattr__ = lambda self, name, value: _swig_setattr(self, svn_wc_committed_queue_t, name, value)
__swig_getmethods__ = {}
__getattr__ = lambda self, name: _swig_getattr(self, svn_wc_committed_queue_t, name)
def __init__(self, *args, **kwargs): raise AttributeError("No constructor defined")
__repr__ = _swig_repr
def set_parent_pool(self, parent_pool=None):
"""Create a new proxy object for svn_wc_committed_queue_t"""
import libsvn.core, weakref
self.__dict__["_parent_pool"] = \
parent_pool or libsvn.core.application_pool;
if self.__dict__["_parent_pool"]:
self.__dict__["_is_valid"] = weakref.ref(
self.__dict__["_parent_pool"]._is_valid)
def assert_valid(self):
"""Assert that this object is using valid pool memory"""
if "_is_valid" in self.__dict__:
assert self.__dict__["_is_valid"](), "Variable has already been deleted"
def __getattr__(self, name):
"""Get an attribute from this object"""
self.assert_valid()
value = _swig_getattr(self, self.__class__, name)
members = self.__dict__.get("_members")
if members is not None:
old_value = members.get(name)
if (old_value is not None and value is not None and
value is not old_value):
try:
value.__dict__.update(old_value.__dict__)
except AttributeError:
pass
if hasattr(value, "assert_valid"):
value.assert_valid()
return value
def __setattr__(self, name, value):
"""Set an attribute on this object"""
self.assert_valid()
self.__dict__.setdefault("_members",{})[name] = value
return _swig_setattr(self, self.__class__, name, value)
svn_wc_committed_queue_t_swigregister = _wc.svn_wc_committed_queue_t_swigregister
svn_wc_committed_queue_t_swigregister(svn_wc_committed_queue_t)
def svn_wc_diff_callbacks3_invoke_file_changed(*args):
"""
svn_wc_diff_callbacks3_invoke_file_changed(svn_wc_diff_callbacks3_t _obj, svn_wc_adm_access_t adm_access,
svn_wc_notify_state_t contentstate,
svn_wc_notify_state_t propstate, char path,
char tmpfile1, char tmpfile2, svn_revnum_t rev1,
svn_revnum_t rev2, char mimetype1, char mimetype2,
apr_array_header_t propchanges, apr_hash_t originalprops,
void diff_baton) -> svn_error_t
"""
return _wc.svn_wc_diff_callbacks3_invoke_file_changed(*args)
def svn_wc_diff_callbacks3_invoke_file_added(*args):
"""
svn_wc_diff_callbacks3_invoke_file_added(svn_wc_diff_callbacks3_t _obj, svn_wc_adm_access_t adm_access,
svn_wc_notify_state_t contentstate,
svn_wc_notify_state_t propstate, char path,
char tmpfile1, char tmpfile2, svn_revnum_t rev1,
svn_revnum_t rev2, char mimetype1, char mimetype2,
apr_array_header_t propchanges, apr_hash_t originalprops,
void diff_baton) -> svn_error_t
"""
return _wc.svn_wc_diff_callbacks3_invoke_file_added(*args)
def svn_wc_diff_callbacks3_invoke_file_deleted(*args):
"""
svn_wc_diff_callbacks3_invoke_file_deleted(svn_wc_diff_callbacks3_t _obj, svn_wc_adm_access_t adm_access,
svn_wc_notify_state_t state, char path,
char tmpfile1, char tmpfile2, char mimetype1,
char mimetype2, apr_hash_t originalprops,
void diff_baton) -> svn_error_t
"""
return _wc.svn_wc_diff_callbacks3_invoke_file_deleted(*args)
def svn_wc_diff_callbacks3_invoke_dir_added(*args):
"""
svn_wc_diff_callbacks3_invoke_dir_added(svn_wc_diff_callbacks3_t _obj, svn_wc_adm_access_t adm_access,
svn_wc_notify_state_t state, char path,
svn_revnum_t rev, void diff_baton) -> svn_error_t
"""
return _wc.svn_wc_diff_callbacks3_invoke_dir_added(*args)
def svn_wc_diff_callbacks3_invoke_dir_deleted(*args):
"""
svn_wc_diff_callbacks3_invoke_dir_deleted(svn_wc_diff_callbacks3_t _obj, svn_wc_adm_access_t adm_access,
svn_wc_notify_state_t state, char path,
void diff_baton) -> svn_error_t
"""
return _wc.svn_wc_diff_callbacks3_invoke_dir_deleted(*args)
def svn_wc_diff_callbacks3_invoke_dir_props_changed(*args):
"""
svn_wc_diff_callbacks3_invoke_dir_props_changed(svn_wc_diff_callbacks3_t _obj, svn_wc_adm_access_t adm_access,
svn_wc_notify_state_t propstate,
char path, apr_array_header_t propchanges, apr_hash_t original_props,
void diff_baton) -> svn_error_t
"""
return _wc.svn_wc_diff_callbacks3_invoke_dir_props_changed(*args)
def svn_wc_diff_callbacks3_invoke_dir_opened(*args):
"""
svn_wc_diff_callbacks3_invoke_dir_opened(svn_wc_diff_callbacks3_t _obj, svn_wc_adm_access_t adm_access,
char path, svn_revnum_t rev, void diff_baton) -> svn_error_t
"""
return _wc.svn_wc_diff_callbacks3_invoke_dir_opened(*args)
def svn_wc_diff_callbacks3_invoke_dir_closed(*args):
"""
svn_wc_diff_callbacks3_invoke_dir_closed(svn_wc_diff_callbacks3_t _obj, svn_wc_adm_access_t adm_access,
svn_wc_notify_state_t contentstate,
svn_wc_notify_state_t propstate, char path,
void diff_baton) -> svn_error_t
"""
return _wc.svn_wc_diff_callbacks3_invoke_dir_closed(*args)
def svn_wc_diff_callbacks2_invoke_file_changed(*args):
"""
svn_wc_diff_callbacks2_invoke_file_changed(svn_wc_diff_callbacks2_t _obj, svn_wc_adm_access_t adm_access,
svn_wc_notify_state_t contentstate,
svn_wc_notify_state_t propstate, char path,
char tmpfile1, char tmpfile2, svn_revnum_t rev1,
svn_revnum_t rev2, char mimetype1, char mimetype2,
apr_array_header_t propchanges, apr_hash_t originalprops,
void diff_baton) -> svn_error_t
"""
return _wc.svn_wc_diff_callbacks2_invoke_file_changed(*args)
def svn_wc_diff_callbacks2_invoke_file_added(*args):
"""
svn_wc_diff_callbacks2_invoke_file_added(svn_wc_diff_callbacks2_t _obj, svn_wc_adm_access_t adm_access,
svn_wc_notify_state_t contentstate,
svn_wc_notify_state_t propstate, char path,
char tmpfile1, char tmpfile2, svn_revnum_t rev1,
svn_revnum_t rev2, char mimetype1, char mimetype2,
apr_array_header_t propchanges, apr_hash_t originalprops,
void diff_baton) -> svn_error_t
"""
return _wc.svn_wc_diff_callbacks2_invoke_file_added(*args)
def svn_wc_diff_callbacks2_invoke_file_deleted(*args):
"""
svn_wc_diff_callbacks2_invoke_file_deleted(svn_wc_diff_callbacks2_t _obj, svn_wc_adm_access_t adm_access,
svn_wc_notify_state_t state, char path,
char tmpfile1, char tmpfile2, char mimetype1,
char mimetype2, apr_hash_t originalprops,
void diff_baton) -> svn_error_t
"""
return _wc.svn_wc_diff_callbacks2_invoke_file_deleted(*args)
def svn_wc_diff_callbacks2_invoke_dir_added(*args):
"""
svn_wc_diff_callbacks2_invoke_dir_added(svn_wc_diff_callbacks2_t _obj, svn_wc_adm_access_t adm_access,
svn_wc_notify_state_t state, char path,
svn_revnum_t rev, void diff_baton) -> svn_error_t
"""
return _wc.svn_wc_diff_callbacks2_invoke_dir_added(*args)
def svn_wc_diff_callbacks2_invoke_dir_deleted(*args):
"""
svn_wc_diff_callbacks2_invoke_dir_deleted(svn_wc_diff_callbacks2_t _obj, svn_wc_adm_access_t adm_access,
svn_wc_notify_state_t state, char path,
void diff_baton) -> svn_error_t
"""
return _wc.svn_wc_diff_callbacks2_invoke_dir_deleted(*args)
def svn_wc_diff_callbacks2_invoke_dir_props_changed(*args):
"""
svn_wc_diff_callbacks2_invoke_dir_props_changed(svn_wc_diff_callbacks2_t _obj, svn_wc_adm_access_t adm_access,
svn_wc_notify_state_t state, char path,
apr_array_header_t propchanges, apr_hash_t original_props,
void diff_baton) -> svn_error_t
"""
return _wc.svn_wc_diff_callbacks2_invoke_dir_props_changed(*args)
def svn_wc_diff_callbacks_invoke_file_changed(*args):
"""
svn_wc_diff_callbacks_invoke_file_changed(svn_wc_diff_callbacks_t _obj, svn_wc_adm_access_t adm_access,
svn_wc_notify_state_t state, char path,
char tmpfile1, char tmpfile2, svn_revnum_t rev1,
svn_revnum_t rev2, char mimetype1,
char mimetype2, void diff_baton) -> svn_error_t
"""
return _wc.svn_wc_diff_callbacks_invoke_file_changed(*args)
def svn_wc_diff_callbacks_invoke_file_added(*args):
"""
svn_wc_diff_callbacks_invoke_file_added(svn_wc_diff_callbacks_t _obj, svn_wc_adm_access_t adm_access,
svn_wc_notify_state_t state, char path,
char tmpfile1, char tmpfile2, svn_revnum_t rev1,
svn_revnum_t rev2, char mimetype1,
char mimetype2, void diff_baton) -> svn_error_t
"""
return _wc.svn_wc_diff_callbacks_invoke_file_added(*args)
def svn_wc_diff_callbacks_invoke_file_deleted(*args):
"""
svn_wc_diff_callbacks_invoke_file_deleted(svn_wc_diff_callbacks_t _obj, svn_wc_adm_access_t adm_access,
svn_wc_notify_state_t state, char path,
char tmpfile1, char tmpfile2, char mimetype1,
char mimetype2, void diff_baton) -> svn_error_t
"""
return _wc.svn_wc_diff_callbacks_invoke_file_deleted(*args)
def svn_wc_diff_callbacks_invoke_dir_added(*args):
"""
svn_wc_diff_callbacks_invoke_dir_added(svn_wc_diff_callbacks_t _obj, svn_wc_adm_access_t adm_access,
svn_wc_notify_state_t state, char path,
svn_revnum_t rev, void diff_baton) -> svn_error_t
"""
return _wc.svn_wc_diff_callbacks_invoke_dir_added(*args)
def svn_wc_diff_callbacks_invoke_dir_deleted(*args):
"""
svn_wc_diff_callbacks_invoke_dir_deleted(svn_wc_diff_callbacks_t _obj, svn_wc_adm_access_t adm_access,
svn_wc_notify_state_t state, char path,
void diff_baton) -> svn_error_t
"""
return _wc.svn_wc_diff_callbacks_invoke_dir_deleted(*args)
def svn_wc_diff_callbacks_invoke_props_changed(*args):
"""
svn_wc_diff_callbacks_invoke_props_changed(svn_wc_diff_callbacks_t _obj, svn_wc_adm_access_t adm_access,
svn_wc_notify_state_t state, char path,
apr_array_header_t propchanges, apr_hash_t original_props,
void diff_baton) -> svn_error_t
"""
return _wc.svn_wc_diff_callbacks_invoke_props_changed(*args)
def svn_wc_entry_callbacks2_invoke_found_entry(*args):
"""
svn_wc_entry_callbacks2_invoke_found_entry(svn_wc_entry_callbacks2_t _obj, char path, svn_wc_entry_t entry,
void walk_baton, apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_entry_callbacks2_invoke_found_entry(*args)
def svn_wc_entry_callbacks2_invoke_handle_error(*args):
"""
svn_wc_entry_callbacks2_invoke_handle_error(svn_wc_entry_callbacks2_t _obj, char path, svn_error_t err,
void walk_baton, apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_entry_callbacks2_invoke_handle_error(*args)
def svn_wc_entry_callbacks_invoke_found_entry(*args):
"""
svn_wc_entry_callbacks_invoke_found_entry(svn_wc_entry_callbacks_t _obj, char path, svn_wc_entry_t entry,
void walk_baton, apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_entry_callbacks_invoke_found_entry(*args)
def svn_wc_invoke_notify_func2(*args):
"""
svn_wc_invoke_notify_func2(svn_wc_notify_func2_t _obj, void baton, svn_wc_notify_t notify,
apr_pool_t pool)
"""
return _wc.svn_wc_invoke_notify_func2(*args)
def svn_wc_invoke_notify_func(*args):
"""
svn_wc_invoke_notify_func(svn_wc_notify_func_t _obj, void baton, char path, svn_wc_notify_action_t action,
svn_node_kind_t kind,
char mime_type, svn_wc_notify_state_t content_state,
svn_wc_notify_state_t prop_state,
svn_revnum_t revision)
"""
return _wc.svn_wc_invoke_notify_func(*args)
def svn_wc_invoke_get_file(*args):
"""
svn_wc_invoke_get_file(svn_wc_get_file_t _obj, void baton, char path, svn_revnum_t revision,
svn_stream_t stream, apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_invoke_get_file(*args)
def svn_wc_invoke_conflict_resolver_func(*args):
"""
svn_wc_invoke_conflict_resolver_func(svn_wc_conflict_resolver_func_t _obj, svn_wc_conflict_description_t description,
void baton, apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_invoke_conflict_resolver_func(*args)
def svn_wc_invoke_status_func3(*args):
"""
svn_wc_invoke_status_func3(svn_wc_status_func3_t _obj, void baton, char path,
svn_wc_status2_t status, apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_invoke_status_func3(*args)
def svn_wc_invoke_status_func2(*args):
"""
svn_wc_invoke_status_func2(svn_wc_status_func2_t _obj, void baton, char path,
svn_wc_status2_t status)
"""
return _wc.svn_wc_invoke_status_func2(*args)
def svn_wc_invoke_status_func(*args):
"""svn_wc_invoke_status_func(svn_wc_status_func_t _obj, void baton, char path, svn_wc_status_t status)"""
return _wc.svn_wc_invoke_status_func(*args)
def svn_wc_invoke_canonicalize_svn_prop_get_file(*args):
"""
svn_wc_invoke_canonicalize_svn_prop_get_file(svn_wc_canonicalize_svn_prop_get_file_t _obj, svn_stream_t stream,
void baton, apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_invoke_canonicalize_svn_prop_get_file(*args)
def svn_wc_invoke_relocation_validator3(*args):
"""
svn_wc_invoke_relocation_validator3(svn_wc_relocation_validator3_t _obj, void baton, char uuid,
char url, char root_url, apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_invoke_relocation_validator3(*args)
def svn_wc_invoke_relocation_validator2(*args):
"""
svn_wc_invoke_relocation_validator2(svn_wc_relocation_validator2_t _obj, void baton, char uuid,
char url, svn_boolean_t root, apr_pool_t pool) -> svn_error_t
"""
return _wc.svn_wc_invoke_relocation_validator2(*args)
def svn_wc_invoke_relocation_validator(*args):
"""
svn_wc_invoke_relocation_validator(svn_wc_relocation_validator_t _obj, void baton, char uuid,
char url) -> svn_error_t
"""
return _wc.svn_wc_invoke_relocation_validator(*args)
class svn_wc_notify_func2_t:
"""Proxy of C svn_wc_notify_func2_t struct"""
__swig_setmethods__ = {}
__setattr__ = lambda self, name, value: _swig_setattr(self, svn_wc_notify_func2_t, name, value)
__swig_getmethods__ = {}
__getattr__ = lambda self, name: _swig_getattr(self, svn_wc_notify_func2_t, name)
def __init__(self, *args, **kwargs): raise AttributeError("No constructor defined")
__repr__ = _swig_repr
def set_parent_pool(self, parent_pool=None):
"""Create a new proxy object for svn_wc_notify_func2_t"""
import libsvn.core, weakref
self.__dict__["_parent_pool"] = \
parent_pool or libsvn.core.application_pool;
if self.__dict__["_parent_pool"]:
self.__dict__["_is_valid"] = weakref.ref(
self.__dict__["_parent_pool"]._is_valid)
def assert_valid(self):
"""Assert that this object is using valid pool memory"""
if "_is_valid" in self.__dict__:
assert self.__dict__["_is_valid"](), "Variable has already been deleted"
def __getattr__(self, name):
"""Get an attribute from this object"""
self.assert_valid()
value = _swig_getattr(self, self.__class__, name)
members = self.__dict__.get("_members")
if members is not None:
old_value = members.get(name)
if (old_value is not None and value is not None and
value is not old_value):
try:
value.__dict__.update(old_value.__dict__)
except AttributeError:
pass
if hasattr(value, "assert_valid"):
value.assert_valid()
return value
def __setattr__(self, name, value):
"""Set an attribute on this object"""
self.assert_valid()
self.__dict__.setdefault("_members",{})[name] = value
return _swig_setattr(self, self.__class__, name, value)
def __call__(self, *args):
return svn_wc_invoke_notify_func2(self, *args)
svn_wc_notify_func2_t_swigregister = _wc.svn_wc_notify_func2_t_swigregister
svn_wc_notify_func2_t_swigregister(svn_wc_notify_func2_t)
class svn_wc_notify_func_t:
"""Proxy of C svn_wc_notify_func_t struct"""
__swig_setmethods__ = {}
__setattr__ = lambda self, name, value: _swig_setattr(self, svn_wc_notify_func_t, name, value)
__swig_getmethods__ = {}
__getattr__ = lambda self, name: _swig_getattr(self, svn_wc_notify_func_t, name)
def __init__(self, *args, **kwargs): raise AttributeError("No constructor defined")
__repr__ = _swig_repr
def set_parent_pool(self, parent_pool=None):
"""Create a new proxy object for svn_wc_notify_func_t"""
import libsvn.core, weakref
self.__dict__["_parent_pool"] = \
parent_pool or libsvn.core.application_pool;
if self.__dict__["_parent_pool"]:
self.__dict__["_is_valid"] = weakref.ref(
self.__dict__["_parent_pool"]._is_valid)
def assert_valid(self):
"""Assert that this object is using valid pool memory"""
if "_is_valid" in self.__dict__:
assert self.__dict__["_is_valid"](), "Variable has already been deleted"
def __getattr__(self, name):
"""Get an attribute from this object"""
self.assert_valid()
value = _swig_getattr(self, self.__class__, name)
members = self.__dict__.get("_members")
if members is not None:
old_value = members.get(name)
if (old_value is not None and value is not None and
value is not old_value):
try:
value.__dict__.update(old_value.__dict__)
except AttributeError:
pass
if hasattr(value, "assert_valid"):
value.assert_valid()
return value
def __setattr__(self, name, value):
"""Set an attribute on this object"""
self.assert_valid()
self.__dict__.setdefault("_members",{})[name] = value
return _swig_setattr(self, self.__class__, name, value)
def __call__(self, *args):
return svn_wc_invoke_notify_func(self, *args)
svn_wc_notify_func_t_swigregister = _wc.svn_wc_notify_func_t_swigregister
svn_wc_notify_func_t_swigregister(svn_wc_notify_func_t)
class svn_wc_get_file_t:
"""Proxy of C svn_wc_get_file_t struct"""
__swig_setmethods__ = {}
__setattr__ = lambda self, name, value: _swig_setattr(self, svn_wc_get_file_t, name, value)
__swig_getmethods__ = {}
__getattr__ = lambda self, name: _swig_getattr(self, svn_wc_get_file_t, name)
def __init__(self, *args, **kwargs): raise AttributeError("No constructor defined")
__repr__ = _swig_repr
def set_parent_pool(self, parent_pool=None):
"""Create a new proxy object for svn_wc_get_file_t"""
import libsvn.core, weakref
self.__dict__["_parent_pool"] = \
parent_pool or libsvn.core.application_pool;
if self.__dict__["_parent_pool"]:
self.__dict__["_is_valid"] = weakref.ref(
self.__dict__["_parent_pool"]._is_valid)
def assert_valid(self):
"""Assert that this object is using valid pool memory"""
if "_is_valid" in self.__dict__:
assert self.__dict__["_is_valid"](), "Variable has already been deleted"
def __getattr__(self, name):
"""Get an attribute from this object"""
self.assert_valid()
value = _swig_getattr(self, self.__class__, name)
members = self.__dict__.get("_members")
if members is not None:
old_value = members.get(name)
if (old_value is not None and value is not None and
value is not old_value):
try:
value.__dict__.update(old_value.__dict__)
except AttributeError:
pass
if hasattr(value, "assert_valid"):
value.assert_valid()
return value
def __setattr__(self, name, value):
"""Set an attribute on this object"""
self.assert_valid()
self.__dict__.setdefault("_members",{})[name] = value
return _swig_setattr(self, self.__class__, name, value)
def __call__(self, *args):
return svn_wc_invoke_get_file(self, *args)
svn_wc_get_file_t_swigregister = _wc.svn_wc_get_file_t_swigregister
svn_wc_get_file_t_swigregister(svn_wc_get_file_t)
class svn_wc_conflict_resolver_func_t:
"""Proxy of C svn_wc_conflict_resolver_func_t struct"""
__swig_setmethods__ = {}
__setattr__ = lambda self, name, value: _swig_setattr(self, svn_wc_conflict_resolver_func_t, name, value)
__swig_getmethods__ = {}
__getattr__ = lambda self, name: _swig_getattr(self, svn_wc_conflict_resolver_func_t, name)
def __init__(self, *args, **kwargs): raise AttributeError("No constructor defined")
__repr__ = _swig_repr
def set_parent_pool(self, parent_pool=None):
"""Create a new proxy object for svn_wc_conflict_resolver_func_t"""
import libsvn.core, weakref
self.__dict__["_parent_pool"] = \
parent_pool or libsvn.core.application_pool;
if self.__dict__["_parent_pool"]:
self.__dict__["_is_valid"] = weakref.ref(
self.__dict__["_parent_pool"]._is_valid)
def assert_valid(self):
"""Assert that this object is using valid pool memory"""
if "_is_valid" in self.__dict__:
assert self.__dict__["_is_valid"](), "Variable has already been deleted"
def __getattr__(self, name):
"""Get an attribute from this object"""
self.assert_valid()
value = _swig_getattr(self, self.__class__, name)
members = self.__dict__.get("_members")
if members is not None:
old_value = members.get(name)
if (old_value is not None and value is not None and
value is not old_value):
try:
value.__dict__.update(old_value.__dict__)
except AttributeError:
pass
if hasattr(value, "assert_valid"):
value.assert_valid()
return value
def __setattr__(self, name, value):
"""Set an attribute on this object"""
self.assert_valid()
self.__dict__.setdefault("_members",{})[name] = value
return _swig_setattr(self, self.__class__, name, value)
def __call__(self, *args):
return svn_wc_invoke_conflict_resolver_func(self, *args)
svn_wc_conflict_resolver_func_t_swigregister = _wc.svn_wc_conflict_resolver_func_t_swigregister
svn_wc_conflict_resolver_func_t_swigregister(svn_wc_conflict_resolver_func_t)
class svn_wc_status_func3_t:
"""Proxy of C svn_wc_status_func3_t struct"""
__swig_setmethods__ = {}
__setattr__ = lambda self, name, value: _swig_setattr(self, svn_wc_status_func3_t, name, value)
__swig_getmethods__ = {}
__getattr__ = lambda self, name: _swig_getattr(self, svn_wc_status_func3_t, name)
def __init__(self, *args, **kwargs): raise AttributeError("No constructor defined")
__repr__ = _swig_repr
def set_parent_pool(self, parent_pool=None):
"""Create a new proxy object for svn_wc_status_func3_t"""
import libsvn.core, weakref
self.__dict__["_parent_pool"] = \
parent_pool or libsvn.core.application_pool;
if self.__dict__["_parent_pool"]:
self.__dict__["_is_valid"] = weakref.ref(
self.__dict__["_parent_pool"]._is_valid)
def assert_valid(self):
"""Assert that this object is using valid pool memory"""
if "_is_valid" in self.__dict__:
assert self.__dict__["_is_valid"](), "Variable has already been deleted"
def __getattr__(self, name):
"""Get an attribute from this object"""
self.assert_valid()
value = _swig_getattr(self, self.__class__, name)
members = self.__dict__.get("_members")
if members is not None:
old_value = members.get(name)
if (old_value is not None and value is not None and
value is not old_value):
try:
value.__dict__.update(old_value.__dict__)
except AttributeError:
pass
if hasattr(value, "assert_valid"):
value.assert_valid()
return value
def __setattr__(self, name, value):
"""Set an attribute on this object"""
self.assert_valid()
self.__dict__.setdefault("_members",{})[name] = value
return _swig_setattr(self, self.__class__, name, value)
def __call__(self, *args):
return svn_wc_invoke_status_func3(self, *args)
svn_wc_status_func3_t_swigregister = _wc.svn_wc_status_func3_t_swigregister
svn_wc_status_func3_t_swigregister(svn_wc_status_func3_t)
class svn_wc_status_func2_t:
"""Proxy of C svn_wc_status_func2_t struct"""
__swig_setmethods__ = {}
__setattr__ = lambda self, name, value: _swig_setattr(self, svn_wc_status_func2_t, name, value)
__swig_getmethods__ = {}
__getattr__ = lambda self, name: _swig_getattr(self, svn_wc_status_func2_t, name)
def __init__(self, *args, **kwargs): raise AttributeError("No constructor defined")
__repr__ = _swig_repr
def set_parent_pool(self, parent_pool=None):
"""Create a new proxy object for svn_wc_status_func2_t"""
import libsvn.core, weakref
self.__dict__["_parent_pool"] = \
parent_pool or libsvn.core.application_pool;
if self.__dict__["_parent_pool"]:
self.__dict__["_is_valid"] = weakref.ref(
self.__dict__["_parent_pool"]._is_valid)
def assert_valid(self):
"""Assert that this object is using valid pool memory"""
if "_is_valid" in self.__dict__:
assert self.__dict__["_is_valid"](), "Variable has already been deleted"
def __getattr__(self, name):
"""Get an attribute from this object"""
self.assert_valid()
value = _swig_getattr(self, self.__class__, name)
members = self.__dict__.get("_members")
if members is not None:
old_value = members.get(name)
if (old_value is not None and value is not None and
value is not old_value):
try:
value.__dict__.update(old_value.__dict__)
except AttributeError:
pass
if hasattr(value, "assert_valid"):
value.assert_valid()
return value
def __setattr__(self, name, value):
"""Set an attribute on this object"""
self.assert_valid()
self.__dict__.setdefault("_members",{})[name] = value
return _swig_setattr(self, self.__class__, name, value)
def __call__(self, *args):
return svn_wc_invoke_status_func2(self, *args)
svn_wc_status_func2_t_swigregister = _wc.svn_wc_status_func2_t_swigregister
svn_wc_status_func2_t_swigregister(svn_wc_status_func2_t)
class svn_wc_status_func_t:
"""Proxy of C svn_wc_status_func_t struct"""
__swig_setmethods__ = {}
__setattr__ = lambda self, name, value: _swig_setattr(self, svn_wc_status_func_t, name, value)
__swig_getmethods__ = {}
__getattr__ = lambda self, name: _swig_getattr(self, svn_wc_status_func_t, name)
def __init__(self, *args, **kwargs): raise AttributeError("No constructor defined")
__repr__ = _swig_repr
def set_parent_pool(self, parent_pool=None):
"""Create a new proxy object for svn_wc_status_func_t"""
import libsvn.core, weakref
self.__dict__["_parent_pool"] = \
parent_pool or libsvn.core.application_pool;
if self.__dict__["_parent_pool"]:
self.__dict__["_is_valid"] = weakref.ref(
self.__dict__["_parent_pool"]._is_valid)
def assert_valid(self):
"""Assert that this object is using valid pool memory"""
if "_is_valid" in self.__dict__:
assert self.__dict__["_is_valid"](), "Variable has already been deleted"
def __getattr__(self, name):
"""Get an attribute from this object"""
self.assert_valid()
value = _swig_getattr(self, self.__class__, name)
members = self.__dict__.get("_members")
if members is not None:
old_value = members.get(name)
if (old_value is not None and value is not None and
value is not old_value):
try:
value.__dict__.update(old_value.__dict__)
except AttributeError:
pass
if hasattr(value, "assert_valid"):
value.assert_valid()
return value
def __setattr__(self, name, value):
"""Set an attribute on this object"""
self.assert_valid()
self.__dict__.setdefault("_members",{})[name] = value
return _swig_setattr(self, self.__class__, name, value)
def __call__(self, *args):
return svn_wc_invoke_status_func(self, *args)
svn_wc_status_func_t_swigregister = _wc.svn_wc_status_func_t_swigregister
svn_wc_status_func_t_swigregister(svn_wc_status_func_t)
class svn_wc_canonicalize_svn_prop_get_file_t:
"""Proxy of C svn_wc_canonicalize_svn_prop_get_file_t struct"""
__swig_setmethods__ = {}
__setattr__ = lambda self, name, value: _swig_setattr(self, svn_wc_canonicalize_svn_prop_get_file_t, name, value)
__swig_getmethods__ = {}
__getattr__ = lambda self, name: _swig_getattr(self, svn_wc_canonicalize_svn_prop_get_file_t, name)
def __init__(self, *args, **kwargs): raise AttributeError("No constructor defined")
__repr__ = _swig_repr
def set_parent_pool(self, parent_pool=None):
"""Create a new proxy object for svn_wc_canonicalize_svn_prop_get_file_t"""
import libsvn.core, weakref
self.__dict__["_parent_pool"] = \
parent_pool or libsvn.core.application_pool;
if self.__dict__["_parent_pool"]:
self.__dict__["_is_valid"] = weakref.ref(
self.__dict__["_parent_pool"]._is_valid)
def assert_valid(self):
"""Assert that this object is using valid pool memory"""
if "_is_valid" in self.__dict__:
assert self.__dict__["_is_valid"](), "Variable has already been deleted"
def __getattr__(self, name):
"""Get an attribute from this object"""
self.assert_valid()
value = _swig_getattr(self, self.__class__, name)
members = self.__dict__.get("_members")
if members is not None:
old_value = members.get(name)
if (old_value is not None and value is not None and
value is not old_value):
try:
value.__dict__.update(old_value.__dict__)
except AttributeError:
pass
if hasattr(value, "assert_valid"):
value.assert_valid()
return value
def __setattr__(self, name, value):
"""Set an attribute on this object"""
self.assert_valid()
self.__dict__.setdefault("_members",{})[name] = value
return _swig_setattr(self, self.__class__, name, value)
def __call__(self, *args):
return svn_wc_invoke_canonicalize_svn_prop_get_file(self, *args)
svn_wc_canonicalize_svn_prop_get_file_t_swigregister = _wc.svn_wc_canonicalize_svn_prop_get_file_t_swigregister
svn_wc_canonicalize_svn_prop_get_file_t_swigregister(svn_wc_canonicalize_svn_prop_get_file_t)
class svn_wc_relocation_validator3_t:
"""Proxy of C svn_wc_relocation_validator3_t struct"""
__swig_setmethods__ = {}
__setattr__ = lambda self, name, value: _swig_setattr(self, svn_wc_relocation_validator3_t, name, value)
__swig_getmethods__ = {}
__getattr__ = lambda self, name: _swig_getattr(self, svn_wc_relocation_validator3_t, name)
def __init__(self, *args, **kwargs): raise AttributeError("No constructor defined")
__repr__ = _swig_repr
def set_parent_pool(self, parent_pool=None):
"""Create a new proxy object for svn_wc_relocation_validator3_t"""
import libsvn.core, weakref
self.__dict__["_parent_pool"] = \
parent_pool or libsvn.core.application_pool;
if self.__dict__["_parent_pool"]:
self.__dict__["_is_valid"] = weakref.ref(
self.__dict__["_parent_pool"]._is_valid)
def assert_valid(self):
"""Assert that this object is using valid pool memory"""
if "_is_valid" in self.__dict__:
assert self.__dict__["_is_valid"](), "Variable has already been deleted"
def __getattr__(self, name):
"""Get an attribute from this object"""
self.assert_valid()
value = _swig_getattr(self, self.__class__, name)
members = self.__dict__.get("_members")
if members is not None:
old_value = members.get(name)
if (old_value is not None and value is not None and
value is not old_value):
try:
value.__dict__.update(old_value.__dict__)
except AttributeError:
pass
if hasattr(value, "assert_valid"):
value.assert_valid()
return value
def __setattr__(self, name, value):
"""Set an attribute on this object"""
self.assert_valid()
self.__dict__.setdefault("_members",{})[name] = value
return _swig_setattr(self, self.__class__, name, value)
def __call__(self, *args):
return svn_wc_invoke_relocation_validator3(self, *args)
svn_wc_relocation_validator3_t_swigregister = _wc.svn_wc_relocation_validator3_t_swigregister
svn_wc_relocation_validator3_t_swigregister(svn_wc_relocation_validator3_t)
class svn_wc_relocation_validator2_t:
"""Proxy of C svn_wc_relocation_validator2_t struct"""
__swig_setmethods__ = {}
__setattr__ = lambda self, name, value: _swig_setattr(self, svn_wc_relocation_validator2_t, name, value)
__swig_getmethods__ = {}
__getattr__ = lambda self, name: _swig_getattr(self, svn_wc_relocation_validator2_t, name)
def __init__(self, *args, **kwargs): raise AttributeError("No constructor defined")
__repr__ = _swig_repr
def set_parent_pool(self, parent_pool=None):
"""Create a new proxy object for svn_wc_relocation_validator2_t"""
import libsvn.core, weakref
self.__dict__["_parent_pool"] = \
parent_pool or libsvn.core.application_pool;
if self.__dict__["_parent_pool"]:
self.__dict__["_is_valid"] = weakref.ref(
self.__dict__["_parent_pool"]._is_valid)
def assert_valid(self):
"""Assert that this object is using valid pool memory"""
if "_is_valid" in self.__dict__:
assert self.__dict__["_is_valid"](), "Variable has already been deleted"
def __getattr__(self, name):
"""Get an attribute from this object"""
self.assert_valid()
value = _swig_getattr(self, self.__class__, name)
members = self.__dict__.get("_members")
if members is not None:
old_value = members.get(name)
if (old_value is not None and value is not None and
value is not old_value):
try:
value.__dict__.update(old_value.__dict__)
except AttributeError:
pass
if hasattr(value, "assert_valid"):
value.assert_valid()
return value
def __setattr__(self, name, value):
"""Set an attribute on this object"""
self.assert_valid()
self.__dict__.setdefault("_members",{})[name] = value
return _swig_setattr(self, self.__class__, name, value)
def __call__(self, *args):
return svn_wc_invoke_relocation_validator2(self, *args)
svn_wc_relocation_validator2_t_swigregister = _wc.svn_wc_relocation_validator2_t_swigregister
svn_wc_relocation_validator2_t_swigregister(svn_wc_relocation_validator2_t)
class svn_wc_relocation_validator_t:
"""Proxy of C svn_wc_relocation_validator_t struct"""
__swig_setmethods__ = {}
__setattr__ = lambda self, name, value: _swig_setattr(self, svn_wc_relocation_validator_t, name, value)
__swig_getmethods__ = {}
__getattr__ = lambda self, name: _swig_getattr(self, svn_wc_relocation_validator_t, name)
def __init__(self, *args, **kwargs): raise AttributeError("No constructor defined")
__repr__ = _swig_repr
def set_parent_pool(self, parent_pool=None):
"""Create a new proxy object for svn_wc_relocation_validator_t"""
import libsvn.core, weakref
self.__dict__["_parent_pool"] = \
parent_pool or libsvn.core.application_pool;
if self.__dict__["_parent_pool"]:
self.__dict__["_is_valid"] = weakref.ref(
self.__dict__["_parent_pool"]._is_valid)
def assert_valid(self):
"""Assert that this object is using valid pool memory"""
if "_is_valid" in self.__dict__:
assert self.__dict__["_is_valid"](), "Variable has already been deleted"
def __getattr__(self, name):
"""Get an attribute from this object"""
self.assert_valid()
value = _swig_getattr(self, self.__class__, name)
members = self.__dict__.get("_members")
if members is not None:
old_value = members.get(name)
if (old_value is not None and value is not None and
value is not old_value):
try:
value.__dict__.update(old_value.__dict__)
except AttributeError:
pass
if hasattr(value, "assert_valid"):
value.assert_valid()
return value
def __setattr__(self, name, value):
"""Set an attribute on this object"""
self.assert_valid()
self.__dict__.setdefault("_members",{})[name] = value
return _swig_setattr(self, self.__class__, name, value)
def __call__(self, *args):
return svn_wc_invoke_relocation_validator(self, *args)
svn_wc_relocation_validator_t_swigregister = _wc.svn_wc_relocation_validator_t_swigregister
svn_wc_relocation_validator_t_swigregister(svn_wc_relocation_validator_t)
def svn_wc_swig_init_asp_dot_net_hack(*args):
"""svn_wc_swig_init_asp_dot_net_hack(apr_pool_t pool) -> svn_error_t"""
return _wc.svn_wc_swig_init_asp_dot_net_hack(*args)
svn_wc_swig_init_asp_dot_net_hack()
| 40.679863 | 141 | 0.724763 | 22,490 | 154,136 | 4.278835 | 0.02161 | 0.089005 | 0.044009 | 0.025667 | 0.929046 | 0.843689 | 0.78375 | 0.742505 | 0.710301 | 0.66278 | 0 | 0.004712 | 0.1932 | 154,136 | 3,788 | 142 | 40.690602 | 0.769116 | 0.27224 | 0 | 0.504854 | 1 | 0 | 0.065398 | 0.000405 | 0 | 0 | 0 | 0 | 0.084466 | 1 | 0.181553 | false | 0.014078 | 0.019903 | 0.01699 | 0.439806 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e5185728cd99fea54f2079102cac731470ea02fc | 168 | py | Python | src/field_schnet/nn/__init__.py | atomistic-machine-learning/field_schnet | 0dcc72a91eaa6eb9d65183a8b6fb98a4330d1e5b | [
"MIT"
] | 4 | 2021-06-19T01:21:41.000Z | 2021-08-21T01:47:29.000Z | src/field_schnet/nn/__init__.py | atomistic-machine-learning/field_schnet | 0dcc72a91eaa6eb9d65183a8b6fb98a4330d1e5b | [
"MIT"
] | null | null | null | src/field_schnet/nn/__init__.py | atomistic-machine-learning/field_schnet | 0dcc72a91eaa6eb9d65183a8b6fb98a4330d1e5b | [
"MIT"
] | null | null | null | from field_schnet.nn.cutoff import *
from field_schnet.nn.field_interactions import *
from field_schnet.nn.basic import *
from field_schnet.nn.field_generators import * | 42 | 48 | 0.839286 | 26 | 168 | 5.192308 | 0.346154 | 0.266667 | 0.444444 | 0.503704 | 0.585185 | 0.414815 | 0 | 0 | 0 | 0 | 0 | 0 | 0.089286 | 168 | 4 | 49 | 42 | 0.882353 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
e56f23dc7e91ab7b6b07dbfad6e03bf8bfda432f | 363,034 | py | Python | ProcessMaker_PMIO/apis/processmaker_api.py | ProcessMaker/pmio-sdk-python | 49ddf9e6444c77a35ce51aa052059b254e0f5299 | [
"Apache-2.0"
] | 2 | 2017-11-10T05:10:44.000Z | 2020-05-14T14:20:01.000Z | ProcessMaker_PMIO/apis/processmaker_api.py | ProcessMaker/pmio-sdk-python | 49ddf9e6444c77a35ce51aa052059b254e0f5299 | [
"Apache-2.0"
] | null | null | null | ProcessMaker_PMIO/apis/processmaker_api.py | ProcessMaker/pmio-sdk-python | 49ddf9e6444c77a35ce51aa052059b254e0f5299 | [
"Apache-2.0"
] | 4 | 2017-07-01T22:04:18.000Z | 2020-05-14T14:33:41.000Z | # coding: utf-8
"""
ProcessMaker API
This ProcessMaker I/O API provides access to a BPMN 2.0 compliant workflow engine api that is designed to be used as a microservice to support enterprise cloud applications. The current Alpha 1.0 version supports most of the descriptive class of the BPMN 2.0 specification.
OpenAPI spec version: 1.0.0
Contact: support@processmaker.io
Generated by: https://github.com/swagger-api/swagger-codegen.git
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
"""
from __future__ import absolute_import
import sys
import os
import re
# python 2 and python 3 compatibility library
from six import iteritems
from ..configuration import Configuration
from ..api_client import ApiClient
class ProcessmakerApi(object):
"""
NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
Ref: https://github.com/swagger-api/swagger-codegen
"""
def __init__(self, api_client=None):
config = Configuration()
if api_client:
self.api_client = api_client
else:
if not config.api_client:
config.api_client = ApiClient()
self.api_client = config.api_client
def add_client(self, user_id, client_create_item, **kwargs):
"""
This method creates a new Oauth client for the user
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.add_client(user_id, client_create_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str user_id: ID of the user related to the Oauth client (required)
:param ClientCreateItem client_create_item: JSON API with the Oauth Client object to add (required)
:return: ClientItem
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.add_client_with_http_info(user_id, client_create_item, **kwargs)
else:
(data) = self.add_client_with_http_info(user_id, client_create_item, **kwargs)
return data
def add_client_with_http_info(self, user_id, client_create_item, **kwargs):
"""
This method creates a new Oauth client for the user
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.add_client_with_http_info(user_id, client_create_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str user_id: ID of the user related to the Oauth client (required)
:param ClientCreateItem client_create_item: JSON API with the Oauth Client object to add (required)
:return: ClientItem
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['user_id', 'client_create_item']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method add_client" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'user_id' is set
if ('user_id' not in params) or (params['user_id'] is None):
raise ValueError("Missing the required parameter `user_id` when calling `add_client`")
# verify the required parameter 'client_create_item' is set
if ('client_create_item' not in params) or (params['client_create_item'] is None):
raise ValueError("Missing the required parameter `client_create_item` when calling `add_client`")
resource_path = '/users/{user_id}/clients'.replace('{format}', 'json')
path_params = {}
if 'user_id' in params:
path_params['user_id'] = params['user_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'client_create_item' in params:
body_params = params['client_create_item']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ClientItem',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def add_event(self, process_id, event_create_item, **kwargs):
"""
This method creates the new event.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.add_event(process_id, event_create_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of the process related to the event (required)
:param EventCreateItem event_create_item: JSON API response with the Event object to add (required)
:return: EventItem
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.add_event_with_http_info(process_id, event_create_item, **kwargs)
else:
(data) = self.add_event_with_http_info(process_id, event_create_item, **kwargs)
return data
def add_event_with_http_info(self, process_id, event_create_item, **kwargs):
"""
This method creates the new event.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.add_event_with_http_info(process_id, event_create_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of the process related to the event (required)
:param EventCreateItem event_create_item: JSON API response with the Event object to add (required)
:return: EventItem
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['process_id', 'event_create_item']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method add_event" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'process_id' is set
if ('process_id' not in params) or (params['process_id'] is None):
raise ValueError("Missing the required parameter `process_id` when calling `add_event`")
# verify the required parameter 'event_create_item' is set
if ('event_create_item' not in params) or (params['event_create_item'] is None):
raise ValueError("Missing the required parameter `event_create_item` when calling `add_event`")
resource_path = '/processes/{process_id}/events'.replace('{format}', 'json')
path_params = {}
if 'process_id' in params:
path_params['process_id'] = params['process_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'event_create_item' in params:
body_params = params['event_create_item']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='EventItem',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def add_event_connector(self, process_id, event_id, event_connector_create_item, **kwargs):
"""
This method is intended for creating a new Event connector.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.add_event_connector(process_id, event_id, event_connector_create_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of Process to fetch (required)
:param str event_id: ID of Event to fetch (required)
:param EventConnectorCreateItem event_connector_create_item: JSON API with the EventConnector object to add (required)
:return: EventConnector1
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.add_event_connector_with_http_info(process_id, event_id, event_connector_create_item, **kwargs)
else:
(data) = self.add_event_connector_with_http_info(process_id, event_id, event_connector_create_item, **kwargs)
return data
def add_event_connector_with_http_info(self, process_id, event_id, event_connector_create_item, **kwargs):
"""
This method is intended for creating a new Event connector.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.add_event_connector_with_http_info(process_id, event_id, event_connector_create_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of Process to fetch (required)
:param str event_id: ID of Event to fetch (required)
:param EventConnectorCreateItem event_connector_create_item: JSON API with the EventConnector object to add (required)
:return: EventConnector1
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['process_id', 'event_id', 'event_connector_create_item']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method add_event_connector" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'process_id' is set
if ('process_id' not in params) or (params['process_id'] is None):
raise ValueError("Missing the required parameter `process_id` when calling `add_event_connector`")
# verify the required parameter 'event_id' is set
if ('event_id' not in params) or (params['event_id'] is None):
raise ValueError("Missing the required parameter `event_id` when calling `add_event_connector`")
# verify the required parameter 'event_connector_create_item' is set
if ('event_connector_create_item' not in params) or (params['event_connector_create_item'] is None):
raise ValueError("Missing the required parameter `event_connector_create_item` when calling `add_event_connector`")
resource_path = '/processes/{process_id}/events/{event_id}/connectors'.replace('{format}', 'json')
path_params = {}
if 'process_id' in params:
path_params['process_id'] = params['process_id']
if 'event_id' in params:
path_params['event_id'] = params['event_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'event_connector_create_item' in params:
body_params = params['event_connector_create_item']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='EventConnector1',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def add_flow(self, process_id, flow_create_item, **kwargs):
"""
This method creates a new Sequence flow
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.add_flow(process_id, flow_create_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of the process related to the flow (required)
:param FlowCreateItem flow_create_item: JSON API response with the Flow object to add (required)
:return: FlowItem
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.add_flow_with_http_info(process_id, flow_create_item, **kwargs)
else:
(data) = self.add_flow_with_http_info(process_id, flow_create_item, **kwargs)
return data
def add_flow_with_http_info(self, process_id, flow_create_item, **kwargs):
"""
This method creates a new Sequence flow
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.add_flow_with_http_info(process_id, flow_create_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of the process related to the flow (required)
:param FlowCreateItem flow_create_item: JSON API response with the Flow object to add (required)
:return: FlowItem
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['process_id', 'flow_create_item']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method add_flow" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'process_id' is set
if ('process_id' not in params) or (params['process_id'] is None):
raise ValueError("Missing the required parameter `process_id` when calling `add_flow`")
# verify the required parameter 'flow_create_item' is set
if ('flow_create_item' not in params) or (params['flow_create_item'] is None):
raise ValueError("Missing the required parameter `flow_create_item` when calling `add_flow`")
resource_path = '/processes/{process_id}/flows'.replace('{format}', 'json')
path_params = {}
if 'process_id' in params:
path_params['process_id'] = params['process_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'flow_create_item' in params:
body_params = params['flow_create_item']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='FlowItem',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def add_gateway(self, process_id, gateway_create_item, **kwargs):
"""
This method creates a new gateway.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.add_gateway(process_id, gateway_create_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of the process related to the gateway (required)
:param GatewayCreateItem gateway_create_item: JSON API response with the gateway object to add (required)
:return: GatewayItem
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.add_gateway_with_http_info(process_id, gateway_create_item, **kwargs)
else:
(data) = self.add_gateway_with_http_info(process_id, gateway_create_item, **kwargs)
return data
def add_gateway_with_http_info(self, process_id, gateway_create_item, **kwargs):
"""
This method creates a new gateway.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.add_gateway_with_http_info(process_id, gateway_create_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of the process related to the gateway (required)
:param GatewayCreateItem gateway_create_item: JSON API response with the gateway object to add (required)
:return: GatewayItem
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['process_id', 'gateway_create_item']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method add_gateway" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'process_id' is set
if ('process_id' not in params) or (params['process_id'] is None):
raise ValueError("Missing the required parameter `process_id` when calling `add_gateway`")
# verify the required parameter 'gateway_create_item' is set
if ('gateway_create_item' not in params) or (params['gateway_create_item'] is None):
raise ValueError("Missing the required parameter `gateway_create_item` when calling `add_gateway`")
resource_path = '/processes/{process_id}/gateways'.replace('{format}', 'json')
path_params = {}
if 'process_id' in params:
path_params['process_id'] = params['process_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'gateway_create_item' in params:
body_params = params['gateway_create_item']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='GatewayItem',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def add_group(self, group_create_item, **kwargs):
"""
This method creates a new group.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.add_group(group_create_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param GroupCreateItem group_create_item: JSON API with the Group object to add (required)
:return: GroupItem
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.add_group_with_http_info(group_create_item, **kwargs)
else:
(data) = self.add_group_with_http_info(group_create_item, **kwargs)
return data
def add_group_with_http_info(self, group_create_item, **kwargs):
"""
This method creates a new group.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.add_group_with_http_info(group_create_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param GroupCreateItem group_create_item: JSON API with the Group object to add (required)
:return: GroupItem
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['group_create_item']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method add_group" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'group_create_item' is set
if ('group_create_item' not in params) or (params['group_create_item'] is None):
raise ValueError("Missing the required parameter `group_create_item` when calling `add_group`")
resource_path = '/groups'.replace('{format}', 'json')
path_params = {}
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'group_create_item' in params:
body_params = params['group_create_item']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='GroupItem',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def add_groups_to_task(self, process_id, task_id, task_add_groups_item, **kwargs):
"""
This method assigns group(s) to the choosen task
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.add_groups_to_task(process_id, task_id, task_add_groups_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: Process ID (required)
:param str task_id: ID of task to be modified (required)
:param TaskAddGroupsItem task_add_groups_item: JSON API with Groups ID's to add (required)
:return: ResultSuccess
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.add_groups_to_task_with_http_info(process_id, task_id, task_add_groups_item, **kwargs)
else:
(data) = self.add_groups_to_task_with_http_info(process_id, task_id, task_add_groups_item, **kwargs)
return data
def add_groups_to_task_with_http_info(self, process_id, task_id, task_add_groups_item, **kwargs):
"""
This method assigns group(s) to the choosen task
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.add_groups_to_task_with_http_info(process_id, task_id, task_add_groups_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: Process ID (required)
:param str task_id: ID of task to be modified (required)
:param TaskAddGroupsItem task_add_groups_item: JSON API with Groups ID's to add (required)
:return: ResultSuccess
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['process_id', 'task_id', 'task_add_groups_item']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method add_groups_to_task" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'process_id' is set
if ('process_id' not in params) or (params['process_id'] is None):
raise ValueError("Missing the required parameter `process_id` when calling `add_groups_to_task`")
# verify the required parameter 'task_id' is set
if ('task_id' not in params) or (params['task_id'] is None):
raise ValueError("Missing the required parameter `task_id` when calling `add_groups_to_task`")
# verify the required parameter 'task_add_groups_item' is set
if ('task_add_groups_item' not in params) or (params['task_add_groups_item'] is None):
raise ValueError("Missing the required parameter `task_add_groups_item` when calling `add_groups_to_task`")
resource_path = '/processes/{process_id}/tasks/{task_id}/groups'.replace('{format}', 'json')
path_params = {}
if 'process_id' in params:
path_params['process_id'] = params['process_id']
if 'task_id' in params:
path_params['task_id'] = params['task_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'task_add_groups_item' in params:
body_params = params['task_add_groups_item']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ResultSuccess',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def add_input_output(self, process_id, task_id, input_output_create_item, **kwargs):
"""
This method creates a new Input/Output object
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.add_input_output(process_id, task_id, input_output_create_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: Process ID related to Input/Output object (required)
:param str task_id: Task instance ID related to Input/Output object (required)
:param InputOutputCreateItem input_output_create_item: Create and add a new Input/Output object with JSON API (required)
:return: InputOutputItem
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.add_input_output_with_http_info(process_id, task_id, input_output_create_item, **kwargs)
else:
(data) = self.add_input_output_with_http_info(process_id, task_id, input_output_create_item, **kwargs)
return data
def add_input_output_with_http_info(self, process_id, task_id, input_output_create_item, **kwargs):
"""
This method creates a new Input/Output object
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.add_input_output_with_http_info(process_id, task_id, input_output_create_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: Process ID related to Input/Output object (required)
:param str task_id: Task instance ID related to Input/Output object (required)
:param InputOutputCreateItem input_output_create_item: Create and add a new Input/Output object with JSON API (required)
:return: InputOutputItem
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['process_id', 'task_id', 'input_output_create_item']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method add_input_output" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'process_id' is set
if ('process_id' not in params) or (params['process_id'] is None):
raise ValueError("Missing the required parameter `process_id` when calling `add_input_output`")
# verify the required parameter 'task_id' is set
if ('task_id' not in params) or (params['task_id'] is None):
raise ValueError("Missing the required parameter `task_id` when calling `add_input_output`")
# verify the required parameter 'input_output_create_item' is set
if ('input_output_create_item' not in params) or (params['input_output_create_item'] is None):
raise ValueError("Missing the required parameter `input_output_create_item` when calling `add_input_output`")
resource_path = '/processes/{process_id}/tasks/{task_id}/inputoutput'.replace('{format}', 'json')
path_params = {}
if 'process_id' in params:
path_params['process_id'] = params['process_id']
if 'task_id' in params:
path_params['task_id'] = params['task_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'input_output_create_item' in params:
body_params = params['input_output_create_item']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='InputOutputItem',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def add_instance(self, process_id, instance_create_item, **kwargs):
"""
This method creates a new instance.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.add_instance(process_id, instance_create_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: Process ID related to the nstance (required)
:param InstanceCreateItem instance_create_item: JSON API response with the Instance object to add (required)
:return: InstanceItem
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.add_instance_with_http_info(process_id, instance_create_item, **kwargs)
else:
(data) = self.add_instance_with_http_info(process_id, instance_create_item, **kwargs)
return data
def add_instance_with_http_info(self, process_id, instance_create_item, **kwargs):
"""
This method creates a new instance.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.add_instance_with_http_info(process_id, instance_create_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: Process ID related to the nstance (required)
:param InstanceCreateItem instance_create_item: JSON API response with the Instance object to add (required)
:return: InstanceItem
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['process_id', 'instance_create_item']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method add_instance" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'process_id' is set
if ('process_id' not in params) or (params['process_id'] is None):
raise ValueError("Missing the required parameter `process_id` when calling `add_instance`")
# verify the required parameter 'instance_create_item' is set
if ('instance_create_item' not in params) or (params['instance_create_item'] is None):
raise ValueError("Missing the required parameter `instance_create_item` when calling `add_instance`")
resource_path = '/processes/{process_id}/instances'.replace('{format}', 'json')
path_params = {}
if 'process_id' in params:
path_params['process_id'] = params['process_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'instance_create_item' in params:
body_params = params['instance_create_item']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='InstanceItem',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def add_process(self, process_create_item, **kwargs):
"""
This method creates a new process
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.add_process(process_create_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param ProcessCreateItem process_create_item: JSON API response with the Process object to add (required)
:return: ProcessItem
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.add_process_with_http_info(process_create_item, **kwargs)
else:
(data) = self.add_process_with_http_info(process_create_item, **kwargs)
return data
def add_process_with_http_info(self, process_create_item, **kwargs):
"""
This method creates a new process
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.add_process_with_http_info(process_create_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param ProcessCreateItem process_create_item: JSON API response with the Process object to add (required)
:return: ProcessItem
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['process_create_item']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method add_process" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'process_create_item' is set
if ('process_create_item' not in params) or (params['process_create_item'] is None):
raise ValueError("Missing the required parameter `process_create_item` when calling `add_process`")
resource_path = '/processes'.replace('{format}', 'json')
path_params = {}
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'process_create_item' in params:
body_params = params['process_create_item']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ProcessItem',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def add_task(self, process_id, task_create_item, **kwargs):
"""
This method creates a new task.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.add_task(process_id, task_create_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: Process ID related to the task (required)
:param TaskCreateItem task_create_item: JSON API with the Task object to add (required)
:return: TaskItem
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.add_task_with_http_info(process_id, task_create_item, **kwargs)
else:
(data) = self.add_task_with_http_info(process_id, task_create_item, **kwargs)
return data
def add_task_with_http_info(self, process_id, task_create_item, **kwargs):
"""
This method creates a new task.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.add_task_with_http_info(process_id, task_create_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: Process ID related to the task (required)
:param TaskCreateItem task_create_item: JSON API with the Task object to add (required)
:return: TaskItem
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['process_id', 'task_create_item']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method add_task" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'process_id' is set
if ('process_id' not in params) or (params['process_id'] is None):
raise ValueError("Missing the required parameter `process_id` when calling `add_task`")
# verify the required parameter 'task_create_item' is set
if ('task_create_item' not in params) or (params['task_create_item'] is None):
raise ValueError("Missing the required parameter `task_create_item` when calling `add_task`")
resource_path = '/processes/{process_id}/tasks'.replace('{format}', 'json')
path_params = {}
if 'process_id' in params:
path_params['process_id'] = params['process_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'task_create_item' in params:
body_params = params['task_create_item']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='TaskItem',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def add_task_connector(self, process_id, task_id, task_connector_create_item, **kwargs):
"""
This method is intended for creating a new task connector.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.add_task_connector(process_id, task_id, task_connector_create_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of Process to fetch (required)
:param str task_id: ID of Task to fetch (required)
:param TaskConnectorCreateItem task_connector_create_item: JSON API with the TaskConnector object to add (required)
:return: TaskConnector1
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.add_task_connector_with_http_info(process_id, task_id, task_connector_create_item, **kwargs)
else:
(data) = self.add_task_connector_with_http_info(process_id, task_id, task_connector_create_item, **kwargs)
return data
def add_task_connector_with_http_info(self, process_id, task_id, task_connector_create_item, **kwargs):
"""
This method is intended for creating a new task connector.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.add_task_connector_with_http_info(process_id, task_id, task_connector_create_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of Process to fetch (required)
:param str task_id: ID of Task to fetch (required)
:param TaskConnectorCreateItem task_connector_create_item: JSON API with the TaskConnector object to add (required)
:return: TaskConnector1
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['process_id', 'task_id', 'task_connector_create_item']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method add_task_connector" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'process_id' is set
if ('process_id' not in params) or (params['process_id'] is None):
raise ValueError("Missing the required parameter `process_id` when calling `add_task_connector`")
# verify the required parameter 'task_id' is set
if ('task_id' not in params) or (params['task_id'] is None):
raise ValueError("Missing the required parameter `task_id` when calling `add_task_connector`")
# verify the required parameter 'task_connector_create_item' is set
if ('task_connector_create_item' not in params) or (params['task_connector_create_item'] is None):
raise ValueError("Missing the required parameter `task_connector_create_item` when calling `add_task_connector`")
resource_path = '/processes/{process_id}/tasks/{task_id}/connectors'.replace('{format}', 'json')
path_params = {}
if 'process_id' in params:
path_params['process_id'] = params['process_id']
if 'task_id' in params:
path_params['task_id'] = params['task_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'task_connector_create_item' in params:
body_params = params['task_connector_create_item']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='TaskConnector1',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def add_user(self, user_create_item, **kwargs):
"""
This method creates a new user in the system.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.add_user(user_create_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param UserCreateItem user_create_item: JSON API with the User object to add (required)
:return: UserItem
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.add_user_with_http_info(user_create_item, **kwargs)
else:
(data) = self.add_user_with_http_info(user_create_item, **kwargs)
return data
def add_user_with_http_info(self, user_create_item, **kwargs):
"""
This method creates a new user in the system.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.add_user_with_http_info(user_create_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param UserCreateItem user_create_item: JSON API with the User object to add (required)
:return: UserItem
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['user_create_item']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method add_user" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'user_create_item' is set
if ('user_create_item' not in params) or (params['user_create_item'] is None):
raise ValueError("Missing the required parameter `user_create_item` when calling `add_user`")
resource_path = '/users'.replace('{format}', 'json')
path_params = {}
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'user_create_item' in params:
body_params = params['user_create_item']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='UserItem',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def add_users_to_group(self, id, group_add_users_item, **kwargs):
"""
This method adds one or more new users to a group.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.add_users_to_group(id, group_add_users_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: ID of group to be modified (required)
:param GroupAddUsersItem group_add_users_item: JSON API response with array of users ID's (required)
:return: ResultSuccess
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.add_users_to_group_with_http_info(id, group_add_users_item, **kwargs)
else:
(data) = self.add_users_to_group_with_http_info(id, group_add_users_item, **kwargs)
return data
def add_users_to_group_with_http_info(self, id, group_add_users_item, **kwargs):
"""
This method adds one or more new users to a group.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.add_users_to_group_with_http_info(id, group_add_users_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: ID of group to be modified (required)
:param GroupAddUsersItem group_add_users_item: JSON API response with array of users ID's (required)
:return: ResultSuccess
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'group_add_users_item']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method add_users_to_group" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params) or (params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `add_users_to_group`")
# verify the required parameter 'group_add_users_item' is set
if ('group_add_users_item' not in params) or (params['group_add_users_item'] is None):
raise ValueError("Missing the required parameter `group_add_users_item` when calling `add_users_to_group`")
resource_path = '/groups/{id}/users'.replace('{format}', 'json')
path_params = {}
if 'id' in params:
path_params['id'] = params['id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'group_add_users_item' in params:
body_params = params['group_add_users_item']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ResultSuccess',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def delete_client(self, user_id, client_id, **kwargs):
"""
This method deletes an Oauth client using the client and user IDs.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.delete_client(user_id, client_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str user_id: User ID (required)
:param str client_id: ID of client to delete (required)
:return: ResultSuccess
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.delete_client_with_http_info(user_id, client_id, **kwargs)
else:
(data) = self.delete_client_with_http_info(user_id, client_id, **kwargs)
return data
def delete_client_with_http_info(self, user_id, client_id, **kwargs):
"""
This method deletes an Oauth client using the client and user IDs.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.delete_client_with_http_info(user_id, client_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str user_id: User ID (required)
:param str client_id: ID of client to delete (required)
:return: ResultSuccess
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['user_id', 'client_id']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_client" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'user_id' is set
if ('user_id' not in params) or (params['user_id'] is None):
raise ValueError("Missing the required parameter `user_id` when calling `delete_client`")
# verify the required parameter 'client_id' is set
if ('client_id' not in params) or (params['client_id'] is None):
raise ValueError("Missing the required parameter `client_id` when calling `delete_client`")
resource_path = '/users/{user_id}/clients/{client_id}'.replace('{format}', 'json')
path_params = {}
if 'user_id' in params:
path_params['user_id'] = params['user_id']
if 'client_id' in params:
path_params['client_id'] = params['client_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ResultSuccess',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def delete_event(self, process_id, event_id, **kwargs):
"""
This method deletes an event using the event ID and process ID
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.delete_event(process_id, event_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: Process ID (required)
:param str event_id: ID of event to delete (required)
:return: ResultSuccess
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.delete_event_with_http_info(process_id, event_id, **kwargs)
else:
(data) = self.delete_event_with_http_info(process_id, event_id, **kwargs)
return data
def delete_event_with_http_info(self, process_id, event_id, **kwargs):
"""
This method deletes an event using the event ID and process ID
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.delete_event_with_http_info(process_id, event_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: Process ID (required)
:param str event_id: ID of event to delete (required)
:return: ResultSuccess
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['process_id', 'event_id']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_event" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'process_id' is set
if ('process_id' not in params) or (params['process_id'] is None):
raise ValueError("Missing the required parameter `process_id` when calling `delete_event`")
# verify the required parameter 'event_id' is set
if ('event_id' not in params) or (params['event_id'] is None):
raise ValueError("Missing the required parameter `event_id` when calling `delete_event`")
resource_path = '/processes/{process_id}/events/{event_id}'.replace('{format}', 'json')
path_params = {}
if 'process_id' in params:
path_params['process_id'] = params['process_id']
if 'event_id' in params:
path_params['event_id'] = params['event_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ResultSuccess',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def delete_event_connector(self, process_id, event_id, connector_id, **kwargs):
"""
This method is intended for deleting a single Event connector based on Event ID, Process ID and Connector ID.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.delete_event_connector(process_id, event_id, connector_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of of Process item (required)
:param str event_id: ID of item to fetch (required)
:param str connector_id: ID of EventConnector to fetch (required)
:return: ResultSuccess
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.delete_event_connector_with_http_info(process_id, event_id, connector_id, **kwargs)
else:
(data) = self.delete_event_connector_with_http_info(process_id, event_id, connector_id, **kwargs)
return data
def delete_event_connector_with_http_info(self, process_id, event_id, connector_id, **kwargs):
"""
This method is intended for deleting a single Event connector based on Event ID, Process ID and Connector ID.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.delete_event_connector_with_http_info(process_id, event_id, connector_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of of Process item (required)
:param str event_id: ID of item to fetch (required)
:param str connector_id: ID of EventConnector to fetch (required)
:return: ResultSuccess
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['process_id', 'event_id', 'connector_id']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_event_connector" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'process_id' is set
if ('process_id' not in params) or (params['process_id'] is None):
raise ValueError("Missing the required parameter `process_id` when calling `delete_event_connector`")
# verify the required parameter 'event_id' is set
if ('event_id' not in params) or (params['event_id'] is None):
raise ValueError("Missing the required parameter `event_id` when calling `delete_event_connector`")
# verify the required parameter 'connector_id' is set
if ('connector_id' not in params) or (params['connector_id'] is None):
raise ValueError("Missing the required parameter `connector_id` when calling `delete_event_connector`")
resource_path = '/processes/{process_id}/events/{event_id}/connectors/{connector_id}'.replace('{format}', 'json')
path_params = {}
if 'process_id' in params:
path_params['process_id'] = params['process_id']
if 'event_id' in params:
path_params['event_id'] = params['event_id']
if 'connector_id' in params:
path_params['connector_id'] = params['connector_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ResultSuccess',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def delete_flow(self, process_id, flow_id, **kwargs):
"""
This method deletes a sequence flow using the flow ID and process ID.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.delete_flow(process_id, flow_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: Process ID (required)
:param str flow_id: ID of flow to delete (required)
:return: ResultSuccess
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.delete_flow_with_http_info(process_id, flow_id, **kwargs)
else:
(data) = self.delete_flow_with_http_info(process_id, flow_id, **kwargs)
return data
def delete_flow_with_http_info(self, process_id, flow_id, **kwargs):
"""
This method deletes a sequence flow using the flow ID and process ID.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.delete_flow_with_http_info(process_id, flow_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: Process ID (required)
:param str flow_id: ID of flow to delete (required)
:return: ResultSuccess
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['process_id', 'flow_id']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_flow" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'process_id' is set
if ('process_id' not in params) or (params['process_id'] is None):
raise ValueError("Missing the required parameter `process_id` when calling `delete_flow`")
# verify the required parameter 'flow_id' is set
if ('flow_id' not in params) or (params['flow_id'] is None):
raise ValueError("Missing the required parameter `flow_id` when calling `delete_flow`")
resource_path = '/processes/{process_id}/flows/{flow_id}'.replace('{format}', 'json')
path_params = {}
if 'process_id' in params:
path_params['process_id'] = params['process_id']
if 'flow_id' in params:
path_params['flow_id'] = params['flow_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ResultSuccess',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def delete_gateway(self, process_id, gateway_id, **kwargs):
"""
This method is deletes a single item using the gateway ID and process ID.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.delete_gateway(process_id, gateway_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: Process ID (required)
:param str gateway_id: ID of Process to delete (required)
:return: ResultSuccess
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.delete_gateway_with_http_info(process_id, gateway_id, **kwargs)
else:
(data) = self.delete_gateway_with_http_info(process_id, gateway_id, **kwargs)
return data
def delete_gateway_with_http_info(self, process_id, gateway_id, **kwargs):
"""
This method is deletes a single item using the gateway ID and process ID.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.delete_gateway_with_http_info(process_id, gateway_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: Process ID (required)
:param str gateway_id: ID of Process to delete (required)
:return: ResultSuccess
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['process_id', 'gateway_id']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_gateway" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'process_id' is set
if ('process_id' not in params) or (params['process_id'] is None):
raise ValueError("Missing the required parameter `process_id` when calling `delete_gateway`")
# verify the required parameter 'gateway_id' is set
if ('gateway_id' not in params) or (params['gateway_id'] is None):
raise ValueError("Missing the required parameter `gateway_id` when calling `delete_gateway`")
resource_path = '/processes/{process_id}/gateways/{gateway_id}'.replace('{format}', 'json')
path_params = {}
if 'process_id' in params:
path_params['process_id'] = params['process_id']
if 'gateway_id' in params:
path_params['gateway_id'] = params['gateway_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ResultSuccess',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def delete_group(self, id, **kwargs):
"""
This method deletes a group using the group ID
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.delete_group(id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: ID of group to delete (required)
:return: ResultSuccess
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.delete_group_with_http_info(id, **kwargs)
else:
(data) = self.delete_group_with_http_info(id, **kwargs)
return data
def delete_group_with_http_info(self, id, **kwargs):
"""
This method deletes a group using the group ID
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.delete_group_with_http_info(id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: ID of group to delete (required)
:return: ResultSuccess
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_group" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params) or (params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `delete_group`")
resource_path = '/groups/{id}'.replace('{format}', 'json')
path_params = {}
if 'id' in params:
path_params['id'] = params['id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ResultSuccess',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def delete_input_output(self, process_id, task_id, inputoutput_uid, **kwargs):
"""
This method deletes the Input/Output based on the Input/Output ID, process ID and task ID.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.delete_input_output(process_id, task_id, inputoutput_uid, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: Process ID related to the Input/Output object (required)
:param str task_id: Task instance ID related to Input/Output object (required)
:param str inputoutput_uid: Input/Output ID to fetch (required)
:return: ResultSuccess
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.delete_input_output_with_http_info(process_id, task_id, inputoutput_uid, **kwargs)
else:
(data) = self.delete_input_output_with_http_info(process_id, task_id, inputoutput_uid, **kwargs)
return data
def delete_input_output_with_http_info(self, process_id, task_id, inputoutput_uid, **kwargs):
"""
This method deletes the Input/Output based on the Input/Output ID, process ID and task ID.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.delete_input_output_with_http_info(process_id, task_id, inputoutput_uid, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: Process ID related to the Input/Output object (required)
:param str task_id: Task instance ID related to Input/Output object (required)
:param str inputoutput_uid: Input/Output ID to fetch (required)
:return: ResultSuccess
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['process_id', 'task_id', 'inputoutput_uid']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_input_output" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'process_id' is set
if ('process_id' not in params) or (params['process_id'] is None):
raise ValueError("Missing the required parameter `process_id` when calling `delete_input_output`")
# verify the required parameter 'task_id' is set
if ('task_id' not in params) or (params['task_id'] is None):
raise ValueError("Missing the required parameter `task_id` when calling `delete_input_output`")
# verify the required parameter 'inputoutput_uid' is set
if ('inputoutput_uid' not in params) or (params['inputoutput_uid'] is None):
raise ValueError("Missing the required parameter `inputoutput_uid` when calling `delete_input_output`")
resource_path = '/processes/{process_id}/tasks/{task_id}/inputoutput/{inputoutput_uid}'.replace('{format}', 'json')
path_params = {}
if 'process_id' in params:
path_params['process_id'] = params['process_id']
if 'task_id' in params:
path_params['task_id'] = params['task_id']
if 'inputoutput_uid' in params:
path_params['inputoutput_uid'] = params['inputoutput_uid']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ResultSuccess',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def delete_instance(self, process_id, instance_id, **kwargs):
"""
This method deletes an instance using the instance ID and process ID.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.delete_instance(process_id, instance_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: Process ID (required)
:param str instance_id: ID of instance to delete (required)
:return: ResultSuccess
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.delete_instance_with_http_info(process_id, instance_id, **kwargs)
else:
(data) = self.delete_instance_with_http_info(process_id, instance_id, **kwargs)
return data
def delete_instance_with_http_info(self, process_id, instance_id, **kwargs):
"""
This method deletes an instance using the instance ID and process ID.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.delete_instance_with_http_info(process_id, instance_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: Process ID (required)
:param str instance_id: ID of instance to delete (required)
:return: ResultSuccess
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['process_id', 'instance_id']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_instance" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'process_id' is set
if ('process_id' not in params) or (params['process_id'] is None):
raise ValueError("Missing the required parameter `process_id` when calling `delete_instance`")
# verify the required parameter 'instance_id' is set
if ('instance_id' not in params) or (params['instance_id'] is None):
raise ValueError("Missing the required parameter `instance_id` when calling `delete_instance`")
resource_path = '/processes/{process_id}/instances/{instance_id}'.replace('{format}', 'json')
path_params = {}
if 'process_id' in params:
path_params['process_id'] = params['process_id']
if 'instance_id' in params:
path_params['instance_id'] = params['instance_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ResultSuccess',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def delete_process(self, id, **kwargs):
"""
This method deletes a process using the process ID.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.delete_process(id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Process ID to delete (required)
:return: ResultSuccess
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.delete_process_with_http_info(id, **kwargs)
else:
(data) = self.delete_process_with_http_info(id, **kwargs)
return data
def delete_process_with_http_info(self, id, **kwargs):
"""
This method deletes a process using the process ID.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.delete_process_with_http_info(id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Process ID to delete (required)
:return: ResultSuccess
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_process" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params) or (params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `delete_process`")
resource_path = '/processes/{id}'.replace('{format}', 'json')
path_params = {}
if 'id' in params:
path_params['id'] = params['id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ResultSuccess',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def delete_task(self, process_id, task_id, **kwargs):
"""
This method deletes a task using the task ID and process ID.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.delete_task(process_id, task_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: Process ID (required)
:param str task_id: ID of task to delete (required)
:return: ResultSuccess
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.delete_task_with_http_info(process_id, task_id, **kwargs)
else:
(data) = self.delete_task_with_http_info(process_id, task_id, **kwargs)
return data
def delete_task_with_http_info(self, process_id, task_id, **kwargs):
"""
This method deletes a task using the task ID and process ID.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.delete_task_with_http_info(process_id, task_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: Process ID (required)
:param str task_id: ID of task to delete (required)
:return: ResultSuccess
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['process_id', 'task_id']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_task" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'process_id' is set
if ('process_id' not in params) or (params['process_id'] is None):
raise ValueError("Missing the required parameter `process_id` when calling `delete_task`")
# verify the required parameter 'task_id' is set
if ('task_id' not in params) or (params['task_id'] is None):
raise ValueError("Missing the required parameter `task_id` when calling `delete_task`")
resource_path = '/processes/{process_id}/tasks/{task_id}'.replace('{format}', 'json')
path_params = {}
if 'process_id' in params:
path_params['process_id'] = params['process_id']
if 'task_id' in params:
path_params['task_id'] = params['task_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ResultSuccess',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def delete_task_connector(self, process_id, task_id, connector_id, **kwargs):
"""
This method is intended for deleting a single Task connector based on Task ID, Process ID and Connector ID.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.delete_task_connector(process_id, task_id, connector_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of Process item to fetch (required)
:param str task_id: ID of Task item to fetch (required)
:param str connector_id: ID of TaskConnector to fetch (required)
:return: ResultSuccess
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.delete_task_connector_with_http_info(process_id, task_id, connector_id, **kwargs)
else:
(data) = self.delete_task_connector_with_http_info(process_id, task_id, connector_id, **kwargs)
return data
def delete_task_connector_with_http_info(self, process_id, task_id, connector_id, **kwargs):
"""
This method is intended for deleting a single Task connector based on Task ID, Process ID and Connector ID.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.delete_task_connector_with_http_info(process_id, task_id, connector_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of Process item to fetch (required)
:param str task_id: ID of Task item to fetch (required)
:param str connector_id: ID of TaskConnector to fetch (required)
:return: ResultSuccess
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['process_id', 'task_id', 'connector_id']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_task_connector" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'process_id' is set
if ('process_id' not in params) or (params['process_id'] is None):
raise ValueError("Missing the required parameter `process_id` when calling `delete_task_connector`")
# verify the required parameter 'task_id' is set
if ('task_id' not in params) or (params['task_id'] is None):
raise ValueError("Missing the required parameter `task_id` when calling `delete_task_connector`")
# verify the required parameter 'connector_id' is set
if ('connector_id' not in params) or (params['connector_id'] is None):
raise ValueError("Missing the required parameter `connector_id` when calling `delete_task_connector`")
resource_path = '/processes/{process_id}/tasks/{task_id}/connectors/{connector_id}'.replace('{format}', 'json')
path_params = {}
if 'process_id' in params:
path_params['process_id'] = params['process_id']
if 'task_id' in params:
path_params['task_id'] = params['task_id']
if 'connector_id' in params:
path_params['connector_id'] = params['connector_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ResultSuccess',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def delete_user(self, id, **kwargs):
"""
This method deletes a user from the system.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.delete_user(id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: ID of user to delete (required)
:return: ResultSuccess
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.delete_user_with_http_info(id, **kwargs)
else:
(data) = self.delete_user_with_http_info(id, **kwargs)
return data
def delete_user_with_http_info(self, id, **kwargs):
"""
This method deletes a user from the system.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.delete_user_with_http_info(id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: ID of user to delete (required)
:return: ResultSuccess
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_user" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params) or (params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `delete_user`")
resource_path = '/users/{id}'.replace('{format}', 'json')
path_params = {}
if 'id' in params:
path_params['id'] = params['id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ResultSuccess',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def event_trigger(self, process_id, event_id, trigger_event_create_item, **kwargs):
"""
This method starts/triggers an event.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.event_trigger(process_id, event_id, trigger_event_create_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: Process ID related to the event (required)
:param str event_id: ID of event to trigger (required)
:param TriggerEventCreateItem trigger_event_create_item: Json with some parameters (required)
:return: DataModelItem1
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.event_trigger_with_http_info(process_id, event_id, trigger_event_create_item, **kwargs)
else:
(data) = self.event_trigger_with_http_info(process_id, event_id, trigger_event_create_item, **kwargs)
return data
def event_trigger_with_http_info(self, process_id, event_id, trigger_event_create_item, **kwargs):
"""
This method starts/triggers an event.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.event_trigger_with_http_info(process_id, event_id, trigger_event_create_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: Process ID related to the event (required)
:param str event_id: ID of event to trigger (required)
:param TriggerEventCreateItem trigger_event_create_item: Json with some parameters (required)
:return: DataModelItem1
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['process_id', 'event_id', 'trigger_event_create_item']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method event_trigger" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'process_id' is set
if ('process_id' not in params) or (params['process_id'] is None):
raise ValueError("Missing the required parameter `process_id` when calling `event_trigger`")
# verify the required parameter 'event_id' is set
if ('event_id' not in params) or (params['event_id'] is None):
raise ValueError("Missing the required parameter `event_id` when calling `event_trigger`")
# verify the required parameter 'trigger_event_create_item' is set
if ('trigger_event_create_item' not in params) or (params['trigger_event_create_item'] is None):
raise ValueError("Missing the required parameter `trigger_event_create_item` when calling `event_trigger`")
resource_path = '/processes/{process_id}/events/{event_id}/trigger'.replace('{format}', 'json')
path_params = {}
if 'process_id' in params:
path_params['process_id'] = params['process_id']
if 'event_id' in params:
path_params['event_id'] = params['event_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'trigger_event_create_item' in params:
body_params = params['trigger_event_create_item']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DataModelItem1',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def find_client_by_id(self, user_id, client_id, **kwargs):
"""
This method is retrieves an Oauth client based on its ID.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.find_client_by_id(user_id, client_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str user_id: ID of user to retrieve (required)
:param str client_id: ID of client to retrieve (required)
:return: ClientItem
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.find_client_by_id_with_http_info(user_id, client_id, **kwargs)
else:
(data) = self.find_client_by_id_with_http_info(user_id, client_id, **kwargs)
return data
def find_client_by_id_with_http_info(self, user_id, client_id, **kwargs):
"""
This method is retrieves an Oauth client based on its ID.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.find_client_by_id_with_http_info(user_id, client_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str user_id: ID of user to retrieve (required)
:param str client_id: ID of client to retrieve (required)
:return: ClientItem
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['user_id', 'client_id']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method find_client_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'user_id' is set
if ('user_id' not in params) or (params['user_id'] is None):
raise ValueError("Missing the required parameter `user_id` when calling `find_client_by_id`")
# verify the required parameter 'client_id' is set
if ('client_id' not in params) or (params['client_id'] is None):
raise ValueError("Missing the required parameter `client_id` when calling `find_client_by_id`")
resource_path = '/users/{user_id}/clients/{client_id}'.replace('{format}', 'json')
path_params = {}
if 'user_id' in params:
path_params['user_id'] = params['user_id']
if 'client_id' in params:
path_params['client_id'] = params['client_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ClientItem',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def find_clients(self, user_id, **kwargs):
"""
This method retrieves all existing clients belonging to an user.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.find_clients(user_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str user_id: User ID related to the clients (required)
:param int pagefind_process_by_id: Page numbers to fetch
:param int per_page: Amount of items per page
:return: ClientCollection
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.find_clients_with_http_info(user_id, **kwargs)
else:
(data) = self.find_clients_with_http_info(user_id, **kwargs)
return data
def find_clients_with_http_info(self, user_id, **kwargs):
"""
This method retrieves all existing clients belonging to an user.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.find_clients_with_http_info(user_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str user_id: User ID related to the clients (required)
:param int pagefind_process_by_id: Page numbers to fetch
:param int per_page: Amount of items per page
:return: ClientCollection
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['user_id', 'pagefind_process_by_id', 'per_page']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method find_clients" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'user_id' is set
if ('user_id' not in params) or (params['user_id'] is None):
raise ValueError("Missing the required parameter `user_id` when calling `find_clients`")
resource_path = '/users/{user_id}/clients'.replace('{format}', 'json')
path_params = {}
if 'user_id' in params:
path_params['user_id'] = params['user_id']
query_params = {}
if 'pagefind_process_by_id' in params:
query_params['pagefindProcessById'] = params['pagefind_process_by_id']
if 'per_page' in params:
query_params['per_page'] = params['per_page']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ClientCollection',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def find_data_model(self, process_id, instance_id, **kwargs):
"""
This method returns the instance DataModel and lets the user work with it directly
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.find_data_model(process_id, instance_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of process to return (required)
:param str instance_id: ID of instance to return (required)
:param int page: Page number to fetch
:param int per_page: Amount of items per Page
:return: DataModelItem1
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.find_data_model_with_http_info(process_id, instance_id, **kwargs)
else:
(data) = self.find_data_model_with_http_info(process_id, instance_id, **kwargs)
return data
def find_data_model_with_http_info(self, process_id, instance_id, **kwargs):
"""
This method returns the instance DataModel and lets the user work with it directly
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.find_data_model_with_http_info(process_id, instance_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of process to return (required)
:param str instance_id: ID of instance to return (required)
:param int page: Page number to fetch
:param int per_page: Amount of items per Page
:return: DataModelItem1
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['process_id', 'instance_id', 'page', 'per_page']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method find_data_model" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'process_id' is set
if ('process_id' not in params) or (params['process_id'] is None):
raise ValueError("Missing the required parameter `process_id` when calling `find_data_model`")
# verify the required parameter 'instance_id' is set
if ('instance_id' not in params) or (params['instance_id'] is None):
raise ValueError("Missing the required parameter `instance_id` when calling `find_data_model`")
resource_path = '/processes/{process_id}/instances/{instance_id}/datamodel'.replace('{format}', 'json')
path_params = {}
if 'process_id' in params:
path_params['process_id'] = params['process_id']
if 'instance_id' in params:
path_params['instance_id'] = params['instance_id']
query_params = {}
if 'page' in params:
query_params['page'] = params['page']
if 'per_page' in params:
query_params['per_page'] = params['per_page']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DataModelItem1',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def find_event_by_id(self, process_id, event_id, **kwargs):
"""
This method retrieves an event using its ID.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.find_event_by_id(process_id, event_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of process to return (required)
:param str event_id: ID of event to return (required)
:return: EventItem
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.find_event_by_id_with_http_info(process_id, event_id, **kwargs)
else:
(data) = self.find_event_by_id_with_http_info(process_id, event_id, **kwargs)
return data
def find_event_by_id_with_http_info(self, process_id, event_id, **kwargs):
"""
This method retrieves an event using its ID.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.find_event_by_id_with_http_info(process_id, event_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of process to return (required)
:param str event_id: ID of event to return (required)
:return: EventItem
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['process_id', 'event_id']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method find_event_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'process_id' is set
if ('process_id' not in params) or (params['process_id'] is None):
raise ValueError("Missing the required parameter `process_id` when calling `find_event_by_id`")
# verify the required parameter 'event_id' is set
if ('event_id' not in params) or (params['event_id'] is None):
raise ValueError("Missing the required parameter `event_id` when calling `find_event_by_id`")
resource_path = '/processes/{process_id}/events/{event_id}'.replace('{format}', 'json')
path_params = {}
if 'process_id' in params:
path_params['process_id'] = params['process_id']
if 'event_id' in params:
path_params['event_id'] = params['event_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='EventItem',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def find_event_connector_by_id(self, process_id, event_id, connector_id, **kwargs):
"""
This method returns all Event connectors related to the run Process and Event.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.find_event_connector_by_id(process_id, event_id, connector_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of Process to fetch (required)
:param str event_id: ID of Event to fetch (required)
:param str connector_id: ID of EventConnector to fetch (required)
:return: EventConnector1
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.find_event_connector_by_id_with_http_info(process_id, event_id, connector_id, **kwargs)
else:
(data) = self.find_event_connector_by_id_with_http_info(process_id, event_id, connector_id, **kwargs)
return data
def find_event_connector_by_id_with_http_info(self, process_id, event_id, connector_id, **kwargs):
"""
This method returns all Event connectors related to the run Process and Event.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.find_event_connector_by_id_with_http_info(process_id, event_id, connector_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of Process to fetch (required)
:param str event_id: ID of Event to fetch (required)
:param str connector_id: ID of EventConnector to fetch (required)
:return: EventConnector1
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['process_id', 'event_id', 'connector_id']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method find_event_connector_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'process_id' is set
if ('process_id' not in params) or (params['process_id'] is None):
raise ValueError("Missing the required parameter `process_id` when calling `find_event_connector_by_id`")
# verify the required parameter 'event_id' is set
if ('event_id' not in params) or (params['event_id'] is None):
raise ValueError("Missing the required parameter `event_id` when calling `find_event_connector_by_id`")
# verify the required parameter 'connector_id' is set
if ('connector_id' not in params) or (params['connector_id'] is None):
raise ValueError("Missing the required parameter `connector_id` when calling `find_event_connector_by_id`")
resource_path = '/processes/{process_id}/events/{event_id}/connectors/{connector_id}'.replace('{format}', 'json')
path_params = {}
if 'process_id' in params:
path_params['process_id'] = params['process_id']
if 'event_id' in params:
path_params['event_id'] = params['event_id']
if 'connector_id' in params:
path_params['connector_id'] = params['connector_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='EventConnector1',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def find_event_connectors(self, process_id, event_id, **kwargs):
"""
This method returns all Event connectors related to the run Process and Event.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.find_event_connectors(process_id, event_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of Process to fetch (required)
:param str event_id: ID of Task to fetch (required)
:param int page: Page number to fetch
:param int per_page: Amount of items per Page
:return: EventConnectorsCollection
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.find_event_connectors_with_http_info(process_id, event_id, **kwargs)
else:
(data) = self.find_event_connectors_with_http_info(process_id, event_id, **kwargs)
return data
def find_event_connectors_with_http_info(self, process_id, event_id, **kwargs):
"""
This method returns all Event connectors related to the run Process and Event.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.find_event_connectors_with_http_info(process_id, event_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of Process to fetch (required)
:param str event_id: ID of Task to fetch (required)
:param int page: Page number to fetch
:param int per_page: Amount of items per Page
:return: EventConnectorsCollection
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['process_id', 'event_id', 'page', 'per_page']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method find_event_connectors" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'process_id' is set
if ('process_id' not in params) or (params['process_id'] is None):
raise ValueError("Missing the required parameter `process_id` when calling `find_event_connectors`")
# verify the required parameter 'event_id' is set
if ('event_id' not in params) or (params['event_id'] is None):
raise ValueError("Missing the required parameter `event_id` when calling `find_event_connectors`")
resource_path = '/processes/{process_id}/events/{event_id}/connectors'.replace('{format}', 'json')
path_params = {}
if 'process_id' in params:
path_params['process_id'] = params['process_id']
if 'event_id' in params:
path_params['event_id'] = params['event_id']
query_params = {}
if 'page' in params:
query_params['page'] = params['page']
if 'per_page' in params:
query_params['per_page'] = params['per_page']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='EventConnectorsCollection',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def find_events(self, process_id, **kwargs):
"""
This method returns all events related to the runned process
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.find_events(process_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of process related to the event (required)
:param int page: Page numbers to fetch
:param int per_page: Amount of items per Page
:return: EventCollection
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.find_events_with_http_info(process_id, **kwargs)
else:
(data) = self.find_events_with_http_info(process_id, **kwargs)
return data
def find_events_with_http_info(self, process_id, **kwargs):
"""
This method returns all events related to the runned process
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.find_events_with_http_info(process_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of process related to the event (required)
:param int page: Page numbers to fetch
:param int per_page: Amount of items per Page
:return: EventCollection
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['process_id', 'page', 'per_page']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method find_events" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'process_id' is set
if ('process_id' not in params) or (params['process_id'] is None):
raise ValueError("Missing the required parameter `process_id` when calling `find_events`")
resource_path = '/processes/{process_id}/events'.replace('{format}', 'json')
path_params = {}
if 'process_id' in params:
path_params['process_id'] = params['process_id']
query_params = {}
if 'page' in params:
query_params['page'] = params['page']
if 'per_page' in params:
query_params['per_page'] = params['per_page']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='EventCollection',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def find_flow_by_id(self, process_id, flow_id, **kwargs):
"""
This method retrieves a flow based on its ID.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.find_flow_by_id(process_id, flow_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of process to return (required)
:param str flow_id: ID of flow to return (required)
:return: FlowItem
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.find_flow_by_id_with_http_info(process_id, flow_id, **kwargs)
else:
(data) = self.find_flow_by_id_with_http_info(process_id, flow_id, **kwargs)
return data
def find_flow_by_id_with_http_info(self, process_id, flow_id, **kwargs):
"""
This method retrieves a flow based on its ID.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.find_flow_by_id_with_http_info(process_id, flow_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of process to return (required)
:param str flow_id: ID of flow to return (required)
:return: FlowItem
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['process_id', 'flow_id']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method find_flow_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'process_id' is set
if ('process_id' not in params) or (params['process_id'] is None):
raise ValueError("Missing the required parameter `process_id` when calling `find_flow_by_id`")
# verify the required parameter 'flow_id' is set
if ('flow_id' not in params) or (params['flow_id'] is None):
raise ValueError("Missing the required parameter `flow_id` when calling `find_flow_by_id`")
resource_path = '/processes/{process_id}/flows/{flow_id}'.replace('{format}', 'json')
path_params = {}
if 'process_id' in params:
path_params['process_id'] = params['process_id']
if 'flow_id' in params:
path_params['flow_id'] = params['flow_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='FlowItem',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def find_flows(self, process_id, **kwargs):
"""
This method retrieves all existing flows.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.find_flows(process_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of process related to the flow (required)
:param int page: Page numbers to fetch
:param int per_page: Amount of items per Page
:return: FlowCollection
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.find_flows_with_http_info(process_id, **kwargs)
else:
(data) = self.find_flows_with_http_info(process_id, **kwargs)
return data
def find_flows_with_http_info(self, process_id, **kwargs):
"""
This method retrieves all existing flows.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.find_flows_with_http_info(process_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of process related to the flow (required)
:param int page: Page numbers to fetch
:param int per_page: Amount of items per Page
:return: FlowCollection
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['process_id', 'page', 'per_page']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method find_flows" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'process_id' is set
if ('process_id' not in params) or (params['process_id'] is None):
raise ValueError("Missing the required parameter `process_id` when calling `find_flows`")
resource_path = '/processes/{process_id}/flows'.replace('{format}', 'json')
path_params = {}
if 'process_id' in params:
path_params['process_id'] = params['process_id']
query_params = {}
if 'page' in params:
query_params['page'] = params['page']
if 'per_page' in params:
query_params['per_page'] = params['per_page']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='FlowCollection',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def find_gateway_by_id(self, process_id, gateway_id, **kwargs):
"""
This method retrieves a gateway based on its ID.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.find_gateway_by_id(process_id, gateway_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of process to return (required)
:param str gateway_id: ID of gateway to return (required)
:return: GatewayItem
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.find_gateway_by_id_with_http_info(process_id, gateway_id, **kwargs)
else:
(data) = self.find_gateway_by_id_with_http_info(process_id, gateway_id, **kwargs)
return data
def find_gateway_by_id_with_http_info(self, process_id, gateway_id, **kwargs):
"""
This method retrieves a gateway based on its ID.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.find_gateway_by_id_with_http_info(process_id, gateway_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of process to return (required)
:param str gateway_id: ID of gateway to return (required)
:return: GatewayItem
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['process_id', 'gateway_id']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method find_gateway_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'process_id' is set
if ('process_id' not in params) or (params['process_id'] is None):
raise ValueError("Missing the required parameter `process_id` when calling `find_gateway_by_id`")
# verify the required parameter 'gateway_id' is set
if ('gateway_id' not in params) or (params['gateway_id'] is None):
raise ValueError("Missing the required parameter `gateway_id` when calling `find_gateway_by_id`")
resource_path = '/processes/{process_id}/gateways/{gateway_id}'.replace('{format}', 'json')
path_params = {}
if 'process_id' in params:
path_params['process_id'] = params['process_id']
if 'gateway_id' in params:
path_params['gateway_id'] = params['gateway_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='GatewayItem',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def find_gateways(self, process_id, **kwargs):
"""
This method retrieves all existing gateways.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.find_gateways(process_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of process related to the gateway (required)
:param int page: Page number to fetch
:param int per_page: Amount of items per page
:return: GatewayCollection
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.find_gateways_with_http_info(process_id, **kwargs)
else:
(data) = self.find_gateways_with_http_info(process_id, **kwargs)
return data
def find_gateways_with_http_info(self, process_id, **kwargs):
"""
This method retrieves all existing gateways.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.find_gateways_with_http_info(process_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of process related to the gateway (required)
:param int page: Page number to fetch
:param int per_page: Amount of items per page
:return: GatewayCollection
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['process_id', 'page', 'per_page']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method find_gateways" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'process_id' is set
if ('process_id' not in params) or (params['process_id'] is None):
raise ValueError("Missing the required parameter `process_id` when calling `find_gateways`")
resource_path = '/processes/{process_id}/gateways'.replace('{format}', 'json')
path_params = {}
if 'process_id' in params:
path_params['process_id'] = params['process_id']
query_params = {}
if 'page' in params:
query_params['page'] = params['page']
if 'per_page' in params:
query_params['per_page'] = params['per_page']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='GatewayCollection',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def find_group_by_id(self, id, **kwargs):
"""
This method retrieves a group using its ID.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.find_group_by_id(id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: ID of group to return (required)
:return: GroupItem
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.find_group_by_id_with_http_info(id, **kwargs)
else:
(data) = self.find_group_by_id_with_http_info(id, **kwargs)
return data
def find_group_by_id_with_http_info(self, id, **kwargs):
"""
This method retrieves a group using its ID.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.find_group_by_id_with_http_info(id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: ID of group to return (required)
:return: GroupItem
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method find_group_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params) or (params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `find_group_by_id`")
resource_path = '/groups/{id}'.replace('{format}', 'json')
path_params = {}
if 'id' in params:
path_params['id'] = params['id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='GroupItem',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def find_groups(self, **kwargs):
"""
This method retrieves all existing groups.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.find_groups(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param int page: Page number to fetch
:param int per_page: Amount of items per Page
:return: GroupCollection
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.find_groups_with_http_info(**kwargs)
else:
(data) = self.find_groups_with_http_info(**kwargs)
return data
def find_groups_with_http_info(self, **kwargs):
"""
This method retrieves all existing groups.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.find_groups_with_http_info(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param int page: Page number to fetch
:param int per_page: Amount of items per Page
:return: GroupCollection
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['page', 'per_page']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method find_groups" % key
)
params[key] = val
del params['kwargs']
resource_path = '/groups'.replace('{format}', 'json')
path_params = {}
query_params = {}
if 'page' in params:
query_params['page'] = params['page']
if 'per_page' in params:
query_params['per_page'] = params['per_page']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='GroupCollection',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def find_input_output_by_id(self, process_id, task_id, inputoutput_uid, **kwargs):
"""
This method retrieves an Input/Output object using its ID.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.find_input_output_by_id(process_id, task_id, inputoutput_uid, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: Process ID related to the Input/Output object (required)
:param str task_id: Task instance ID related to the Input/Output object (required)
:param str inputoutput_uid: ID of Input/Output to return (required)
:return: InputOutputItem
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.find_input_output_by_id_with_http_info(process_id, task_id, inputoutput_uid, **kwargs)
else:
(data) = self.find_input_output_by_id_with_http_info(process_id, task_id, inputoutput_uid, **kwargs)
return data
def find_input_output_by_id_with_http_info(self, process_id, task_id, inputoutput_uid, **kwargs):
"""
This method retrieves an Input/Output object using its ID.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.find_input_output_by_id_with_http_info(process_id, task_id, inputoutput_uid, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: Process ID related to the Input/Output object (required)
:param str task_id: Task instance ID related to the Input/Output object (required)
:param str inputoutput_uid: ID of Input/Output to return (required)
:return: InputOutputItem
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['process_id', 'task_id', 'inputoutput_uid']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method find_input_output_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'process_id' is set
if ('process_id' not in params) or (params['process_id'] is None):
raise ValueError("Missing the required parameter `process_id` when calling `find_input_output_by_id`")
# verify the required parameter 'task_id' is set
if ('task_id' not in params) or (params['task_id'] is None):
raise ValueError("Missing the required parameter `task_id` when calling `find_input_output_by_id`")
# verify the required parameter 'inputoutput_uid' is set
if ('inputoutput_uid' not in params) or (params['inputoutput_uid'] is None):
raise ValueError("Missing the required parameter `inputoutput_uid` when calling `find_input_output_by_id`")
resource_path = '/processes/{process_id}/tasks/{task_id}/inputoutput/{inputoutput_uid}'.replace('{format}', 'json')
path_params = {}
if 'process_id' in params:
path_params['process_id'] = params['process_id']
if 'task_id' in params:
path_params['task_id'] = params['task_id']
if 'inputoutput_uid' in params:
path_params['inputoutput_uid'] = params['inputoutput_uid']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='InputOutputItem',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def find_input_outputs(self, process_id, task_id, **kwargs):
"""
This method retrieves all existing Input/Output objects in the related task instance.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.find_input_outputs(process_id, task_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: Process ID related to Input/Output object (required)
:param str task_id: Task instance ID related to Input/Output object (required)
:param int page: Page number to fetch
:param int per_page: Amount of items per page
:return: InputOutputCollection
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.find_input_outputs_with_http_info(process_id, task_id, **kwargs)
else:
(data) = self.find_input_outputs_with_http_info(process_id, task_id, **kwargs)
return data
def find_input_outputs_with_http_info(self, process_id, task_id, **kwargs):
"""
This method retrieves all existing Input/Output objects in the related task instance.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.find_input_outputs_with_http_info(process_id, task_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: Process ID related to Input/Output object (required)
:param str task_id: Task instance ID related to Input/Output object (required)
:param int page: Page number to fetch
:param int per_page: Amount of items per page
:return: InputOutputCollection
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['process_id', 'task_id', 'page', 'per_page']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method find_input_outputs" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'process_id' is set
if ('process_id' not in params) or (params['process_id'] is None):
raise ValueError("Missing the required parameter `process_id` when calling `find_input_outputs`")
# verify the required parameter 'task_id' is set
if ('task_id' not in params) or (params['task_id'] is None):
raise ValueError("Missing the required parameter `task_id` when calling `find_input_outputs`")
resource_path = '/processes/{process_id}/tasks/{task_id}/inputoutput'.replace('{format}', 'json')
path_params = {}
if 'process_id' in params:
path_params['process_id'] = params['process_id']
if 'task_id' in params:
path_params['task_id'] = params['task_id']
query_params = {}
if 'page' in params:
query_params['page'] = params['page']
if 'per_page' in params:
query_params['per_page'] = params['per_page']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='InputOutputCollection',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def find_instance_by_id(self, process_id, instance_id, **kwargs):
"""
This method retrieves an instance using its ID.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.find_instance_by_id(process_id, instance_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of process to return (required)
:param str instance_id: ID of instance to return (required)
:return: InstanceItem
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.find_instance_by_id_with_http_info(process_id, instance_id, **kwargs)
else:
(data) = self.find_instance_by_id_with_http_info(process_id, instance_id, **kwargs)
return data
def find_instance_by_id_with_http_info(self, process_id, instance_id, **kwargs):
"""
This method retrieves an instance using its ID.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.find_instance_by_id_with_http_info(process_id, instance_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of process to return (required)
:param str instance_id: ID of instance to return (required)
:return: InstanceItem
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['process_id', 'instance_id']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method find_instance_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'process_id' is set
if ('process_id' not in params) or (params['process_id'] is None):
raise ValueError("Missing the required parameter `process_id` when calling `find_instance_by_id`")
# verify the required parameter 'instance_id' is set
if ('instance_id' not in params) or (params['instance_id'] is None):
raise ValueError("Missing the required parameter `instance_id` when calling `find_instance_by_id`")
resource_path = '/processes/{process_id}/instances/{instance_id}'.replace('{format}', 'json')
path_params = {}
if 'process_id' in params:
path_params['process_id'] = params['process_id']
if 'instance_id' in params:
path_params['instance_id'] = params['instance_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='InstanceItem',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def find_instances(self, process_id, **kwargs):
"""
This method retrieves related to the process using the Process ID
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.find_instances(process_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: Process ID related to the instances (required)
:param int page: Page number to fetch
:param int per_page: Amount of items per page
:return: InstanceCollection
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.find_instances_with_http_info(process_id, **kwargs)
else:
(data) = self.find_instances_with_http_info(process_id, **kwargs)
return data
def find_instances_with_http_info(self, process_id, **kwargs):
"""
This method retrieves related to the process using the Process ID
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.find_instances_with_http_info(process_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: Process ID related to the instances (required)
:param int page: Page number to fetch
:param int per_page: Amount of items per page
:return: InstanceCollection
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['process_id', 'page', 'per_page']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method find_instances" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'process_id' is set
if ('process_id' not in params) or (params['process_id'] is None):
raise ValueError("Missing the required parameter `process_id` when calling `find_instances`")
resource_path = '/processes/{process_id}/instances'.replace('{format}', 'json')
path_params = {}
if 'process_id' in params:
path_params['process_id'] = params['process_id']
query_params = {}
if 'page' in params:
query_params['page'] = params['page']
if 'per_page' in params:
query_params['per_page'] = params['per_page']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='InstanceCollection',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def find_process_by_id(self, id, **kwargs):
"""
This method retrieves a process using its ID
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.find_process_by_id(id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: ID of process to return (required)
:return: ProcessItem
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.find_process_by_id_with_http_info(id, **kwargs)
else:
(data) = self.find_process_by_id_with_http_info(id, **kwargs)
return data
def find_process_by_id_with_http_info(self, id, **kwargs):
"""
This method retrieves a process using its ID
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.find_process_by_id_with_http_info(id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: ID of process to return (required)
:return: ProcessItem
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method find_process_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params) or (params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `find_process_by_id`")
resource_path = '/processes/{id}'.replace('{format}', 'json')
path_params = {}
if 'id' in params:
path_params['id'] = params['id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ProcessItem',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def find_processes(self, **kwargs):
"""
This method retrieves all existing processes.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.find_processes(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param int page: Page number to fetch
:param int per_page: Amount of items per page
:return: ProcessCollection
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.find_processes_with_http_info(**kwargs)
else:
(data) = self.find_processes_with_http_info(**kwargs)
return data
def find_processes_with_http_info(self, **kwargs):
"""
This method retrieves all existing processes.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.find_processes_with_http_info(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param int page: Page number to fetch
:param int per_page: Amount of items per page
:return: ProcessCollection
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['page', 'per_page']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method find_processes" % key
)
params[key] = val
del params['kwargs']
resource_path = '/processes'.replace('{format}', 'json')
path_params = {}
query_params = {}
if 'page' in params:
query_params['page'] = params['page']
if 'per_page' in params:
query_params['per_page'] = params['per_page']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ProcessCollection',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def find_task_by_id(self, process_id, task_id, **kwargs):
"""
This method is retrieves a task using its ID.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.find_task_by_id(process_id, task_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of process to return (required)
:param str task_id: ID of task to return (required)
:return: TaskItem
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.find_task_by_id_with_http_info(process_id, task_id, **kwargs)
else:
(data) = self.find_task_by_id_with_http_info(process_id, task_id, **kwargs)
return data
def find_task_by_id_with_http_info(self, process_id, task_id, **kwargs):
"""
This method is retrieves a task using its ID.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.find_task_by_id_with_http_info(process_id, task_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of process to return (required)
:param str task_id: ID of task to return (required)
:return: TaskItem
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['process_id', 'task_id']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method find_task_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'process_id' is set
if ('process_id' not in params) or (params['process_id'] is None):
raise ValueError("Missing the required parameter `process_id` when calling `find_task_by_id`")
# verify the required parameter 'task_id' is set
if ('task_id' not in params) or (params['task_id'] is None):
raise ValueError("Missing the required parameter `task_id` when calling `find_task_by_id`")
resource_path = '/processes/{process_id}/tasks/{task_id}'.replace('{format}', 'json')
path_params = {}
if 'process_id' in params:
path_params['process_id'] = params['process_id']
if 'task_id' in params:
path_params['task_id'] = params['task_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='TaskItem',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def find_task_connector_by_id(self, process_id, task_id, connector_id, **kwargs):
"""
This method is intended for retrieving an Task connector based on it's ID.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.find_task_connector_by_id(process_id, task_id, connector_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of Process to fetch (required)
:param str task_id: ID of Task to fetch (required)
:param str connector_id: ID of TaskConnector to fetch (required)
:return: TaskConnector1
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.find_task_connector_by_id_with_http_info(process_id, task_id, connector_id, **kwargs)
else:
(data) = self.find_task_connector_by_id_with_http_info(process_id, task_id, connector_id, **kwargs)
return data
def find_task_connector_by_id_with_http_info(self, process_id, task_id, connector_id, **kwargs):
"""
This method is intended for retrieving an Task connector based on it's ID.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.find_task_connector_by_id_with_http_info(process_id, task_id, connector_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of Process to fetch (required)
:param str task_id: ID of Task to fetch (required)
:param str connector_id: ID of TaskConnector to fetch (required)
:return: TaskConnector1
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['process_id', 'task_id', 'connector_id']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method find_task_connector_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'process_id' is set
if ('process_id' not in params) or (params['process_id'] is None):
raise ValueError("Missing the required parameter `process_id` when calling `find_task_connector_by_id`")
# verify the required parameter 'task_id' is set
if ('task_id' not in params) or (params['task_id'] is None):
raise ValueError("Missing the required parameter `task_id` when calling `find_task_connector_by_id`")
# verify the required parameter 'connector_id' is set
if ('connector_id' not in params) or (params['connector_id'] is None):
raise ValueError("Missing the required parameter `connector_id` when calling `find_task_connector_by_id`")
resource_path = '/processes/{process_id}/tasks/{task_id}/connectors/{connector_id}'.replace('{format}', 'json')
path_params = {}
if 'process_id' in params:
path_params['process_id'] = params['process_id']
if 'task_id' in params:
path_params['task_id'] = params['task_id']
if 'connector_id' in params:
path_params['connector_id'] = params['connector_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='TaskConnector1',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def find_task_connectors(self, process_id, task_id, **kwargs):
"""
This method returns all Task connectors related to the run Process and Task.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.find_task_connectors(process_id, task_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of Process to fetch (required)
:param str task_id: ID of Task to fetch (required)
:param int page: Page number to fetch
:param int per_page: Amount of items per Page
:return: TaskConnectorsCollection
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.find_task_connectors_with_http_info(process_id, task_id, **kwargs)
else:
(data) = self.find_task_connectors_with_http_info(process_id, task_id, **kwargs)
return data
def find_task_connectors_with_http_info(self, process_id, task_id, **kwargs):
"""
This method returns all Task connectors related to the run Process and Task.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.find_task_connectors_with_http_info(process_id, task_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of Process to fetch (required)
:param str task_id: ID of Task to fetch (required)
:param int page: Page number to fetch
:param int per_page: Amount of items per Page
:return: TaskConnectorsCollection
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['process_id', 'task_id', 'page', 'per_page']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method find_task_connectors" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'process_id' is set
if ('process_id' not in params) or (params['process_id'] is None):
raise ValueError("Missing the required parameter `process_id` when calling `find_task_connectors`")
# verify the required parameter 'task_id' is set
if ('task_id' not in params) or (params['task_id'] is None):
raise ValueError("Missing the required parameter `task_id` when calling `find_task_connectors`")
resource_path = '/processes/{process_id}/tasks/{task_id}/connectors'.replace('{format}', 'json')
path_params = {}
if 'process_id' in params:
path_params['process_id'] = params['process_id']
if 'task_id' in params:
path_params['task_id'] = params['task_id']
query_params = {}
if 'page' in params:
query_params['page'] = params['page']
if 'per_page' in params:
query_params['per_page'] = params['per_page']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='TaskConnectorsCollection',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def find_task_instance_by_id(self, task_instance_id, **kwargs):
"""
This method retrieves a task instance based on its ID.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.find_task_instance_by_id(task_instance_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str task_instance_id: ID of task instance to return (required)
:param int page: Page number to fetch
:param int per_page: Amount of items per page
:return: InlineResponse200
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.find_task_instance_by_id_with_http_info(task_instance_id, **kwargs)
else:
(data) = self.find_task_instance_by_id_with_http_info(task_instance_id, **kwargs)
return data
def find_task_instance_by_id_with_http_info(self, task_instance_id, **kwargs):
"""
This method retrieves a task instance based on its ID.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.find_task_instance_by_id_with_http_info(task_instance_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str task_instance_id: ID of task instance to return (required)
:param int page: Page number to fetch
:param int per_page: Amount of items per page
:return: InlineResponse200
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['task_instance_id', 'page', 'per_page']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method find_task_instance_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'task_instance_id' is set
if ('task_instance_id' not in params) or (params['task_instance_id'] is None):
raise ValueError("Missing the required parameter `task_instance_id` when calling `find_task_instance_by_id`")
resource_path = '/task_instances/{task_instance_id}'.replace('{format}', 'json')
path_params = {}
if 'task_instance_id' in params:
path_params['task_instance_id'] = params['task_instance_id']
query_params = {}
if 'page' in params:
query_params['page'] = params['page']
if 'per_page' in params:
query_params['per_page'] = params['per_page']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='InlineResponse200',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def find_task_instances(self, **kwargs):
"""
This method retrieves all existing task instances
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.find_task_instances(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param int page: Page number to fetch
:param int per_page: Amount of items per page
:return: TaskInstanceCollection
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.find_task_instances_with_http_info(**kwargs)
else:
(data) = self.find_task_instances_with_http_info(**kwargs)
return data
def find_task_instances_with_http_info(self, **kwargs):
"""
This method retrieves all existing task instances
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.find_task_instances_with_http_info(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param int page: Page number to fetch
:param int per_page: Amount of items per page
:return: TaskInstanceCollection
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['page', 'per_page']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method find_task_instances" % key
)
params[key] = val
del params['kwargs']
resource_path = '/task_instances'.replace('{format}', 'json')
path_params = {}
query_params = {}
if 'page' in params:
query_params['page'] = params['page']
if 'per_page' in params:
query_params['per_page'] = params['per_page']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='TaskInstanceCollection',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def find_tasks(self, process_id, **kwargs):
"""
This method is intended for returning a list of all Tasks related to the process
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.find_tasks(process_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of Process relative to task (required)
:param int page: Page number to fetch
:param int per_page: Amount of items per Page
:return: TaskCollection
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.find_tasks_with_http_info(process_id, **kwargs)
else:
(data) = self.find_tasks_with_http_info(process_id, **kwargs)
return data
def find_tasks_with_http_info(self, process_id, **kwargs):
"""
This method is intended for returning a list of all Tasks related to the process
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.find_tasks_with_http_info(process_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of Process relative to task (required)
:param int page: Page number to fetch
:param int per_page: Amount of items per Page
:return: TaskCollection
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['process_id', 'page', 'per_page']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method find_tasks" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'process_id' is set
if ('process_id' not in params) or (params['process_id'] is None):
raise ValueError("Missing the required parameter `process_id` when calling `find_tasks`")
resource_path = '/processes/{process_id}/tasks'.replace('{format}', 'json')
path_params = {}
if 'process_id' in params:
path_params['process_id'] = params['process_id']
query_params = {}
if 'page' in params:
query_params['page'] = params['page']
if 'per_page' in params:
query_params['per_page'] = params['per_page']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='TaskCollection',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def find_user_by_id(self, id, **kwargs):
"""
This method returns a user using its ID.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.find_user_by_id(id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: ID of the user to return (required)
:return: UserItem
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.find_user_by_id_with_http_info(id, **kwargs)
else:
(data) = self.find_user_by_id_with_http_info(id, **kwargs)
return data
def find_user_by_id_with_http_info(self, id, **kwargs):
"""
This method returns a user using its ID.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.find_user_by_id_with_http_info(id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: ID of the user to return (required)
:return: UserItem
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method find_user_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params) or (params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `find_user_by_id`")
resource_path = '/users/{id}'.replace('{format}', 'json')
path_params = {}
if 'id' in params:
path_params['id'] = params['id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='UserItem',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def find_users(self, **kwargs):
"""
This method returs all existing users in the system.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.find_users(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param int page: Page number to fetch
:param int per_page: Amount of items per page
:return: UserCollection
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.find_users_with_http_info(**kwargs)
else:
(data) = self.find_users_with_http_info(**kwargs)
return data
def find_users_with_http_info(self, **kwargs):
"""
This method returs all existing users in the system.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.find_users_with_http_info(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param int page: Page number to fetch
:param int per_page: Amount of items per page
:return: UserCollection
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['page', 'per_page']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method find_users" % key
)
params[key] = val
del params['kwargs']
resource_path = '/users'.replace('{format}', 'json')
path_params = {}
query_params = {}
if 'page' in params:
query_params['page'] = params['page']
if 'per_page' in params:
query_params['per_page'] = params['per_page']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='UserCollection',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def import_bpmn_file(self, bpmn_import_item, **kwargs):
"""
This method imports BPMN files. A new process is created when import is successful.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.import_bpmn_file(bpmn_import_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param BpmnImportItem bpmn_import_item: JSON API with the BPMN file to import (required)
:return: ProcessCollection1
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.import_bpmn_file_with_http_info(bpmn_import_item, **kwargs)
else:
(data) = self.import_bpmn_file_with_http_info(bpmn_import_item, **kwargs)
return data
def import_bpmn_file_with_http_info(self, bpmn_import_item, **kwargs):
"""
This method imports BPMN files. A new process is created when import is successful.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.import_bpmn_file_with_http_info(bpmn_import_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param BpmnImportItem bpmn_import_item: JSON API with the BPMN file to import (required)
:return: ProcessCollection1
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['bpmn_import_item']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method import_bpmn_file" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'bpmn_import_item' is set
if ('bpmn_import_item' not in params) or (params['bpmn_import_item'] is None):
raise ValueError("Missing the required parameter `bpmn_import_item` when calling `import_bpmn_file`")
resource_path = '/processes/import'.replace('{format}', 'json')
path_params = {}
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'bpmn_import_item' in params:
body_params = params['bpmn_import_item']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ProcessCollection1',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def myself_user(self, **kwargs):
"""
This method returns user information using a token
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.myself_user(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param int page: Page number to fetch
:param int per_page: Amount of items per page
:return: UserItem
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.myself_user_with_http_info(**kwargs)
else:
(data) = self.myself_user_with_http_info(**kwargs)
return data
def myself_user_with_http_info(self, **kwargs):
"""
This method returns user information using a token
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.myself_user_with_http_info(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param int page: Page number to fetch
:param int per_page: Amount of items per page
:return: UserItem
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['page', 'per_page']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method myself_user" % key
)
params[key] = val
del params['kwargs']
resource_path = '/users/myself'.replace('{format}', 'json')
path_params = {}
query_params = {}
if 'page' in params:
query_params['page'] = params['page']
if 'per_page' in params:
query_params['per_page'] = params['per_page']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='UserItem',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def remove_groups_from_task(self, process_id, task_id, task_remove_groups_item, **kwargs):
"""
This method removes groups from a task
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.remove_groups_from_task(process_id, task_id, task_remove_groups_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: Process ID (required)
:param str task_id: Task ID (required)
:param TaskRemoveGroupsItem task_remove_groups_item: JSON API response with Groups IDs to remove (required)
:return: ResultSuccess
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.remove_groups_from_task_with_http_info(process_id, task_id, task_remove_groups_item, **kwargs)
else:
(data) = self.remove_groups_from_task_with_http_info(process_id, task_id, task_remove_groups_item, **kwargs)
return data
def remove_groups_from_task_with_http_info(self, process_id, task_id, task_remove_groups_item, **kwargs):
"""
This method removes groups from a task
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.remove_groups_from_task_with_http_info(process_id, task_id, task_remove_groups_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: Process ID (required)
:param str task_id: Task ID (required)
:param TaskRemoveGroupsItem task_remove_groups_item: JSON API response with Groups IDs to remove (required)
:return: ResultSuccess
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['process_id', 'task_id', 'task_remove_groups_item']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method remove_groups_from_task" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'process_id' is set
if ('process_id' not in params) or (params['process_id'] is None):
raise ValueError("Missing the required parameter `process_id` when calling `remove_groups_from_task`")
# verify the required parameter 'task_id' is set
if ('task_id' not in params) or (params['task_id'] is None):
raise ValueError("Missing the required parameter `task_id` when calling `remove_groups_from_task`")
# verify the required parameter 'task_remove_groups_item' is set
if ('task_remove_groups_item' not in params) or (params['task_remove_groups_item'] is None):
raise ValueError("Missing the required parameter `task_remove_groups_item` when calling `remove_groups_from_task`")
resource_path = '/processes/{process_id}/tasks/{task_id}/groups'.replace('{format}', 'json')
path_params = {}
if 'process_id' in params:
path_params['process_id'] = params['process_id']
if 'task_id' in params:
path_params['task_id'] = params['task_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'task_remove_groups_item' in params:
body_params = params['task_remove_groups_item']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ResultSuccess',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def remove_users_from_group(self, id, group_remove_users_item, **kwargs):
"""
This method removes one or more users from a group.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.remove_users_from_group(id, group_remove_users_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: ID of group to be modified (required)
:param GroupRemoveUsersItem group_remove_users_item: JSON API response with Users IDs to remove (required)
:return: ResultSuccess
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.remove_users_from_group_with_http_info(id, group_remove_users_item, **kwargs)
else:
(data) = self.remove_users_from_group_with_http_info(id, group_remove_users_item, **kwargs)
return data
def remove_users_from_group_with_http_info(self, id, group_remove_users_item, **kwargs):
"""
This method removes one or more users from a group.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.remove_users_from_group_with_http_info(id, group_remove_users_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: ID of group to be modified (required)
:param GroupRemoveUsersItem group_remove_users_item: JSON API response with Users IDs to remove (required)
:return: ResultSuccess
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'group_remove_users_item']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method remove_users_from_group" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params) or (params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `remove_users_from_group`")
# verify the required parameter 'group_remove_users_item' is set
if ('group_remove_users_item' not in params) or (params['group_remove_users_item'] is None):
raise ValueError("Missing the required parameter `group_remove_users_item` when calling `remove_users_from_group`")
resource_path = '/groups/{id}/users'.replace('{format}', 'json')
path_params = {}
if 'id' in params:
path_params['id'] = params['id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'group_remove_users_item' in params:
body_params = params['group_remove_users_item']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ResultSuccess',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def sync_groups_to_task(self, process_id, task_id, task_sync_groups_item, **kwargs):
"""
This method synchronizes a one or more groups with a task.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.sync_groups_to_task(process_id, task_id, task_sync_groups_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: Process ID (required)
:param str task_id: ID of task to modify (required)
:param TaskSyncGroupsItem task_sync_groups_item: JSON API response with groups IDs to sync (required)
:return: ResultSuccess
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.sync_groups_to_task_with_http_info(process_id, task_id, task_sync_groups_item, **kwargs)
else:
(data) = self.sync_groups_to_task_with_http_info(process_id, task_id, task_sync_groups_item, **kwargs)
return data
def sync_groups_to_task_with_http_info(self, process_id, task_id, task_sync_groups_item, **kwargs):
"""
This method synchronizes a one or more groups with a task.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.sync_groups_to_task_with_http_info(process_id, task_id, task_sync_groups_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: Process ID (required)
:param str task_id: ID of task to modify (required)
:param TaskSyncGroupsItem task_sync_groups_item: JSON API response with groups IDs to sync (required)
:return: ResultSuccess
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['process_id', 'task_id', 'task_sync_groups_item']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method sync_groups_to_task" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'process_id' is set
if ('process_id' not in params) or (params['process_id'] is None):
raise ValueError("Missing the required parameter `process_id` when calling `sync_groups_to_task`")
# verify the required parameter 'task_id' is set
if ('task_id' not in params) or (params['task_id'] is None):
raise ValueError("Missing the required parameter `task_id` when calling `sync_groups_to_task`")
# verify the required parameter 'task_sync_groups_item' is set
if ('task_sync_groups_item' not in params) or (params['task_sync_groups_item'] is None):
raise ValueError("Missing the required parameter `task_sync_groups_item` when calling `sync_groups_to_task`")
resource_path = '/processes/{process_id}/tasks/{task_id}/groups'.replace('{format}', 'json')
path_params = {}
if 'process_id' in params:
path_params['process_id'] = params['process_id']
if 'task_id' in params:
path_params['task_id'] = params['task_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'task_sync_groups_item' in params:
body_params = params['task_sync_groups_item']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ResultSuccess',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def sync_users_to_group(self, id, group_sync_users_item, **kwargs):
"""
This method synchronizes one or more users with a group.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.sync_users_to_group(id, group_sync_users_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: ID of group to be modifieded (required)
:param GroupSyncUsersItem group_sync_users_item: JSON API with array of users IDs to sync (required)
:return: ResultSuccess
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.sync_users_to_group_with_http_info(id, group_sync_users_item, **kwargs)
else:
(data) = self.sync_users_to_group_with_http_info(id, group_sync_users_item, **kwargs)
return data
def sync_users_to_group_with_http_info(self, id, group_sync_users_item, **kwargs):
"""
This method synchronizes one or more users with a group.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.sync_users_to_group_with_http_info(id, group_sync_users_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: ID of group to be modifieded (required)
:param GroupSyncUsersItem group_sync_users_item: JSON API with array of users IDs to sync (required)
:return: ResultSuccess
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'group_sync_users_item']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method sync_users_to_group" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params) or (params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `sync_users_to_group`")
# verify the required parameter 'group_sync_users_item' is set
if ('group_sync_users_item' not in params) or (params['group_sync_users_item'] is None):
raise ValueError("Missing the required parameter `group_sync_users_item` when calling `sync_users_to_group`")
resource_path = '/groups/{id}/users'.replace('{format}', 'json')
path_params = {}
if 'id' in params:
path_params['id'] = params['id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'group_sync_users_item' in params:
body_params = params['group_sync_users_item']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ResultSuccess',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def update_client(self, user_id, client_id, client_update_item, **kwargs):
"""
This method updates an existing Oauth client.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.update_client(user_id, client_id, client_update_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str user_id: ID of user to retrieve (required)
:param str client_id: ID of client to retrieve (required)
:param ClientUpdateItem client_update_item: Client object to edit (required)
:return: ClientItem
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.update_client_with_http_info(user_id, client_id, client_update_item, **kwargs)
else:
(data) = self.update_client_with_http_info(user_id, client_id, client_update_item, **kwargs)
return data
def update_client_with_http_info(self, user_id, client_id, client_update_item, **kwargs):
"""
This method updates an existing Oauth client.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.update_client_with_http_info(user_id, client_id, client_update_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str user_id: ID of user to retrieve (required)
:param str client_id: ID of client to retrieve (required)
:param ClientUpdateItem client_update_item: Client object to edit (required)
:return: ClientItem
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['user_id', 'client_id', 'client_update_item']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_client" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'user_id' is set
if ('user_id' not in params) or (params['user_id'] is None):
raise ValueError("Missing the required parameter `user_id` when calling `update_client`")
# verify the required parameter 'client_id' is set
if ('client_id' not in params) or (params['client_id'] is None):
raise ValueError("Missing the required parameter `client_id` when calling `update_client`")
# verify the required parameter 'client_update_item' is set
if ('client_update_item' not in params) or (params['client_update_item'] is None):
raise ValueError("Missing the required parameter `client_update_item` when calling `update_client`")
resource_path = '/users/{user_id}/clients/{client_id}'.replace('{format}', 'json')
path_params = {}
if 'user_id' in params:
path_params['user_id'] = params['user_id']
if 'client_id' in params:
path_params['client_id'] = params['client_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'client_update_item' in params:
body_params = params['client_update_item']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ClientItem',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def update_event(self, process_id, event_id, event_update_item, **kwargs):
"""
This method updates an existing event
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.update_event(process_id, event_id, event_update_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of process to retrieve (required)
:param str event_id: ID of event to retrieve (required)
:param EventUpdateItem event_update_item: Event object to edit (required)
:return: EventItem
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.update_event_with_http_info(process_id, event_id, event_update_item, **kwargs)
else:
(data) = self.update_event_with_http_info(process_id, event_id, event_update_item, **kwargs)
return data
def update_event_with_http_info(self, process_id, event_id, event_update_item, **kwargs):
"""
This method updates an existing event
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.update_event_with_http_info(process_id, event_id, event_update_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of process to retrieve (required)
:param str event_id: ID of event to retrieve (required)
:param EventUpdateItem event_update_item: Event object to edit (required)
:return: EventItem
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['process_id', 'event_id', 'event_update_item']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_event" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'process_id' is set
if ('process_id' not in params) or (params['process_id'] is None):
raise ValueError("Missing the required parameter `process_id` when calling `update_event`")
# verify the required parameter 'event_id' is set
if ('event_id' not in params) or (params['event_id'] is None):
raise ValueError("Missing the required parameter `event_id` when calling `update_event`")
# verify the required parameter 'event_update_item' is set
if ('event_update_item' not in params) or (params['event_update_item'] is None):
raise ValueError("Missing the required parameter `event_update_item` when calling `update_event`")
resource_path = '/processes/{process_id}/events/{event_id}'.replace('{format}', 'json')
path_params = {}
if 'process_id' in params:
path_params['process_id'] = params['process_id']
if 'event_id' in params:
path_params['event_id'] = params['event_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'event_update_item' in params:
body_params = params['event_update_item']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='EventItem',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def update_event_connector(self, process_id, event_id, connector_id, event_connector_update_item, **kwargs):
"""
This method lets update the existing Event connector with new parameters values
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.update_event_connector(process_id, event_id, connector_id, event_connector_update_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of Process to fetch (required)
:param str event_id: ID of Event to fetch (required)
:param str connector_id: ID of Event Connector to fetch (required)
:param EventConnectorUpdateItem event_connector_update_item: EventConnector object to edit (required)
:return: EventConnector1
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.update_event_connector_with_http_info(process_id, event_id, connector_id, event_connector_update_item, **kwargs)
else:
(data) = self.update_event_connector_with_http_info(process_id, event_id, connector_id, event_connector_update_item, **kwargs)
return data
def update_event_connector_with_http_info(self, process_id, event_id, connector_id, event_connector_update_item, **kwargs):
"""
This method lets update the existing Event connector with new parameters values
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.update_event_connector_with_http_info(process_id, event_id, connector_id, event_connector_update_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of Process to fetch (required)
:param str event_id: ID of Event to fetch (required)
:param str connector_id: ID of Event Connector to fetch (required)
:param EventConnectorUpdateItem event_connector_update_item: EventConnector object to edit (required)
:return: EventConnector1
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['process_id', 'event_id', 'connector_id', 'event_connector_update_item']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_event_connector" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'process_id' is set
if ('process_id' not in params) or (params['process_id'] is None):
raise ValueError("Missing the required parameter `process_id` when calling `update_event_connector`")
# verify the required parameter 'event_id' is set
if ('event_id' not in params) or (params['event_id'] is None):
raise ValueError("Missing the required parameter `event_id` when calling `update_event_connector`")
# verify the required parameter 'connector_id' is set
if ('connector_id' not in params) or (params['connector_id'] is None):
raise ValueError("Missing the required parameter `connector_id` when calling `update_event_connector`")
# verify the required parameter 'event_connector_update_item' is set
if ('event_connector_update_item' not in params) or (params['event_connector_update_item'] is None):
raise ValueError("Missing the required parameter `event_connector_update_item` when calling `update_event_connector`")
resource_path = '/processes/{process_id}/events/{event_id}/connectors/{connector_id}'.replace('{format}', 'json')
path_params = {}
if 'process_id' in params:
path_params['process_id'] = params['process_id']
if 'event_id' in params:
path_params['event_id'] = params['event_id']
if 'connector_id' in params:
path_params['connector_id'] = params['connector_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'event_connector_update_item' in params:
body_params = params['event_connector_update_item']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='EventConnector1',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def update_flow(self, process_id, flow_id, flow_update_item, **kwargs):
"""
This method updates an existing flow.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.update_flow(process_id, flow_id, flow_update_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of process to retrieve (required)
:param str flow_id: ID of flow to retrieve (required)
:param FlowUpdateItem flow_update_item: Flow object to edit (required)
:return: FlowItem
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.update_flow_with_http_info(process_id, flow_id, flow_update_item, **kwargs)
else:
(data) = self.update_flow_with_http_info(process_id, flow_id, flow_update_item, **kwargs)
return data
def update_flow_with_http_info(self, process_id, flow_id, flow_update_item, **kwargs):
"""
This method updates an existing flow.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.update_flow_with_http_info(process_id, flow_id, flow_update_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of process to retrieve (required)
:param str flow_id: ID of flow to retrieve (required)
:param FlowUpdateItem flow_update_item: Flow object to edit (required)
:return: FlowItem
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['process_id', 'flow_id', 'flow_update_item']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_flow" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'process_id' is set
if ('process_id' not in params) or (params['process_id'] is None):
raise ValueError("Missing the required parameter `process_id` when calling `update_flow`")
# verify the required parameter 'flow_id' is set
if ('flow_id' not in params) or (params['flow_id'] is None):
raise ValueError("Missing the required parameter `flow_id` when calling `update_flow`")
# verify the required parameter 'flow_update_item' is set
if ('flow_update_item' not in params) or (params['flow_update_item'] is None):
raise ValueError("Missing the required parameter `flow_update_item` when calling `update_flow`")
resource_path = '/processes/{process_id}/flows/{flow_id}'.replace('{format}', 'json')
path_params = {}
if 'process_id' in params:
path_params['process_id'] = params['process_id']
if 'flow_id' in params:
path_params['flow_id'] = params['flow_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'flow_update_item' in params:
body_params = params['flow_update_item']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='FlowItem',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def update_gateway(self, process_id, gateway_id, gateway_update_item, **kwargs):
"""
This method updates an existing gateway.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.update_gateway(process_id, gateway_id, gateway_update_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of process to retrieve (required)
:param str gateway_id: ID of gateway to retrieve (required)
:param GatewayUpdateItem gateway_update_item: Gateway object to edit (required)
:return: GatewayItem
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.update_gateway_with_http_info(process_id, gateway_id, gateway_update_item, **kwargs)
else:
(data) = self.update_gateway_with_http_info(process_id, gateway_id, gateway_update_item, **kwargs)
return data
def update_gateway_with_http_info(self, process_id, gateway_id, gateway_update_item, **kwargs):
"""
This method updates an existing gateway.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.update_gateway_with_http_info(process_id, gateway_id, gateway_update_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of process to retrieve (required)
:param str gateway_id: ID of gateway to retrieve (required)
:param GatewayUpdateItem gateway_update_item: Gateway object to edit (required)
:return: GatewayItem
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['process_id', 'gateway_id', 'gateway_update_item']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_gateway" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'process_id' is set
if ('process_id' not in params) or (params['process_id'] is None):
raise ValueError("Missing the required parameter `process_id` when calling `update_gateway`")
# verify the required parameter 'gateway_id' is set
if ('gateway_id' not in params) or (params['gateway_id'] is None):
raise ValueError("Missing the required parameter `gateway_id` when calling `update_gateway`")
# verify the required parameter 'gateway_update_item' is set
if ('gateway_update_item' not in params) or (params['gateway_update_item'] is None):
raise ValueError("Missing the required parameter `gateway_update_item` when calling `update_gateway`")
resource_path = '/processes/{process_id}/gateways/{gateway_id}'.replace('{format}', 'json')
path_params = {}
if 'process_id' in params:
path_params['process_id'] = params['process_id']
if 'gateway_id' in params:
path_params['gateway_id'] = params['gateway_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'gateway_update_item' in params:
body_params = params['gateway_update_item']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='GatewayItem',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def update_group(self, id, group_update_item, **kwargs):
"""
This method updates an existing group.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.update_group(id, group_update_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: ID of group to retrieve (required)
:param GroupUpdateItem group_update_item: Group object to edit (required)
:return: GroupItem
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.update_group_with_http_info(id, group_update_item, **kwargs)
else:
(data) = self.update_group_with_http_info(id, group_update_item, **kwargs)
return data
def update_group_with_http_info(self, id, group_update_item, **kwargs):
"""
This method updates an existing group.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.update_group_with_http_info(id, group_update_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: ID of group to retrieve (required)
:param GroupUpdateItem group_update_item: Group object to edit (required)
:return: GroupItem
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'group_update_item']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_group" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params) or (params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `update_group`")
# verify the required parameter 'group_update_item' is set
if ('group_update_item' not in params) or (params['group_update_item'] is None):
raise ValueError("Missing the required parameter `group_update_item` when calling `update_group`")
resource_path = '/groups/{id}'.replace('{format}', 'json')
path_params = {}
if 'id' in params:
path_params['id'] = params['id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'group_update_item' in params:
body_params = params['group_update_item']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='GroupItem',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def update_input_output(self, process_id, task_id, inputoutput_uid, input_output_update_item, **kwargs):
"""
This method updates an existing Input/Output object.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.update_input_output(process_id, task_id, inputoutput_uid, input_output_update_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: Process ID related to the Input/Output object (required)
:param str task_id: Task instance ID related to the Input/Output object (required)
:param str inputoutput_uid: ID of Input/Output to retrieve (required)
:param InputOutputUpdateItem input_output_update_item: Input/Output object to edit (required)
:return: InputOutputItem
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.update_input_output_with_http_info(process_id, task_id, inputoutput_uid, input_output_update_item, **kwargs)
else:
(data) = self.update_input_output_with_http_info(process_id, task_id, inputoutput_uid, input_output_update_item, **kwargs)
return data
def update_input_output_with_http_info(self, process_id, task_id, inputoutput_uid, input_output_update_item, **kwargs):
"""
This method updates an existing Input/Output object.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.update_input_output_with_http_info(process_id, task_id, inputoutput_uid, input_output_update_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: Process ID related to the Input/Output object (required)
:param str task_id: Task instance ID related to the Input/Output object (required)
:param str inputoutput_uid: ID of Input/Output to retrieve (required)
:param InputOutputUpdateItem input_output_update_item: Input/Output object to edit (required)
:return: InputOutputItem
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['process_id', 'task_id', 'inputoutput_uid', 'input_output_update_item']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_input_output" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'process_id' is set
if ('process_id' not in params) or (params['process_id'] is None):
raise ValueError("Missing the required parameter `process_id` when calling `update_input_output`")
# verify the required parameter 'task_id' is set
if ('task_id' not in params) or (params['task_id'] is None):
raise ValueError("Missing the required parameter `task_id` when calling `update_input_output`")
# verify the required parameter 'inputoutput_uid' is set
if ('inputoutput_uid' not in params) or (params['inputoutput_uid'] is None):
raise ValueError("Missing the required parameter `inputoutput_uid` when calling `update_input_output`")
# verify the required parameter 'input_output_update_item' is set
if ('input_output_update_item' not in params) or (params['input_output_update_item'] is None):
raise ValueError("Missing the required parameter `input_output_update_item` when calling `update_input_output`")
resource_path = '/processes/{process_id}/tasks/{task_id}/inputoutput/{inputoutput_uid}'.replace('{format}', 'json')
path_params = {}
if 'process_id' in params:
path_params['process_id'] = params['process_id']
if 'task_id' in params:
path_params['task_id'] = params['task_id']
if 'inputoutput_uid' in params:
path_params['inputoutput_uid'] = params['inputoutput_uid']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'input_output_update_item' in params:
body_params = params['input_output_update_item']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='InputOutputItem',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def update_instance(self, process_id, instance_id, instance_update_item, **kwargs):
"""
This method updates an existing instance.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.update_instance(process_id, instance_id, instance_update_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of Process to retrieve (required)
:param str instance_id: ID of Instance to retrieve (required)
:param InstanceUpdateItem instance_update_item: Instance object to edit (required)
:return: InstanceItem
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.update_instance_with_http_info(process_id, instance_id, instance_update_item, **kwargs)
else:
(data) = self.update_instance_with_http_info(process_id, instance_id, instance_update_item, **kwargs)
return data
def update_instance_with_http_info(self, process_id, instance_id, instance_update_item, **kwargs):
"""
This method updates an existing instance.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.update_instance_with_http_info(process_id, instance_id, instance_update_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of Process to retrieve (required)
:param str instance_id: ID of Instance to retrieve (required)
:param InstanceUpdateItem instance_update_item: Instance object to edit (required)
:return: InstanceItem
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['process_id', 'instance_id', 'instance_update_item']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_instance" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'process_id' is set
if ('process_id' not in params) or (params['process_id'] is None):
raise ValueError("Missing the required parameter `process_id` when calling `update_instance`")
# verify the required parameter 'instance_id' is set
if ('instance_id' not in params) or (params['instance_id'] is None):
raise ValueError("Missing the required parameter `instance_id` when calling `update_instance`")
# verify the required parameter 'instance_update_item' is set
if ('instance_update_item' not in params) or (params['instance_update_item'] is None):
raise ValueError("Missing the required parameter `instance_update_item` when calling `update_instance`")
resource_path = '/processes/{process_id}/instances/{instance_id}'.replace('{format}', 'json')
path_params = {}
if 'process_id' in params:
path_params['process_id'] = params['process_id']
if 'instance_id' in params:
path_params['instance_id'] = params['instance_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'instance_update_item' in params:
body_params = params['instance_update_item']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='InstanceItem',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def update_process(self, id, process_update_item, **kwargs):
"""
This method updates an existing process.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.update_process(id, process_update_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: ID of process to retrieve (required)
:param ProcessUpdateItem process_update_item: Process object to edit (required)
:return: ProcessItem
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.update_process_with_http_info(id, process_update_item, **kwargs)
else:
(data) = self.update_process_with_http_info(id, process_update_item, **kwargs)
return data
def update_process_with_http_info(self, id, process_update_item, **kwargs):
"""
This method updates an existing process.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.update_process_with_http_info(id, process_update_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: ID of process to retrieve (required)
:param ProcessUpdateItem process_update_item: Process object to edit (required)
:return: ProcessItem
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'process_update_item']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_process" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params) or (params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `update_process`")
# verify the required parameter 'process_update_item' is set
if ('process_update_item' not in params) or (params['process_update_item'] is None):
raise ValueError("Missing the required parameter `process_update_item` when calling `update_process`")
resource_path = '/processes/{id}'.replace('{format}', 'json')
path_params = {}
if 'id' in params:
path_params['id'] = params['id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'process_update_item' in params:
body_params = params['process_update_item']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ProcessItem',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def update_task(self, process_id, task_id, task_update_item, **kwargs):
"""
This method is intended for updating an existing task.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.update_task(process_id, task_id, task_update_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of Process to fetch (required)
:param str task_id: ID of Task to fetch (required)
:param TaskUpdateItem task_update_item: Task object to edit (required)
:return: TaskItem
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.update_task_with_http_info(process_id, task_id, task_update_item, **kwargs)
else:
(data) = self.update_task_with_http_info(process_id, task_id, task_update_item, **kwargs)
return data
def update_task_with_http_info(self, process_id, task_id, task_update_item, **kwargs):
"""
This method is intended for updating an existing task.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.update_task_with_http_info(process_id, task_id, task_update_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of Process to fetch (required)
:param str task_id: ID of Task to fetch (required)
:param TaskUpdateItem task_update_item: Task object to edit (required)
:return: TaskItem
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['process_id', 'task_id', 'task_update_item']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_task" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'process_id' is set
if ('process_id' not in params) or (params['process_id'] is None):
raise ValueError("Missing the required parameter `process_id` when calling `update_task`")
# verify the required parameter 'task_id' is set
if ('task_id' not in params) or (params['task_id'] is None):
raise ValueError("Missing the required parameter `task_id` when calling `update_task`")
# verify the required parameter 'task_update_item' is set
if ('task_update_item' not in params) or (params['task_update_item'] is None):
raise ValueError("Missing the required parameter `task_update_item` when calling `update_task`")
resource_path = '/processes/{process_id}/tasks/{task_id}'.replace('{format}', 'json')
path_params = {}
if 'process_id' in params:
path_params['process_id'] = params['process_id']
if 'task_id' in params:
path_params['task_id'] = params['task_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'task_update_item' in params:
body_params = params['task_update_item']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='TaskItem',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def update_task_connector(self, process_id, task_id, connector_id, task_connector_update_item, **kwargs):
"""
This method lets update the existing Task connector with new parameters values
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.update_task_connector(process_id, task_id, connector_id, task_connector_update_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of Process to fetch (required)
:param str task_id: ID of Task to fetch (required)
:param str connector_id: ID of Task Connector to fetch (required)
:param TaskConnectorUpdateItem task_connector_update_item: TaskConnector object to edit (required)
:return: TaskConnector1
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.update_task_connector_with_http_info(process_id, task_id, connector_id, task_connector_update_item, **kwargs)
else:
(data) = self.update_task_connector_with_http_info(process_id, task_id, connector_id, task_connector_update_item, **kwargs)
return data
def update_task_connector_with_http_info(self, process_id, task_id, connector_id, task_connector_update_item, **kwargs):
"""
This method lets update the existing Task connector with new parameters values
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.update_task_connector_with_http_info(process_id, task_id, connector_id, task_connector_update_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str process_id: ID of Process to fetch (required)
:param str task_id: ID of Task to fetch (required)
:param str connector_id: ID of Task Connector to fetch (required)
:param TaskConnectorUpdateItem task_connector_update_item: TaskConnector object to edit (required)
:return: TaskConnector1
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['process_id', 'task_id', 'connector_id', 'task_connector_update_item']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_task_connector" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'process_id' is set
if ('process_id' not in params) or (params['process_id'] is None):
raise ValueError("Missing the required parameter `process_id` when calling `update_task_connector`")
# verify the required parameter 'task_id' is set
if ('task_id' not in params) or (params['task_id'] is None):
raise ValueError("Missing the required parameter `task_id` when calling `update_task_connector`")
# verify the required parameter 'connector_id' is set
if ('connector_id' not in params) or (params['connector_id'] is None):
raise ValueError("Missing the required parameter `connector_id` when calling `update_task_connector`")
# verify the required parameter 'task_connector_update_item' is set
if ('task_connector_update_item' not in params) or (params['task_connector_update_item'] is None):
raise ValueError("Missing the required parameter `task_connector_update_item` when calling `update_task_connector`")
resource_path = '/processes/{process_id}/tasks/{task_id}/connectors/{connector_id}'.replace('{format}', 'json')
path_params = {}
if 'process_id' in params:
path_params['process_id'] = params['process_id']
if 'task_id' in params:
path_params['task_id'] = params['task_id']
if 'connector_id' in params:
path_params['connector_id'] = params['connector_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'task_connector_update_item' in params:
body_params = params['task_connector_update_item']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='TaskConnector1',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def update_task_instance(self, task_instance_id, task_instance_update_item, **kwargs):
"""
This method updates an existing task instance.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.update_task_instance(task_instance_id, task_instance_update_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str task_instance_id: ID of task instance to retrieve (required)
:param TaskInstanceUpdateItem task_instance_update_item: Task Instance object to update (required)
:return: InlineResponse200
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.update_task_instance_with_http_info(task_instance_id, task_instance_update_item, **kwargs)
else:
(data) = self.update_task_instance_with_http_info(task_instance_id, task_instance_update_item, **kwargs)
return data
def update_task_instance_with_http_info(self, task_instance_id, task_instance_update_item, **kwargs):
"""
This method updates an existing task instance.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.update_task_instance_with_http_info(task_instance_id, task_instance_update_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str task_instance_id: ID of task instance to retrieve (required)
:param TaskInstanceUpdateItem task_instance_update_item: Task Instance object to update (required)
:return: InlineResponse200
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['task_instance_id', 'task_instance_update_item']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_task_instance" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'task_instance_id' is set
if ('task_instance_id' not in params) or (params['task_instance_id'] is None):
raise ValueError("Missing the required parameter `task_instance_id` when calling `update_task_instance`")
# verify the required parameter 'task_instance_update_item' is set
if ('task_instance_update_item' not in params) or (params['task_instance_update_item'] is None):
raise ValueError("Missing the required parameter `task_instance_update_item` when calling `update_task_instance`")
resource_path = '/task_instances/{task_instance_id}'.replace('{format}', 'json')
path_params = {}
if 'task_instance_id' in params:
path_params['task_instance_id'] = params['task_instance_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'task_instance_update_item' in params:
body_params = params['task_instance_update_item']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'PATCH',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='InlineResponse200',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def update_user(self, id, user_update_item, **kwargs):
"""
This method updates an existing user.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.update_user(id, user_update_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: ID of user to retrieve (required)
:param UserUpdateItem user_update_item: User object for update (required)
:return: UserItem
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.update_user_with_http_info(id, user_update_item, **kwargs)
else:
(data) = self.update_user_with_http_info(id, user_update_item, **kwargs)
return data
def update_user_with_http_info(self, id, user_update_item, **kwargs):
"""
This method updates an existing user.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.update_user_with_http_info(id, user_update_item, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: ID of user to retrieve (required)
:param UserUpdateItem user_update_item: User object for update (required)
:return: UserItem
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'user_update_item']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_user" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params) or (params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `update_user`")
# verify the required parameter 'user_update_item' is set
if ('user_update_item' not in params) or (params['user_update_item'] is None):
raise ValueError("Missing the required parameter `user_update_item` when calling `update_user`")
resource_path = '/users/{id}'.replace('{format}', 'json')
path_params = {}
if 'id' in params:
path_params['id'] = params['id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'user_update_item' in params:
body_params = params['user_update_item']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/vnd.api+json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/vnd.api+json'])
# Authentication setting
auth_settings = ['PasswordGrant']
return self.api_client.call_api(resource_path, 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='UserItem',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
| 43.897703 | 278 | 0.583926 | 39,508 | 363,034 | 5.124481 | 0.009542 | 0.036585 | 0.020192 | 0.025961 | 0.989682 | 0.984896 | 0.976114 | 0.963988 | 0.951343 | 0.946572 | 0 | 0.000253 | 0.335346 | 363,034 | 8,269 | 279 | 43.903011 | 0.838805 | 0.336048 | 0 | 0.808424 | 0 | 0 | 0.203066 | 0.064843 | 0 | 0 | 0 | 0 | 0 | 1 | 0.0373 | false | 0.018523 | 0.004567 | 0 | 0.097691 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e57ed9a603595e71f16e55de94ee51043d5240e0 | 59,983 | py | Python | pysnmp-with-texts/DOCS-IETF-CABLE-DEVICE-NOTIFICATION-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 8 | 2019-05-09T17:04:00.000Z | 2021-06-09T06:50:51.000Z | pysnmp-with-texts/DOCS-IETF-CABLE-DEVICE-NOTIFICATION-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 4 | 2019-05-31T16:42:59.000Z | 2020-01-31T21:57:17.000Z | pysnmp-with-texts/DOCS-IETF-CABLE-DEVICE-NOTIFICATION-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 10 | 2019-04-30T05:51:36.000Z | 2022-02-16T03:33:41.000Z | #
# PySNMP MIB module DOCS-IETF-CABLE-DEVICE-NOTIFICATION-MIB (http://snmplabs.com/pysmi)
# ASN.1 source file:///Users/davwang4/Dev/mibs.snmplabs.com/asn1/DOCS-IETF-CABLE-DEVICE-NOTIFICATION-MIB
# Produced by pysmi-0.3.4 at Wed May 1 12:52:47 2019
# On host DAVWANG4-M-1475 platform Darwin version 18.5.0 by user davwang4
# Using Python version 3.7.3 (default, Mar 27 2019, 09:23:15)
#
Integer, ObjectIdentifier, OctetString = mibBuilder.importSymbols("ASN1", "Integer", "ObjectIdentifier", "OctetString")
NamedValues, = mibBuilder.importSymbols("ASN1-ENUMERATION", "NamedValues")
ValueRangeConstraint, ConstraintsUnion, ValueSizeConstraint, SingleValueConstraint, ConstraintsIntersection = mibBuilder.importSymbols("ASN1-REFINEMENT", "ValueRangeConstraint", "ConstraintsUnion", "ValueSizeConstraint", "SingleValueConstraint", "ConstraintsIntersection")
docsDevEvLevel, docsDevSwServer, docsDevEvId, docsDevServerDhcp, docsDevServerTime, docsDevSwFilename, docsDevEvText = mibBuilder.importSymbols("DOCS-CABLE-DEVICE-MIB", "docsDevEvLevel", "docsDevSwServer", "docsDevEvId", "docsDevServerDhcp", "docsDevServerTime", "docsDevSwFilename", "docsDevEvText")
docsIfCmtsCmStatusDocsisRegMode, docsIfCmtsCmStatusModulationType, docsIfCmStatusModulationType, docsIfCmCmtsAddress, docsIfCmtsCmStatusMacAddress, docsIfDocsisBaseCapability, docsIfCmStatusDocsisOperMode = mibBuilder.importSymbols("DOCS-IF-MIB", "docsIfCmtsCmStatusDocsisRegMode", "docsIfCmtsCmStatusModulationType", "docsIfCmStatusModulationType", "docsIfCmCmtsAddress", "docsIfCmtsCmStatusMacAddress", "docsIfDocsisBaseCapability", "docsIfCmStatusDocsisOperMode")
ifPhysAddress, = mibBuilder.importSymbols("IF-MIB", "ifPhysAddress")
NotificationGroup, ObjectGroup, ModuleCompliance = mibBuilder.importSymbols("SNMPv2-CONF", "NotificationGroup", "ObjectGroup", "ModuleCompliance")
Counter64, Counter32, MibIdentifier, ObjectIdentity, iso, MibScalar, MibTable, MibTableRow, MibTableColumn, Bits, Gauge32, Unsigned32, TimeTicks, IpAddress, mib_2, Integer32, ModuleIdentity, NotificationType = mibBuilder.importSymbols("SNMPv2-SMI", "Counter64", "Counter32", "MibIdentifier", "ObjectIdentity", "iso", "MibScalar", "MibTable", "MibTableRow", "MibTableColumn", "Bits", "Gauge32", "Unsigned32", "TimeTicks", "IpAddress", "mib-2", "Integer32", "ModuleIdentity", "NotificationType")
TextualConvention, DisplayString = mibBuilder.importSymbols("SNMPv2-TC", "TextualConvention", "DisplayString")
docsDevNotifMIB = ModuleIdentity((1, 3, 6, 1, 2, 1, 132))
docsDevNotifMIB.setRevisions(('2006-05-24 00:00',))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
if mibBuilder.loadTexts: docsDevNotifMIB.setRevisionsDescriptions(('Initial version, published as RFC 4547.',))
if mibBuilder.loadTexts: docsDevNotifMIB.setLastUpdated('200605240000Z')
if mibBuilder.loadTexts: docsDevNotifMIB.setOrganization('IETF IP over Cable Data Network Working Group')
if mibBuilder.loadTexts: docsDevNotifMIB.setContactInfo(' Azlina Ahmad Postal: Cisco Systems, Inc. 170 West Tasman Drive San Jose, CA 95134, U.S.A. Phone: 408 853 7927 E-mail: azlina@cisco.com Greg Nakanishi Postal: Motorola 6450 Sequence Drive San Diego, CA 92121, U.S.A. Phone: 858 404 2366 E-mail: gnakanishi@motorola.com IETF IPCDN Working Group General Discussion: ipcdn@ietf.org Subscribe: http://www.ietf.org/mailman/listinfo/ipcdn Archive: ftp://ftp.ietf.org/ietf-mail-archive/ipcdn Co-chairs: Richard Woundy, richard_woundy@cable.comcast.com Jean-Francois Mule, jf.mule@cablelabs.com')
if mibBuilder.loadTexts: docsDevNotifMIB.setDescription('The Event Notification MIB is an extension of the CABLE DEVICE MIB. It defines various notification objects for both cable modem and cable modem termination systems. Two groups of SNMP notification objects are defined. One group is for notifying cable modem events, and one group is for notifying cable modem termination system events. DOCSIS defines numerous events, and each event is assigned to a functional category. This MIB defines a notification object for each functional category. The varbinding list of each notification includes information about the event that occurred on the device. Copyright (C) The Internet Society (2006). This version of this MIB module is part of RFC 4547; see the RFC itself for full legal notices.')
docsDevNotifControl = MibIdentifier((1, 3, 6, 1, 2, 1, 132, 1))
docsDevCmNotifs = MibIdentifier((1, 3, 6, 1, 2, 1, 132, 2, 0))
docsDevCmtsNotifs = MibIdentifier((1, 3, 6, 1, 2, 1, 132, 3, 0))
docsDevCmNotifControl = MibScalar((1, 3, 6, 1, 2, 1, 132, 1, 1), Bits().clone(namedValues=NamedValues(("cmInitTLVUnknownNotif", 0), ("cmDynServReqFailNotif", 1), ("cmDynServRspFailNotif", 2), ("cmDynServAckFailNotif", 3), ("cmBpiInitNotif", 4), ("cmBPKMNotif", 5), ("cmDynamicSANotif", 6), ("cmDHCPFailNotif", 7), ("cmSwUpgradeInitNotif", 8), ("cmSwUpgradeFailNotif", 9), ("cmSwUpgradeSuccessNotif", 10), ("cmSwUpgradeCVCNotif", 11), ("cmTODFailNotif", 12), ("cmDCCReqFailNotif", 13), ("cmDCCRspFailNotif", 14), ("cmDCCAckFailNotif", 15)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: docsDevCmNotifControl.setStatus('current')
if mibBuilder.loadTexts: docsDevCmNotifControl.setDescription('The object is used to enable specific CM notifications. For example, if the first bit is set, then docsDevCmInitTLVUnknownNotif is enabled. If it is not set, the notification is disabled. Note that notifications are also under the control of the MIB modules defined in RFC3413. If the device is rebooted,the value of this object SHOULD revert to the default value. ')
docsDevCmtsNotifControl = MibScalar((1, 3, 6, 1, 2, 1, 132, 1, 2), Bits().clone(namedValues=NamedValues(("cmtsInitRegReqFailNotif", 0), ("cmtsInitRegRspFailNotif", 1), ("cmtsInitRegAckFailNotif", 2), ("cmtsDynServReqFailNotif", 3), ("cmtsDynServRspFailNotif", 4), ("cmtsDynServAckFailNotif", 5), ("cmtsBpiInitNotif", 6), ("cmtsBPKMNotif", 7), ("cmtsDynamicSANotif", 8), ("cmtsDCCReqFailNotif", 9), ("cmtsDCCRspFailNotif", 10), ("cmtsDCCAckFailNotif", 11)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: docsDevCmtsNotifControl.setStatus('current')
if mibBuilder.loadTexts: docsDevCmtsNotifControl.setDescription('The object is used to enable specific CMTS notifications. For example, if the first bit is set, then docsDevCmtsInitRegRspFailNotif is enabled. If it is not set, the notification is disabled. Note that notifications are also under the control of the MIB modules defined in RFC3413. If the device is rebooted,the value of this object SHOULD revert to the default value. ')
docsDevCmInitTLVUnknownNotif = NotificationType((1, 3, 6, 1, 2, 1, 132, 2, 0, 1)).setObjects(("DOCS-CABLE-DEVICE-MIB", "docsDevEvLevel"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvId"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvText"), ("IF-MIB", "ifPhysAddress"), ("DOCS-IF-MIB", "docsIfCmCmtsAddress"), ("DOCS-IF-MIB", "docsIfDocsisBaseCapability"), ("DOCS-IF-MIB", "docsIfCmStatusDocsisOperMode"), ("DOCS-IF-MIB", "docsIfCmStatusModulationType"))
if mibBuilder.loadTexts: docsDevCmInitTLVUnknownNotif.setStatus('current')
if mibBuilder.loadTexts: docsDevCmInitTLVUnknownNotif.setDescription('Notification to indicate that an unknown TLV was encountered during the TLV parsing process. This notification sends additional information about the event by including the following objects in its varbinding list. - docsDevEvLevel: the priority level associated with the event. - docsDevEvId: the unique identifier of the event that occurred. - docsDevEvText: a textual description of the event. - ifPhysAddress: the MAC address of the cable interface of this cable modem. - docsIfCmCmtsAddress: the MAC address of the CMTS to which the CM is connected (if there is a cable card/interface in the CMTS, then it is actually the MAC address of the cable interface to which it is connected). - docsIfDocsisBaseCapability: the highest version of the DOCSIS specification (1.0, 1.1, 2.0) that the device is capable of supporting. - docsIfCmStatusDocsisOperMode: the QOS level (1.0, 1.1) that the CM is operating in. - docsIfCmStatusModulationType: the upstream modulation methodology used by the CM. ')
docsDevCmDynServReqFailNotif = NotificationType((1, 3, 6, 1, 2, 1, 132, 2, 0, 2)).setObjects(("DOCS-CABLE-DEVICE-MIB", "docsDevEvLevel"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvId"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvText"), ("IF-MIB", "ifPhysAddress"), ("DOCS-IF-MIB", "docsIfCmCmtsAddress"), ("DOCS-IF-MIB", "docsIfDocsisBaseCapability"), ("DOCS-IF-MIB", "docsIfCmStatusDocsisOperMode"), ("DOCS-IF-MIB", "docsIfCmStatusModulationType"))
if mibBuilder.loadTexts: docsDevCmDynServReqFailNotif.setStatus('current')
if mibBuilder.loadTexts: docsDevCmDynServReqFailNotif.setDescription('A notification to report the failure of a dynamic service request during the dynamic services process. This notification sends additional information about the event by including the following objects in its varbinding list. - docsDevEvLevel: the priority level associated with the event. - docsDevEvId: the unique identifier of the event that occurred. - docsDevEvText: a textual description of the event. - ifPhysAddress: the MAC address of the cable interface of this cable modem. - docsIfCmCmtsAddress: the MAC address of the CMTS to which the CM is connected to (if there is a cable card/interface in the CMTS, then it is actually the MAC address of the cable interface to which it is connected). - docsIfDocsisBaseCapability: the highest version of the DOCSIS specification (1.0, 1.1, 2.0) that the device is capable of supporting. - docsIfCmStatusDocsisOperMode: the QOS level (1.0, 1.1) that the CM is operating in. - docsIfCmStatusModulationType: the upstream modulation methodology used by the CM. ')
docsDevCmDynServRspFailNotif = NotificationType((1, 3, 6, 1, 2, 1, 132, 2, 0, 3)).setObjects(("DOCS-CABLE-DEVICE-MIB", "docsDevEvLevel"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvId"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvText"), ("IF-MIB", "ifPhysAddress"), ("DOCS-IF-MIB", "docsIfCmCmtsAddress"), ("DOCS-IF-MIB", "docsIfDocsisBaseCapability"), ("DOCS-IF-MIB", "docsIfCmStatusDocsisOperMode"), ("DOCS-IF-MIB", "docsIfCmStatusModulationType"))
if mibBuilder.loadTexts: docsDevCmDynServRspFailNotif.setStatus('current')
if mibBuilder.loadTexts: docsDevCmDynServRspFailNotif.setDescription(' A notification to report the failure of a dynamic service response during the dynamic services process. This notification sends additional information about the event by including the following objects in its varbinding list. - docsDevEvLevel: the priority level associated with the event. - docsDevEvId: the unique identifier of the event that occurred. - docsDevEvText: a textual description of the event. - ifPhysAddress: the MAC address of the cable interface of this cable modem. - docsIfCmCmtsAddress: the MAC address of the CMTS to which the CM is connected (if there is a cable card/interface in the CMTS, then it is actually the MAC address of the cable interface to which it is connected). - docsIfDocsisBaseCapability: the highest version of the DOCSIS specification (1.0, 1.1, 2.0) that the device is capable of supporting. - docsIfCmStatusDocsisOperMode: the QOS level (1.0, 1.1) that the CM is operating in. - docsIfCmStatusModulationType: the upstream modulation methodology used by the CM. ')
docsDevCmDynServAckFailNotif = NotificationType((1, 3, 6, 1, 2, 1, 132, 2, 0, 4)).setObjects(("DOCS-CABLE-DEVICE-MIB", "docsDevEvLevel"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvId"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvText"), ("IF-MIB", "ifPhysAddress"), ("DOCS-IF-MIB", "docsIfCmCmtsAddress"), ("DOCS-IF-MIB", "docsIfDocsisBaseCapability"), ("DOCS-IF-MIB", "docsIfCmStatusDocsisOperMode"), ("DOCS-IF-MIB", "docsIfCmStatusModulationType"))
if mibBuilder.loadTexts: docsDevCmDynServAckFailNotif.setStatus('current')
if mibBuilder.loadTexts: docsDevCmDynServAckFailNotif.setDescription('A notification to report the failure of a dynamic service acknowledgement during the dynamic services process. This notification sends additional information about the event by including the following objects in its varbinding list. - docsDevEvLevel: the priority level associated with the event. - docsDevEvId: the unique identifier of the event that occurred. - docsDevEvText: a textual description of the event. - ifPhysAddress: the MAC address of the cable interface of this cable modem. - docsIfCmCmtsAddress: the MAC address of the CMTS to which the CM is connected (if there is a cable card/interface in the CMTS, then it is actually the MAC address of the cable interface to which it is connected). - docsIfDocsisBaseCapability: the highest version of the DOCSIS specification (1.0, 1.1, 2.0) that the device is capable of supporting. - docsIfCmStatusDocsisOperMode: the QOS level (1.0, 1.1) that the CM is operating in. - docsIfCmStatusModulationType: the upstream modulation methodology used by the CM. ')
docsDevCmBpiInitNotif = NotificationType((1, 3, 6, 1, 2, 1, 132, 2, 0, 5)).setObjects(("DOCS-CABLE-DEVICE-MIB", "docsDevEvLevel"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvId"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvText"), ("IF-MIB", "ifPhysAddress"), ("DOCS-IF-MIB", "docsIfCmCmtsAddress"), ("DOCS-IF-MIB", "docsIfDocsisBaseCapability"), ("DOCS-IF-MIB", "docsIfCmStatusDocsisOperMode"), ("DOCS-IF-MIB", "docsIfCmStatusModulationType"))
if mibBuilder.loadTexts: docsDevCmBpiInitNotif.setStatus('current')
if mibBuilder.loadTexts: docsDevCmBpiInitNotif.setDescription('A notification to report the failure of a BPI initialization attempt during the registration process. This notification sends additional information about the event by including the following objects in its varbinding list. - docsDevEvLevel: the priority level associated with the event. - docsDevEvId: the unique identifier of the event that occurred. - docsDevEvText: a textual description of the event. - ifPhysAddress: the MAC address of the cable interface of this cable modem. - docsIfCmCmtsAddress: the MAC address of the CMTS to which the CM is connected (if there is a cable card/interface in the CMTS, then it is actually the MAC address of the cable interface to which it is connected). - docsIfDocsisBaseCapability: the highest version of the DOCSIS specification (1.0, 1.1, 2.0) that the device is capable of supporting. - docsIfCmStatusDocsisOperMode: the QOS level (1.0, 1.1) that the CM is operating in. - docsIfCmStatusModulationType: the upstream modulation methodology used by the CM. ')
docsDevCmBPKMNotif = NotificationType((1, 3, 6, 1, 2, 1, 132, 2, 0, 6)).setObjects(("DOCS-CABLE-DEVICE-MIB", "docsDevEvLevel"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvId"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvText"), ("IF-MIB", "ifPhysAddress"), ("DOCS-IF-MIB", "docsIfCmCmtsAddress"), ("DOCS-IF-MIB", "docsIfDocsisBaseCapability"), ("DOCS-IF-MIB", "docsIfCmStatusDocsisOperMode"), ("DOCS-IF-MIB", "docsIfCmStatusModulationType"))
if mibBuilder.loadTexts: docsDevCmBPKMNotif.setStatus('current')
if mibBuilder.loadTexts: docsDevCmBPKMNotif.setDescription('A notification to report the failure of a Baseline Privacy Key Management (BPKM) operation. This notification sends additional information about the event by including the following objects in its varbinding list. - docsDevEvLevel: the priority level associated with the event. - docsDevEvId: the unique identifier of the event that occurred. - docsDevEvText: a textual description of the event. - ifPhysAddress: the MAC address of the cable interface of this cable modem. - docsIfCmCmtsAddress: the MAC address of the CMTS to which the CM is connected (if there is a cable card/interface in the CMTS, then it is actually the MAC address of the cable interface to which it is connected). - docsIfDocsisBaseCapability: the highest version of the DOCSIS specification (1.0, 1.1, 2.0) that the device is capable of supporting. - docsIfCmStatusDocsisOperMode: the QOS level (1.0, 1.1) that the CM is operating in. - docsIfCmStatusModulationType: the upstream modulation methodology used by the CM. ')
docsDevCmDynamicSANotif = NotificationType((1, 3, 6, 1, 2, 1, 132, 2, 0, 7)).setObjects(("DOCS-CABLE-DEVICE-MIB", "docsDevEvLevel"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvId"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvText"), ("IF-MIB", "ifPhysAddress"), ("DOCS-IF-MIB", "docsIfCmCmtsAddress"), ("DOCS-IF-MIB", "docsIfDocsisBaseCapability"), ("DOCS-IF-MIB", "docsIfCmStatusDocsisOperMode"), ("DOCS-IF-MIB", "docsIfCmStatusModulationType"))
if mibBuilder.loadTexts: docsDevCmDynamicSANotif.setStatus('current')
if mibBuilder.loadTexts: docsDevCmDynamicSANotif.setDescription('A notification to report the failure of a dynamic security association operation. This notification sends additional information about the event by including the following objects in its varbinding list. - docsDevEvLevel: the priority level associated with the event. - docsDevEvId: the unique identifier of the event that occurred. - docsDevEvText: a textual description of the event. - ifPhysAddress: the MAC address of the cable interface of this cable modem. - docsIfCmCmtsAddress: the MAC address of the CMTS to which the CM is connected (if there is a cable card/interface in the CMTS, then it is actually the MAC address of the cable interface to which it is connected). - docsIfDocsisBaseCapability: the highest version of the DOCSIS specification (1.0, 1.1, 2.0) that the device is capable of supporting. - docsIfCmStatusDocsisOperMode: the QOS level (1.0, 1.1) that the CM is operating in. - docsIfCmStatusModulationType: the upstream modulation methodology used by the CM. ')
docsDevCmDHCPFailNotif = NotificationType((1, 3, 6, 1, 2, 1, 132, 2, 0, 8)).setObjects(("DOCS-CABLE-DEVICE-MIB", "docsDevEvLevel"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvId"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvText"), ("IF-MIB", "ifPhysAddress"), ("DOCS-IF-MIB", "docsIfCmCmtsAddress"), ("DOCS-CABLE-DEVICE-MIB", "docsDevServerDhcp"), ("DOCS-IF-MIB", "docsIfDocsisBaseCapability"), ("DOCS-IF-MIB", "docsIfCmStatusDocsisOperMode"), ("DOCS-IF-MIB", "docsIfCmStatusModulationType"))
if mibBuilder.loadTexts: docsDevCmDHCPFailNotif.setStatus('current')
if mibBuilder.loadTexts: docsDevCmDHCPFailNotif.setDescription('A notification to report the failure of a DHCP operation. This notification sends additional information about the event by including the following objects in its varbinding list. - docsDevEvLevel: the priority level associated with the event. - docsDevEvId: the unique identifier of the event that occurred. - docsDevEvText: a textual description of the event. - ifPhysAddress: the MAC address of the cable interface of this cable modem. - docsIfCmCmtsAddress: the MAC address of the CMTS to which the CM is connected (if there is a cable card/interface in the CMTS, then it is actually the MAC address of the cable interface to which it is connected). - docsDevServerDhcp: the IP address of the DHCP server. - docsIfDocsisBaseCapability: the highest version of the DOCSIS specification (1.0, 1.1, 2.0) that the device is capable of supporting. - docsIfCmStatusDocsisOperMode: the QOS level (1.0, 1.1) that the CM is operating in. - docsIfCmStatusModulationType: the upstream modulation methodology used by the CM. ')
docsDevCmSwUpgradeInitNotif = NotificationType((1, 3, 6, 1, 2, 1, 132, 2, 0, 9)).setObjects(("DOCS-CABLE-DEVICE-MIB", "docsDevEvLevel"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvId"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvText"), ("IF-MIB", "ifPhysAddress"), ("DOCS-IF-MIB", "docsIfCmCmtsAddress"), ("DOCS-CABLE-DEVICE-MIB", "docsDevSwFilename"), ("DOCS-CABLE-DEVICE-MIB", "docsDevSwServer"), ("DOCS-IF-MIB", "docsIfDocsisBaseCapability"), ("DOCS-IF-MIB", "docsIfCmStatusDocsisOperMode"), ("DOCS-IF-MIB", "docsIfCmStatusModulationType"))
if mibBuilder.loadTexts: docsDevCmSwUpgradeInitNotif.setStatus('current')
if mibBuilder.loadTexts: docsDevCmSwUpgradeInitNotif.setDescription('A notification to indicate that a software upgrade has been initiated on the device. This notification sends additional information about the event by including the following objects in its varbinding list. - docsDevEvLevel: the priority level associated with the event. - docsDevEvId: the unique identifier of the event that occurred. - docsDevEvText: a textual description of the event. - ifPhysAddress: the MAC address of the cable interface of this cable modem. - docsIfCmCmtsAddress: the MAC address of the CMTS to which the CM is connected (if there is a cable card/interface in the CMTS, then it is actually the MAC address of the cable interface to which it is connected). - docsIfDocsisBaseCapability: the highest version of the DOCSIS specification (1.0, 1.1, 2.0) that the device is capable of supporting. - docsIfCmStatusDocsisOperMode: the QOS level (1.0, 1.1) that the CM is operating in. - docsIfCmStatusModulationType: the upstream modulation methodology used by the CM. ')
docsDevCmSwUpgradeFailNotif = NotificationType((1, 3, 6, 1, 2, 1, 132, 2, 0, 10)).setObjects(("DOCS-CABLE-DEVICE-MIB", "docsDevEvLevel"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvId"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvText"), ("IF-MIB", "ifPhysAddress"), ("DOCS-IF-MIB", "docsIfCmCmtsAddress"), ("DOCS-CABLE-DEVICE-MIB", "docsDevSwFilename"), ("DOCS-CABLE-DEVICE-MIB", "docsDevSwServer"), ("DOCS-IF-MIB", "docsIfDocsisBaseCapability"), ("DOCS-IF-MIB", "docsIfCmStatusDocsisOperMode"), ("DOCS-IF-MIB", "docsIfCmStatusModulationType"))
if mibBuilder.loadTexts: docsDevCmSwUpgradeFailNotif.setStatus('current')
if mibBuilder.loadTexts: docsDevCmSwUpgradeFailNotif.setDescription('A notification to report the failure of a software upgrade attempt. This notification sends additional information about the event by including the following objects in its varbinding list. - docsDevEvLevel: the priority level associated with the event. - docsDevEvId: the unique identifier of the event that occurred. - docsDevEvText: a textual description of the event. - ifPhysAddress: the MAC address of the cable interface of this cable modem. - docsIfCmCmtsAddress: the MAC address of the CMTS to which the CM is connected (if there is a cable card/interface in the CMTS, then it is actually the MAC address of the cable interface to which it is connected). - docsDevSwFilename: the software image file name - docsDevSwServer: the IP address of the server that the image is retrieved from. - docsIfDocsisBaseCapability: the highest version of the DOCSIS specification (1.0, 1.1, 2.0) that the device is capable of supporting. - docsIfCmStatusDocsisOperMode: the QOS level (1.0, 1.1) that the CM is operating in. - docsIfCmStatusModulationType: the upstream modulation methodology used by the CM. ')
docsDevCmSwUpgradeSuccessNotif = NotificationType((1, 3, 6, 1, 2, 1, 132, 2, 0, 11)).setObjects(("DOCS-CABLE-DEVICE-MIB", "docsDevEvLevel"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvId"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvText"), ("IF-MIB", "ifPhysAddress"), ("DOCS-IF-MIB", "docsIfCmCmtsAddress"), ("DOCS-CABLE-DEVICE-MIB", "docsDevSwFilename"), ("DOCS-CABLE-DEVICE-MIB", "docsDevSwServer"), ("DOCS-IF-MIB", "docsIfDocsisBaseCapability"), ("DOCS-IF-MIB", "docsIfCmStatusDocsisOperMode"), ("DOCS-IF-MIB", "docsIfCmStatusModulationType"))
if mibBuilder.loadTexts: docsDevCmSwUpgradeSuccessNotif.setStatus('current')
if mibBuilder.loadTexts: docsDevCmSwUpgradeSuccessNotif.setDescription('A notification to report the software upgrade success status. This notification sends additional information about the event by including the following objects in its varbinding list. - docsDevEvLevel: the priority level associated with the event. - docsDevEvId: the unique identifier of the event that occurred. - docsDevEvText: a textual description of the event. - ifPhysAddress: the MAC address of the cable interface of this cable modem. - docsIfCmCmtsAddress: the MAC address of the CMTS to which the CM is connected (if there is a cable card/interface in the CMTS, then it is actually the MAC address of the cable interface to which it is connected). - docsDevSwFilename: the software image file name - docsDevSwServer: the IP address of the server that the image is retrieved from. - docsIfDocsisBaseCapability: the highest version of the DOCSIS specification (1.0, 1.1, 2.0) that the device is capable of supporting. - docsIfCmStatusDocsisOperMode: the QOS level (1.0, 1.1) that the CM is operating in. - docsIfCmStatusModulationType: the upstream modulation methodology used by the CM. ')
docsDevCmSwUpgradeCVCFailNotif = NotificationType((1, 3, 6, 1, 2, 1, 132, 2, 0, 12)).setObjects(("DOCS-CABLE-DEVICE-MIB", "docsDevEvLevel"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvId"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvText"), ("IF-MIB", "ifPhysAddress"), ("DOCS-IF-MIB", "docsIfCmCmtsAddress"), ("DOCS-IF-MIB", "docsIfDocsisBaseCapability"), ("DOCS-IF-MIB", "docsIfCmStatusDocsisOperMode"), ("DOCS-IF-MIB", "docsIfCmStatusModulationType"))
if mibBuilder.loadTexts: docsDevCmSwUpgradeCVCFailNotif.setStatus('current')
if mibBuilder.loadTexts: docsDevCmSwUpgradeCVCFailNotif.setDescription('A notification to report that the verification of the code file has failed during a secure software upgrade attempt. This notification sends additional information about the event by including the following objects in its varbinding list. - docsDevEvLevel: the priority level associated with the event. - docsDevEvId: the unique identifier of the event that occurred. - docsDevEvText: a textual description of the event. - ifPhysAddress: the MAC address of the cable interface of this cable modem. - docsIfCmCmtsAddress: the MAC address of the CMTS to which the CM is connected (if there is a cable card/interface in the CMTS, then it is actually the MAC address of the cable interface to which it is connected). - docsIfDocsisBaseCapability: the highest version of the DOCSIS specification (1.0, 1.1, 2.0) that the device is capable of supporting. - docsIfCmStatusDocsisOperMode: the QOS level (1.0, 1.1) that the CM is operating in. - docsIfCmStatusModulationType: the upstream modulation methodology used by the CM. ')
docsDevCmTODFailNotif = NotificationType((1, 3, 6, 1, 2, 1, 132, 2, 0, 13)).setObjects(("DOCS-CABLE-DEVICE-MIB", "docsDevEvLevel"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvId"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvText"), ("IF-MIB", "ifPhysAddress"), ("DOCS-IF-MIB", "docsIfCmCmtsAddress"), ("DOCS-CABLE-DEVICE-MIB", "docsDevServerTime"), ("DOCS-IF-MIB", "docsIfDocsisBaseCapability"), ("DOCS-IF-MIB", "docsIfCmStatusDocsisOperMode"), ("DOCS-IF-MIB", "docsIfCmStatusModulationType"))
if mibBuilder.loadTexts: docsDevCmTODFailNotif.setStatus('current')
if mibBuilder.loadTexts: docsDevCmTODFailNotif.setDescription('A notification to report the failure of a time of day operation. This notification sends additional information about the event by including the following objects in its varbinding list. - docsDevEvLevel: the priority level associated with the event. - docsDevEvId: the unique identifier of the event that occurred. - docsDevEvText: a textual description of the event. - ifPhysAddress: the MAC address of the cable interface of this cable modem. - docsIfCmCmtsAddress: the MAC address of the CMTS to which the CM is connected (if there is a cable card/interface in the CMTS, then it is actually the MAC address of the cable interface to which it is connected). - docsDevServerTime: the IP address of the time server. - docsIfDocsisBaseCapability: the highest version of the DOCSIS specification (1.0, 1.1, 2.0) that the device is capable of supporting. - docsIfCmStatusDocsisOperMode: the QOS level (1.0, 1.1) that the CM is operating in. - docsIfCmStatusModulationType: the upstream modulation methodology used by the CM. ')
docsDevCmDCCReqFailNotif = NotificationType((1, 3, 6, 1, 2, 1, 132, 2, 0, 14)).setObjects(("DOCS-CABLE-DEVICE-MIB", "docsDevEvLevel"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvId"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvText"), ("IF-MIB", "ifPhysAddress"), ("DOCS-IF-MIB", "docsIfCmCmtsAddress"), ("DOCS-IF-MIB", "docsIfDocsisBaseCapability"), ("DOCS-IF-MIB", "docsIfCmStatusDocsisOperMode"), ("DOCS-IF-MIB", "docsIfCmStatusModulationType"))
if mibBuilder.loadTexts: docsDevCmDCCReqFailNotif.setStatus('current')
if mibBuilder.loadTexts: docsDevCmDCCReqFailNotif.setDescription(' A notification to report the failure of a dynamic channel change request during the dynamic channel change process on the CM. This notification sends additional information about the event by including the following objects in its varbinding list. - docsDevEvLevel: the priority level associated with the event. - docsDevEvId: the unique identifier of the event that occurred. - docsDevEvText: a textual description of the event. - ifPhysAddress: the MAC address of the cable interface of this cable modem. - docsIfCmCmtsAddress: the MAC address of the CMTS to which the CM is connected (if there is a cable card/interface in the CMTS, then it is actually the MAC address of the cable interface to which it is connected). - docsIfDocsisBaseCapability: the highest version of the DOCSIS specification (1.0, 1.1, 2.0) that the device is capable of supporting. - docsIfCmStatusDocsisOperMode: the QOS level (1.0, 1.1) that the CM is operating in. - docsIfCmStatusModulationType: the upstream modulation methodology used by the CM. ')
docsDevCmDCCRspFailNotif = NotificationType((1, 3, 6, 1, 2, 1, 132, 2, 0, 15)).setObjects(("DOCS-CABLE-DEVICE-MIB", "docsDevEvLevel"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvId"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvText"), ("IF-MIB", "ifPhysAddress"), ("DOCS-IF-MIB", "docsIfCmCmtsAddress"), ("DOCS-IF-MIB", "docsIfDocsisBaseCapability"), ("DOCS-IF-MIB", "docsIfCmStatusDocsisOperMode"), ("DOCS-IF-MIB", "docsIfCmStatusModulationType"))
if mibBuilder.loadTexts: docsDevCmDCCRspFailNotif.setStatus('current')
if mibBuilder.loadTexts: docsDevCmDCCRspFailNotif.setDescription('A notification to report the failure of a dynamic channel change response during the dynamic channel change process on the CM. This notification sends additional information about the event by including the following objects in its varbinding list. - docsDevEvLevel: the priority level associated with the event. - docsDevEvId: the unique identifier of the event that occurred. - docsDevEvText: a textual description of the event. - ifPhysAddress: the MAC address of the cable interface of this cable modem. - docsIfCmCmtsAddress: the MAC address of the CMTS to which the CM is connected (if there is a cable card/interface in the CMTS, then it is actually the MAC address of the cable interface to which it is connected). - docsIfDocsisBaseCapability: the highest version of the DOCSIS specification (1.0, 1.1, 2.0) that the device is capable of supporting. - docsIfCmStatusDocsisOperMode: the QOS level (1.0, 1.1) that the CM is operating in. - docsIfCmStatusModulationType: the upstream modulation methodology used by the CM. ')
docsDevCmDCCAckFailNotif = NotificationType((1, 3, 6, 1, 2, 1, 132, 2, 0, 16)).setObjects(("DOCS-CABLE-DEVICE-MIB", "docsDevEvLevel"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvId"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvText"), ("IF-MIB", "ifPhysAddress"), ("DOCS-IF-MIB", "docsIfCmCmtsAddress"), ("DOCS-IF-MIB", "docsIfDocsisBaseCapability"), ("DOCS-IF-MIB", "docsIfCmStatusDocsisOperMode"), ("DOCS-IF-MIB", "docsIfCmStatusModulationType"))
if mibBuilder.loadTexts: docsDevCmDCCAckFailNotif.setStatus('current')
if mibBuilder.loadTexts: docsDevCmDCCAckFailNotif.setDescription('A notification to report the failure of a dynamic channel change acknowledgement during the dynamic channel change process on the CM. This notification sends additional information about the event by including the following objects in its varbinding list. - docsDevEvLevel: the priority level associated with the event. - docsDevEvId: the unique identifier of the event that occurred. - docsDevEvText: a textual description of the event. - ifPhysAddress: the MAC address of the cable interface of this cable modem. - docsIfCmCmtsAddress: the MAC address of the CMTS to which the CM is connected (if there is a cable card/interface in the CMTS, then it is actually the MAC address of the cable interface to which it is connected). - docsIfDocsisBaseCapability: the highest version of the DOCSIS specification (1.0, 1.1, 2.0) that the device is capable of supporting. - docsIfCmStatusDocsisOperMode: the QOS level (1.0, 1.1) that the CM is operating in. - docsIfCmtsCmStatusModulationType the upstream modulation methodology used by the CM. ')
docsDevCmtsInitRegReqFailNotif = NotificationType((1, 3, 6, 1, 2, 1, 132, 3, 0, 1)).setObjects(("DOCS-CABLE-DEVICE-MIB", "docsDevEvLevel"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvId"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvText"), ("DOCS-IF-MIB", "docsIfCmtsCmStatusMacAddress"), ("IF-MIB", "ifPhysAddress"), ("DOCS-IF-MIB", "docsIfCmtsCmStatusDocsisRegMode"), ("DOCS-IF-MIB", "docsIfDocsisBaseCapability"), ("DOCS-IF-MIB", "docsIfCmtsCmStatusModulationType"))
if mibBuilder.loadTexts: docsDevCmtsInitRegReqFailNotif.setStatus('current')
if mibBuilder.loadTexts: docsDevCmtsInitRegReqFailNotif.setDescription('A notification to report the failure of a registration request from a CM during the CM initialization process that was detected on the CMTS. This notification sends additional information about the event by including the following objects in its varbinding list. - docsDevEvLevel: the priority level associated with the event. - docsDevEvId: the unique identifier of the event that occurred. - docsDevEvText: a textual description of the event. - docsIfCmtsCmStatusMacAddress: the MAC address of the CM with which this notification is associated. - ifPhysAddress: the MAC address of the CMTS (if there is a cable card/interface in the CMTS, then it is actually the MAC address of the cable interface that connected to the CM) cable interface connected to the CM. - docsIfCmtsCmStatusDocsisRegMode: the QOS level (1.0, 1.1) that the reporting CM is operating in. - docsIfDocsisBaseCapability: the highest version of the DOCSIS specification (1.0, 1.1, 2.0) that the device is capable of supporting. - docsIfCmtsCmStatusModulationType: the upstream modulation methodology used by the CM. ')
docsDevCmtsInitRegRspFailNotif = NotificationType((1, 3, 6, 1, 2, 1, 132, 3, 0, 2)).setObjects(("DOCS-CABLE-DEVICE-MIB", "docsDevEvLevel"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvId"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvText"), ("DOCS-IF-MIB", "docsIfCmtsCmStatusMacAddress"), ("IF-MIB", "ifPhysAddress"), ("DOCS-IF-MIB", "docsIfCmtsCmStatusDocsisRegMode"), ("DOCS-IF-MIB", "docsIfDocsisBaseCapability"), ("DOCS-IF-MIB", "docsIfCmtsCmStatusModulationType"))
if mibBuilder.loadTexts: docsDevCmtsInitRegRspFailNotif.setStatus('current')
if mibBuilder.loadTexts: docsDevCmtsInitRegRspFailNotif.setDescription('A notification to report the failure of a registration response during the CM initialization process that was detected by the CMTS. This notification sends additional information about the event by including the following objects in its varbinding list. - docsDevEvLevel: the priority level associated with the event. - docsDevEvId: the unique identifier of the event that occurred. - docsDevEvText: a textual description of the event. - docsIfCmtsCmStatusMacAddress: the MAC address of the CM with which this notification is associated. - ifPhysAddress: the MAC address of the CMTS (if there is a cable card/interface in the CMTS, then it is actually the MAC address of the cable interface that connected to the CM) cable interface connected to the CM. - docsIfCmtsCmStatusDocsisRegMode: the QOS level (1.0, 1.1) that the reporting CM is operating in. - docsIfDocsisBaseCapability: the highest version of the DOCSIS specification (1.0, 1.1, 2.0) that the device is capable of supporting. - docsIfCmtsCmStatusModulationType: the upstream modulation methodology used by the CM. ')
docsDevCmtsInitRegAckFailNotif = NotificationType((1, 3, 6, 1, 2, 1, 132, 3, 0, 3)).setObjects(("DOCS-CABLE-DEVICE-MIB", "docsDevEvLevel"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvId"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvText"), ("DOCS-IF-MIB", "docsIfCmtsCmStatusMacAddress"), ("IF-MIB", "ifPhysAddress"), ("DOCS-IF-MIB", "docsIfCmtsCmStatusDocsisRegMode"), ("DOCS-IF-MIB", "docsIfDocsisBaseCapability"), ("DOCS-IF-MIB", "docsIfCmtsCmStatusModulationType"))
if mibBuilder.loadTexts: docsDevCmtsInitRegAckFailNotif.setStatus('current')
if mibBuilder.loadTexts: docsDevCmtsInitRegAckFailNotif.setDescription('A notification to report the failure of a registration acknowledgement from the CM during the CM initialization process that was detected by the CMTS. This notification sends additional information about the event by including the following objects in its varbinding list. - docsDevEvLevel: the priority level associated with the event. - docsDevEvId: the unique identifier of the event that occurred. - docsDevEvText: a textual description of the event. - docsIfCmtsCmStatusMacAddress: the MAC address of the CM with which this notification is associated. - ifPhysAddress: the MAC address of the CMTS (if there is a cable card/interface in the CMTS, then it is actually the MAC address of the cable interface that connected to the CM) cable interface connected to the CM. - docsIfCmtsCmStatusDocsisRegMode: the QOS level (1.0, 1.1) that the reporting CM is operating in. - docsIfDocsisBaseCapability: the highest version of the DOCSIS specification (1.0, 1.1, 2.0) that the device is capable of supporting. - docsIfCmtsCmStatusModulationType: the upstream modulation methodology used by the CM. ')
docsDevCmtsDynServReqFailNotif = NotificationType((1, 3, 6, 1, 2, 1, 132, 3, 0, 4)).setObjects(("DOCS-CABLE-DEVICE-MIB", "docsDevEvLevel"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvId"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvText"), ("DOCS-IF-MIB", "docsIfCmtsCmStatusMacAddress"), ("IF-MIB", "ifPhysAddress"), ("DOCS-IF-MIB", "docsIfCmtsCmStatusDocsisRegMode"), ("DOCS-IF-MIB", "docsIfDocsisBaseCapability"), ("DOCS-IF-MIB", "docsIfCmtsCmStatusModulationType"))
if mibBuilder.loadTexts: docsDevCmtsDynServReqFailNotif.setStatus('current')
if mibBuilder.loadTexts: docsDevCmtsDynServReqFailNotif.setDescription('A notification to report the failure of a dynamic service request during the dynamic services process that was detected by the CMTS. This notification sends additional information about the event by including the following objects in its varbinding list. - docsDevEvLevel: the priority level associated with the event. - docsDevEvId: the unique identifier of the event that occurred. - docsDevEvText: a textual description of the event. - docsIfCmtsCmStatusMacAddress: the MAC address of the CM with which this notification is associated. - ifPhysAddress: the MAC address of the CMTS (if there is a cable card/interface in the CMTS, then it is actually the MAC address of the cable interface that connected to the CM) cable interface connected to the CM. - docsIfCmtsCmStatusDocsisRegMode: the QOS level (1.0, 1.1) that the reporting CM is operating in. - docsIfDocsisBaseCapability: the highest version of the DOCSIS specification (1.0, 1.1, 2.0) that the device is capable of supporting. - docsIfCmtsCmStatusModulationType: the upstream modulation methodology used by the CM. ')
docsDevCmtsDynServRspFailNotif = NotificationType((1, 3, 6, 1, 2, 1, 132, 3, 0, 5)).setObjects(("DOCS-CABLE-DEVICE-MIB", "docsDevEvLevel"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvId"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvText"), ("DOCS-IF-MIB", "docsIfCmtsCmStatusMacAddress"), ("IF-MIB", "ifPhysAddress"), ("DOCS-IF-MIB", "docsIfCmtsCmStatusDocsisRegMode"), ("DOCS-IF-MIB", "docsIfDocsisBaseCapability"), ("DOCS-IF-MIB", "docsIfCmtsCmStatusModulationType"))
if mibBuilder.loadTexts: docsDevCmtsDynServRspFailNotif.setStatus('current')
if mibBuilder.loadTexts: docsDevCmtsDynServRspFailNotif.setDescription('A notification to report the failure of a dynamic service response during the dynamic services process that was detected by the CMTS. This notification sends additional information about the event by including the following objects in its varbinding list. - docsDevEvLevel: the priority level associated with the event. - docsDevEvId: the unique identifier of the event that occurred. - docsDevEvText: a textual description of the event. - docsIfCmtsCmStatusMacAddress: the MAC address of the CM with which this notification is associated. - ifPhysAddress: the MAC address of the CMTS (if there is a cable card/interface in the CMTS, then it is actually the MAC address of the cable interface that connected to the CM) cable interface connected to the CM. - docsIfCmtsCmStatusDocsisRegMode: the QOS level (1.0, 1.1) that the reporting CM is operating in. - docsIfDocsisBaseCapability: the highest version of the DOCSIS specification (1.0, 1.1, 2.0) that the device is capable of supporting. - docsIfCmtsCmStatusModulationType: the upstream modulation methodology used by the CM. ')
docsDevCmtsDynServAckFailNotif = NotificationType((1, 3, 6, 1, 2, 1, 132, 3, 0, 6)).setObjects(("DOCS-CABLE-DEVICE-MIB", "docsDevEvLevel"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvId"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvText"), ("DOCS-IF-MIB", "docsIfCmtsCmStatusMacAddress"), ("IF-MIB", "ifPhysAddress"), ("DOCS-IF-MIB", "docsIfCmtsCmStatusDocsisRegMode"), ("DOCS-IF-MIB", "docsIfDocsisBaseCapability"), ("DOCS-IF-MIB", "docsIfCmtsCmStatusModulationType"))
if mibBuilder.loadTexts: docsDevCmtsDynServAckFailNotif.setStatus('current')
if mibBuilder.loadTexts: docsDevCmtsDynServAckFailNotif.setDescription('A notification to report the failure of a dynamic service acknowledgement during the dynamic services process that was detected by the CMTS. This notification sends additional information about the event by including the following objects in its varbinding list. - docsDevEvLevel: the priority level associated with the event. - docsDevEvId: the unique identifier of the event that occurred. - docsDevEvText: a textual description of the event. - docsIfCmtsCmStatusMacAddress: the MAC address of the CM with which this notification is associated. - ifPhysAddress: the MAC address of the CMTS (if there is a cable card/interface in the CMTS, then it is actually the MAC address of the cable interface that connected to the CM) cable interface connected to the CM. - docsIfCmtsCmStatusDocsisRegMode: the QOS level (1.0, 1.1) that the reporting CM is operating in. - docsIfDocsisBaseCapability: the highest version of the DOCSIS specification (1.0, 1.1, 2.0) that the device is capable of supporting. - docsIfCmtsCmStatusModulationType: the upstream modulation methodology used by the CM. ')
docsDevCmtsBpiInitNotif = NotificationType((1, 3, 6, 1, 2, 1, 132, 3, 0, 7)).setObjects(("DOCS-CABLE-DEVICE-MIB", "docsDevEvLevel"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvId"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvText"), ("DOCS-IF-MIB", "docsIfCmtsCmStatusMacAddress"), ("IF-MIB", "ifPhysAddress"), ("DOCS-IF-MIB", "docsIfCmtsCmStatusDocsisRegMode"), ("DOCS-IF-MIB", "docsIfDocsisBaseCapability"), ("DOCS-IF-MIB", "docsIfCmtsCmStatusModulationType"))
if mibBuilder.loadTexts: docsDevCmtsBpiInitNotif.setStatus('current')
if mibBuilder.loadTexts: docsDevCmtsBpiInitNotif.setDescription('A notification to report the failure of a BPI initialization attempt during the CM registration process that was detected by the CMTS. This notification sends additional information about the event by including the following objects in its varbinding list. - docsDevEvLevel: the priority level associated with the event. - docsDevEvId: the unique identifier of the event that occurred. - docsDevEvText: a textual description of the event. - docsIfCmtsCmStatusMacAddress: the MAC address of the CM with which this notification is associated. - ifPhysAddress: the MAC address of the CMTS (if there is a cable card/interface in the CMTS, then it is actually the MAC address of the cable interface that connected to the CM) cable interface connected to the CM. - docsIfCmtsCmStatusDocsisRegMode: the QOS level (1.0, 1.1) that the reporting CM is operating in. - docsIfDocsisBaseCapability: the highest version of the DOCSIS specification (1.0, 1.1, 2.0) that the device is capable of supporting. - docsIfCmtsCmStatusModulationType: the upstream modulation methodology used by the CM. ')
docsDevCmtsBPKMNotif = NotificationType((1, 3, 6, 1, 2, 1, 132, 3, 0, 8)).setObjects(("DOCS-CABLE-DEVICE-MIB", "docsDevEvLevel"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvId"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvText"), ("DOCS-IF-MIB", "docsIfCmtsCmStatusMacAddress"), ("IF-MIB", "ifPhysAddress"), ("DOCS-IF-MIB", "docsIfCmtsCmStatusDocsisRegMode"), ("DOCS-IF-MIB", "docsIfDocsisBaseCapability"), ("DOCS-IF-MIB", "docsIfCmtsCmStatusModulationType"))
if mibBuilder.loadTexts: docsDevCmtsBPKMNotif.setStatus('current')
if mibBuilder.loadTexts: docsDevCmtsBPKMNotif.setDescription('A notification to report the failure of a BPKM operation that is detected by the CMTS. This notification sends additional information about the event by including the following objects in its varbinding list. - docsDevEvLevel: the priority level associated with the event. - docsDevEvId: the unique identifier of the event that occurred. - docsDevEvText: a textual description of the event. - docsIfCmtsCmStatusMacAddress: the MAC address of the CM with which this notification is associated. - ifPhysAddress: the MAC address of the CMTS (if there is a cable card/interface in the CMTS, then it is actually the MAC address of the cable interface that connected to the CM) cable interface connected to the CM. - docsIfCmtsCmStatusDocsisRegMode: the QOS level (1.0, 1.1) that the reporting CM is operating in. - docsIfDocsisBaseCapability: the highest version of the DOCSIS specification (1.0, 1.1, 2.0) that the device is capable of supporting. - docsIfCmtsCmStatusModulationType: the upstream modulation methodology used by the CM. ')
docsDevCmtsDynamicSANotif = NotificationType((1, 3, 6, 1, 2, 1, 132, 3, 0, 9)).setObjects(("DOCS-CABLE-DEVICE-MIB", "docsDevEvLevel"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvId"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvText"), ("DOCS-IF-MIB", "docsIfCmtsCmStatusMacAddress"), ("IF-MIB", "ifPhysAddress"), ("DOCS-IF-MIB", "docsIfCmtsCmStatusDocsisRegMode"), ("DOCS-IF-MIB", "docsIfDocsisBaseCapability"), ("DOCS-IF-MIB", "docsIfCmtsCmStatusModulationType"))
if mibBuilder.loadTexts: docsDevCmtsDynamicSANotif.setStatus('current')
if mibBuilder.loadTexts: docsDevCmtsDynamicSANotif.setDescription('A notification to report the failure of a dynamic security association operation that is detected by the CMTS. This notification sends additional information about the event by including the following objects in its varbinding list. - docsDevEvLevel: the priority level associated with the event. - docsDevEvId: the unique identifier of the event that occurred. - docsDevEvText: a textual description of the event. - docsIfCmtsCmStatusMacAddress: the MAC address of the CM with which this notification is associated. - ifPhysAddress: the MAC address of the CMTS (if there is a cable card/interface in the CMTS, then it is actually the MAC address of the cable interface that connected to the CM) cable interface connected to the CM. - docsIfCmtsCmStatusDocsisRegMode: the QOS level (1.0, 1.1) that the reporting CM is operating in. - docsIfDocsisBaseCapability: the highest version of the DOCSIS specification (1.0, 1.1, 2.0) that the device is capable of supporting. - docsIfCmtsCmStatusModulationType: the upstream modulation methodology used by the CM. ')
docsDevCmtsDCCReqFailNotif = NotificationType((1, 3, 6, 1, 2, 1, 132, 3, 0, 10)).setObjects(("DOCS-CABLE-DEVICE-MIB", "docsDevEvLevel"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvId"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvText"), ("DOCS-IF-MIB", "docsIfCmtsCmStatusMacAddress"), ("IF-MIB", "ifPhysAddress"), ("DOCS-IF-MIB", "docsIfCmtsCmStatusDocsisRegMode"), ("DOCS-IF-MIB", "docsIfDocsisBaseCapability"), ("DOCS-IF-MIB", "docsIfCmtsCmStatusModulationType"))
if mibBuilder.loadTexts: docsDevCmtsDCCReqFailNotif.setStatus('current')
if mibBuilder.loadTexts: docsDevCmtsDCCReqFailNotif.setDescription('A notification to report the failure of a dynamic channel change request during the dynamic channel change process and is detected by the CMTS. This notification sends additional information about the event by including the following objects in its varbinding list. - docsDevEvLevel: the priority level associated with the event. - docsDevEvId: the unique identifier of the event that occurred. - docsDevEvText: a textual description of the event. - docsIfCmtsCmStatusMacAddress: the MAC address of the CM with which this notification is associated. - ifPhysAddress: the MAC address of the CMTS (if there is a cable card/interface in the CMTS, then it is actually the MAC address of the cable interface that connected to the CM) cable interface connected to the CM. - docsIfCmtsCmStatusDocsisRegMode: the QOS level (1.0, 1.1) that the reporting CM is operating in. - docsIfDocsisBaseCapability: the highest version of the DOCSIS specification (1.0, 1.1, 2.0) that the device is capable of supporting. - docsIfCmtsCmStatusModulationType: the upstream modulation methodology used by the CM. ')
docsDevCmtsDCCRspFailNotif = NotificationType((1, 3, 6, 1, 2, 1, 132, 3, 0, 11)).setObjects(("DOCS-CABLE-DEVICE-MIB", "docsDevEvLevel"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvId"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvText"), ("DOCS-IF-MIB", "docsIfCmtsCmStatusMacAddress"), ("IF-MIB", "ifPhysAddress"), ("DOCS-IF-MIB", "docsIfCmtsCmStatusDocsisRegMode"), ("DOCS-IF-MIB", "docsIfDocsisBaseCapability"), ("DOCS-IF-MIB", "docsIfCmtsCmStatusModulationType"))
if mibBuilder.loadTexts: docsDevCmtsDCCRspFailNotif.setStatus('current')
if mibBuilder.loadTexts: docsDevCmtsDCCRspFailNotif.setDescription('A notification to report the failure of a dynamic channel change response during the dynamic channel change process and is detected by the CMTS. This notification sends additional information about the event by including the following objects in its varbinding list. - docsDevEvLevel: the priority level associated with the event. - docsDevEvId: the unique identifier of the event that occurred. - docsDevEvText: a textual description of the event. - docsIfCmtsCmStatusMacAddress: the MAC address of the CM with which this notification is associated. - ifPhysAddress: the MAC address of the CMTS (if there is a cable card/interface in the CMTS, then it is actually the MAC address of the cable interface that connected to the CM) cable interface connected to the CM. - docsIfCmtsCmStatusDocsisRegMode: the QOS level (1.0, 1.1) that the reporting CM is operating in. - docsIfDocsisBaseCapability: the highest version of the DOCSIS specification (1.0, 1.1, 2.0) that the device is capable of supporting. - docsIfCmtsCmStatusModulationType: the upstream modulation methodology used by the CM. ')
docsDevCmtsDCCAckFailNotif = NotificationType((1, 3, 6, 1, 2, 1, 132, 3, 0, 12)).setObjects(("DOCS-CABLE-DEVICE-MIB", "docsDevEvLevel"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvId"), ("DOCS-CABLE-DEVICE-MIB", "docsDevEvText"), ("DOCS-IF-MIB", "docsIfCmtsCmStatusMacAddress"), ("IF-MIB", "ifPhysAddress"), ("DOCS-IF-MIB", "docsIfCmtsCmStatusDocsisRegMode"), ("DOCS-IF-MIB", "docsIfDocsisBaseCapability"), ("DOCS-IF-MIB", "docsIfCmtsCmStatusModulationType"))
if mibBuilder.loadTexts: docsDevCmtsDCCAckFailNotif.setStatus('current')
if mibBuilder.loadTexts: docsDevCmtsDCCAckFailNotif.setDescription('A notification to report the failure of a dynamic channel change acknowledgement during the dynamic channel change process and is detected by the CMTS. This notification sends additional information about the event by including the following objects in its varbinding list. - docsDevEvLevel: the priority level associated with the event. - docsDevEvId: the unique identifier of the event that occurred. - docsDevEvText: a textual description of the event. - docsIfCmtsCmStatusMacAddress: the MAC address of the CM with which this notification is associated. - ifPhysAddress: the MAC address of the CMTS (if there is a cable card/interface in the CMTS, then it is actually the MAC address of the cable interface that connected to the CM) cable interface connected to the CM. - docsIfCmtsCmStatusDocsisRegMode: the QOS level (1.0, 1.1) that the reporting CM is operating in. - docsIfDocsisBaseCapability: the highest version of the DOCSIS specification (1.0, 1.1, 2.0) that the device is capable of supporting. - docsIfCmtsCmStatusModulationType: the upstream modulation methodology used by the CM. ')
docsDevNotifConformance = MibIdentifier((1, 3, 6, 1, 2, 1, 132, 4))
docsDevNotifGroups = MibIdentifier((1, 3, 6, 1, 2, 1, 132, 4, 1))
docsDevNotifCompliances = MibIdentifier((1, 3, 6, 1, 2, 1, 132, 4, 2))
docsDevCmNotifCompliance = ModuleCompliance((1, 3, 6, 1, 2, 1, 132, 4, 2, 1)).setObjects(("DOCS-IETF-CABLE-DEVICE-NOTIFICATION-MIB", "docsDevCmNotifControlGroup"), ("DOCS-IETF-CABLE-DEVICE-NOTIFICATION-MIB", "docsDevCmNotificationGroup"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
docsDevCmNotifCompliance = docsDevCmNotifCompliance.setStatus('current')
if mibBuilder.loadTexts: docsDevCmNotifCompliance.setDescription('The compliance statement for CM Notifications and Control.')
docsDevCmNotifControlGroup = ObjectGroup((1, 3, 6, 1, 2, 1, 132, 4, 1, 1)).setObjects(("DOCS-IETF-CABLE-DEVICE-NOTIFICATION-MIB", "docsDevCmNotifControl"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
docsDevCmNotifControlGroup = docsDevCmNotifControlGroup.setStatus('current')
if mibBuilder.loadTexts: docsDevCmNotifControlGroup.setDescription('This group represents objects that allow control over CM Notifications.')
docsDevCmNotificationGroup = NotificationGroup((1, 3, 6, 1, 2, 1, 132, 4, 1, 2)).setObjects(("DOCS-IETF-CABLE-DEVICE-NOTIFICATION-MIB", "docsDevCmInitTLVUnknownNotif"), ("DOCS-IETF-CABLE-DEVICE-NOTIFICATION-MIB", "docsDevCmDynServReqFailNotif"), ("DOCS-IETF-CABLE-DEVICE-NOTIFICATION-MIB", "docsDevCmDynServRspFailNotif"), ("DOCS-IETF-CABLE-DEVICE-NOTIFICATION-MIB", "docsDevCmDynServAckFailNotif"), ("DOCS-IETF-CABLE-DEVICE-NOTIFICATION-MIB", "docsDevCmBpiInitNotif"), ("DOCS-IETF-CABLE-DEVICE-NOTIFICATION-MIB", "docsDevCmBPKMNotif"), ("DOCS-IETF-CABLE-DEVICE-NOTIFICATION-MIB", "docsDevCmDynamicSANotif"), ("DOCS-IETF-CABLE-DEVICE-NOTIFICATION-MIB", "docsDevCmDHCPFailNotif"), ("DOCS-IETF-CABLE-DEVICE-NOTIFICATION-MIB", "docsDevCmSwUpgradeInitNotif"), ("DOCS-IETF-CABLE-DEVICE-NOTIFICATION-MIB", "docsDevCmSwUpgradeFailNotif"), ("DOCS-IETF-CABLE-DEVICE-NOTIFICATION-MIB", "docsDevCmSwUpgradeSuccessNotif"), ("DOCS-IETF-CABLE-DEVICE-NOTIFICATION-MIB", "docsDevCmSwUpgradeCVCFailNotif"), ("DOCS-IETF-CABLE-DEVICE-NOTIFICATION-MIB", "docsDevCmTODFailNotif"), ("DOCS-IETF-CABLE-DEVICE-NOTIFICATION-MIB", "docsDevCmDCCReqFailNotif"), ("DOCS-IETF-CABLE-DEVICE-NOTIFICATION-MIB", "docsDevCmDCCRspFailNotif"), ("DOCS-IETF-CABLE-DEVICE-NOTIFICATION-MIB", "docsDevCmDCCAckFailNotif"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
docsDevCmNotificationGroup = docsDevCmNotificationGroup.setStatus('current')
if mibBuilder.loadTexts: docsDevCmNotificationGroup.setDescription('A collection of CM notifications providing device status and control.')
docsDevCmtsNotifCompliance = ModuleCompliance((1, 3, 6, 1, 2, 1, 132, 4, 2, 2)).setObjects(("DOCS-IETF-CABLE-DEVICE-NOTIFICATION-MIB", "docsDevCmtsNotifControlGroup"), ("DOCS-IETF-CABLE-DEVICE-NOTIFICATION-MIB", "docsDevCmtsNotificationGroup"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
docsDevCmtsNotifCompliance = docsDevCmtsNotifCompliance.setStatus('current')
if mibBuilder.loadTexts: docsDevCmtsNotifCompliance.setDescription('The compliance statement for DOCSIS CMTS Notification and Control.')
docsDevCmtsNotifControlGroup = ObjectGroup((1, 3, 6, 1, 2, 1, 132, 4, 1, 3)).setObjects(("DOCS-IETF-CABLE-DEVICE-NOTIFICATION-MIB", "docsDevCmtsNotifControl"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
docsDevCmtsNotifControlGroup = docsDevCmtsNotifControlGroup.setStatus('current')
if mibBuilder.loadTexts: docsDevCmtsNotifControlGroup.setDescription('This group represents objects that allow control over CMTS Notifications.')
docsDevCmtsNotificationGroup = NotificationGroup((1, 3, 6, 1, 2, 1, 132, 4, 1, 4)).setObjects(("DOCS-IETF-CABLE-DEVICE-NOTIFICATION-MIB", "docsDevCmtsInitRegReqFailNotif"), ("DOCS-IETF-CABLE-DEVICE-NOTIFICATION-MIB", "docsDevCmtsInitRegRspFailNotif"), ("DOCS-IETF-CABLE-DEVICE-NOTIFICATION-MIB", "docsDevCmtsInitRegAckFailNotif"), ("DOCS-IETF-CABLE-DEVICE-NOTIFICATION-MIB", "docsDevCmtsDynServReqFailNotif"), ("DOCS-IETF-CABLE-DEVICE-NOTIFICATION-MIB", "docsDevCmtsDynServRspFailNotif"), ("DOCS-IETF-CABLE-DEVICE-NOTIFICATION-MIB", "docsDevCmtsDynServAckFailNotif"), ("DOCS-IETF-CABLE-DEVICE-NOTIFICATION-MIB", "docsDevCmtsBpiInitNotif"), ("DOCS-IETF-CABLE-DEVICE-NOTIFICATION-MIB", "docsDevCmtsBPKMNotif"), ("DOCS-IETF-CABLE-DEVICE-NOTIFICATION-MIB", "docsDevCmtsDynamicSANotif"), ("DOCS-IETF-CABLE-DEVICE-NOTIFICATION-MIB", "docsDevCmtsDCCReqFailNotif"), ("DOCS-IETF-CABLE-DEVICE-NOTIFICATION-MIB", "docsDevCmtsDCCRspFailNotif"), ("DOCS-IETF-CABLE-DEVICE-NOTIFICATION-MIB", "docsDevCmtsDCCAckFailNotif"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
docsDevCmtsNotificationGroup = docsDevCmtsNotificationGroup.setStatus('current')
if mibBuilder.loadTexts: docsDevCmtsNotificationGroup.setDescription('A collection of CMTS notifications providing device status and control.')
mibBuilder.exportSymbols("DOCS-IETF-CABLE-DEVICE-NOTIFICATION-MIB", docsDevCmtsNotifCompliance=docsDevCmtsNotifCompliance, docsDevCmSwUpgradeInitNotif=docsDevCmSwUpgradeInitNotif, docsDevCmDynServRspFailNotif=docsDevCmDynServRspFailNotif, docsDevNotifCompliances=docsDevNotifCompliances, docsDevCmtsDynServRspFailNotif=docsDevCmtsDynServRspFailNotif, docsDevCmtsDCCReqFailNotif=docsDevCmtsDCCReqFailNotif, docsDevNotifControl=docsDevNotifControl, docsDevCmtsDynServReqFailNotif=docsDevCmtsDynServReqFailNotif, PYSNMP_MODULE_ID=docsDevNotifMIB, docsDevCmtsNotifs=docsDevCmtsNotifs, docsDevCmtsNotifControl=docsDevCmtsNotifControl, docsDevCmtsNotificationGroup=docsDevCmtsNotificationGroup, docsDevCmBPKMNotif=docsDevCmBPKMNotif, docsDevNotifMIB=docsDevNotifMIB, docsDevCmtsDynamicSANotif=docsDevCmtsDynamicSANotif, docsDevCmtsInitRegRspFailNotif=docsDevCmtsInitRegRspFailNotif, docsDevNotifGroups=docsDevNotifGroups, docsDevCmDynServReqFailNotif=docsDevCmDynServReqFailNotif, docsDevCmNotificationGroup=docsDevCmNotificationGroup, docsDevCmInitTLVUnknownNotif=docsDevCmInitTLVUnknownNotif, docsDevCmtsNotifControlGroup=docsDevCmtsNotifControlGroup, docsDevCmDynServAckFailNotif=docsDevCmDynServAckFailNotif, docsDevCmtsInitRegReqFailNotif=docsDevCmtsInitRegReqFailNotif, docsDevCmDCCRspFailNotif=docsDevCmDCCRspFailNotif, docsDevCmNotifCompliance=docsDevCmNotifCompliance, docsDevCmDCCReqFailNotif=docsDevCmDCCReqFailNotif, docsDevCmDynamicSANotif=docsDevCmDynamicSANotif, docsDevCmtsDynServAckFailNotif=docsDevCmtsDynServAckFailNotif, docsDevNotifConformance=docsDevNotifConformance, docsDevCmDCCAckFailNotif=docsDevCmDCCAckFailNotif, docsDevCmtsBpiInitNotif=docsDevCmtsBpiInitNotif, docsDevCmNotifControl=docsDevCmNotifControl, docsDevCmNotifs=docsDevCmNotifs, docsDevCmDHCPFailNotif=docsDevCmDHCPFailNotif, docsDevCmSwUpgradeCVCFailNotif=docsDevCmSwUpgradeCVCFailNotif, docsDevCmtsBPKMNotif=docsDevCmtsBPKMNotif, docsDevCmtsDCCRspFailNotif=docsDevCmtsDCCRspFailNotif, docsDevCmtsDCCAckFailNotif=docsDevCmtsDCCAckFailNotif, docsDevCmNotifControlGroup=docsDevCmNotifControlGroup, docsDevCmBpiInitNotif=docsDevCmBpiInitNotif, docsDevCmSwUpgradeFailNotif=docsDevCmSwUpgradeFailNotif, docsDevCmTODFailNotif=docsDevCmTODFailNotif, docsDevCmSwUpgradeSuccessNotif=docsDevCmSwUpgradeSuccessNotif, docsDevCmtsInitRegAckFailNotif=docsDevCmtsInitRegAckFailNotif)
| 402.57047 | 2,352 | 0.792775 | 7,417 | 59,983 | 6.410813 | 0.058784 | 0.018507 | 0.021388 | 0.035206 | 0.798965 | 0.761551 | 0.736735 | 0.720898 | 0.71419 | 0.701213 | 0 | 0.019075 | 0.106764 | 59,983 | 148 | 2,353 | 405.290541 | 0.868381 | 0.006202 | 0 | 0.050725 | 0 | 0.231884 | 0.731905 | 0.173316 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.065217 | 0 | 0.065217 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
e585fa9c8a7158010adc7b35be8738e734e5d477 | 12,632 | py | Python | main.py | ReflexTheLegend/RPC-Tool | 99dabbb079503d98e254b01f20448975c5a27e36 | [
"MIT"
] | 1 | 2022-02-09T08:12:11.000Z | 2022-02-09T08:12:11.000Z | main.py | ReflexTheLegend/RPC-Tool | 99dabbb079503d98e254b01f20448975c5a27e36 | [
"MIT"
] | null | null | null | main.py | ReflexTheLegend/RPC-Tool | 99dabbb079503d98e254b01f20448975c5a27e36 | [
"MIT"
] | null | null | null | from pypresence import *
import time
import psutil
from colorama import Fore
import colorama
import os
# Some loading animaion lmfao to look better till line 252
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... |')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... /')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... -')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... \\')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... |')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... /')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... -')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... \\')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... |')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... /')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... -')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... \\')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... |')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... /')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... -')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... \\')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... |')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... /')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... -')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... \\')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... |')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... /')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... -')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... \\')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... |')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... /')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... -')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... \\')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... |')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... /')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... -')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... \\')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... |')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... /')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... -')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... \\')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... |')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... /')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... -')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... \\')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... |')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... /')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... -')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... \\')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... |')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... /')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... -')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... \\')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... |')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... /')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... -')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... \\')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... |')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... /')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... -')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... \\')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... |')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... /')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... -')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... \\')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... |')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... /')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... -')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... \\')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... |')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... /')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... -')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... \\')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... |')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... /')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... -')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... \\')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... |')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... /')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... -')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... \\')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... |')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... /')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... -')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTYELLOW_EX}Loading... \\')
time.sleep(0.1)
os.system('cls')
print(f'{Fore.LIGHTBLUE_EX}Ready!')
time.sleep(2)
os.system('cls')
colorama.init()
print(f'''{Fore.LIGHTWHITE_EX}
/$$$$$$$ /$$$$$$$ /$$$$$$ /$$$$$$$$ /$$
| $$__ $$| $$__ $$ /$$__ $$ |__ $$__/ | $$
| $$ \ $$| $$ \ $$| $$ \__/ | $$ /$$$$$$ /$$$$$$ | $$
| $$$$$$$/| $$$$$$$/| $$ | $$ /$$__ $$ /$$__ $$| $$
| $$__ $$| $$____/ | $$ | $$| $$ \ $$| $$ \ $$| $$
| $$ \ $$| $$ | $$ $$ | $$| $$ | $$| $$ | $$| $$
| $$ | $$| $$ | $$$$$$/ | $$| $$$$$$/| $$$$$$/| $$
|__/ |__/|__/ \______/ |__/ \______/ \______/ |__/
''') #Sick ASCII logo XD
time.sleep(0.5)
print(f'{Fore.LIGHTBLACK_EX}Made by: {Fore.WHITE}R3FL3X#1337 {Fore.LIGHTBLACK_EX}| Licenced under {Fore.WHITE}MIT License')
time.sleep(1)
client_id = input(f"{Fore.LIGHTBLACK_EX}[{Fore.GREEN}+{Fore.LIGHTBLACK_EX}] Please input Application ID: \n>>>{Fore.WHITE} ")
details = input(f"{Fore.LIGHTBLACK_EX}[{Fore.YELLOW}?{Fore.LIGHTBLACK_EX}] Do you want RAM usage as RPC details? (yes/no) \n>>>{Fore.WHITE} ")
if(details=='yes'):
details = round(psutil.virtual_memory().percent,1)
details1 = "RAM: "+str(details)+"%"
elif(details=='no'):
details1 = input(f"{Fore.LIGHTBLACK_EX}[{Fore.GREEN}+{Fore.LIGHTBLACK_EX}] Input details part text: \n>>>{Fore.WHITE} ")
else:
print(f"{Fore.LIGHTBLACK_EX}[{Fore.RED}X{Fore.LIGHTBLACK_EX}] Please choose between yes and no!")
state = input(f"{Fore.LIGHTBLACK_EX}[{Fore.YELLOW}?{Fore.LIGHTBLACK_EX}] Do you want CPU usage as RPC state? (yes/no) \n>>>{Fore.WHITE} ")
if(state=="yes"):
state = round(psutil.cpu_percent(),1)
state1= "CPU: "+str(state)+"%"
elif(state=="no"):
state1 = input(f"{Fore.LIGHTBLACK_EX}[{Fore.GREEN}+{Fore.LIGHTBLACK_EX}] Input state part text: \n>>>{Fore.WHITE} ")
else:
print(f"{Fore.LIGHTBLACK_EX}[{Fore.RED}X{Fore.LIGHTBLACK_EX}] Please choose between yes and no!")
l_image = input(f"{Fore.LIGHTBLACK_EX}[{Fore.YELLOW}?{Fore.LIGHTBLACK_EX}] Do you want Large Image on your RPC? (yes/no) \n>>>{Fore.WHITE} ")
if(l_image=='yes'):
l_image = input(f'{Fore.LIGHTBLACK_EX}[{Fore.GREEN}+{Fore.LIGHTBLACK_EX}] Please input the name of the large image name from developer portal application: \n>>>{Fore.WHITE} ')
elif(l_image=='no'):
l_image = ''
else:
print(f"{Fore.LIGHTBLACK_EX}[{Fore.RED}X{Fore.LIGHTBLACK_EX}] Please choose between yes and no!")
s_image = input(f'{Fore.LIGHTBLACK_EX}[{Fore.YELLOW}?{Fore.LIGHTBLACK_EX}] Do you want Small Image on your RPC? (yes/no) \n>>>{Fore.WHITE} ')
if(s_image=='yes'):
s_image = input(f'{Fore.LIGHTBLACK_EX}[{Fore.GREEN}+{Fore.LIGHTBLACK_EX}] Please input the name of the small image name from developer portal application: \n>>>{Fore.WHITE} ')
elif(s_image=='no'):
s_image = ''
else:
print(f"{Fore.LIGHTBLACK_EX}[{Fore.RED}X{Fore.LIGHTBLACK_EX}] Please choose between yes and no!")
l_text = input(f'{Fore.LIGHTBLACK_EX}[{Fore.YELLOW}?{Fore.LIGHTBLACK_EX}] Do you want Large Text on your RPC? (yes/no) \n>>>{Fore.WHITE} ')
if(l_text=='yes'):
l_text = input(f'{Fore.LIGHTBLACK_EX}[{Fore.GREEN}+{Fore.LIGHTBLACK_EX}] Please input the name of the large text name from developer portal application: \n>>>{Fore.WHITE} ')
elif(l_text=='no'):
l_text = ''
else:
print(f"{Fore.LIGHTBLACK_EX}[{Fore.RED}X{Fore.LIGHTBLACK_EX}] Please choose between yes and no!")
s_text = input(f'{Fore.LIGHTBLACK_EX}[{Fore.YELLOW}?{Fore.LIGHTBLACK_EX}] Do you want Small Text on your RPC? (yes/no) \n>>>{Fore.WHITE} ')
if(s_text=='yes'):
s_text = input(f'{Fore.LIGHTBLACK_EX}[{Fore.GREEN}+{Fore.LIGHTBLACK_EX}] Please input the name of the small text name from developer portal application: \n>>>{Fore.WHITE} ')
elif(s_text=='no'):
s_text = ''
else:
print(f"{Fore.LIGHTBLACK_EX}[{Fore.RED}X{Fore.LIGHTBLACK_EX}] Please choose between yes and no!")
buts = input(f"{Fore.LIGHTBLACK_EX}[{Fore.YELLOW}?{Fore.LIGHTBLACK_EX}] Do you want buttons in your RPC? (yes/no) \n>>>{Fore.WHITE} ")
if(buts=='no'):
buttons=''
label1=''
url1=''
label2=''
url2=''
elif(buts=='yes'):
buts2 = int(input(f'{Fore.LIGHTBLACK_EX}[{Fore.YELLOW}?{Fore.LIGHTBLACK_EX}] 1 or 2 buttons?\n>>>{Fore.WHITE} '))
if(buts2==1):
label1 = input(f'{Fore.LIGHTBLACK_EX}[{Fore.GREEN}+{Fore.LIGHTBLACK_EX}] Please input Button`s label: \n>>>{Fore.WHITE} ')
url1 = input(f'{Fore.LIGHTBLACK_EX}[{Fore.GREEN}+{Fore.LIGHTBLACK_EX}] Please input Button`s URL link: \n>>>{Fore.WHITE} ')
elif(buts2==2):
label1 = input(f'{Fore.LIGHTBLACK_EX}[{Fore.GREEN}+{Fore.LIGHTBLACK_EX}] Please input 1st button`s label: \n>>>{Fore.WHITE} ')
url1 = input(f'{Fore.LIGHTBLACK_EX}[{Fore.GREEN}+{Fore.LIGHTBLACK_EX}] Please input 1st button`s URL link: \n>>>{Fore.WHITE} ')
label2 = input(f'{Fore.LIGHTBLACK_EX}[{Fore.GREEN}+{Fore.LIGHTBLACK_EX}] Please input 2nd button`s label: \n>>>{Fore.WHITE} ')
url2 = input(f'{Fore.LIGHTBLACK_EX}[{Fore.GREEN}+{Fore.LIGHTBLACK_EX}] Please input 2nd button`s URL link: \n>>>{Fore.WHITE} ')
else:
print(f'{Fore.LIGHTBLACK_EX}[{Fore.RED}X{Fore.LIGHTBLACK_EX}] Please select only between 1 and 2!')
else:
print(f"{Fore.LIGHTBLACK_EX}[{Fore.RED}X{Fore.LIGHTBLACK_EX}] Please select between yes or no only!")
#Finished requesting RPC components (oof)
print(f'{Fore.BLUE}RPC is running!')
RPC = Presence(client_id,pipe=0) # Initialize the client class
RPC.connect() # Start the handshake loop
def rpc():
start_time = time.time()
while True:
RPC.update(start=start_time, details=details1, state=state1, large_image=l_image, small_image=s_image, large_text=l_text, small_text=s_text, buttons=[{"label": label1, "url": url1}, {"label": label2, "url": url2},]) # Set the presence
time.sleep(1)
rpc() #Start the RPC
| 36.091429 | 244 | 0.640674 | 1,912 | 12,632 | 4.11454 | 0.070084 | 0.071819 | 0.116944 | 0.164739 | 0.855726 | 0.85293 | 0.846701 | 0.84149 | 0.837931 | 0.837931 | 0 | 0.019155 | 0.132125 | 12,632 | 349 | 245 | 36.194842 | 0.69844 | 0.015595 | 0 | 0.761062 | 0 | 0.091445 | 0.57584 | 0.346498 | 0 | 0 | 0 | 0 | 0 | 1 | 0.00295 | false | 0 | 0.017699 | 0 | 0.020649 | 0.271386 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
00821f9bed1ee73aac2d80dd91fc5bc2b87fd27d | 201 | py | Python | src/apps/about/admin.py | rko619619/Skidon | fe09d0d87edb973c0cb1f20478e398bc69899d1b | [
"Apache-2.0"
] | null | null | null | src/apps/about/admin.py | rko619619/Skidon | fe09d0d87edb973c0cb1f20478e398bc69899d1b | [
"Apache-2.0"
] | 1 | 2020-04-11T18:55:09.000Z | 2020-04-11T18:55:21.000Z | src/apps/about/admin.py | rko619619/Skidon | fe09d0d87edb973c0cb1f20478e398bc69899d1b | [
"Apache-2.0"
] | null | null | null | from django.contrib import admin
from . import models
admin.site.register(models.Katalog)
admin.site.register(models.Discount)
admin.site.register(models.Post_kateg)
admin.site.register(models.Post)
| 22.333333 | 38 | 0.820896 | 29 | 201 | 5.655172 | 0.413793 | 0.219512 | 0.414634 | 0.560976 | 0.329268 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.069652 | 201 | 8 | 39 | 25.125 | 0.877005 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
dac8a2d19c50cc7311a374a5d815dcc35c703122 | 8,925 | py | Python | my_exp_pose_grps.py | pulkitag/egomotion | fad2ab94b0c2f5533c79a01d1b0546b8d0c64f19 | [
"BSD-3-Clause"
] | 9 | 2017-11-25T14:24:23.000Z | 2022-03-25T07:08:28.000Z | my_exp_pose_grps.py | pulkitag/egomotion | fad2ab94b0c2f5533c79a01d1b0546b8d0c64f19 | [
"BSD-3-Clause"
] | null | null | null | my_exp_pose_grps.py | pulkitag/egomotion | fad2ab94b0c2f5533c79a01d1b0546b8d0c64f19 | [
"BSD-3-Clause"
] | 3 | 2017-10-13T02:30:28.000Z | 2021-06-30T05:55:42.000Z | import street_exp_v2 as sev2
import street_label_utils as slu
import my_exp_config as mec
import street_config as cfg
REAL_PATH = cfg.REAL_PATH
DEF_DB = cfg.DEF_DB % ('default', '%s')
def simple_euler_dof2_dcv2_smallnetv5(isRun=False,
gradClip=10, stepsize=20000, base_lr=0.001,
gamma=0.5, deviceId=0):
posePrms = slu.PosePrms(maxRot=90, simpleRot=True, dof=2)
dPrms = sev2.get_data_prms(lbPrms=posePrms)
nwFn = sev2.process_net_prms
nwArgs = {'ncpu': 0, 'baseNetDefProto': 'smallnet-v5_window_siamese_fc5'}
solFn = mec.get_default_solver_prms
solArgs = {'dbFile': DEF_DB % 'sol', 'clip_gradients': gradClip,
'stepsize': stepsize, 'base_lr':base_lr, 'gamma':gamma}
cPrms = mec.get_caffe_prms(nwFn=nwFn, nwPrms=nwArgs,
solFn=solFn, solPrms=solArgs)
exp = mec.CaffeSolverExperiment(dPrms, cPrms,
netDefFn=sev2.make_net_def, isLog=True)
if isRun:
exp.make(deviceId=deviceId)
exp.run()
return exp
def simple_euler_dof2_dcv2_amirnetv5(isRun=False,
gradClip=10, stepsize=20000, base_lr=0.001,
gamma=0.5, deviceId=0):
posePrms = slu.PosePrms(maxRot=90, simpleRot=True, dof=2)
dPrms = sev2.get_data_prms(lbPrms=posePrms)
nwFn = sev2.process_net_prms
nwArgs = {'ncpu': 0, 'baseNetDefProto': 'pose_amir_window_siamese'}
solFn = mec.get_default_solver_prms
solArgs = {'dbFile': DEF_DB % 'sol', 'clip_gradients': gradClip,
'stepsize': stepsize, 'base_lr':base_lr, 'gamma':gamma}
cPrms = mec.get_caffe_prms(nwFn=nwFn, nwPrms=nwArgs,
solFn=solFn, solPrms=solArgs)
exp = mec.CaffeSolverExperiment(dPrms, cPrms,
netDefFn=sev2.make_net_def, isLog=True)
if isRun:
exp.make(deviceId=deviceId)
exp.run()
return exp
def simple_euler_dof2_dcv2_doublefcv1(isRun=False,
gradClip=10, stepsize=20000, base_lr=0.001,
gamma=0.5, deviceId=0):
posePrms = slu.PosePrms(maxRot=90, simpleRot=True, dof=2)
dPrms = sev2.get_data_prms(lbPrms=posePrms)
nwFn = sev2.process_net_prms
nwArgs = {'ncpu': 0, 'baseNetDefProto': 'doublefc-v1_window_siamese_fc6'}
solFn = mec.get_default_solver_prms
solArgs = {'dbFile': DEF_DB % 'sol', 'clip_gradients': gradClip,
'stepsize': stepsize, 'base_lr':base_lr, 'gamma':gamma}
cPrms = mec.get_caffe_prms(nwFn=nwFn, nwPrms=nwArgs,
solFn=solFn, solPrms=solArgs)
exp = mec.CaffeSolverExperiment(dPrms, cPrms,
netDefFn=sev2.make_net_def, isLog=True)
if isRun:
exp.make(deviceId=deviceId)
exp.run()
return exp
def simple_euler_dof5_dcv2_doublefcv1(isRun=False,
gradClip=10, stepsize=20000, base_lr=0.001,
gamma=0.5, deviceId=0, readSingleGrp=False, ncpu=0, resumeIter=None):
posePrms = slu.PosePrms(maxRot=90, simpleRot=True, dof=5, nrmlz=True)
dPrms = sev2.get_data_prms(lbPrms=posePrms)
nwFn = sev2.process_net_prms
nwArgs = {'ncpu': ncpu, 'baseNetDefProto': 'doublefc-v1_window_siamese_fc6',
'readSingleGrp': readSingleGrp}
solFn = mec.get_default_solver_prms
solArgs = {'dbFile': DEF_DB % 'sol', 'clip_gradients': gradClip,
'stepsize': stepsize, 'base_lr':base_lr, 'gamma':gamma}
cPrms = mec.get_caffe_prms(nwFn=nwFn, nwPrms=nwArgs,
solFn=solFn, solPrms=solArgs, resumeIter=resumeIter)
exp = mec.CaffeSolverExperiment(dPrms, cPrms,
netDefFn=sev2.make_net_def, isLog=True)
if isRun:
exp.make(deviceId=deviceId)
exp.run()
return exp
def simple_euler_dof6_dcv2_doublefcv1_rolljitter(isRun=False,
gradClip=10, stepsize=20000, base_lr=0.001,
gamma=0.5, deviceId=0, readSingleGrp=False, ncpu=0, resumeIter=None,
maxRollJitter=15):
posePrms = slu.PosePrms(maxRot=90, simpleRot=True, dof=6, nrmlz=True)
dPrms = sev2.get_data_prms(lbPrms=posePrms)
nwFn = sev2.process_net_prms
nwArgs = {'ncpu': ncpu, 'baseNetDefProto': 'doublefc-v1_window_siamese_fc6',
'readSingleGrp': readSingleGrp,
'maxRollJitter': maxRollJitter}
solFn = mec.get_default_solver_prms
solArgs = {'dbFile': DEF_DB % 'sol', 'clip_gradients': gradClip,
'stepsize': stepsize, 'base_lr':base_lr, 'gamma':gamma}
cPrms = mec.get_caffe_prms(nwFn=nwFn, nwPrms=nwArgs,
solFn=solFn, solPrms=solArgs, resumeIter=resumeIter)
exp = mec.CaffeSolverExperiment(dPrms, cPrms,
netDefFn=sev2.make_net_def, isLog=True)
if isRun:
exp.make(deviceId=deviceId)
exp.run()
return exp
def simple_euler_dof5_dcv2_doublefcv1_l2loss(isRun=False,
gradClip=10, stepsize=20000, base_lr=0.001,
gamma=0.5, deviceId=0, readSingleGrp=False, ncpu=0, resumeIter=None):
posePrms = slu.PosePrms(maxRot=90, simpleRot=True, dof=5, nrmlz=True)
dPrms = sev2.get_data_prms(lbPrms=posePrms)
nwFn = sev2.process_net_prms
nwArgs = {'ncpu': ncpu, 'baseNetDefProto': 'doublefc-v1_window_siamese_fc6',
'lossNetDefProto': 'pose_loss_layers',
'readSingleGrp': readSingleGrp}
solFn = mec.get_default_solver_prms
solArgs = {'dbFile': DEF_DB % 'sol', 'clip_gradients': gradClip,
'stepsize': stepsize, 'base_lr':base_lr, 'gamma':gamma}
cPrms = mec.get_caffe_prms(nwFn=nwFn, nwPrms=nwArgs,
solFn=solFn, solPrms=solArgs, resumeIter=resumeIter)
exp = mec.CaffeSolverExperiment(dPrms, cPrms,
netDefFn=sev2.make_net_def, isLog=True)
if isRun:
exp.make(deviceId=deviceId)
exp.run()
return exp
def simple_quat_dof5_dcv2_doublefcv1_l2loss(isRun=False,
gradClip=10, stepsize=20000, base_lr=0.001,
gamma=0.5, deviceId=0, readSingleGrp=False, ncpu=0, resumeIter=None):
posePrms = slu.PosePrms(maxRot=90, simpleRot=False, dof=5, nrmlz=True, angleType='quat')
dPrms = sev2.get_data_prms(lbPrms=posePrms)
nwFn = sev2.process_net_prms
nwArgs = {'ncpu': ncpu, 'baseNetDefProto': 'doublefc-v1_window_siamese_fc6',
'lossNetDefProto': 'pose_loss_layers',
'readSingleGrp': readSingleGrp}
solFn = mec.get_default_solver_prms
solArgs = {'dbFile': DEF_DB % 'sol', 'clip_gradients': gradClip,
'stepsize': stepsize, 'base_lr':base_lr, 'gamma':gamma}
cPrms = mec.get_caffe_prms(nwFn=nwFn, nwPrms=nwArgs,
solFn=solFn, solPrms=solArgs, resumeIter=resumeIter)
exp = mec.CaffeSolverExperiment(dPrms, cPrms,
netDefFn=sev2.make_net_def, isLog=True)
if isRun:
exp.make(deviceId=deviceId)
exp.run()
return exp
def euler_dof3_dcv2_doublefcv1(isRun=False,
gradClip=10, stepsize=20000, base_lr=0.001,
gamma=0.5, deviceId=0):
posePrms = slu.PosePrms(maxRot=90, simpleRot=False, dof=3)
dPrms = sev2.get_data_prms(lbPrms=posePrms)
nwFn = sev2.process_net_prms
nwArgs = {'ncpu': 0, 'baseNetDefProto': 'doublefc-v1_window_siamese_fc6'}
solFn = mec.get_default_solver_prms
solArgs = {'dbFile': DEF_DB % 'sol', 'clip_gradients': gradClip,
'stepsize': stepsize, 'base_lr':base_lr, 'gamma':gamma}
cPrms = mec.get_caffe_prms(nwFn=nwFn, nwPrms=nwArgs,
solFn=solFn, solPrms=solArgs)
exp = mec.CaffeSolverExperiment(dPrms, cPrms,
netDefFn=sev2.make_net_def, isLog=True)
if isRun:
exp.make(deviceId=deviceId)
exp.run()
return exp
def simple_euler_dof2_dcv2_doublefcv1_diff(isRun=False,
gradClip=30, stepsize=60000, base_lr=0.001,
gamma=0.1, deviceId=0):
posePrms = slu.PosePrms(maxRot=90, simpleRot=True, dof=2)
dPrms = sev2.get_data_prms(lbPrms=posePrms)
nwFn = sev2.process_net_prms
nwArgs = {'ncpu': 0, 'baseNetDefProto': 'doublefc-v1_window_siamese_fc6_diff'}
solFn = mec.get_default_solver_prms
solArgs = {'dbFile': DEF_DB % 'sol', 'clip_gradients': gradClip,
'stepsize': stepsize, 'base_lr':base_lr, 'gamma':gamma}
cPrms = mec.get_caffe_prms(nwFn=nwFn, nwPrms=nwArgs,
solFn=solFn, solPrms=solArgs)
exp = mec.CaffeSolverExperiment(dPrms, cPrms,
netDefFn=sev2.make_net_def, isLog=True)
if isRun:
exp.make(deviceId=deviceId)
exp.run()
return exp
def simple_euler_dof2_dcv2_doublefcv1_diff_no_common_fc(isRun=False,
gradClip=30, stepsize=60000, base_lr=0.001,
gamma=0.1, deviceId=0, resumeIter=None):
posePrms = slu.PosePrms(maxRot=90, simpleRot=True, dof=2)
dPrms = sev2.get_data_prms(lbPrms=posePrms)
nwFn = sev2.process_net_prms
nwArgs = {'ncpu': 0, 'baseNetDefProto': 'doublefc-v1_window_siamese_fc6_diff_no-common-fc'}
solFn = mec.get_default_solver_prms
solArgs = {'dbFile': DEF_DB % 'sol', 'clip_gradients': gradClip,
'stepsize': stepsize, 'base_lr':base_lr, 'gamma':gamma}
cPrms = mec.get_caffe_prms(nwFn=nwFn, nwPrms=nwArgs,
solFn=solFn, solPrms=solArgs, resumeIter=resumeIter)
exp = mec.CaffeSolverExperiment(dPrms, cPrms,
netDefFn=sev2.make_net_def, isLog=True)
if isRun:
exp.make(deviceId=deviceId)
exp.run()
return exp
| 41.705607 | 93 | 0.699496 | 1,192 | 8,925 | 5.02349 | 0.092282 | 0.03006 | 0.03006 | 0.0167 | 0.945224 | 0.94155 | 0.94155 | 0.94155 | 0.931697 | 0.931697 | 0 | 0.035923 | 0.176583 | 8,925 | 213 | 94 | 41.901408 | 0.778881 | 0 | 0 | 0.850515 | 0 | 0 | 0.120686 | 0.035522 | 0 | 0 | 0 | 0 | 0 | 1 | 0.051546 | false | 0 | 0.020619 | 0 | 0.123711 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
97162abedff07e309383a8adbff8b7bbe9c75591 | 147 | py | Python | wedc/service/local/__init__.py | usc-isi-i2/WEDC | cf48355d8a5c6616fb34be9932520875e218d2c4 | [
"Apache-2.0"
] | null | null | null | wedc/service/local/__init__.py | usc-isi-i2/WEDC | cf48355d8a5c6616fb34be9932520875e218d2c4 | [
"Apache-2.0"
] | null | null | null | wedc/service/local/__init__.py | usc-isi-i2/WEDC | cf48355d8a5c6616fb34be9932520875e218d2c4 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
# @Author: ZwEin
# @Date: 2016-06-20 11:42:41
# @Last Modified by: ZwEin
# @Last Modified time: 2016-06-20 11:46:47
| 16.333333 | 42 | 0.598639 | 25 | 147 | 3.52 | 0.72 | 0.136364 | 0.181818 | 0.227273 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.247863 | 0.204082 | 147 | 8 | 43 | 18.375 | 0.504274 | 0.904762 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
971aef4b5a967853ada0f8d2b27ed8a3c79cf30b | 154 | py | Python | UCDPA_gavinhoran_Pycharm/nicedate.py | gwavin/UCDPA_gavinhoran | eb21c55098ceae37952344aa80143ab3da42e443 | [
"MIT"
] | null | null | null | UCDPA_gavinhoran_Pycharm/nicedate.py | gwavin/UCDPA_gavinhoran | eb21c55098ceae37952344aa80143ab3da42e443 | [
"MIT"
] | null | null | null | UCDPA_gavinhoran_Pycharm/nicedate.py | gwavin/UCDPA_gavinhoran | eb21c55098ceae37952344aa80143ab3da42e443 | [
"MIT"
] | null | null | null | import datetime
# import nicedate import nicedate
def nicedate(year_week):
return datetime.datetime.strptime('{0}-1'.format(year_week), "%Y-W%W-%w")
| 25.666667 | 77 | 0.733766 | 23 | 154 | 4.826087 | 0.565217 | 0.252252 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014599 | 0.11039 | 154 | 5 | 78 | 30.8 | 0.79562 | 0.201299 | 0 | 0 | 0 | 0 | 0.115702 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
977ebca7c07652f5648dd58a3c82d71183678375 | 166 | py | Python | base/handler/wrappers/__init__.py | vralex/RumbleRunner | eb9889daf90846176af292d4e7411c41dac885c8 | [
"MIT"
] | 2 | 2022-01-26T15:06:02.000Z | 2022-02-03T05:14:52.000Z | base/handler/wrappers/__init__.py | vralex/RumbleRunner | eb9889daf90846176af292d4e7411c41dac885c8 | [
"MIT"
] | 1 | 2022-02-07T23:50:26.000Z | 2022-02-07T23:50:26.000Z | base/handler/wrappers/__init__.py | vralex/RumbleRunner | eb9889daf90846176af292d4e7411c41dac885c8 | [
"MIT"
] | 1 | 2022-02-07T23:19:16.000Z | 2022-02-07T23:19:16.000Z | from base.handler.wrappers.context import Context
from base.handler.wrappers.message import Message, CallbackData
from base.handler.wrappers.requests import Requests
| 41.5 | 63 | 0.861446 | 22 | 166 | 6.5 | 0.409091 | 0.167832 | 0.314685 | 0.482517 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.078313 | 166 | 3 | 64 | 55.333333 | 0.934641 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
979609c6383e40afea244bf44e43b4ba382df20a | 20,186 | py | Python | model-optimizer/extensions/middle/L2NormFusing_test.py | Andruxin52rus/openvino | d824e371fe7dffb90e6d3d58e4e34adecfce4606 | [
"Apache-2.0"
] | null | null | null | model-optimizer/extensions/middle/L2NormFusing_test.py | Andruxin52rus/openvino | d824e371fe7dffb90e6d3d58e4e34adecfce4606 | [
"Apache-2.0"
] | 21 | 2021-02-16T13:02:05.000Z | 2022-02-21T13:05:06.000Z | model-optimizer/extensions/middle/L2NormFusing_test.py | mmakridi/openvino | 769bb7709597c14debdaa356dd60c5a78bdfa97e | [
"Apache-2.0"
] | null | null | null | """
Copyright (C) 2018-2021 Intel Corporation
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
"""
import unittest
import numpy as np
from extensions.middle.L2NormFusing import L2NormToNorm
from mo.front.common.partial_infer.utils import int64_array
from mo.utils.ir_engine.compare_graphs import compare_graphs
from mo.utils.unittest.graph import build_graph_with_attrs
# A list with nodes attributes used to build various graphs.
nodes = [
('l2_normalize_mul', dict(kind='op', op='Mul', name='l2_norm_name')),
('l2_normalize_mul_data', dict(kind='data')),
('maximum', dict(kind='op', op='Maximum')),
('maximum_data', dict(kind='data')),
('maximum_y_const', dict(kind='op', op='Const', value=np.array(12.e-13, dtype=np.float32))),
('maximum_y_data', dict(kind='data', value=np.array(12.e-13, dtype=np.float32))),
('rsqrt_pow', dict(kind='data', value=-0.5)),
('rsqrt', dict(kind='op', op='Pow')),
('rsqrt_data', dict(kind='data')),
('square_pow', dict(kind='op', op='Const', value=2.)),
('square_pow_data', dict(kind='data', value=2.)),
('square', dict(kind='op', op='Pow')),
('sum', dict(kind='op', op='ReduceSum')),
('sum_data', dict(kind='data')),
('sum_axes', dict(kind='op', op='Const')),
# nodes added after replacement
('normalize_node', dict(kind='op', op='NormalizeL2')),
('weights_node', dict(kind='op', op='Const')),
('result', dict(kind='op', op='Result'))
]
edges = [
('input', 'input_data', {'out': 0}),
('input_data', 'square', {'in': 0}),
('square_pow', 'square_pow_data', {'out': 0}),
('square_pow_data', 'square', {'in': 1}),
('square', 'square_data'),
('square_data', 'sum'),
('sum_axes', 'sum_axes_data'),
('sum_axes_data', 'sum'),
('sum', 'sum_data'),
('maximum_y_const', 'maximum_y_data'),
('maximum_y_data', 'maximum'),
('sum_data', 'maximum'),
('maximum', 'maximum_data'),
('maximum_data', 'rsqrt', {'in': 0}),
('rsqrt_pow', 'rsqrt', {'in': 1}),
('rsqrt', 'rsqrt_data'),
('rsqrt_data', 'l2_normalize_mul'),
('input_data', 'l2_normalize_mul'),
('l2_normalize_mul', 'l2_normalize_mul_data'),
('l2_normalize_mul_data', 'result'),
]
edges_after_replacement = [
('input', 'input_data', {'out': 0}),
('input_data', 'normalize_node'),
('weights_node', 'weights_node_data'),
('weights_node_data', 'normalize_node'),
('normalize_node', 'l2_normalize_mul_data'),
('l2_normalize_mul_data', 'result'),
]
class L2NormToNormTest(unittest.TestCase):
def test_2D(self):
input_shape = int64_array([1, 300])
axes = int64_array([1])
graph = build_graph_with_attrs(nodes + [
('input', dict(kind='op', shape=input_shape, op='Parameter', data_type=np.float32)),
('input_data', dict(kind='data', shape=input_shape, data_type=np.float32)),
('square_data', dict(kind='data', shape=input_shape)),
('sum_axes_data', dict(kind='data', value=axes, shape=axes.shape)),
], edges, nodes_with_edges_only=True)
graph.stage = 'middle'
L2NormToNorm().find_and_replace_pattern(graph)
graph_ref = build_graph_with_attrs(nodes + [
('input', dict(kind='op', shape=input_shape, op='Parameter', data_type=np.float32)),
('input_data', dict(kind='data', shape=input_shape, data_type=np.float32)),
('weights_node_data', dict(kind='data', value=axes.sort())),
], edges_after_replacement, nodes_with_edges_only=True)
(flag, resp) = compare_graphs(graph, graph_ref, 'result', check_op_attrs=True)
self.assertTrue(graph.node[graph.get_nodes_with_attributes(type='NormalizeL2')[0]]['name'] == 'l2_norm_name')
self.assertTrue(flag, resp)
def test_2D_scalar_axis(self):
input_shape = int64_array([1, 300])
axes = int64_array(1)
graph = build_graph_with_attrs(nodes + [
('input', dict(kind='op', shape=input_shape, op='Parameter', data_type=np.float32)),
('input_data', dict(kind='data', shape=input_shape, data_type=np.float32)),
('square_data', dict(kind='data', shape=input_shape)),
('sum_axes_data', dict(kind='data', value=axes, shape=None)),
], edges, nodes_with_edges_only=True)
graph.stage = 'middle'
L2NormToNorm().find_and_replace_pattern(graph)
graph_ref = build_graph_with_attrs(nodes + [
('input', dict(kind='op', shape=input_shape, op='Parameter', data_type=np.float32)),
('input_data', dict(kind='data', shape=input_shape, data_type=np.float32)),
('weights_node_data', dict(kind='data', value=int64_array([axes]).sort())),
], edges_after_replacement, nodes_with_edges_only=True)
(flag, resp) = compare_graphs(graph, graph_ref, 'result', check_op_attrs=True)
self.assertTrue(graph.node[graph.get_nodes_with_attributes(type='NormalizeL2')[0]]['name'] == 'l2_norm_name')
self.assertTrue(flag, resp)
def test_3D(self):
input_shape = int64_array([1, 300, 300])
axes = int64_array([1, 2])
graph = build_graph_with_attrs(nodes + [
('input', dict(kind='op', shape=input_shape, op='Parameter', data_type=np.float32)),
('input_data', dict(kind='data', shape=input_shape, data_type=np.float32)),
('square_data', dict(kind='data', shape=input_shape)),
('sum_axes_data', dict(kind='data', value=axes, shape=None)),
], edges, nodes_with_edges_only=True)
graph.stage = 'middle'
L2NormToNorm().find_and_replace_pattern(graph)
graph_ref = build_graph_with_attrs(nodes + [
('input', dict(kind='op', shape=input_shape, op='Parameter', data_type=np.float32)),
('input_data', dict(kind='data', shape=input_shape, data_type=np.float32)),
('weights_node_data', dict(kind='data', value=axes.sort())),
], edges_after_replacement, nodes_with_edges_only=True)
(flag, resp) = compare_graphs(graph, graph_ref, 'result', check_op_attrs=True)
self.assertTrue(graph.node[graph.get_nodes_with_attributes(type='NormalizeL2')[0]]['name'] == 'l2_norm_name')
self.assertTrue(flag, resp)
def test_4D(self):
input_shape = int64_array([1, 300, 300, 3])
axes = int64_array([1, 2, 3])
graph = build_graph_with_attrs(nodes + [
('input', dict(kind='op', shape=input_shape, op='Parameter', data_type=np.float32)),
('input_data', dict(kind='data', shape=input_shape, data_type=np.float32)),
('square_data', dict(kind='data', shape=input_shape)),
('sum_axes_data', dict(kind='data', value=axes, shape=None)),
], edges, nodes_with_edges_only=True)
graph.stage = 'middle'
L2NormToNorm().find_and_replace_pattern(graph)
graph_ref = build_graph_with_attrs(nodes + [
('input', dict(kind='op', shape=input_shape, op='Parameter', data_type=np.float32)),
('input_data', dict(kind='data', shape=input_shape, data_type=np.float32)),
('weights_node_data', dict(kind='data', value=axes.sort())),
], edges_after_replacement, nodes_with_edges_only=True)
(flag, resp) = compare_graphs(graph, graph_ref, 'result', check_op_attrs=True)
self.assertTrue(graph.node[graph.get_nodes_with_attributes(type='NormalizeL2')[0]]['name'] == 'l2_norm_name')
self.assertTrue(flag, resp)
def test_4D_mixed_axes(self):
input_shape = int64_array([1, 300, 300, 3])
axes = int64_array([3, 1, 2])
graph = build_graph_with_attrs(nodes + [
('input', dict(kind='op', shape=input_shape, op='Parameter', data_type=np.float32)),
('input_data', dict(kind='data', shape=input_shape, data_type=np.float32)),
('square_data', dict(kind='data', shape=input_shape)),
('sum_axes_data', dict(kind='data', value=axes, shape=None)),
], edges, nodes_with_edges_only=True)
graph.stage = 'middle'
L2NormToNorm().find_and_replace_pattern(graph)
graph_ref = build_graph_with_attrs(nodes + [
('input', dict(kind='op', shape=input_shape, op='Parameter', data_type=np.float32)),
('input_data', dict(kind='data', shape=input_shape, data_type=np.float32)),
('weights_node_data', dict(kind='data', value=axes.sort())),
], edges_after_replacement, nodes_with_edges_only=True)
(flag, resp) = compare_graphs(graph, graph_ref, 'result', check_op_attrs=True)
self.assertTrue(graph.node[graph.get_nodes_with_attributes(type='NormalizeL2')[0]]['name'] == 'l2_norm_name')
self.assertTrue(flag, resp)
def test_4D_multiple_consumers(self):
input_shape = int64_array([1, 300, 300, 3])
axes = int64_array([1, 2, 3])
weights_value = np.ones(shape=int64_array([input_shape[-1]]), dtype=np.float32)
graph = build_graph_with_attrs(nodes + [
('input', dict(kind='op', shape=input_shape, op='Parameter', data_type=np.float32)),
('input_data', dict(kind='data', shape=input_shape, data_type=np.float32)),
('square_data', dict(kind='data', shape=input_shape)),
('sum_axes_data', dict(kind='data', value=axes, shape=None)),
('result_2', dict(kind='op', op='Result'))
], edges + [('input_data', 'result_2')], nodes_with_edges_only=True)
graph.stage = 'middle'
L2NormToNorm().find_and_replace_pattern(graph)
graph_ref = build_graph_with_attrs(nodes + [
('input', dict(kind='op', shape=input_shape, op='Parameter', data_type=np.float32)),
('input_data', dict(kind='data', shape=input_shape, data_type=np.float32)),
('weights_node_data', dict(kind='data', value=axes.sort())),
('result_2', dict(kind='op', op='Result'))
], edges_after_replacement + [('input_data', 'result_2')], nodes_with_edges_only=True)
(flag, resp) = compare_graphs(graph, graph_ref, 'result', check_op_attrs=True)
self.assertTrue(graph.node[graph.get_nodes_with_attributes(type='NormalizeL2')[0]]['name'] == 'l2_norm_name')
self.assertTrue(flag, resp)
def test_1D_negative(self):
input_shape = int64_array([300])
axes = int64_array([0])
graph = build_graph_with_attrs(nodes + [
('input', dict(kind='op', shape=input_shape, op='Parameter', data_type=np.float32)),
('input_data', dict(kind='data', shape=input_shape, data_type=np.float32)),
('square_data', dict(kind='data', shape=input_shape)),
('sum_axes_data', dict(kind='data', value=axes, shape=None)),
], edges, nodes_with_edges_only=True)
graph.stage = 'middle'
L2NormToNorm().find_and_replace_pattern(graph)
graph_ref = build_graph_with_attrs(nodes + [
('input', dict(kind='op', shape=input_shape, op='Parameter', data_type=np.float32)),
('input_data', dict(kind='data', shape=input_shape, data_type=np.float32)),
('square_data', dict(kind='data', shape=input_shape)),
('sum_axes_data', dict(kind='data', value=axes, shape=None)),
], edges, nodes_with_edges_only=True)
(flag, resp) = compare_graphs(graph, graph_ref, 'result', check_op_attrs=True)
self.assertTrue(flag, resp)
def test_2D_negative(self):
input_shape = int64_array([1, 300])
axes = int64_array([0])
graph = build_graph_with_attrs(nodes + [
('input', dict(kind='op', shape=input_shape, op='Parameter', data_type=np.float32)),
('input_data', dict(kind='data', shape=input_shape, data_type=np.float32)),
('square_data', dict(kind='data', shape=input_shape)),
('sum_axes_data', dict(kind='data', value=axes, shape=None)),
], edges, nodes_with_edges_only=True)
graph.stage = 'middle'
L2NormToNorm().find_and_replace_pattern(graph)
graph_ref = build_graph_with_attrs(nodes + [
('input', dict(kind='op', shape=input_shape, op='Parameter', data_type=np.float32)),
('input_data', dict(kind='data', shape=input_shape, data_type=np.float32)),
('square_data', dict(kind='data', shape=input_shape)),
('sum_axes_data', dict(kind='data', value=axes, shape=None)),
], edges, nodes_with_edges_only=True)
(flag, resp) = compare_graphs(graph, graph_ref, 'result', check_op_attrs=True)
self.assertTrue(flag, resp)
def test_3D_negative(self):
input_shape = int64_array([1, 300, 300])
axes = int64_array([2])
graph = build_graph_with_attrs(nodes + [
('input', dict(kind='op', shape=input_shape, op='Parameter', data_type=np.float32)),
('input_data', dict(kind='data', shape=input_shape, data_type=np.float32)),
('square_data', dict(kind='data', shape=input_shape)),
('sum_axes_data', dict(kind='data', value=axes, shape=None)),
], edges, nodes_with_edges_only=True)
graph.stage = 'middle'
L2NormToNorm().find_and_replace_pattern(graph)
graph_ref = build_graph_with_attrs(nodes + [
('input', dict(kind='op', shape=input_shape, op='Parameter', data_type=np.float32)),
('input_data', dict(kind='data', shape=input_shape, data_type=np.float32)),
('square_data', dict(kind='data', shape=input_shape)),
('sum_axes_data', dict(kind='data', value=axes, shape=None)),
], edges, nodes_with_edges_only=True)
(flag, resp) = compare_graphs(graph, graph_ref, 'result', check_op_attrs=True)
self.assertTrue(flag, resp)
def test_4D_negative_1(self):
input_shape = int64_array([1, 300, 300, 3])
axes = int64_array([0, 1, 2])
graph = build_graph_with_attrs(nodes + [
('input', dict(kind='op', shape=input_shape, op='Parameter', data_type=np.float32)),
('input_data', dict(kind='data', shape=input_shape, data_type=np.float32)),
('square_data', dict(kind='data', shape=input_shape)),
('sum_axes_data', dict(kind='data', value=axes, shape=None)),
], edges, nodes_with_edges_only=True)
graph.stage = 'middle'
L2NormToNorm().find_and_replace_pattern(graph)
graph_ref = build_graph_with_attrs(nodes + [
('input', dict(kind='op', shape=input_shape, op='Parameter', data_type=np.float32)),
('input_data', dict(kind='data', shape=input_shape, data_type=np.float32)),
('square_data', dict(kind='data', shape=input_shape)),
('sum_axes_data', dict(kind='data', value=axes, shape=None)),
], edges, nodes_with_edges_only=True)
(flag, resp) = compare_graphs(graph, graph_ref, 'result', check_op_attrs=True)
self.assertTrue(flag, resp)
def test_4D_negative_2(self):
input_shape = int64_array([1, 300, 300, 3])
axes = int64_array([2])
graph = build_graph_with_attrs(nodes + [
('input', dict(kind='op', shape=input_shape, op='Parameter', data_type=np.float32)),
('input_data', dict(kind='data', shape=input_shape, data_type=np.float32)),
('square_data', dict(kind='data', shape=input_shape)),
('sum_axes_data', dict(kind='data', value=axes, shape=None)),
], edges, nodes_with_edges_only=True)
graph.stage = 'middle'
L2NormToNorm().find_and_replace_pattern(graph)
graph_ref = build_graph_with_attrs(nodes + [
('input', dict(kind='op', shape=input_shape, op='Parameter', data_type=np.float32)),
('input_data', dict(kind='data', shape=input_shape, data_type=np.float32)),
('square_data', dict(kind='data', shape=input_shape)),
('sum_axes_data', dict(kind='data', value=axes, shape=None)),
], edges, nodes_with_edges_only=True)
(flag, resp) = compare_graphs(graph, graph_ref, 'result', check_op_attrs=True)
self.assertTrue(flag, resp)
def test_4D_negative_3(self):
input_shape = int64_array([1, 300, 300, 3])
axes = int64_array([2, 1])
graph = build_graph_with_attrs(nodes + [
('input', dict(kind='op', shape=input_shape, op='Parameter', data_type=np.float32)),
('input_data', dict(kind='data', shape=input_shape, data_type=np.float32)),
('square_data', dict(kind='data', shape=input_shape)),
('sum_axes_data', dict(kind='data', value=axes, shape=None)),
], edges, nodes_with_edges_only=True)
graph.stage = 'middle'
L2NormToNorm().find_and_replace_pattern(graph)
graph_ref = build_graph_with_attrs(nodes + [
('input', dict(kind='op', shape=input_shape, op='Parameter', data_type=np.float32)),
('input_data', dict(kind='data', shape=input_shape, data_type=np.float32)),
('square_data', dict(kind='data', shape=input_shape)),
('sum_axes_data', dict(kind='data', value=axes, shape=None)),
], edges, nodes_with_edges_only=True)
(flag, resp) = compare_graphs(graph, graph_ref, 'result', check_op_attrs=True)
self.assertTrue(flag, resp)
def test_4D_negative_4(self):
input_shape = int64_array([1, 300, 300, 3])
axes = int64_array([2, 0])
graph = build_graph_with_attrs(nodes + [
('input', dict(kind='op', shape=input_shape, op='Parameter', data_type=np.float32)),
('input_data', dict(kind='data', shape=input_shape, data_type=np.float32)),
('square_data', dict(kind='data', shape=input_shape)),
('sum_axes_data', dict(kind='data', value=axes, shape=None)),
], edges, nodes_with_edges_only=True)
graph.stage = 'middle'
L2NormToNorm().find_and_replace_pattern(graph)
graph_ref = build_graph_with_attrs(nodes + [
('input', dict(kind='op', shape=input_shape, op='Parameter', data_type=np.float32)),
('input_data', dict(kind='data', shape=input_shape, data_type=np.float32)),
('square_data', dict(kind='data', shape=input_shape)),
('sum_axes_data', dict(kind='data', value=axes, shape=None)),
], edges, nodes_with_edges_only=True)
(flag, resp) = compare_graphs(graph, graph_ref, 'result', check_op_attrs=True)
self.assertTrue(flag, resp)
def test_5D_negative(self):
input_shape = int64_array([1, 300, 300, 300, 3])
axes = int64_array([1, 2, 3, 4])
graph = build_graph_with_attrs(nodes + [
('input', dict(kind='op', shape=input_shape, op='Parameter', data_type=np.float32)),
('input_data', dict(kind='data', shape=input_shape, data_type=np.float32)),
('square_data', dict(kind='data', shape=input_shape)),
('sum_axes_data', dict(kind='data', value=axes, shape=None)),
], edges, nodes_with_edges_only=True)
graph.stage = 'middle'
L2NormToNorm().find_and_replace_pattern(graph)
graph_ref = build_graph_with_attrs(nodes + [
('input', dict(kind='op', shape=input_shape, op='Parameter', data_type=np.float32)),
('input_data', dict(kind='data', shape=input_shape, data_type=np.float32)),
('square_data', dict(kind='data', shape=input_shape)),
('sum_axes_data', dict(kind='data', value=axes, shape=None)),
], edges, nodes_with_edges_only=True)
(flag, resp) = compare_graphs(graph, graph_ref, 'result', check_op_attrs=True)
self.assertTrue(flag, resp)
| 47.947743 | 117 | 0.632567 | 2,663 | 20,186 | 4.519715 | 0.062711 | 0.083749 | 0.084746 | 0.111665 | 0.871054 | 0.854104 | 0.845879 | 0.838817 | 0.833167 | 0.812728 | 0 | 0.025088 | 0.202269 | 20,186 | 420 | 118 | 48.061905 | 0.72235 | 0.032547 | 0 | 0.748466 | 0 | 0 | 0.151871 | 0.005382 | 0 | 0 | 0 | 0 | 0.06135 | 1 | 0.042945 | false | 0 | 0.018405 | 0 | 0.064417 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8adf6a0b00060b6ca802dd36c198e76c5d283c98 | 9,249 | py | Python | topocalc/tests/test_gradient.py | cgallinger/topocalc | a4bca61bde8b9cc2fc9dff6c666830eb37292222 | [
"CC0-1.0"
] | 6 | 2021-03-09T14:26:59.000Z | 2021-11-11T16:25:56.000Z | topocalc/tests/test_gradient.py | cgallinger/topocalc | a4bca61bde8b9cc2fc9dff6c666830eb37292222 | [
"CC0-1.0"
] | 8 | 2020-04-21T21:49:13.000Z | 2022-02-07T17:17:10.000Z | topocalc/tests/test_gradient.py | cgallinger/topocalc | a4bca61bde8b9cc2fc9dff6c666830eb37292222 | [
"CC0-1.0"
] | 5 | 2020-11-24T17:22:40.000Z | 2022-02-02T18:59:48.000Z | import unittest
import numpy as np
from topocalc import gradient
class TestGradient(unittest.TestCase):
# with self.dx and self.dy equal to 1, the cardinal direction
# slope values will be np.pi/4 as one of the differences
# will be zero
dx = 1
dy = 1
# with self.dx and self.dy equal to 1, the slope of the 45 degree
# areas will be arctan(sqrt(2))
slope_val = np.arctan(np.sqrt(2))
def gen_dem_nw(self, dem_size=10):
dem = np.tile(range(10), (10, 1)).transpose()
for i in range(dem_size):
dem[i, :] = np.arange(i, i+dem_size)
return dem
def gen_dem_sw(self, dem_size=10):
dem = np.tile(range(10), (10, 1)).transpose()
for i in range(1, 1+dem_size):
dem[-i, :] = np.arange(i, i+dem_size)
return dem
def gen_dem_se(self, dem_size=10):
dem = np.tile(range(10), (10, 1)).transpose()
for i in range(1, 1+dem_size):
dem[-i, :] = np.arange(i+dem_size, i, -1)
return dem
def gen_dem_ne(self, dem_size=10):
dem = np.tile(range(10), (10, 1)).transpose()
for i in range(dem_size):
dem[i, :] = np.arange(i+dem_size, i, -1)
return dem
class TestGradientD4(TestGradient):
def test_gradient_d4_west(self):
""" Test for the gradient_d4 for west """
# test west slope and aspect
dem = np.tile(range(10), (10, 1))
py_slope, asp = gradient.gradient_d4(dem, self.dx, self.dy)
ipw_a = gradient.aspect_to_ipw_radians(asp)
self.assertTrue(np.all(py_slope == np.pi/4))
self.assertTrue(np.all(asp == 270))
self.assertTrue(np.all(ipw_a == -np.pi/2))
def test_gradient_d4_north(self):
""" Test for the gradient_d4 for north """
# test north slope and aspect
dem = np.tile(range(10), (10, 1)).transpose()
py_slope, asp = gradient.gradient_d4(dem, self.dx, self.dy)
ipw_a = gradient.aspect_to_ipw_radians(asp)
self.assertTrue(np.all(py_slope == np.pi/4))
self.assertTrue(np.all(asp == 0))
self.assertTrue(np.all(np.abs(ipw_a) == np.pi))
def test_gradient_d4_east(self):
""" Test for the gradient_d4 for east """
# test east slope and aspect
dem = np.fliplr(np.tile(range(10), (10, 1)))
py_slope, asp = gradient.gradient_d4(dem, self.dx, self.dy)
ipw_a = gradient.aspect_to_ipw_radians(asp)
self.assertTrue(np.all(py_slope == np.pi/4))
self.assertTrue(np.all(asp == 90))
self.assertTrue(np.all(ipw_a == np.pi/2))
def test_gradient_d4_south(self):
""" Test for the gradient_d4 for south """
# test south slope and aspect
dem = np.flipud(np.tile(range(10), (10, 1)).transpose())
py_slope, asp = gradient.gradient_d4(dem, self.dx, self.dy)
ipw_a = gradient.aspect_to_ipw_radians(asp)
self.assertTrue(np.all(py_slope == np.pi/4))
self.assertTrue(np.all(asp == 180))
self.assertTrue(np.all(ipw_a == 0))
def test_gradient_d4_nw(self):
""" Test for the gradient_d4 for nw """
# test northwest slope and aspect
dem = self.gen_dem_nw()
py_slope, asp = gradient.gradient_d4(dem, self.dx, self.dy)
ipw_a = gradient.aspect_to_ipw_radians(asp)
self.assertTrue(np.all(py_slope == self.slope_val))
self.assertTrue(np.all(asp == 315))
self.assertTrue(np.all(ipw_a == (-np.pi/2 - np.pi/4)))
def test_gradient_d4_sw(self):
""" Test for the gradient_d4 for north """
# test southwest slope and aspect
dem = self.gen_dem_sw()
py_slope, asp = gradient.gradient_d4(dem, self.dx, self.dy)
ipw_a = gradient.aspect_to_ipw_radians(asp)
self.assertTrue(np.all(py_slope == self.slope_val))
self.assertTrue(np.all(asp == 225))
self.assertTrue(np.all(ipw_a == -np.pi/4))
def test_gradient_d4_se(self):
""" Test for the gradient_d4 for se """
# test southeast slope and aspect
dem = self.gen_dem_se()
py_slope, asp = gradient.gradient_d4(dem, self.dx, self.dy)
ipw_a = gradient.aspect_to_ipw_radians(asp)
self.assertTrue(np.all(py_slope == self.slope_val))
self.assertTrue(np.all(asp == 135))
self.assertTrue(np.all(ipw_a == np.pi/4))
def test_gradient_d4_ne(self):
""" Test for the gradient_d4 for ne """
# test northeast slope and aspect
dem = self.gen_dem_ne()
py_slope, asp = gradient.gradient_d4(dem, self.dx, self.dy)
ipw_a = gradient.aspect_to_ipw_radians(asp)
self.assertTrue(np.all(py_slope == self.slope_val))
self.assertTrue(np.all(asp == 45))
self.assertTrue(np.all(ipw_a == (np.pi/2 + np.pi/4)))
def test_gradient_d4_flat(self):
""" Test for the gradient_d4 for flat """
# test south slope and aspect
dem = np.ones((10, 10))
py_slope, asp = gradient.gradient_d4(dem, self.dx, self.dy)
ipw_a = gradient.aspect_to_ipw_radians(asp)
self.assertTrue(np.all(py_slope == 0))
self.assertTrue(np.all(asp == 180))
self.assertTrue(np.all(ipw_a == 0))
class TestGradientD8(TestGradient):
def test_gradient_d8_west(self):
""" Test for the gradient_d8 for west """
# test west slope and aspect
dem = np.tile(range(10), (10, 1))
py_slope, asp = gradient.gradient_d8(dem, self.dx, self.dy)
ipw_a = gradient.aspect_to_ipw_radians(asp)
self.assertTrue(np.all(py_slope == np.pi/4))
self.assertTrue(np.all(asp == 270))
self.assertTrue(np.all(ipw_a == -np.pi/2))
def test_gradient_d8_north(self):
""" Test for the gradient_d8 for north """
# test north slope and aspect
dem = np.tile(range(10), (10, 1)).transpose()
py_slope, asp = gradient.gradient_d8(dem, self.dx, self.dy)
ipw_a = gradient.aspect_to_ipw_radians(asp)
self.assertTrue(np.all(py_slope == np.pi/4))
self.assertTrue(np.all(asp == 0))
self.assertTrue(np.all(np.abs(ipw_a) == np.pi))
def test_gradient_d8_east(self):
""" Test for the gradient_d8 for east """
# test east slope and aspect
dem = np.fliplr(np.tile(range(10), (10, 1)))
py_slope, asp = gradient.gradient_d8(dem, self.dx, self.dy)
ipw_a = gradient.aspect_to_ipw_radians(asp)
self.assertTrue(np.all(py_slope == np.pi/4))
self.assertTrue(np.all(asp == 90))
self.assertTrue(np.all(ipw_a == np.pi/2))
def test_gradient_d8_south(self):
""" Test for the gradient_d8 for south """
# test south slope and aspect
dem = np.flipud(np.tile(range(10), (10, 1)).transpose())
py_slope, asp = gradient.gradient_d8(dem, self.dx, self.dy)
ipw_a = gradient.aspect_to_ipw_radians(asp)
self.assertTrue(np.all(py_slope == np.pi/4))
self.assertTrue(np.all(asp == 180))
self.assertTrue(np.all(ipw_a == 0))
def test_gradient_d8_nw(self):
""" Test for the gradient_d8 for nw """
# test northwest slope and aspect
dem = self.gen_dem_nw()
py_slope, asp = gradient.gradient_d8(dem, self.dx, self.dy)
ipw_a = gradient.aspect_to_ipw_radians(asp)
self.assertTrue(np.all(py_slope == self.slope_val))
self.assertTrue(np.all(asp == 315))
self.assertTrue(np.all(ipw_a == (-np.pi/2 - np.pi/4)))
def test_gradient_d8_sw(self):
""" Test for the gradient_d8 for sw """
# test southwest slope and aspect
dem = self.gen_dem_sw()
py_slope, asp = gradient.gradient_d8(dem, self.dx, self.dy)
ipw_a = gradient.aspect_to_ipw_radians(asp)
self.assertTrue(np.all(py_slope == self.slope_val))
self.assertTrue(np.all(asp == 225))
self.assertTrue(np.all(ipw_a == -np.pi/4))
def test_gradient_d8_se(self):
""" Test for the gradient_d8 for se """
# test southeast slope and aspect
dem = self.gen_dem_se()
py_slope, asp = gradient.gradient_d8(dem, self.dx, self.dy)
ipw_a = gradient.aspect_to_ipw_radians(asp)
self.assertTrue(np.all(py_slope == self.slope_val))
self.assertTrue(np.all(asp == 135))
self.assertTrue(np.all(ipw_a == np.pi/4))
def test_gradient_d8_ne(self):
""" Test for the gradient_d8 for ne """
# test northeast slope and aspect
dem = self.gen_dem_ne()
py_slope, asp = gradient.gradient_d8(dem, self.dx, self.dy)
ipw_a = gradient.aspect_to_ipw_radians(asp)
self.assertTrue(np.all(py_slope == self.slope_val))
self.assertTrue(np.all(asp == 45))
self.assertTrue(np.all(ipw_a == (np.pi/2 + np.pi/4)))
def test_gradient_d8_flat(self):
""" Test for the gradient_d8 for flat """
# test south slope and aspect
dem = np.ones((10, 10))
py_slope, asp = gradient.gradient_d8(dem, self.dx, self.dy)
ipw_a = gradient.aspect_to_ipw_radians(asp)
self.assertTrue(np.all(py_slope == 0))
self.assertTrue(np.all(asp == 180))
self.assertTrue(np.all(ipw_a == 0))
| 34.3829 | 69 | 0.613147 | 1,431 | 9,249 | 3.780573 | 0.062194 | 0.139741 | 0.159704 | 0.189649 | 0.935305 | 0.933641 | 0.922551 | 0.851017 | 0.851017 | 0.840665 | 0 | 0.03223 | 0.251919 | 9,249 | 268 | 70 | 34.511194 | 0.749675 | 0.148665 | 0 | 0.774194 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.348387 | 1 | 0.141935 | false | 0 | 0.019355 | 0 | 0.225806 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
c151ca33e455f625af6dd20d78064a65aff3f44d | 2,597 | py | Python | NorthNet/plotting/reactions/drawing.py | Will-Robin/NorthNet | 343238afbefd02b7255ef6013cbfb0e801bc2b3b | [
"BSD-3-Clause"
] | null | null | null | NorthNet/plotting/reactions/drawing.py | Will-Robin/NorthNet | 343238afbefd02b7255ef6013cbfb0e801bc2b3b | [
"BSD-3-Clause"
] | 2 | 2022-02-23T12:03:32.000Z | 2022-02-23T14:27:29.000Z | NorthNet/plotting/reactions/drawing.py | Will-Robin/NorthNet | 343238afbefd02b7255ef6013cbfb0e801bc2b3b | [
"BSD-3-Clause"
] | null | null | null | def draw_reaction(reaction_smiles,name):
'''
For drawing reactions
Parameters
----------
reaction_smiles: str
Reaction SMILES for the reaction to be plotted.
name: str
Name for the output file.
Returns
-------
fname: str
Name of the file output.
'''
name = name.replace(".","_").replace(">>","__")
DrawingOptions.atomLabelFontSize = 110
DrawingOptions.dotsPerAngstrom = 100
DrawingOptions.bondLineWidth = 6.0
reacs = [Chem.MolFromSmiles(x) for x in reaction_smiles.split(">>")[0].split(".")]
prods = [Chem.MolFromSmiles(x) for x in reaction_smiles.split(">>")[1].split(".")]
#reacs = [x for x in reacs if x.GetNumAtoms() > 1]
#prods = [x for x in prods if x.GetNumAtoms() > 1]
print(reacs)
print(prods)
for r in reacs:
for atom in r.GetAtoms():
atom.SetAtomMapNum(0)
for p in prods:
for atom in p.GetAtoms():
atom.SetAtomMapNum(0)
rxn = AllChem.ChemicalReaction() # Create a chemcical reaction
[rxn.AddReactantTemplate(r) for r in reacs]
[rxn.AddProductTemplate(p) for p in prods]
#rxn = AllChem.ReactionFromSmarts(reaction_smiles)
img = Draw.ReactionToImage(rxn)
img.save('{}.png'.format(name))
return '{}.png'.format(name)
def draw_reaction_SMARTS(reaction_smiles,name):
'''
For drawing reactions
Parameters
----------
reaction_smiles: str
Reaction SMILES for the reaction to be plotted.
name: str
Name for the output file.
Returns
-------
fname: str
Name of the file output.
'''
name = name.replace(".","_").replace(">>","__")
DrawingOptions.atomLabelFontSize = 110
DrawingOptions.dotsPerAngstrom = 100
DrawingOptions.bondLineWidth = 6.0
reacs = [Chem.MolFromSmarts(x) for x in reaction_smiles.split(">>")[0].split(".")]
prods = [Chem.MolFromSmarts(x) for x in reaction_smiles.split(">>")[1].split(".")]
#reacs = [x for x in reacs if x.GetNumAtoms() > 1]
#prods = [x for x in prods if x.GetNumAtoms() > 1]
for r in reacs:
for atom in r.GetAtoms():
atom.SetAtomMapNum(0)
for p in prods:
for atom in p.GetAtoms():
atom.SetAtomMapNum(0)
rxn = AllChem.ChemicalReaction() # Create a chemcical reaction
[rxn.AddReactantTemplate(r) for r in reacs]
[rxn.AddProductTemplate(p) for p in prods]
#rxn = AllChem.ReactionFromSmarts(reaction_smiles)
img = Draw.ReactionToImage(rxn)
img.save('{}.png'.format(name))
return '{}.png'.format(name)
| 29.850575 | 86 | 0.621871 | 319 | 2,597 | 5.003135 | 0.191223 | 0.105263 | 0.025063 | 0.035088 | 0.964912 | 0.964912 | 0.964912 | 0.964912 | 0.964912 | 0.932331 | 0 | 0.014192 | 0.240277 | 2,597 | 86 | 87 | 30.197674 | 0.794729 | 0.298806 | 0 | 0.8 | 0 | 0 | 0.028186 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.05 | false | 0 | 0 | 0 | 0.1 | 0.05 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c172d0b2a4d366e2b8aeec57ddfaf3bb4084745c | 232 | py | Python | savoten/handler/error.py | JunYamaguchi/savoten | 0dc0c54af409bb6b72bf1985c38614212b3955f4 | [
"MIT"
] | null | null | null | savoten/handler/error.py | JunYamaguchi/savoten | 0dc0c54af409bb6b72bf1985c38614212b3955f4 | [
"MIT"
] | 57 | 2018-04-30T05:59:43.000Z | 2019-12-08T12:16:35.000Z | savoten/handler/error.py | JunYamaguchi/savoten | 0dc0c54af409bb6b72bf1985c38614212b3955f4 | [
"MIT"
] | 1 | 2019-11-03T15:11:05.000Z | 2019-11-03T15:11:05.000Z | from flask import jsonify, make_response
def not_found(error):
return make_response(jsonify({'error': 'Not found'}), 404)
def internal_error(error):
return make_response(jsonify({'error': 'Internal Server Error'}), 500)
| 23.2 | 74 | 0.728448 | 31 | 232 | 5.290323 | 0.483871 | 0.219512 | 0.182927 | 0.280488 | 0.426829 | 0.426829 | 0 | 0 | 0 | 0 | 0 | 0.03 | 0.137931 | 232 | 9 | 75 | 25.777778 | 0.79 | 0 | 0 | 0 | 0 | 0 | 0.172414 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | false | 0 | 0.2 | 0.4 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
de0379233ad8f60ee86de215d29152f68eff72c6 | 20,848 | py | Python | tests/test_poisson.py | wsmorgan/py_dft | 5517d13483789586ecfb878748e1bb0a7c74132e | [
"MIT"
] | null | null | null | tests/test_poisson.py | wsmorgan/py_dft | 5517d13483789586ecfb878748e1bb0a7c74132e | [
"MIT"
] | null | null | null | tests/test_poisson.py | wsmorgan/py_dft | 5517d13483789586ecfb878748e1bb0a7c74132e | [
"MIT"
] | null | null | null | """Tests the evaluation and generation of the values for the poisson equation.
"""
import pytest
import numpy as np
import sys
def test_generate_M():
"""Tests that the generate M subroutine works.
"""
from pydft.poisson import _generate_M
assert np.alltrue(_generate_M([3,2,1]) == np.array([[0, 0, 0], [1, 0, 0], [2, 0, 0],
[0, 1, 0], [1, 1, 0], [2, 1, 0]]))
assert np.alltrue(_generate_M([1,2,3]) == np.array([[0, 0, 0], [0, 1, 0], [0, 0, 1],
[0, 1, 1], [0, 0, 2], [0, 1, 2]]))
assert np.alltrue(_generate_M([3,3,3]) == np.array([[0, 0, 0], [1, 0, 0], [2, 0, 0],
[0, 1, 0], [1, 1, 0], [2, 1, 0],
[0, 2, 0], [1, 2, 0], [2, 2, 0],
[0, 0, 1], [1, 0, 1], [2, 0, 1],
[0, 1, 1], [1, 1, 1], [2, 1, 1],
[0, 2, 1], [1, 2, 1], [2, 2, 1],
[0, 0, 2], [1, 0, 2], [2, 0, 2],
[0, 1, 2], [1, 1, 2], [2, 1, 2],
[0, 2, 2], [1, 2, 2], [2, 2, 2]]))
def test_generate_N():
"""Tests that the generate_N subroutine constructs the correct grid.
"""
from pydft.poisson import _generate_N
assert np.alltrue(_generate_N([3,2,1]) ==
np.array([[0, 0, 0], [1, 0, 0], [-1, 0, 0], [0, 1, 0],
[1, 1, 0], [-1, 1, 0]]))
assert np.alltrue(_generate_N([1,2,3]) == np.array([[0, 0, 0], [0, 1, 0], [0, 0, 1],
[0, 1, 1], [0, 0, -1], [0, 1, -1]]))
assert np.alltrue(_generate_N([3,3,3]) ==
np.array([[0, 0, 0], [1, 0, 0], [-1, 0, 0], [0, 1, 0], [1, 1, 0],
[-1, 1, 0], [0, -1, 0], [1, -1, 0], [-1, -1, 0], [0, 0, 1],
[1, 0, 1], [-1, 0, 1], [0, 1, 1], [1, 1, 1], [-1, 1, 1],
[0, -1, 1], [1, -1, 1], [-1, -1, 1], [0, 0, -1], [1, 0, -1],
[-1, 0, -1], [0, 1, -1], [1, 1, -1], [-1, 1, -1], [0, -1, -1],
[1, -1, -1], [-1, -1, -1]]))
def test_generate_r():
"""Tests that the generate_r subroutine constructs the correct matrix.
"""
from pydft.poisson import _generate_r
R = [[1,0,0],[0,1,0],[0,0,1]]
S = [3,3,3]
assert np.allclose(_generate_r(R,S),np.array([[0.0, 0.0, 0.0], [0.3333333333333333, 0.0, 0.0], [0.6666666666666666, 0.0, 0.0], [0.0, 0.3333333333333333, 0.0], [0.3333333333333333, 0.3333333333333333, 0.0], [0.6666666666666666, 0.3333333333333333, 0.0], [0.0, 0.6666666666666666, 0.0], [0.3333333333333333, 0.6666666666666666, 0.0], [0.6666666666666666, 0.6666666666666666, 0.0], [0.0, 0.0, 0.3333333333333333], [0.3333333333333333, 0.0, 0.3333333333333333], [0.6666666666666666, 0.0, 0.3333333333333333], [0.0, 0.3333333333333333, 0.3333333333333333], [0.3333333333333333, 0.3333333333333333, 0.3333333333333333], [0.6666666666666666, 0.3333333333333333, 0.3333333333333333], [0.0, 0.6666666666666666, 0.3333333333333333], [0.3333333333333333, 0.6666666666666666, 0.3333333333333333], [0.6666666666666666, 0.6666666666666666, 0.3333333333333333], [0.0, 0.0, 0.6666666666666666], [0.3333333333333333, 0.0, 0.6666666666666666], [0.6666666666666666, 0.0, 0.6666666666666666], [0.0, 0.3333333333333333, 0.6666666666666666], [0.3333333333333333, 0.3333333333333333, 0.6666666666666666], [0.6666666666666666, 0.3333333333333333, 0.6666666666666666], [0.0, 0.6666666666666666, 0.6666666666666666], [0.3333333333333333, 0.6666666666666666, 0.6666666666666666], [0.6666666666666666, 0.6666666666666666, 0.6666666666666666]]))
R = [[0.5,0.5,-0.5],[0.5,-0.5,0.5],[-0.5,0.5,0.5]]
S = [3,2,1]
assert np.allclose(_generate_r(R,S),np.array([[0.0, 0.0, 0.0], [0.16666666666666666, 0.0, 0.0], [0.3333333333333333, 0.0, 0.0], [0.0, -0.25, 0.0], [0.16666666666666666, -0.25, 0.0], [0.3333333333333333, -0.25, 0.0]]))
R = [[0.5,0.5,0.0],[0.5,0.0,0.5],[0.0,0.5,0.5]]
S = [1,2,3]
assert np.allclose(_generate_r(R,S),np.array([[0.0, 0.0, 0.0], [0.0, 0.0, 0.0], [0.0, 0.0, 0.16666666666666666], [0.0, 0.0, 0.16666666666666666], [0.0, 0.0, 0.3333333333333333], [0.0, 0.0, 0.3333333333333333]]))
def test_generate_G():
"""Tests the construction of the G vectors for the system.
"""
from pydft.poisson import _generate_G
R = [[1,0,0,],[0,1,0],[0,0,1]]
s = [1,2,3]
assert np.allclose(_generate_G(R,s),np.array([[0.0, 0.0, 0.0], [0.0, 6.283185307179586, 0.0], [0.0, 0.0, 6.283185307179586], [0.0, 6.283185307179586, 6.283185307179586], [0.0, 0.0, -6.283185307179586], [0.0, 6.283185307179586, -6.283185307179586]]))
R = [[0.5,0.5,-0.5],[0.5,-0.5,0.5],[-0.5,0.5,0.5]]
s = [3,2,1]
assert np.allclose(_generate_G(R,s),np.array([[0.0, 0.0, 0.0], [6.283185307179586, 6.283185307179586, 0.0], [-6.283185307179586, -6.283185307179586, 0.0], [6.283185307179586, 0.0, 6.283185307179586], [12.566370614359172, 6.283185307179586, 6.283185307179586], [0.0, -6.283185307179586, 6.283185307179586]]))
R = [[0.5,0.5,0.0],[0.5,0.0,0.5],[0.0,0.5,0.5]]
s = [3,3,3]
assert np.allclose(_generate_G(R,s),np.array([[0.0, 0.0, 0.0], [6.283185307179586, 6.283185307179586, -6.283185307179586], [-6.283185307179586, -6.283185307179586, 6.283185307179586], [6.283185307179586, -6.283185307179586, 6.283185307179586], [12.566370614359172, 0.0, 0.0], [0.0, -12.566370614359172, 12.566370614359172], [-6.283185307179586, 6.283185307179586, -6.283185307179586], [0.0, 12.566370614359172, -12.566370614359172], [-12.566370614359172, 0.0, 0.0], [-6.283185307179586, 6.283185307179586, 6.283185307179586], [0.0, 12.566370614359172, 0.0], [-12.566370614359172, 0.0, 12.566370614359172], [0.0, 0.0, 12.566370614359172], [6.283185307179586, 6.283185307179586, 6.283185307179586], [-6.283185307179586, -6.283185307179586, 18.84955592153876], [-12.566370614359172, 12.566370614359172, 0.0], [-6.283185307179586, 18.84955592153876, -6.283185307179586], [-18.84955592153876, 6.283185307179586, 6.283185307179586], [6.283185307179586, -6.283185307179586, -6.283185307179586], [12.566370614359172, 0.0, -12.566370614359172], [0.0, -12.566370614359172, 0.0], [12.566370614359172, -12.566370614359172, 0.0], [18.84955592153876, -6.283185307179586, -6.283185307179586], [6.283185307179586, -18.84955592153876, 6.283185307179586], [0.0, 0.0, -12.566370614359172], [6.283185307179586, 6.283185307179586, -18.84955592153876], [-6.283185307179586, -6.283185307179586, -6.283185307179586]]))
def test_find_Gsqu():
"""Tests the codes finding of the squared norm of the G vector.
"""
from pydft.poisson import _find_Gsqu, _generate_G
R = [[1,0,0,],[0,1,0],[0,0,1]]
s = [3,2,1]
assert np.allclose(_find_Gsqu(_generate_G(R,s)),np.array([0.0, 39.478417604357432, 39.478417604357432, 39.478417604357432, 78.956835208714864, 78.956835208714864]))
R = [[0.5,0.5,-0.5],[0.5,-0.5,0.5],[-0.5,0.5,0.5]]
s = [1,2,3]
assert np.allclose(_find_Gsqu(_generate_G(R,s)),np.array([0.0, 78.956835208714864, 78.956835208714864, 236.87050562614459, 78.956835208714864, 78.956835208714864]))
R = [[0.5,0.5,0.0],[0.5,0.0,0.5],[0.0,0.5,0.5]]
s = [3,3,3]
assert np.allclose(_find_Gsqu(_generate_G(R,s)),np.array([0.0, 118.43525281307228, 118.43525281307228, 118.43525281307228, 157.91367041742973, 315.82734083485946, 118.43525281307228, 315.82734083485946, 157.91367041742973, 118.43525281307228, 157.91367041742973, 315.82734083485946, 157.91367041742973, 118.43525281307228, 434.26259364793174, 315.82734083485946, 434.26259364793174, 434.26259364793174, 118.43525281307228, 315.82734083485946, 157.91367041742973, 315.82734083485946, 434.26259364793174, 434.26259364793174, 157.91367041742973, 434.26259364793174, 118.43525281307228]))
def test_find_dr():
"""Tests that the find dr subroutine finds the correct distances.
"""
from pydft.poisson import _find_dr, _generate_r
R = [[0.5,0.5,0.0],[0.5,0.0,0.5],[0.0,0.5,0.5]]
s = [3,3,3]
r = _generate_r(R,s)
assert np.allclose(_find_dr(r,R),np.array([0.8660254037844386, 0.78173595997057166, 0.72648315725677892, 0.8660254037844386, 0.78173595997057166, 0.72648315725677892, 0.8660254037844386, 0.78173595997057166, 0.72648315725677892, 0.78173595997057166, 0.68718427093627688, 0.62360956446232363, 0.78173595997057166, 0.68718427093627688, 0.62360956446232363, 0.78173595997057166, 0.68718427093627688, 0.62360956446232363, 0.72648315725677892, 0.62360956446232363, 0.55277079839256671, 0.72648315725677892, 0.62360956446232363, 0.55277079839256671, 0.72648315725677892, 0.62360956446232363, 0.55277079839256671]))
R = [[1,0,0,],[0,1,0],[0,0,1]]
s = [1,2,3]
r = _generate_r(R,s)
assert np.allclose(_find_dr(r,R),np.array([0.8660254037844386, 0.70710678118654757, 0.72648315725677892, 0.52704627669472992, 0.72648315725677892, 0.52704627669472992]))
R = [[0.5,0.5,-0.5],[0.5,-0.5,0.5],[-0.5,0.5,0.5]]
s = [3,2,1]
r = _generate_r(R,s)
assert np.allclose(_find_dr(r,R),np.array([0.4330127018922193, 0.36324157862838946, 0.36324157862838946, 0.61237243569579447, 0.56519416526043897, 0.56519416526043897]))
def test_gaussian():
"""Tests the evaluation of the gaussian function.
"""
from pydft.poisson import _gaussian, _generate_r, _find_dr
R = [[0.5,0.5,-0.5],[0.5,-0.5,0.5],[-0.5,0.5,0.5]]
s = [1,2,3]
sigma = 0.5
r = _generate_r(R,s)
dr = _find_dr(r,R)
assert np.allclose(_gaussian(dr,sigma),np.array([ 0.34910796, 0.23993816, 0.3901348 , 0.26813547, 0.3901348 ,0.26813547]))
R = [[1,0,0,],[0,1,0],[0,0,1]]
s = [3,2,1]
sigma = 0.25
r = _generate_r(R,s)
dr = _find_dr(r,R)
assert np.allclose(_gaussian(dr,sigma),np.array([0.010072639249693521, 0.059596720089735565, 0.059596720089735565, 0.074427296480276031, 0.44036350805532332, 0.44036350805532332]))
R = [[0.5,0.5,0.0],[0.5,0.0,0.5],[0.0,0.5,0.5]]
s = [3,3,3]
sigma = 0.75
r = _generate_r(R,s)
dr = _find_dr(r,R)
assert np.allclose(_gaussian(dr,sigma),np.array([0.077271039142547598, 0.087424539903614443, 0.094146313078274091, 0.077271039142547598, 0.087424539903614443, 0.094146313078274091, 0.077271039142547598, 0.087424539903614443, 0.094146313078274091, 0.087424539903614443, 0.098912221993792224, 0.1065172436636445, 0.087424539903614443, 0.098912221993792224, 0.1065172436636445, 0.087424539903614443, 0.098912221993792224, 0.1065172436636445, 0.094146313078274091, 0.1065172436636445, 0.11470698937905052, 0.094146313078274091, 0.1065172436636445, 0.11470698937905052, 0.094146313078274091, 0.1065172436636445, 0.11470698937905052]))
R = [[6,0,0],[0,6,0],[0,0,6]]
s = [20,25,30]
sigma = 0.75
r = _generate_r(R,s)
dr = _find_dr(r,R)
g = _gaussian(dr,sigma)
intg = sum(g*np.linalg.det(R)/float(np.prod(s)))
assert np.allclose([1],[intg],atol=1E-3)
R = [[6,0,0],[0,6,0],[0,0,6]]
s = [20,25,30]
sigma = 0.25
r = _generate_r(R,s)
dr = _find_dr(r,R)
g = _gaussian(dr,sigma)
intg = sum(g*np.linalg.det(R)/float(np.prod(s)))
assert np.allclose([1],[intg],atol=1E-3)
def test_charge_dist():
"""Tests the construction of the charge distribution function.
"""
from pydft.poisson import charge_dist, _generate_r
R = [[0.5,0.5,0.0],[0.5,0.0,0.5],[0.0,0.5,0.5]]
s = [3,3,3]
sigmas = [0.75,0.25]
coeffs = [1,-1]
assert np.allclose(charge_dist(s,R,coeffs,sigmas),np.array([0.067198399892854074, 0.056826563571350616, 0.034549592988538526, 0.067198399892854074, 0.056826563571350616, 0.034549592988538526, 0.067198399892854074, 0.056826563571350616, 0.034549592988538526, 0.056826563571350616, 0.0059637769615466657, -0.074521606788759701, 0.056826563571350616, 0.0059637769615466657, -0.074521606788759701, 0.056826563571350616, 0.0059637769615466657, -0.074521606788759701, 0.034549592988538526, -0.074521606788759701, -0.23790854240050377, 0.034549592988538526, -0.074521606788759701, -0.23790854240050377, 0.034549592988538526, -0.074521606788759701, -0.23790854240050377]))
R = [[1,0,0],[0,1,0],[0,0,1]]
s = [2,2,2]
sigmas = [0.5,0.025,0.3]
coeffs = [1,-1,0.5]
assert np.allclose(charge_dist(s,R,sigmas,coeffs),np.array([0.056911838376348944, 0.08201987696049548, 0.08201987696049548, 0.12184330048684332, 0.08201987696049548, 0.12184330048684332, 0.12184330048684332, 0.18571888510765483]))
R = [[0.5,0.5,-0.5],[0.5,-0.5,0.5],[-0.5,0.5,0.5]]
s = [1,3,1]
sigmas = [0.75,0.25]
coeffs = [0,0]
assert np.allclose(charge_dist(s,R,coeffs,sigmas),np.array([0.0, 0.0, 0.0]))
R = [[6,0,0],[0,6,0],[0,0,6]]
s = [20,25,30]
sigmas = [0.75,0.25]
coeffs = [1,-1]
n = charge_dist(s,R,coeffs,sigmas)
sumn = sum(n*np.linalg.det(R)/float(np.prod(s)))
assert np.allclose([0],[sumn],atol=1E-3)
R = [[6,0,0],[0,6,0],[0,0,6]]
s = [20,25,30]
sigmas = [0.75,0.25]
coeffs = [1]
with pytest.raises(ValueError):
charge_dist(s,R,coeffs,sigmas)
def test_O():
"""Test the generation of the O operator.
"""
from pydft.poisson import _O_operator
R = [[0.5,0.5,-0.5],[0.5,-0.5,0.5],[-0.5,0.5,0.5]]
s = [2,4,5]
out = np.identity(np.prod(s))*np.linalg.det(R)
v = np.random.normal(0,0.1,40)
assert np.allclose(_O_operator(s,R,v),np.dot(out,v))
R = [[1,0,0],[0,2,0],[0,0,4]]
s = [1,5,6]
out = np.identity(np.prod(s))*np.linalg.det(R)
v = np.random.normal(0.1,0.5,30)
assert np.allclose(_O_operator(s,R,v),np.dot(out,v))
assert np.allclose(_O_operator(s,R,v)/v,np.linalg.det(R))
R = [[2.5,2.5,-2.5],[0.5,-0.5,0.5],[-1.5,1.5,1.5]]
s = [10,2,6]
out = np.identity(np.prod(s))*np.linalg.det(R)
v = np.random.normal(0,0.25,120)
assert np.allclose(_O_operator(s,R,v),np.dot(out,v))
s = [1,2,3]
R = [[2.5,2.5,-2.5],[0.5,-0.5,0.5],[-1.5,1.5,1.5]]
v = []
for i in range(3):
v.append(np.random.normal(0,0.5,6))
v = np.array(v)
out = []
for Ns in v:
out.append(_O_operator(s,R,Ns))
out = np.transpose(out)
assert np.allclose(_O_operator(s,R,v.T), out)
def test_L():
"""Tests the L operator.
"""
from pydft.poisson import _L_operator, _generate_G, _find_Gsqu
R = [[0.5,0.5,-0.5],[0.5,-0.5,0.5],[-0.5,0.5,0.5]]
s = [2,4,5]
G = _generate_G(R,s)
G2 = _find_Gsqu(G)
L = -np.linalg.det(R)*np.diag(G2)
v = np.random.normal(0,0.1,40)
assert np.allclose(_L_operator(s,R,v),np.dot(L,v))
R = [[1,0,0],[0,2,0],[0,0,4]]
s = [1,5,6]
G = _generate_G(R,s)
G2 = _find_Gsqu(G)
L = -np.linalg.det(R)*np.diag(G2)
v = np.random.normal(0.1,0.5,30)
assert np.allclose(_L_operator(s,R,v),np.dot(L,v))
assert np.allclose(_L_operator(s,R,v)/v,-np.linalg.det(R)*G2)
R = [[2.5,2.5,-2.5],[0.5,-0.5,0.5],[-1.5,1.5,1.5]]
s = [10,2,6]
G = _generate_G(R,s)
G2 = _find_Gsqu(G)
L = -np.linalg.det(R)*np.diag(G2)
v = np.random.normal(0,0.25,120)
assert np.allclose(_L_operator(s,R,v),np.dot(L,v))
s = [1,2,3]
R = [[2.5,2.5,-2.5],[0.5,-0.5,0.5],[-1.5,1.5,1.5]]
v = []
for i in range(3):
v.append(np.random.normal(0,0.5,6))
v = np.array(v)
out = []
for Ns in v:
out.append(_L_operator(s,R,Ns))
out = np.transpose(out)
assert np.allclose(_L_operator(s,R,v.T), out)
def test_Linv():
"""Tests the Linv operator.
"""
from pydft.poisson import _Linv_operator, _generate_G, _find_Gsqu, _L_operator
R = [[0.5,0.5,-0.5],[0.5,-0.5,0.5],[-0.5,0.5,0.5]]
s = [2,4,5]
G = _generate_G(R,s)
G2 = _find_Gsqu(G)
G2[0] = 1.0
Linv = -np.diag(1/(G2*np.linalg.det(R)))
Linv[0][0] = -0.0
v = np.random.normal(0,0.1,40)
assert np.allclose(_Linv_operator(s,R,v),np.dot(Linv,v))
R = [[1,0,0],[0,2,0],[0,0,4]]
s = [1,5,6]
G = _generate_G(R,s)
G2 = _find_Gsqu(G)
G2[0] = 1.0
Linv = -np.diag(1/(G2*np.linalg.det(R)))
Linv[0][0] = -0.0
v = np.random.normal(0.1,0.5,30)
assert np.allclose(_Linv_operator(s,R,v),np.dot(Linv,v))
assert np.allclose(_L_operator(s,R,_Linv_operator(s,R,v))[1:],v[1:])
R = [[2.5,2.5,-2.5],[0.5,-0.5,0.5],[-1.5,1.5,1.5]]
s = [10,2,6]
G = _generate_G(R,s)
G2 = _find_Gsqu(G)
G2[0] = 1.0
Linv = -np.diag(1/(G2*np.linalg.det(R)))
Linv[0][0] = -0.0
v = np.random.normal(0,0.25,120)
assert np.allclose(_Linv_operator(s,R,v),np.dot(Linv,v))
def test_B():
"""Tests of the B operator.
"""
from pydft.poisson import _B_operator, _generate_G, _generate_r
s = [1,2,3]
R = [[2.5,2.5,-2.5],[0.5,-0.5,0.5],[-1.5,1.5,1.5]]
v = np.random.normal(0,0.5,6)
out = np.fft.fftn(v.reshape(s,order="F")).reshape(np.prod(s),order="F")
assert np.allclose(_B_operator(s,R,v), out)
s = [10,2,3]
R = [[0.5,0.5,-0.5],[0.5,-0.5,0.5],[-0.5,0.5,0.5]]
v = np.random.normal(0,0.5,60)
out = np.fft.fftn(v.reshape(s,order="F")).reshape(np.prod(s),order="F")
assert np.allclose(_B_operator(s,R,v),out)
s = [5,5,2]
R = [[6.0,0.0,0.0],[0.0,6.0,0.0],[0.0,0.0,6.0]]
v = np.random.normal(0,0.25,50)
out = np.fft.fftn(v.reshape(s,order="F")).reshape(np.prod(s),order="F")
assert np.allclose(_B_operator(s,R,v),out)
s = [1,2,3]
R = [[2.5,2.5,-2.5],[0.5,-0.5,0.5],[-1.5,1.5,1.5]]
v = []
for i in range(3):
v.append(np.random.normal(0,0.5,6))
v = np.array(v)
out = []
for Ns in v:
out.append(_B_operator(s,R,Ns))
out = np.transpose(out)
assert np.allclose(_B_operator(s,R,v.T), out)
def test_Bj():
"""Tests of the B conjugate transpose operator.
"""
from pydft.poisson import _Bj_operator, _generate_G, _generate_r, _B_operator
s = [5,5,2]
R = [[6.0,0.0,0.0],[0.0,6.0,0.0],[0.0,0.0,6.0]]
v = np.random.normal(0,0.25,50)
out = np.fft.ifftn(v.reshape(s,order="F")).reshape(np.prod(s),order="F")
assert np.allclose(_Bj_operator(s,R,v),out)
assert np.allclose(_B_operator(s,R,_Bj_operator(s,R,v)),v)
s = [1,2,3]
R = [[2.5,2.5,-2.5],[0.5,-0.5,0.5],[-1.5,1.5,1.5]]
v = np.random.normal(0,0.5,6)
out = np.fft.ifftn(v.reshape(s,order="F")).reshape(np.prod(s),order="F")
assert np.allclose(_Bj_operator(s,R,v),out)
s = [10,2,3]
R = [[0.5,0.5,-0.5],[0.5,-0.5,0.5],[-0.5,0.5,0.5]]
v = np.random.normal(0,0.5,60)
out = np.fft.ifftn(v.reshape(s,order="F")).reshape(np.prod(s),order="F")
assert np.allclose(_Bj_operator(s,R,v),out)
s = [1,2,3]
R = [[2.5,2.5,-2.5],[0.5,-0.5,0.5],[-1.5,1.5,1.5]]
v = []
for i in range(3):
v.append(np.random.normal(0,0.5,6))
v = np.array(v)
out = []
for Ns in v:
out.append(_Bj_operator(s,R,Ns))
out = np.transpose(out)
assert np.allclose(_Bj_operator(s,R,v.T), out)
def test_poission():
"""Tests the solution to poissons equation
"""
from pydft.poisson import poisson, charge_dist, _Bj_operator, _O_operator, _B_operator, _Linv_operator
R = [[6,0,0],[0,6,0],[0,0,6]]
s = [20,15,15]
coefs = [-1,1]
sigmas = [0.75,0.5]
n = charge_dist(s,R,coefs,sigmas)
phi = _B_operator(s,R,_Linv_operator(s,R,-4*np.pi*_O_operator(s,R,_Bj_operator(s,R,n))))
assert np.allclose(phi,poisson(s,R,n))
phi = np.real(phi)
Unum=0.5*np.real(np.dot(_Bj_operator(s,R,phi),np.transpose(_O_operator(s,R,_Bj_operator(s,R,n)))))
Uanal=((1/sigmas[0]+1/sigmas[1])/2-np.sqrt(2)/np.sqrt(sigmas[0]**2+sigmas[1]**2))/np.sqrt(np.pi)
assert np.allclose(Unum,Uanal,atol=1e-4)
def test_B_dag():
"""Tests the _B_dag_operator.
"""
from pydft.poisson import _B_dag_operator, _B_operator
from numpy.matlib import randn
s = [1,2,3]
R = [[2.5,2.5,-2.5],[0.5,-0.5,0.5],[-1.5,1.5,1.5]]
a = np.random.normal(0,0.5,6)
b = np.random.normal(0,0.5,6)
out1 = np.conj(np.dot(a.conjugate(),_B_operator(s,R,np.transpose(b))))
out2 = np.dot(np.conj(b),_B_dag_operator(s,R,np.transpose(a)))
assert np.allclose(out1, out2)
def test_Bj_dag():
"""Tests the _B_dag_operator.
"""
from pydft.poisson import _Bj_dag_operator, _Bj_operator
s = [1,2,3]
R = [[2.5,2.5,-2.5],[0.5,-0.5,0.5],[-1.5,1.5,1.5]]
a = np.random.normal(0,0.5,6)
b = np.random.normal(0,0.5,6)
out1 = np.conj(np.dot(a.conjugate(),_Bj_operator(s,R,np.transpose(b))))
out2 = np.dot(np.conj(b),_Bj_dag_operator(s,R,np.transpose(a)))
assert np.allclose(out1, out2)
| 47.274376 | 1,398 | 0.59881 | 3,645 | 20,848 | 3.34513 | 0.052675 | 0.057574 | 0.04847 | 0.040023 | 0.847946 | 0.793078 | 0.728369 | 0.720495 | 0.686623 | 0.649553 | 0 | 0.382303 | 0.186349 | 20,848 | 440 | 1,399 | 47.381818 | 0.336497 | 0.041731 | 0 | 0.634146 | 1 | 0 | 0.000603 | 0 | 0 | 0 | 0 | 0 | 0.164634 | 1 | 0.04878 | false | 0 | 0.060976 | 0 | 0.109756 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
e70ffd97997d6dcf9f48984b1ebca810b272bed4 | 35,947 | py | Python | models/roberta_iter_models.py | alibaba/Retrieval-based-Pre-training-for-Machine-Reading-Comprehension | b27dc55446a29a53af7fffdad8628ccb545420da | [
"Apache-2.0"
] | 7 | 2021-06-16T01:40:23.000Z | 2021-12-04T02:40:35.000Z | models/roberta_iter_models.py | SparkJiao/Retrieval-based-Pre-training-for-Machine-Reading-Comprehension | 9ccad31bd0bf2216004cf729d1d511fc3e0b77c9 | [
"Apache-2.0"
] | 1 | 2021-08-16T09:10:05.000Z | 2021-08-25T08:44:44.000Z | models/roberta_iter_models.py | SparkJiao/Retrieval-based-Pre-training-for-Machine-Reading-Comprehension | 9ccad31bd0bf2216004cf729d1d511fc3e0b77c9 | [
"Apache-2.0"
] | 3 | 2021-09-13T02:03:37.000Z | 2021-10-11T18:48:21.000Z | import torch
from torch import nn
from copy import deepcopy
from transformers.modeling_roberta import RobertaPreTrainedModel, RobertaConfig, RobertaModel
from general_util.logger import get_child_logger
from general_util.mixin import LogMixin, PredictionMixin
from modules import layers
logger = get_child_logger(__name__)
class IterRobertaPreTrainedConfig(RobertaConfig):
added_configs = [
'query_dropout', 'cls_type', 'sr_query_dropout', 'lm_query_dropout',
'z_step', 'pos_emb_size', 'weight_typing', 'share_ssp_sum'
]
def __init__(self, query_dropout=0.1, cls_type=0,
sr_query_dropout=0.1, lm_query_dropout=0.1,
pos_emb_size=200, z_step=0, weight_typing=True,
share_ssp_sum=False, **kwargs):
super().__init__(**kwargs)
self.query_dropout = query_dropout
self.cls_type = cls_type
self.sr_query_dropout = sr_query_dropout
self.lm_query_dropout = lm_query_dropout
self.pos_emb_size = pos_emb_size
self.z_step = z_step
self.weight_typing = weight_typing
self.share_ssp_sum = share_ssp_sum
def expand_configs(self, *args):
self.added_configs.extend(list(args))
class IterRobertaModel(RobertaPreTrainedModel):
config_class = IterRobertaPreTrainedConfig
model_prefix = 'iter_roberta'
def __init__(self, config: IterRobertaPreTrainedConfig):
super().__init__(config)
self.config = config
self.roberta = RobertaModel(config)
self.query = layers.MultiHeadAlignedTokenAttention(
config,
attn_dropout_p=config.query_dropout,
dropout_p=config.query_dropout
)
self.z_step = config.z_step
self.init_weights()
def forward(self, input_ids, attention_mask=None, token_type_ids=None,
sentence_index=None, sentence_mask=None, sent_word_mask=None,
**kwargs):
seq_output = self.roberta(input_ids=input_ids,
attention_mask=attention_mask,
token_type_ids=token_type_ids)[0]
batch, sent_num, seq_len = sent_word_mask.size()
sentence_index = sentence_index.unsqueeze(-1).expand(
-1, -1, -1, self.config.hidden_size
).reshape(batch, sent_num * seq_len, self.config.hidden_size)
sent_word_hidden = seq_output.gather(dim=1, index=sentence_index).reshape(
batch, sent_num, seq_len, -1)
q_vec = seq_output[:, :1] # <s>
for _step in range(self.z_step):
if _step == 0:
_aligned = False
else:
_aligned = True
q_vec = self.query(q_vec, sent_word_hidden, sent_word_mask, aligned=_aligned, residual=False)
if _step == 0:
q_vec = q_vec.squeeze(1)
hidden_sent = q_vec
assert hidden_sent.size() == (batch, sent_num, seq_output.size(-1))
return hidden_sent, seq_output, sent_word_hidden
class IterRobertaModelForSRAndMLM(IterRobertaModel, LogMixin):
model_prefix = 'iter_roberta_sr_mlm'
def __init__(self, config: IterRobertaPreTrainedConfig):
super().__init__(config)
self.sr_sent_sum_q = nn.Linear(config.hidden_size, config.hidden_size)
self.sr_sent_sum_k = nn.Linear(config.hidden_size, config.hidden_size)
self.sr_sent_sum_v = nn.Linear(config.hidden_size, config.hidden_size)
self.lm_sent_sum_q = nn.Linear(config.hidden_size, config.hidden_size)
self.lm_sent_sum_k = nn.Linear(config.hidden_size, config.hidden_size)
self.lm_sent_sum_v = nn.Linear(config.hidden_size, config.hidden_size)
self.lm_sum_dropout = nn.Dropout(config.lm_query_dropout)
self.sr_sum_dropout = nn.Dropout(config.sr_query_dropout)
self.sr_pooler = layers.Pooler(config.hidden_size)
self.sr_prediction_head = nn.Linear(config.hidden_size, 1)
if not self.config.weight_typing:
word_embedding_weight = deepcopy(self.roberta.get_input_embeddings().weight)
else:
word_embedding_weight = self.roberta.get_input_embeddings().weight
self.vocab_size = word_embedding_weight.size(0)
self.lm_prediction_head = layers.MaskedLMPredictionHead(config, word_embedding_weight)
self.loss_fct = nn.CrossEntropyLoss(ignore_index=-1)
self.init_weights()
# metric
self.init_metric("sr_acc", "sr_loss", "mlm_loss", "mlm_acc")
logger.info(self.config.to_dict())
def forward(self, input_ids, attention_mask=None, token_type_ids=None,
sentence_index=None, sentence_mask=None, sent_word_mask=None,
mlm_ids: torch.Tensor = None, reverse_sentence_index: torch.Tensor = None,
answers: torch.Tensor = None, pre_answers: torch.Tensor = None, **kwargs):
hidden_sent, seq_output, sent_word_hidden = super().forward(
input_ids=input_ids,
attention_mask=attention_mask,
sentence_index=sentence_index,
sentence_mask=sentence_mask,
sent_word_mask=sent_word_mask
)
batch, sent_num, word_num = sent_word_mask.size()
query_num = answers.size(1)
query_h = hidden_sent[:, :query_num]
attention_mask = attention_mask.to(query_h.dtype)
# SR
q_rel_d_sent_h = layers.multi_head_sent_sum(
q=self.sr_sent_sum_q(query_h),
k=self.sr_sent_sum_k(sent_word_hidden),
v=self.sr_sent_sum_v(sent_word_hidden),
mask=sent_word_mask,
head_num=self.config.num_attention_heads,
attn_dropout=self.sr_sum_dropout
)
sr_scores = self.sr_prediction_head(self.sr_sum_dropout(self.sr_pooler(q_rel_d_sent_h))).squeeze(-1)
# MLM
q_rel_d_h = layers.multi_head_sum(
q=self.lm_sent_sum_q(query_h),
k=self.lm_sent_sum_k(seq_output),
v=self.lm_sent_sum_v(seq_output),
mask=(1 - attention_mask),
head_num=self.config.num_attention_heads,
attn_dropout=self.lm_sum_dropout
)
query_token_num = mlm_ids.size(1)
aligned_sent_hidden = q_rel_d_h.gather(
dim=1,
index=reverse_sentence_index.unsqueeze(-1).expand(-1, -1, seq_output.size(-1))
)
concat_word_hidden = torch.cat([seq_output[:, :query_token_num], aligned_sent_hidden], dim=-1)
mlm_scores = self.lm_prediction_head(concat_word_hidden)
output_dict = {}
if mlm_ids is not None and answers is not None and pre_answers is not None:
sent_mask = sentence_mask
sent_mask = sent_mask.unsqueeze(1).expand(-1, query_num, -1)
sr_scores = sr_scores + sent_mask * -10000.0
fol_masked_scores = layers.mask_scores_with_labels(sr_scores, answers).contiguous()
sr_loss1 = self.loss_fct(fol_masked_scores.view(batch * query_num, -1),
pre_answers.view(-1))
pre_masked_scores = layers.mask_scores_with_labels(sr_scores, pre_answers).contiguous()
sr_loss2 = self.loss_fct(pre_masked_scores.view(batch * query_num, -1),
answers.view(-1))
mlm_loss = self.loss_fct(mlm_scores.view(-1, self.config.vocab_size),
mlm_ids.view(-1))
print(sr_loss1, sr_loss2, mlm_loss)
loss = sr_loss1 + sr_loss2 + mlm_loss
output_dict["loss"] = loss
if not self.training:
_, mlm_pred = mlm_scores.max(dim=-1)
mlm_valid_num = (mlm_ids != -1).sum().item()
mlm_acc = (mlm_pred == mlm_ids).sum().to(loss.dtype).item() / mlm_valid_num
self.eval_metrics.update("mlm_loss", mlm_loss.item(), mlm_valid_num)
self.eval_metrics.update("mlm_acc", mlm_acc, mlm_valid_num)
valid_num1 = (answers != -1).sum().item()
valid_num2 = (pre_answers != -1).sum().item()
valid_num = valid_num1 + valid_num2
_, pred = torch.topk(sr_scores, k=2, dim=-1, largest=True)
acc1 = (pred == answers.unsqueeze(-1)).sum()
acc2 = (pred == pre_answers.unsqueeze(-1)).sum()
acc = (acc1 + acc2).to(dtype=sr_scores.dtype) / (valid_num * 1.0)
output_dict["acc"] = acc
output_dict["valid_num"] = valid_num
self.eval_metrics.update("sr_acc", acc.item(), valid_num)
self.eval_metrics.update("sr_loss", loss.item(), valid_num)
return output_dict
class IterRobertaModelForSRAndMLMSimple(IterRobertaModel, LogMixin):
model_prefix = 'iter_roberta_sr_mlm_s'
def __init__(self, config: IterRobertaPreTrainedConfig):
super().__init__(config)
self.sr_sent_sum = nn.Linear(config.hidden_size, config.hidden_size)
self.lm_sent_sum = nn.Linear(config.hidden_size, config.hidden_size)
self.lm_dropout = nn.Dropout(config.lm_query_dropout)
self.sr_dropout = nn.Dropout(config.sr_query_dropout)
self.sr_pooler = layers.Pooler(config.hidden_size)
self.sr_prediction_head = nn.Linear(config.hidden_size, 1)
if not self.config.weight_typing:
word_embedding_weight = deepcopy(self.roberta.get_input_embeddings().weight)
else:
word_embedding_weight = self.roberta.get_input_embeddings().weight
self.vocab_size = word_embedding_weight.size(0)
self.lm_prediction_head = layers.MaskedLMPredictionHead(config, word_embedding_weight)
self.loss_fct = nn.CrossEntropyLoss(ignore_index=-1)
self.init_weights()
# metric
self.init_metric("sr_acc", "sr_loss", "mlm_loss", "mlm_acc")
logger.info(self.config.to_dict())
def forward(self, input_ids, attention_mask=None, token_type_ids=None,
sentence_index=None, sentence_mask=None, sent_word_mask=None,
mlm_ids: torch.Tensor = None, reverse_sentence_index: torch.Tensor = None,
answers: torch.Tensor = None, pre_answers: torch.Tensor = None, **kwargs):
hidden_sent, seq_output, sent_word_hidden = super().forward(
input_ids=input_ids,
attention_mask=attention_mask,
sentence_index=sentence_index,
sentence_mask=sentence_mask,
sent_word_mask=sent_word_mask
)
batch, sent_num, word_num = sent_word_mask.size()
query_num = answers.size(1)
query_h = hidden_sent[:, :query_num]
attention_mask = attention_mask.to(query_h.dtype)
# SR
sr_query_h = self.sr_sent_sum(query_h)
q_rel_d_sent_h, _ = layers.mul_sentence_sum(
sr_query_h, sent_word_hidden, sent_word_mask
)
sr_scores = self.sr_prediction_head(
self.sr_dropout(self.sr_pooler(q_rel_d_sent_h))
).squeeze(-1)
# MLM
lm_query_h = self.lm_sent_sum(query_h)
query_token_num = mlm_ids.size(1)
q_rel_d_h, _ = layers.mul_weighted_sum(
lm_query_h, seq_output, 1 - attention_mask
)
q_rel_d_h = self.lm_dropout(q_rel_d_h)
aligned_sent_hidden = q_rel_d_h.gather(
dim=1,
index=reverse_sentence_index.unsqueeze(-1).expand(-1, -1, seq_output.size(-1))
)
concat_word_hidden = torch.cat([seq_output[:, :query_token_num], aligned_sent_hidden], dim=-1)
mlm_scores = self.lm_prediction_head(concat_word_hidden)
output_dict = {}
if mlm_ids is not None and answers is not None and pre_answers is not None:
sent_mask = sentence_mask
sent_mask = sent_mask.unsqueeze(1).expand(-1, query_num, -1)
sr_scores = sr_scores + sent_mask * -10000.0
fol_masked_scores = layers.mask_scores_with_labels(sr_scores, answers).contiguous()
sr_loss1 = self.loss_fct(fol_masked_scores.view(batch * query_num, -1),
pre_answers.view(-1))
pre_masked_scores = layers.mask_scores_with_labels(sr_scores, pre_answers).contiguous()
sr_loss2 = self.loss_fct(pre_masked_scores.view(batch * query_num, -1),
answers.view(-1))
mlm_loss = self.loss_fct(mlm_scores.view(-1, self.config.vocab_size),
mlm_ids.view(-1))
print(sr_loss1, sr_loss2, mlm_loss)
loss = sr_loss1 + sr_loss2 + mlm_loss
output_dict["loss"] = loss
if not self.training:
_, mlm_pred = mlm_scores.max(dim=-1)
mlm_valid_num = (mlm_ids != -1).sum().item()
mlm_acc = (mlm_pred == mlm_ids).sum().to(loss.dtype).item() / mlm_valid_num
self.eval_metrics.update("mlm_loss", mlm_loss.item(), mlm_valid_num)
self.eval_metrics.update("mlm_acc", mlm_acc, mlm_valid_num)
valid_num1 = (answers != -1).sum().item()
valid_num2 = (pre_answers != -1).sum().item()
valid_num = valid_num1 + valid_num2
_, pred = torch.topk(sr_scores, k=2, dim=-1, largest=True)
acc1 = (pred == answers.unsqueeze(-1)).sum()
acc2 = (pred == pre_answers.unsqueeze(-1)).sum()
acc = (acc1 + acc2).to(dtype=sr_scores.dtype) / (valid_num * 1.0)
output_dict["acc"] = acc
output_dict["valid_num"] = valid_num
self.eval_metrics.update("sr_acc", acc.item(), valid_num)
self.eval_metrics.update("sr_loss", loss.item(), valid_num)
return output_dict
class IterRobertaModelForSRAndMLMWithPosBias(IterRobertaModel, LogMixin):
model_prefix = 'iter_roberta_sr_mlm_pb'
def __init__(self, config: IterRobertaPreTrainedConfig):
super().__init__(config)
self.sr_sent_sum_q = nn.Linear(config.hidden_size, config.hidden_size)
self.sr_sent_sum_k = nn.Linear(config.hidden_size, config.hidden_size)
self.sr_sent_sum_v = nn.Linear(config.hidden_size, config.hidden_size)
self.lm_sent_sum_q = nn.Linear(config.hidden_size, config.hidden_size)
self.lm_sent_sum_k = nn.Linear(config.hidden_size, config.hidden_size)
self.lm_sent_sum_v = nn.Linear(config.hidden_size, config.hidden_size)
self.lm_sum_dropout = nn.Dropout(config.lm_query_dropout)
self.sr_sum_dropout = nn.Dropout(config.sr_query_dropout)
self.sr_pooler = layers.Pooler(config.hidden_size)
self.sr_prediction_head = nn.Linear(config.hidden_size, 1)
if not self.config.weight_typing:
word_embedding_weight = deepcopy(self.roberta.get_input_embeddings().weight)
else:
word_embedding_weight = self.roberta.get_input_embeddings().weight
self.vocab_size = word_embedding_weight.size(0)
self.lm_prediction_head = layers.MultiHeadPositionBiasBasedForMLM(config,
word_embedding_weight,
config.pos_emb_size)
self.loss_fct = nn.CrossEntropyLoss(ignore_index=-1)
self.init_weights()
# metric
self.init_metric("sr_acc", "sr_loss", "mlm_loss", "mlm_acc")
logger.info(self.config.to_dict())
def forward(self, input_ids, attention_mask=None, token_type_ids=None,
sentence_index=None, sentence_mask=None, sent_word_mask=None,
mlm_ids: torch.Tensor = None, true_sent_ids: torch.Tensor = None, reverse_sentence_index=None,
answers: torch.Tensor = None, pre_answers: torch.Tensor = None, **kwargs):
hidden_sent, seq_output, sent_word_hidden = super().forward(
input_ids=input_ids,
attention_mask=attention_mask,
sentence_index=sentence_index,
sentence_mask=sentence_mask,
sent_word_mask=sent_word_mask
)
batch, sent_num, word_num = sent_word_mask.size()
query_num = answers.size(1)
query_h = hidden_sent[:, :query_num]
attention_mask = attention_mask.to(query_h.dtype)
# SR
q_rel_d_sent_h = layers.multi_head_sent_sum(
q=self.sr_sent_sum_q(query_h),
k=self.sr_sent_sum_k(sent_word_hidden),
v=self.sr_sent_sum_v(sent_word_hidden),
mask=sent_word_mask,
head_num=self.config.num_attention_heads,
attn_dropout=self.sr_sum_dropout
)
sr_scores = self.sr_prediction_head(self.sr_sum_dropout(self.sr_pooler(q_rel_d_sent_h))).squeeze(-1)
# MLM
lm_q = self.lm_sent_sum_q(query_h)
lm_k = self.lm_sent_sum_k(seq_output)
lm_v = self.lm_sent_sum_v(seq_output)
q_rel_mh_scores = layers.multi_head_sum(
q=lm_q,
k=lm_k,
v=lm_v,
mask=(1 - attention_mask),
head_num=self.config.num_attention_heads,
attn_dropout=self.lm_sum_dropout,
return_scores=True
) # [batch, head_num, query_num, seq_len]
query_token_num = mlm_ids.size(1)
aligned_q_rel_scores = q_rel_mh_scores.gather(
dim=2,
index=reverse_sentence_index.view(batch, 1, query_token_num, 1).expand(
-1, self.config.num_attention_heads, -1, q_rel_mh_scores.size(-1))
) # [batch, head_num, query_token_num, seq_len]
mlm_scores = self.lm_prediction_head(aligned_q_rel_scores,
seq_hidden_k=lm_k,
seq_hidden_v=lm_v,
seq_mask=(1 - attention_mask),
dropout=self.lm_sum_dropout)
output_dict = {}
if mlm_ids is not None and answers is not None and pre_answers is not None:
sent_mask = sentence_mask
sent_mask = sent_mask.unsqueeze(1).expand(-1, query_num, -1)
sr_scores = sr_scores + sent_mask * -10000.0
fol_masked_scores = layers.mask_scores_with_labels(sr_scores, answers).contiguous()
sr_loss1 = self.loss_fct(fol_masked_scores.view(batch * query_num, -1),
pre_answers.view(-1))
pre_masked_scores = layers.mask_scores_with_labels(sr_scores, pre_answers).contiguous()
sr_loss2 = self.loss_fct(pre_masked_scores.view(batch * query_num, -1),
answers.view(-1))
mlm_loss = self.loss_fct(mlm_scores.view(-1, self.config.vocab_size),
mlm_ids.view(-1))
print(sr_loss1, sr_loss2, mlm_loss)
loss = sr_loss1 + sr_loss2 + mlm_loss
output_dict["loss"] = loss
if not self.training:
_, mlm_pred = mlm_scores.max(dim=-1)
mlm_valid_num = (mlm_ids != -1).sum().item()
mlm_acc = (mlm_pred == mlm_ids).sum().to(loss.dtype).item() / mlm_valid_num
self.eval_metrics.update("mlm_loss", mlm_loss.item(), mlm_valid_num)
self.eval_metrics.update("mlm_acc", mlm_acc, mlm_valid_num)
valid_num1 = (answers != -1).sum().item()
valid_num2 = (pre_answers != -1).sum().item()
valid_num = valid_num1 + valid_num2
_, pred = torch.topk(sr_scores, k=2, dim=-1, largest=True)
acc1 = (pred == answers.unsqueeze(-1)).sum()
acc2 = (pred == pre_answers.unsqueeze(-1)).sum()
acc = (acc1 + acc2).to(dtype=sr_scores.dtype) / (valid_num * 1.0)
output_dict["acc"] = acc
output_dict["valid_num"] = valid_num
self.eval_metrics.update("sr_acc", acc.item(), valid_num)
self.eval_metrics.update("sr_loss", loss.item(), valid_num)
return output_dict
class IterRobertaModelForMCRC(IterRobertaModel):
model_prefix = 'iter_roberta_mcrc'
def __init__(self, config: IterRobertaPreTrainedConfig):
super().__init__(config)
self.sent_sum = nn.Linear(config.hidden_size, config.hidden_size)
if config.share_ssp_sum:
self.sr_sent_sum = nn.Linear(config.hidden_size, config.hidden_size)
self.sent_sum = self.sr_sent_sum
if config.cls_type == 1:
self.pooler = nn.Sequential(
nn.Linear(config.hidden_size * 2, config.hidden_size),
nn.Tanh()
)
else:
self.pooler = nn.Linear(config.hidden_size * 2, config.hidden_size)
self.classifier = nn.Linear(config.hidden_size, 1)
self.dropout = nn.Dropout(config.hidden_dropout_prob)
self.loss_fct = nn.CrossEntropyLoss(ignore_index=-1)
self.init_weights()
@staticmethod
def fold_tensor(x):
if x is None:
return None
return x.reshape(x.size(0) * x.size(1), *x.size()[2:])
def forward(self, input_ids, attention_mask=None, token_type_ids=None,
sentence_index=None, sentence_mask=None, sent_word_mask=None,
labels=None, **kwargs):
batch, num_choice, _ = input_ids.size()
input_ids = self.fold_tensor(input_ids)
token_type_ids = self.fold_tensor(token_type_ids)
attention_mask = self.fold_tensor(attention_mask)
sentence_index = self.fold_tensor(sentence_index)
sent_word_mask = self.fold_tensor(sent_word_mask)
sentence_mask = self.fold_tensor(sentence_mask)
seq_output = self.roberta(input_ids=input_ids,
attention_mask=attention_mask,
token_type_ids=token_type_ids)[0]
fb, sent_num, seq_len = sent_word_mask.size()
sentence_index = sentence_index.unsqueeze(-1).expand(
-1, -1, -1, self.config.hidden_size
).reshape(fb, sent_num * seq_len, self.config.hidden_size)
cls_h = seq_output[:, :1]
sent_word_hidden = seq_output.gather(dim=1, index=sentence_index).reshape(
fb, sent_num, seq_len, -1)
sent_word_hidden = sent_word_hidden * (1 - sent_word_mask.unsqueeze(-1))
q_op_word_hidden = sent_word_hidden[:, :2].reshape(fb, 1, 2 * seq_len, -1)
q_op_word_mask = sent_word_mask[:, :2].reshape(fb, 1, 2 * seq_len)
q_op_hidden_sent = self.query(cls_h, q_op_word_hidden, q_op_word_mask,
aligned=False, residual=False).view(fb, seq_output.size(-1))
# =====================================
q_op_query = self.sent_sum(q_op_hidden_sent)
p_hidden_sent, _ = layers.sentence_sum(q_op_query, sent_word_hidden[:, 2:], sent_word_mask[:, 2:])
p_hidden_sent = p_hidden_sent * (1 - sentence_mask[:, 2:].unsqueeze(-1))
attended_h, _ = layers.weighted_sum(q_op_query, p_hidden_sent, sentence_mask[:, 2:])
cls_input = torch.cat([q_op_hidden_sent, attended_h], dim=-1)
logits = self.classifier(self.dropout(self.pooler(cls_input))).view(batch, num_choice)
outputs = (logits,)
if labels is not None:
loss = self.loss_fct(logits, labels)
outputs = (loss,) + outputs
_, pred = logits.max(dim=-1)
acc = torch.sum(pred == labels) / (1.0 * batch)
outputs = outputs + (acc,)
return outputs
class IterRobertaModelForMCRC3(IterRobertaModel):
model_prefix = 'iter_roberta_mcrc3'
def __init__(self, config: IterRobertaPreTrainedConfig):
super().__init__(config)
self.sen_sum_q = nn.Linear(config.hidden_size, config.hidden_size)
self.sen_sum_k = nn.Linear(config.hidden_size, config.hidden_size)
self.doc_sum_q = nn.Linear(config.hidden_size, config.hidden_size)
self.doc_sum_k = nn.Linear(config.hidden_size, config.hidden_size)
if config.cls_type == 1:
self.pooler = nn.Sequential(
nn.Linear(config.hidden_size * 2, config.hidden_size),
nn.Tanh()
)
else:
self.pooler = nn.Linear(config.hidden_size * 2, config.hidden_size)
self.classifier = nn.Linear(config.hidden_size, 1)
self.dropout = nn.Dropout(config.hidden_dropout_prob)
self.loss_fct = nn.CrossEntropyLoss(ignore_index=-1)
self.init_weights()
@staticmethod
def fold_tensor(x):
if x is None:
return None
return x.reshape(x.size(0) * x.size(1), *x.size()[2:])
def forward(self, input_ids, attention_mask=None, token_type_ids=None,
sentence_index=None, sentence_mask=None, sent_word_mask=None,
labels=None, **kwargs):
batch, num_choice, _ = input_ids.size()
input_ids = self.fold_tensor(input_ids)
token_type_ids = self.fold_tensor(token_type_ids)
attention_mask = self.fold_tensor(attention_mask)
sentence_index = self.fold_tensor(sentence_index)
sent_word_mask = self.fold_tensor(sent_word_mask)
sentence_mask = self.fold_tensor(sentence_mask)
seq_output = self.roberta(input_ids=input_ids,
attention_mask=attention_mask,
token_type_ids=token_type_ids)[0]
fb, sent_num, seq_len = sent_word_mask.size()
sentence_index = sentence_index.unsqueeze(-1).expand(
-1, -1, -1, self.config.hidden_size
).reshape(fb, sent_num * seq_len, self.config.hidden_size)
cls_h = seq_output[:, :1]
sent_word_hidden = seq_output.gather(dim=1, index=sentence_index).reshape(
fb, sent_num, seq_len, -1)
sent_word_hidden = sent_word_hidden * (1 - sent_word_mask.unsqueeze(-1))
q_op_word_hidden = sent_word_hidden[:, :2].reshape(fb, 1, 2 * seq_len, -1)
q_op_word_mask = sent_word_mask[:, :2].reshape(fb, 1, 2 * seq_len)
q_op_hidden_sent = self.query(cls_h, q_op_word_hidden, q_op_word_mask,
aligned=False, residual=False).view(fb, seq_output.size(-1))
# =====================================
p_hidden_sent, _ = layers.sentence_sum(
q=self.sen_sum_q(q_op_hidden_sent),
kv=self.sen_sum_k(sent_word_hidden[:, 2:]),
mask=sent_word_mask[:, 2:]
)
p_hidden_sent = p_hidden_sent * (1 - sentence_mask[:, 2:].unsqueeze(-1))
attended_h, _scores = layers.weighted_sum(
q=self.doc_sum_q(q_op_hidden_sent),
kv=self.doc_sum_k(p_hidden_sent),
mask=sentence_mask[:, 2:]
)
cls_input = torch.cat([q_op_hidden_sent, attended_h], dim=-1)
logits = self.classifier(self.dropout(self.pooler(cls_input))).view(batch, num_choice)
outputs = (logits,)
if labels is not None:
loss = self.loss_fct(logits, labels)
outputs = (loss,) + outputs
_, pred = logits.max(dim=-1)
acc = torch.sum(pred == labels) / (1.0 * batch)
outputs = outputs + (acc,)
return outputs
class IterRobertaModelForSequenceClassificationV3(IterRobertaModel, PredictionMixin):
model_prefix = 'iter_roberta_sc_v3'
def __init__(self, config: IterRobertaPreTrainedConfig):
super().__init__(config)
self.sen_sum_q = nn.Linear(config.hidden_size, config.hidden_size)
self.sen_sum_k = nn.Linear(config.hidden_size, config.hidden_size)
self.doc_sum_q = nn.Linear(config.hidden_size, config.hidden_size)
self.doc_sum_k = nn.Linear(config.hidden_size, config.hidden_size)
if config.cls_type == 1:
self.pooler = nn.Sequential(
nn.Linear(config.hidden_size * 2, config.hidden_size),
nn.Tanh()
)
else:
self.pooler = nn.Linear(config.hidden_size * 2, config.hidden_size)
self.classifier = nn.Linear(config.hidden_size, config.num_labels)
self.dropout = nn.Dropout(config.hidden_dropout_prob)
self.loss_fct = nn.CrossEntropyLoss(ignore_index=-1)
self.init_weights()
def forward(self, input_ids, attention_mask=None, token_type_ids=None,
sentence_index=None, sentence_mask=None, sent_word_mask=None,
labels=None, **kwargs):
seq_output = self.roberta(input_ids=input_ids,
attention_mask=attention_mask,
token_type_ids=token_type_ids)[0]
batch, sent_num, seq_len = sent_word_mask.size()
sentence_index = sentence_index.unsqueeze(-1).expand(
-1, -1, -1, self.config.hidden_size
).reshape(batch, sent_num * seq_len, self.config.hidden_size)
cls_h = seq_output[:, :1]
sent_word_hidden = seq_output.gather(dim=1, index=sentence_index).reshape(
batch, sent_num, seq_len, -1)
sent_word_hidden = sent_word_hidden * (1 - sent_word_mask.unsqueeze(-1))
q_op_word_hidden = sent_word_hidden[:, :2].reshape(batch, 1, 2 * seq_len, -1)
q_op_word_mask = sent_word_mask[:, :2].reshape(batch, 1, 2 * seq_len)
q_op_hidden_sent = self.query(cls_h, q_op_word_hidden, q_op_word_mask,
aligned=False, residual=False).view(batch, seq_output.size(-1))
# =====================================
p_hidden_sent, _ = layers.sentence_sum(
q=self.sen_sum_q(q_op_hidden_sent),
kv=self.sen_sum_k(sent_word_hidden[:, 2:]),
mask=sent_word_mask[:, 2:]
)
p_hidden_sent = p_hidden_sent * (1 - sentence_mask[:, 2:].unsqueeze(-1))
attended_h, _scores = layers.weighted_sum(
q=self.doc_sum_q(q_op_hidden_sent),
kv=self.doc_sum_k(p_hidden_sent),
mask=sentence_mask[:, 2:]
)
cls_input = torch.cat([q_op_hidden_sent, attended_h], dim=-1)
logits = self.classifier(self.dropout(self.pooler(cls_input)))
outputs = (logits,)
if labels is not None:
loss = self.loss_fct(logits, labels)
outputs = (loss,) + outputs
_, pred = logits.max(dim=-1)
acc = torch.sum(pred == labels) / (1.0 * batch)
outputs = outputs + (acc,)
# prediction utils
if not self.training:
self.concat_predict_tensors(sentence_logits=_scores, sent_word_ids=input_ids.gather(dim=1, index=sentence_index[:, :, 0]).reshape(batch, sent_num, seq_len))
return outputs
from transformers.modeling_roberta import RobertaForMultipleChoice
class RobertaForMultipleChoice(RobertaPreTrainedModel):
model_prefix = 'roberta_mcrc'
config_class = IterRobertaPreTrainedConfig
def __init__(self, config: IterRobertaPreTrainedConfig):
super().__init__(config)
self.roberta = RobertaModel(config)
self.dropout = nn.Dropout(config.hidden_dropout_prob)
if config.cls_type == 1:
self.pooler = nn.Sequential(
nn.Linear(config.hidden_size, config.hidden_size),
nn.Tanh()
)
else:
self.pooler = nn.Linear(config.hidden_size, config.hidden_size)
self.classifier = nn.Linear(config.hidden_size, 1)
self.loss_fct = nn.CrossEntropyLoss(ignore_index=-1)
self.init_weights()
def forward(self, input_ids, attention_mask=None, token_type_ids=None, labels=None, **kwargs):
batch, num_choices = input_ids.size()[:2]
input_ids = input_ids.view(-1, input_ids.size(-1)) if input_ids is not None else None
attention_mask = attention_mask.view(-1, attention_mask.size(-1)) if attention_mask is not None else None
token_type_ids = token_type_ids.view(-1, token_type_ids.size(-1)) if token_type_ids is not None else None
seq_output = self.roberta(
input_ids,
attention_mask=attention_mask,
token_type_ids=token_type_ids
)[0]
logits = self.classifier(self.dropout(self.pooler(seq_output[:, 0])))
logits = logits.view(-1, num_choices)
outputs = (logits,)
if labels is not None:
loss = self.loss_fct(logits, labels)
outputs = (loss,) + outputs
_, pred = logits.max(dim=-1)
acc = torch.sum(pred == labels) / (1.0 * batch)
outputs = outputs + (acc,)
return outputs
class RobertaForSequenceClassification(RobertaPreTrainedModel):
model_prefix = 'roberta_sc'
config_class = IterRobertaPreTrainedConfig
def __init__(self, config: IterRobertaPreTrainedConfig):
super().__init__(config)
self.roberta = RobertaModel(config)
self.dropout = nn.Dropout(config.hidden_dropout_prob)
if config.cls_type == 1:
self.pooler = nn.Sequential(
nn.Linear(config.hidden_size, config.hidden_size),
nn.Tanh()
)
else:
self.pooler = nn.Linear(config.hidden_size, config.hidden_size)
self.classifier = nn.Linear(config.hidden_size, config.num_labels)
self.loss_fct = nn.CrossEntropyLoss(ignore_index=-1)
self.init_weights()
def forward(self, input_ids, attention_mask=None, token_type_ids=None, labels=None, **kwargs):
batch, num_choices = input_ids.size()[:2]
seq_output = self.roberta(
input_ids,
attention_mask=attention_mask,
token_type_ids=token_type_ids
)[0]
logits = self.classifier(self.dropout(self.pooler(seq_output[:, 0])))
outputs = (logits,)
if labels is not None:
loss = self.loss_fct(logits, labels)
outputs = (loss,) + outputs
_, pred = logits.max(dim=-1)
acc = torch.sum(pred == labels) / (1.0 * batch)
outputs = outputs + (acc,)
return outputs
iter_roberta_models_map = {
IterRobertaModelForSRAndMLM.model_prefix: IterRobertaModelForSRAndMLM,
IterRobertaModelForSRAndMLMWithPosBias.model_prefix: IterRobertaModelForSRAndMLMWithPosBias,
IterRobertaModelForSRAndMLMSimple.model_prefix: IterRobertaModelForSRAndMLMSimple,
IterRobertaModelForMCRC.model_prefix: IterRobertaModelForMCRC,
IterRobertaModelForMCRC3.model_prefix: IterRobertaModelForMCRC3,
IterRobertaModelForSequenceClassificationV3.model_prefix: IterRobertaModelForSequenceClassificationV3,
RobertaForMultipleChoice.model_prefix: RobertaForMultipleChoice,
RobertaForSequenceClassification.model_prefix: RobertaForSequenceClassification
}
| 39.896781 | 169 | 0.609314 | 4,473 | 35,947 | 4.534317 | 0.044713 | 0.054433 | 0.068632 | 0.041416 | 0.852234 | 0.832265 | 0.819791 | 0.813184 | 0.794448 | 0.786165 | 0 | 0.012429 | 0.28823 | 35,947 | 900 | 170 | 39.941111 | 0.78027 | 0.007177 | 0 | 0.762658 | 0 | 0 | 0.013287 | 0.001237 | 0 | 0 | 0 | 0 | 0.001582 | 1 | 0.03481 | false | 0 | 0.012658 | 0 | 0.10443 | 0.004747 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e75cb87732877bf17f7f554cf30ff4ef743c8f7c | 1,626 | py | Python | src/models/triplet_net.py | Real2CAD/Real2CAD-3DV | 897fefb5d4a07c4f67217bd01e2f8193a098573c | [
"MIT"
] | 16 | 2021-09-08T23:07:16.000Z | 2021-11-21T07:21:23.000Z | src/models/triplet_net.py | markkua/Real2CAD-3DV | 897fefb5d4a07c4f67217bd01e2f8193a098573c | [
"MIT"
] | null | null | null | src/models/triplet_net.py | markkua/Real2CAD-3DV | 897fefb5d4a07c4f67217bd01e2f8193a098573c | [
"MIT"
] | 6 | 2021-09-09T10:22:38.000Z | 2021-10-31T01:40:37.000Z | from typing import Tuple
import torch
import torch.nn as nn
class TripletNet(nn.Module):
def __init__(self, network: nn.Module) -> None:
super().__init__()
self.network = network
def forward(self, anchor: torch.Tensor, positive: torch.Tensor, negative: torch.Tensor) -> Tuple[torch.Tensor, torch.Tensor, torch.Tensor]:
anchor = self.network(anchor)
positive = self.network(positive)
negative = self.network(negative)
return anchor, positive, negative
def embed(self, data: torch.Tensor) -> torch.Tensor:
return self.network(data)
class TripletNetBatch(nn.Module):
def __init__(self, network: nn.Module) -> None:
super().__init__()
self.network = network
def forward(self, anchor: torch.Tensor, positive: torch.Tensor) -> torch.Tensor:
anchor = self.network(anchor)
positive = self.network(positive)
return torch.cat((anchor, positive), dim=0)
def embed(self, data: torch.Tensor) -> torch.Tensor:
return self.network(data)
class TripletNetBatchMix(nn.Module):
def __init__(self, network: nn.Module) -> None:
super().__init__()
self.network = network
def forward(self, anchor: torch.Tensor, positive: torch.Tensor) -> torch.Tensor:
half_shape = int(anchor.shape[0] / 2)
anchor = self.network(anchor)[0:half_shape-1,-1]
positive = self.network(positive)[0:half_shape-1,-1]
return torch.cat((anchor, positive), dim=0)
def embed(self, data: torch.Tensor) -> torch.Tensor:
return self.network(data) | 30.679245 | 143 | 0.647601 | 199 | 1,626 | 5.155779 | 0.170854 | 0.192982 | 0.109162 | 0.150097 | 0.744639 | 0.721248 | 0.721248 | 0.721248 | 0.721248 | 0.721248 | 0 | 0.007955 | 0.226937 | 1,626 | 53 | 144 | 30.679245 | 0.808274 | 0 | 0 | 0.657143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.257143 | false | 0 | 0.085714 | 0.085714 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 7 |
e76915e9cb21e6d9fe262524c5eb82f4e02cf9ac | 74 | py | Python | neuron_ml/core/data/__init__.py | fossabot/Neuron | ee8b328411bddb9c86675914b0e0b50250fb7ff9 | [
"MIT"
] | 9 | 2018-12-18T06:19:09.000Z | 2021-11-22T19:46:13.000Z | neuron_ml/core/data/__init__.py | fossabot/Neuron | ee8b328411bddb9c86675914b0e0b50250fb7ff9 | [
"MIT"
] | 20 | 2018-11-23T16:09:04.000Z | 2022-02-10T00:06:17.000Z | neuron_ml/core/data/__init__.py | fossabot/Neuron | ee8b328411bddb9c86675914b0e0b50250fb7ff9 | [
"MIT"
] | 1 | 2019-02-25T11:58:20.000Z | 2019-02-25T11:58:20.000Z | import neuron_ml.core.data.createml
import neuron_ml.core.data.tensorflow
| 24.666667 | 37 | 0.864865 | 12 | 74 | 5.166667 | 0.583333 | 0.387097 | 0.451613 | 0.580645 | 0.709677 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.054054 | 74 | 2 | 38 | 37 | 0.885714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
e77468f83e06a8ca5953b1f4849666a748ca1c41 | 213 | py | Python | libcc/__init__.py | MICLab-Unicamp/inCCsight | 94efc5d68234950687d13151b8878bbdeef53eb3 | [
"BSD-3-Clause"
] | 2 | 2021-07-07T20:16:23.000Z | 2021-07-14T14:03:09.000Z | libcc/__init__.py | thaiscaldeira/ccinsight | 94efc5d68234950687d13151b8878bbdeef53eb3 | [
"BSD-3-Clause"
] | null | null | null | libcc/__init__.py | thaiscaldeira/ccinsight | 94efc5d68234950687d13151b8878bbdeef53eb3 | [
"BSD-3-Clause"
] | 1 | 2022-03-22T18:23:06.000Z | 2022-03-22T18:23:06.000Z | # -*- encoding: utf-8 -*-
from libcc.points import *
from libcc.preprocess import *
from libcc.parcellation import *
from libcc.segmentation import *
from libcc.gets import *
from libcc.shape_signature import *
| 21.3 | 35 | 0.755869 | 28 | 213 | 5.714286 | 0.464286 | 0.3375 | 0.46875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005495 | 0.14554 | 213 | 9 | 36 | 23.666667 | 0.873626 | 0.107981 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
e77620660d1ae2a78a7640c3f0ac5ca0328952fc | 57 | py | Python | python/testData/resolve/multiFile/resolveQualifiedSuperClassInPackage/bar/__init__.py | truthiswill/intellij-community | fff88cfb0dc168eea18ecb745d3e5b93f57b0b95 | [
"Apache-2.0"
] | 2 | 2019-04-28T07:48:50.000Z | 2020-12-11T14:18:08.000Z | python/testData/resolve/multiFile/resolveQualifiedSuperClassInPackage/bar/__init__.py | truthiswill/intellij-community | fff88cfb0dc168eea18ecb745d3e5b93f57b0b95 | [
"Apache-2.0"
] | 173 | 2018-07-05T13:59:39.000Z | 2018-08-09T01:12:03.000Z | python/testData/resolve/multiFile/resolveQualifiedSuperClassInPackage/bar/__init__.py | truthiswill/intellij-community | fff88cfb0dc168eea18ecb745d3e5b93f57b0b95 | [
"Apache-2.0"
] | 2 | 2020-03-15T08:57:37.000Z | 2020-04-07T04:48:14.000Z | import foo.baz
class Super(foo.baz.SuperDuper):
pass
| 9.5 | 32 | 0.736842 | 9 | 57 | 4.666667 | 0.777778 | 0.285714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.157895 | 57 | 5 | 33 | 11.4 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 7 |
e7de86f91f14b8c21ecaefe85f3b627dce3c389c | 7,462 | py | Python | tests/cmds/search/test_cursor_store.py | unparalleled-js/code42cli | 3efbc274b96437ffcfa0fe5abb280fa10f0f05e6 | [
"MIT"
] | null | null | null | tests/cmds/search/test_cursor_store.py | unparalleled-js/code42cli | 3efbc274b96437ffcfa0fe5abb280fa10f0f05e6 | [
"MIT"
] | null | null | null | tests/cmds/search/test_cursor_store.py | unparalleled-js/code42cli | 3efbc274b96437ffcfa0fe5abb280fa10f0f05e6 | [
"MIT"
] | null | null | null | from os import path
import pytest
from code42cli import PRODUCT_NAME
from code42cli.cmds.search.cursor_store import AlertCursorStore
from code42cli.cmds.search.cursor_store import Cursor
from code42cli.cmds.search.cursor_store import FileEventCursorStore
from code42cli.errors import Code42CLIError
PROFILE_NAME = "testprofile"
CURSOR_NAME = "testcursor"
_NAMESPACE = "{}.cmds.search.cursor_store".format(PRODUCT_NAME)
@pytest.fixture
def mock_open(mocker):
mock = mocker.patch("builtins.open", mocker.mock_open(read_data="123456789"))
return mock
@pytest.fixture
def mock_isfile(mocker):
mock = mocker.patch("{}.os.path.isfile".format(_NAMESPACE))
mock.return_value = True
return mock
class TestCursor:
def test_name_returns_expected_name(self):
cursor = Cursor("bogus/path")
assert cursor.name == "path"
def test_value_returns_expected_value(self, mock_open):
cursor = Cursor("bogus/path")
assert cursor.value == "123456789"
def test_value_reads_expected_file(self, mock_open):
cursor = Cursor("bogus/path")
_ = cursor.value
mock_open.assert_called_once_with("bogus/path")
class TestAlertCursorStore:
def test_get_returns_expected_timestamp(self, mock_open):
store = AlertCursorStore(PROFILE_NAME)
checkpoint = store.get(CURSOR_NAME)
assert checkpoint == 123456789
def test_get_when_profile_does_not_exist_returns_none(self, mocker):
store = AlertCursorStore(PROFILE_NAME)
checkpoint = store.get(CURSOR_NAME)
mock_open = mocker.patch("{}.open".format(_NAMESPACE))
mock_open.side_effect = FileNotFoundError
assert checkpoint is None
def test_get_reads_expected_file(self, mock_open):
store = AlertCursorStore(PROFILE_NAME)
store.get(CURSOR_NAME)
user_path = path.join(path.expanduser("~"), ".code42cli")
expected_path = path.join(
user_path, "alert_checkpoints", PROFILE_NAME, CURSOR_NAME
)
mock_open.assert_called_once_with(expected_path)
def test_replace_writes_to_expected_file(self, mock_open):
store = AlertCursorStore(PROFILE_NAME)
store.replace("checkpointname", 123)
user_path = path.join(path.expanduser("~"), ".code42cli")
expected_path = path.join(
user_path, "alert_checkpoints", PROFILE_NAME, "checkpointname"
)
mock_open.assert_called_once_with(expected_path, "w")
def test_replace_writes_expected_content(self, mock_open):
store = AlertCursorStore(PROFILE_NAME)
store.replace("checkpointname", 123)
user_path = path.join(path.expanduser("~"), ".code42cli")
path.join(user_path, "alert_checkpoints", PROFILE_NAME, "checkpointname")
mock_open.return_value.write.assert_called_once_with("123")
def test_delete_calls_remove_on_expected_file(self, mock_open, mock_remove):
store = AlertCursorStore(PROFILE_NAME)
store.delete("deleteme")
user_path = path.join(path.expanduser("~"), ".code42cli")
expected_path = path.join(
user_path, "alert_checkpoints", PROFILE_NAME, "deleteme"
)
mock_remove.assert_called_once_with(expected_path)
def test_delete_when_checkpoint_does_not_exist_raises_cli_error(
self, mock_open, mock_remove
):
store = AlertCursorStore(PROFILE_NAME)
mock_remove.side_effect = FileNotFoundError
with pytest.raises(Code42CLIError):
store.delete("deleteme")
def test_clean_calls_remove_on_each_checkpoint(
self, mock_open, mock_remove, mock_listdir, mock_isfile
):
mock_listdir.return_value = ["fileone", "filetwo", "filethree"]
store = AlertCursorStore(PROFILE_NAME)
store.clean()
assert mock_remove.call_count == 3
def test_get_all_cursors_returns_all_checkpoints(
self, mock_open, mock_listdir, mock_isfile
):
mock_listdir.return_value = ["fileone", "filetwo", "filethree"]
store = AlertCursorStore(PROFILE_NAME)
cursors = store.get_all_cursors()
assert len(cursors) == 3
assert cursors[0].name == "fileone"
assert cursors[1].name == "filetwo"
assert cursors[2].name == "filethree"
class TestFileEventCursorStore:
def test_get_returns_expected_timestamp(self, mock_open):
store = FileEventCursorStore(PROFILE_NAME)
checkpoint = store.get(CURSOR_NAME)
assert checkpoint == 123456789
def test_get_reads_expected_file(self, mock_open):
store = FileEventCursorStore(PROFILE_NAME)
store.get(CURSOR_NAME)
user_path = path.join(path.expanduser("~"), ".code42cli")
expected_path = path.join(
user_path, "file_event_checkpoints", PROFILE_NAME, CURSOR_NAME
)
mock_open.assert_called_once_with(expected_path)
def test_get_when_profile_does_not_exist_returns_none(self, mocker):
store = FileEventCursorStore(PROFILE_NAME)
checkpoint = store.get(CURSOR_NAME)
mock_open = mocker.patch("{}.open".format(_NAMESPACE))
mock_open.side_effect = FileNotFoundError
assert checkpoint is None
def test_replace_writes_to_expected_file(self, mock_open):
store = FileEventCursorStore(PROFILE_NAME)
store.replace("checkpointname", 123)
user_path = path.join(path.expanduser("~"), ".code42cli")
expected_path = path.join(
user_path, "file_event_checkpoints", PROFILE_NAME, "checkpointname"
)
mock_open.assert_called_once_with(expected_path, "w")
def test_replace_writes_expected_content(self, mock_open):
store = FileEventCursorStore(PROFILE_NAME)
store.replace("checkpointname", 123)
user_path = path.join(path.expanduser("~"), ".code42cli")
path.join(user_path, "file_event_checkpoints", PROFILE_NAME, "checkpointname")
mock_open.return_value.write.assert_called_once_with("123")
def test_delete_calls_remove_on_expected_file(self, mock_open, mock_remove):
store = FileEventCursorStore(PROFILE_NAME)
store.delete("deleteme")
user_path = path.join(path.expanduser("~"), ".code42cli")
expected_path = path.join(
user_path, "file_event_checkpoints", PROFILE_NAME, "deleteme"
)
mock_remove.assert_called_once_with(expected_path)
def test_delete_when_checkpoint_does_not_exist_raises_cli_error(
self, mock_open, mock_remove
):
store = FileEventCursorStore(PROFILE_NAME)
mock_remove.side_effect = FileNotFoundError
with pytest.raises(Code42CLIError):
store.delete("deleteme")
def test_clean_calls_remove_on_each_checkpoint(
self, mock_open, mock_remove, mock_listdir, mock_isfile
):
mock_listdir.return_value = ["fileone", "filetwo", "filethree"]
store = FileEventCursorStore(PROFILE_NAME)
store.clean()
assert mock_remove.call_count == 3
def test_get_all_cursors_returns_all_checkpoints(self, mock_listdir, mock_isfile):
mock_listdir.return_value = ["fileone", "filetwo", "filethree"]
store = FileEventCursorStore(PROFILE_NAME)
cursors = store.get_all_cursors()
assert len(cursors) == 3
assert cursors[0].name == "fileone"
assert cursors[1].name == "filetwo"
assert cursors[2].name == "filethree"
| 39.068063 | 86 | 0.700482 | 864 | 7,462 | 5.699074 | 0.113426 | 0.048741 | 0.04143 | 0.036556 | 0.87368 | 0.87368 | 0.855402 | 0.817628 | 0.817628 | 0.788383 | 0 | 0.016153 | 0.203565 | 7,462 | 190 | 87 | 39.273684 | 0.812384 | 0 | 0 | 0.734177 | 0 | 0 | 0.094345 | 0.015411 | 0 | 0 | 0 | 0 | 0.158228 | 1 | 0.14557 | false | 0 | 0.044304 | 0 | 0.221519 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
99b39a907d00f14e7f09a74895bf5c5753542180 | 3,877 | py | Python | tests/api_resources/identity/test_verification_session.py | bhch/async-stripe | 75d934a8bb242f664e7be30812c12335cf885287 | [
"MIT",
"BSD-3-Clause"
] | 8 | 2021-05-29T08:57:58.000Z | 2022-02-19T07:09:25.000Z | tests/api_resources/identity/test_verification_session.py | bhch/async-stripe | 75d934a8bb242f664e7be30812c12335cf885287 | [
"MIT",
"BSD-3-Clause"
] | 5 | 2021-05-31T10:18:36.000Z | 2022-01-25T11:39:03.000Z | tests/api_resources/identity/test_verification_session.py | bhch/async-stripe | 75d934a8bb242f664e7be30812c12335cf885287 | [
"MIT",
"BSD-3-Clause"
] | 1 | 2021-05-29T13:27:10.000Z | 2021-05-29T13:27:10.000Z | from __future__ import absolute_import, division, print_function
import stripe
import pytest
pytestmark = pytest.mark.asyncio
TEST_RESOURCE_ID = "vs_123"
class TestVerificationSession(object):
async def test_is_creatable(self, request_mock):
resource = await stripe.identity.VerificationSession.create(type="id_number")
request_mock.assert_requested(
"post", "/v1/identity/verification_sessions"
)
assert isinstance(resource, stripe.identity.VerificationSession)
async def test_is_listable(self, request_mock):
resources = await stripe.identity.VerificationSession.list()
request_mock.assert_requested(
"get", "/v1/identity/verification_sessions"
)
assert isinstance(resources.data, list)
assert isinstance(
resources.data[0], stripe.identity.VerificationSession
)
async def test_is_modifiable(self, request_mock):
resource = await stripe.identity.VerificationSession.modify(
TEST_RESOURCE_ID, metadata={"key": "value"}
)
request_mock.assert_requested(
"post", "/v1/identity/verification_sessions/%s" % TEST_RESOURCE_ID
)
assert isinstance(resource, stripe.identity.VerificationSession)
async def test_is_retrievable(self, request_mock):
resource = await stripe.identity.VerificationSession.retrieve(
TEST_RESOURCE_ID
)
request_mock.assert_requested(
"get", "/v1/identity/verification_sessions/%s" % TEST_RESOURCE_ID
)
assert isinstance(resource, stripe.identity.VerificationSession)
async def test_is_saveable(self, request_mock):
resource = await stripe.identity.VerificationSession.retrieve(
TEST_RESOURCE_ID
)
resource.metadata["key"] = "value"
verification_session = await resource.save()
request_mock.assert_requested(
"post", "/v1/identity/verification_sessions/%s" % TEST_RESOURCE_ID
)
assert isinstance(resource, stripe.identity.VerificationSession)
assert resource is verification_session
async def test_can_cancel(self, request_mock):
resource = await stripe.identity.VerificationSession.retrieve(
TEST_RESOURCE_ID
)
verification_session = await resource.cancel()
request_mock.assert_requested(
"post",
"/v1/identity/verification_sessions/%s/cancel" % TEST_RESOURCE_ID,
)
assert isinstance(resource, stripe.identity.VerificationSession)
assert resource is verification_session
async def test_can_cancel_classmethod(self, request_mock):
resource = await stripe.identity.VerificationSession.cancel(TEST_RESOURCE_ID)
request_mock.assert_requested(
"post",
"/v1/identity/verification_sessions/%s/cancel" % TEST_RESOURCE_ID,
)
assert isinstance(resource, stripe.identity.VerificationSession)
async def test_can_redact(self, request_mock):
resource = await stripe.identity.VerificationSession.retrieve(
TEST_RESOURCE_ID
)
verification_session = await resource.redact()
request_mock.assert_requested(
"post",
"/v1/identity/verification_sessions/%s/redact" % TEST_RESOURCE_ID,
)
assert isinstance(resource, stripe.identity.VerificationSession)
assert resource is verification_session
async def test_can_redact_classmethod(self, request_mock):
resource = await stripe.identity.VerificationSession.redact(TEST_RESOURCE_ID)
request_mock.assert_requested(
"post",
"/v1/identity/verification_sessions/%s/redact" % TEST_RESOURCE_ID,
)
assert isinstance(resource, stripe.identity.VerificationSession)
| 38.386139 | 85 | 0.689451 | 387 | 3,877 | 6.648579 | 0.157623 | 0.076953 | 0.230859 | 0.132919 | 0.818888 | 0.813059 | 0.80684 | 0.788574 | 0.741158 | 0.627672 | 0 | 0.004354 | 0.229817 | 3,877 | 100 | 86 | 38.77 | 0.857334 | 0 | 0 | 0.452381 | 0 | 0 | 0.108331 | 0.091566 | 0 | 0 | 0 | 0 | 0.261905 | 1 | 0 | false | 0 | 0.035714 | 0 | 0.047619 | 0.011905 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
99b768b461d762040ad8540cb619aeeef2b19754 | 2,987 | py | Python | pyuwphysret/common/pyfiles/atmos/testing123.py | graziano-giuliani/pythoncode | 4e505af5be3e32519cf4e62b85c101a63c885f77 | [
"MIT"
] | null | null | null | pyuwphysret/common/pyfiles/atmos/testing123.py | graziano-giuliani/pythoncode | 4e505af5be3e32519cf4e62b85c101a63c885f77 | [
"MIT"
] | null | null | null | pyuwphysret/common/pyfiles/atmos/testing123.py | graziano-giuliani/pythoncode | 4e505af5be3e32519cf4e62b85c101a63c885f77 | [
"MIT"
] | 1 | 2020-07-24T02:45:47.000Z | 2020-07-24T02:45:47.000Z | #!/usr/bin/env python
import numpy as num
from e2rh import e2rh
from e2mr import e2mr
from e2dp import e2dp
from rh2mr import rh2mr
from rh2dp import rh2dp
from rh2e import rh2e
from mr2dp import mr2dp
from mr2e import mr2e
from mr2rh import mr2rh
from dp2e import dp2e
from dp2rh import dp2rh
from dp2mr import dp2mr
p = num.array((1000.0,))
t = num.array((260.0,))
e = num.array((1.0,))
print(p[0])
print(t[0])
print(e[0])
Tconvert = 273.15
rh = e2rh(p,t,e,Tconvert)
print(rh[0][0])
mr = e2mr(p,e)
print(mr[0])
dp = e2dp(e,t,Tconvert)
print(dp[0])
mr = rh2mr(p,t,rh[0],Tconvert)
print(mr[0][0])
dp = rh2dp(p,t,rh[0],Tconvert)
print(dp[0][0])
e = rh2e(p,t,rh[0],Tconvert)
print(e[0][0])
dp = mr2dp(p,t,mr[0],Tconvert)
print(dp[0])
e = mr2e(p,mr[0])
print(e[0])
rh = mr2rh(p,t,mr[0],Tconvert)
print(rh[0][0])
e = dp2e(t,dp,Tconvert)
print(e[0])
rh = dp2rh(p,t,dp,Tconvert)
print(rh[0][0])
mr = dp2mr(p,t,dp,Tconvert)
print(mr[0])
print('OK')
p = num.array((1000.0,))
t = num.array((260.0,))
rh = num.array((50.0,))
print(p[0])
print(t[0])
print(rh[0])
Tconvert = 273.15
e = rh2e(p,t,rh,Tconvert)
print(e[0][0])
mr = rh2mr(p,t,rh,Tconvert)
print(mr[0][0])
dp = rh2dp(p,t,rh,Tconvert)
print(dp[0][0])
rh = e2rh(p,t,e[0],Tconvert)
print(rh[0][0])
mr = e2mr(p,e[0])
print(mr[0])
dp = e2dp(e[0],t,Tconvert)
print(dp[0])
dp = mr2dp(p,t,mr[0],Tconvert)
print(dp[0])
e = mr2e(p,mr[0])
print(e[0])
rh = mr2rh(p,t,mr[0],Tconvert)
print(rh[0][0])
e = dp2e(t,dp,Tconvert)
print(e[0])
rh = dp2rh(p,t,dp,Tconvert)
print(rh[0][0])
mr = dp2mr(p,t,dp,Tconvert)
print(mr[0])
print('OK')
p = num.array((1000.0,))
t = num.array((260.0,))
mr = num.array((1.0,))
print(p[0])
print(t[0])
print(mr[0])
Tconvert = 273.15
dp = mr2dp(p,t,mr,Tconvert)
print(dp[0])
e = mr2e(p,mr)
print(e[0])
rh = mr2rh(p,t,mr,Tconvert)
print(rh[0][0])
e = rh2e(p,t,rh[0],Tconvert)
print(e[0][0])
mr = rh2mr(p,t,rh[0],Tconvert)
print(mr[0][0])
dp = rh2dp(p,t,rh[0],Tconvert)
print(dp[0][0])
rh = e2rh(p,t,e[0],Tconvert)
print(rh[0][0])
mr = e2mr(p,e[0])
print(mr[0])
dp = e2dp(e[0],t,Tconvert)
print(dp[0])
e = dp2e(t,dp,Tconvert)
print(e[0])
rh = dp2rh(p,t,dp,Tconvert)
print(rh[0][0])
mr = dp2mr(p,t,dp,Tconvert)
print(mr[0])
print('OK')
p = num.array((1000.0,))
t = num.array((260.0,))
dp = num.array((255.0,))
print(p[0])
print(t[0])
print(dp[0])
Tconvert = 273.15
e = dp2e(t,dp,Tconvert)
print(e[0])
rh = dp2rh(p,t,dp,Tconvert)
print(rh[0][0])
mr = dp2mr(p,t,dp,Tconvert)
print(mr[0])
dp = mr2dp(p,t,mr[0],Tconvert)
print(dp[0])
e = mr2e(p,mr[0])
print(e[0])
rh = mr2rh(p,t,mr[0],Tconvert)
print(rh[0][0])
e = rh2e(p,t,rh[0],Tconvert)
print(e[0][0])
mr = rh2mr(p,t,rh[0],Tconvert)
print(mr[0][0])
dp = rh2dp(p,t,rh[0],Tconvert)
print(dp[0][0])
rh = e2rh(p,t,e[0],Tconvert)
print(rh[0][0])
mr = e2mr(p,e[0])
print(mr[0])
dp = e2dp(e[0],t,Tconvert)
print(dp[0])
| 20.04698 | 31 | 0.597255 | 629 | 2,987 | 2.836248 | 0.062003 | 0.29148 | 0.141256 | 0.107623 | 0.819507 | 0.778587 | 0.770179 | 0.770179 | 0.724215 | 0.710202 | 0 | 0.099562 | 0.159357 | 2,987 | 148 | 32 | 20.182432 | 0.610912 | 0.006696 | 0 | 0.807143 | 0 | 0 | 0.002129 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.092857 | 0 | 0.092857 | 0.45 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 9 |
99c0679688fb26c5b91c25f470f5e164f642bb3b | 14,826 | py | Python | examples/distance_calculation.py | saikat107/OpenNMT-py | 148b0d860e78120de704f7a6671e8eced251801b | [
"MIT"
] | null | null | null | examples/distance_calculation.py | saikat107/OpenNMT-py | 148b0d860e78120de704f7a6671e8eced251801b | [
"MIT"
] | null | null | null | examples/distance_calculation.py | saikat107/OpenNMT-py | 148b0d860e78120de704f7a6671e8eced251801b | [
"MIT"
] | null | null | null | from matplotlib import pyplot as plt
import sys
from examples.util import bigram_edit_distance, bigram_jaccard_distance, calculate_edit_distance, \
create_tree_from_string, read_patch_category, count_number_of_nodes
INDEX = 'INDEX'
EDIT_DISTANCE = 'EDIT_DISTANCE'
TREE_DISTANCE = 'TREE_DISTANCE'
PREV_CODE = 'PREV_CODE'
NEXT_CODE = 'NEXT_CODE'
PREV_TREE_STR = 'PREV_TREE_STR'
NEXT_TREE_STR = "NEXT_TREE_STR"
PREV_TREE = 'PREV_TREE'
NEXT_TREE = 'NEXT_TREE'
BIGRAM_JACCARD = 'BIGRAM_JACCARD'
BIGRAM_EDIT = 'BIGRAM_EDIT_DISTANCE'
TYPE_OF_CHANGE = 'TYPE_OF_CHANGE'
NORMALIZED_TREE_DISTANCE = 'NORMALIZED_TREE_DIST'
eds = []
tds = []
njds = []
neds = []
all_keys = []
ntds = []
dataname = sys.argv[1] # 'codit'
intended_class = sys.argv[2] #'only-t2t'
if dataname == 'codit':
complete_split_data_path = \
'/home/saikatc/Research/codit_data/complete_split_data/10_20_original/test'
data_raw_path = '/home/saikatc/Research/OpenNMT-py/c_data/raw/all/concrete/test_new'
correct_indices_path = 'whole-data-stats/codit-state.csv'
else:
complete_split_data_path = \
'/home/saikatc/Research/icse_data_concrete/all/small/test'
data_raw_path = '/home/saikatc/Research/OpenNMT-py/icse_data/raw/all/concrete_small/test_new'
correct_indices_path = 'whole-data-stats/icse-state.csv'
correct_indices_path = 'difference_analysis/' + dataname + '/' + intended_class + '.csv'
patch_classification_file = 'patch-classify/Correctly-predicted-patch-' + dataname + '.txt'
patch_category_dictionary, patch_to_category_dict = read_patch_category(patch_classification_file)
correct_data = {}
with open(correct_indices_path) as cip:
for line in cip:
parts = line.split(',')
idx = int(parts[0].strip())
dist = int(parts[1].strip())
data = {
INDEX: idx,
EDIT_DISTANCE: dist
}
correct_data[idx] = data
prev_code_file = complete_split_data_path + '/parent.code'
next_code_file = complete_split_data_path + '/child.code'
previous_tree_file = complete_split_data_path + '/parent.org.tree'
next_tree_file = complete_split_data_path + '/child.tree'
tree_dictionary = {}
with open(previous_tree_file) as ptfile:
with open(next_tree_file) as ntfile:
with open(prev_code_file) as pcfile:
with open(next_code_file) as ncfile:
for (pc, cc, pt, nt) in zip(pcfile, ncfile, ptfile, ntfile):
pcs = ' '.join([token.strip() for token in pc.strip().split()])
ccs = ' '.join([token.strip() for token in cc.strip().split()])
pts = pt.strip()
nts = nt.strip()
key = pcs + ' -> ' + ccs
tree_dictionary[key] = [pts, nts]
draw_prev_token_file = data_raw_path + '/prev.token'
draw_next_token_file = data_raw_path + '/next.token'
count = 0
# correct_classification_file = open('difference_analysis/' +
# dataname + '/' + intended_class + '-category.txt', 'w')
change_type_map = {}
with open(draw_prev_token_file) as pcfile:
with open(draw_next_token_file) as ncfile:
for idx, (pc, nc) in enumerate(zip(pcfile, ncfile)):
if idx not in correct_data.keys():
continue
pcs = ' '.join([token.strip() for token in pc.strip().split()])
ccs = ' '.join([token.strip() for token in nc.strip().split()])
correct_data[idx][PREV_CODE] = pcs
correct_data[idx][NEXT_CODE] = ccs
key = pcs + ' -> ' + ccs
if key not in tree_dictionary.keys():
continue
pts, nts = tree_dictionary[key]
correct_data[idx][PREV_TREE_STR] = pts
correct_data[idx][NEXT_TREE_STR] = nts
correct_data[idx][PREV_TREE] = create_tree_from_string(pts)
correct_data[idx][NEXT_TREE] = create_tree_from_string(nts)
correct_data[idx][TREE_DISTANCE] = calculate_edit_distance(
correct_data[idx][PREV_TREE], correct_data[idx][NEXT_TREE])
correct_data[idx][BIGRAM_EDIT] = bigram_edit_distance(pcs, ccs)
correct_data[idx][BIGRAM_JACCARD] = bigram_jaccard_distance(pcs, ccs)
correct_data[idx][NORMALIZED_TREE_DISTANCE] = \
float(correct_data[idx][TREE_DISTANCE]) / count_number_of_nodes(correct_data[idx][PREV_TREE])
# if correct_data[idx][BIGRAM_EDIT] == 0:
# print(pcs, ' ----> ', ccs)
# print(idx)
# print(correct_data[idx][EDIT_DISTANCE], correct_data[idx][TREE_DISTANCE],
# correct_data[idx][BIGRAM_EDIT], correct_data[idx][BIGRAM_JACCARD])
if False: #key in all_keys:
continue
else:
all_keys.append(key)
# correct_classification_file.write('Example : ' + str(idx) + '\n')
# correct_classification_file.write(pcs + '\n')
# correct_classification_file.write(ccs + '\n')
# correct_classification_file.write(str(correct_data[idx][EDIT_DISTANCE]) +
# ' , ' +
# str(correct_data[idx][TREE_DISTANCE]) +
# ' , ' +
# str(correct_data[idx][BIGRAM_EDIT]) +
# ' , ' +
# str(correct_data[idx][BIGRAM_JACCARD]) +
# '\n'
# )
# print('Previous Version:\t', pcs)
# print('Next Version: \t', ccs)
# print(',\t'.join(change_type_map.keys()))
patch_key = pcs + ccs
# if idx in patch_category_dictionary.keys():
# change_type = patch_category_dictionary[idx]
# else:
# if patch_key in patch_category_dictionary.keys():
# change_type = patch_category_dictionary[patch_key]
# else:
# change_type = input('Enter Change Type: ')
# patch_category_dictionary[patch_key] = change_type
# correct_data[idx][TYPE_OF_CHANGE] = change_type
change_type = ''
if change_type in change_type_map.keys():
change_type_map[change_type] += 1
else:
change_type_map[change_type] = 1
# correct_classification_file.write('Change Type : ' + change_type + '\n')
# correct_classification_file.write('=====================================\n\n')
eds.append(correct_data[idx][EDIT_DISTANCE])
tds.append(correct_data[idx][TREE_DISTANCE])
neds.append(correct_data[idx][BIGRAM_EDIT])
njds.append(correct_data[idx][BIGRAM_JACCARD])
ntds.append(correct_data[idx][NORMALIZED_TREE_DISTANCE])
count += 1
print(count)
intended_class = 'common'
if dataname == 'codit':
complete_split_data_path = \
'/home/saikatc/Research/codit_data/complete_split_data/10_20_original/test'
data_raw_path = '/home/saikatc/Research/OpenNMT-py/c_data/raw/all/concrete/test_new'
correct_indices_path = 'whole-data-stats/codit-state.csv'
else:
complete_split_data_path = \
'/home/saikatc/Research/icse_data_concrete/all/small/test'
data_raw_path = '/home/saikatc/Research/OpenNMT-py/icse_data/raw/all/concrete_small/test_new'
correct_indices_path = 'whole-data-stats/icse-state.csv'
correct_indices_path = 'difference_analysis/' + dataname + '/' + intended_class + '.csv'
patch_classification_file = 'patch-classify/Correctly-predicted-patch-' + dataname + '.txt'
patch_category_dictionary, patch_to_category_dict = read_patch_category(patch_classification_file)
correct_data = {}
with open(correct_indices_path) as cip:
for line in cip:
parts = line.split(',')
idx = int(parts[0].strip())
dist = int(parts[1].strip())
data = {
INDEX: idx,
EDIT_DISTANCE: dist
}
correct_data[idx] = data
prev_code_file = complete_split_data_path + '/parent.code'
next_code_file = complete_split_data_path + '/child.code'
previous_tree_file = complete_split_data_path + '/parent.org.tree'
next_tree_file = complete_split_data_path + '/child.tree'
tree_dictionary = {}
with open(previous_tree_file) as ptfile:
with open(next_tree_file) as ntfile:
with open(prev_code_file) as pcfile:
with open(next_code_file) as ncfile:
for (pc, cc, pt, nt) in zip(pcfile, ncfile, ptfile, ntfile):
pcs = ' '.join([token.strip() for token in pc.strip().split()])
ccs = ' '.join([token.strip() for token in cc.strip().split()])
pts = pt.strip()
nts = nt.strip()
key = pcs + ' -> ' + ccs
tree_dictionary[key] = [pts, nts]
draw_prev_token_file = data_raw_path + '/prev.token'
draw_next_token_file = data_raw_path + '/next.token'
count = 0
# correct_classification_file = open('difference_analysis/' +
# dataname + '/' + intended_class + '-category.txt', 'w')
change_type_map = {}
with open(draw_prev_token_file) as pcfile:
with open(draw_next_token_file) as ncfile:
for idx, (pc, nc) in enumerate(zip(pcfile, ncfile)):
if idx not in correct_data.keys():
continue
pcs = ' '.join([token.strip() for token in pc.strip().split()])
ccs = ' '.join([token.strip() for token in nc.strip().split()])
correct_data[idx][PREV_CODE] = pcs
correct_data[idx][NEXT_CODE] = ccs
key = pcs + ' -> ' + ccs
if key not in tree_dictionary.keys():
continue
pts, nts = tree_dictionary[key]
correct_data[idx][PREV_TREE_STR] = pts
correct_data[idx][NEXT_TREE_STR] = nts
correct_data[idx][PREV_TREE] = create_tree_from_string(pts)
correct_data[idx][NEXT_TREE] = create_tree_from_string(nts)
correct_data[idx][TREE_DISTANCE] = calculate_edit_distance(
correct_data[idx][PREV_TREE], correct_data[idx][NEXT_TREE])
correct_data[idx][BIGRAM_EDIT] = bigram_edit_distance(pcs, ccs)
correct_data[idx][BIGRAM_JACCARD] = bigram_jaccard_distance(pcs, ccs)
correct_data[idx][NORMALIZED_TREE_DISTANCE] = \
float(correct_data[idx][TREE_DISTANCE]) / count_number_of_nodes(correct_data[idx][PREV_TREE])
# if correct_data[idx][BIGRAM_EDIT] == 0:
# print(pcs, ' ----> ', ccs)
# print(idx)
# print(correct_data[idx][EDIT_DISTANCE], correct_data[idx][TREE_DISTANCE],
# correct_data[idx][BIGRAM_EDIT], correct_data[idx][BIGRAM_JACCARD])
if False: #key in all_keys:
continue
else:
all_keys.append(key)
# correct_classification_file.write('Example : ' + str(idx) + '\n')
# correct_classification_file.write(pcs + '\n')
# correct_classification_file.write(ccs + '\n')
# correct_classification_file.write(str(correct_data[idx][EDIT_DISTANCE]) +
# ' , ' +
# str(correct_data[idx][TREE_DISTANCE]) +
# ' , ' +
# str(correct_data[idx][BIGRAM_EDIT]) +
# ' , ' +
# str(correct_data[idx][BIGRAM_JACCARD]) +
# '\n'
# )
# print('Previous Version:\t', pcs)
# print('Next Version: \t', ccs)
# print(',\t'.join(change_type_map.keys()))
patch_key = pcs + ccs
# if idx in patch_category_dictionary.keys():
# change_type = patch_category_dictionary[idx]
# else:
# if patch_key in patch_category_dictionary.keys():
# change_type = patch_category_dictionary[patch_key]
# else:
# if idx in patch_category_dictionary.keys():
# change_type = patch_category_dictionary[idx]
# else:
# if patch_key in patch_category_dictionary.keys():
# change_type = patch_category_dictionary[patch_key]
# else:
# change_type = input('Enter Change Type: ')
# patch_category_dictionary[patch_key] = change_type
# correct_data[idx][TYPE_OF_CHANGE] = change_type
change_type = ''
if change_type in change_type_map.keys():
change_type_map[change_type] += 1
else:
change_type_map[change_type] = 1
# correct_classification_file.write('Change Type : ' + change_type + '\n')
# correct_classification_file.write('=====================================\n\n')
eds.append(correct_data[idx][EDIT_DISTANCE])
tds.append(correct_data[idx][TREE_DISTANCE])
neds.append(correct_data[idx][BIGRAM_EDIT])
njds.append(correct_data[idx][BIGRAM_JACCARD])
ntds.append(correct_data[idx][NORMALIZED_TREE_DISTANCE])
count += 1
print(count)
# correct_classification_file.close()
f1 = 'difference_analysis/' + dataname + '/final.tree-dist-' + sys.argv[2] + '.pdf'
f2 = 'difference_analysis/' + dataname + '/final.token-dist' + sys.argv[2] + '.pdf'
f3 = 'difference_analysis/' + dataname + '/pddfs/normal-td-complete' + sys.argv[2] + '.pdf'
plt.rcParams.update({'font.size': 16})
plt.figure()
# plt.hist(eds, label='Edit Distance', alpha=0.5, ls='dashed', edgecolor='b', lw=3, color='b')
# plt.hist(neds, label='Bigram Edit Distance', alpha=0.5, ls='dotted', edgecolor='b', lw=3, color='r')
plt.hist(tds, label='Tree Distance', alpha=0.5, ls='solid', lw=3, color='g', edgecolor='b')
plt.xlabel('Tree Edit Distance')
plt.ylabel('Number of Pathces')
#plt.legend()
plt.savefig(fname=f1)
# plt.show()
plt.figure()
plt.hist(eds, label='Token Distance', alpha=0.5, ls='solid', lw=3, color='b', edgecolor='b')
plt.xlabel('Token Edit Distance')
plt.ylabel('Number of Pathces')
plt.savefig(fname=f2)
#plt.legend()
# plt.show()
#plt.figure()
#plt.hist(ntds, label='Normalized Tree Distance', alpha=0.5, ls='solid', edgecolor='b', lw=3, color='r')
#plt.savefig(fname=f3)
#plt.legend()
# plt.show()
| 45.618462 | 109 | 0.589572 | 1,754 | 14,826 | 4.688712 | 0.099202 | 0.085603 | 0.10214 | 0.043774 | 0.892388 | 0.879621 | 0.865029 | 0.856031 | 0.846547 | 0.839251 | 0 | 0.004801 | 0.283556 | 14,826 | 324 | 110 | 45.759259 | 0.769441 | 0.274113 | 0 | 0.817308 | 0 | 0 | 0.131702 | 0.07246 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.014423 | 0 | 0.014423 | 0.009615 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
821e3d2473ea1c7fc561af3c5f17b9308897d284 | 24,327 | py | Python | M4nifest0-Report-V2.py | M4nifest0/M4nifest0-Report-instagramV2 | 8a74a71c85b2030abc1c85b3e7eb30ba89b03718 | [
"MIT"
] | 6 | 2021-11-06T12:22:37.000Z | 2022-03-14T14:24:38.000Z | M4nifest0-Report-V2.py | M4nifest0/M4nifest0-Report-instagramV2 | 8a74a71c85b2030abc1c85b3e7eb30ba89b03718 | [
"MIT"
] | null | null | null | M4nifest0-Report-V2.py | M4nifest0/M4nifest0-Report-instagramV2 | 8a74a71c85b2030abc1c85b3e7eb30ba89b03718 | [
"MIT"
] | 2 | 2022-01-09T13:28:55.000Z | 2022-01-29T01:52:20.000Z | from pytransform import pyarmor_runtime
pyarmor_runtime()
__pyarmor__(__name__, __file__, b'\x50\x59\x41\x52\x4d\x4f\x52\x00\x00\x03\x09\x00\x61\x0d\x0d\x0a\x08\x2d\xa0\x01\x00\x00\x00\x00\x01\x00\x00\x00\x40\x00\x00\x00\x69\x17\x00\x00\x00\x00\x00\x18\x98\x90\xc3\xcd\x89\x63\x63\x79\x67\xbb\x4a\xea\xde\xe8\xbe\xda\x00\x00\x00\x00\x00\x00\x00\x00\x9a\x12\x8e\x73\x94\x1f\x3f\x6b\xd5\x7e\xc8\x40\xa4\x25\x88\x6f\xdb\x72\xcd\x5a\x42\x51\xca\xd0\x0b\x46\xe6\x7c\xd5\xaa\x79\x50\xf4\xdd\x79\xf5\xc8\xe9\xc3\x17\xde\xda\x10\x43\xc0\x54\x8a\x24\xc0\xdc\xa7\x73\x3d\xbb\x13\x64\x93\x72\x26\x25\x5b\xb2\x2e\x07\x39\xa0\x66\x78\x79\x58\xec\x97\x97\x19\xb9\x8e\x4c\xb9\x87\x48\x18\x3a\xcb\x98\x57\xb4\x98\x3c\xa1\xc4\x35\x44\x97\x13\xaa\xa9\x29\xb0\x6c\x7f\x99\x02\xe1\x76\xfe\x2a\xd8\x2d\xea\xce\x20\x94\x5a\xdc\xab\xdd\xb1\x77\xc6\xa6\x98\xc4\x87\xe9\x5b\xf5\x96\x09\xe6\xa7\x26\x6d\x38\xad\x8d\x35\x5f\xe2\x9e\x57\x2b\x9a\x9e\x0f\x8f\x7c\x0f\xbd\x51\x48\xd6\x7f\x9a\x85\x3b\x1a\x34\xa9\x40\x05\xea\xd3\xc8\x2f\x05\x76\x03\x7b\xe5\xd3\x96\x83\xb4\xfa\xb0\xf3\x96\xab\xa1\x00\xe9\xaa\xfd\xc0\x81\x52\x45\x51\xd9\xfa\x49\xa0\xc9\x86\xdc\xab\xd5\x54\x97\x23\x82\xe5\x18\xdf\x87\x3a\xf6\xea\x54\xe7\xa0\x10\xfe\xa0\x9b\x77\x3d\x4d\x47\xda\x27\xd0\xaa\xbe\x50\x33\xd9\xc3\xfb\x48\xe8\x21\x61\x04\x35\x70\xde\xd3\x87\xbf\xeb\xac\x77\xa9\x30\xb1\x83\x06\x8a\xdf\x4a\x80\x7b\xe0\x0c\x3e\x12\x1a\x27\x29\x4a\x33\x99\x1a\x2a\xad\xdc\x0f\x09\x0a\x86\x18\x63\x33\x41\x51\x6c\x8e\x6b\xd1\x1a\x54\xb0\xca\x59\xda\x1c\x12\x4b\x9a\x29\x15\x54\x41\x58\xe5\x17\xee\xde\x09\x8e\xba\x57\xf1\xec\x4c\x5e\x76\x24\x80\x1d\xa0\x33\x63\xd3\x1d\x6b\xac\x22\xab\x24\xfc\x2d\x78\x05\x84\x4e\xa6\xd7\x73\xde\x4e\x46\xb6\x04\x07\x82\x28\x83\x77\xba\x2d\x29\xe7\x33\xb2\x8a\x90\x33\xfd\x24\x04\x61\x66\x16\x76\xf9\x28\xf2\x67\x3e\x9c\x84\x89\x31\x96\x9d\x5f\x86\xff\x16\xe8\x3f\xb8\x8c\x4f\x4e\x6c\xa1\x1c\x24\x12\xf3\x39\x44\x56\xac\x86\x2d\x72\xb2\xaf\x73\x78\xf5\xaa\x16\x3c\x6f\xe0\x26\x69\xf9\x74\xbf\x23\xb3\xaa\x93\x5d\x37\xdf\x31\x5a\x94\x41\x44\xb0\x1e\x54\x28\xc2\xf1\x91\xb9\x42\x25\x28\x22\xd8\x3f\x90\xc1\xe9\xe8\xaf\xab\x3f\xac\x68\x94\xb7\x9d\xe4\x7d\xdf\x10\x09\x2f\x41\x4c\x48\x47\xfa\xa8\x0f\xeb\x17\x28\x79\x45\x28\x94\x33\xe3\x7f\x6f\x09\x03\xe8\x01\xbc\xb3\x07\x80\xde\x83\xd1\x53\x92\xc8\x4b\x3a\x36\x3c\x32\x2c\x2e\xff\x38\x42\xa0\x2c\x54\x34\x81\xdb\xfd\x54\xca\xd3\xdd\xca\x8c\xb6\x7e\x75\xbc\x0e\xd2\x5e\xfa\x3f\xd3\xb8\x4d\x92\x91\x29\x49\x72\x9f\x1c\x08\xd7\x05\x83\x8d\xb8\xfd\x85\xde\xd2\x42\x9c\xd8\x09\xf5\x45\xca\x9d\xd3\x84\x50\x8a\xc6\x08\x88\xfe\xbf\x4e\x87\xea\x92\x4d\x36\x0d\x7f\xbd\x62\x76\x8d\x37\xc5\x6b\xb4\x7a\x4f\xb9\xef\x56\x10\xf4\x35\xf3\xac\x3b\xb0\xff\xf8\xfb\x49\x2c\x4d\xe4\xd1\xd5\x29\x9e\x68\x94\xc1\x74\x7c\xae\xea\x5c\x9e\x72\x69\x30\xd7\x26\x20\xca\x79\x9a\x6e\x15\xec\x00\x1f\x45\x44\xb1\xfb\x47\x20\x2d\x9a\xba\x96\xfd\x0c\xed\x28\x3a\xcc\xf1\xc2\x3f\x34\x07\x52\x39\xa1\xd5\x39\x33\x72\x03\x50\x96\x0c\xf3\x0e\x03\x09\xa2\x72\xf0\x46\xa2\x88\x04\x4d\xb8\xe6\xfe\xcd\x3b\xb6\x54\xeb\xbe\x01\x50\x4b\xc3\x35\x0d\xe3\x88\xb4\x58\xf9\x9c\x69\x26\x01\x8e\x2f\xcf\x0d\x93\x89\x78\x58\x8d\xd2\x6d\x14\x4c\xd7\xa0\xf0\xf9\xe3\x78\xc8\xad\xfe\xd2\xb3\x9b\xc3\xd9\x2a\x22\x03\x71\x62\x36\x60\x6f\x04\x38\xcd\xbd\x1c\xd4\xc1\xae\x1d\xe3\x1f\xfb\xf0\x02\x37\x5a\x68\x60\x5a\x2a\xc2\xbf\x34\x37\xe2\xc7\x22\x2a\xc9\x72\x12\xe2\x85\x0c\x24\x99\x76\xd1\x67\xfc\x7c\x34\x3f\xe6\x1f\x28\x68\xe4\xfa\x11\x08\x1c\x6d\xb0\x04\xa3\x24\x0e\x21\xff\xec\x0a\xe8\x1f\x79\x49\xd6\x5f\x80\xba\xea\x9e\x4c\x98\x77\xcc\xc4\x9f\x8c\xdf\x4d\x95\x00\x85\x1b\xb3\x66\x34\xf2\x40\x36\x66\xa7\x54\x9c\x0c\x3e\x13\xd3\x3c\x6b\xb1\x25\x03\xec\x24\xd7\x4c\xb2\x4a\xaa\xf6\x4f\x06\x76\xf8\x3c\x6a\x1c\x7f\x49\x13\xe0\x61\xd2\x8c\x71\xf6\x90\x0f\xa9\x3a\x61\x7e\xdb\xa1\xd6\x66\xe4\x5d\x14\xae\xd6\xe3\x3b\xc5\x77\x30\x59\xd2\x51\x16\x25\x2e\x75\x20\x81\x7e\x7b\xb3\x65\x06\xa1\x49\xdf\x38\x7f\xb1\xe3\x21\x29\x18\xfc\x27\xc7\x90\xf4\x14\xa4\x15\x12\x89\xde\x16\xa9\x95\x5f\xc0\x64\x5b\x68\xc1\x17\x12\xdd\x42\xff\xee\x6d\x16\x2b\x3b\xe3\x9e\xf8\x9e\x00\x75\x8f\xcd\x81\x57\x1f\x0f\xde\x8a\xf5\x78\xea\x95\x72\x9d\xf8\xde\xc9\x91\xfd\xf7\x7a\x78\xc1\x84\x78\x97\x06\x98\xe0\xe5\xdd\xc4\x3f\xa0\x54\x89\x26\x43\x7d\xfe\xc1\x9b\x7d\x6d\x5d\x32\xe3\x82\xce\x1b\x51\xea\xc1\xa0\xa1\x58\x46\xce\x2b\x8a\xb0\x78\x88\x6f\xb8\x06\x72\x61\xd7\xe5\x07\x71\x88\xb6\x8e\x73\x6d\x40\xd4\x9b\x7a\xc7\x65\x8d\x25\x10\xbd\x24\xf1\x87\x60\x36\x60\x4e\x83\x32\x93\xc2\x73\x16\xd2\x86\xd8\x04\xe4\x74\x77\x8d\x63\x05\xf1\x88\x20\x71\xc1\xa1\xed\x00\xff\x89\x55\xe6\xef\x29\xd0\x0a\x61\xfb\xc6\x4a\x55\x76\x13\x60\x3b\xcc\xfd\xdc\xb2\x7a\xb6\xb6\x17\xb6\x39\x4f\x8b\x59\xf1\x37\xda\xc7\xf2\xbf\x6c\x65\x0f\xaf\x76\x59\x3f\x32\xc9\x6e\x7f\x59\x37\x04\xf5\x81\x0d\x47\x3b\x2b\x6d\x7b\x6b\x71\x44\xb0\xfd\x5d\xc6\x03\x02\xb3\x79\x2f\x4d\x90\x32\x7b\xd1\x0a\xce\x6d\xb9\xd8\x0c\x88\xea\x28\xa7\xc8\xdf\xe7\xd2\xb5\xe2\x75\xc2\xa6\x2e\x42\x9a\xfc\xa1\x7d\x9a\xad\x3e\xcc\x17\x9d\xdf\x2b\xa6\x9b\x62\x69\x1e\xe0\xe6\x6a\x4e\xb4\xe6\x71\xa6\x8d\x39\xa6\x14\xf6\xfe\x63\xd1\x2b\x84\xef\x72\x89\xda\x26\x7d\x39\xc4\x46\x11\x98\xf7\xfb\x4e\x54\x24\xa4\xa9\xc4\x9f\xba\x49\xad\x5f\x18\x79\xf9\xe0\x10\x0f\x7f\x38\x67\x6e\x21\x7e\x73\x9b\xf2\x8a\x5c\x3a\x7d\x0f\x93\x04\x97\x31\x9b\xe3\x0a\xe1\xc0\x72\x22\x03\xb5\x4b\xef\xb1\x30\xd0\x17\xf0\x60\xa1\xa9\x0d\x08\x6a\x47\x64\xe0\xe7\x40\xe5\x72\x3e\xcc\x4f\x40\xea\xd5\x47\x58\x1b\xf1\xd1\x90\xa5\x31\xd1\xab\x25\xb2\xaa\xa4\x0c\x8b\xfa\xc2\xb9\x39\x7b\x22\xe3\xd1\x95\xb8\xec\x89\xc0\xe6\xe3\x8b\x63\x23\x3b\x5f\x4e\x82\xbd\x71\x29\x5c\x89\xbe\x76\x47\xa5\x5d\xbe\x12\xf0\x73\x01\x80\xf3\x5c\x5f\xa8\x77\x25\xf0\x52\xd3\x08\xc5\xd9\xac\x11\x9e\xaf\x07\xdf\x22\xf5\x08\x4b\x2f\x22\xf1\x6b\xc6\xf3\x74\x66\x07\xcb\x59\xbd\xa3\xd2\x75\xe6\xd7\x08\x3a\xe0\x3f\xfb\xf5\x33\x8f\x89\xed\xa1\x72\xa7\x27\xd9\x77\x9f\xf0\xc8\xa5\x10\x63\x4e\xde\xbe\x57\x9a\xc0\xac\xab\x49\xb0\x54\xc7\x2f\xb7\x57\xba\x10\xd1\xcc\x6d\xb8\xdd\x46\x43\x2e\xd2\xc1\x36\x47\x70\xc8\x54\x48\x29\x16\x1d\x93\x81\x6a\xba\x13\x2e\x44\xa7\xd6\xdd\x88\xef\x11\x1c\xb8\x93\x7a\xd3\x6c\x8a\xfb\xca\x66\xa3\xfb\x14\x97\xa3\xf4\xd8\xec\x54\x89\xcf\x9e\x04\xf1\xf9\xb1\x40\xd9\xda\x8a\xee\x2d\xb5\x38\xff\x65\x68\x70\xaf\x94\x20\x25\x5b\xba\x4d\xd2\x4a\xc3\xa1\x67\x2e\x6c\xb6\x1b\x52\xab\x01\x63\x79\x5b\x50\x3b\x8e\xbd\xb3\x45\x90\x87\x59\xd4\x28\x83\xed\x4e\x65\xdc\x24\x75\x3f\x3d\x4f\x06\x3a\x03\x30\x07\x6b\xcb\x2c\x49\xa8\x09\xb7\x0d\x4a\xa2\xde\x5f\xe6\xc5\x70\xf7\xfa\xe7\xae\x57\x51\xe5\x55\xac\xe0\x7d\x25\x8e\x3c\xb8\x1a\x80\x1f\xe2\xf4\x5a\xf4\xa9\xd5\xab\x42\x91\x75\x43\xee\x5d\x90\x3c\xf8\x49\x3b\xaa\x31\x33\xc9\x3d\xd6\xa5\xc3\x67\x8d\x78\x26\xdc\x3a\xbe\xf5\xc3\xca\x03\x98\xd8\xea\x25\xf9\x09\xc8\x86\x73\x84\xd2\x5b\xd1\x2f\xad\xd7\x11\xe6\xff\x17\x90\xaa\x16\x68\xaa\x8f\x18\xf4\x4d\x4e\xda\x48\x0e\x7c\x54\x53\x2e\x4d\x67\x4d\x7d\xe4\xe5\x64\x36\x53\xea\x60\x27\xd5\xd8\x49\xa3\x18\x7c\xec\x6a\x03\x6e\x80\x82\x74\x60\xe9\x77\x1c\x31\xf4\xcd\xb0\xf2\xc1\x59\xd4\xd9\x0b\x92\x4c\x21\xfb\x5b\xa4\x58\x51\x70\xc5\x4f\xcf\xdb\x24\x4c\x36\x1c\x49\xf8\x79\xf3\x6b\x34\xd5\x23\xd3\xd3\x7a\x72\x56\x28\xd5\x31\x5c\x33\x2a\xb6\xe1\x72\xb0\x66\x2d\x77\xbc\x29\x4b\xbb\x5e\xe3\xdf\x75\x78\x87\x8e\x65\x71\x6d\x56\x18\xc5\x5d\x33\x85\x34\xbc\x2e\x9e\x08\x53\xe5\x22\xb3\x3d\xa4\x34\x08\x5d\xb1\x76\x8d\xba\xff\xb5\x91\x3d\x6b\xfd\xf4\x53\x41\x50\xe4\xaf\x24\xc9\x47\x0b\x3a\xb1\x38\x43\xbb\x0b\x7a\x73\xf1\x12\x3a\x75\x94\x7f\x53\xd6\x35\xdc\x5a\x9c\xcd\xde\xfc\x9f\x17\xa8\x68\xf1\xbe\x5f\xee\x05\x4f\xd8\xc7\xcc\xff\xee\xa9\x21\xb9\xe9\x97\xe0\x23\xcc\x55\x4d\xb6\x47\xe0\x89\xdb\xf9\xcd\xb5\xd0\xfb\xaf\xc4\xc3\xa2\x50\x8f\x1b\xb1\xfa\xd6\x29\x8f\xef\xe3\x8b\x7e\x98\xc3\x1b\xac\x5b\xd0\xae\xb4\x4f\xe4\x32\xb5\x18\xe7\x9e\xca\x14\x4d\x9b\x43\x7c\x72\x30\xe6\x89\xf7\x4e\x25\xd0\xa6\x66\xae\x76\x30\x38\xd8\xbe\x1d\x8f\x73\x0a\xb5\xbc\xf8\xa2\xb5\xa0\xb9\x54\x9e\x50\xcc\x2c\xcb\xf7\xd5\xbc\x47\xa8\x3e\x44\x7b\xa5\xa9\x33\x71\x3f\x4a\xae\x0e\x78\xc0\x00\xe2\x03\xa1\x76\x22\x19\xe8\x35\x41\x8e\xf1\x09\xa6\xe7\xc6\xa0\x1b\x76\xfd\x97\xc1\xca\x19\xbe\x4d\xd0\x9e\xdf\x13\x42\xb9\x76\x51\x08\x6a\x28\x32\x7a\x96\xa2\xd1\x12\xaa\x97\x04\xc6\x70\xb9\x98\xaf\x46\xe1\x60\xa8\x14\x2f\xea\x33\xd8\x1e\x47\x20\xd1\x45\xc7\x22\xfc\xa2\x7f\x27\x01\x87\x99\x35\xff\x5f\x10\x40\xe5\xe2\x90\x59\xa0\xee\x1b\x28\x16\x3a\xcd\xcf\xfb\x65\x41\x13\x37\xef\x0d\x7a\x64\x61\xf9\x56\x46\xe0\x40\xab\x1e\xe2\x7e\xe5\x68\x08\x6f\xf2\x80\x90\xcb\xd0\xb6\x55\x53\xfa\xb6\xcf\x24\xe4\xf3\xe6\x91\x81\x86\x88\xac\x7c\x77\x43\x42\x20\xa0\x68\x52\x2c\xf1\x30\x1e\x76\x3c\x04\x23\x4a\x88\x22\xb8\xe6\x2d\xd1\x67\xcc\xfb\x94\x18\xf1\xb7\x77\xed\x35\xc1\x95\x2a\x29\xe5\xba\x3d\x88\xe9\x98\x4a\x0c\xad\x04\x08\x56\xd5\x74\x7d\x24\xe4\x09\x6c\xb2\x10\x64\x3d\x3e\x25\xb1\xc0\x27\x4b\xa2\x89\xb3\x9e\x8c\x56\x75\x74\x80\x22\xfa\xf0\x82\x21\xce\x8d\x2f\x98\x99\x62\x3b\xd3\x7d\x86\xb6\x68\x60\xdb\xcd\xaa\x2a\xbc\xbf\x80\x3e\x27\xf6\x93\xba\x8a\xa3\x94\xfb\x77\x75\xed\x7c\xf4\xe0\xb9\xce\x59\xe8\x9b\x1d\x8e\x5f\x13\x57\x69\x89\xc9\x30\x06\x7f\x70\x17\x2a\x0d\xc6\xfd\x7f\x26\x21\x20\xc9\x36\xfe\xb1\x3b\x70\x48\x9d\xad\xd5\xdd\x9e\x49\x70\x12\x0d\x5d\x3c\x7a\x14\x63\xce\x1f\xd8\x0b\x06\x51\x36\x13\x1f\x27\x09\xac\x54\x05\xa7\x20\x7c\x13\xc6\xe3\x29\x6e\x32\xe2\x65\xff\x17\x66\xa8\x75\xae\x30\xee\xa8\x22\x12\x86\xb9\x45\xa6\xab\xb7\x8c\xba\x5f\x76\xb7\x0e\xb0\x5d\xb6\x30\xb6\x12\xac\x8f\x89\x5c\x99\x24\xa4\xae\xe4\xb9\x71\xf4\xb6\xed\x39\x53\x9c\x68\x31\xe3\x7f\xb2\x0c\xd5\x71\xb0\x5d\xc5\xfc\x18\xf3\x6c\x3a\xa1\x57\x2b\x70\x36\xb4\x8d\xf0\x27\x7b\x12\x29\x12\x24\x31\xd3\xbc\x84\x52\x8e\xfd\x5a\xcd\x75\xe1\xb8\x70\x24\x80\x31\xeb\x43\x14\x91\x90\xf3\x56\x87\x15\x33\x28\x65\x5e\x50\xa4\x1c\xe1\x1c\xdc\x3e\xac\xe0\xd3\x00\x1b\xb8\xb6\xd2\x91\x4e\xbf\x50\xdb\x9a\x23\x0c\x1f\x23\xe2\x04\x19\xe3\x4b\xcc\xcc\x9c\xa7\xbe\x16\x7f\x13\xd4\x99\x82\x94\x77\x13\xfd\x0c\x0f\x34\xd9\x95\x58\x61\x52\x13\xc3\xfb\x05\xa6\x1a\x39\x4a\x37\xb4\x47\x9c\x3e\x67\xa2\xe3\x02\x6f\x7f\x0b\x88\x07\x6d\xf5\x57\xcf\x40\x0b\xdd\x2f\x49\x8c\x4d\x9b\x82\x15\x40\x15\x89\xb8\xc1\x1e\xb5\x1b\x39\xf6\x58\x16\x79\xba\x2f\xa8\x4e\xe7\x9a\xb7\x42\x46\xe3\xe4\xfe\x67\x3f\x93\x86\xf9\x51\x2d\x85\x13\xcc\x6c\x66\xa1\x13\xdf\xa1\xff\x11\xc8\x5b\xc5\x71\x14\x88\x2f\x4d\x23\x49\xa0\x39\xad\xb9\xd8\xeb\x88\xb2\xfd\x74\x75\xb6\x6e\xd1\x39\xd5\x20\x72\x13\x5f\x6e\xb0\xa1\xc4\x70\x29\x21\xa0\x0e\x13\xfb\x70\x6a\x7a\x27\x3c\x16\x80\xdc\x93\x53\x54\x99\x88\xa7\x4f\x70\x62\x6a\xa8\x73\xaa\x77\xbd\xbc\xae\x05\x65\xcd\x63\x75\xdf\x39\x64\x7f\xa5\x35\xd1\x62\x34\x66\x63\x07\xd7\x90\x7d\x32\xaf\x4d\x3e\xef\xd7\xe4\x0c\x2d\x01\xae\x80\xa5\xc1\xdd\x37\xe7\x84\xd5\xc3\x57\x5d\xb0\xd1\xd6\x99\xa3\xfa\xa6\xb9\x6b\x35\x04\xf8\xb0\x4a\x90\x44\xb4\x1f\xf0\x8d\xe8\x8f\xe1\x2f\x7a\x97\x8b\x55\x74\xb0\x25\xa2\x93\xa6\xcb\x5a\x45\xf6\xf5\xbb\xb1\xfe\x1f\xd0\x08\xc6\xd1\xc2\xb9\xc6\xac\x4e\xb6\x0b\x39\x93\x40\x1c\x10\xcc\x87\xff\xc5\xf7\x2f\x6f\x48\x03\xac\x7c\x70\xc8\x88\x4b\xb9\x49\x54\xfb\x36\x5b\xae\xe6\x80\xb9\xe3\x83\x96\x76\xc5\x42\x1c\x2d\xcb\x41\x1c\x3b\x7b\x42\x52\xe8\x77\x92\x20\x20\xb8\xf6\x3c\x7b\xec\x9e\x62\xd9\x00\xcd\xb2\x06\x71\xe0\x47\xc6\x09\x60\x02\xe3\xd6\xef\xba\xc2\x79\x16\xaa\xb1\x9a\x57\xae\x3d\x06\xa3\x82\x39\x04\x1f\x70\x60\xc4\x9f\x94\xcf\x9f\x27\xd6\x5c\xf8\xa7\xed\xae\xd4\x50\x42\x5c\xea\x63\xe5\xe9\x8d\x84\x19\x4a\x50\xdf\x07\x84\xd5\xa0\x0f\x51\x79\x11\xa2\x81\xb9\x46\x4e\x44\xdb\x3d\xd0\xf0\x60\x6c\x46\xe3\xd7\x36\xb2\x79\x84\x24\x8b\x6c\x7a\x03\x24\xd3\x6a\xd9\x88\xcd\xd5\xe3\x68\x7a\x08\x56\x9e\x22\x95\xe3\xfe\x5d\xa6\x45\xa4\x5d\x28\x8a\xa5\x7b\x1b\xea\x31\xfb\xe9\xf0\xd5\x3e\x39\x93\x45\x1a\xab\x81\x94\x50\x12\xda\x6e\x61\x3f\x1b\xb0\xf6\x5a\x26\x16\x25\xc9\x79\x68\xf1\x42\x23\xbf\xd3\x56\x7b\xca\x7e\x55\xc1\x8b\x34\x82\xf4\x01\xd8\x8f\xef\xb7\x9d\x87\xaa\xea\x2f\x8c\xe3\xa2\x80\xc5\x07\x48\xbd\xe7\x8a\x93\x34\xef\xa0\x89\x90\x47\x48\x7f\x94\x53\x9e\xef\x3a\x4f\x6f\x78\x13\x56\x17\xa3\x9d\x45\xf3\x1a\x23\x67\xf2\x45\xf0\x3b\x62\xab\x3d\x53\x65\xf8\x94\xf6\x82\x72\x04\xc2\xcd\x24\x30\x9f\x09\xee\x7a\x75\xeb\x76\xc6\xf6\xa6\xad\xc2\x23\x6a\x23\xea\x2c\x43\x93\x7a\xb1\xcb\x63\x1d\x7c\xae\xa3\xf0\x38\xc6\x29\x8c\x66\xbb\xd1\xe3\xb2\x6a\x4e\x79\x59\x71\xe8\xfe\x86\xab\x42\x75\x5e\xae\x21\x85\x22\xa6\x2e\xc4\x87\x0a\x05\x47\x75\x13\x2d\x90\xd6\x0e\xa0\x11\xd9\xe5\xfc\x3b\xdc\x0c\x15\x0d\xe0\x84\x4e\x71\xb0\x6e\xa1\x0e\x3f\x1c\xf5\x02\xc7\x47\x84\x67\xc2\x4a\x22\x5d\x59\x1b\x69\xe6\xc7\x59\x8a\xaf\xe1\x2c\x7f\x84\xf5\xd1\xc1\x98\x0f\x14\xf6\x73\x02\x67\x91\x78\x65\xfc\x12\x13\xa0\xfd\x7d\x43\x80\x2f\x89\xb5\x0c\x7f\x25\xb1\x90\x28\x0f\xd3\x2f\xe6\xf6\xaa\x31\xb4\x01\xe6\xd9\xd6\xac\x77\x1b\x6e\x28\x44\xf7\x1c\xd1\x96\x63\x8f\xa3\x51\xd0\x7c\x75\x2e\x8e\x46\x6b\xa2\x93\x3d\x4b\xd0\x85\xb1\x65\xa7\xca\xbb\xd1\x65\x89\x18\x6a\x70\x93\x39\xe5\x51\xb6\xf0\xeb\x44\xb9\xbf\x00\xfd\xb0\x31\x74\xe0\x65\x39\xcb\x29\x2a\xff\x63\x10\x5b\xb8\x8b\xdf\xe2\x67\xb5\x81\x16\x2c\xda\xff\x22\xeb\xce\xf7\x45\x2b\x59\x6c\x3f\x64\x27\x92\x67\x0f\x25\x3c\x42\xb9\xe5\x46\x11\x32\x69\xc5\x7a\x73\x4b\xc4\xd8\x80\x57\x61\x9e\xf8\x76\x98\x27\x41\x0e\xc5\xc5\x9b\xbd\x1a\xa4\xd2\xfe\xfd\x5f\x67\xc2\xdc\x28\xe2\x23\x35\xdb\x42\x34\x63\xde\xfc\x76\xba\x0d\xf8\xe9\x5b\x14\x67\x61\x59\x49\xec\xde\x4e\x0b\x01\x40\x6a\x28\x1d\xfb\xb0\xef\x26\xd0\x01\xb0\x22\x86\x34\xcb\x19\x18\xf6\xcb\xf8\x87\x98\x23\xc1\x98\x83\x01\x0f\xfe\x19\x86\x27\x11\xc8\xcd\xf7\x58\xdd\x08\xa0\x87\xd7\xcd\x6d\xdd\x92\x53\xf5\x35\x1c\x8c\x1a\x78\x57\xf7\xa9\xb4\xb0\x1a\x24\x6e\x23\xa1\xc9\xa8\x1a\x0f\x6c\xec\x1b\xd0\x30\xba\x3e\x8a\xff\xcf\xac\x24\x3f\xc3\xf3\x82\x7c\x7a\x24\xae\x97\x89\x12\xc0\x37\x88\x5f\xa2\x06\x14\x81\x81\x13\x1e\xaf\x57\x9f\x6e\x80\x5d\x05\x44\xf6\xf3\xa6\x5f\x59\x8c\x4e\xea\xe0\x2d\x46\xa5\x9c\x66\x7d\xa1\x9b\x4a\x06\xf4\x52\xc5\x5d\x86\x9f\x98\x93\x62\x66\xf7\x94\x70\x36\xc6\x43\x2c\x20\x11\xd8\xf4\x6c\x4c\xe1\xd7\xed\x72\x0d\x01\x36\xee\x45\x90\x3e\xea\x39\xe2\xeb\x0e\x5b\x32\x64\x91\x60\xa1\x7b\x4d\xba\xb8\x27\x91\xc8\x04\xfa\xc4\xd8\xa0\x9a\xf3\x51\x6c\x23\x49\x73\x4f\xa0\x55\x93\x6a\xf2\x0f\x59\x23\x62\x2f\xaa\x21\x7e\xe4\xcd\x47\xfc\xcc\xea\x8e\x7a\x71\x6a\xe0\xdf\xc8\x01\x45\xa1\x62\x8e\xab\x8b\x1c\x41\x78\x7e\x4a\xde\x84\xa5\x81\xe0\x2f\xd0\xa1\xf1\xcd\xee\x27\x41\x2a\x64\x98\x0b\x51\x28\xfe\xcd\x05\xa9\xdd\x30\xcd\x41\xe2\x11\x96\x98\xcd\xfb\xd6\xe7\xfd\xf2\x77\x0c\x0b\xda\x33\xf4\x8b\x1a\x6c\xcc\xa9\x15\x80\x48\x53\x81\x72\x6e\x9f\x59\x3b\x1a\xfd\xad\x41\x45\x32\x4d\xe7\x49\x7d\x62\x5a\x40\xeb\xdf\xba\x24\x4f\x5d\x2e\xe1\x4f\xa1\xb7\xa4\x4c\x31\x94\x9a\x71\xd5\xdd\xc9\xa0\xae\x6d\x77\x47\x74\x26\xf6\xce\x0d\x3c\x1c\xd1\x1a\xb0\x7e\x15\xab\xa5\x35\xe7\xc2\x89\x8f\xab\x10\xfc\xb7\xc0\x04\x0d\x43\x04\xd4\x22\x0a\x47\x53\x34\xcc\x9b\x69\xcd\xf4\xb2\xdf\x83\xc4\xa9\x77\x67\x20\x5a\xfb\x0d\xa1\x03\x6c\x68\xd6\xda\x8c\x6b\x9b\xbe\x4e\xb9\xeb\x82\x38\xd5\xc4\x6d\x38\xc0\x24\x5e\xe8\xde\xf8\x37\x4c\x2f\xeb\x4b\xb0\x49\x62\xa9\xa6\x33\x1b\xaa\x58\xb0\x00\xdb\xaa\xc5\x52\x90\xdd\x4c\x2c\xf9\xd5\x6b\x02\x3f\xea\xfc\x2e\xb5\x57\x7a\xb2\x23\x8f\xc5\x3b\x81\x10\xf9\x01\xcf\x5a\x80\xad\x5f\xbb\xc4\xc8\x73\xd3\x65\x5e\x93\x32\x15\x7d\xa1\x1f\x0f\x9e\xd8\x94\xbb\x2a\x18\xcf\xfa\x5d\x75\x2e\xbd\xee\xe5\x05\xd9\x2f\x16\xd2\x3a\x54\x97\x2a\xfa\xd1\x74\xed\x21\xb7\x12\x2a\x9d\xe6\x96\x53\xd7\x9e\x8e\xa7\x72\x07\xa7\xa4\xea\x57\xd5\x15\xff\xdb\x67\x18\xda\xb3\x62\x09\x23\x7b\x18\xfe\xdc\x31\x97\xf7\x94\x24\x56\x14\x5f\x2a\xab\x51\x16\x24\xc5\x5f\xdf\x27\x3f\xb9\x5e\xe4\xa0\x42\x5f\xc1\x1c\x45\x06\xe0\xa9\x72\xaa\x88\x97\xb5\x9f\x59\x14\xde\x2a\x98\x81\x70\xc9\xa8\xaa\xb7\x07\x40\xb6\x82\xe9\xfc\x2b\xe6\xf4\x46\x27\x2e\xb0\xa3\x94\xa0\x10\x55\xa3\xe6\xaa\x16\xb6\x97\xb0\x96\xd9\xcd\xeb\x90\xa2\x26\xa7\xae\x0a\xc0\xe5\xbe\x5b\x2c\xcf\xe9\x89\x7f\xf1\xcd\x46\x9c\x91\x20\x9f\xd1\xe3\x4d\x19\x70\xea\xe4\xa7\x3a\xa6\xfb\x24\xc7\xe0\xbf\x65\xc4\xba\x75\x81\x18\x13\x8a\xaa\x61\xf4\xa9\xb2\xa5\xa7\xa0\x30\x6a\x4a\xa1\x6e\xda\x41\x15\x4b\x65\x02\xbc\x14\xf8\xc0\x58\x2f\xef\x15\x3c\x08\xae\xc9\x77\xa7\x49\xe5\x01\xcc\xc7\x70\xb2\x51\x98\x9a\xea\x64\x3a\xe3\x1e\x14\xe6\x13\x2f\xab\xaf\xbf\x81\x8d\x11\xf1\x77\x47\x67\x04\x3a\x0a\x53\xb3\x39\x20\x03\x6b\x19\x64\x77\x6f\x80\x12\xe0\x6f\x08\x7d\x3f\x26\xdb\x60\x18\xd2\x35\xbe\xc3\xc6\xde\x05\xb5\x1d\x1f\xd5\xe2\x87\xb5\x81\xf2\x4f\xaf\x2f\x5b\x4c\xd0\xcc\x60\x40\xc6\x15\x62\x24\x23\x0a\x03\x89\x24\x69\xba\x03\x49\xc5\x32\x23\xd1\xe0\x13\xef\x76\x86\xe4\x7b\x81\xc9\x37\x79\x92\x78\x0a\x67\x3f\xbe\x2a\xd0\xc7\x39\x47\x59\x4a\x44\x44\x67\xcb\xd2\x76\x0d\x45\x08\xf3\xc3\x0a\x93\x16\x5a\x0d\x75\x44\x23\xf7\xd8\x38\x1b\xb4\xbb\x02\x73\x79\x55\x80\x10\xfe\x00\x33\x7f\xf1\x34\x25\x15\xab\xc9\x8b\x0a\x07\x19\x2e\x9f\x76\xc8\x1f\xe0\x87\xa1\x9e\x84\x36\x5d\xda\x3d\x04\x0d\x6a\x59\x20\x09\xdf\xa8\x6c\xa6\x8b\x7f\xb2\x88\xfe\xa5\x23\x74\x37\x64\xbd\x13\xa7\x42\x3e\x2f\xbe\x6d\x66\x0a\x21\x46\x10\x20\x6a\x00\x5c\x80\x63\xba\x74\x46\x23\xdd\x18\x67\x76\xec\x43\xde\x82\x14\x79\x89\x24\x59\xd0\x1a\x8f\x87\x41\xa0\xf5\xa1\x33\x6d\xd4\xd9\xcf\x50\x9c\x0b\xbd\x60\x92\x0c\xcf\x32\xd9\x0e\xba\xae\x71\x27\x8a\x5d\xec\x33\x92\xb5\x8b\x41\xc8\x44\xb4\x97\x43\x3f\xc8\xc6\x49\x9f\x55\x69\x63\x89\x72\x58\x43\xd9\x3c\x73\x1f\x69\xfe\x12\x1d\xfc\xdd\x84\x01\x23\x48\x43\xfb\x12\x8b\x8f\xfb\x74\x12\xe8\x84\xec\x0d\x0f\xa3\x5e\x3c\xa7\xfb\xfd\x5d\x40\x21\x14\x24\xfd\x89\xd8\x3b\x89\xfb\x9e\x8c\x74\x1c\x76\xa1\x0b\x65\x0e\x04\x34\x22\xb2\x82\x49\x3b\x0c\xcd\x17\x15\xb1\x2a\x31\xc3\x64\x12\xd6\xbc\x40\xf1\x30\xa2\x30\xfa\xc8\x7c\xf3\x98\x76\x74\xa7\x51\x39\x2a\xe1\x20\xbf\xf1\x8a\xf6\x43\x3c\xb3\x24\x4f\xb9\xf6\x08\xe6\x86\x6c\x39\x4f\x1a\x6b\xd1\xbf\x3f\x17\x21\xc3\xc3\xf7\xfc\x52\x28\xfe\xfe\x5f\x03\xcb\x10\x9a\x56\xb3\x6a\x95\x1d\xb1\x97\x78\x3f\x63\x84\xe1\x46\xc6\x85\xfd\xae\x39\x00\x95\xdb\x2b\xc2\xcf\xff\x05\xe4\xc4\xd0\x61\x0e\xe6\x2d\x9f\xfc\xe5\x74\xfd\xc9\x7b\x9c\xc3\xa7\x2a\x3f\xf1\x10\x15\x40\xaf\x32\x62\x35\xc0\x18\x0f\xac\x83\xbf\x32\x16\x86\x6c\x35\x79\xea\xe2\x89\x92\x81\x99\x9e\xe8\x6b\x4c\x63\xf0\x02\x6b\xae\x3d\x52\x5c\x3b\xf6\x47\x3d\x9b\x3b\x9e\x34\x57\x90\x94\x9f\x92\x4a\xda\xf6\xc7\xee\x31\xc8\xa2\x11\xd6\x77\x6b\xd3\x8f\xde\x55\xe0\xe0\xd6\x6e\xa6\x1c\x1c\x4e\xed\x49\xbd\xf6\x9c\xa0\xc9\xea\x53\xe3\x6d\x98\xbb\x35\x11\x44\x19\x73\xc6\x89\x20\xe1\x57\x3d\xde\x3b\x90\xdd\x85\xd9\x2e\xe1\x52\x2b\x34\x1e\x1b\x57\x23\x91\x3e\xe8\xe3\x51\x84\x08\x1f\x8a\xa1\x59\x0b\x86\x16\xaa\x96\x92\x87\x9c\x7b\x49\xad\x8e\x41\xd2\x76\xad\x97\x88\x78\x61\xe9\xdf\x13\x84\xbb\xec\x16\x54\x38\xea\x45\xfd\xb7\x35\x19\x03\xb8\x49\xe9\xdd\x38\xae\x29\xf3\x0b\xfa\xe8\xd5\x97\xfc\x73\xff\x38\x90\x34\x70\x8d\x5d\x90\x35\xc3\x27\x34\xd9\x8d\xe8\xe3\xc9\x8b\x0c\xb6\x02\x7f\xa8\x67\x6b\x01\x78\x88\xf1\xc2\x22\x30\x95\xe3\xdd\x0d\xe1\xbd\x88\x99\x63\x74\x6e\xc9\xdc\x4e\xa6\x2e\xc7\x8f\xc3\x7f\x18\x7d\xa7\x40\xee\x5b\xd1\x4f\x43\x02\xc0\x54\xfc\x63\x24\x50\xff\xca\xde\xbb\x02\x0d\xaf\x54\xe5\x33\x33\x61\x99\xa9\x50\xc2\x10\x6c\x50\xb4\xb2\x4b\xa7\xc2\x96\x15\xed\x87\x30\xff\x6a\x12\x51\xe8\x1e\x72\xe2\x4e\x34\x41\xa5\xc1\x52\x98\x41\x21\x41\x71\x4c\xff\x76\x17\xfe\x0e\xfc\x65\x3b\x12\xa6\x74\x7a\xc4\xb2\xe5\xbb\x09\x93\xe9\x3b\xca\x6a\xe0\x37\xf5\x32\xcc\x51\xec\x19\x0e\xe2\xcd\x9f\x0e\x63\x23\x86\x93\x99\x9d\x92\xc6\xf6\xf4\x3e\xc2\x8c\x63\x83\x23\x05\x46\xaf\x35\x52\x1d\x31\xde\x1f\x35\x9a\xb9\x4f\xa7\xed\xd5\xb0\xcf\xa0\x69\xf0\x46\xeb\x70\x6b\xf9\x29\xac\xa4\x0a\xbd\x19\x9a\x89\xfd\x5e\x41\xaa\x54\x67\xf8\x52\x80\x23\xca\x41\xfc\x87\x35\xaf\xd7\x2c\xeb\x06\xc5\x36\x2d\x6a\x39\x4d\x71\x5a\xb3\x5f\x9a\x20\x7b\x75\x79\xe6\x30\x34\x34\xaf\x9b\xa2\xfb\xb6\x19\x81\x4c\x9f\x82\x63\x0a\x9c\xe0\xdd\x6d\xe4\x8b\xa7\x14\x98\xa8\x3a\xde\xe0\x9d\x1e\xbe\x71\x47\x9b\x09\xd6\x9d\xd7\xd9\xc5\xc9\xc4\xd1\xd1\x95\x93\xad\x10\xcd\xc2\x5f\x1e\xd9\x44\x57\xb5\xd4\x9f\x35\x25\x71\xd0\x93\x83\x8f\x22\x1b\x57\xc5\x50\x0b\xb7\x9d\x47\x80\x6e\xe9\xd8\x3a\x70\xd7\xb7\xa3\xc7\xf5\x3f\x62\x7b\x67\xd0\xba\xb1\x0b\xff\x84\x3c\x69\x27\x61\xd0\x56\xc4\x0e\xdf\x41\x04\xa5\x09\x25\x41\x61\x25\xb5\x4c\x71\x1a\x13\xc7\x96\x85\x33\xc8\xa7\xc9\x0a\x39\xa5\xc0\xcf\x2c\x12\x8a\x72\x79\x64\x0e\xbb\xd2\x99\x67\xbe\x99\xee\xf4\x35\x90\x07\x10\x23\xa6\x85\xe1\xc7\xed\xb7\xd4\x91\xfc\x7b\x6f\xdd\x3f\x56\xdc\x2e\xfa\xb1\xfc\x4d\xc2\xfd\x24\x38\x86\x7d\x51\x26\x46\x06\xd3\x3f\x32\xbb\x5d\xa9\x1f\x27\x9a\x07\xd8\xfb\xe9\x2b\x89\xbd\x98\x2c\xfc\x4e\x68\x43\xb0\xb2\x2a\xf6\x85\x79\x3e\x07\xb5\x58\xf4\xb8\x89\xf6\x24\xc6\x32\x76\x92\x50\x2f\x80\x3c\x42\xe2\xf2\xa1\x7e\xe7\x2b\xbb\xe4\x73\xde\x0e\x38\xba\x9d\x19\xf6\x89\x8c\xd5\x4a\xe1\x23\x94\x11\x71\x3e\x3d\x52\xdc\x75\x83\xe0\x4b\xf5\x8f\xd6\x7f\xb1\x2f\x64\x58\xe2\x48\x80\x33\x73\x3f\x06\x54\xbc\x1c\x32\x62\x72\x59\x80\x86\x1d\xa4\xe2\x59\x60\x41\xe1\x24\x05\x27\xe7\x19\x87\x07\x33\x99\xc7\x01\x4e\x66\x04\x43\xe3\xb1\x94\xf8\x37\xcb\x7b\x2d\x1b\x5d\xb8\x8d\x39\x98\x82\xdb\xce\xed\xcc\x9c\x10\x9b\x55\x55\x81\xa5\xe1\xc2\x5f\xd0\x2e\xc5\xcf\x28\x0c\xbd\xec\xcb\xe3\x2e\xbc\xe1\x1b\xc6\xd3\xca\xda\x9c\x04\x34\xac\x82\x3f\x2e\x5b\xbb\xab\x27\xd1\x52\xe9\xc6\xc5\xab\xa6\xdb\x48\xc8\xba\xd7\xb7\x7c\xfa\x23\xa7\xda\x4a\x5b\xc9\x36\x73\x14\x86\x2a\x52\xcf\x25\xdd\x82\x4d\x23\xa4\x51\x9e\x54\x59\x7d\x24\x3f\x95\x1a\x90\x8e\xe5\x94\x93\xe2\x68\xe5\x87\x1a\xdf\x56\x61\x13\x39\x31\xf4\x93\xad\x99\x0a\x9b\x98\x98\x8c\x47\x18\x7f\xa6\xd8\x79\x9c\x09\x3d\xfc\xaf\xe1\x70\x50\xb3\x91\xd3\x3b\x64\x8f\x0c\x65\xba\x11\xa1\xd8\x7f\x46\x1b\x86\xf8\x20\xca\xf2\x76\x95\xce\x66\x73\x59\x1a\x21\x4b\x08\xb3\x3c\x28\x53\xf4\x78\x50\x25\x67\x6d\xfe\xb9\x37\x51\xd8\x3a\xe3\x8e\xe0\x48\xde\x49\x0d\x03\xe7\x49\x33\xdd\xac\xce\xfb\xaf\xe8\x5b\x86\xfb\x00\x2d\x0e\x11\xa8\x8f\x22\x39\x8c\x24\xcc\x0d\x9d\xc2\xb5\x3b\xc2\x95\x95\x68\x77\x45\x48\x5e\x2f\x27\x51\x93\xd7\x8b\x29\x1c\x2e\x25\x87\xaf\x69\xe4\xb0\x28\x3c\xb8\x2e\x4b\x3e\x5c\xd3\x0e\xf8\x4e\xbe\xf5\xe8\xcd\x72\xcc\x8b\x81\xe8\x70\x26\x90\xbd\x28\x65\xbb\x1e\x1b\x2b\x77\x5e\x4c\xb3\x10\x77\xe2\xee\x5c\x0e\xfa\xd8\x7d\x38\xa3\x38\x9d\x0b\x92\x50\x78\x24\xf3\xa5\x80\x40\xae\x2f\xa7\xca\xaa\x9f\xe3\x4d\xac\xb8\x8c\x8d\xc3\x12\xc0\xf9\x64\x09\x85\x50\x49\x48\x0f\x7a\x91\x20\x4b\xc3\x61\x67\xc2\xd8\xc5\xa5\xb1\x52\x91\x84\x42\x39\x69\xdc\xda\xe6\x94\xc7\xba\x06\xff\x65\xd5\x91\xa7\x5b\x4c\x81\xcb\xf5\x89\x55\x9a\xd2\xb9\x66\xb4\x3c\x1a\x06\x74\x8a\x01\x2a\x09\xdf\x47\x29\x89\x15\xf4\x17\x77\x22\x4c\xe1\x45\xe7\x41\x4d\xde\x0d\x4b\x08\x2a\x29\xeb\xdc\x5b\xfa\x6e\x38\x10\x1a\x64\x37\xba\x5e\x39\x9b\x35\x68\x37\x4b\x3c\x94\x12\xe1\x48\xf7\xfb\x1f\x8c\x21\x62\xa7\x55\xcd\x75\x1e\xeb\xce\x27\xd8\x6d\xd1\xed\xf0\x26\x6c\xad\x7e\x71\x3d\x1c\x25\xf9\x74\x2b\x32\xb1\x30\xd0\x83\x7a\xe0\x9a\x31\xc3\x77\xaf\x7f\x22\x54\xaa\xaf\x07\x43\x2a\x65\x2b\x2b\x29\x5d\xd6\xcf\x1e\xb0\x11\x36\x20\x32\x99\xff\xbb\x14\x83\x1a\x46\xb4\xa7\x1e\xdc\xa6\x2c\x1b\x43\x8d\x9d\x45\xa8\x4f\xbb\x79\x19\xac\x14\x05\x77\xa4\x0c\xef\x9a\x4c\x81\x40\xbc\x8e\x78\x1e\x81\x0e\xbe\x3f\x71\x49\x1a\x89\x3c\xb5\xcb\x75\x26\x3d\x63\x6f\xbb\x2f\x6e\xe0\xc8\x1a\x04\x5d\x9b\xf3\x99\x81\x5e\x75\x66\x77\x21\x49\xc9\x21\x29\x29\x84\x17\x96\xd8\x56\xf9\x62\x3c\x12\x4c\xc7\x57\x1c\xc6\xe0\xb7\xfe\x58\xd9\xd9\x56\x97\x92\x31\x32\x89\x13\x57\x76\xfb\x46\x35\x99\xc2\x16\x94\x9c\x9b\x1d\xe8\x0b\xbd\x7c\x3f\x32\xb0\xbd\x68\xb7\xea\x03\x1e\x2b\xa2\xd4\x54\x37\xaa\xe6\x57\xe5\xb7\xc2\x2b\xb3\x1c\x2f\xec\x63\x32\x13\x53\xfe\x83\x62\xef\xe9\xf8\xda\xbd\x95\xb1\xc1\x03\xb0\xd6\xf3\x71\xe4\x19\xb1\xe4\xf6\x2f\x6b\x81\x53\xa4\x5a\x36\x42\xb0\xcf\x21\xf3\x21\xfe\x7b\xa9\x0d\xc8\xe6\x36\xe9\x85\xfd\xe7\xd7\x3d\x2f\x7c\x64\x14\x4a\xbf\x52\xbb\xb3\x49\x87\xa4\x4e\x2c\x37\x15\x17\x00\x5b\x65\x51\x96\xb8\x92\xfb\x9a\x96\x10\x59\xc5\x09\x49\xc4\xcb\x9a\x8a\x8b\x5f\x44\x55\xdb\x1d\x7a\x7c\xbd\x94\x17\xd0\xd8\x96\x38\x6f\xaf\xfd\xc8\xa0\x0c\x99\xe0\x04\xa3\x82\x3d\x8a\x75\xb4\xfd\xcd\xf4\xf8\x9d\x89\x55\x41\xf4\x8d\xff\xe5\xbe\xce\x7f\xe9\x93\xbd\xee\x66\xb0\x0f\xe5\x62\xe4\xf1\xab\x13\x7e\xa5\x1c\xf2\x28\xba\x1b\x3d\x9b\xac\x0a\x11\xce\x9a\x09\xeb\xd8\x46\xf7\x45\x98\xcd\x62\x2a\xf9\xee\x0e\x04\x01\x5c\xa4\xbf\x3c\xdf\x44\xca\x22\xf5\xe0\xbe\x52\xc3\xa2\x60\xb0\xcb\x82\xc2\x2f\x35\x9a\xf1\xd3\x56\x8f\x6a\xd2\x6d\xdc\xa5\x28\xa4\x1c\xaf\x47\xe3\x0e\x3e\x1c\xa2\x40\x75\x3d\x6e\xbf\x7d\xd6\xb2\x45\x71\xc2\x51\x09\x04\xfa\xac\x2f\xc3\x9a\x3d\xba\xdb\x7c\x5c\x69\x1c\x62\xb1\x39\x5d\x23\x33\x84\x50\xea\x36\xc1\xd2\xd7\xf8\x96\xa6\x68\xca\x59\x99\x68\x21\xd8\xc4\xcd\x2a\x22\xba\xc2\x62\x42\xdd\xf8\x3c\x12\x4e\x28\x22\xab\x06\x5e\x0f\x62\x7e\xc6\x60\x31\x1a\xb6\xf7\xfa\xa0\x1c\x5e\xb7\x83\xa9\xe4\x03\xe0\x15\x8b\xce\xd5\xa3\xb7\xdf\x15\xcd', 2) | 8,109 | 24,267 | 0.750236 | 6,069 | 24,327 | 3.004943 | 0.043665 | 0.006251 | 0.006416 | 0.005264 | 0.002139 | 0.001316 | 0.001316 | 0 | 0 | 0 | 0 | 0.316569 | 0.000411 | 24,327 | 3 | 24,267 | 8,109 | 0.433401 | 0 | 0 | 0 | 0 | 0.333333 | 0.995971 | 0.995971 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 11 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.