hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
4c0397f62970e8fc09a91e7fa90917bd9efbc31f | 30,290 | py | Python | src/openprocurement/api/tests/bidder.py | Vanuan/openprocurement.api | 7116623574c558f76a5ac6f9ccbe720f418912a0 | [
"Apache-2.0"
] | null | null | null | src/openprocurement/api/tests/bidder.py | Vanuan/openprocurement.api | 7116623574c558f76a5ac6f9ccbe720f418912a0 | [
"Apache-2.0"
] | null | null | null | src/openprocurement/api/tests/bidder.py | Vanuan/openprocurement.api | 7116623574c558f76a5ac6f9ccbe720f418912a0 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
import unittest
from openprocurement.api.tests.base import BaseTenderWebTest, test_tender_data
class TenderBidderResourceTest(BaseTenderWebTest):
initial_status = 'active.tendering'
def test_create_tender_bidder_invalid(self):
response = self.app.post_json('/tenders/some_id/bids', {
'data': {'tenderers': [test_tender_data["procuringEntity"]], "value": {"amount": 500}}}, status=404)
self.assertEqual(response.status, '404 Not Found')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(response.json['status'], 'error')
self.assertEqual(response.json['errors'], [
{u'description': u'Not Found', u'location':
u'url', u'name': u'tender_id'}
])
request_path = '/tenders/{}/bids'.format(self.tender_id)
response = self.app.post(request_path, 'data', status=415)
self.assertEqual(response.status, '415 Unsupported Media Type')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(response.json['status'], 'error')
self.assertEqual(response.json['errors'], [
{u'description':
u"Content-Type header should be one of ['application/json']", u'location': u'header', u'name': u'Content-Type'}
])
response = self.app.post(
request_path, 'data', content_type='application/json', status=422)
self.assertEqual(response.status, '422 Unprocessable Entity')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(response.json['status'], 'error')
self.assertEqual(response.json['errors'], [
{u'description': u'No JSON object could be decoded',
u'location': u'body', u'name': u'data'}
])
response = self.app.post_json(request_path, 'data', status=422)
self.assertEqual(response.status, '422 Unprocessable Entity')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(response.json['status'], 'error')
self.assertEqual(response.json['errors'], [
{u'description': u'Data not available',
u'location': u'body', u'name': u'data'}
])
response = self.app.post_json(
request_path, {'not_data': {}}, status=422)
self.assertEqual(response.status, '422 Unprocessable Entity')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(response.json['status'], 'error')
self.assertEqual(response.json['errors'], [
{u'description': u'Data not available',
u'location': u'body', u'name': u'data'}
])
response = self.app.post_json(request_path, {'data': {
'invalid_field': 'invalid_value'}}, status=422)
self.assertEqual(response.status, '422 Unprocessable Entity')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(response.json['status'], 'error')
self.assertEqual(response.json['errors'], [
{u'description': u'Rogue field', u'location':
u'body', u'name': u'invalid_field'}
])
response = self.app.post_json(request_path, {
'data': {'tenderers': [{'identifier': 'invalid_value'}]}}, status=422)
self.assertEqual(response.status, '422 Unprocessable Entity')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(response.json['status'], 'error')
self.assertEqual(response.json['errors'], [
{u'description': {u'identifier': [
u'Please use a mapping for this field or Identifier instance instead of unicode.']}, u'location': u'body', u'name': u'tenderers'}
])
response = self.app.post_json(request_path, {
'data': {'tenderers': [{'identifier': {}}]}}, status=422)
self.assertEqual(response.status, '422 Unprocessable Entity')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(response.json['status'], 'error')
self.assertEqual(response.json['errors'], [
{u'description': [u'This field is required.'], u'location': u'body', u'name': u'value'},
{u'description': [u'contactPoint', u'identifier', u'name', u'address'], u'location': u'body', u'name': u'tenderers'}
])
response = self.app.post_json(request_path, {'data': {'tenderers': [{
'name': 'name', 'identifier': {'uri': 'invalid_value'}}]}}, status=422)
self.assertEqual(response.status, '422 Unprocessable Entity')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(response.json['status'], 'error')
self.assertEqual(response.json['errors'], [
{u'description': [u'This field is required.'], u'location': u'body', u'name': u'value'},
{u'description': [u'contactPoint', u'identifier', u'address'], u'location': u'body', u'name': u'tenderers'}
])
response = self.app.post_json(request_path, {'data': {'tenderers': [test_tender_data["procuringEntity"]], "value": {"amount": 500, 'valueAddedTaxIncluded': False}}}, status=422)
self.assertEqual(response.status, '422 Unprocessable Entity')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(response.json['status'], 'error')
self.assertEqual(response.json['errors'], [
{u'description': [u'valueAddedTaxIncluded of bid should be identical to valueAddedTaxIncluded of value of tender'], u'location': u'body', u'name': u'bids'},
])
response = self.app.post_json(request_path, {'data': {'tenderers': [test_tender_data["procuringEntity"]], "value": {"amount": 500, 'currency': "USD"}}}, status=422)
self.assertEqual(response.status, '422 Unprocessable Entity')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(response.json['status'], 'error')
self.assertEqual(response.json['errors'], [
{u'description': [u'currency of bid should be identical to currency of value of tender'], u'location': u'body', u'name': u'bids'},
])
def test_create_tender_bidder(self):
response = self.app.post_json('/tenders/{}/bids'.format(
self.tender_id), {'data': {'tenderers': [test_tender_data["procuringEntity"]], "value": {"amount": 500}}})
self.assertEqual(response.status, '201 Created')
self.assertEqual(response.content_type, 'application/json')
bidder = response.json['data']
self.assertEqual(bidder['tenderers'][0]['name'], test_tender_data["procuringEntity"]['name'])
self.assertTrue('id' in bidder)
self.assertTrue(bidder['id'] in response.headers['Location'])
self.set_status('complete')
response = self.app.post_json('/tenders/{}/bids'.format(
self.tender_id), {'data': {'tenderers': [test_tender_data["procuringEntity"]], "value": {"amount": 500}}}, status=403)
self.assertEqual(response.status, '403 Forbidden')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(response.json['errors'][0]["description"], "Can't add bid in current tender status")
def test_patch_tender_bidder(self):
response = self.app.post_json('/tenders/{}/bids'.format(
self.tender_id), {'data': {'tenderers': [test_tender_data["procuringEntity"]], "value": {"amount": 500}}})
self.assertEqual(response.status, '201 Created')
self.assertEqual(response.content_type, 'application/json')
bidder = response.json['data']
response = self.app.patch_json('/tenders/{}/bids/{}'.format(self.tender_id, bidder['id']), {"data": {"value": {"amount": 600}}}, status=422)
self.assertEqual(response.status, '422 Unprocessable Entity')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(response.json['status'], 'error')
self.assertEqual(response.json['errors'], [
{"location": "body", "name": "bids", "description": ["value of bid should be less than value of tender"]}
])
response = self.app.patch_json('/tenders/{}/bids/{}'.format(self.tender_id, bidder['id']), {"data": {"value": {"amount": 400}}})
self.assertEqual(response.status, '200 OK')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(response.json['data']["value"]["amount"], 400)
response = self.app.patch_json('/tenders/{}/bids/some_id'.format(self.tender_id), {"data": {"value": {"amount": 400}}}, status=404)
self.assertEqual(response.status, '404 Not Found')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(response.json['status'], 'error')
self.assertEqual(response.json['errors'], [
{u'description': u'Not Found', u'location':
u'url', u'name': u'bid_id'}
])
response = self.app.patch_json('/tenders/some_id/bids/some_id', {"data": {"value": {"amount": 400}}}, status=404)
self.assertEqual(response.status, '404 Not Found')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(response.json['status'], 'error')
self.assertEqual(response.json['errors'], [
{u'description': u'Not Found', u'location':
u'url', u'name': u'tender_id'}
])
self.set_status('complete')
response = self.app.get('/tenders/{}/bids/{}'.format(self.tender_id, bidder['id']))
self.assertEqual(response.status, '200 OK')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(response.json['data']["value"]["amount"], 400)
response = self.app.patch_json('/tenders/{}/bids/{}'.format(self.tender_id, bidder['id']), {"data": {"value": {"amount": 400}}}, status=403)
self.assertEqual(response.status, '403 Forbidden')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(response.json['errors'][0]["description"], "Can't update bid in current tender status")
def test_get_tender_bidder(self):
response = self.app.post_json('/tenders/{}/bids'.format(
self.tender_id), {'data': {'tenderers': [test_tender_data["procuringEntity"]], "value": {"amount": 500}}})
self.assertEqual(response.status, '201 Created')
self.assertEqual(response.content_type, 'application/json')
bidder = response.json['data']
response = self.app.get('/tenders/{}/bids/{}'.format(self.tender_id, bidder['id']))
self.assertEqual(response.status, '200 OK')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(response.json['data'], {})
self.set_status('active.qualification')
response = self.app.get('/tenders/{}/bids/{}'.format(self.tender_id, bidder['id']))
self.assertEqual(response.status, '200 OK')
self.assertEqual(response.content_type, 'application/json')
bidder_data = response.json['data']
#self.assertTrue(u'participationUrl' in bidder_data)
#bidder_data.pop(u'participationUrl')
self.assertEqual(bidder_data, bidder)
response = self.app.get('/tenders/{}/bids/some_id'.format(self.tender_id), status=404)
self.assertEqual(response.status, '404 Not Found')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(response.json['status'], 'error')
self.assertEqual(response.json['errors'], [
{u'description': u'Not Found', u'location':
u'url', u'name': u'bid_id'}
])
response = self.app.get('/tenders/some_id/bids/some_id', status=404)
self.assertEqual(response.status, '404 Not Found')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(response.json['status'], 'error')
self.assertEqual(response.json['errors'], [
{u'description': u'Not Found', u'location':
u'url', u'name': u'tender_id'}
])
response = self.app.delete('/tenders/{}/bids/{}'.format(self.tender_id, bidder['id']), status=403)
self.assertEqual(response.status, '403 Forbidden')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(response.json['errors'][0]["description"], "Can't delete bid in current tender status")
def test_delete_tender_bidder(self):
response = self.app.post_json('/tenders/{}/bids'.format(
self.tender_id), {'data': {'tenderers': [test_tender_data["procuringEntity"]], "value": {"amount": 500}}})
self.assertEqual(response.status, '201 Created')
self.assertEqual(response.content_type, 'application/json')
bidder = response.json['data']
response = self.app.delete('/tenders/{}/bids/{}'.format(self.tender_id, bidder['id']))
self.assertEqual(response.status, '200 OK')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(response.json['data'], bidder)
revisions = self.db.get(self.tender_id).get('revisions')
self.assertEqual(revisions[-2][u'changes'][0]['op'], u'remove')
self.assertEqual(revisions[-2][u'changes'][0]['path'], u'/bids')
self.assertEqual(revisions[-1][u'changes'][0]['op'], u'add')
self.assertEqual(revisions[-1][u'changes'][0]['path'], u'/bids')
response = self.app.delete('/tenders/{}/bids/some_id'.format(self.tender_id), status=404)
self.assertEqual(response.status, '404 Not Found')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(response.json['status'], 'error')
self.assertEqual(response.json['errors'], [
{u'description': u'Not Found', u'location':
u'url', u'name': u'bid_id'}
])
response = self.app.delete('/tenders/some_id/bids/some_id', status=404)
self.assertEqual(response.status, '404 Not Found')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(response.json['status'], 'error')
self.assertEqual(response.json['errors'], [
{u'description': u'Not Found', u'location':
u'url', u'name': u'tender_id'}
])
def test_get_tender_tenderers(self):
response = self.app.post_json('/tenders/{}/bids'.format(
self.tender_id), {'data': {'tenderers': [test_tender_data["procuringEntity"]], "value": {"amount": 500}}})
self.assertEqual(response.status, '201 Created')
self.assertEqual(response.content_type, 'application/json')
bidder = response.json['data']
response = self.app.get('/tenders/{}/bids'.format(self.tender_id))
self.assertEqual(response.status, '200 OK')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(response.json['data'], [])
self.set_status('active.qualification')
response = self.app.get('/tenders/{}/bids'.format(self.tender_id))
self.assertEqual(response.status, '200 OK')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(response.json['data'][0], bidder)
response = self.app.get('/tenders/some_id/bids', status=404)
self.assertEqual(response.status, '404 Not Found')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(response.json['status'], 'error')
self.assertEqual(response.json['errors'], [
{u'description': u'Not Found', u'location':
u'url', u'name': u'tender_id'}
])
class TenderBidderDocumentResourceTest(BaseTenderWebTest):
initial_status = 'active.tendering'
def setUp(self):
super(TenderBidderDocumentResourceTest, self).setUp()
# Create bid
response = self.app.post_json('/tenders/{}/bids'.format(
self.tender_id), {'data': {'tenderers': [test_tender_data["procuringEntity"]], "value": {"amount": 500}}})
bid = response.json['data']
self.bid_id = bid['id']
def test_not_found(self):
response = self.app.post('/tenders/some_id/bids/some_id/documents', status=404, upload_files=[
('file', 'name.doc', 'content')])
self.assertEqual(response.status, '404 Not Found')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(response.json['status'], 'error')
self.assertEqual(response.json['errors'], [
{u'description': u'Not Found', u'location':
u'url', u'name': u'tender_id'}
])
response = self.app.post('/tenders/{}/bids/some_id/documents'.format(self.tender_id), status=404, upload_files=[('file', 'name.doc', 'content')])
self.assertEqual(response.status, '404 Not Found')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(response.json['status'], 'error')
self.assertEqual(response.json['errors'], [
{u'description': u'Not Found', u'location':
u'url', u'name': u'bid_id'}
])
response = self.app.post('/tenders/{}/bids/{}/documents'.format(self.tender_id, self.bid_id), status=404, upload_files=[
('invalid_value', 'name.doc', 'content')])
self.assertEqual(response.status, '404 Not Found')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(response.json['status'], 'error')
self.assertEqual(response.json['errors'], [
{u'description': u'Not Found', u'location':
u'body', u'name': u'file'}
])
response = self.app.get('/tenders/some_id/bids/some_id/documents', status=404)
self.assertEqual(response.status, '404 Not Found')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(response.json['status'], 'error')
self.assertEqual(response.json['errors'], [
{u'description': u'Not Found', u'location':
u'url', u'name': u'tender_id'}
])
response = self.app.get('/tenders/{}/bids/some_id/documents'.format(self.tender_id), status=404)
self.assertEqual(response.status, '404 Not Found')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(response.json['status'], 'error')
self.assertEqual(response.json['errors'], [
{u'description': u'Not Found', u'location':
u'url', u'name': u'bid_id'}
])
response = self.app.get('/tenders/some_id/bids/some_id/documents/some_id', status=404)
self.assertEqual(response.status, '404 Not Found')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(response.json['status'], 'error')
self.assertEqual(response.json['errors'], [
{u'description': u'Not Found', u'location':
u'url', u'name': u'tender_id'}
])
response = self.app.get('/tenders/{}/bids/some_id/documents/some_id'.format(self.tender_id), status=404)
self.assertEqual(response.status, '404 Not Found')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(response.json['status'], 'error')
self.assertEqual(response.json['errors'], [
{u'description': u'Not Found', u'location':
u'url', u'name': u'bid_id'}
])
response = self.app.get('/tenders/{}/bids/{}/documents/some_id'.format(self.tender_id, self.bid_id), status=404)
self.assertEqual(response.status, '404 Not Found')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(response.json['status'], 'error')
self.assertEqual(response.json['errors'], [
{u'description': u'Not Found', u'location':
u'url', u'name': u'document_id'}
])
response = self.app.put('/tenders/some_id/bids/some_id/documents/some_id', status=404,
upload_files=[('file', 'name.doc', 'content2')])
self.assertEqual(response.status, '404 Not Found')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(response.json['status'], 'error')
self.assertEqual(response.json['errors'], [
{u'description': u'Not Found', u'location':
u'url', u'name': u'tender_id'}
])
response = self.app.put('/tenders/{}/bids/some_id/documents/some_id'.format(self.tender_id), status=404, upload_files=[
('file', 'name.doc', 'content2')])
self.assertEqual(response.status, '404 Not Found')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(response.json['status'], 'error')
self.assertEqual(response.json['errors'], [
{u'description': u'Not Found', u'location':
u'url', u'name': u'bid_id'}
])
response = self.app.put('/tenders/{}/bids/{}/documents/some_id'.format(
self.tender_id, self.bid_id), status=404, upload_files=[('file', 'name.doc', 'content2')])
self.assertEqual(response.status, '404 Not Found')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(response.json['status'], 'error')
self.assertEqual(response.json['errors'], [
{u'description': u'Not Found', u'location': u'url', u'name': u'document_id'}
])
def test_create_tender_bidder_document(self):
response = self.app.post('/tenders/{}/bids/{}/documents'.format(
self.tender_id, self.bid_id), upload_files=[('file', 'name.doc', 'content')])
self.assertEqual(response.status, '201 Created')
self.assertEqual(response.content_type, 'application/json')
doc_id = response.json["data"]['id']
self.assertTrue(doc_id in response.headers['Location'])
self.assertEqual('name.doc', response.json["data"]["title"])
key = response.json["data"]["url"].split('?')[-1]
response = self.app.get('/tenders/{}/bids/{}/documents'.format(self.tender_id, self.bid_id))
self.assertEqual(response.status, '200 OK')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(doc_id, response.json["data"][0]["id"])
self.assertEqual('name.doc', response.json["data"][0]["title"])
response = self.app.get('/tenders/{}/bids/{}/documents?all=true'.format(self.tender_id, self.bid_id))
self.assertEqual(response.status, '200 OK')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(doc_id, response.json["data"][0]["id"])
self.assertEqual('name.doc', response.json["data"][0]["title"])
response = self.app.get('/tenders/{}/bids/{}/documents/{}?download=some_id'.format(
self.tender_id, self.bid_id, doc_id), status=404)
self.assertEqual(response.status, '404 Not Found')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(response.json['status'], 'error')
self.assertEqual(response.json['errors'], [
{u'description': u'Not Found', u'location': u'url', u'name': u'download'}
])
response = self.app.get('/tenders/{}/bids/{}/documents/{}?{}'.format(
self.tender_id, self.bid_id, doc_id, key))
self.assertEqual(response.status, '200 OK')
self.assertEqual(response.content_type, 'application/msword')
self.assertEqual(response.content_length, 7)
self.assertEqual(response.body, 'content')
response = self.app.get('/tenders/{}/bids/{}/documents/{}'.format(
self.tender_id, self.bid_id, doc_id))
self.assertEqual(response.status, '200 OK')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(doc_id, response.json["data"]["id"])
self.assertEqual('name.doc', response.json["data"]["title"])
self.set_status('active.awarded')
response = self.app.post('/tenders/{}/bids/{}/documents'.format(
self.tender_id, self.bid_id), upload_files=[('file', 'name.doc', 'content')], status=403)
self.assertEqual(response.status, '403 Forbidden')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(response.json['errors'][0]["description"], "Can't add document in current tender status")
def test_put_tender_bidder_document(self):
response = self.app.post('/tenders/{}/bids/{}/documents'.format(
self.tender_id, self.bid_id), upload_files=[('file', 'name.doc', 'content')])
self.assertEqual(response.status, '201 Created')
self.assertEqual(response.content_type, 'application/json')
doc_id = response.json["data"]['id']
self.assertTrue(doc_id in response.headers['Location'])
response = self.app.put('/tenders/{}/bids/{}/documents/{}'.format(self.tender_id, self.bid_id, doc_id),
status=404,
upload_files=[('invalid_name', 'name.doc', 'content')])
self.assertEqual(response.status, '404 Not Found')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(response.json['status'], 'error')
self.assertEqual(response.json['errors'], [
{u'description': u'Not Found', u'location':
u'body', u'name': u'file'}
])
response = self.app.put('/tenders/{}/bids/{}/documents/{}'.format(
self.tender_id, self.bid_id, doc_id), upload_files=[('file', 'name.doc', 'content2')])
self.assertEqual(response.status, '200 OK')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(doc_id, response.json["data"]["id"])
key = response.json["data"]["url"].split('?')[-1]
response = self.app.get('/tenders/{}/bids/{}/documents/{}?{}'.format(
self.tender_id, self.bid_id, doc_id, key))
self.assertEqual(response.status, '200 OK')
self.assertEqual(response.content_type, 'application/msword')
self.assertEqual(response.content_length, 8)
self.assertEqual(response.body, 'content2')
response = self.app.get('/tenders/{}/bids/{}/documents/{}'.format(
self.tender_id, self.bid_id, doc_id))
self.assertEqual(response.status, '200 OK')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(doc_id, response.json["data"]["id"])
self.assertEqual('name.doc', response.json["data"]["title"])
response = self.app.put('/tenders/{}/bids/{}/documents/{}'.format(
self.tender_id, self.bid_id, doc_id), 'content3', content_type='application/msword')
self.assertEqual(response.status, '200 OK')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(doc_id, response.json["data"]["id"])
key = response.json["data"]["url"].split('?')[-1]
response = self.app.get('/tenders/{}/bids/{}/documents/{}?{}'.format(
self.tender_id, self.bid_id, doc_id, key))
self.assertEqual(response.status, '200 OK')
self.assertEqual(response.content_type, 'application/msword')
self.assertEqual(response.content_length, 8)
self.assertEqual(response.body, 'content3')
self.set_status('active.awarded')
response = self.app.put('/tenders/{}/bids/{}/documents/{}'.format(
self.tender_id, self.bid_id, doc_id), upload_files=[('file', 'name.doc', 'content3')], status=403)
self.assertEqual(response.status, '403 Forbidden')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(response.json['errors'][0]["description"], "Can't update document in current tender status")
def test_patch_tender_bidder_document(self):
response = self.app.post('/tenders/{}/bids/{}/documents'.format(
self.tender_id, self.bid_id), upload_files=[('file', 'name.doc', 'content')])
self.assertEqual(response.status, '201 Created')
self.assertEqual(response.content_type, 'application/json')
doc_id = response.json["data"]['id']
self.assertTrue(doc_id in response.headers['Location'])
response = self.app.patch_json('/tenders/{}/bids/{}/documents/{}'.format(self.tender_id, self.bid_id, doc_id), {"data": {"description": "document description"}})
self.assertEqual(response.status, '200 OK')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(doc_id, response.json["data"]["id"])
response = self.app.get('/tenders/{}/bids/{}/documents/{}'.format(
self.tender_id, self.bid_id, doc_id))
self.assertEqual(response.status, '200 OK')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(doc_id, response.json["data"]["id"])
self.assertEqual('document description', response.json["data"]["description"])
self.set_status('active.awarded')
response = self.app.patch_json('/tenders/{}/bids/{}/documents/{}'.format(self.tender_id, self.bid_id, doc_id), {"data": {"description": "document description"}}, status=403)
self.assertEqual(response.status, '403 Forbidden')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(response.json['errors'][0]["description"], "Can't update document in current tender status")
def suite():
suite = unittest.TestSuite()
suite.addTest(unittest.makeSuite(TenderBidderDocumentResourceTest))
suite.addTest(unittest.makeSuite(TenderBidderResourceTest))
return suite
if __name__ == '__main__':
unittest.main(defaultTest='suite')
| 53.99287 | 185 | 0.631 | 3,484 | 30,290 | 5.386051 | 0.046211 | 0.183853 | 0.257394 | 0.109353 | 0.922675 | 0.919265 | 0.900453 | 0.877058 | 0.866294 | 0.86203 | 0 | 0.016144 | 0.198382 | 30,290 | 560 | 186 | 54.089286 | 0.756682 | 0.003929 | 0 | 0.740506 | 0 | 0 | 0.259265 | 0.043625 | 0 | 0 | 0 | 0 | 0.495781 | 1 | 0.025316 | false | 0 | 0.004219 | 0 | 0.040084 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
d5bd33414807d663dafedc16ac5c58dcca7889b4 | 825 | py | Python | tests/test_emails.py | seattleopendata/scrubadub | 00522458640d1ba6eddf5b2772ebd0bbf62cb4e2 | [
"MIT"
] | 3 | 2019-04-14T04:13:40.000Z | 2020-04-22T05:10:28.000Z | tests/test_emails.py | seattleopendata/scrubadub | 00522458640d1ba6eddf5b2772ebd0bbf62cb4e2 | [
"MIT"
] | null | null | null | tests/test_emails.py | seattleopendata/scrubadub | 00522458640d1ba6eddf5b2772ebd0bbf62cb4e2 | [
"MIT"
] | 3 | 2020-04-18T15:25:33.000Z | 2021-06-12T02:58:01.000Z | import unittest
from base import BaseTestCase
class EmailTestCase(unittest.TestCase, BaseTestCase):
def test_clean_gmail_john(self):
"""
BEFORE: My email is john@gmail.com
AFTER: My email is {{EMAIL}}
"""
self.compare_clean_before_after()
def test_clean_fancy_gmail_john(self):
"""
BEFORE: My email is john at gmail.com
AFTER: My email is {{EMAIL}}
"""
self.compare_clean_before_after()
def test_scan_gmail_john(self):
"""
BEFORE: My email is john@gmail.com
AFTER: email
"""
self.compare_scan_before_after()
def test_scan_fancy_gmail_john(self):
"""
BEFORE: My email is john at gmail.com
AFTER: email
"""
self.compare_scan_before_after()
| 23.571429 | 53 | 0.601212 | 101 | 825 | 4.653465 | 0.237624 | 0.089362 | 0.114894 | 0.161702 | 0.780851 | 0.748936 | 0.748936 | 0.748936 | 0.748936 | 0.617021 | 0 | 0 | 0.307879 | 825 | 34 | 54 | 24.264706 | 0.823117 | 0.282424 | 0 | 0.363636 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.363636 | false | 0 | 0.181818 | 0 | 0.636364 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 8 |
9150478e6e3cf35f75330fe5d7d6b1f93c3766ba | 1,825 | py | Python | CRM-Project/crm/filters.py | wiky-avis/trainee-domclick-test | a12727563a9a728b2a3472b7fee7b16cf11b298b | [
"MIT"
] | null | null | null | CRM-Project/crm/filters.py | wiky-avis/trainee-domclick-test | a12727563a9a728b2a3472b7fee7b16cf11b298b | [
"MIT"
] | 2 | 2022-01-13T03:52:59.000Z | 2022-03-12T01:00:19.000Z | CRM-Project/crm/filters.py | wiky-avis/trainee-domclick-test | a12727563a9a728b2a3472b7fee7b16cf11b298b | [
"MIT"
] | null | null | null | import django_filters
from django import forms
from config import settings
from .models import Request
class FilterRequestsDashboardView(django_filters.FilterSet):
status = django_filters.MultipleChoiceFilter(
field_name='status',
choices=settings.STATUS,
label=('Статус заявки:')
)
specific_date = django_filters.DateFilter(
field_name='created',
lookup_expr='date',
widget=forms.SelectDateWidget(),
label=('Конкретная дата:')
)
start_date = django_filters.DateFilter(
field_name='created',
lookup_expr=('date__gt'),
widget=forms.SelectDateWidget(),
label=('Дата больше чем:')
)
end_date = django_filters.DateFilter(
field_name='created',
lookup_expr=('date__lt'),
widget=forms.SelectDateWidget(),
label=('Дата меньше чем:')
)
class Meta:
model = Request
fields = ['subject']
class FilterRequestsView(django_filters.FilterSet):
status = django_filters.MultipleChoiceFilter(
field_name='status',
choices=settings.STATUS,
label=('Статус заявки:')
)
specific_date = django_filters.DateFilter(
field_name='created',
lookup_expr='date',
widget=forms.SelectDateWidget(),
label=('Конкретная дата:')
)
start_date = django_filters.DateFilter(
field_name='created',
lookup_expr=('date__gt'),
widget=forms.SelectDateWidget(),
label=('Дата больше чем:')
)
end_date = django_filters.DateFilter(
field_name='created',
lookup_expr=('date__lt'),
widget=forms.SelectDateWidget(),
label=('Дата меньше чем:')
)
class Meta:
model = Request
fields = ['status']
| 27.238806 | 60 | 0.619178 | 173 | 1,825 | 6.306358 | 0.242775 | 0.131072 | 0.093492 | 0.148488 | 0.857929 | 0.857929 | 0.857929 | 0.857929 | 0.857929 | 0.857929 | 0 | 0 | 0.269589 | 1,825 | 66 | 61 | 27.651515 | 0.818455 | 0 | 0 | 0.724138 | 0 | 0 | 0.126575 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.068966 | 0 | 0.275862 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e6858636d9418b57089deee18b49e407d1e9b96a | 97 | py | Python | my_library/__init__.py | ShalyginaA/allennlp-language-predictor | aea24460817edac188175a5719d7c64a7111386e | [
"MIT"
] | null | null | null | my_library/__init__.py | ShalyginaA/allennlp-language-predictor | aea24460817edac188175a5719d7c64a7111386e | [
"MIT"
] | null | null | null | my_library/__init__.py | ShalyginaA/allennlp-language-predictor | aea24460817edac188175a5719d7c64a7111386e | [
"MIT"
] | null | null | null | from my_library.reader import *
from my_library.model import *
from my_library.predictor import * | 32.333333 | 34 | 0.824742 | 15 | 97 | 5.133333 | 0.466667 | 0.233766 | 0.506494 | 0.493506 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.113402 | 97 | 3 | 34 | 32.333333 | 0.895349 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
e6a0c9c96452dd6bdc16fa2168bc27c45ae9a3bc | 122 | py | Python | tests/common.py | taro-kayo/statelint | 033f3d24da02ecaf43c7df7775a0fa440a7154c6 | [
"Apache-2.0"
] | null | null | null | tests/common.py | taro-kayo/statelint | 033f3d24da02ecaf43c7df7775a0fa440a7154c6 | [
"Apache-2.0"
] | null | null | null | tests/common.py | taro-kayo/statelint | 033f3d24da02ecaf43c7df7775a0fa440a7154c6 | [
"Apache-2.0"
] | null | null | null | import os.path
def get_path(data_file_name):
return os.path.join(os.path.dirname(__file__), "data", data_file_name)
| 20.333333 | 74 | 0.754098 | 21 | 122 | 3.952381 | 0.52381 | 0.216867 | 0.289157 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.114754 | 122 | 5 | 75 | 24.4 | 0.768519 | 0 | 0 | 0 | 0 | 0 | 0.032787 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 8 |
e6c1520058a2442c59a15143677a025e0a799b0c | 654 | py | Python | autoPyTorch/utils/benchmarking/visualization_pipeline/__init__.py | thomascherickal/Auto-PyTorch | 9e25a3bdef8e836e63979229eef77830cd64bb53 | [
"BSD-3-Clause"
] | 1 | 2019-09-02T00:37:52.000Z | 2019-09-02T00:37:52.000Z | autoPyTorch/utils/benchmarking/visualization_pipeline/__init__.py | thomascherickal/Auto-PyTorch | 9e25a3bdef8e836e63979229eef77830cd64bb53 | [
"BSD-3-Clause"
] | null | null | null | autoPyTorch/utils/benchmarking/visualization_pipeline/__init__.py | thomascherickal/Auto-PyTorch | 9e25a3bdef8e836e63979229eef77830cd64bb53 | [
"BSD-3-Clause"
] | 1 | 2019-09-02T00:40:30.000Z | 2019-09-02T00:40:30.000Z | from autoPyTorch.utils.benchmarking.visualization_pipeline.collect_trajectories import CollectAutoNetConfigTrajectories, CollectRunTrajectories
from autoPyTorch.utils.benchmarking.visualization_pipeline.get_run_trajectories import GetRunTrajectories
from autoPyTorch.utils.benchmarking.visualization_pipeline.plot_trajectories import PlotTrajectories
from autoPyTorch.utils.benchmarking.visualization_pipeline.read_instance_info import ReadInstanceInfo
from autoPyTorch.utils.benchmarking.visualization_pipeline.visualization_settings import VisualizationSettings
from autoPyTorch.utils.benchmarking.visualization_pipeline.for_instance import ForInstance | 109 | 143 | 0.925076 | 63 | 654 | 9.380952 | 0.380952 | 0.152284 | 0.203046 | 0.324873 | 0.538071 | 0.538071 | 0 | 0 | 0 | 0 | 0 | 0 | 0.036697 | 654 | 6 | 144 | 109 | 0.938095 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
fc0535e8702f9ce18892feea3b147a71ac8462f2 | 17,255 | py | Python | barcodes/tests/test_api_views.py | bihealth/digestiflow-server | 298c53f95dbf56e7be0d0b8bcceacabc21257d5f | [
"MIT"
] | 13 | 2019-11-27T19:12:15.000Z | 2021-12-01T21:32:18.000Z | barcodes/tests/test_api_views.py | bihealth/digestiflow-server | 298c53f95dbf56e7be0d0b8bcceacabc21257d5f | [
"MIT"
] | 60 | 2019-03-27T14:43:19.000Z | 2022-03-22T09:12:53.000Z | barcodes/tests/test_api_views.py | bihealth/digestiflow-server | 298c53f95dbf56e7be0d0b8bcceacabc21257d5f | [
"MIT"
] | 3 | 2020-11-09T07:08:42.000Z | 2022-02-09T11:37:54.000Z | # TODO: check timeline events
import json
from test_plus.test import APITestCase
from digestiflow.test_utils import SetupUserMixin, SetupProjectMixin, AuthenticatedRequestMixin
from ..models import BarcodeSet, BarcodeSetEntry
from ..tests import SetupBarcodeSetMixin
class BarcodeSetListCreateApiViewTest(
SetupBarcodeSetMixin, SetupProjectMixin, SetupUserMixin, AuthenticatedRequestMixin, APITestCase
):
"""Tests for creation of barcode sets using REST API"""
url_name = "api:barcodesets"
def testGet(self):
"""Test that querying API for the machine list works (with super user)"""
response = self.runGet(self.root)
self.response_200(response)
data = json.loads(response.content.decode("utf-8"))
self.assertEqual(len(data), 1)
def testGetAccessDenied(self):
"""Test that access is denied if role assignment is missing"""
self.runGet(None)
self.response_401()
for user in (self.norole, self.unrelated_owner):
self.runGet(user)
self.response_403()
def testGetAccessAllowed(self):
"""Test that access is allowed if role assignment is correct"""
for user in (self.guest, self.contributor, self.delegate, self.owner, self.root):
response = self.runGet(user)
self.response_200(response)
data = json.loads(response.content.decode("utf-8"))
self.assertEqual(len(data), 1)
def testPost(self):
"""Test that creating machine via API works (with super user)"""
response = self.runPost(self.root, data=self.barcode_set_api_post_data)
self.response_201(response)
data = json.loads(response.content.decode("utf-8"))
self.assertIn("sodar_uuid", data)
def testPostAccessDenied(self):
"""Test that creating machine via API is denied if role assignment is missing"""
self.runPost(None, data=self.barcode_set_api_post_data)
self.response_401()
for user in (self.guest, self.norole, self.unrelated_owner):
self.runPost(user, data=self.barcode_set_api_post_data)
self.response_403()
def testPostAccessAllowed(self):
"""Test that creating machine via API is allowed if role assignment is correct"""
for user in (self.contributor, self.delegate, self.owner, self.root):
response = self.runPost(user, data=self.barcode_set_api_post_data)
self.response_201(response)
data = json.loads(response.content.decode("utf-8"))
self.assertIn("sodar_uuid", data)
BarcodeSet.objects.filter(sodar_uuid=data["sodar_uuid"]).delete()
class BarcodeSetUpdateApiViewTest(
SetupBarcodeSetMixin, SetupProjectMixin, SetupUserMixin, AuthenticatedRequestMixin, APITestCase
):
"""Tests for detail view, update, delete of barcode sets using REST API"""
url_name = "api:barcodesets"
def testGet(self):
"""Test that querying API for the barcode set list works (with super user)"""
response = self.runGet(self.root, barcodeset=self.barcode_set.sodar_uuid)
self.response_200(response)
data = json.loads(response.content.decode("utf-8"))
self.assertEqual(data["sodar_uuid"], str(self.barcode_set.sodar_uuid))
def testGetAccessDenied(self):
"""Test that access is denied if role assignment is missing"""
self.runGet(None, barcodeset=self.barcode_set.sodar_uuid)
self.response_401()
for user in (self.norole, self.unrelated_owner):
self.runGet(user)
self.response_403()
def testGetAccessAllowed(self):
"""Test that access is allowed if role assignment is correct"""
for user in (self.guest, self.contributor, self.delegate, self.owner, self.root):
response = self.runGet(user, barcodeset=self.barcode_set.sodar_uuid)
self.response_200(response)
data = json.loads(response.content.decode("utf-8"))
self.assertEqual(data["sodar_uuid"], str(self.barcode_set.sodar_uuid))
def testUpdate(self):
"""Test that creating barcode set via API works (with super user)"""
response = self.runPut(
self.root, barcodeset=self.barcode_set.sodar_uuid, data=self.barcode_set_api_post_data
)
self.response_200(response)
data = json.loads(response.content.decode("utf-8"))
self.assertEqual(data["short_name"], self.barcode_set_api_post_data["short_name"])
def testUpdateAccessDenied(self):
"""Test that creating barcode set via API is denied if role assignment is missing"""
self.runPut(
None, barcodeset=self.barcode_set.sodar_uuid, data=self.barcode_set_api_post_data
)
self.response_401()
for user in (self.guest, self.norole, self.unrelated_owner):
self.runPut(
user, barcodeset=self.barcode_set.sodar_uuid, data=self.barcode_set_api_post_data
)
self.response_403()
def testUpdateAccessAllowed(self):
"""Test that creating barcode set via API is allowed if role assignment is correct"""
for user in (self.contributor, self.delegate, self.owner, self.root):
response = self.runPut(
user, barcodeset=self.barcode_set.sodar_uuid, data=self.barcode_set_api_post_data
)
self.response_200(response)
data = json.loads(response.content.decode("utf-8"))
self.assertEqual(data["short_name"], self.barcode_set_api_post_data["short_name"])
def testDelete(self):
"""Test that creating barcode set via API works (with super user)"""
self.assertEqual(BarcodeSet.objects.count(), 1)
response = self.runDelete(self.root, barcodeset=self.barcode_set.sodar_uuid)
self.response_204(response)
self.assertEqual(BarcodeSet.objects.count(), 0)
def testDeleteAccessDenied(self):
"""Test that creating machine via API is denied if role assignment is missing"""
self.assertEqual(BarcodeSet.objects.count(), 1)
self.runDelete(None, barcodeset=self.barcode_set.sodar_uuid)
self.assertEqual(BarcodeSet.objects.count(), 1)
self.response_401()
for user in (self.guest, self.norole, self.unrelated_owner):
self.assertEqual(BarcodeSet.objects.count(), 1)
self.runDelete(user, barcodeset=self.barcode_set.sodar_uuid)
self.assertEqual(BarcodeSet.objects.count(), 1)
self.response_403()
def testDeleteAccessAllowed(self):
"""Test that creating machine via API is allowed if role assignment is correct"""
for user in (self.contributor, self.delegate, self.owner, self.root):
BarcodeSet.objects.all().delete()
barcode_set = self.make_barcode_set()
self.assertEqual(BarcodeSet.objects.count(), 1)
response = self.runDelete(user, barcodeset=barcode_set.sodar_uuid)
self.response_204(response)
self.assertEqual(BarcodeSet.objects.count(), 0)
class BarcodeSetEntryListCreateApiViewTest(
SetupBarcodeSetMixin, SetupProjectMixin, SetupUserMixin, AuthenticatedRequestMixin, APITestCase
):
"""Tests for creation of barcode set entries using REST API"""
url_name = "api:barcodesetentries"
def testGet(self):
"""Test that querying API for the machine list works (with super user)"""
response = self.runGet(self.root, barcodeset=self.barcode_set.sodar_uuid)
self.response_200(response)
data = json.loads(response.content.decode("utf-8"))
self.assertEqual(len(data), 1)
def testGetAccessDenied(self):
"""Test that access is denied if role assignment is missing"""
self.runGet(None, barcodeset=self.barcode_set.sodar_uuid)
self.response_401()
for user in (self.norole, self.unrelated_owner):
self.runGet(user, barcodeset=self.barcode_set.sodar_uuid)
self.response_403()
def testGetAccessAllowed(self):
"""Test that access is allowed if role assignment is correct"""
for user in (self.guest, self.contributor, self.delegate, self.owner, self.root):
response = self.runGet(user, barcodeset=self.barcode_set.sodar_uuid)
self.response_200(response)
data = json.loads(response.content.decode("utf-8"))
self.assertEqual(len(data), 1)
def testPost(self):
"""Test that creating machine via API works (with super user)"""
response = self.runPost(
self.root,
barcodeset=self.barcode_set.sodar_uuid,
data=self.barcode_set_entry_api_post_data,
)
self.response_201(response)
data = json.loads(response.content.decode("utf-8"))
self.assertIn("sodar_uuid", data)
def testPostAccessDenied(self):
"""Test that creating machine via API is denied if role assignment is missing"""
self.runPost(
None, barcodeset=self.barcode_set.sodar_uuid, data=self.barcode_set_entry_api_post_data
)
self.response_401()
for user in (self.guest, self.norole, self.unrelated_owner):
self.runPost(
user,
barcodeset=self.barcode_set.sodar_uuid,
data=self.barcode_set_entry_api_post_data,
)
self.response_403()
def testPostAccessAllowed(self):
"""Test that creating machine via API is allowed if role assignment is correct"""
for user in (self.contributor, self.delegate, self.owner, self.root):
response = self.runPost(
user,
barcodeset=self.barcode_set.sodar_uuid,
data=self.barcode_set_entry_api_post_data,
)
self.response_201(response)
data = json.loads(response.content.decode("utf-8"))
self.assertIn("sodar_uuid", data)
BarcodeSetEntry.objects.filter(sodar_uuid=data["sodar_uuid"]).delete()
class BarcodeSetEntryUpdateDeleteApiViewTest(
SetupBarcodeSetMixin, SetupProjectMixin, SetupUserMixin, AuthenticatedRequestMixin, APITestCase
):
"""Tests for update and delete action using REST API"""
url_name = "api:barcodesetentries"
def testGet(self):
"""Test that querying API for the barcode set list works (with super user)"""
response = self.runGet(
self.root,
barcodeset=self.barcode_set.sodar_uuid,
barcodesetentry=self.barcode_set_entry.sodar_uuid,
)
self.response_200(response)
data = json.loads(response.content.decode("utf-8"))
self.assertEqual(data["sodar_uuid"], str(self.barcode_set_entry.sodar_uuid))
def testGetAccessDenied(self):
"""Test that access is denied if role assignment is missing"""
self.runGet(
None,
barcodeset=self.barcode_set.sodar_uuid,
barcodesetentry=self.barcode_set_entry.sodar_uuid,
)
self.response_401()
for user in (self.norole, self.unrelated_owner):
self.runGet(
user,
barcodeset=self.barcode_set.sodar_uuid,
barcodesetentry=self.barcode_set_entry.sodar_uuid,
)
self.response_403()
def testGetAccessAllowed(self):
"""Test that access is allowed if role assignment is correct"""
for user in (self.guest, self.contributor, self.delegate, self.owner, self.root):
response = self.runGet(
user,
barcodeset=self.barcode_set.sodar_uuid,
barcodesetentry=self.barcode_set_entry.sodar_uuid,
)
self.response_200(response)
data = json.loads(response.content.decode("utf-8"))
self.assertEqual(data["sodar_uuid"], str(self.barcode_set_entry.sodar_uuid))
def testUpdate(self):
"""Test that creating barcode set via API works (with super user)"""
response = self.runPut(
self.root,
barcodeset=self.barcode_set.sodar_uuid,
barcodesetentry=self.barcode_set_entry.sodar_uuid,
data=self.barcode_set_entry_api_post_data,
)
self.response_200(response)
data = json.loads(response.content.decode("utf-8"))
self.assertEqual(data["name"], self.barcode_set_entry_api_post_data["name"])
def testUpdateAccessDenied(self):
"""Test that creating barcode set via API is denied if role assignment is missing"""
self.runPut(
None, barcodeset=self.barcode_set.sodar_uuid, data=self.barcode_set_entry_api_post_data
)
self.response_401()
for user in (self.guest, self.norole, self.unrelated_owner):
self.runPut(
user,
barcodeset=self.barcode_set.sodar_uuid,
barcodesetentry=self.barcode_set_entry.sodar_uuid,
data=self.barcode_set_entry_api_post_data,
)
self.response_403()
def testUpdateAccessAllowed(self):
"""Test that creating barcode set via API is allowed if role assignment is correct"""
for user in (self.contributor, self.delegate, self.owner, self.root):
response = self.runPut(
user,
barcodeset=self.barcode_set.sodar_uuid,
barcodesetentry=self.barcode_set_entry.sodar_uuid,
data=self.barcode_set_entry_api_post_data,
)
self.response_200(response)
data = json.loads(response.content.decode("utf-8"))
self.assertEqual(data["name"], self.barcode_set_entry_api_post_data["name"])
def testDelete(self):
"""Test that creating barcode set via API works (with super user)"""
self.assertEqual(BarcodeSetEntry.objects.count(), 1)
response = self.runDelete(
self.root,
barcodeset=self.barcode_set.sodar_uuid,
barcodesetentry=self.barcode_set_entry.sodar_uuid,
)
self.response_204(response)
self.assertEqual(BarcodeSetEntry.objects.count(), 0)
def testDeleteAccessDenied(self):
"""Test that creating machine via API is denied if role assignment is missing"""
self.assertEqual(BarcodeSetEntry.objects.count(), 1)
self.runDelete(
None,
barcodeset=self.barcode_set.sodar_uuid,
barcodesetentry=self.barcode_set_entry.sodar_uuid,
)
self.assertEqual(BarcodeSetEntry.objects.count(), 1)
self.response_401()
for user in (self.guest, self.norole, self.unrelated_owner):
self.assertEqual(BarcodeSetEntry.objects.count(), 1)
self.runDelete(
user,
barcodeset=self.barcode_set.sodar_uuid,
barcodesetentry=self.barcode_set_entry.sodar_uuid,
)
self.assertEqual(BarcodeSetEntry.objects.count(), 1)
self.response_403()
def testDeleteAccessAllowed(self):
"""Test that creating machine via API is allowed if role assignment is correct"""
for user in (self.contributor, self.delegate, self.owner, self.root):
BarcodeSetEntry.objects.all().delete()
barcode_set_entry = self.make_barcode_set_entry()
self.assertEqual(BarcodeSetEntry.objects.count(), 1)
response = self.runDelete(
user,
barcodeset=self.barcode_set.sodar_uuid,
barcodesetentry=barcode_set_entry.sodar_uuid,
)
self.response_204(response)
self.assertEqual(BarcodeSetEntry.objects.count(), 0)
class BarcodeSetEntryRetrieveApiViewTest(
SetupBarcodeSetMixin, SetupProjectMixin, SetupUserMixin, AuthenticatedRequestMixin, APITestCase
):
"""Tests for retrieve action using REST API"""
url_name = "api:barcodesetentries-retrieve"
def testGet(self):
"""Test that querying API for the barcode set list works (with super user)"""
response = self.runGet(self.root, barcodesetentry=self.barcode_set_entry.sodar_uuid)
self.response_200(response)
data = json.loads(response.content.decode("utf-8"))
self.assertEqual(data["sodar_uuid"], str(self.barcode_set_entry.sodar_uuid))
def testGetAccessDenied(self):
"""Test that access is denied if role assignment is missing"""
self.runGet(None, barcodesetentry=self.barcode_set_entry.sodar_uuid)
self.response_401()
for user in (self.norole, self.unrelated_owner):
self.runGet(user, barcodesetentry=self.barcode_set_entry.sodar_uuid)
self.response_403()
def testGetAccessAllowed(self):
"""Test that access is allowed if role assignment is correct"""
for user in (self.guest, self.contributor, self.delegate, self.owner, self.root):
response = self.runGet(user, barcodesetentry=self.barcode_set_entry.sodar_uuid)
self.response_200(response)
data = json.loads(response.content.decode("utf-8"))
self.assertEqual(data["sodar_uuid"], str(self.barcode_set_entry.sodar_uuid))
| 44.586563 | 99 | 0.659867 | 2,017 | 17,255 | 5.495786 | 0.060486 | 0.079387 | 0.088408 | 0.056563 | 0.94876 | 0.944069 | 0.944069 | 0.918358 | 0.907984 | 0.887506 | 0 | 0.01299 | 0.241553 | 17,255 | 386 | 100 | 44.702073 | 0.834034 | 0.145697 | 0 | 0.811448 | 0 | 0 | 0.025292 | 0.004948 | 0 | 0 | 0 | 0.002591 | 0.114478 | 1 | 0.111111 | false | 0 | 0.016835 | 0 | 0.161616 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
fc2f25775ab133825e06838edabf2cdc94807c08 | 6,353 | py | Python | src/secml/ml/classifiers/loss/c_loss_hinge.py | zangobot/secml | 95a293e1201c24256eb7fe2f1d2125cd5f318c8c | [
"Apache-2.0"
] | 63 | 2020-04-20T16:31:16.000Z | 2022-03-29T01:05:35.000Z | src/secml/ml/classifiers/loss/c_loss_hinge.py | zangobot/secml | 95a293e1201c24256eb7fe2f1d2125cd5f318c8c | [
"Apache-2.0"
] | 5 | 2020-04-21T11:31:39.000Z | 2022-03-24T13:42:56.000Z | src/secml/ml/classifiers/loss/c_loss_hinge.py | zangobot/secml | 95a293e1201c24256eb7fe2f1d2125cd5f318c8c | [
"Apache-2.0"
] | 8 | 2020-04-21T09:16:42.000Z | 2022-02-23T16:28:43.000Z | """
.. module:: CLossHinge
:synopsis: Hinge Loss Functions
.. moduleauthor:: Marco Melis <marco.melis@unica.it>
"""
from secml.ml.classifiers.loss import CLossClassification
from secml.ml.classifiers.loss.c_loss import _check_binary_score
from secml.ml.classifiers.clf_utils import convert_binary_labels
from secml.array import CArray
class CLossHinge(CLossClassification):
"""Hinge Loss Function.
The function computes the average distance between the model and
the data using hinge loss, a one-sided metric that considers only
prediction errors.
Hinge loss is used in maximal margin classifiers such as
support vector machines.
After converting the labels to {-1, +1},
then the hinge loss is defined as:
.. math::
L_\\text{Hinge}(y, s) = \\max \\left\\{ 1 - sy, 0 \\right\\}
Attributes
----------
class_type : 'hinge'
suitable_for : 'classification'
"""
__class_type = 'hinge'
def loss(self, y_true, score, pos_label=1):
"""Computes the value of the hinge loss function.
Parameters
----------
y_true : CArray
Ground truth (correct), targets. Vector-like array.
score : CArray
Outputs (predicted), targets.
2-D array of shape (n_samples, n_classes) or 1-D flat array
of shape (n_samples,). If 1-D array, the probabilities
provided are assumed to be that of the positive class.
pos_label : {0, 1}, optional
The class wrt compute the loss function. Default 1.
If `score` is a 1-D flat array, this parameter is ignored.
Returns
-------
CArray
Loss function. Vector-like array.
"""
if pos_label not in (0, 1):
raise ValueError("only {0, 1} are accepted for `pos_label`")
y_true = convert_binary_labels(y_true).ravel() # Convert to {-1, 1}
score = _check_binary_score(score, pos_label)
# max(0, 1 - y*s)
h = 1.0 - y_true * score
h[h < 0] = 0.0
return h
def dloss(self, y_true, score, pos_label=1):
"""Computes the derivative of the hinge loss function with respect to `score`.
Parameters
----------
y_true : CArray
Ground truth (correct), targets. Vector-like array.
score : CArray
Outputs (predicted), targets.
2-D array of shape (n_samples, n_classes) or 1-D flat array
of shape (n_samples,). If 1-D array, the probabilities
provided are assumed to be that of the positive class.
pos_label : {0, 1}, optional
The class wrt compute the loss function derivative. Default 1.
If `score` is a 1-D flat array, this parameter is ignored.
Returns
-------
CArray
Derivative of the loss function. Vector-like array.
"""
if pos_label not in (0, 1):
raise ValueError("only {0, 1} are accepted for `pos_label`")
y_true = convert_binary_labels(y_true).ravel() # Convert to {-1, 1}
score = _check_binary_score(score, pos_label)
# 0 if (1 - y*s) < 0 else -y_true
d = -y_true.astype(float) # labels are generally int
h = 1.0 - y_true * score
d[h < 0] = 0.0
return d
class CLossHingeSquared(CLossClassification):
"""Squared Hinge Loss Function.
The function computes the average distance between the model and
the data using hinge loss, a one-sided metric that considers only
prediction errors.
After converting the labels to {-1, +1}, then the hinge loss is defined as:
.. math::
L^2_\\text{Hinge} (y, s) =
{\\left( \\max \\left\\{ 1 - sy, 0 \\right\\} \\right)}^2
Attributes
----------
class_type : 'hinge-squared'
suitable_for : 'classification'
"""
__class_type = 'hinge-squared'
def loss(self, y_true, score, pos_label=1):
"""Computes the value of the squared hinge loss function.
Parameters
----------
y_true : CArray
Ground truth (correct), targets. Vector-like array.
score : CArray
Outputs (predicted), targets.
2-D array of shape (n_samples, n_classes) or 1-D flat array
of shape (n_samples,). If 1-D array, the probabilities
provided are assumed to be that of the positive class.
pos_label : {0, 1}, optional
The class wrt compute the loss function. Default 1.
If `score` is a 1-D flat array, this parameter is ignored.
Returns
-------
CArray
Loss function. Vector-like array.
"""
if pos_label not in (0, 1):
raise ValueError("only {0, 1} are accepted for `pos_label`")
y_true = convert_binary_labels(y_true).ravel() # Convert to {-1, 1}
score = _check_binary_score(score, pos_label)
# (max(0, 1 - y*s))^2
h = 1.0 - y_true * score
h[h < 0] = 0.0
return h ** 2
def dloss(self, y_true, score, pos_label=1):
"""Computes the derivative of the squared hinge loss function with respect to `score`.
Parameters
----------
y_true : CArray
Ground truth (correct), targets. Vector-like array.
score : CArray
Outputs (predicted), targets.
2-D array of shape (n_samples, n_classes) or 1-D flat array
of shape (n_samples,). If 1-D array, the probabilities
provided are assumed to be that of the positive class.
pos_label : {0, 1}, optional
The class wrt compute the loss function derivative. Default 1.
If `score` is a 1-D flat array, this parameter is ignored.
Returns
-------
CArray
Derivative of the loss function. Vector-like array.
"""
if pos_label not in (0, 1):
raise ValueError("only {0, 1} are accepted for `pos_label`")
y_true = convert_binary_labels(y_true).ravel() # Convert to {-1, 1}
score = _check_binary_score(score, pos_label)
# 0 if (1 - y * s) < 0 else -2 * y * (1 - y * s)
h = 1.0 - y_true * score
d = -2.0 * y_true * h
d[h < 0] = 0.0
return d
| 31.765 | 94 | 0.583661 | 860 | 6,353 | 4.202326 | 0.155814 | 0.031821 | 0.022136 | 0.028777 | 0.857775 | 0.843387 | 0.809076 | 0.798561 | 0.798561 | 0.794134 | 0 | 0.023378 | 0.313238 | 6,353 | 199 | 95 | 31.924623 | 0.804951 | 0.586337 | 0 | 0.714286 | 0 | 0 | 0.09334 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.095238 | false | 0 | 0.095238 | 0 | 0.380952 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5d8ef3dee932ddadf0929ae4fa278526cccba74d | 128 | py | Python | oceans/sw_extras/__init__.py | arnaldorusso/python-oceans | fb4dc2a7ee1add14b023b4830f47993061fe2e6a | [
"MIT"
] | null | null | null | oceans/sw_extras/__init__.py | arnaldorusso/python-oceans | fb4dc2a7ee1add14b023b4830f47993061fe2e6a | [
"MIT"
] | null | null | null | oceans/sw_extras/__init__.py | arnaldorusso/python-oceans | fb4dc2a7ee1add14b023b4830f47993061fe2e6a | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from .sw_extras import *
from .waves import Waves
from .gamma_GP_from_SP_pt import gamma_GP_from_SP_pt
| 21.333333 | 52 | 0.757813 | 23 | 128 | 3.826087 | 0.521739 | 0.159091 | 0.25 | 0.295455 | 0.340909 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009091 | 0.140625 | 128 | 5 | 53 | 25.6 | 0.790909 | 0.164063 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
5dd6eed9581935589bd44dd2ad766d6b407573d0 | 82 | py | Python | old/vl8/functions/reverse.py | rec/sorta | 8d7f61f7eaffbda7775cca06fca1002280223eb2 | [
"MIT"
] | 2 | 2020-12-16T10:49:52.000Z | 2021-09-12T08:01:31.000Z | old/vl8/functions/reverse.py | rec/sorta | 8d7f61f7eaffbda7775cca06fca1002280223eb2 | [
"MIT"
] | 2 | 2021-06-13T09:41:32.000Z | 2021-06-13T09:49:29.000Z | old/vl8/functions/reverse.py | rec/sorta | 8d7f61f7eaffbda7775cca06fca1002280223eb2 | [
"MIT"
] | null | null | null | import numpy as np
def reverse(src, axis=0):
return np.flip(src, axis=axis)
| 13.666667 | 34 | 0.682927 | 15 | 82 | 3.733333 | 0.733333 | 0.25 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015152 | 0.195122 | 82 | 5 | 35 | 16.4 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
5df852e8303a905832e15feb233f0ce4c2745652 | 20,998 | py | Python | sdk/python/pulumi_aws/guardduty/publishing_destination.py | dmelo/pulumi-aws | dd1a08d1fb93bab0d046aa410ca660f05ca0a58c | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_aws/guardduty/publishing_destination.py | dmelo/pulumi-aws | dd1a08d1fb93bab0d046aa410ca660f05ca0a58c | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_aws/guardduty/publishing_destination.py | dmelo/pulumi-aws | dd1a08d1fb93bab0d046aa410ca660f05ca0a58c | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from .. import _utilities
__all__ = ['PublishingDestinationArgs', 'PublishingDestination']
@pulumi.input_type
class PublishingDestinationArgs:
def __init__(__self__, *,
destination_arn: pulumi.Input[str],
detector_id: pulumi.Input[str],
kms_key_arn: pulumi.Input[str],
destination_type: Optional[pulumi.Input[str]] = None):
"""
The set of arguments for constructing a PublishingDestination resource.
:param pulumi.Input[str] destination_arn: The bucket arn and prefix under which the findings get exported. Bucket-ARN is required, the prefix is optional and will be `AWSLogs/[Account-ID]/GuardDuty/[Region]/` if not provided
:param pulumi.Input[str] detector_id: The detector ID of the GuardDuty.
:param pulumi.Input[str] kms_key_arn: The ARN of the KMS key used to encrypt GuardDuty findings. GuardDuty enforces this to be encrypted.
:param pulumi.Input[str] destination_type: Currently there is only "S3" available as destination type which is also the default value
"""
pulumi.set(__self__, "destination_arn", destination_arn)
pulumi.set(__self__, "detector_id", detector_id)
pulumi.set(__self__, "kms_key_arn", kms_key_arn)
if destination_type is not None:
pulumi.set(__self__, "destination_type", destination_type)
@property
@pulumi.getter(name="destinationArn")
def destination_arn(self) -> pulumi.Input[str]:
"""
The bucket arn and prefix under which the findings get exported. Bucket-ARN is required, the prefix is optional and will be `AWSLogs/[Account-ID]/GuardDuty/[Region]/` if not provided
"""
return pulumi.get(self, "destination_arn")
@destination_arn.setter
def destination_arn(self, value: pulumi.Input[str]):
pulumi.set(self, "destination_arn", value)
@property
@pulumi.getter(name="detectorId")
def detector_id(self) -> pulumi.Input[str]:
"""
The detector ID of the GuardDuty.
"""
return pulumi.get(self, "detector_id")
@detector_id.setter
def detector_id(self, value: pulumi.Input[str]):
pulumi.set(self, "detector_id", value)
@property
@pulumi.getter(name="kmsKeyArn")
def kms_key_arn(self) -> pulumi.Input[str]:
"""
The ARN of the KMS key used to encrypt GuardDuty findings. GuardDuty enforces this to be encrypted.
"""
return pulumi.get(self, "kms_key_arn")
@kms_key_arn.setter
def kms_key_arn(self, value: pulumi.Input[str]):
pulumi.set(self, "kms_key_arn", value)
@property
@pulumi.getter(name="destinationType")
def destination_type(self) -> Optional[pulumi.Input[str]]:
"""
Currently there is only "S3" available as destination type which is also the default value
"""
return pulumi.get(self, "destination_type")
@destination_type.setter
def destination_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "destination_type", value)
@pulumi.input_type
class _PublishingDestinationState:
def __init__(__self__, *,
destination_arn: Optional[pulumi.Input[str]] = None,
destination_type: Optional[pulumi.Input[str]] = None,
detector_id: Optional[pulumi.Input[str]] = None,
kms_key_arn: Optional[pulumi.Input[str]] = None):
"""
Input properties used for looking up and filtering PublishingDestination resources.
:param pulumi.Input[str] destination_arn: The bucket arn and prefix under which the findings get exported. Bucket-ARN is required, the prefix is optional and will be `AWSLogs/[Account-ID]/GuardDuty/[Region]/` if not provided
:param pulumi.Input[str] destination_type: Currently there is only "S3" available as destination type which is also the default value
:param pulumi.Input[str] detector_id: The detector ID of the GuardDuty.
:param pulumi.Input[str] kms_key_arn: The ARN of the KMS key used to encrypt GuardDuty findings. GuardDuty enforces this to be encrypted.
"""
if destination_arn is not None:
pulumi.set(__self__, "destination_arn", destination_arn)
if destination_type is not None:
pulumi.set(__self__, "destination_type", destination_type)
if detector_id is not None:
pulumi.set(__self__, "detector_id", detector_id)
if kms_key_arn is not None:
pulumi.set(__self__, "kms_key_arn", kms_key_arn)
@property
@pulumi.getter(name="destinationArn")
def destination_arn(self) -> Optional[pulumi.Input[str]]:
"""
The bucket arn and prefix under which the findings get exported. Bucket-ARN is required, the prefix is optional and will be `AWSLogs/[Account-ID]/GuardDuty/[Region]/` if not provided
"""
return pulumi.get(self, "destination_arn")
@destination_arn.setter
def destination_arn(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "destination_arn", value)
@property
@pulumi.getter(name="destinationType")
def destination_type(self) -> Optional[pulumi.Input[str]]:
"""
Currently there is only "S3" available as destination type which is also the default value
"""
return pulumi.get(self, "destination_type")
@destination_type.setter
def destination_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "destination_type", value)
@property
@pulumi.getter(name="detectorId")
def detector_id(self) -> Optional[pulumi.Input[str]]:
"""
The detector ID of the GuardDuty.
"""
return pulumi.get(self, "detector_id")
@detector_id.setter
def detector_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "detector_id", value)
@property
@pulumi.getter(name="kmsKeyArn")
def kms_key_arn(self) -> Optional[pulumi.Input[str]]:
"""
The ARN of the KMS key used to encrypt GuardDuty findings. GuardDuty enforces this to be encrypted.
"""
return pulumi.get(self, "kms_key_arn")
@kms_key_arn.setter
def kms_key_arn(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "kms_key_arn", value)
class PublishingDestination(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
destination_arn: Optional[pulumi.Input[str]] = None,
destination_type: Optional[pulumi.Input[str]] = None,
detector_id: Optional[pulumi.Input[str]] = None,
kms_key_arn: Optional[pulumi.Input[str]] = None,
__props__=None):
"""
Provides a resource to manage a GuardDuty PublishingDestination. Requires an existing GuardDuty Detector.
## Example Usage
```python
import pulumi
import pulumi_aws as aws
current_caller_identity = aws.get_caller_identity()
current_region = aws.get_region()
gd_bucket = aws.s3.Bucket("gdBucket",
acl="private",
force_destroy=True)
bucket_pol = aws.iam.get_policy_document_output(statements=[
aws.iam.GetPolicyDocumentStatementArgs(
sid="Allow PutObject",
actions=["s3:PutObject"],
resources=[gd_bucket.arn.apply(lambda arn: f"{arn}/*")],
principals=[aws.iam.GetPolicyDocumentStatementPrincipalArgs(
type="Service",
identifiers=["guardduty.amazonaws.com"],
)],
),
aws.iam.GetPolicyDocumentStatementArgs(
sid="Allow GetBucketLocation",
actions=["s3:GetBucketLocation"],
resources=[gd_bucket.arn],
principals=[aws.iam.GetPolicyDocumentStatementPrincipalArgs(
type="Service",
identifiers=["guardduty.amazonaws.com"],
)],
),
])
kms_pol = aws.iam.get_policy_document(statements=[
aws.iam.GetPolicyDocumentStatementArgs(
sid="Allow GuardDuty to encrypt findings",
actions=["kms:GenerateDataKey"],
resources=[f"arn:aws:kms:{current_region.name}:{current_caller_identity.account_id}:key/*"],
principals=[aws.iam.GetPolicyDocumentStatementPrincipalArgs(
type="Service",
identifiers=["guardduty.amazonaws.com"],
)],
),
aws.iam.GetPolicyDocumentStatementArgs(
sid="Allow all users to modify/delete key (test only)",
actions=["kms:*"],
resources=[f"arn:aws:kms:{current_region.name}:{current_caller_identity.account_id}:key/*"],
principals=[aws.iam.GetPolicyDocumentStatementPrincipalArgs(
type="AWS",
identifiers=[f"arn:aws:iam::{current_caller_identity.account_id}:root"],
)],
),
])
test_gd = aws.guardduty.Detector("testGd", enable=True)
gd_bucket_policy = aws.s3.BucketPolicy("gdBucketPolicy",
bucket=gd_bucket.id,
policy=bucket_pol.json)
gd_key = aws.kms.Key("gdKey",
description="Temporary key for AccTest of TF",
deletion_window_in_days=7,
policy=kms_pol.json)
test = aws.guardduty.PublishingDestination("test",
detector_id=test_gd.id,
destination_arn=gd_bucket.arn,
kms_key_arn=gd_key.arn,
opts=pulumi.ResourceOptions(depends_on=[gd_bucket_policy]))
```
> **Note:** Please do not use this simple example for Bucket-Policy and KMS Key Policy in a production environment. It is much too open for such a use-case. Refer to the AWS documentation here: https://docs.aws.amazon.com/guardduty/latest/ug/guardduty_exportfindings.html
## Import
GuardDuty PublishingDestination can be imported using the the master GuardDuty detector ID and PublishingDestinationID, e.g.,
```sh
$ pulumi import aws:guardduty/publishingDestination:PublishingDestination test a4b86f26fa42e7e7cf0d1c333ea77777:a4b86f27a0e464e4a7e0516d242f1234
```
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] destination_arn: The bucket arn and prefix under which the findings get exported. Bucket-ARN is required, the prefix is optional and will be `AWSLogs/[Account-ID]/GuardDuty/[Region]/` if not provided
:param pulumi.Input[str] destination_type: Currently there is only "S3" available as destination type which is also the default value
:param pulumi.Input[str] detector_id: The detector ID of the GuardDuty.
:param pulumi.Input[str] kms_key_arn: The ARN of the KMS key used to encrypt GuardDuty findings. GuardDuty enforces this to be encrypted.
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: PublishingDestinationArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
Provides a resource to manage a GuardDuty PublishingDestination. Requires an existing GuardDuty Detector.
## Example Usage
```python
import pulumi
import pulumi_aws as aws
current_caller_identity = aws.get_caller_identity()
current_region = aws.get_region()
gd_bucket = aws.s3.Bucket("gdBucket",
acl="private",
force_destroy=True)
bucket_pol = aws.iam.get_policy_document_output(statements=[
aws.iam.GetPolicyDocumentStatementArgs(
sid="Allow PutObject",
actions=["s3:PutObject"],
resources=[gd_bucket.arn.apply(lambda arn: f"{arn}/*")],
principals=[aws.iam.GetPolicyDocumentStatementPrincipalArgs(
type="Service",
identifiers=["guardduty.amazonaws.com"],
)],
),
aws.iam.GetPolicyDocumentStatementArgs(
sid="Allow GetBucketLocation",
actions=["s3:GetBucketLocation"],
resources=[gd_bucket.arn],
principals=[aws.iam.GetPolicyDocumentStatementPrincipalArgs(
type="Service",
identifiers=["guardduty.amazonaws.com"],
)],
),
])
kms_pol = aws.iam.get_policy_document(statements=[
aws.iam.GetPolicyDocumentStatementArgs(
sid="Allow GuardDuty to encrypt findings",
actions=["kms:GenerateDataKey"],
resources=[f"arn:aws:kms:{current_region.name}:{current_caller_identity.account_id}:key/*"],
principals=[aws.iam.GetPolicyDocumentStatementPrincipalArgs(
type="Service",
identifiers=["guardduty.amazonaws.com"],
)],
),
aws.iam.GetPolicyDocumentStatementArgs(
sid="Allow all users to modify/delete key (test only)",
actions=["kms:*"],
resources=[f"arn:aws:kms:{current_region.name}:{current_caller_identity.account_id}:key/*"],
principals=[aws.iam.GetPolicyDocumentStatementPrincipalArgs(
type="AWS",
identifiers=[f"arn:aws:iam::{current_caller_identity.account_id}:root"],
)],
),
])
test_gd = aws.guardduty.Detector("testGd", enable=True)
gd_bucket_policy = aws.s3.BucketPolicy("gdBucketPolicy",
bucket=gd_bucket.id,
policy=bucket_pol.json)
gd_key = aws.kms.Key("gdKey",
description="Temporary key for AccTest of TF",
deletion_window_in_days=7,
policy=kms_pol.json)
test = aws.guardduty.PublishingDestination("test",
detector_id=test_gd.id,
destination_arn=gd_bucket.arn,
kms_key_arn=gd_key.arn,
opts=pulumi.ResourceOptions(depends_on=[gd_bucket_policy]))
```
> **Note:** Please do not use this simple example for Bucket-Policy and KMS Key Policy in a production environment. It is much too open for such a use-case. Refer to the AWS documentation here: https://docs.aws.amazon.com/guardduty/latest/ug/guardduty_exportfindings.html
## Import
GuardDuty PublishingDestination can be imported using the the master GuardDuty detector ID and PublishingDestinationID, e.g.,
```sh
$ pulumi import aws:guardduty/publishingDestination:PublishingDestination test a4b86f26fa42e7e7cf0d1c333ea77777:a4b86f27a0e464e4a7e0516d242f1234
```
:param str resource_name: The name of the resource.
:param PublishingDestinationArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(PublishingDestinationArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
destination_arn: Optional[pulumi.Input[str]] = None,
destination_type: Optional[pulumi.Input[str]] = None,
detector_id: Optional[pulumi.Input[str]] = None,
kms_key_arn: Optional[pulumi.Input[str]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = PublishingDestinationArgs.__new__(PublishingDestinationArgs)
if destination_arn is None and not opts.urn:
raise TypeError("Missing required property 'destination_arn'")
__props__.__dict__["destination_arn"] = destination_arn
__props__.__dict__["destination_type"] = destination_type
if detector_id is None and not opts.urn:
raise TypeError("Missing required property 'detector_id'")
__props__.__dict__["detector_id"] = detector_id
if kms_key_arn is None and not opts.urn:
raise TypeError("Missing required property 'kms_key_arn'")
__props__.__dict__["kms_key_arn"] = kms_key_arn
super(PublishingDestination, __self__).__init__(
'aws:guardduty/publishingDestination:PublishingDestination',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
destination_arn: Optional[pulumi.Input[str]] = None,
destination_type: Optional[pulumi.Input[str]] = None,
detector_id: Optional[pulumi.Input[str]] = None,
kms_key_arn: Optional[pulumi.Input[str]] = None) -> 'PublishingDestination':
"""
Get an existing PublishingDestination resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] destination_arn: The bucket arn and prefix under which the findings get exported. Bucket-ARN is required, the prefix is optional and will be `AWSLogs/[Account-ID]/GuardDuty/[Region]/` if not provided
:param pulumi.Input[str] destination_type: Currently there is only "S3" available as destination type which is also the default value
:param pulumi.Input[str] detector_id: The detector ID of the GuardDuty.
:param pulumi.Input[str] kms_key_arn: The ARN of the KMS key used to encrypt GuardDuty findings. GuardDuty enforces this to be encrypted.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _PublishingDestinationState.__new__(_PublishingDestinationState)
__props__.__dict__["destination_arn"] = destination_arn
__props__.__dict__["destination_type"] = destination_type
__props__.__dict__["detector_id"] = detector_id
__props__.__dict__["kms_key_arn"] = kms_key_arn
return PublishingDestination(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter(name="destinationArn")
def destination_arn(self) -> pulumi.Output[str]:
"""
The bucket arn and prefix under which the findings get exported. Bucket-ARN is required, the prefix is optional and will be `AWSLogs/[Account-ID]/GuardDuty/[Region]/` if not provided
"""
return pulumi.get(self, "destination_arn")
@property
@pulumi.getter(name="destinationType")
def destination_type(self) -> pulumi.Output[Optional[str]]:
"""
Currently there is only "S3" available as destination type which is also the default value
"""
return pulumi.get(self, "destination_type")
@property
@pulumi.getter(name="detectorId")
def detector_id(self) -> pulumi.Output[str]:
"""
The detector ID of the GuardDuty.
"""
return pulumi.get(self, "detector_id")
@property
@pulumi.getter(name="kmsKeyArn")
def kms_key_arn(self) -> pulumi.Output[str]:
"""
The ARN of the KMS key used to encrypt GuardDuty findings. GuardDuty enforces this to be encrypted.
"""
return pulumi.get(self, "kms_key_arn")
| 47.292793 | 279 | 0.6453 | 2,363 | 20,998 | 5.522641 | 0.102835 | 0.047203 | 0.057931 | 0.045517 | 0.861686 | 0.843831 | 0.837088 | 0.815632 | 0.805594 | 0.78659 | 0 | 0.006426 | 0.258929 | 20,998 | 443 | 280 | 47.399549 | 0.832209 | 0.50562 | 0 | 0.602273 | 1 | 0 | 0.11783 | 0.014282 | 0 | 0 | 0 | 0 | 0 | 1 | 0.153409 | false | 0.005682 | 0.028409 | 0 | 0.272727 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f8ffd753a5032de18f02854b669c8b5ca76cd73f | 2,502 | py | Python | olo/mixins/operations.py | Watch-Later/olo | c8bbdfffb64a9766df51a462a119a879c9030a4a | [
"Apache-2.0"
] | 81 | 2018-03-08T11:07:33.000Z | 2022-01-18T10:19:46.000Z | olo/mixins/operations.py | Watch-Later/olo | c8bbdfffb64a9766df51a462a119a879c9030a4a | [
"Apache-2.0"
] | 40 | 2018-07-04T09:22:34.000Z | 2021-09-03T09:30:02.000Z | olo/mixins/operations.py | Watch-Later/olo | c8bbdfffb64a9766df51a462a119a879c9030a4a | [
"Apache-2.0"
] | 13 | 2018-03-19T10:34:24.000Z | 2022-01-18T10:19:49.000Z | from __future__ import annotations
from typing import TYPE_CHECKING, Type
if TYPE_CHECKING:
from olo.expression import UnaryExpression, BinaryExpression
class UnaryOperationMixin(object):
UnaryExpression: Type[UnaryExpression]
def desc(self) -> UnaryExpression:
return self.UnaryExpression(self, 'DESC')
def asc(self) -> UnaryExpression:
return self.UnaryExpression(self, 'ASC')
class BinaryOperationMixin(object):
BinaryExpression: Type[BinaryExpression]
def __add__(self, other):
return self.BinaryExpression(self, other, '+')
__radd__ = __add__
def __sub__(self, other):
return self.BinaryExpression(self, other, '-')
__rsub__ = __sub__
def __mul__(self, other):
return self.BinaryExpression(self, other, '*')
def __div__(self, other):
return self.BinaryExpression(self, other, '/')
__truediv__ = __div__
def __mod__(self, other):
return self.BinaryExpression(self, other, '%')
def __eq__(self, other):
operator = '='
if other is None:
operator = 'IS'
return self.BinaryExpression(self, other, operator)
def __ne__(self, other):
operator = '!='
if other is None:
operator = 'IS NOT'
return self.BinaryExpression(self, other, operator)
def __gt__(self, other):
return self.BinaryExpression(self, other, '>')
def __ge__(self, other):
return self.BinaryExpression(self, other, '>=')
def __lt__(self, other):
return self.BinaryExpression(self, other, '<')
def __le__(self, other):
return self.BinaryExpression(self, other, '<=')
def in_(self, other):
return self.BinaryExpression(self, other, 'IN')
__lshift__ = in_
def not_in_(self, other):
return self.BinaryExpression(self, other, 'NOT IN')
def like_(self, other):
return self.BinaryExpression(self, other, 'LIKE')
def ilike_(self, other):
return self.BinaryExpression(self, other, 'ILIKE')
def regexp_(self, other):
return self.BinaryExpression(self, other, 'REGEXP')
def between_(self, other):
return self.BinaryExpression(self, other, 'BETWEEN')
def concat_(self, other):
return self.BinaryExpression(self, other, '||')
def is_(self, other):
return self.BinaryExpression(self, other, 'IS')
def is_not_(self, other):
return self.BinaryExpression(self, other, 'IS NOT')
| 26.617021 | 64 | 0.648281 | 266 | 2,502 | 5.759399 | 0.176692 | 0.234987 | 0.339426 | 0.391645 | 0.710836 | 0.710836 | 0.648172 | 0.387076 | 0.052219 | 0 | 0 | 0 | 0.236611 | 2,502 | 93 | 65 | 26.903226 | 0.802094 | 0 | 0 | 0.064516 | 0 | 0 | 0.027578 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.354839 | false | 0 | 0.048387 | 0.322581 | 0.887097 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
5d6a0f5f545b9ef01a6b3c4ff281f8eb4c79db84 | 294 | py | Python | translators/__init__.py | cjh0613/translators | 69a08f5f7b774ae0934b1f76f316df906ef49b4a | [
"MIT"
] | 1 | 2021-05-21T13:33:53.000Z | 2021-05-21T13:33:53.000Z | translators/__init__.py | cjh0613/translators | 69a08f5f7b774ae0934b1f76f316df906ef49b4a | [
"MIT"
] | null | null | null | translators/__init__.py | cjh0613/translators | 69a08f5f7b774ae0934b1f76f316df906ef49b4a | [
"MIT"
] | null | null | null | __version__ = "4.7.16"
__author__ = "UlionTse"
from translators.apis import alibaba, baidu, bing, deepl, google, sogou, tencent, yandex, youdao
from translators.apis import _alibaba, _baidu, _bing, _deepl, _google, _sogou, _tencent, _yandex, _youdao
from translators.apis import translate_html | 49 | 105 | 0.792517 | 38 | 294 | 5.657895 | 0.552632 | 0.209302 | 0.265116 | 0.348837 | 0.823256 | 0.823256 | 0.823256 | 0.823256 | 0.823256 | 0.823256 | 0 | 0.015385 | 0.115646 | 294 | 6 | 106 | 49 | 0.811538 | 0 | 0 | 0 | 0 | 0 | 0.047458 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.6 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 12 |
5d6cd203d0f208ffab95ca76bd32ec147091e75d | 202 | py | Python | lego/apps/meetings/constants.py | andrinelo/lego | 9b53c8fe538d9107b980a70e2a21fb487cc3b290 | [
"MIT"
] | null | null | null | lego/apps/meetings/constants.py | andrinelo/lego | 9b53c8fe538d9107b980a70e2a21fb487cc3b290 | [
"MIT"
] | null | null | null | lego/apps/meetings/constants.py | andrinelo/lego | 9b53c8fe538d9107b980a70e2a21fb487cc3b290 | [
"MIT"
] | null | null | null | NO_ANSWER = 'NO_ANSWER'
ATTENDING = 'ATTENDING'
NOT_ATTENDING = 'NOT_ATTENDING'
INVITATION_STATUS_TYPES = (
(NO_ANSWER, NO_ANSWER),
(ATTENDING, ATTENDING),
(NOT_ATTENDING, NOT_ATTENDING)
)
| 20.2 | 34 | 0.727723 | 23 | 202 | 5.956522 | 0.304348 | 0.233577 | 0.613139 | 0.233577 | 0.846715 | 0.846715 | 0.846715 | 0.846715 | 0.846715 | 0.846715 | 0 | 0 | 0.158416 | 202 | 9 | 35 | 22.444444 | 0.805882 | 0 | 0 | 0 | 0 | 0 | 0.153465 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 11 |
539800523eb174da0005ccab3cc1fd5e55228220 | 45,881 | py | Python | tasks/forms/form_task_foci_001.py | csdevsc/colcat_crowdsourcing_application | ad6015ca9cfc2a91063408280978a4c2eb4d6bc0 | [
"MIT"
] | null | null | null | tasks/forms/form_task_foci_001.py | csdevsc/colcat_crowdsourcing_application | ad6015ca9cfc2a91063408280978a4c2eb4d6bc0 | [
"MIT"
] | null | null | null | tasks/forms/form_task_foci_001.py | csdevsc/colcat_crowdsourcing_application | ad6015ca9cfc2a91063408280978a4c2eb4d6bc0 | [
"MIT"
] | null | null | null | from django import forms
from django.forms import ModelForm
from tasks.models import Task_Foci_001
class Form_Task_Foci_001_00(ModelForm):
class Meta:
model = Task_Foci_001
fields = ('cell_a0', 'cell_b0', 'cell_c0', 'cell_d0', 'cell_e0',
'cell_f0', 'cell_g0', 'cell_h0', 'cell_i0', 'cell_j0',)
column = 0
cell_a0 = forms.CharField(widget=forms.TextInput(), label = "A", required=False)
cell_b0 = forms.CharField(widget=forms.TextInput(), label = "B", required=False)
cell_c0 = forms.CharField(widget=forms.TextInput(), label = "C", required=False)
cell_d0 = forms.CharField(widget=forms.TextInput(), label = "D", required=False)
cell_e0 = forms.CharField(widget=forms.TextInput(), label = "E", required=False)
cell_f0 = forms.CharField(widget=forms.TextInput(), label = "F", required=False)
cell_g0 = forms.CharField(widget=forms.TextInput(), label = "G", required=False)
cell_h0 = forms.CharField(widget=forms.TextInput(), label = "H", required=False)
cell_i0 = forms.CharField(widget=forms.TextInput(), label = "I", required=False)
cell_j0 = forms.CharField(widget=forms.TextInput(), label = "J", required=False)
class Form_Task_Foci_001_01(ModelForm):
class Meta:
model = Task_Foci_001
fields = ('cell_a1', 'cell_b1', 'cell_c1', 'cell_d1', 'cell_e1',
'cell_f1', 'cell_g1', 'cell_h1', 'cell_i1', 'cell_j1',)
column = 1
cell_a1 = forms.CharField(widget=forms.TextInput(), label = "A", required=False)
cell_b1 = forms.CharField(widget=forms.TextInput(), label = "B", required=False)
cell_c1 = forms.CharField(widget=forms.TextInput(), label = "C", required=False)
cell_d1 = forms.CharField(widget=forms.TextInput(), label = "D", required=False)
cell_e1 = forms.CharField(widget=forms.TextInput(), label = "E", required=False)
cell_f1 = forms.CharField(widget=forms.TextInput(), label = "F", required=False)
cell_g1 = forms.CharField(widget=forms.TextInput(), label = "G", required=False)
cell_h1 = forms.CharField(widget=forms.TextInput(), label = "H", required=False)
cell_i1 = forms.CharField(widget=forms.TextInput(), label = "I", required=False)
cell_j1 = forms.CharField(widget=forms.TextInput(), label = "J", required=False)
class Form_Task_Foci_001_02(ModelForm):
class Meta:
model = Task_Foci_001
fields = ('cell_a2', 'cell_b2', 'cell_c2', 'cell_d2', 'cell_e2',
'cell_f2', 'cell_g2', 'cell_h2', 'cell_i2', 'cell_j2',)
column = 2
cell_a2 = forms.CharField(widget=forms.TextInput(), label = "A", required=False)
cell_b2 = forms.CharField(widget=forms.TextInput(), label = "B", required=False)
cell_c2 = forms.CharField(widget=forms.TextInput(), label = "C", required=False)
cell_d2 = forms.CharField(widget=forms.TextInput(), label = "D", required=False)
cell_e2 = forms.CharField(widget=forms.TextInput(), label = "E", required=False)
cell_f2 = forms.CharField(widget=forms.TextInput(), label = "F", required=False)
cell_g2 = forms.CharField(widget=forms.TextInput(), label = "G", required=False)
cell_h2 = forms.CharField(widget=forms.TextInput(), label = "H", required=False)
cell_i2 = forms.CharField(widget=forms.TextInput(), label = "I", required=False)
cell_j2 = forms.CharField(widget=forms.TextInput(), label = "J", required=False)
class Form_Task_Foci_001_03(ModelForm):
class Meta:
model = Task_Foci_001
fields = ('cell_a3', 'cell_b3', 'cell_c3', 'cell_d3', 'cell_e3',
'cell_f3', 'cell_g3', 'cell_h3', 'cell_i3', 'cell_j3',)
column = 3
cell_a3 = forms.CharField(widget=forms.TextInput(), label = "A", required=False)
cell_b3 = forms.CharField(widget=forms.TextInput(), label = "B", required=False)
cell_c3 = forms.CharField(widget=forms.TextInput(), label = "C", required=False)
cell_d3 = forms.CharField(widget=forms.TextInput(), label = "D", required=False)
cell_e3 = forms.CharField(widget=forms.TextInput(), label = "E", required=False)
cell_f3 = forms.CharField(widget=forms.TextInput(), label = "F", required=False)
cell_g3 = forms.CharField(widget=forms.TextInput(), label = "G", required=False)
cell_h3 = forms.CharField(widget=forms.TextInput(), label = "H", required=False)
cell_i3 = forms.CharField(widget=forms.TextInput(), label = "I", required=False)
cell_j3 = forms.CharField(widget=forms.TextInput(), label = "J", required=False)
class Form_Task_Foci_001_04(ModelForm):
class Meta:
model = Task_Foci_001
fields = ('cell_a4', 'cell_b4', 'cell_c4', 'cell_d4', 'cell_e4',
'cell_f4', 'cell_g4', 'cell_h4', 'cell_i4', 'cell_j4',)
column = 4
cell_a4 = forms.CharField(widget=forms.TextInput(), label = "A", required=False)
cell_b4 = forms.CharField(widget=forms.TextInput(), label = "B", required=False)
cell_c4 = forms.CharField(widget=forms.TextInput(), label = "C", required=False)
cell_d4 = forms.CharField(widget=forms.TextInput(), label = "D", required=False)
cell_e4 = forms.CharField(widget=forms.TextInput(), label = "E", required=False)
cell_f4 = forms.CharField(widget=forms.TextInput(), label = "F", required=False)
cell_g4 = forms.CharField(widget=forms.TextInput(), label = "G", required=False)
cell_h4 = forms.CharField(widget=forms.TextInput(), label = "H", required=False)
cell_i4 = forms.CharField(widget=forms.TextInput(), label = "I", required=False)
cell_j4 = forms.CharField(widget=forms.TextInput(), label = "J", required=False)
class Form_Task_Foci_001_05(ModelForm):
class Meta:
model = Task_Foci_001
fields = ('cell_a5', 'cell_b5', 'cell_c5', 'cell_d5', 'cell_e5',
'cell_f5', 'cell_g5', 'cell_h5', 'cell_i5', 'cell_j5',)
column = 5
cell_a5 = forms.CharField(widget=forms.TextInput(), label = "A", required=False)
cell_b5 = forms.CharField(widget=forms.TextInput(), label = "B", required=False)
cell_c5 = forms.CharField(widget=forms.TextInput(), label = "C", required=False)
cell_d5 = forms.CharField(widget=forms.TextInput(), label = "D", required=False)
cell_e5 = forms.CharField(widget=forms.TextInput(), label = "E", required=False)
cell_f5 = forms.CharField(widget=forms.TextInput(), label = "F", required=False)
cell_g5 = forms.CharField(widget=forms.TextInput(), label = "G", required=False)
cell_h5 = forms.CharField(widget=forms.TextInput(), label = "H", required=False)
cell_i5 = forms.CharField(widget=forms.TextInput(), label = "I", required=False)
cell_j5 = forms.CharField(widget=forms.TextInput(), label = "J", required=False)
class Form_Task_Foci_001_06(ModelForm):
class Meta:
model = Task_Foci_001
fields = ('cell_a6', 'cell_b6', 'cell_c6', 'cell_d6', 'cell_e6',
'cell_f6', 'cell_g6', 'cell_h6', 'cell_i6', 'cell_j6',)
column = 6
cell_a6 = forms.CharField(widget=forms.TextInput(), label = "A", required=False)
cell_b6 = forms.CharField(widget=forms.TextInput(), label = "B", required=False)
cell_c6 = forms.CharField(widget=forms.TextInput(), label = "C", required=False)
cell_d6 = forms.CharField(widget=forms.TextInput(), label = "D", required=False)
cell_e6 = forms.CharField(widget=forms.TextInput(), label = "E", required=False)
cell_f6 = forms.CharField(widget=forms.TextInput(), label = "F", required=False)
cell_g6 = forms.CharField(widget=forms.TextInput(), label = "G", required=False)
cell_h6 = forms.CharField(widget=forms.TextInput(), label = "H", required=False)
cell_i6 = forms.CharField(widget=forms.TextInput(), label = "I", required=False)
cell_j6 = forms.CharField(widget=forms.TextInput(), label = "J", required=False)
class Form_Task_Foci_001_07(ModelForm):
class Meta:
model = Task_Foci_001
fields = ('cell_a7', 'cell_b7', 'cell_c7', 'cell_d7', 'cell_e7',
'cell_f7', 'cell_g7', 'cell_h7', 'cell_i7', 'cell_j7',)
column = 7
cell_a7 = forms.CharField(widget=forms.TextInput(), label = "A", required=False)
cell_b7 = forms.CharField(widget=forms.TextInput(), label = "B", required=False)
cell_c7 = forms.CharField(widget=forms.TextInput(), label = "C", required=False)
cell_d7 = forms.CharField(widget=forms.TextInput(), label = "D", required=False)
cell_e7 = forms.CharField(widget=forms.TextInput(), label = "E", required=False)
cell_f7 = forms.CharField(widget=forms.TextInput(), label = "F", required=False)
cell_g7 = forms.CharField(widget=forms.TextInput(), label = "G", required=False)
cell_h7 = forms.CharField(widget=forms.TextInput(), label = "H", required=False)
cell_i7 = forms.CharField(widget=forms.TextInput(), label = "I", required=False)
cell_j7 = forms.CharField(widget=forms.TextInput(), label = "J", required=False)
class Form_Task_Foci_001_08(ModelForm):
class Meta:
model = Task_Foci_001
fields = ('cell_a8', 'cell_b8', 'cell_c8', 'cell_d8', 'cell_e8',
'cell_f8', 'cell_g8', 'cell_h8', 'cell_i8', 'cell_j8',)
column = 8
cell_a8 = forms.CharField(widget=forms.TextInput(), label = "A", required=False)
cell_b8 = forms.CharField(widget=forms.TextInput(), label = "B", required=False)
cell_c8 = forms.CharField(widget=forms.TextInput(), label = "C", required=False)
cell_d8 = forms.CharField(widget=forms.TextInput(), label = "D", required=False)
cell_e8 = forms.CharField(widget=forms.TextInput(), label = "E", required=False)
cell_f8 = forms.CharField(widget=forms.TextInput(), label = "F", required=False)
cell_g8 = forms.CharField(widget=forms.TextInput(), label = "G", required=False)
cell_h8 = forms.CharField(widget=forms.TextInput(), label = "H", required=False)
cell_i8 = forms.CharField(widget=forms.TextInput(), label = "I", required=False)
cell_j8 = forms.CharField(widget=forms.TextInput(), label = "J", required=False)
class Form_Task_Foci_001_09(ModelForm):
class Meta:
model = Task_Foci_001
fields = ('cell_a9', 'cell_b9', 'cell_c9', 'cell_d9', 'cell_e9',
'cell_f9', 'cell_g9', 'cell_h9', 'cell_i9', 'cell_j9',)
column = 9
cell_a9 = forms.CharField(widget=forms.TextInput(), label = "A", required=False)
cell_b9 = forms.CharField(widget=forms.TextInput(), label = "B", required=False)
cell_c9 = forms.CharField(widget=forms.TextInput(), label = "C", required=False)
cell_d9 = forms.CharField(widget=forms.TextInput(), label = "D", required=False)
cell_e9 = forms.CharField(widget=forms.TextInput(), label = "E", required=False)
cell_f9 = forms.CharField(widget=forms.TextInput(), label = "F", required=False)
cell_g9 = forms.CharField(widget=forms.TextInput(), label = "G", required=False)
cell_h9 = forms.CharField(widget=forms.TextInput(), label = "H", required=False)
cell_i9 = forms.CharField(widget=forms.TextInput(), label = "I", required=False)
cell_j9 = forms.CharField(widget=forms.TextInput(), label = "J", required=False)
class Form_Task_Foci_001_010(ModelForm):
class Meta:
model = Task_Foci_001
fields = ('cell_a10', 'cell_b10', 'cell_c10', 'cell_d10', 'cell_e10',
'cell_f10', 'cell_g10', 'cell_h10', 'cell_i10', 'cell_j10',)
column = 10
cell_a10 = forms.CharField(widget=forms.TextInput(), label = "A", required=False)
cell_b10 = forms.CharField(widget=forms.TextInput(), label = "B", required=False)
cell_c10 = forms.CharField(widget=forms.TextInput(), label = "C", required=False)
cell_d10 = forms.CharField(widget=forms.TextInput(), label = "D", required=False)
cell_e10 = forms.CharField(widget=forms.TextInput(), label = "E", required=False)
cell_f10 = forms.CharField(widget=forms.TextInput(), label = "F", required=False)
cell_g10 = forms.CharField(widget=forms.TextInput(), label = "G", required=False)
cell_h10 = forms.CharField(widget=forms.TextInput(), label = "H", required=False)
cell_i10 = forms.CharField(widget=forms.TextInput(), label = "I", required=False)
cell_j10 = forms.CharField(widget=forms.TextInput(), label = "J", required=False)
class Form_Task_Foci_001_011(ModelForm):
class Meta:
model = Task_Foci_001
fields = ('cell_a11', 'cell_b11', 'cell_c11', 'cell_d11', 'cell_e11',
'cell_f11', 'cell_g11', 'cell_h11', 'cell_i11', 'cell_j11',)
column = 11
cell_a11 = forms.CharField(widget=forms.TextInput(), label = "A", required=False)
cell_b11 = forms.CharField(widget=forms.TextInput(), label = "B", required=False)
cell_c11 = forms.CharField(widget=forms.TextInput(), label = "C", required=False)
cell_d11 = forms.CharField(widget=forms.TextInput(), label = "D", required=False)
cell_e11 = forms.CharField(widget=forms.TextInput(), label = "E", required=False)
cell_f11 = forms.CharField(widget=forms.TextInput(), label = "F", required=False)
cell_g11 = forms.CharField(widget=forms.TextInput(), label = "G", required=False)
cell_h11 = forms.CharField(widget=forms.TextInput(), label = "H", required=False)
cell_i11 = forms.CharField(widget=forms.TextInput(), label = "I", required=False)
cell_j11 = forms.CharField(widget=forms.TextInput(), label = "J", required=False)
class Form_Task_Foci_001_012(ModelForm):
class Meta:
model = Task_Foci_001
fields = ('cell_a12', 'cell_b12', 'cell_c12', 'cell_d12', 'cell_e12',
'cell_f12', 'cell_g12', 'cell_h12', 'cell_i12', 'cell_j12',)
column = 12
cell_a12 = forms.CharField(widget=forms.TextInput(), label = "A", required=False)
cell_b12 = forms.CharField(widget=forms.TextInput(), label = "B", required=False)
cell_c12 = forms.CharField(widget=forms.TextInput(), label = "C", required=False)
cell_d12 = forms.CharField(widget=forms.TextInput(), label = "D", required=False)
cell_e12 = forms.CharField(widget=forms.TextInput(), label = "E", required=False)
cell_f12 = forms.CharField(widget=forms.TextInput(), label = "F", required=False)
cell_g12 = forms.CharField(widget=forms.TextInput(), label = "G", required=False)
cell_h12 = forms.CharField(widget=forms.TextInput(), label = "H", required=False)
cell_i12 = forms.CharField(widget=forms.TextInput(), label = "I", required=False)
cell_j12 = forms.CharField(widget=forms.TextInput(), label = "J", required=False)
class Form_Task_Foci_001_013(ModelForm):
class Meta:
model = Task_Foci_001
fields = ('cell_a13', 'cell_b13', 'cell_c13', 'cell_d13', 'cell_e13',
'cell_f13', 'cell_g13', 'cell_h13', 'cell_i13', 'cell_j13',)
column = 13
cell_a13 = forms.CharField(widget=forms.TextInput(), label = "A", required=False)
cell_b13 = forms.CharField(widget=forms.TextInput(), label = "B", required=False)
cell_c13 = forms.CharField(widget=forms.TextInput(), label = "C", required=False)
cell_d13 = forms.CharField(widget=forms.TextInput(), label = "D", required=False)
cell_e13 = forms.CharField(widget=forms.TextInput(), label = "E", required=False)
cell_f13 = forms.CharField(widget=forms.TextInput(), label = "F", required=False)
cell_g13 = forms.CharField(widget=forms.TextInput(), label = "G", required=False)
cell_h13 = forms.CharField(widget=forms.TextInput(), label = "H", required=False)
cell_i13 = forms.CharField(widget=forms.TextInput(), label = "I", required=False)
cell_j13 = forms.CharField(widget=forms.TextInput(), label = "J", required=False)
class Form_Task_Foci_001_014(ModelForm):
class Meta:
model = Task_Foci_001
fields = ('cell_a14', 'cell_b14', 'cell_c14', 'cell_d14', 'cell_e14',
'cell_f14', 'cell_g14', 'cell_h14', 'cell_i14', 'cell_j14',)
column = 14
cell_a14 = forms.CharField(widget=forms.TextInput(), label = "A", required=False)
cell_b14 = forms.CharField(widget=forms.TextInput(), label = "B", required=False)
cell_c14 = forms.CharField(widget=forms.TextInput(), label = "C", required=False)
cell_d14 = forms.CharField(widget=forms.TextInput(), label = "D", required=False)
cell_e14 = forms.CharField(widget=forms.TextInput(), label = "E", required=False)
cell_f14 = forms.CharField(widget=forms.TextInput(), label = "F", required=False)
cell_g14 = forms.CharField(widget=forms.TextInput(), label = "G", required=False)
cell_h14 = forms.CharField(widget=forms.TextInput(), label = "H", required=False)
cell_i14 = forms.CharField(widget=forms.TextInput(), label = "I", required=False)
cell_j14 = forms.CharField(widget=forms.TextInput(), label = "J", required=False)
class Form_Task_Foci_001_015(ModelForm):
class Meta:
model = Task_Foci_001
fields = ('cell_a15', 'cell_b15', 'cell_c15', 'cell_d15', 'cell_e15',
'cell_f15', 'cell_g15', 'cell_h15', 'cell_i15', 'cell_j15',)
column = 15
cell_a15 = forms.CharField(widget=forms.TextInput(), label = "A", required=False)
cell_b15 = forms.CharField(widget=forms.TextInput(), label = "B", required=False)
cell_c15 = forms.CharField(widget=forms.TextInput(), label = "C", required=False)
cell_d15 = forms.CharField(widget=forms.TextInput(), label = "D", required=False)
cell_e15 = forms.CharField(widget=forms.TextInput(), label = "E", required=False)
cell_f15 = forms.CharField(widget=forms.TextInput(), label = "F", required=False)
cell_g15 = forms.CharField(widget=forms.TextInput(), label = "G", required=False)
cell_h15 = forms.CharField(widget=forms.TextInput(), label = "H", required=False)
cell_i15 = forms.CharField(widget=forms.TextInput(), label = "I", required=False)
cell_j15 = forms.CharField(widget=forms.TextInput(), label = "J", required=False)
class Form_Task_Foci_001_016(ModelForm):
class Meta:
model = Task_Foci_001
fields = ('cell_a16', 'cell_b16', 'cell_c16', 'cell_d16', 'cell_e16',
'cell_f16', 'cell_g16', 'cell_h16', 'cell_i16', 'cell_j16',)
column = 16
cell_a16 = forms.CharField(widget=forms.TextInput(), label = "A", required=False)
cell_b16 = forms.CharField(widget=forms.TextInput(), label = "B", required=False)
cell_c16 = forms.CharField(widget=forms.TextInput(), label = "C", required=False)
cell_d16 = forms.CharField(widget=forms.TextInput(), label = "D", required=False)
cell_e16 = forms.CharField(widget=forms.TextInput(), label = "E", required=False)
cell_f16 = forms.CharField(widget=forms.TextInput(), label = "F", required=False)
cell_g16 = forms.CharField(widget=forms.TextInput(), label = "G", required=False)
cell_h16 = forms.CharField(widget=forms.TextInput(), label = "H", required=False)
cell_i16 = forms.CharField(widget=forms.TextInput(), label = "I", required=False)
cell_j16 = forms.CharField(widget=forms.TextInput(), label = "J", required=False)
class Form_Task_Foci_001_017(ModelForm):
class Meta:
model = Task_Foci_001
fields = ('cell_a17', 'cell_b17', 'cell_c17', 'cell_d17', 'cell_e17',
'cell_f17', 'cell_g17', 'cell_h17', 'cell_i17', 'cell_j17',)
column = 17
cell_a17 = forms.CharField(widget=forms.TextInput(), label = "A", required=False)
cell_b17 = forms.CharField(widget=forms.TextInput(), label = "B", required=False)
cell_c17 = forms.CharField(widget=forms.TextInput(), label = "C", required=False)
cell_d17 = forms.CharField(widget=forms.TextInput(), label = "D", required=False)
cell_e17 = forms.CharField(widget=forms.TextInput(), label = "E", required=False)
cell_f17 = forms.CharField(widget=forms.TextInput(), label = "F", required=False)
cell_g17 = forms.CharField(widget=forms.TextInput(), label = "G", required=False)
cell_h17 = forms.CharField(widget=forms.TextInput(), label = "H", required=False)
cell_i17 = forms.CharField(widget=forms.TextInput(), label = "I", required=False)
cell_j17 = forms.CharField(widget=forms.TextInput(), label = "J", required=False)
class Form_Task_Foci_001_018(ModelForm):
class Meta:
model = Task_Foci_001
fields = ('cell_a18', 'cell_b18', 'cell_c18', 'cell_d18', 'cell_e18',
'cell_f18', 'cell_g18', 'cell_h18', 'cell_i18', 'cell_j18',)
column = 18
cell_a18 = forms.CharField(widget=forms.TextInput(), label = "A", required=False)
cell_b18 = forms.CharField(widget=forms.TextInput(), label = "B", required=False)
cell_c18 = forms.CharField(widget=forms.TextInput(), label = "C", required=False)
cell_d18 = forms.CharField(widget=forms.TextInput(), label = "D", required=False)
cell_e18 = forms.CharField(widget=forms.TextInput(), label = "E", required=False)
cell_f18 = forms.CharField(widget=forms.TextInput(), label = "F", required=False)
cell_g18 = forms.CharField(widget=forms.TextInput(), label = "G", required=False)
cell_h18 = forms.CharField(widget=forms.TextInput(), label = "H", required=False)
cell_i18 = forms.CharField(widget=forms.TextInput(), label = "I", required=False)
cell_j18 = forms.CharField(widget=forms.TextInput(), label = "J", required=False)
class Form_Task_Foci_001_019(ModelForm):
class Meta:
model = Task_Foci_001
fields = ('cell_a19', 'cell_b19', 'cell_c19', 'cell_d19', 'cell_e19',
'cell_f19', 'cell_g19', 'cell_h19', 'cell_i19', 'cell_j19',)
column = 19
cell_a19 = forms.CharField(widget=forms.TextInput(), label = "A", required=False)
cell_b19 = forms.CharField(widget=forms.TextInput(), label = "B", required=False)
cell_c19 = forms.CharField(widget=forms.TextInput(), label = "C", required=False)
cell_d19 = forms.CharField(widget=forms.TextInput(), label = "D", required=False)
cell_e19 = forms.CharField(widget=forms.TextInput(), label = "E", required=False)
cell_f19 = forms.CharField(widget=forms.TextInput(), label = "F", required=False)
cell_g19 = forms.CharField(widget=forms.TextInput(), label = "G", required=False)
cell_h19 = forms.CharField(widget=forms.TextInput(), label = "H", required=False)
cell_i19 = forms.CharField(widget=forms.TextInput(), label = "I", required=False)
cell_j19 = forms.CharField(widget=forms.TextInput(), label = "J", required=False)
class Form_Task_Foci_001_020(ModelForm):
class Meta:
model = Task_Foci_001
fields = ('cell_a20', 'cell_b20', 'cell_c20', 'cell_d20', 'cell_e20',
'cell_f20', 'cell_g20', 'cell_h20', 'cell_i20', 'cell_j20',)
column = 20
cell_a20 = forms.CharField(widget=forms.TextInput(), label = "A", required=False)
cell_b20 = forms.CharField(widget=forms.TextInput(), label = "B", required=False)
cell_c20 = forms.CharField(widget=forms.TextInput(), label = "C", required=False)
cell_d20 = forms.CharField(widget=forms.TextInput(), label = "D", required=False)
cell_e20 = forms.CharField(widget=forms.TextInput(), label = "E", required=False)
cell_f20 = forms.CharField(widget=forms.TextInput(), label = "F", required=False)
cell_g20 = forms.CharField(widget=forms.TextInput(), label = "G", required=False)
cell_h20 = forms.CharField(widget=forms.TextInput(), label = "H", required=False)
cell_i20 = forms.CharField(widget=forms.TextInput(), label = "I", required=False)
cell_j20 = forms.CharField(widget=forms.TextInput(), label = "J", required=False)
class Form_Task_Foci_001_021(ModelForm):
class Meta:
model = Task_Foci_001
fields = ('cell_a21', 'cell_b21', 'cell_c21', 'cell_d21', 'cell_e21',
'cell_f21', 'cell_g21', 'cell_h21', 'cell_i21', 'cell_j21',)
column = 21
cell_a21 = forms.CharField(widget=forms.TextInput(), label = "A", required=False)
cell_b21 = forms.CharField(widget=forms.TextInput(), label = "B", required=False)
cell_c21 = forms.CharField(widget=forms.TextInput(), label = "C", required=False)
cell_d21 = forms.CharField(widget=forms.TextInput(), label = "D", required=False)
cell_e21 = forms.CharField(widget=forms.TextInput(), label = "E", required=False)
cell_f21 = forms.CharField(widget=forms.TextInput(), label = "F", required=False)
cell_g21 = forms.CharField(widget=forms.TextInput(), label = "G", required=False)
cell_h21 = forms.CharField(widget=forms.TextInput(), label = "H", required=False)
cell_i21 = forms.CharField(widget=forms.TextInput(), label = "I", required=False)
cell_j21 = forms.CharField(widget=forms.TextInput(), label = "J", required=False)
class Form_Task_Foci_001_022(ModelForm):
class Meta:
model = Task_Foci_001
fields = ('cell_a22', 'cell_b22', 'cell_c22', 'cell_d22', 'cell_e22',
'cell_f22', 'cell_g22', 'cell_h22', 'cell_i22', 'cell_j22',)
column = 22
cell_a22 = forms.CharField(widget=forms.TextInput(), label = "A", required=False)
cell_b22 = forms.CharField(widget=forms.TextInput(), label = "B", required=False)
cell_c22 = forms.CharField(widget=forms.TextInput(), label = "C", required=False)
cell_d22 = forms.CharField(widget=forms.TextInput(), label = "D", required=False)
cell_e22 = forms.CharField(widget=forms.TextInput(), label = "E", required=False)
cell_f22 = forms.CharField(widget=forms.TextInput(), label = "F", required=False)
cell_g22 = forms.CharField(widget=forms.TextInput(), label = "G", required=False)
cell_h22 = forms.CharField(widget=forms.TextInput(), label = "H", required=False)
cell_i22 = forms.CharField(widget=forms.TextInput(), label = "I", required=False)
cell_j22 = forms.CharField(widget=forms.TextInput(), label = "J", required=False)
class Form_Task_Foci_001_023(ModelForm):
class Meta:
model = Task_Foci_001
fields = ('cell_a23', 'cell_b23', 'cell_c23', 'cell_d23', 'cell_e23',
'cell_f23', 'cell_g23', 'cell_h23', 'cell_i23', 'cell_j23',)
column = 23
cell_a23 = forms.CharField(widget=forms.TextInput(), label = "A", required=False)
cell_b23 = forms.CharField(widget=forms.TextInput(), label = "B", required=False)
cell_c23 = forms.CharField(widget=forms.TextInput(), label = "C", required=False)
cell_d23 = forms.CharField(widget=forms.TextInput(), label = "D", required=False)
cell_e23 = forms.CharField(widget=forms.TextInput(), label = "E", required=False)
cell_f23 = forms.CharField(widget=forms.TextInput(), label = "F", required=False)
cell_g23 = forms.CharField(widget=forms.TextInput(), label = "G", required=False)
cell_h23 = forms.CharField(widget=forms.TextInput(), label = "H", required=False)
cell_i23 = forms.CharField(widget=forms.TextInput(), label = "I", required=False)
cell_j23 = forms.CharField(widget=forms.TextInput(), label = "J", required=False)
class Form_Task_Foci_001_024(ModelForm):
class Meta:
model = Task_Foci_001
fields = ('cell_a24', 'cell_b24', 'cell_c24', 'cell_d24', 'cell_e24',
'cell_f24', 'cell_g24', 'cell_h24', 'cell_i24', 'cell_j24',)
column = 24
cell_a24 = forms.CharField(widget=forms.TextInput(), label = "A", required=False)
cell_b24 = forms.CharField(widget=forms.TextInput(), label = "B", required=False)
cell_c24 = forms.CharField(widget=forms.TextInput(), label = "C", required=False)
cell_d24 = forms.CharField(widget=forms.TextInput(), label = "D", required=False)
cell_e24 = forms.CharField(widget=forms.TextInput(), label = "E", required=False)
cell_f24 = forms.CharField(widget=forms.TextInput(), label = "F", required=False)
cell_g24 = forms.CharField(widget=forms.TextInput(), label = "G", required=False)
cell_h24 = forms.CharField(widget=forms.TextInput(), label = "H", required=False)
cell_i24 = forms.CharField(widget=forms.TextInput(), label = "I", required=False)
cell_j24 = forms.CharField(widget=forms.TextInput(), label = "J", required=False)
class Form_Task_Foci_001_025(ModelForm):
class Meta:
model = Task_Foci_001
fields = ('cell_a25', 'cell_b25', 'cell_c25', 'cell_d25', 'cell_e25',
'cell_f25', 'cell_g25', 'cell_h25', 'cell_i25', 'cell_j25',)
column = 25
cell_a25 = forms.CharField(widget=forms.TextInput(), label = "A", required=False)
cell_b25 = forms.CharField(widget=forms.TextInput(), label = "B", required=False)
cell_c25 = forms.CharField(widget=forms.TextInput(), label = "C", required=False)
cell_d25 = forms.CharField(widget=forms.TextInput(), label = "D", required=False)
cell_e25 = forms.CharField(widget=forms.TextInput(), label = "E", required=False)
cell_f25 = forms.CharField(widget=forms.TextInput(), label = "F", required=False)
cell_g25 = forms.CharField(widget=forms.TextInput(), label = "G", required=False)
cell_h25 = forms.CharField(widget=forms.TextInput(), label = "H", required=False)
cell_i25 = forms.CharField(widget=forms.TextInput(), label = "I", required=False)
cell_j25 = forms.CharField(widget=forms.TextInput(), label = "J", required=False)
class Form_Task_Foci_001_026(ModelForm):
class Meta:
model = Task_Foci_001
fields = ('cell_a26', 'cell_b26', 'cell_c26', 'cell_d26', 'cell_e26',
'cell_f26', 'cell_g26', 'cell_h26', 'cell_i26', 'cell_j26',)
column = 26
cell_a26 = forms.CharField(widget=forms.TextInput(), label = "A", required=False)
cell_b26 = forms.CharField(widget=forms.TextInput(), label = "B", required=False)
cell_c26 = forms.CharField(widget=forms.TextInput(), label = "C", required=False)
cell_d26 = forms.CharField(widget=forms.TextInput(), label = "D", required=False)
cell_e26 = forms.CharField(widget=forms.TextInput(), label = "E", required=False)
cell_f26 = forms.CharField(widget=forms.TextInput(), label = "F", required=False)
cell_g26 = forms.CharField(widget=forms.TextInput(), label = "G", required=False)
cell_h26 = forms.CharField(widget=forms.TextInput(), label = "H", required=False)
cell_i26 = forms.CharField(widget=forms.TextInput(), label = "I", required=False)
cell_j26 = forms.CharField(widget=forms.TextInput(), label = "J", required=False)
class Form_Task_Foci_001_027(ModelForm):
class Meta:
model = Task_Foci_001
fields = ('cell_a27', 'cell_b27', 'cell_c27', 'cell_d27', 'cell_e27',
'cell_f27', 'cell_g27', 'cell_h27', 'cell_i27', 'cell_j27',)
column = 27
cell_a27 = forms.CharField(widget=forms.TextInput(), label = "A", required=False)
cell_b27 = forms.CharField(widget=forms.TextInput(), label = "B", required=False)
cell_c27 = forms.CharField(widget=forms.TextInput(), label = "C", required=False)
cell_d27 = forms.CharField(widget=forms.TextInput(), label = "D", required=False)
cell_e27 = forms.CharField(widget=forms.TextInput(), label = "E", required=False)
cell_f27 = forms.CharField(widget=forms.TextInput(), label = "F", required=False)
cell_g27 = forms.CharField(widget=forms.TextInput(), label = "G", required=False)
cell_h27 = forms.CharField(widget=forms.TextInput(), label = "H", required=False)
cell_i27 = forms.CharField(widget=forms.TextInput(), label = "I", required=False)
cell_j27 = forms.CharField(widget=forms.TextInput(), label = "J", required=False)
class Form_Task_Foci_001_028(ModelForm):
class Meta:
model = Task_Foci_001
fields = ('cell_a28', 'cell_b28', 'cell_c28', 'cell_d28', 'cell_e28',
'cell_f28', 'cell_g28', 'cell_h28', 'cell_i28', 'cell_j28',)
column = 28
cell_a28 = forms.CharField(widget=forms.TextInput(), label = "A", required=False)
cell_b28 = forms.CharField(widget=forms.TextInput(), label = "B", required=False)
cell_c28 = forms.CharField(widget=forms.TextInput(), label = "C", required=False)
cell_d28 = forms.CharField(widget=forms.TextInput(), label = "D", required=False)
cell_e28 = forms.CharField(widget=forms.TextInput(), label = "E", required=False)
cell_f28 = forms.CharField(widget=forms.TextInput(), label = "F", required=False)
cell_g28 = forms.CharField(widget=forms.TextInput(), label = "G", required=False)
cell_h28 = forms.CharField(widget=forms.TextInput(), label = "H", required=False)
cell_i28 = forms.CharField(widget=forms.TextInput(), label = "I", required=False)
cell_j28 = forms.CharField(widget=forms.TextInput(), label = "J", required=False)
class Form_Task_Foci_001_029(ModelForm):
class Meta:
model = Task_Foci_001
fields = ('cell_a29', 'cell_b29', 'cell_c29', 'cell_d29', 'cell_e29',
'cell_f29', 'cell_g29', 'cell_h29', 'cell_i29', 'cell_j29',)
column = 29
cell_a29 = forms.CharField(widget=forms.TextInput(), label = "A", required=False)
cell_b29 = forms.CharField(widget=forms.TextInput(), label = "B", required=False)
cell_c29 = forms.CharField(widget=forms.TextInput(), label = "C", required=False)
cell_d29 = forms.CharField(widget=forms.TextInput(), label = "D", required=False)
cell_e29 = forms.CharField(widget=forms.TextInput(), label = "E", required=False)
cell_f29 = forms.CharField(widget=forms.TextInput(), label = "F", required=False)
cell_g29 = forms.CharField(widget=forms.TextInput(), label = "G", required=False)
cell_h29 = forms.CharField(widget=forms.TextInput(), label = "H", required=False)
cell_i29 = forms.CharField(widget=forms.TextInput(), label = "I", required=False)
cell_j29 = forms.CharField(widget=forms.TextInput(), label = "J", required=False)
class Form_Task_Foci_001_030(ModelForm):
class Meta:
model = Task_Foci_001
fields = ('cell_a30', 'cell_b30', 'cell_c30', 'cell_d30', 'cell_e30',
'cell_f30', 'cell_g30', 'cell_h30', 'cell_i30', 'cell_j30',)
column = 30
cell_a30 = forms.CharField(widget=forms.TextInput(), label = "A", required=False)
cell_b30 = forms.CharField(widget=forms.TextInput(), label = "B", required=False)
cell_c30 = forms.CharField(widget=forms.TextInput(), label = "C", required=False)
cell_d30 = forms.CharField(widget=forms.TextInput(), label = "D", required=False)
cell_e30 = forms.CharField(widget=forms.TextInput(), label = "E", required=False)
cell_f30 = forms.CharField(widget=forms.TextInput(), label = "F", required=False)
cell_g30 = forms.CharField(widget=forms.TextInput(), label = "G", required=False)
cell_h30 = forms.CharField(widget=forms.TextInput(), label = "H", required=False)
cell_i30 = forms.CharField(widget=forms.TextInput(), label = "I", required=False)
cell_j30 = forms.CharField(widget=forms.TextInput(), label = "J", required=False)
class Form_Task_Foci_001_031(ModelForm):
class Meta:
model = Task_Foci_001
fields = ('cell_a31', 'cell_b31', 'cell_c31', 'cell_d31', 'cell_e31',
'cell_f31', 'cell_g31', 'cell_h31', 'cell_i31', 'cell_j31',)
column = 31
cell_a31 = forms.CharField(widget=forms.TextInput(), label = "A", required=False)
cell_b31 = forms.CharField(widget=forms.TextInput(), label = "B", required=False)
cell_c31 = forms.CharField(widget=forms.TextInput(), label = "C", required=False)
cell_d31 = forms.CharField(widget=forms.TextInput(), label = "D", required=False)
cell_e31 = forms.CharField(widget=forms.TextInput(), label = "E", required=False)
cell_f31 = forms.CharField(widget=forms.TextInput(), label = "F", required=False)
cell_g31 = forms.CharField(widget=forms.TextInput(), label = "G", required=False)
cell_h31 = forms.CharField(widget=forms.TextInput(), label = "H", required=False)
cell_i31 = forms.CharField(widget=forms.TextInput(), label = "I", required=False)
cell_j31 = forms.CharField(widget=forms.TextInput(), label = "J", required=False)
class Form_Task_Foci_001_032(ModelForm):
class Meta:
model = Task_Foci_001
fields = ('cell_a32', 'cell_b32', 'cell_c32', 'cell_d32', 'cell_e32',
'cell_f32', 'cell_g32', 'cell_h32', 'cell_i32', 'cell_j32',)
column = 32
cell_a32 = forms.CharField(widget=forms.TextInput(), label = "A", required=False)
cell_b32 = forms.CharField(widget=forms.TextInput(), label = "B", required=False)
cell_c32 = forms.CharField(widget=forms.TextInput(), label = "C", required=False)
cell_d32 = forms.CharField(widget=forms.TextInput(), label = "D", required=False)
cell_e32 = forms.CharField(widget=forms.TextInput(), label = "E", required=False)
cell_f32 = forms.CharField(widget=forms.TextInput(), label = "F", required=False)
cell_g32 = forms.CharField(widget=forms.TextInput(), label = "G", required=False)
cell_h32 = forms.CharField(widget=forms.TextInput(), label = "H", required=False)
cell_i32 = forms.CharField(widget=forms.TextInput(), label = "I", required=False)
cell_j32 = forms.CharField(widget=forms.TextInput(), label = "J", required=False)
class Form_Task_Foci_001_033(ModelForm):
class Meta:
model = Task_Foci_001
fields = ('cell_a33', 'cell_b33', 'cell_c33', 'cell_d33', 'cell_e33',
'cell_f33', 'cell_g33', 'cell_h33', 'cell_i33', 'cell_j33',)
column = 33
cell_a33 = forms.CharField(widget=forms.TextInput(), label = "A", required=False)
cell_b33 = forms.CharField(widget=forms.TextInput(), label = "B", required=False)
cell_c33 = forms.CharField(widget=forms.TextInput(), label = "C", required=False)
cell_d33 = forms.CharField(widget=forms.TextInput(), label = "D", required=False)
cell_e33 = forms.CharField(widget=forms.TextInput(), label = "E", required=False)
cell_f33 = forms.CharField(widget=forms.TextInput(), label = "F", required=False)
cell_g33 = forms.CharField(widget=forms.TextInput(), label = "G", required=False)
cell_h33 = forms.CharField(widget=forms.TextInput(), label = "H", required=False)
cell_i33 = forms.CharField(widget=forms.TextInput(), label = "I", required=False)
cell_j33 = forms.CharField(widget=forms.TextInput(), label = "J", required=False)
class Form_Task_Foci_001_034(ModelForm):
class Meta:
model = Task_Foci_001
fields = ('cell_a34', 'cell_b34', 'cell_c34', 'cell_d34', 'cell_e34',
'cell_f34', 'cell_g34', 'cell_h34', 'cell_i34', 'cell_j34',)
column = 34
cell_a34 = forms.CharField(widget=forms.TextInput(), label = "A", required=False)
cell_b34 = forms.CharField(widget=forms.TextInput(), label = "B", required=False)
cell_c34 = forms.CharField(widget=forms.TextInput(), label = "C", required=False)
cell_d34 = forms.CharField(widget=forms.TextInput(), label = "D", required=False)
cell_e34 = forms.CharField(widget=forms.TextInput(), label = "E", required=False)
cell_f34 = forms.CharField(widget=forms.TextInput(), label = "F", required=False)
cell_g34 = forms.CharField(widget=forms.TextInput(), label = "G", required=False)
cell_h34 = forms.CharField(widget=forms.TextInput(), label = "H", required=False)
cell_i34 = forms.CharField(widget=forms.TextInput(), label = "I", required=False)
cell_j34 = forms.CharField(widget=forms.TextInput(), label = "J", required=False)
class Form_Task_Foci_001_035(ModelForm):
class Meta:
model = Task_Foci_001
fields = ('cell_a35', 'cell_b35', 'cell_c35', 'cell_d35', 'cell_e35',
'cell_f35', 'cell_g35', 'cell_h35', 'cell_i35', 'cell_j35',)
column = 35
cell_a35 = forms.CharField(widget=forms.TextInput(), label = "A", required=False)
cell_b35 = forms.CharField(widget=forms.TextInput(), label = "B", required=False)
cell_c35 = forms.CharField(widget=forms.TextInput(), label = "C", required=False)
cell_d35 = forms.CharField(widget=forms.TextInput(), label = "D", required=False)
cell_e35 = forms.CharField(widget=forms.TextInput(), label = "E", required=False)
cell_f35 = forms.CharField(widget=forms.TextInput(), label = "F", required=False)
cell_g35 = forms.CharField(widget=forms.TextInput(), label = "G", required=False)
cell_h35 = forms.CharField(widget=forms.TextInput(), label = "H", required=False)
cell_i35 = forms.CharField(widget=forms.TextInput(), label = "I", required=False)
cell_j35 = forms.CharField(widget=forms.TextInput(), label = "J", required=False)
class Form_Task_Foci_001_036(ModelForm):
class Meta:
model = Task_Foci_001
fields = ('cell_a36', 'cell_b36', 'cell_c36', 'cell_d36', 'cell_e36',
'cell_f36', 'cell_g36', 'cell_h36', 'cell_i36', 'cell_j36',)
column = 36
cell_a36 = forms.CharField(widget=forms.TextInput(), label = "A", required=False)
cell_b36 = forms.CharField(widget=forms.TextInput(), label = "B", required=False)
cell_c36 = forms.CharField(widget=forms.TextInput(), label = "C", required=False)
cell_d36 = forms.CharField(widget=forms.TextInput(), label = "D", required=False)
cell_e36 = forms.CharField(widget=forms.TextInput(), label = "E", required=False)
cell_f36 = forms.CharField(widget=forms.TextInput(), label = "F", required=False)
cell_g36 = forms.CharField(widget=forms.TextInput(), label = "G", required=False)
cell_h36 = forms.CharField(widget=forms.TextInput(), label = "H", required=False)
cell_i36 = forms.CharField(widget=forms.TextInput(), label = "I", required=False)
cell_j36 = forms.CharField(widget=forms.TextInput(), label = "J", required=False)
class Form_Task_Foci_001_037(ModelForm):
class Meta:
model = Task_Foci_001
fields = ('cell_a37', 'cell_b37', 'cell_c37', 'cell_d37', 'cell_e37',
'cell_f37', 'cell_g37', 'cell_h37', 'cell_i37', 'cell_j37',)
column = 37
cell_a37 = forms.CharField(widget=forms.TextInput(), label = "A", required=False)
cell_b37 = forms.CharField(widget=forms.TextInput(), label = "B", required=False)
cell_c37 = forms.CharField(widget=forms.TextInput(), label = "C", required=False)
cell_d37 = forms.CharField(widget=forms.TextInput(), label = "D", required=False)
cell_e37 = forms.CharField(widget=forms.TextInput(), label = "E", required=False)
cell_f37 = forms.CharField(widget=forms.TextInput(), label = "F", required=False)
cell_g37 = forms.CharField(widget=forms.TextInput(), label = "G", required=False)
cell_h37 = forms.CharField(widget=forms.TextInput(), label = "H", required=False)
cell_i37 = forms.CharField(widget=forms.TextInput(), label = "I", required=False)
cell_j37 = forms.CharField(widget=forms.TextInput(), label = "J", required=False)
class Form_Task_Foci_001_038(ModelForm):
class Meta:
model = Task_Foci_001
fields = ('cell_a38', 'cell_b38', 'cell_c38', 'cell_d38', 'cell_e38',
'cell_f38', 'cell_g38', 'cell_h38', 'cell_i38', 'cell_j38',)
column = 38
cell_a38 = forms.CharField(widget=forms.TextInput(), label = "A", required=False)
cell_b38 = forms.CharField(widget=forms.TextInput(), label = "B", required=False)
cell_c38 = forms.CharField(widget=forms.TextInput(), label = "C", required=False)
cell_d38 = forms.CharField(widget=forms.TextInput(), label = "D", required=False)
cell_e38 = forms.CharField(widget=forms.TextInput(), label = "E", required=False)
cell_f38 = forms.CharField(widget=forms.TextInput(), label = "F", required=False)
cell_g38 = forms.CharField(widget=forms.TextInput(), label = "G", required=False)
cell_h38 = forms.CharField(widget=forms.TextInput(), label = "H", required=False)
cell_i38 = forms.CharField(widget=forms.TextInput(), label = "I", required=False)
cell_j38 = forms.CharField(widget=forms.TextInput(), label = "J", required=False)
class Form_Task_Foci_001_039(ModelForm):
class Meta:
model = Task_Foci_001
fields = ('cell_a39', 'cell_b39', 'cell_c39', 'cell_d39', 'cell_e39',
'cell_f39', 'cell_g39', 'cell_h39', 'cell_i39', 'cell_j39',)
column = 39
cell_a39 = forms.CharField(widget=forms.TextInput(), label = "A", required=False)
cell_b39 = forms.CharField(widget=forms.TextInput(), label = "B", required=False)
cell_c39 = forms.CharField(widget=forms.TextInput(), label = "C", required=False)
cell_d39 = forms.CharField(widget=forms.TextInput(), label = "D", required=False)
cell_e39 = forms.CharField(widget=forms.TextInput(), label = "E", required=False)
cell_f39 = forms.CharField(widget=forms.TextInput(), label = "F", required=False)
cell_g39 = forms.CharField(widget=forms.TextInput(), label = "G", required=False)
cell_h39 = forms.CharField(widget=forms.TextInput(), label = "H", required=False)
cell_i39 = forms.CharField(widget=forms.TextInput(), label = "I", required=False)
cell_j39 = forms.CharField(widget=forms.TextInput(), label = "J", required=False)
class Form_Task_Foci_001_040(ModelForm):
class Meta:
model = Task_Foci_001
fields = ('cell_a40', 'cell_b40', 'cell_c40', 'cell_d40', 'cell_e40',
'cell_f40', 'cell_g40', 'cell_h40', 'cell_i40', 'cell_j40',)
column = 40
cell_a40 = forms.CharField(widget=forms.TextInput(), label = "A", required=False)
cell_b40 = forms.CharField(widget=forms.TextInput(), label = "B", required=False)
cell_c40 = forms.CharField(widget=forms.TextInput(), label = "C", required=False)
cell_d40 = forms.CharField(widget=forms.TextInput(), label = "D", required=False)
cell_e40 = forms.CharField(widget=forms.TextInput(), label = "E", required=False)
cell_f40 = forms.CharField(widget=forms.TextInput(), label = "F", required=False)
cell_g40 = forms.CharField(widget=forms.TextInput(), label = "G", required=False)
cell_h40 = forms.CharField(widget=forms.TextInput(), label = "H", required=False)
cell_i40 = forms.CharField(widget=forms.TextInput(), label = "I", required=False)
cell_j40 = forms.CharField(widget=forms.TextInput(), label = "J", required=False)
| 65.450785 | 85 | 0.682461 | 6,002 | 45,881 | 5.038987 | 0.087471 | 0.18979 | 0.271128 | 0.33891 | 0.854054 | 0.853392 | 0.853392 | 0.853392 | 0.853392 | 0.791992 | 0 | 0.048483 | 0.157538 | 45,881 | 700 | 86 | 65.544286 | 0.733966 | 0 | 0 | 0.124431 | 0 | 0 | 0.078246 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.004552 | 0 | 0.813354 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 9 |
53ef32fd54c8432c8ab338cbd9c6c9572fef4111 | 423 | py | Python | src/teste.py | carlosdevice1/MarketplacePy | 27c0913ee8da764f3f5164d51d36ca8024a7ea2f | [
"MIT"
] | null | null | null | src/teste.py | carlosdevice1/MarketplacePy | 27c0913ee8da764f3f5164d51d36ca8024a7ea2f | [
"MIT"
] | null | null | null | src/teste.py | carlosdevice1/MarketplacePy | 27c0913ee8da764f3f5164d51d36ca8024a7ea2f | [
"MIT"
] | null | null | null | from models.produto import Produto
ps4 = Produto('Playstation 4', 1789.44)
xbox = Produto('Xbox 360', 1699.99)
print(ps4,'\n')
print(xbox)from models.produto import Produto
ps4 = Produto('Playstation 4', 1789.44)
xbox = Produto('Xbox 360', 1699.99)
print(ps4,'\n')
print(xbox)from models.produto import Produto
ps4 = Produto('Playstation 4', 1789.44)
xbox = Produto('Xbox 360', 1699.99)
print(ps4,'\n')
print(xbox) | 19.227273 | 45 | 0.70922 | 66 | 423 | 4.545455 | 0.227273 | 0.1 | 0.17 | 0.23 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0.146341 | 0.12766 | 423 | 22 | 46 | 19.227273 | 0.666667 | 0 | 0 | 0.846154 | 0 | 0 | 0.162736 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.230769 | null | null | 0.461538 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 10 |
9906a257d5d15726df622c658114689ab9b4e3cf | 67,813 | py | Python | tests/unit_tests/test_tethys_cli/test__init__.py | msouff/tethys | 45795d1e6561d5db8fddd838f4d1ae1d91dbb837 | [
"BSD-2-Clause"
] | null | null | null | tests/unit_tests/test_tethys_cli/test__init__.py | msouff/tethys | 45795d1e6561d5db8fddd838f4d1ae1d91dbb837 | [
"BSD-2-Clause"
] | null | null | null | tests/unit_tests/test_tethys_cli/test__init__.py | msouff/tethys | 45795d1e6561d5db8fddd838f4d1ae1d91dbb837 | [
"BSD-2-Clause"
] | null | null | null | import sys
import unittest
from unittest import mock
from io import StringIO
from tethys_cli import tethys_command
class TethysCommandTests(unittest.TestCase):
def setUp(self):
pass
def tearDown(self):
pass
def assert_returns_help(self, stdout):
self.assertIn('usage: tethys', stdout)
self.assertIn('scaffold', stdout)
self.assertIn('gen', stdout)
self.assertIn('manage', stdout)
self.assertIn('schedulers', stdout)
self.assertIn('services', stdout)
self.assertIn('app_settings', stdout)
self.assertIn('link', stdout)
self.assertIn('test', stdout)
self.assertIn('uninstall', stdout)
self.assertIn('list', stdout)
self.assertIn('syncstores', stdout)
self.assertIn('docker', stdout)
@mock.patch('sys.stderr', new_callable=StringIO)
@mock.patch('tethys_cli.argparse._sys.exit')
def test_tethys_with_no_subcommand(self, mock_exit, mock_stderr):
mock_exit.side_effect = SystemExit
testargs = ['tethys']
with mock.patch.object(sys, 'argv', testargs):
self.assertRaises(SystemExit, tethys_command)
self.assert_returns_help(mock_stderr.getvalue())
@mock.patch('sys.stdout', new_callable=StringIO)
@mock.patch('tethys_cli.argparse._sys.exit')
def test_tethys_help(self, mock_exit, mock_stdout):
mock_exit.side_effect = SystemExit
testargs = ['tethys', '-h']
with mock.patch.object(sys, 'argv', testargs):
self.assertRaises(SystemExit, tethys_command)
mock_exit.assert_called_with(0)
self.assert_returns_help(mock_stdout.getvalue())
@mock.patch('tethys_cli.scaffold_commands.scaffold_command')
def test_scaffold_subcommand(self, mock_scaffold_command):
testargs = ['tethys', 'scaffold', 'foo']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_scaffold_command.assert_called()
call_args = mock_scaffold_command.call_args_list
self.assertEqual('foo', call_args[0][0][0].name)
self.assertEqual('default', call_args[0][0][0].template)
self.assertFalse(call_args[0][0][0].overwrite)
self.assertFalse(call_args[0][0][0].extension)
self.assertFalse(call_args[0][0][0].use_defaults)
@mock.patch('tethys_cli.scaffold_commands.scaffold_command')
def test_scaffold_subcommand_with_options(self, mock_scaffold_command):
testargs = ['tethys', 'scaffold', 'foo', '-e', '-t', 'my_template', '-o', '-d']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_scaffold_command.assert_called()
call_args = mock_scaffold_command.call_args_list
self.assertEqual('foo', call_args[0][0][0].name)
self.assertEqual('my_template', call_args[0][0][0].template)
self.assertTrue(call_args[0][0][0].overwrite)
self.assertTrue(call_args[0][0][0].extension)
self.assertTrue(call_args[0][0][0].use_defaults)
@mock.patch('tethys_cli.scaffold_commands.scaffold_command')
def test_scaffold_subcommand_with_verbose_options(self, mock_scaffold_command):
testargs = ['tethys', 'scaffold', 'foo', '--extension', '--template', 'my_template', '--overwrite',
'--defaults']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_scaffold_command.assert_called()
call_args = mock_scaffold_command.call_args_list
self.assertEqual('foo', call_args[0][0][0].name)
self.assertEqual('my_template', call_args[0][0][0].template)
self.assertTrue(call_args[0][0][0].overwrite)
self.assertTrue(call_args[0][0][0].extension)
self.assertTrue(call_args[0][0][0].use_defaults)
@mock.patch('sys.stdout', new_callable=StringIO)
@mock.patch('tethys_cli.argparse._sys.exit')
@mock.patch('tethys_cli.scaffold_commands.scaffold_command')
def test_scaffold_subcommand_help(self, mock_scaffold_command, mock_exit, mock_stdout):
mock_exit.side_effect = SystemExit
testargs = ['tethys', 'scaffold', '-h']
with mock.patch.object(sys, 'argv', testargs):
self.assertRaises(SystemExit, tethys_command)
mock_scaffold_command.assert_not_called()
mock_exit.assert_called_with(0)
self.assertIn('--help', mock_stdout.getvalue())
self.assertIn('--template', mock_stdout.getvalue())
self.assertIn('--extension', mock_stdout.getvalue())
self.assertIn('--defaults', mock_stdout.getvalue())
self.assertIn('--overwrite', mock_stdout.getvalue())
@mock.patch('tethys_cli.gen_commands.generate_command')
def test_generate_subcommand_settings_defaults(self, mock_gen_command):
testargs = ['tethys', 'gen', 'settings']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_gen_command.assert_called()
call_args = mock_gen_command.call_args_list
self.assertEqual(None, call_args[0][0][0].allowed_hosts)
self.assertEqual('75M', call_args[0][0][0].client_max_body_size)
self.assertEqual('pass', call_args[0][0][0].db_password)
self.assertEqual('tethys_platform', call_args[0][0][0].db_name)
self.assertEqual('127.0.0.1', call_args[0][0][0].db_host)
self.assertEqual(5436, call_args[0][0][0].db_port)
self.assertEqual('tethys_default', call_args[0][0][0].db_username)
self.assertEqual(None, call_args[0][0][0].directory)
self.assertEqual(4, call_args[0][0][0].asgi_processes)
self.assertEqual('psql', call_args[0][0][0].db_dir)
self.assertEqual(8000, call_args[0][0][0].tethys_port)
self.assertEqual(840, call_args[0][0][0].session_warning)
self.assertEqual(900, call_args[0][0][0].session_expire)
self.assertEqual(None, call_args[0][0][0].add_apps)
self.assertEqual('', call_args[0][0][0].channel_layer)
self.assertEqual(None, call_args[0][0][0].recaptcha_private_key)
self.assertEqual(None, call_args[0][0][0].recaptcha_public_key)
self.assertFalse(call_args[0][0][0].open_portal)
self.assertFalse(call_args[0][0][0].open_signup)
self.assertFalse(call_args[0][0][0].bypass_portal_home)
self.assertFalse(call_args[0][0][0].session_persist)
self.assertFalse(call_args[0][0][0].captcha)
self.assertFalse(call_args[0][0][0].overwrite)
self.assertFalse(call_args[0][0][0].production)
self.assertEqual('settings', call_args[0][0][0].type)
@mock.patch('tethys_cli.gen_commands.generate_command')
def test_generate_subcommand_settings_directory(self, mock_gen_command):
testargs = ['tethys', 'gen', 'settings', '--directory', '/tmp/foo/bar']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_gen_command.assert_called()
call_args = mock_gen_command.call_args_list
self.assertEqual(None, call_args[0][0][0].allowed_hosts)
self.assertEqual('75M', call_args[0][0][0].client_max_body_size)
self.assertEqual('pass', call_args[0][0][0].db_password)
self.assertEqual('tethys_platform', call_args[0][0][0].db_name)
self.assertEqual('127.0.0.1', call_args[0][0][0].db_host)
self.assertEqual(5436, call_args[0][0][0].db_port)
self.assertEqual('tethys_default', call_args[0][0][0].db_username)
self.assertEqual(4, call_args[0][0][0].asgi_processes)
self.assertEqual('psql', call_args[0][0][0].db_dir)
self.assertEqual(8000, call_args[0][0][0].tethys_port)
self.assertEqual(840, call_args[0][0][0].session_warning)
self.assertEqual(900, call_args[0][0][0].session_expire)
self.assertEqual(None, call_args[0][0][0].add_apps)
self.assertEqual('', call_args[0][0][0].channel_layer)
self.assertEqual(None, call_args[0][0][0].recaptcha_private_key)
self.assertEqual(None, call_args[0][0][0].recaptcha_public_key)
self.assertFalse(call_args[0][0][0].open_portal)
self.assertFalse(call_args[0][0][0].open_signup)
self.assertFalse(call_args[0][0][0].bypass_portal_home)
self.assertFalse(call_args[0][0][0].session_persist)
self.assertFalse(call_args[0][0][0].captcha)
self.assertFalse(call_args[0][0][0].overwrite)
self.assertFalse(call_args[0][0][0].production)
self.assertEqual('settings', call_args[0][0][0].type)
@mock.patch('tethys_cli.gen_commands.generate_command')
def test_generate_subcommand_nginx_settings_verbose_options(self, mock_gen_command):
testargs = ['tethys', 'gen', 'nginx', '-d', '/tmp/foo/bar', '--db-name', 'tethys_db', '--db-host', '127.0.17.1',
'--allowed-hosts', 'localhost', '--client-max-body-size', '123M', '--asgi-processes', '4',
'--db-username', 'foo_user', '--db-password', 'foo_pass', '--db-port', '5555',
'--production', '--overwrite', '--open-portal', '--open-signup', '--tethys-port', '8080',
'--add-apps', 'django_registration', '--session-persist', '--session-warning', '1500',
'--session-expire', '1800', '--bypass-portal-home',
'--add-quota-handlers', 'tethysapp.inventory.quota_handler.QuotaHandler',
'--static-root', '/new/static', '--workspaces-root', '/new/workspace',
'--django-analytical', 'KEY:VALUE', '--add-backends', 'hydroshare', '--add-backends', 'hydroshare',
'--oauth-options', 'KEY1:VALUE1 KEY2:VALUE2',
'--channel-layer', 'channels_redis.core.RedisChannelLayer', '--captcha',
'--recaptcha-private-key', '123456', '--recaptcha-public-key', '123456']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_gen_command.assert_called()
call_args = mock_gen_command.call_args_list
self.assertEqual(['localhost'], call_args[0][0][0].allowed_hosts)
self.assertEqual('123M', call_args[0][0][0].client_max_body_size)
self.assertEqual('foo_pass', call_args[0][0][0].db_password)
self.assertEqual('tethys_db', call_args[0][0][0].db_name)
self.assertEqual('127.0.17.1', call_args[0][0][0].db_host)
self.assertEqual('5555', call_args[0][0][0].db_port)
self.assertEqual('foo_user', call_args[0][0][0].db_username)
self.assertEqual('/tmp/foo/bar', call_args[0][0][0].directory)
self.assertTrue(call_args[0][0][0].overwrite)
self.assertTrue(call_args[0][0][0].production)
self.assertEqual('nginx', call_args[0][0][0].type)
self.assertEqual('4', call_args[0][0][0].asgi_processes)
self.assertTrue(call_args[0][0][0].open_portal)
self.assertTrue(call_args[0][0][0].open_signup)
self.assertEqual('8080', call_args[0][0][0].tethys_port)
self.assertEqual(['django_registration'], call_args[0][0][0].add_apps)
self.assertTrue(call_args[0][0][0].session_persist)
self.assertEqual('1500', call_args[0][0][0].session_warning)
self.assertEqual('1800', call_args[0][0][0].session_expire)
self.assertEqual('/new/static', call_args[0][0][0].static_root)
self.assertEqual('/new/workspace', call_args[0][0][0].workspaces_root)
self.assertEqual(['tethysapp.inventory.quota_handler.QuotaHandler'], call_args[0][0][0].add_quota_handlers)
self.assertTrue(call_args[0][0][0].bypass_portal_home)
self.assertEqual(['KEY:VALUE'], call_args[0][0][0].django_analytical)
self.assertEqual(['hydroshare'], call_args[0][0][0].add_backends)
self.assertEqual(['KEY1:VALUE1 KEY2:VALUE2'], call_args[0][0][0].oauth_options)
self.assertEqual('channels_redis.core.RedisChannelLayer', call_args[0][0][0].channel_layer)
self.assertTrue(call_args[0][0][0].captcha)
self.assertEqual('123456', call_args[0][0][0].recaptcha_private_key)
self.assertEqual('123456', call_args[0][0][0].recaptcha_public_key)
@mock.patch('sys.stdout', new_callable=StringIO)
@mock.patch('tethys_cli.argparse._sys.exit')
@mock.patch('tethys_cli.gen_commands.generate_command')
def test_generate_subcommand_help(self, mock_gen_command, mock_exit, mock_stdout):
mock_exit.side_effect = SystemExit
testargs = ['tethys', 'gen', '-h']
with mock.patch.object(sys, 'argv', testargs):
self.assertRaises(SystemExit, tethys_command)
mock_gen_command.assert_not_called()
mock_exit.assert_called_with(0)
self.assertIn('--help', mock_stdout.getvalue())
self.assertIn('--directory', mock_stdout.getvalue())
self.assertIn('--allowed-hosts', mock_stdout.getvalue())
self.assertIn('--client-max-body-size', mock_stdout.getvalue())
self.assertIn('--asgi-processes', mock_stdout.getvalue())
self.assertIn('--db-username', mock_stdout.getvalue())
self.assertIn('--db-password', mock_stdout.getvalue())
self.assertIn('--db-name', mock_stdout.getvalue())
self.assertIn('--db-host', mock_stdout.getvalue())
self.assertIn('--db-port', mock_stdout.getvalue())
self.assertIn('--production', mock_stdout.getvalue())
self.assertIn('--overwrite', mock_stdout.getvalue())
self.assertIn('--open-portal', mock_stdout.getvalue())
self.assertIn('--open-signup', mock_stdout.getvalue())
self.assertIn('--tethys-port', mock_stdout.getvalue())
self.assertIn('--add-apps', mock_stdout.getvalue())
self.assertIn('--session-persist', mock_stdout.getvalue())
self.assertIn('--session-warning', mock_stdout.getvalue())
self.assertIn('--session-expire', mock_stdout.getvalue())
self.assertIn('--static-root', mock_stdout.getvalue())
self.assertIn('--workspaces-root', mock_stdout.getvalue())
self.assertIn('--bypass-portal-home', mock_stdout.getvalue())
self.assertIn('--add-quota-handlers', mock_stdout.getvalue())
self.assertIn('--django-analytical', mock_stdout.getvalue())
self.assertIn('--add-backends', mock_stdout.getvalue())
self.assertIn('--oauth-options', mock_stdout.getvalue())
self.assertIn('--channel-layer', mock_stdout.getvalue())
self.assertIn('--captcha', mock_stdout.getvalue())
self.assertIn('--recaptcha-private-key', mock_stdout.getvalue())
self.assertIn('--recaptcha-public-key', mock_stdout.getvalue())
@mock.patch('tethys_cli.manage_commands.manage_command')
def test_manage_subcommand_start(self, mock_manage_command):
testargs = ['tethys', 'manage', 'start']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_manage_command.assert_called()
call_args = mock_manage_command.call_args_list
self.assertEqual('start', call_args[0][0][0].command)
self.assertEqual(False, call_args[0][0][0].force)
self.assertEqual(None, call_args[0][0][0].manage)
self.assertEqual(False, call_args[0][0][0].noinput)
self.assertEqual(None, call_args[0][0][0].port)
@mock.patch('tethys_cli.manage_commands.manage_command')
def test_manage_subcommand_start_options(self, mock_manage_command):
testargs = ['tethys', 'manage', 'start', '-m', '/foo/bar/manage.py', '-p', '5555', '-f']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_manage_command.assert_called()
call_args = mock_manage_command.call_args_list
self.assertEqual('start', call_args[0][0][0].command)
self.assertEqual(True, call_args[0][0][0].force)
self.assertEqual('/foo/bar/manage.py', call_args[0][0][0].manage)
self.assertEqual(False, call_args[0][0][0].noinput)
self.assertEqual('5555', call_args[0][0][0].port)
@mock.patch('tethys_cli.manage_commands.manage_command')
def test_manage_subcommand_start_verbose_options(self, mock_manage_command):
testargs = ['tethys', 'manage', 'start', '--manage', '/foo/bar/manage.py', '--port', '5555', '--force',
'--noinput']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_manage_command.assert_called()
call_args = mock_manage_command.call_args_list
self.assertEqual('start', call_args[0][0][0].command)
self.assertEqual(True, call_args[0][0][0].force)
self.assertEqual('/foo/bar/manage.py', call_args[0][0][0].manage)
self.assertEqual(True, call_args[0][0][0].noinput)
self.assertEqual('5555', call_args[0][0][0].port)
@mock.patch('tethys_cli.manage_commands.manage_command')
def test_manage_subcommand_collectstatic(self, mock_manage_command):
testargs = ['tethys', 'manage', 'collectstatic']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_manage_command.assert_called()
call_args = mock_manage_command.call_args_list
self.assertEqual('collectstatic', call_args[0][0][0].command)
self.assertEqual(False, call_args[0][0][0].force)
self.assertEqual(None, call_args[0][0][0].manage)
self.assertEqual(False, call_args[0][0][0].noinput)
self.assertEqual(None, call_args[0][0][0].port)
@mock.patch('tethys_cli.manage_commands.manage_command')
def test_manage_subcommand_collectworkspaces(self, mock_manage_command):
testargs = ['tethys', 'manage', 'collectworkspaces']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_manage_command.assert_called()
call_args = mock_manage_command.call_args_list
self.assertEqual('collectworkspaces', call_args[0][0][0].command)
self.assertEqual(False, call_args[0][0][0].force)
self.assertEqual(None, call_args[0][0][0].manage)
self.assertEqual(False, call_args[0][0][0].noinput)
self.assertEqual(None, call_args[0][0][0].port)
@mock.patch('tethys_cli.manage_commands.manage_command')
def test_manage_subcommand_collectall(self, mock_manage_command):
testargs = ['tethys', 'manage', 'collectall']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_manage_command.assert_called()
call_args = mock_manage_command.call_args_list
self.assertEqual('collectall', call_args[0][0][0].command)
self.assertEqual(False, call_args[0][0][0].force)
self.assertEqual(None, call_args[0][0][0].manage)
self.assertEqual(False, call_args[0][0][0].noinput)
self.assertEqual(None, call_args[0][0][0].port)
@mock.patch('tethys_cli.manage_commands.manage_command')
def test_manage_subcommand_createsuperuser(self, mock_manage_command):
testargs = ['tethys', 'manage', 'createsuperuser']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_manage_command.assert_called()
call_args = mock_manage_command.call_args_list
self.assertEqual('createsuperuser', call_args[0][0][0].command)
self.assertEqual(False, call_args[0][0][0].force)
self.assertEqual(None, call_args[0][0][0].manage)
self.assertEqual(False, call_args[0][0][0].noinput)
self.assertEqual(None, call_args[0][0][0].port)
@mock.patch('sys.stdout', new_callable=StringIO)
@mock.patch('tethys_cli.argparse._sys.exit')
@mock.patch('tethys_cli.manage_commands.manage_command')
def test_manage_subcommand_help(self, mock_manage_command, mock_exit, mock_stdout):
mock_exit.side_effect = SystemExit
testargs = ['tethys', 'manage', '-h']
with mock.patch.object(sys, 'argv', testargs):
self.assertRaises(SystemExit, tethys_command)
mock_manage_command.assert_not_called()
mock_exit.assert_called_with(0)
self.assertIn('--help', mock_stdout.getvalue())
self.assertIn('--manage', mock_stdout.getvalue())
self.assertIn('--port', mock_stdout.getvalue())
self.assertIn('--noinput', mock_stdout.getvalue())
self.assertIn('--force', mock_stdout.getvalue())
@mock.patch('tethys_cli.scheduler_commands.condor_scheduler_create_command')
def test_scheduler_create_condor_command_options(self, mock_scheduler_create_command):
testargs = ['tethys', 'schedulers', 'create-condor', '-n', 'foo_name', '-e', 'http://foo.foo_endpoint',
'-u', 'foo_user', '-p', 'foo_pass', '-k', 'private_foo_pass']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_scheduler_create_command.assert_called()
call_args = mock_scheduler_create_command.call_args_list
self.assertEqual('http://foo.foo_endpoint', call_args[0][0][0].endpoint)
self.assertEqual('foo_name', call_args[0][0][0].name)
self.assertEqual('foo_pass', call_args[0][0][0].password)
self.assertEqual('private_foo_pass', call_args[0][0][0].private_key_pass)
self.assertEqual(None, call_args[0][0][0].private_key_path)
self.assertEqual('foo_user', call_args[0][0][0].username)
@mock.patch('tethys_cli.scheduler_commands.dask_scheduler_create_command')
def test_scheduler_create_dask_command_options(self, mock_scheduler_create_command):
testargs = ['tethys', 'schedulers', 'create-dask', '-n', 'foo_name', '-e', 'http://foo.foo_endpoint',
'-t', '10', '-b', '12', '-d', 'bar']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_scheduler_create_command.assert_called()
call_args = mock_scheduler_create_command.call_args_list
self.assertEqual('http://foo.foo_endpoint', call_args[0][0][0].endpoint)
self.assertEqual('foo_name', call_args[0][0][0].name)
self.assertEqual(10, call_args[0][0][0].timeout)
self.assertEqual(12, call_args[0][0][0].heartbeat_interval)
self.assertEqual('bar', call_args[0][0][0].dashboard)
@mock.patch('tethys_cli.scheduler_commands.condor_scheduler_create_command')
def test_scheduler_create_condor_command_verbose_options(self, mock_scheduler_create_command):
testargs = ['tethys', 'schedulers', 'create-condor', '--name', 'foo_name', '--endpoint',
'http://foo.foo_endpoint', '--username', 'foo_user', '--private-key-path', 'private_foo_path']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_scheduler_create_command.assert_called()
call_args = mock_scheduler_create_command.call_args_list
self.assertEqual('http://foo.foo_endpoint', call_args[0][0][0].endpoint)
self.assertEqual('foo_name', call_args[0][0][0].name)
self.assertEqual(None, call_args[0][0][0].password)
self.assertEqual(None, call_args[0][0][0].private_key_pass)
self.assertEqual('private_foo_path', call_args[0][0][0].private_key_path)
self.assertEqual('foo_user', call_args[0][0][0].username)
@mock.patch('tethys_cli.scheduler_commands.dask_scheduler_create_command')
def test_scheduler_create_dask_command_verbose_options(self, mock_scheduler_create_command):
testargs = ['tethys', 'schedulers', 'create-dask', '--name', 'foo_name', '--endpoint',
'http://foo.foo_endpoint', '--timeout', '12', '--heartbeat-interval', '15',
'--dashboard', 'bar']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_scheduler_create_command.assert_called()
call_args = mock_scheduler_create_command.call_args_list
self.assertEqual('http://foo.foo_endpoint', call_args[0][0][0].endpoint)
self.assertEqual('foo_name', call_args[0][0][0].name)
self.assertEqual(12, call_args[0][0][0].timeout)
self.assertEqual(15, call_args[0][0][0].heartbeat_interval)
self.assertEqual('bar', call_args[0][0][0].dashboard)
@mock.patch('sys.stdout', new_callable=StringIO)
@mock.patch('tethys_cli.argparse._sys.exit')
@mock.patch('tethys_cli.scheduler_commands.condor_scheduler_create_command')
def test_condor_scheduler_create_command_help(self, mock_scheduler_create_command, mock_exit, mock_stdout):
mock_exit.side_effect = SystemExit
testargs = ['tethys', 'schedulers', 'create-condor', '-h']
with mock.patch.object(sys, 'argv', testargs):
self.assertRaises(SystemExit, tethys_command)
mock_scheduler_create_command.assert_not_called()
mock_exit.assert_called_with(0)
self.assertIn('--help', mock_stdout.getvalue())
self.assertIn('--name', mock_stdout.getvalue())
self.assertIn('--endpoint', mock_stdout.getvalue())
self.assertIn('--username', mock_stdout.getvalue())
self.assertIn('--password', mock_stdout.getvalue())
self.assertIn('--private-key-path', mock_stdout.getvalue())
self.assertIn('--private-key-pass', mock_stdout.getvalue())
@mock.patch('sys.stdout', new_callable=StringIO)
@mock.patch('tethys_cli.argparse._sys.exit')
@mock.patch('tethys_cli.scheduler_commands.dask_scheduler_create_command')
def test_dask_scheduler_create_command_help(self, mock_scheduler_create_command, mock_exit, mock_stdout):
mock_exit.side_effect = SystemExit
testargs = ['tethys', 'schedulers', 'create-dask', '-h']
with mock.patch.object(sys, 'argv', testargs):
self.assertRaises(SystemExit, tethys_command)
mock_scheduler_create_command.assert_not_called()
mock_exit.assert_called_with(0)
self.assertIn('--help', mock_stdout.getvalue())
self.assertIn('--name', mock_stdout.getvalue())
self.assertIn('--endpoint', mock_stdout.getvalue())
self.assertIn('--heartbeat-interval', mock_stdout.getvalue())
self.assertIn('--dashboard', mock_stdout.getvalue())
@mock.patch('tethys_cli.scheduler_commands.schedulers_list_command')
def test_scheduler_dask_list_command(self, mock_scheduler_list_command):
testargs = ['tethys', 'schedulers', 'list', '-t', 'Dask']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_scheduler_list_command.assert_called()
@mock.patch('tethys_cli.scheduler_commands.schedulers_list_command')
def test_scheduler_condor_list_command(self, mock_scheduler_list_command):
testargs = ['tethys', 'schedulers', 'list', '-t', 'Condor']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_scheduler_list_command.assert_called()
@mock.patch('sys.stdout', new_callable=StringIO)
@mock.patch('tethys_cli.argparse._sys.exit')
@mock.patch('tethys_cli.scheduler_commands.schedulers_list_command')
def test_scheduler_list_command_help(self, mock_scheduler_list_command, mock_exit, mock_stdout):
mock_exit.side_effect = SystemExit
testargs = ['tethys', 'schedulers', 'list', '-h']
with mock.patch.object(sys, 'argv', testargs):
self.assertRaises(SystemExit, tethys_command)
mock_scheduler_list_command.assert_not_called()
mock_exit.assert_called_with(0)
self.assertIn('--help', mock_stdout.getvalue())
@mock.patch('tethys_cli.scheduler_commands.schedulers_remove_command')
def test_scheduler_remove_command(self, mock_scheduler_remove_command):
testargs = ['tethys', 'schedulers', 'remove', 'foo_name']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_scheduler_remove_command.assert_called()
call_args = mock_scheduler_remove_command.call_args_list
self.assertEqual(False, call_args[0][0][0].force)
self.assertEqual('foo_name', call_args[0][0][0].scheduler_name)
@mock.patch('tethys_cli.scheduler_commands.schedulers_remove_command')
def test_scheduler_remove_command_options(self, mock_scheduler_remove_command):
testargs = ['tethys', 'schedulers', 'remove', 'foo_name', '-f']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_scheduler_remove_command.assert_called()
call_args = mock_scheduler_remove_command.call_args_list
self.assertEqual(True, call_args[0][0][0].force)
self.assertEqual('foo_name', call_args[0][0][0].scheduler_name)
@mock.patch('tethys_cli.scheduler_commands.schedulers_remove_command')
def test_scheduler_remove_command_verbose_options(self, mock_scheduler_remove_command):
testargs = ['tethys', 'schedulers', 'remove', 'foo_name', '--force']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_scheduler_remove_command.assert_called()
call_args = mock_scheduler_remove_command.call_args_list
self.assertEqual(True, call_args[0][0][0].force)
self.assertEqual('foo_name', call_args[0][0][0].scheduler_name)
@mock.patch('sys.stdout', new_callable=StringIO)
@mock.patch('tethys_cli.argparse._sys.exit')
@mock.patch('tethys_cli.scheduler_commands.schedulers_remove_command')
def test_scheduler_list_command_help_2(self, mock_scheduler_remove_command, mock_exit, mock_stdout):
mock_exit.side_effect = SystemExit
testargs = ['tethys', 'schedulers', 'remove', '-h']
with mock.patch.object(sys, 'argv', testargs):
self.assertRaises(SystemExit, tethys_command)
mock_scheduler_remove_command.assert_not_called()
mock_exit.assert_called_with(0)
self.assertIn('--help', mock_stdout.getvalue())
self.assertIn('--force', mock_stdout.getvalue())
self.assertIn('scheduler_name', mock_stdout.getvalue())
@mock.patch('tethys_cli.services_commands.services_remove_persistent_command')
def test_services_remove_persistent_command(self, mock_services_remove_persistent_command):
testargs = ['tethys', 'services', 'remove', 'persistent', 'foo_service_uid']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_services_remove_persistent_command.assert_called()
call_args = mock_services_remove_persistent_command.call_args_list
self.assertEqual(False, call_args[0][0][0].force)
self.assertEqual('foo_service_uid', call_args[0][0][0].service_uid)
@mock.patch('tethys_cli.services_commands.services_remove_persistent_command')
def test_services_remove_persistent_command_options(self, mock_services_remove_persistent_command):
testargs = ['tethys', 'services', 'remove', 'persistent', '-f', 'foo_service_uid']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_services_remove_persistent_command.assert_called()
call_args = mock_services_remove_persistent_command.call_args_list
self.assertEqual(True, call_args[0][0][0].force)
self.assertEqual('foo_service_uid', call_args[0][0][0].service_uid)
@mock.patch('tethys_cli.services_commands.services_remove_persistent_command')
def test_services_remove_persistent_command_verbose_options(self, mock_services_remove_persistent_command):
testargs = ['tethys', 'services', 'remove', 'persistent', '--force', 'foo_service_uid']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_services_remove_persistent_command.assert_called()
call_args = mock_services_remove_persistent_command.call_args_list
self.assertEqual(True, call_args[0][0][0].force)
self.assertEqual('foo_service_uid', call_args[0][0][0].service_uid)
@mock.patch('sys.stdout', new_callable=StringIO)
@mock.patch('tethys_cli.argparse._sys.exit')
@mock.patch('tethys_cli.services_commands.services_remove_persistent_command')
def test_services_remove_persistent_command_help(self, mock_services_remove_persistent_command, mock_exit,
mock_stdout):
mock_exit.side_effect = SystemExit
testargs = ['tethys', 'services', 'remove', 'persistent', '-h']
with mock.patch.object(sys, 'argv', testargs):
self.assertRaises(SystemExit, tethys_command)
mock_services_remove_persistent_command.assert_not_called()
mock_exit.assert_called_with(0)
self.assertIn('--help', mock_stdout.getvalue())
self.assertIn('--force', mock_stdout.getvalue())
self.assertIn('service_uid', mock_stdout.getvalue())
@mock.patch('tethys_cli.services_commands.services_remove_spatial_command')
def test_services_remove_spatial_command(self, mock_services_remove_spatial_command):
testargs = ['tethys', 'services', 'remove', 'spatial', 'foo_service_uid']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_services_remove_spatial_command.assert_called()
call_args = mock_services_remove_spatial_command.call_args_list
self.assertEqual(False, call_args[0][0][0].force)
self.assertEqual('foo_service_uid', call_args[0][0][0].service_uid)
@mock.patch('tethys_cli.services_commands.services_remove_spatial_command')
def test_services_remove_spatial_command_options(self, mock_services_remove_spatial_command):
testargs = ['tethys', 'services', 'remove', 'spatial', '-f', 'foo_service_uid']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_services_remove_spatial_command.assert_called()
call_args = mock_services_remove_spatial_command.call_args_list
self.assertEqual(True, call_args[0][0][0].force)
self.assertEqual('foo_service_uid', call_args[0][0][0].service_uid)
@mock.patch('tethys_cli.services_commands.services_remove_spatial_command')
def test_services_remove_spatial_command_verbose_options(self, mock_services_remove_spatial_command):
testargs = ['tethys', 'services', 'remove', 'spatial', '--force', 'foo_service_uid']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_services_remove_spatial_command.assert_called()
call_args = mock_services_remove_spatial_command.call_args_list
self.assertEqual(True, call_args[0][0][0].force)
self.assertEqual('foo_service_uid', call_args[0][0][0].service_uid)
@mock.patch('sys.stdout', new_callable=StringIO)
@mock.patch('tethys_cli.argparse._sys.exit')
@mock.patch('tethys_cli.services_commands.services_remove_spatial_command')
def test_services_remove_spatial_command_help(self, mock_services_remove_spatial_command, mock_exit, mock_stdout):
mock_exit.side_effect = SystemExit
testargs = ['tethys', 'services', 'remove', 'spatial', '-h']
with mock.patch.object(sys, 'argv', testargs):
self.assertRaises(SystemExit, tethys_command)
mock_services_remove_spatial_command.assert_not_called()
mock_exit.assert_called_with(0)
self.assertIn('--help', mock_stdout.getvalue())
self.assertIn('--force', mock_stdout.getvalue())
self.assertIn('service_uid', mock_stdout.getvalue())
@mock.patch('tethys_cli.services_commands.services_create_persistent_command')
def test_services_create_persistent_command_options(self, mock_services_create_persistent_command):
testargs = ['tethys', 'services', 'create', 'persistent', '-n', 'foo_name', '-c', 'foo:pass@foo.bar:5555']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_services_create_persistent_command.assert_called()
call_args = mock_services_create_persistent_command.call_args_list
self.assertEqual('foo:pass@foo.bar:5555', call_args[0][0][0].connection)
self.assertEqual('foo_name', call_args[0][0][0].name)
@mock.patch('tethys_cli.services_commands.services_create_persistent_command')
def test_services_create_persistent_command_verbose_options(self, mock_services_create_persistent_command):
testargs = ['tethys', 'services', 'create', 'persistent', '--name', 'foo_name',
'--connection', 'foo:pass@foo.bar:5555']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_services_create_persistent_command.assert_called()
call_args = mock_services_create_persistent_command.call_args_list
self.assertEqual('foo:pass@foo.bar:5555', call_args[0][0][0].connection)
self.assertEqual('foo_name', call_args[0][0][0].name)
@mock.patch('sys.stdout', new_callable=StringIO)
@mock.patch('tethys_cli.argparse._sys.exit')
@mock.patch('tethys_cli.services_commands.services_create_persistent_command')
def test_services_create_persistent_command_help(self, mock_services_create_persistent_command, mock_exit,
mock_stdout):
mock_exit.side_effect = SystemExit
testargs = ['tethys', 'services', 'create', 'persistent', '-h']
with mock.patch.object(sys, 'argv', testargs):
self.assertRaises(SystemExit, tethys_command)
mock_services_create_persistent_command.assert_not_called()
mock_exit.assert_called_with(0)
self.assertIn('--help', mock_stdout.getvalue())
self.assertIn('--name', mock_stdout.getvalue())
self.assertIn('--connection', mock_stdout.getvalue())
@mock.patch('tethys_cli.services_commands.services_create_spatial_command')
def test_services_create_spatial_command_options(self, mock_services_create_spatial_command):
testargs = ['tethys', 'services', 'create', 'spatial', '-n', 'foo_name', '-c', 'foo:pass@http://foo.bar:5555']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_services_create_spatial_command.assert_called()
call_args = mock_services_create_spatial_command.call_args_list
self.assertEqual(None, call_args[0][0][0].apikey)
self.assertEqual('foo:pass@http://foo.bar:5555', call_args[0][0][0].connection)
self.assertEqual('foo_name', call_args[0][0][0].name)
self.assertEqual(None, call_args[0][0][0].public_endpoint)
@mock.patch('tethys_cli.services_commands.services_create_spatial_command')
def test_services_create_spatial_command_verbose_options(self, mock_services_create_spatial_command):
testargs = ['tethys', 'services', 'create', 'spatial', '--name', 'foo_name',
'--connection', 'foo:pass@http://foo.bar:5555', '--public-endpoint', 'foo.bar:1234',
'--apikey', 'foo_apikey']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_services_create_spatial_command.assert_called()
call_args = mock_services_create_spatial_command.call_args_list
self.assertEqual('foo_apikey', call_args[0][0][0].apikey)
self.assertEqual('foo:pass@http://foo.bar:5555', call_args[0][0][0].connection)
self.assertEqual('foo_name', call_args[0][0][0].name)
self.assertEqual('foo.bar:1234', call_args[0][0][0].public_endpoint)
@mock.patch('sys.stdout', new_callable=StringIO)
@mock.patch('tethys_cli.argparse._sys.exit')
@mock.patch('tethys_cli.services_commands.services_create_spatial_command')
def test_services_create_spatial_command_help(self, mock_services_create_spatial_command, mock_exit,
mock_stdout):
mock_exit.side_effect = SystemExit
testargs = ['tethys', 'services', 'create', 'spatial', '-h']
with mock.patch.object(sys, 'argv', testargs):
self.assertRaises(SystemExit, tethys_command)
mock_services_create_spatial_command.assert_not_called()
mock_exit.assert_called_with(0)
self.assertIn('--help', mock_stdout.getvalue())
self.assertIn('--name', mock_stdout.getvalue())
self.assertIn('--connection', mock_stdout.getvalue())
self.assertIn('--public-endpoint', mock_stdout.getvalue())
self.assertIn('--apikey', mock_stdout.getvalue())
@mock.patch('tethys_cli.services_commands.services_list_command')
def test_services_list_command(self, mock_services_list_command):
testargs = ['tethys', 'services', 'list']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_services_list_command.assert_called()
call_args = mock_services_list_command.call_args_list
self.assertEqual(False, call_args[0][0][0].persistent)
self.assertEqual(False, call_args[0][0][0].spatial)
@mock.patch('tethys_cli.services_commands.services_list_command')
def test_services_list_command_options(self, mock_services_list_command):
testargs = ['tethys', 'services', 'list', '-p']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_services_list_command.assert_called()
call_args = mock_services_list_command.call_args_list
self.assertEqual(True, call_args[0][0][0].persistent)
self.assertEqual(False, call_args[0][0][0].spatial)
@mock.patch('tethys_cli.services_commands.services_list_command')
def test_services_list_command_verbose_options(self, mock_services_list_command):
testargs = ['tethys', 'services', 'list', '--spatial']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_services_list_command.assert_called()
call_args = mock_services_list_command.call_args_list
self.assertEqual(False, call_args[0][0][0].persistent)
self.assertEqual(True, call_args[0][0][0].spatial)
@mock.patch('sys.stdout', new_callable=StringIO)
@mock.patch('tethys_cli.argparse._sys.exit')
@mock.patch('tethys_cli.services_commands.services_list_command')
def test_services_list_command_help(self, mock_services_list_command, mock_exit, mock_stdout):
mock_exit.side_effect = SystemExit
testargs = ['tethys', 'services', 'list', '-h']
with mock.patch.object(sys, 'argv', testargs):
self.assertRaises(SystemExit, tethys_command)
mock_services_list_command.assert_not_called()
mock_exit.assert_called_with(0)
self.assertIn('--help', mock_stdout.getvalue())
self.assertIn('--persistent', mock_stdout.getvalue())
self.assertIn('--spatial', mock_stdout.getvalue())
@mock.patch('tethys_cli.app_settings_commands.app_settings_list_command')
def test_app_settings_list_command(self, mock_app_settings_list_command):
testargs = ['tethys', 'app_settings', 'list', 'foo_app']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_app_settings_list_command.assert_called()
call_args = mock_app_settings_list_command.call_args_list
self.assertEqual('foo_app', call_args[0][0][0].app)
@mock.patch('sys.stdout', new_callable=StringIO)
@mock.patch('tethys_cli.argparse._sys.exit')
@mock.patch('tethys_cli.app_settings_commands.app_settings_list_command')
def test_app_settings_list_command_help(self, mock_app_settings_list_command, mock_exit, mock_stdout):
mock_exit.side_effect = SystemExit
testargs = ['tethys', 'app_settings', 'list', '-h']
with mock.patch.object(sys, 'argv', testargs):
self.assertRaises(SystemExit, tethys_command)
mock_app_settings_list_command.assert_not_called()
mock_exit.assert_called_with(0)
self.assertIn('--help', mock_stdout.getvalue())
self.assertIn('<app_package>', mock_stdout.getvalue())
@mock.patch('tethys_cli.app_settings_commands.app_settings_create_ps_database_command')
def test_app_settings_create_command_options(self, mock_app_settings_create_command):
testargs = ['tethys', 'app_settings', 'create', '-a', 'foo_app_package', '-n', 'foo', '-d', 'foo description',
'ps_database', '-s', '-y']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_app_settings_create_command.assert_called()
call_args = mock_app_settings_create_command.call_args_list
self.assertEqual('foo_app_package', call_args[0][0][0].app)
self.assertEqual('foo description', call_args[0][0][0].description)
self.assertEqual(True, call_args[0][0][0].dynamic)
self.assertEqual(False, call_args[0][0][0].initialized)
self.assertEqual(None, call_args[0][0][0].initializer)
self.assertEqual('foo', call_args[0][0][0].name)
self.assertEqual(False, call_args[0][0][0].required)
self.assertEqual(True, call_args[0][0][0].spatial)
@mock.patch('tethys_cli.app_settings_commands.app_settings_create_ps_database_command')
def test_app_settings_create_command_verbose_options(self, mock_app_settings_create_command):
testargs = ['tethys', 'app_settings', 'create', '--app', 'foo_app_package', '--name', 'foo', '--description',
'foo description', 'ps_database', '--spatial', '--dynamic']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_app_settings_create_command.assert_called()
call_args = mock_app_settings_create_command.call_args_list
self.assertEqual('foo_app_package', call_args[0][0][0].app)
self.assertEqual('foo description', call_args[0][0][0].description)
self.assertEqual(True, call_args[0][0][0].dynamic)
self.assertEqual(False, call_args[0][0][0].initialized)
self.assertEqual(None, call_args[0][0][0].initializer)
self.assertEqual('foo', call_args[0][0][0].name)
self.assertEqual(False, call_args[0][0][0].required)
self.assertEqual(True, call_args[0][0][0].spatial)
@mock.patch('sys.stdout', new_callable=StringIO)
@mock.patch('tethys_cli.argparse._sys.exit')
@mock.patch('tethys_cli.app_settings_commands.app_settings_create_ps_database_command')
def test_app_settings_create_command_help(self, mock_app_settings_create_command, mock_exit, mock_stdout):
mock_exit.side_effect = SystemExit
testargs = ['tethys', 'app_settings', 'create', '-h']
with mock.patch.object(sys, 'argv', testargs):
self.assertRaises(SystemExit, tethys_command)
mock_app_settings_create_command.assert_not_called()
mock_exit.assert_called_with(0)
self.assertIn('--help', mock_stdout.getvalue())
self.assertIn('--app', mock_stdout.getvalue())
self.assertIn('--name', mock_stdout.getvalue())
self.assertIn('--description', mock_stdout.getvalue())
self.assertIn('--required', mock_stdout.getvalue())
self.assertIn('--initializer', mock_stdout.getvalue())
self.assertIn('--initialized', mock_stdout.getvalue())
self.assertIn('{ps_database}', mock_stdout.getvalue())
@mock.patch('tethys_cli.app_settings_commands.app_settings_remove_command')
def test_app_settings_create_command_options_2(self, mock_app_settings_remove_command):
testargs = ['tethys', 'app_settings', 'remove', '-n', 'foo', '-f', 'foo_app']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_app_settings_remove_command.assert_called()
call_args = mock_app_settings_remove_command.call_args_list
self.assertEqual('foo_app', call_args[0][0][0].app)
self.assertEqual(True, call_args[0][0][0].force)
self.assertEqual('foo', call_args[0][0][0].name)
@mock.patch('tethys_cli.app_settings_commands.app_settings_remove_command')
def test_app_settings_create_command_verbose_options_2(self, mock_app_settings_remove_command):
testargs = ['tethys', 'app_settings', 'remove', '--name', 'foo', '--force', 'foo_app']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_app_settings_remove_command.assert_called()
call_args = mock_app_settings_remove_command.call_args_list
self.assertEqual('foo_app', call_args[0][0][0].app)
self.assertEqual(True, call_args[0][0][0].force)
self.assertEqual('foo', call_args[0][0][0].name)
@mock.patch('sys.stdout', new_callable=StringIO)
@mock.patch('tethys_cli.argparse._sys.exit')
@mock.patch('tethys_cli.app_settings_commands.app_settings_remove_command')
def test_app_settings_create_command_help_2(self, mock_app_settings_remove_command, mock_exit, mock_stdout):
mock_exit.side_effect = SystemExit
testargs = ['tethys', 'app_settings', 'remove', '-h']
with mock.patch.object(sys, 'argv', testargs):
self.assertRaises(SystemExit, tethys_command)
mock_app_settings_remove_command.assert_not_called()
mock_exit.assert_called_with(0)
self.assertIn('--help', mock_stdout.getvalue())
self.assertIn('<app_package>', mock_stdout.getvalue())
self.assertIn('--name', mock_stdout.getvalue())
self.assertIn('--force', mock_stdout.getvalue())
@mock.patch('tethys_cli.link_commands.link_command')
def test_link_command(self, mock_link_command):
testargs = ['tethys', 'link', 'spatial:foo_service', 'foo_package:database:foo_2']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_link_command.assert_called()
call_args = mock_link_command.call_args_list
self.assertEqual('spatial:foo_service', call_args[0][0][0].service)
self.assertEqual('foo_package:database:foo_2', call_args[0][0][0].setting)
@mock.patch('sys.stdout', new_callable=StringIO)
@mock.patch('tethys_cli.argparse._sys.exit')
@mock.patch('tethys_cli.link_commands.link_command')
def test_link_command_help(self, mock_link_command, mock_exit, mock_stdout):
mock_exit.side_effect = SystemExit
testargs = ['tethys', 'link', '-h']
with mock.patch.object(sys, 'argv', testargs):
self.assertRaises(SystemExit, tethys_command)
mock_link_command.assert_not_called()
mock_exit.assert_called_with(0)
self.assertIn('--help', mock_stdout.getvalue())
self.assertIn('service', mock_stdout.getvalue())
self.assertIn('setting', mock_stdout.getvalue())
@mock.patch('tethys_cli.test_command.test_command')
def test_test_command(self, mock_test_command):
testargs = ['tethys', 'test']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_test_command.assert_called()
call_args = mock_test_command.call_args_list
self.assertEqual(False, call_args[0][0][0].coverage)
self.assertEqual(False, call_args[0][0][0].coverage_html)
self.assertEqual(None, call_args[0][0][0].file)
self.assertEqual(False, call_args[0][0][0].gui)
self.assertEqual(False, call_args[0][0][0].unit)
@mock.patch('tethys_cli.test_command.test_command')
def test_test_command_options(self, mock_test_command):
testargs = ['tethys', 'test', '-c', '-C', '-u', '-g', '-f', 'foo.bar']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_test_command.assert_called()
call_args = mock_test_command.call_args_list
self.assertEqual(True, call_args[0][0][0].coverage)
self.assertEqual(True, call_args[0][0][0].coverage_html)
self.assertEqual('foo.bar', call_args[0][0][0].file)
self.assertEqual(True, call_args[0][0][0].gui)
self.assertEqual(True, call_args[0][0][0].unit)
@mock.patch('tethys_cli.test_command.test_command')
def test_test_command_options_verbose(self, mock_test_command):
testargs = ['tethys', 'test', '--coverage', '--coverage-html', '--unit', '--gui', '--file', 'foo.bar']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_test_command.assert_called()
call_args = mock_test_command.call_args_list
self.assertEqual(True, call_args[0][0][0].coverage)
self.assertEqual(True, call_args[0][0][0].coverage_html)
self.assertEqual('foo.bar', call_args[0][0][0].file)
self.assertEqual(True, call_args[0][0][0].gui)
self.assertEqual(True, call_args[0][0][0].unit)
@mock.patch('sys.stdout', new_callable=StringIO)
@mock.patch('tethys_cli.argparse._sys.exit')
@mock.patch('tethys_cli.test_command.test_command')
def test_test_command_help(self, mock_test_command, mock_exit, mock_stdout):
mock_exit.side_effect = SystemExit
testargs = ['tethys', 'test', '-h']
with mock.patch.object(sys, 'argv', testargs):
self.assertRaises(SystemExit, tethys_command)
mock_test_command.assert_not_called()
mock_exit.assert_called_with(0)
self.assertIn('--help', mock_stdout.getvalue())
self.assertIn('--coverage', mock_stdout.getvalue())
self.assertIn('--coverage-html', mock_stdout.getvalue())
self.assertIn('--unit', mock_stdout.getvalue())
self.assertIn('--gui', mock_stdout.getvalue())
self.assertIn('--file', mock_stdout.getvalue())
@mock.patch('tethys_cli.uninstall_command.uninstall_command')
def test_uninstall_command(self, mock_uninstall_command):
testargs = ['tethys', 'uninstall', 'foo_app']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_uninstall_command.assert_called()
call_args = mock_uninstall_command.call_args_list
self.assertEqual('foo_app', call_args[0][0][0].app_or_extension)
self.assertEqual(False, call_args[0][0][0].is_extension)
@mock.patch('tethys_cli.uninstall_command.uninstall_command')
def test_uninstall_command_options(self, mock_uninstall_command):
testargs = ['tethys', 'uninstall', '-e', 'foo_ext']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_uninstall_command.assert_called()
call_args = mock_uninstall_command.call_args_list
self.assertEqual('foo_ext', call_args[0][0][0].app_or_extension)
self.assertEqual(True, call_args[0][0][0].is_extension)
@mock.patch('tethys_cli.uninstall_command.uninstall_command')
def test_uninstall_command_verbose_options(self, mock_uninstall_command):
testargs = ['tethys', 'uninstall', '--extension', 'foo_ext']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_uninstall_command.assert_called()
call_args = mock_uninstall_command.call_args_list
self.assertEqual('foo_ext', call_args[0][0][0].app_or_extension)
self.assertEqual(True, call_args[0][0][0].is_extension)
@mock.patch('sys.stdout', new_callable=StringIO)
@mock.patch('tethys_cli.argparse._sys.exit')
@mock.patch('tethys_cli.uninstall_command.uninstall_command')
def test_uninstall_command_help(self, mock_uninstall_command, mock_exit, mock_stdout):
mock_exit.side_effect = SystemExit
testargs = ['tethys', 'uninstall', '-h']
with mock.patch.object(sys, 'argv', testargs):
self.assertRaises(SystemExit, tethys_command)
mock_uninstall_command.assert_not_called()
mock_exit.assert_called_with(0)
self.assertIn('--help', mock_stdout.getvalue())
self.assertIn('--extension', mock_stdout.getvalue())
self.assertIn('app_or_extension', mock_stdout.getvalue())
@mock.patch('tethys_cli.list_command.list_command')
def test_list_command(self, mock_list_command):
testargs = ['tethys', 'list']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_list_command.assert_called()
@mock.patch('sys.stdout', new_callable=StringIO)
@mock.patch('tethys_cli.argparse._sys.exit')
@mock.patch('tethys_cli.list_command.list_command')
def test_list_command_help(self, mock_list_command, mock_exit, mock_stdout):
mock_exit.side_effect = SystemExit
testargs = ['tethys', 'list', '-h']
with mock.patch.object(sys, 'argv', testargs):
self.assertRaises(SystemExit, tethys_command)
mock_list_command.assert_not_called()
mock_exit.assert_called_with(0)
self.assertIn('--help', mock_stdout.getvalue())
@mock.patch('tethys_cli.syncstores_command.syncstores_command')
def test_syncstores_command_single(self, mock_syncstores_command):
testargs = ['tethys', 'syncstores', 'foo_app1']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_syncstores_command.assert_called()
call_args = mock_syncstores_command.call_args_list
self.assertEqual(['foo_app1'], call_args[0][0][0].app)
self.assertEqual(None, call_args[0][0][0].database)
self.assertEqual(False, call_args[0][0][0].firstime)
self.assertEqual(False, call_args[0][0][0].firsttime)
self.assertEqual(None, call_args[0][0][0].manage)
self.assertEqual(False, call_args[0][0][0].refresh)
@mock.patch('tethys_cli.syncstores_command.syncstores_command')
def test_syncstores_command_multiple(self, mock_syncstores_command):
testargs = ['tethys', 'syncstores', 'foo_app1', 'foo_app2', 'foo_app3']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_syncstores_command.assert_called()
call_args = mock_syncstores_command.call_args_list
self.assertEqual(['foo_app1', 'foo_app2', 'foo_app3'], call_args[0][0][0].app)
self.assertEqual(None, call_args[0][0][0].database)
self.assertEqual(False, call_args[0][0][0].firstime)
self.assertEqual(False, call_args[0][0][0].firsttime)
self.assertEqual(None, call_args[0][0][0].manage)
self.assertEqual(False, call_args[0][0][0].refresh)
@mock.patch('tethys_cli.syncstores_command.syncstores_command')
def test_syncstores_command_all(self, mock_syncstores_command):
testargs = ['tethys', 'syncstores', 'all']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_syncstores_command.assert_called()
call_args = mock_syncstores_command.call_args_list
self.assertEqual(['all'], call_args[0][0][0].app)
self.assertEqual(None, call_args[0][0][0].database)
self.assertEqual(False, call_args[0][0][0].firstime)
self.assertEqual(False, call_args[0][0][0].firsttime)
self.assertEqual(None, call_args[0][0][0].manage)
self.assertEqual(False, call_args[0][0][0].refresh)
@mock.patch('tethys_cli.syncstores_command.syncstores_command')
def test_syncstores_command_options(self, mock_syncstores_command):
testargs = ['tethys', 'syncstores', '-r', '-f', '-d', 'foo_db', '-m', '/foo/bar/manage.py', 'foo_app1']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_syncstores_command.assert_called()
call_args = mock_syncstores_command.call_args_list
self.assertEqual(['foo_app1'], call_args[0][0][0].app)
self.assertEqual('foo_db', call_args[0][0][0].database)
self.assertEqual(False, call_args[0][0][0].firstime)
self.assertEqual(True, call_args[0][0][0].firsttime)
self.assertEqual('/foo/bar/manage.py', call_args[0][0][0].manage)
self.assertEqual(True, call_args[0][0][0].refresh)
@mock.patch('tethys_cli.syncstores_command.syncstores_command')
def test_syncstores_command_verbose_options(self, mock_syncstores_command):
testargs = ['tethys', 'syncstores', '--refresh', '--firsttime', '--database', 'foo_db',
'--manage', '/foo/bar/manage.py', 'foo_app1']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_syncstores_command.assert_called()
call_args = mock_syncstores_command.call_args_list
self.assertEqual(['foo_app1'], call_args[0][0][0].app)
self.assertEqual('foo_db', call_args[0][0][0].database)
self.assertEqual(False, call_args[0][0][0].firstime)
self.assertEqual(True, call_args[0][0][0].firsttime)
self.assertEqual('/foo/bar/manage.py', call_args[0][0][0].manage)
self.assertEqual(True, call_args[0][0][0].refresh)
@mock.patch('sys.stdout', new_callable=StringIO)
@mock.patch('tethys_cli.argparse._sys.exit')
@mock.patch('tethys_cli.syncstores_command.syncstores_command')
def test_syncstores_command_help(self, mock_syncstores_command, mock_exit, mock_stdout):
mock_exit.side_effect = SystemExit
testargs = ['tethys', 'syncstores', '-h']
with mock.patch.object(sys, 'argv', testargs):
self.assertRaises(SystemExit, tethys_command)
mock_syncstores_command.assert_not_called()
mock_exit.assert_called_with(0)
self.assertIn('--help', mock_stdout.getvalue())
self.assertIn('app', mock_stdout.getvalue())
self.assertIn('--refresh', mock_stdout.getvalue())
self.assertIn('--firsttime', mock_stdout.getvalue())
self.assertIn('--database', mock_stdout.getvalue())
self.assertIn('--manage', mock_stdout.getvalue())
@mock.patch('tethys_cli.docker_commands.docker_command')
def test_docker_command(self, mock_docker_command):
testargs = ['tethys', 'docker', 'init']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_docker_command.assert_called()
call_args = mock_docker_command.call_args_list
self.assertEqual(False, call_args[0][0][0].boot2docker)
self.assertEqual('init', call_args[0][0][0].command)
self.assertEqual(None, call_args[0][0][0].containers)
self.assertEqual(False, call_args[0][0][0].defaults)
@mock.patch('tethys_cli.docker_commands.docker_command')
def test_docker_command_options(self, mock_docker_command):
testargs = ['tethys', 'docker', 'start', '-c', 'postgis', 'geoserver']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_docker_command.assert_called()
call_args = mock_docker_command.call_args_list
self.assertEqual(False, call_args[0][0][0].boot2docker)
self.assertEqual('start', call_args[0][0][0].command)
self.assertEqual(['postgis', 'geoserver'], call_args[0][0][0].containers)
self.assertEqual(False, call_args[0][0][0].defaults)
@mock.patch('tethys_cli.docker_commands.docker_command')
def test_docker_command_verbose_options(self, mock_docker_command):
testargs = ['tethys', 'docker', 'stop', '--defaults', '--boot2docker']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_docker_command.assert_called()
call_args = mock_docker_command.call_args_list
self.assertEqual(True, call_args[0][0][0].boot2docker)
self.assertEqual('stop', call_args[0][0][0].command)
self.assertEqual(None, call_args[0][0][0].containers)
self.assertEqual(True, call_args[0][0][0].defaults)
@mock.patch('sys.stdout', new_callable=StringIO)
@mock.patch('tethys_cli.argparse._sys.exit')
@mock.patch('tethys_cli.docker_commands.docker_command')
def test_docker_command_help(self, mock_docker_command, mock_exit, mock_stdout):
mock_exit.side_effect = SystemExit
testargs = ['tethys', 'docker', '-h']
with mock.patch.object(sys, 'argv', testargs):
self.assertRaises(SystemExit, tethys_command)
mock_docker_command.assert_not_called()
mock_exit.assert_called_with(0)
self.assertIn('--help', mock_stdout.getvalue())
self.assertIn('--defaults', mock_stdout.getvalue())
self.assertIn('--containers', mock_stdout.getvalue())
self.assertIn('--boot2docker', mock_stdout.getvalue())
@mock.patch('tethys_cli.site_commands.gen_site_content')
def test_site_commands_gen_site_content_options(self, mock_gen_site_content):
testargs = ['tethys', 'site', '--tab-title', 'Tethys Portal', '--favicon', '/tethys_portal/images/favicon.png',
'--title', 'Tethys Portal', '--logo', '/tethys_portal/images/tethys-logo.png',
'--logo-height', '60', '--logo-width', '60', '--logo-padding', '0',
'--library-title', 'Apps Library', '--primary-color', '#0a62a9', '--secondary-color', '#1b95dc',
'--background-color', 'white', '--copyright', 'copyright', '--hero-text', 'Hero Text',
'--blurb-text', 'Blurb Text', '--feature1-heading', 'Feature 1 Heading',
'--feature1-body', 'Feature 1 Content', '--feature1-image', '/tethys_portal/images/feature1.png',
'--feature2-heading', 'Feature 2 Heading', '--feature2-body', 'Feature 2 Content',
'--feature2-image', '/tethys_portal/images/feature2.png', '--feature3-heading', 'Feature 3 Heading',
'--feature3-body', 'Feature 3 Content', '--feature3-image', '/tethys_portal/images/feature3.png',
'--action-text', 'Ready?', '--action-button', 'Signup!']
with mock.patch.object(sys, 'argv', testargs):
tethys_command()
mock_gen_site_content.assert_called()
call_args = mock_gen_site_content.call_args_list
self.assertEqual('Tethys Portal', call_args[0][0][0].tab_title)
self.assertEqual('/tethys_portal/images/favicon.png', call_args[0][0][0].favicon)
self.assertEqual('Tethys Portal', call_args[0][0][0].title)
self.assertEqual('/tethys_portal/images/tethys-logo.png', call_args[0][0][0].logo)
self.assertEqual('60', call_args[0][0][0].logo_height)
self.assertEqual('60', call_args[0][0][0].logo_width)
self.assertEqual('0', call_args[0][0][0].logo_padding)
self.assertEqual('Apps Library', call_args[0][0][0].library_title)
self.assertEqual('#0a62a9', call_args[0][0][0].primary_color)
self.assertEqual('#1b95dc', call_args[0][0][0].secondary_color)
self.assertEqual('white', call_args[0][0][0].background_color)
self.assertEqual('copyright', call_args[0][0][0].footer_copyright)
self.assertEqual('Hero Text', call_args[0][0][0].hero_text)
self.assertEqual('Blurb Text', call_args[0][0][0].blurb_text)
self.assertEqual('Feature 1 Heading', call_args[0][0][0].feature1_heading)
self.assertEqual('Feature 1 Content', call_args[0][0][0].feature1_body)
self.assertEqual('/tethys_portal/images/feature1.png', call_args[0][0][0].feature1_image)
self.assertEqual('Feature 2 Heading', call_args[0][0][0].feature2_heading)
self.assertEqual('Feature 2 Content', call_args[0][0][0].feature2_body)
self.assertEqual('/tethys_portal/images/feature2.png', call_args[0][0][0].feature2_image)
self.assertEqual('Feature 3 Heading', call_args[0][0][0].feature3_heading)
self.assertEqual('Feature 3 Content', call_args[0][0][0].feature3_body)
self.assertEqual('/tethys_portal/images/feature3.png', call_args[0][0][0].feature3_image)
self.assertEqual('Ready?', call_args[0][0][0].action_text)
self.assertEqual('Signup!', call_args[0][0][0].action_button)
self.assertFalse(call_args[0][0][0].restore_defaults)
| 49.716276 | 120 | 0.678823 | 8,615 | 67,813 | 5.050958 | 0.032153 | 0.027761 | 0.062256 | 0.069173 | 0.904054 | 0.872914 | 0.829595 | 0.807763 | 0.782001 | 0.756377 | 0 | 0.021627 | 0.17358 | 67,813 | 1,363 | 121 | 49.752751 | 0.754827 | 0 | 0 | 0.631963 | 0 | 0 | 0.184507 | 0.079542 | 0 | 0 | 0 | 0 | 0.502283 | 1 | 0.075799 | false | 0.024658 | 0.004566 | 0 | 0.081279 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
54fbfaf44494195463d671b2445a19f8587bc616 | 5,230 | py | Python | test/test_speech_api.py | SamyMe/edenai-python | b92ca21086c90a0c31cd68ba92fff897811752d2 | [
"Apache-2.0"
] | 20 | 2021-08-11T09:37:42.000Z | 2022-01-14T08:05:49.000Z | test/test_speech_api.py | SamyMe/edenai-python | b92ca21086c90a0c31cd68ba92fff897811752d2 | [
"Apache-2.0"
] | 2 | 2021-08-11T09:36:21.000Z | 2022-03-13T13:49:53.000Z | test/test_speech_api.py | SamyMe/edenai-python | b92ca21086c90a0c31cd68ba92fff897811752d2 | [
"Apache-2.0"
] | 7 | 2021-08-06T10:08:59.000Z | 2022-01-29T22:10:45.000Z | # coding: utf-8
"""
Eden AI API Documentation
<a href=\"https://app.edenai.run/user/login\" target=\"_blank\"><img src=\"/static/images/welcome.png\"></a>. # Welcome Eden AI simplifies the use and integration of AI technologies by providing a unique API connected to the best AI engines and combined with a powerful management platform. The platform covers a wide range of AI technologies: * Vision: <a href=\"https://www.edenai.co/vision\" target=\"_blank\">www.edenai.co/vision</a>. * Text & NLP: <a href=\"https://www.edenai.co/text\" target=\"_blank\">www.edenai.co/text</a>. * Speech & Audio: <a href=\"https://www.edenai.co/speech\" target=\"_blank\">www.edenai.co/speech</a>. * OCR: <a href=\"https://www.edenai.co/ocr\" target=\"_blank\">www.edenai.co/ocr</a>. * Machine Translation: <a href=\"https://www.edenai.co/translation\" target=\"_blank\">www.edenai.co/translation</a>. * Prediction: <a href=\"https://www.edenai.co/prediction\" target=\"_blank\">www.edenai.co/prediction</a>. For all the proposed technologies, we provide a single endpoint: the service provider is only a parameter that can be changed very easily. All the engines available on Eden AI are listed here: www.edenai.co/catalog # Support & community ### 1- Support If you have any problems, please contact us at this email address: contact@edenai.co. We will be happy to help you in the use of Eden AI. ### 2- Community You can interact personally with other people actively using and working with Eden AI and join our <a href=\"https://join.slack.com/t/edenai/shared_invite/zt-t68c2pr9-4lDKQ_qEqmLiWNptQzB_6w\" target=\"_blank\">Slack community</a>. We are always updating our docs, so a good way to always stay up to date is to watch our documentation repo on Github: <a href=\"https://github.com/edenai\" target=\"_blank\">https://github.com/edenai</a>. ### 3- Blog We also regularly publish various articles with Eden AI news and technical articles on the different AI engines that exist. You can find these articles here: <a href=\"https://www.edenai.co/blog\" target=\"_blank\">https://www.edenai.co/blog</a>. # Authentication ## Create account  To create an account, please go to this link: <a href=\"https://app.edenai.run/user/login\" target=\"_blank\">app.edenai.run/user/login</a>. You can create an account with your email address or by using your account on available platforms (Gmail, Github, etc.). By creating an account with your email address, you will receive a confirmation email with a link to click. Check your spam if needed and contact us if you have any problem: contact@edenai.co  ## API key By going to your account page on the platform: <a href=\"https://app.edenai.run/admin/account\" target=\"_blank\">https://app.edenai.run/admin/account</a>, you will have access to your API key to start using the different AI engines offered by Eden AI.  # Portal Guide Eden AI provides a web portal that allows you to do several tasks:  ### 1- Benchmark and test The platform allows you to easily compare competing engines without having to code. By uploading your data, you have access to the prediction results of the different engines. This gives you a first overview of the performance of AI engines.  ### 2- Cost management The <a href=\"https://app.edenai.run/admin/cost-management\" target=\"_blank\">cost management page</a> also allows you to centralize the costs associated with the different engines with various filters to simplify the analysis. This page also allows you to define monthly budget limits not to be exceeded to secure the use of different AI engines.  ### 3- Account The <a href=\"https://app.edenai.run/admin/account\" target=\"_blank\">account page</a> allows you to change your information and password. It also gives you access to your API key that you can renew if needed. This page also allows you to add a credit card and to buy with credits to use all the engines offered by Eden AI.  # API Guide Eden AI API has different endpoints that refer to different AI services. The connected providers are thus parameters that the user can easily change. # noqa: E501
OpenAPI spec version: v1
Contact: contact@edenai.co
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
from __future__ import absolute_import
import unittest
import edenai
from edenai import Speech # noqa: E501
from edenai.rest import ApiException
import os
from dotenv import load_dotenv
load_dotenv()
class TestSpeechApi(unittest.TestCase):
"""SpeechApi unit test stubs"""
def setUp(self):
self.api = Speech(os.getenv("API_KEY")) # noqa: E501
def tearDown(self):
pass
def test_speech_recognition(self):
"""Test case for speech_recognition
"""
pass
def test_text_to_speech(self):
"""Test case for text_to_speech
"""
pass
if __name__ == '__main__':
unittest.main()
| 100.576923 | 4,375 | 0.732887 | 822 | 5,230 | 4.609489 | 0.316302 | 0.038005 | 0.043547 | 0.033782 | 0.196358 | 0.12853 | 0.054632 | 0.054632 | 0.045922 | 0.045922 | 0 | 0.005179 | 0.15086 | 5,230 | 51 | 4,376 | 102.54902 | 0.848007 | 0.880115 | 0 | 0.157895 | 0 | 0 | 0.028626 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.210526 | false | 0.157895 | 0.368421 | 0 | 0.631579 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 7 |
072f67427930a87f0aaf2c2bc4b7eb9b5189ecbc | 2,713 | py | Python | tests/test_year_2009.py | l0pht511/jpholiday | 083145737b61fad3420c066968c4329d17dc3baf | [
"MIT"
] | 179 | 2017-10-05T12:41:10.000Z | 2022-03-24T22:18:25.000Z | tests/test_year_2009.py | l0pht511/jpholiday | 083145737b61fad3420c066968c4329d17dc3baf | [
"MIT"
] | 17 | 2018-10-23T00:51:13.000Z | 2021-11-22T11:40:06.000Z | tests/test_year_2009.py | l0pht511/jpholiday | 083145737b61fad3420c066968c4329d17dc3baf | [
"MIT"
] | 17 | 2018-10-19T11:13:07.000Z | 2022-01-29T08:05:56.000Z | # coding: utf-8
import datetime
import unittest
import jpholiday
class TestYear2009(unittest.TestCase):
def test_holiday(self):
"""
2009年祝日
"""
self.assertEqual(jpholiday.is_holiday_name(datetime.date(2009, 1, 1)), '元日')
self.assertEqual(jpholiday.is_holiday_name(datetime.date(2009, 1, 12)), '成人の日')
self.assertEqual(jpholiday.is_holiday_name(datetime.date(2009, 2, 11)), '建国記念の日')
self.assertEqual(jpholiday.is_holiday_name(datetime.date(2009, 3, 20)), '春分の日')
self.assertEqual(jpholiday.is_holiday_name(datetime.date(2009, 4, 29)), '昭和の日')
self.assertEqual(jpholiday.is_holiday_name(datetime.date(2009, 5, 3)), '憲法記念日')
self.assertEqual(jpholiday.is_holiday_name(datetime.date(2009, 5, 4)), 'みどりの日')
self.assertEqual(jpholiday.is_holiday_name(datetime.date(2009, 5, 5)), 'こどもの日')
self.assertEqual(jpholiday.is_holiday_name(datetime.date(2009, 5, 6)), '憲法記念日 振替休日')
self.assertEqual(jpholiday.is_holiday_name(datetime.date(2009, 7, 20)), '海の日')
self.assertEqual(jpholiday.is_holiday_name(datetime.date(2009, 9, 21)), '敬老の日')
self.assertEqual(jpholiday.is_holiday_name(datetime.date(2009, 9, 22)), '国民の休日')
self.assertEqual(jpholiday.is_holiday_name(datetime.date(2009, 9, 23)), '秋分の日')
self.assertEqual(jpholiday.is_holiday_name(datetime.date(2009, 10, 12)), '体育の日')
self.assertEqual(jpholiday.is_holiday_name(datetime.date(2009, 11, 3)), '文化の日')
self.assertEqual(jpholiday.is_holiday_name(datetime.date(2009, 11, 23)), '勤労感謝の日')
self.assertEqual(jpholiday.is_holiday_name(datetime.date(2009, 12, 23)), '天皇誕生日')
def test_count_month(self):
"""
2009年月祝日数
"""
self.assertEqual(len(jpholiday.month_holidays(2009, 1)), 2)
self.assertEqual(len(jpholiday.month_holidays(2009, 2)), 1)
self.assertEqual(len(jpholiday.month_holidays(2009, 3)), 1)
self.assertEqual(len(jpholiday.month_holidays(2009, 4)), 1)
self.assertEqual(len(jpholiday.month_holidays(2009, 5)), 4)
self.assertEqual(len(jpholiday.month_holidays(2009, 6)), 0)
self.assertEqual(len(jpholiday.month_holidays(2009, 7)), 1)
self.assertEqual(len(jpholiday.month_holidays(2009, 8)), 0)
self.assertEqual(len(jpholiday.month_holidays(2009, 9)), 3)
self.assertEqual(len(jpholiday.month_holidays(2009, 10)), 1)
self.assertEqual(len(jpholiday.month_holidays(2009, 11)), 2)
self.assertEqual(len(jpholiday.month_holidays(2009, 12)), 1)
def test_count_year(self):
"""
2009年祝日数
"""
self.assertEqual(len(jpholiday.year_holidays(2009)), 17)
| 51.188679 | 92 | 0.681902 | 351 | 2,713 | 5.122507 | 0.176638 | 0.250278 | 0.226919 | 0.245829 | 0.807008 | 0.807008 | 0.807008 | 0.733593 | 0.508343 | 0.331479 | 0 | 0.095513 | 0.170291 | 2,713 | 52 | 93 | 52.173077 | 0.703243 | 0.015112 | 0 | 0 | 0 | 0 | 0.030746 | 0 | 0 | 0 | 0 | 0 | 0.810811 | 1 | 0.081081 | false | 0 | 0.081081 | 0 | 0.189189 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
075a02159dd8d049a2d1ca3ec18a74f6319898ae | 43,068 | py | Python | geokey/socialinteractions/tests/test_views.py | universityofsussex/geokey | 25e161dbc81841c57c148053dbe99facc81e84b8 | [
"Apache-2.0"
] | null | null | null | geokey/socialinteractions/tests/test_views.py | universityofsussex/geokey | 25e161dbc81841c57c148053dbe99facc81e84b8 | [
"Apache-2.0"
] | null | null | null | geokey/socialinteractions/tests/test_views.py | universityofsussex/geokey | 25e161dbc81841c57c148053dbe99facc81e84b8 | [
"Apache-2.0"
] | null | null | null | """Tests for views of social interactions."""
from django.test import TestCase, override_settings
from django.conf import settings
from django.http import HttpRequest, QueryDict
from django.core.urlresolvers import reverse
from django.template.loader import render_to_string
from django.contrib.auth.models import AnonymousUser
from django.contrib.sites.shortcuts import get_current_site
from django.contrib.messages import get_messages
from django.contrib.messages.storage.fallback import FallbackStorage
from allauth.socialaccount.models import SocialAccount
from allauth.socialaccount.providers import registry
from geokey import version
from geokey.core.tests.helpers import render_helpers
from geokey.users.tests.model_factories import UserFactory
from geokey.projects.tests.model_factories import ProjectFactory
from geokey.socialinteractions.base import freq_dic, STATUS
import importlib
from .model_factories import (
SocialInteractionFactory,
SocialInteractionPullFactory
)
from ..models import SocialInteractionPost, SocialInteractionPull
from ..views import (
SocialInteractionList,
SocialInteractionPostCreate,
SocialInteractionPostSettings,
SocialInteractionPostDelete,
SocialInteractionPost,
SocialInteractionPullCreate,
SocialInteractionPullDelete,
SocialInteractionPullSettings
)
def install_required_apps():
"""Install Twitter and Facebook providers for django-allauth."""
installed_apps = settings.INSTALLED_APPS
apps_to_install = [
'allauth.socialaccount.providers.twitter',
'allauth.socialaccount.providers.facebook',
]
for app in apps_to_install:
if app not in installed_apps:
installed_apps = installed_apps + (app,)
importlib.import_module(app + '.provider')
return installed_apps
class SocialInteractionsListTest(TestCase):
"""Test a list of social interactions page."""
def setUp(self):
"""Set up tests."""
self.view = SocialInteractionList.as_view()
self.request = HttpRequest()
self.request.method = 'GET'
self.request.user = AnonymousUser()
setattr(self.request, 'session', 'session')
messages = FallbackStorage(self.request)
setattr(self.request, '_messages', messages)
def test_get_with_anonymous(self):
"""
Accessing the view with AnonymousUser.
It should redirect to the login page.
"""
response = self.view(self.request)
self.assertEqual(response.status_code, 302)
self.assertIn('/admin/account/login/', response['location'])
def test_get_with_user(self):
"""
Accessing the view with normal user.
It should render the page with an error message.
"""
user = UserFactory.create()
project = ProjectFactory.create()
self.request.user = user
response = self.view(self.request, project_id=project.id).render()
rendered = render_to_string(
'socialinteractions/socialinteraction_list.html',
{
'error_description': 'Project matching query does not exist.',
'error': 'Not found.',
'user': user,
'PLATFORM_NAME': get_current_site(self.request).name,
'GEOKEY_VERSION': version.get_version()
}
)
self.assertEqual(response.status_code, 200)
self.assertEqual(response.content.decode('utf-8'), rendered)
def test_get_with_admin(self):
"""
Accessing the view with project admin.
It should render the page.
"""
project = ProjectFactory.create()
user = project.creator
self.request.user = user
response = self.view(self.request, project_id=project.id).render()
rendered = render_to_string(
'socialinteractions/socialinteraction_list.html',
{
'project': project,
'user': user,
'PLATFORM_NAME': get_current_site(self.request).name,
'GEOKEY_VERSION': version.get_version()
}
)
self.assertEqual(response.status_code, 200)
self.assertEqual(response.content.decode('utf-8'), rendered)
@override_settings(INSTALLED_APPS=install_required_apps())
class SocialInteractionCreateTest(TestCase):
"""Test creating a new social interaction."""
def setUp(self):
"""Set up tests."""
self.anonymous_user = AnonymousUser()
self.regular_user = UserFactory.create()
self.admin_user = UserFactory.create()
self.project = ProjectFactory.create(creator=self.admin_user)
self.socialaccount_1 = SocialAccount.objects.create(
user=self.regular_user, provider='twitter', uid='1')
self.socialaccount_2 = SocialAccount.objects.create(
user=self.admin_user, provider='twitter', uid='2')
self.view = SocialInteractionPostCreate.as_view()
self.request = HttpRequest()
self.request.method = 'GET'
self.request.user = self.anonymous_user
setattr(self.request, 'session', 'session')
messages = FallbackStorage(self.request)
setattr(self.request, '_messages', messages)
def test_get_with_anonymous(self):
"""
Accessing the view with AnonymousUser.
It should redirect to the loginpage.
"""
response = self.view(self.request)
self.assertEqual(response.status_code, 302)
self.assertIn('/admin/account/login/', response['location'])
def test_get_with_user(self):
"""
Accessing the view with normal user.
It should render the page with an error message.
"""
self.request.user = self.regular_user
response = self.view(self.request, project_id=self.project.id).render()
rendered = render_to_string(
'socialinteractions/socialinteraction_post_create.html',
{
'error_description': 'Project matching query does not exist.',
'error': 'Not found.',
'user': self.regular_user,
'PLATFORM_NAME': get_current_site(self.request).name,
'GEOKEY_VERSION': version.get_version()
}
)
self.assertEqual(response.status_code, 200)
self.assertEqual(response.content.decode('utf-8'), rendered)
# def test_get_with_admin(self):
# """
# Accessing the view with project admin.
# It should render the page.
# """
# self.request.user = self.admin_user
# response = self.view(self.request, project_id=self.project.id).render()
# rendered = render_to_string(
# 'socialinteractions/socialinteraction_post_create.html',
# {
# 'project': self.project,
# 'auth_users': [self.socialaccount_2],
# 'user': self.admin_user,
# 'PLATFORM_NAME': get_current_site(self.request).name,
# 'GEOKEY_VERSION': version.get_version()
# }
# )
# self.assertEqual(response.status_code, 200)
# response = render_helpers.remove_csrf(response.content.decode('utf-8'))
# self.assertEqual(response, rendered)
def test_post_with_anonymous(self):
"""
Updating with AnonymousUser.
It should redirect to the login page.
"""
self.request.method = 'POST'
self.request.POST = {
'name': 'My social interaction',
'description': '',
'socialaccount': self.socialaccount_2.id
}
response = self.view(self.request)
self.assertEqual(response.status_code, 302)
self.assertIn('/admin/account/login/', response['location'])
self.assertEqual(0, SocialInteractionPost.objects.count())
def test_post_with_user(self):
"""
Updating with normal user.
It should render the page with an error message.
"""
self.request.method = 'POST'
self.request.POST = {
'name': 'My social interaction',
'description': '',
'socialaccount': self.socialaccount_2.id
}
self.request.user = self.regular_user
response = self.view(self.request, project_id=self.project.id).render()
rendered = render_to_string(
'socialinteractions/socialinteraction_post_create.html',
{
'error_description': 'Project matching query does not exist.',
'error': 'Not found.',
'user': self.regular_user,
'PLATFORM_NAME': get_current_site(self.request).name,
'GEOKEY_VERSION': version.get_version()
}
)
self.assertEqual(response.status_code, 200)
self.assertEqual(response.content.decode('utf-8'), rendered)
self.assertEqual(0, SocialInteractionPost.objects.count())
def test_post_with_admin(self):
"""
Updating with project admin.
It should create the social interaction and redirect to the social
interaction page to set the post message.
"""
self.request.method = 'POST'
post = QueryDict('socialaccount=%s&text_post=%s&text_link=%s' %
(
self.socialaccount_2.id,
'New contribution created for #project. Check it out here $link$',
'http://www.mymapfrontend/$project_id$/$contribution_id$'
))
self.request.POST = post
self.request.user = self.admin_user
response = self.view(self.request, project_id=self.project.id)
self.assertEqual(1, SocialInteractionPost.objects.count())
socialinteraction = SocialInteractionPost.objects.first()
self.assertEqual(socialinteraction.text_to_post,
'New contribution created for #project. Check it out here $link$')
self.assertEqual(socialinteraction.link, 'http://www.mymapfrontend/$project_id$/$contribution_id$')
self.assertEqual(socialinteraction.project, self.project)
self.assertEqual(socialinteraction.creator, self.admin_user)
self.assertEqual(self.socialaccount_2, socialinteraction.socialaccount)
self.assertEqual(response.status_code, 302)
self.assertIn(
'/admin/projects/%s/socialinteractions/' % (
self.project.id),
response['location']
)
def test_post_on_locked_project_with_admin(self):
"""
Updating with project admin when the project is locked.
It should redirect to creating a new social interaction page.
"""
self.request.method = 'POST'
post = QueryDict('name=%s&description=''%s&socialaccount=%s' %
(
'My social interaction',
'',
self.socialaccount_2.id
))
self.request.POST = post
self.project.islocked = True
self.project.save()
self.request.user = self.admin_user
response = self.view(self.request, project_id=self.project.id)
self.assertEqual(0, SocialInteractionPost.objects.count())
self.assertEqual(response.status_code, 302)
self.assertIn(
'/admin/projects/%s/socialinteractions/post/create' % (self.project.id),
response['location']
)
def test_post_when_social_accounts_do_not_exist_with_admin(self):
"""
Updating with project admin when the social account is not found.
It should redirect to creating a new social interaction page.
"""
self.request.method = 'POST'
post = QueryDict('name=%s&description=''%s&socialaccount=%s' %
(
'My social interaction',
'',
74746464
))
self.request.POST = post
self.request.user = self.admin_user
response = self.view(self.request, project_id=self.project.id)
self.assertEqual(0, SocialInteractionPost.objects.count())
self.assertEqual(response.status_code, 302)
self.assertIn(
'/admin/projects/%s/socialinteractions/post/create' % (self.project.id),
response['location'])
@override_settings(INSTALLED_APPS=install_required_apps())
class SocialInteractionSettingsTest(TestCase):
"""Test social interaction settings page."""
def setUp(self):
"""Set up tests."""
self.anonymous_user = AnonymousUser()
self.regular_user = UserFactory.create()
self.admin_user = UserFactory.create()
self.project = ProjectFactory.create(creator=self.admin_user)
self.socialaccount_2 = SocialAccount.objects.create(
user=self.admin_user, provider='facebook', uid='2')
self.socialaccount_1 = SocialAccount.objects.create(
user=self.admin_user, provider='twitter', uid='1')
self.socialaccount_3 = SocialAccount.objects.create(
user=self.admin_user, provider='twitter', uid='3')
self.socialinteraction = SocialInteractionFactory.create(
socialaccount=self.socialaccount_1,
project=self.project,
creator=self.admin_user
)
self.view = SocialInteractionPostSettings.as_view()
self.request = HttpRequest()
self.request.method = 'GET'
self.request.user = self.anonymous_user
setattr(self.request, 'session', 'session')
messages = FallbackStorage(self.request)
setattr(self.request, '_messages', messages)
def test_get_with_anonymous(self):
"""
Accessing the view with AnonymousUser.
It should redirect to the login page.
"""
response = self.view(self.request)
self.assertEqual(response.status_code, 302)
self.assertIn('/admin/account/login/', response['location'])
self.assertEqual(SocialInteractionPost.objects.count(), 1)
def test_get_with_user(self):
"""
Accessing the view with normal user.
It should render the page with an error message.
"""
self.request.user = self.regular_user
response = self.view(
self.request,
project_id=self.project.id,
socialinteraction_id=self.socialinteraction.id
).render()
rendered = render_to_string(
'socialinteractions/socialinteraction_post_settings.html',
{
'error_description': 'Project matching query does not exist.',
'error': 'Not found.',
'user': self.regular_user,
'PLATFORM_NAME': get_current_site(self.request).name,
'GEOKEY_VERSION': version.get_version()
}
)
self.assertEqual(response.status_code, 200)
self.assertEqual(response.content.decode('utf-8'), rendered)
# def test_get_with_admin(self):
# """
# Accessing the view with project admin.
# It should render the page.
# """
# self.socialinteraction.creator = self.admin_user
# self.socialinteraction.project = self.project
# self.request.user = self.socialinteraction.creator
# self.socialinteraction.save()
# response = self.view(
# self.request,
# project_id=self.project.id,
# socialinteraction_id=self.socialinteraction.id
# ).render()
# socialaccounts_log = SocialAccount.objects.filter(
# user=self.admin_user,
# provider__in=[id for id, name in registry.as_choices()
# if id in ['twitter', 'facebook']]
# )
# rendered = render_to_string(
# 'socialinteractions/socialinteraction_post_settings.html',
# {
# 'project': self.socialinteraction.project,
# 'socialinteraction': self.socialinteraction,
# 'auth_users': socialaccounts_log,
# 'user': self.admin_user,
# 'PLATFORM_NAME': get_current_site(self.request).name,
# 'GEOKEY_VERSION': version.get_version()
# }
# )
# self.assertEqual(response.status_code, 200)
# response = render_helpers.remove_csrf(response.content.decode('utf-8'))
# self.assertEqual(response, rendered)
def test_post_with_anonymous(self):
"""
Updating with AnonymousUser.
It should redirect to the login page.
"""
self.request.method = 'POST'
reference = SocialInteractionPost.objects.get(pk=self.socialinteraction.id)
post = QueryDict('socialaccount=%s&text_post=%s&text_link=%s' %
(
self.socialaccount_3.id,
'New contribution created for #project. Check it out here $link$',
'http://www.mymapfrontend/$project_id$/$contribution_id$'
))
self.request.POST = post
response = self.view(
self.request,
project_id=self.socialinteraction.project.id,
socialinteraction_id=self.socialinteraction.id)
self.assertEqual(response.status_code, 302)
self.assertIn('/admin/account/login/', response['location'])
reference = SocialInteractionPost.objects.get(pk=self.socialinteraction.id)
self.assertNotEqual(reference.text_to_post, post['text_post'])
self.assertNotEqual(reference.link, post['text_link'])
socialaccount = reference.socialaccount
self.assertNotEqual(self.socialaccount_3, socialaccount)
def test_post_with_user(self):
"""
Updating with normal user.
It should render the page with an error message.
"""
self.request.method = 'POST'
post = QueryDict('socialaccount=%s&text_post=%s&text_link=%s' %
(
self.socialaccount_3.id,
'New contribution created for #project. Check it out here $link$',
'http://www.mymapfrontend/$project_id$/$contribution_id$'
))
self.request.POST = post
self.request.user = self.regular_user
response = self.view(
self.request,
project_id=self.project.id,
socialinteraction_id=self.socialinteraction.id
).render()
rendered = render_to_string(
'socialinteractions/socialinteraction_post_settings.html',
{
'error_description': 'Project matching query does not exist.',
'error': 'Not found.',
'user': self.regular_user,
'PLATFORM_NAME': get_current_site(self.request).name,
'GEOKEY_VERSION': version.get_version()
}
)
self.assertEqual(response.status_code, 200)
self.assertEqual(response.content.decode('utf-8'), rendered)
reference = SocialInteractionPost.objects.get(pk=self.socialinteraction.id)
self.assertNotEqual(reference.text_to_post,
'New contribution created for #project. Check it out here $link$')
self.assertNotEqual(reference.link, 'http://www.mymapfrontend/$project_id$/$contribution_id$')
socialaccount = reference.socialaccount
self.assertNotEqual(self.socialaccount_3, socialaccount)
class SocialInteractionDeleteTest(TestCase):
"""Test social interaction delete view."""
def setUp(self):
"""Set up test."""
self.anonymous_user = AnonymousUser()
self.regular_user = UserFactory.create()
self.admin_user = UserFactory.create()
self.project = ProjectFactory.create(creator=self.admin_user)
self.socialaccount_2 = SocialAccount.objects.create(
user=self.admin_user, provider='facebook', uid='2')
self.socialaccount_1 = SocialAccount.objects.create(
user=self.admin_user, provider='twitter', uid='1')
self.socialaccount_3 = SocialAccount.objects.create(
user=self.admin_user, provider='twitter', uid='3')
self.socialinteraction = SocialInteractionFactory.create(
socialaccount=self.socialaccount_1,
project=self.project,
creator=self.admin_user
)
self.view = SocialInteractionPostDelete.as_view()
self.request = HttpRequest()
self.request.method = 'GET'
self.request.user = self.anonymous_user
setattr(self.request, 'session', 'session')
messages = FallbackStorage(self.request)
setattr(self.request, '_messages', messages)
def test_get_with_anonymous(self):
"""
Accessing the view with AnonymousUser.
It should redirect to the login page.
"""
response = self.view(self.request)
self.assertEqual(response.status_code, 302)
self.assertIn('/admin/account/login/', response['location'])
self.assertEqual(SocialInteractionPost.objects.count(), 1)
def test_get_with_user(self):
"""
Accessing the view with normal user.
It should render the page with an error message.
"""
self.request.user = self.regular_user
response = self.view(
self.request,
project_id=self.project.id,
socialinteraction_id=self.socialinteraction.id
).render()
rendered = render_to_string(
'base.html',
{
'error_description': 'Project matching query does not exist.',
'error': 'Not found.',
'user': self.regular_user,
'PLATFORM_NAME': get_current_site(self.request).name,
'GEOKEY_VERSION': version.get_version()
}
)
self.assertEqual(response.status_code, 200)
self.assertEqual(response.content.decode('utf-8'), rendered)
def test_get_with_admin(self):
"""
Accessing the view with project admin.
It should render the page.
"""
self.socialinteraction.project = self.project
self.socialinteraction.creator = self.admin_user
self.socialinteraction.save()
self.request.user = self.admin_user
response = self.view(
self.request,
project_id=self.project.id,
socialinteraction_id=self.socialinteraction.id
)
self.assertEqual(response.status_code, 302)
self.assertEqual(SocialInteractionPost.objects.count(), 0)
def test_delete_with_admin_when_project_is_locked(self):
"""
Accessing the view with project admin when project is locked.
It should render the page.
"""
self.project.islocked = True
self.project.save()
self.socialinteraction.project = self.project
self.socialinteraction.creator = self.admin_user
self.socialinteraction.save()
self.request.user = self.admin_user
response = self.view(
self.request,
project_id=self.project.id,
socialinteraction_id=self.socialinteraction.id
)
self.assertEqual(response.status_code, 302)
self.assertIn(
reverse(
'admin:socialinteraction_list',
args=(self.project.id,)
),
response['location']
)
self.assertEqual(SocialInteractionPost.objects.count(), 1)
def test_delete_with_admin_when_project_does_not_exit(self):
"""
Accessing the view with project admin when project does not exist.
It should render the page with an error message.
"""
self.socialinteraction.project = self.project
self.socialinteraction.creator = self.admin_user
self.socialinteraction.save()
self.request.user = self.admin_user
response = self.view(
self.request,
project_id=634842156456,
socialinteraction_id=self.socialinteraction.id
).render()
rendered = render_to_string(
'base.html',
{
'error_description': 'Project matching query does not exist.',
'error': 'Not found.',
'user': self.admin_user,
'PLATFORM_NAME': get_current_site(self.request).name,
'GEOKEY_VERSION': version.get_version()
}
)
self.assertEqual(response.status_code, 200)
self.assertEqual(response.content.decode('utf-8'), rendered)
self.assertEqual(SocialInteractionPost.objects.count(), 1)
@override_settings(INSTALLED_APPS=install_required_apps())
class SocialInteractionPullCreateTest(TestCase):
"""Test creating a new social interaction pull."""
def setUp(self):
"""Set up tests."""
self.anonymous_user = AnonymousUser()
self.regular_user = UserFactory.create()
self.admin_user = UserFactory.create()
self.project = ProjectFactory.create(creator=self.admin_user)
self.socialaccount_1 = SocialAccount.objects.create(
user=self.regular_user, provider='twitter', uid='1')
self.socialaccount_2 = SocialAccount.objects.create(
user=self.admin_user, provider='twitter', uid='2')
self.view = SocialInteractionPullCreate.as_view()
self.request = HttpRequest()
self.request.method = 'GET'
self.request.user = self.anonymous_user
setattr(self.request, 'session', 'session')
messages = FallbackStorage(self.request)
setattr(self.request, '_messages', messages)
def test_get_with_anonymous(self):
"""
Accessing the view with AnonymousUser.
It should redirect to the loginpage.
"""
response = self.view(self.request)
self.assertEqual(response.status_code, 302)
self.assertIn('/admin/account/login/', response['location'])
def test_get_with_user(self):
"""
Accessing the view with normal user.
It should render the page with an error message.
"""
self.request.user = self.regular_user
response = self.view(self.request, project_id=self.project.id).render()
rendered = render_to_string(
'socialinteractions/socialinteraction_pull_create.html',
{
'error_description': 'Project matching query does not exist.',
'error': 'Not found.',
'user': self.regular_user,
'PLATFORM_NAME': get_current_site(self.request).name,
'GEOKEY_VERSION': version.get_version()
}
)
self.assertEqual(response.status_code, 200)
self.assertEqual(response.content.decode('utf-8'), rendered)
# def test_get_with_admin(self):
# """
# Accessing the view with project admin.
# It should render the page.
# """
# self.request.user = self.admin_user
# response = self.view(self.request, project_id=self.project.id).render()
# rendered = render_to_string(
# 'socialinteractions/socialinteraction_pull_create.html',
# {
# 'project': self.project,
# 'auth_users': [
# self.socialaccount_2,
# self.socialaccount_1
# ],
# 'user': self.admin_user,
# 'PLATFORM_NAME': get_current_site(self.request).name,
# 'GEOKEY_VERSION': version.get_version()
# }
# )
# self.assertEqual(response.status_code, 200)
# response = render_helpers.remove_csrf(response.content.decode('utf-8'))
# # self.assertEqual(response, rendered)
class SocialInteractionPullDeleteTest(TestCase):
"""Test social interaction pull delete view."""
def setUp(self):
"""Set up test."""
self.anonymous_user = AnonymousUser()
self.regular_user = UserFactory.create()
self.admin_user = UserFactory.create()
self.project = ProjectFactory.create(creator=self.admin_user)
self.socialaccount_2 = SocialAccount.objects.create(
user=self.admin_user, provider='facebook', uid='2')
self.socialaccount_1 = SocialAccount.objects.create(
user=self.admin_user, provider='twitter', uid='1')
self.socialaccount_3 = SocialAccount.objects.create(
user=self.admin_user, provider='twitter', uid='3')
self.si_pull = SocialInteractionPullFactory.create(
socialaccount=self.socialaccount_1,
project=self.project,
creator=self.admin_user
)
self.view = SocialInteractionPullDelete.as_view()
self.request = HttpRequest()
self.request.method = 'GET'
self.request.user = self.anonymous_user
setattr(self.request, 'session', 'session')
messages = FallbackStorage(self.request)
setattr(self.request, '_messages', messages)
def test_get_with_anonymous(self):
"""
Accessing the view with AnonymousUser.
It should redirect to the login page.
"""
response = self.view(self.request)
self.assertEqual(response.status_code, 302)
self.assertIn('/admin/account/login/', response['location'])
self.assertEqual(SocialInteractionPull.objects.count(), 1)
def test_get_with_user(self):
"""
Accessing the view with normal user.
It should render the page with an error message.
"""
self.request.user = self.regular_user
response = self.view(
self.request,
project_id=self.project.id,
socialinteractionpull_id=self.si_pull.id
).render()
rendered = render_to_string(
'base.html',
{
'error_description': 'Project matching query does not exist.',
'error': 'Not found.',
'user': self.regular_user,
'PLATFORM_NAME': get_current_site(self.request).name,
'GEOKEY_VERSION': version.get_version()
}
)
self.assertEqual(response.status_code, 200)
self.assertEqual(response.content.decode('utf-8'), rendered)
def test_get_with_admin(self):
"""
Accessing the view with project admin.
It should render the page.
"""
self.si_pull.project = self.project
self.si_pull.creator = self.admin_user
self.si_pull.save()
self.request.user = self.admin_user
response = self.view(
self.request,
project_id=self.project.id,
socialinteractionpull_id=self.si_pull.id
)
self.assertEqual(response.status_code, 302)
self.assertEqual(SocialInteractionPull.objects.count(), 0)
def test_delete_with_admin_when_project_is_locked(self):
"""
Accessing the view with project admin when project is locked.
It should render the page.
"""
self.project.islocked = True
self.project.save()
self.si_pull.project = self.project
self.si_pull.creator = self.admin_user
self.si_pull.save()
self.request.user = self.admin_user
response = self.view(
self.request,
project_id=self.project.id,
socialinteractionpull_id=self.si_pull.id
)
self.assertEqual(response.status_code, 302)
self.assertIn(
reverse(
'admin:socialinteraction_list',
args=(self.project.id,)
),
response['location']
)
self.assertEqual(SocialInteractionPull.objects.count(), 1)
def test_delete_with_admin_when_project_does_not_exit(self):
"""
Accessing the view with project admin when project does not exist.
It should render the page with an error message.
"""
self.si_pull.project = self.project
self.si_pull.creator = self.admin_user
self.si_pull.save()
self.request.user = self.admin_user
response = self.view(
self.request,
project_id=634842156456,
socialinteractionpull_id=self.si_pull.id
).render()
rendered = render_to_string(
'base.html',
{
'error_description': 'Project matching query does not exist.',
'error': 'Not found.',
'user': self.admin_user,
'PLATFORM_NAME': get_current_site(self.request).name,
'GEOKEY_VERSION': version.get_version()
}
)
self.assertEqual(response.status_code, 200)
self.assertEqual(response.content.decode('utf-8'), rendered)
self.assertEqual(SocialInteractionPull.objects.count(), 1)
@override_settings(INSTALLED_APPS=install_required_apps())
class SocialInteractionPullSettingsTest(TestCase):
"""Test social interaction pull settings page."""
def setUp(self):
"""Set up tests."""
self.anonymous_user = AnonymousUser()
self.regular_user = UserFactory.create()
self.admin_user = UserFactory.create()
self.project = ProjectFactory.create(creator=self.admin_user)
self.socialaccount_2 = SocialAccount.objects.create(
user=self.admin_user, provider='facebook', uid='2')
self.socialaccount_1 = SocialAccount.objects.create(
user=self.admin_user, provider='twitter', uid='1')
self.socialaccount_3 = SocialAccount.objects.create(
user=self.admin_user, provider='twitter', uid='3')
self.si_pull = SocialInteractionPullFactory.create(
socialaccount=self.socialaccount_1,
project=self.project,
creator=self.admin_user
)
self.view = SocialInteractionPullSettings.as_view()
self.request = HttpRequest()
self.request.method = 'GET'
self.request.user = self.anonymous_user
self.freq = freq_dic.keys()
refund_dict = {value: key for key, value in STATUS}
self.status_types = refund_dict.keys()
setattr(self.request, 'session', 'session')
messages = FallbackStorage(self.request)
setattr(self.request, '_messages', messages)
def test_get_with_anonymous(self):
"""
Accessing the view with AnonymousUser.
It should redirect to the login page.
"""
response = self.view(self.request)
self.assertEqual(response.status_code, 302)
self.assertIn('/admin/account/login/', response['location'])
self.assertEqual(SocialInteractionPull.objects.count(), 1)
def test_get_with_user(self):
"""
Accessing the view with normal user.
It should render the page with an error message.
"""
self.request.user = self.regular_user
response = self.view(
self.request,
project_id=self.project.id,
socialinteractionpull_id=self.si_pull.id
).render()
rendered = render_to_string(
'socialinteractions/socialinteraction_pull.html',
{
'error_description': 'Project matching query does not exist.',
'error': 'Not found.',
'user': self.regular_user,
'PLATFORM_NAME': get_current_site(self.request).name,
'GEOKEY_VERSION': version.get_version()
}
)
self.assertEqual(response.status_code, 200)
self.assertEqual(response.content.decode('utf-8'), rendered)
# def test_get_with_admin(self):
# """
# Accessing the view with project admin.
# It should render the page.
# """
# socialaccounts_log = SocialAccount.objects.filter(
# user=self.admin_user,
# provider__in=[id for id, name in registry.as_choices()
# if id in ['twitter', 'facebook']]
# )
# self.si_pull.creator = self.admin_user
# self.si_pull.project = self.project
# self.request.user = self.si_pull.creator
# self.si_pull.save()
# response = self.view(
# self.request,
# project_id=self.project.id,
# socialinteractionpull_id=self.si_pull.id
# ).render()
# socialaccounts_log = SocialAccount.objects.filter(
# user=self.admin_user,
# provider__in=[id for id, name in registry.as_choices()
# if id in ['twitter', 'facebook']]
# )
# rendered = render_to_string(
# 'socialinteractions/socialinteraction_pull.html',
# {
# 'project': self.si_pull.project,
# 'auth_users': socialaccounts_log,
# 'socialinteraction_pull': self.si_pull,
# 'user': self.admin_user,
# 'status_types': self.status_types,
# 'freq': self.freq,
# 'PLATFORM_NAME': get_current_site(self.request).name,
# 'GEOKEY_VERSION': version.get_version()
# }
# )
# self.assertEqual(response.status_code, 200)
# response = render_helpers.remove_csrf(response.content.decode('utf-8'))
# self.assertEqual(response, rendered)
def test_post_with_anonymous(self):
"""
Updating with AnonymousUser.
It should redirect to the login page.
"""
self.request.method = 'POST'
post = QueryDict('frequency=%s&text_pull=%s&status_type=%s&socialaccount=%s' % (
'fortnigthly',
'#hastag',
'inactive',
self.socialaccount_3.id,
))
self.request.POST = post
response = self.view(
self.request,
project_id=self.si_pull.project.id,
socialinteractionpull_id=self.si_pull.id)
self.assertEqual(response.status_code, 302)
self.assertIn('/admin/account/login/', response['location'])
reference = SocialInteractionPull.objects.get(id=self.si_pull.id)
self.assertNotEqual(reference.frequency, 'fortnigthly')
self.assertNotEqual(reference.text_to_pull, '#hastag')
self.assertNotEqual(reference.status, 'inactive')
socialaccount = reference.socialaccount
self.assertNotEqual(self.socialaccount_3, socialaccount)
# def test_post_with_user(self):
# """
# Updating with normal user.
#
# It should render the page with an error message.
# """
# self.request.method = 'POST'
# post = QueryDict('frequency=%s&text_pull=%s&status_type=%s&socialaccount=%s' % (
# 'fortnigthly',
# '#hastag',
# 'inactive',
# self.socialaccount_3.id,
# ))
# self.request.POST = post
#
# self.request.user = self.regular_user
# response = self.view(
# self.request,
# project_id=self.si_pull.project.id,
# socialinteractionpull_id=self.si_pull.id
# ).render()
#
# rendered = render_to_string(
# 'socialinteractions/socialinteraction_pull.html',
# {
# 'error_description': 'Project matching query does not exist.',
# 'error': 'Not found.',
# 'user': self.regular_user,
# 'PLATFORM_NAME': get_current_site(self.request).name,
# 'GEOKEY_VERSION': version.get_version()
# }
# )
# self.assertEqual(response.status_code, 200)
# self.assertEqual(response.content.decode('utf-8'), rendered)
#
# reference = SocialInteractionPull.objects.get(id=self.si_pull.id)
# self.assertEqual(str(reference.frequency), 'fortnigthly')
# self.assertEqual(str(reference.status), 'inactive')
# self.assertEqual(str(reference.text_to_pull), '#hastag')
# socialaccount = reference.socialaccount
# self.assertEqual(self.socialaccount_3, socialaccount)
# def test_post_with_admin(self):
# """
# Updating with admin user.
# It should render the page with an error message.
# """
# self.request.method = 'POST'
# post = QueryDict('frequency=%s&text_pull=%s&status_type=%s&socialaccount=%s' % (
# 'hourly',
# '#hastag',
# 'inactive',
# self.socialaccount_3.id,
# ))
# self.request.POST = post
# socialaccounts_log = SocialAccount.objects.filter(
# user=self.admin_user,
# provider__in=[id for id, name in registry.as_choices()
# if id in ['twitter', 'facebook']]
# )
# self.request.user = self.admin_user
# response = self.view(
# self.request,
# project_id=self.si_pull.project.id,
# socialinteractionpull_id=self.si_pull.id
# ).render()
# rendered = render_to_string(
# 'socialinteractions/socialinteraction_pull.html',
# {
# 'project': self.si_pull.project,
# 'auth_users': socialaccounts_log,
# 'socialinteraction_pull': self.si_pull,
# 'user': self.admin_user,
# 'status_types': self.status_types,
# 'freq': self.freq,
# 'PLATFORM_NAME': get_current_site(self.request).name,
# 'GEOKEY_VERSION': version.get_version()
# }
# )
# self.assertEqual(response.status_code, 200)
# self.assertEqual(response.content.decode('utf-8'), rendered)
# reference = SocialInteractionPull.objects.get(id=self.si_pull.id)
# self.assertEqual(str(reference.frequency), 'hourly')
# self.assertEqual(str(reference.status), 'inactive')
# self.assertEqual(str(reference.text_to_pull), '#hastag')
# socialaccount = reference.socialaccount
# self.assertEqual(self.socialaccount_3, socialaccount)
| 37.417897 | 107 | 0.60221 | 4,300 | 43,068 | 5.874186 | 0.048605 | 0.060968 | 0.031909 | 0.024902 | 0.885071 | 0.875965 | 0.871135 | 0.864484 | 0.847421 | 0.837088 | 0 | 0.007369 | 0.294163 | 43,068 | 1,150 | 108 | 37.450435 | 0.823547 | 0.244613 | 0 | 0.757813 | 0 | 0 | 0.117075 | 0.034175 | 0 | 0 | 0 | 0 | 0.134375 | 1 | 0.057813 | false | 0 | 0.032813 | 0 | 0.103125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4ae3f09926d3a8b0cc2d1663e5d3bf330ec59f60 | 9,327 | py | Python | z2/part2/interactive/jm/random_fuzzy_arrows_1/751716764.py | kozakusek/ipp-2020-testy | 09aa008fa53d159672cc7cbf969a6b237e15a7b8 | [
"MIT"
] | 1 | 2020-04-16T12:13:47.000Z | 2020-04-16T12:13:47.000Z | z2/part2/interactive/jm/random_fuzzy_arrows_1/751716764.py | kozakusek/ipp-2020-testy | 09aa008fa53d159672cc7cbf969a6b237e15a7b8 | [
"MIT"
] | 18 | 2020-03-06T17:50:15.000Z | 2020-05-19T14:58:30.000Z | z2/part2/interactive/jm/random_fuzzy_arrows_1/751716764.py | kozakusek/ipp-2020-testy | 09aa008fa53d159672cc7cbf969a6b237e15a7b8 | [
"MIT"
] | 18 | 2020-03-06T17:45:13.000Z | 2020-06-09T19:18:31.000Z | from part1 import (
gamma_board,
gamma_busy_fields,
gamma_delete,
gamma_free_fields,
gamma_golden_move,
gamma_golden_possible,
gamma_move,
gamma_new,
)
"""
scenario: test_random_actions
uuid: 751716764
"""
"""
random actions, total chaos
"""
board = gamma_new(8, 7, 5, 7)
assert board is not None
assert gamma_move(board, 1, 1, 0) == 1
assert gamma_move(board, 1, 3, 2) == 1
assert gamma_move(board, 2, 2, 6) == 1
assert gamma_move(board, 3, 1, 7) == 0
assert gamma_move(board, 3, 5, 4) == 1
assert gamma_move(board, 4, 4, 6) == 1
assert gamma_move(board, 4, 2, 5) == 1
assert gamma_move(board, 5, 4, 1) == 1
assert gamma_move(board, 1, 3, 4) == 1
assert gamma_move(board, 1, 7, 6) == 1
assert gamma_move(board, 2, 4, 3) == 1
assert gamma_free_fields(board, 2) == 46
assert gamma_move(board, 3, 6, 6) == 1
assert gamma_move(board, 4, 0, 7) == 0
assert gamma_move(board, 5, 4, 2) == 1
assert gamma_move(board, 5, 7, 3) == 1
assert gamma_busy_fields(board, 5) == 3
assert gamma_free_fields(board, 5) == 43
assert gamma_golden_possible(board, 5) == 1
assert gamma_move(board, 1, 5, 6) == 1
assert gamma_move(board, 1, 4, 5) == 1
assert gamma_golden_possible(board, 1) == 1
assert gamma_move(board, 2, 3, 5) == 1
assert gamma_golden_possible(board, 2) == 1
assert gamma_move(board, 3, 2, 7) == 0
assert gamma_move(board, 4, 4, 4) == 1
assert gamma_move(board, 4, 3, 4) == 0
assert gamma_move(board, 5, 2, 0) == 1
assert gamma_move(board, 5, 4, 4) == 0
board427163421 = gamma_board(board)
assert board427163421 is not None
assert board427163421 == ("..2.4131\n"
"..421...\n"
"...143..\n"
"....2..5\n"
"...15...\n"
"....5...\n"
".15.....\n")
del board427163421
board427163421 = None
assert gamma_move(board, 1, 2, 2) == 1
assert gamma_move(board, 1, 6, 1) == 1
board401140187 = gamma_board(board)
assert board401140187 is not None
assert board401140187 == ("..2.4131\n"
"..421...\n"
"...143..\n"
"....2..5\n"
"..115...\n"
"....5.1.\n"
".15.....\n")
del board401140187
board401140187 = None
assert gamma_move(board, 2, 6, 0) == 1
assert gamma_move(board, 3, 1, 6) == 1
assert gamma_move(board, 3, 0, 3) == 1
assert gamma_golden_possible(board, 3) == 1
assert gamma_move(board, 4, 4, 6) == 0
assert gamma_move(board, 5, 2, 5) == 0
assert gamma_move(board, 5, 7, 2) == 1
assert gamma_golden_possible(board, 5) == 1
assert gamma_move(board, 2, 5, 0) == 1
assert gamma_move(board, 2, 4, 2) == 0
assert gamma_free_fields(board, 2) == 31
assert gamma_move(board, 3, 2, 0) == 0
assert gamma_move(board, 4, 4, 0) == 1
assert gamma_move(board, 4, 6, 5) == 1
assert gamma_move(board, 5, 5, 2) == 1
assert gamma_busy_fields(board, 5) == 6
assert gamma_move(board, 1, 0, 3) == 0
assert gamma_move(board, 2, 3, 3) == 1
assert gamma_move(board, 2, 5, 3) == 1
assert gamma_move(board, 3, 5, 0) == 0
assert gamma_move(board, 3, 2, 1) == 1
assert gamma_move(board, 5, 1, 7) == 0
assert gamma_move(board, 1, 7, 4) == 0
assert gamma_move(board, 2, 6, 0) == 0
assert gamma_move(board, 2, 4, 1) == 0
assert gamma_move(board, 3, 5, 2) == 0
assert gamma_move(board, 4, 2, 0) == 0
assert gamma_move(board, 5, 6, 0) == 0
assert gamma_move(board, 1, 7, 6) == 0
assert gamma_move(board, 1, 0, 2) == 0
assert gamma_free_fields(board, 1) == 11
assert gamma_move(board, 2, 4, 0) == 0
assert gamma_move(board, 3, 5, 6) == 0
assert gamma_move(board, 3, 2, 3) == 1
assert gamma_golden_possible(board, 3) == 1
assert gamma_move(board, 4, 4, 0) == 0
assert gamma_move(board, 4, 0, 5) == 1
assert gamma_move(board, 5, 0, 0) == 1
assert gamma_move(board, 5, 1, 4) == 1
assert gamma_busy_fields(board, 5) == 8
assert gamma_move(board, 1, 1, 0) == 0
assert gamma_move(board, 1, 5, 2) == 0
assert gamma_move(board, 2, 1, 5) == 1
assert gamma_move(board, 3, 1, 5) == 0
assert gamma_move(board, 4, 1, 0) == 0
assert gamma_move(board, 5, 0, 6) == 1
assert gamma_move(board, 1, 5, 3) == 0
assert gamma_move(board, 2, 3, 6) == 1
assert gamma_move(board, 3, 1, 7) == 0
assert gamma_move(board, 3, 1, 1) == 1
assert gamma_move(board, 4, 1, 3) == 1
assert gamma_move(board, 4, 7, 3) == 0
assert gamma_busy_fields(board, 4) == 7
assert gamma_free_fields(board, 4) == 7
assert gamma_golden_move(board, 4, 6, 2) == 0
board811563880 = gamma_board(board)
assert board811563880 is not None
assert board811563880 == ("53224131\n"
"42421.4.\n"
".5.143..\n"
"343222.5\n"
"..1155.5\n"
".33.5.1.\n"
"515.422.\n")
del board811563880
board811563880 = None
assert gamma_move(board, 5, 5, 7) == 0
assert gamma_move(board, 5, 7, 5) == 1
assert gamma_busy_fields(board, 5) == 10
assert gamma_free_fields(board, 5) == 10
assert gamma_move(board, 1, 4, 4) == 0
assert gamma_move(board, 1, 1, 5) == 0
assert gamma_move(board, 2, 6, 3) == 1
assert gamma_move(board, 2, 6, 2) == 1
assert gamma_golden_possible(board, 2) == 1
assert gamma_move(board, 3, 2, 0) == 0
assert gamma_move(board, 5, 2, 5) == 0
assert gamma_free_fields(board, 5) == 8
assert gamma_move(board, 1, 4, 6) == 0
assert gamma_move(board, 1, 1, 2) == 1
assert gamma_move(board, 2, 5, 5) == 1
assert gamma_move(board, 3, 4, 0) == 0
assert gamma_move(board, 4, 3, 1) == 0
assert gamma_move(board, 4, 5, 6) == 0
assert gamma_move(board, 5, 0, 7) == 0
assert gamma_busy_fields(board, 5) == 10
assert gamma_move(board, 1, 2, 0) == 0
assert gamma_move(board, 2, 0, 3) == 0
assert gamma_move(board, 2, 0, 0) == 0
assert gamma_move(board, 3, 1, 7) == 0
assert gamma_move(board, 3, 4, 3) == 0
assert gamma_move(board, 4, 6, 5) == 0
assert gamma_move(board, 4, 4, 5) == 0
assert gamma_move(board, 5, 6, 6) == 0
assert gamma_move(board, 3, 7, 1) == 1
assert gamma_move(board, 4, 1, 1) == 0
assert gamma_golden_possible(board, 4) == 1
assert gamma_move(board, 5, 4, 7) == 0
assert gamma_move(board, 5, 5, 5) == 0
assert gamma_free_fields(board, 5) == 7
assert gamma_move(board, 1, 6, 1) == 0
assert gamma_free_fields(board, 1) == 4
assert gamma_move(board, 2, 1, 0) == 0
assert gamma_move(board, 3, 7, 2) == 0
assert gamma_move(board, 3, 6, 4) == 1
assert gamma_move(board, 4, 4, 0) == 0
assert gamma_busy_fields(board, 4) == 7
assert gamma_move(board, 5, 4, 2) == 0
assert gamma_free_fields(board, 1) == 4
assert gamma_move(board, 2, 4, 7) == 0
assert gamma_move(board, 3, 0, 3) == 0
assert gamma_move(board, 3, 0, 1) == 1
assert gamma_move(board, 4, 4, 7) == 0
assert gamma_move(board, 4, 5, 0) == 0
assert gamma_free_fields(board, 4) == 3
assert gamma_move(board, 5, 4, 2) == 0
assert gamma_busy_fields(board, 5) == 10
assert gamma_move(board, 1, 3, 4) == 0
assert gamma_move(board, 1, 5, 6) == 0
assert gamma_golden_possible(board, 1) == 1
assert gamma_move(board, 2, 4, 2) == 0
assert gamma_move(board, 3, 4, 0) == 0
assert gamma_free_fields(board, 3) == 6
assert gamma_golden_move(board, 3, 6, 7) == 0
assert gamma_move(board, 4, 4, 2) == 0
assert gamma_move(board, 5, 4, 7) == 0
assert gamma_move(board, 5, 0, 6) == 0
assert gamma_move(board, 1, 0, 3) == 0
assert gamma_busy_fields(board, 1) == 9
assert gamma_move(board, 2, 1, 3) == 0
assert gamma_move(board, 2, 1, 6) == 0
assert gamma_move(board, 3, 1, 5) == 0
assert gamma_move(board, 3, 0, 6) == 0
assert gamma_move(board, 4, 4, 0) == 0
assert gamma_move(board, 4, 5, 3) == 0
assert gamma_golden_possible(board, 4) == 1
assert gamma_move(board, 5, 4, 0) == 0
assert gamma_busy_fields(board, 5) == 10
assert gamma_move(board, 1, 2, 0) == 0
assert gamma_move(board, 2, 4, 0) == 0
assert gamma_move(board, 2, 5, 2) == 0
assert gamma_busy_fields(board, 2) == 12
assert gamma_move(board, 3, 0, 7) == 0
assert gamma_move(board, 4, 6, 6) == 0
assert gamma_free_fields(board, 4) == 3
assert gamma_move(board, 5, 4, 2) == 0
assert gamma_move(board, 5, 6, 1) == 0
assert gamma_move(board, 1, 0, 3) == 0
assert gamma_busy_fields(board, 1) == 9
assert gamma_golden_possible(board, 1) == 1
assert gamma_move(board, 2, 1, 3) == 0
assert gamma_move(board, 3, 4, 7) == 0
assert gamma_move(board, 4, 0, 2) == 0
assert gamma_move(board, 4, 0, 5) == 0
assert gamma_move(board, 5, 1, 5) == 0
assert gamma_busy_fields(board, 5) == 10
assert gamma_move(board, 1, 1, 3) == 0
assert gamma_move(board, 1, 3, 6) == 0
assert gamma_busy_fields(board, 1) == 9
assert gamma_busy_fields(board, 2) == 12
assert gamma_move(board, 3, 0, 7) == 0
assert gamma_move(board, 3, 6, 3) == 0
assert gamma_move(board, 4, 4, 0) == 0
assert gamma_move(board, 5, 1, 5) == 0
assert gamma_busy_fields(board, 5) == 10
assert gamma_free_fields(board, 5) == 6
assert gamma_move(board, 1, 1, 3) == 0
assert gamma_move(board, 1, 2, 0) == 0
assert gamma_golden_move(board, 1, 4, 5) == 0
assert gamma_move(board, 2, 3, 2) == 0
assert gamma_golden_possible(board, 2) == 1
assert gamma_move(board, 3, 1, 5) == 0
assert gamma_move(board, 4, 5, 4) == 0
assert gamma_move(board, 4, 4, 4) == 0
assert gamma_golden_possible(board, 4) == 1
assert gamma_move(board, 5, 2, 5) == 0
board310680201 = gamma_board(board)
assert board310680201 is not None
assert board310680201 == ("53224131\n"
"42421245\n"
".5.1433.\n"
"34322225\n"
".1115525\n"
"333.5.13\n"
"515.422.\n")
del board310680201
board310680201 = None
assert gamma_move(board, 1, 1, 5) == 0
assert gamma_move(board, 2, 0, 7) == 0
gamma_delete(board)
| 33.793478 | 46 | 0.654873 | 1,724 | 9,327 | 3.389791 | 0.037703 | 0.37269 | 0.390144 | 0.520192 | 0.864134 | 0.848563 | 0.810062 | 0.5705 | 0.487509 | 0.469541 | 0 | 0.135728 | 0.179265 | 9,327 | 275 | 47 | 33.916364 | 0.627694 | 0 | 0 | 0.321569 | 0 | 0 | 0.030306 | 0 | 0 | 0 | 0 | 0 | 0.811765 | 1 | 0 | false | 0 | 0.003922 | 0 | 0.003922 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ab272ea45f3870937fab22b6f7d39e9a03980382 | 26,670 | py | Python | test/coco.py | rra94/maskmatrix | d7e0f26f70e6f26919cd4550ece838645e83bae4 | [
"MIT"
] | null | null | null | test/coco.py | rra94/maskmatrix | d7e0f26f70e6f26919cd4550ece838645e83bae4 | [
"MIT"
] | null | null | null | test/coco.py | rra94/maskmatrix | d7e0f26f70e6f26919cd4550ece838645e83bae4 | [
"MIT"
] | null | null | null | import os
import cv2
import json
import numpy as np
import torch
import matplotlib.pyplot as plt
from tqdm import tqdm
from config import system_configs
from utils import crop_image, normalize_
from external.nms import soft_nms, soft_nms_merge
def _rescale_dets(detections, ratios, borders, sizes):
xs, ys = detections[..., 0:4:2], detections[..., 1:4:2]
xs /= ratios[:, 1][:, None, None]
ys /= ratios[:, 0][:, None, None]
xs -= borders[:, 2][:, None, None]
ys -= borders[:, 0][:, None, None]
np.clip(xs, 0, sizes[:, 1][:, None, None], out=xs)
np.clip(ys, 0, sizes[:, 0][:, None, None], out=ys)
def save_image(data, fn):
sizes = np.shape(data)
height = float(sizes[0])
width = float(sizes[1])
fig = plt.figure()
fig.set_size_inches(width/height, 1, forward=False)
ax = plt.Axes(fig, [0., 0., 1., 1.])
ax.set_axis_off()
fig.add_axes(ax)
ax.imshow(data)
plt.savefig(fn, dpi = height)
plt.close()
def kp_decode(nnet, images, K, matching_threshold=0.5, kernel=3, layers_range = None, output_kernel_size = None,output_sizes=None, input_size = None, base_layer_range = None):
detections = nnet.test([[images]], dist_threshold=matching_threshold, K=K, kernel=kernel,
layers_range = layers_range, output_kernel_size = output_kernel_size,output_sizes=output_sizes, input_size = input_size,
base_layer_range = base_layer_range)
detections = detections.data.cpu().numpy()
return detections
def test_MatrixNetCorners(db, nnet, result_dir, debug=False, decode_func=kp_decode):
debug_dir = os.path.join(result_dir, "debug")
if not os.path.exists(debug_dir):
os.makedirs(debug_dir)
if db.split != "trainval":
db_inds = db.db_inds[:200] if debug else db.db_inds
else:
db_inds = db.db_inds[:100] if debug else db.db_inds[:100]
num_images = db_inds.size
K = db.configs["top_k"]
matching_threshold = db.configs["matching_threshold"]
nms_kernel = db.configs["nms_kernel"]
flag_flip_images=db.configs["test_flip_images"]
max_dim = db.configs["test_image_max_dim"]
scales = db.configs["test_scales"]
weight_exp = db.configs["weight_exp"]
merge_bbox = db.configs["merge_bbox"]
categories = db.configs["categories"]
nms_threshold = db.configs["nms_threshold"]
max_per_image = db.configs["max_per_image"]
layers_range = db.configs["layers_range"]
input_size = db.configs["input_size"]
output_kernel_size = db.configs["output_kernel_size"]
_dict={}
output_sizes=[]
for i,l in enumerate(layers_range):
for j,e in enumerate(l):
if e !=-1:
output_sizes.append([input_size[0]//(8*2**(j)), input_size[1]//(8*2**(i))])
_dict[(i+1)*10+(j+1)]=e
layers_range=[_dict[i] for i in sorted(_dict)]
layers_range = [[lr[0] * os[0]/input_size[0], lr[1] * os[0]/input_size[0],
lr[2] * os[1]/input_size[1], lr[3] * os[1]/input_size[1]] for (lr, os) in zip (layers_range, output_sizes)]
nms_algorithm = {
"nms": 0,
"linear_soft_nms": 1,
"exp_soft_nms": 2
}[db.configs["nms_algorithm"]]
top_bboxes = {}
for ind in tqdm(range(0, num_images), ncols=80, desc="locating kps"):
db_ind = db_inds[ind]
image_id = db.image_ids(db_ind)
image_file = db.image_file(db_ind)
image = cv2.imread(image_file)
height, width = image.shape[0:2]
detections = []
for scale in scales:
org_scale = scale
scale = scale * min((max_dim)/float(height), (max_dim)/float(width))
new_height = int(height * scale)
new_width = int(width * scale)
new_center = np.array([new_height // 2, new_width // 2])
inp_height = ((new_height // 128) + 1) * 128
inp_width = ((new_width // 128) + 1) * 128
images = np.zeros((1, 3, inp_height, inp_width), dtype=np.float32)
ratios = np.zeros((1, 2), dtype=np.float32)
borders = np.zeros((1, 4), dtype=np.float32)
sizes = np.zeros((1, 2), dtype=np.float32)
out_height, out_width = ((inp_height) // 8, (inp_width) // 8)
height_ratio = out_height / inp_height
width_ratio = out_width / inp_width
resized_image = cv2.resize(image, (new_width, new_height))
resized_image, border, offset = crop_image(resized_image, new_center, [inp_height, inp_width])
resized_image = resized_image / 255.
images[0] = resized_image.transpose((2, 0, 1))
borders[0] = border
sizes[0] = [int(height * scale), int(width * scale)]
ratios[0] = [height_ratio, width_ratio]
if flag_flip_images:
images = np.concatenate((images, images[:, :, :, ::-1]), axis=0)
images = torch.from_numpy(images)
dets = decode_func(nnet, images, K, matching_threshold=matching_threshold, kernel=nms_kernel,
layers_range=layers_range, output_kernel_size = output_kernel_size, output_sizes=output_sizes,input_size=input_size)
if flag_flip_images:
dets = dets.reshape(2, -1, 8)
dets[1, :, [0, 2]] = out_width - dets[1, :, [2, 0]]
dets = dets.reshape(1, -1, 8)
_rescale_dets(dets, ratios, borders, sizes)
dets[:, :, 0:4] /= scale
detections.append(dets)
detections = np.concatenate(detections, axis=1)
classes = detections[..., -1]
classes = classes[0]
detections = detections[0]
# reject detections with negative scores
keep_inds = (detections[:, 4] > 0)
detections = detections[keep_inds]
classes = classes[keep_inds]
top_bboxes[image_id] = {}
for j in range(categories):
keep_inds = (classes == j)
top_bboxes[image_id][j + 1] = detections[keep_inds][:, 0:7].astype(np.float32)
if merge_bbox:
soft_nms_merge(top_bboxes[image_id][j + 1], Nt=nms_threshold, method=nms_algorithm, weight_exp=weight_exp)
else:
soft_nms(top_bboxes[image_id][j + 1], Nt=nms_threshold, method=nms_algorithm)
top_bboxes[image_id][j + 1] = top_bboxes[image_id][j + 1][:, 0:5]
scores = np.hstack([
top_bboxes[image_id][j][:, -1]
for j in range(1, categories + 1)
])
if len(scores) > max_per_image:
kth = len(scores) - max_per_image
thresh = np.partition(scores, kth)[kth]
for j in range(1, categories + 1):
keep_inds = (top_bboxes[image_id][j][:, -1] >= thresh)
top_bboxes[image_id][j] = top_bboxes[image_id][j][keep_inds]
if debug:
image_file = db.image_file(db_ind)
image = cv2.imread(image_file)
bboxes = {}
for j in range(categories, 0, -1):
keep_inds = (top_bboxes[image_id][j][:, -1] > 0.2)
cat_name = db.class_name(j)
cat_size = cv2.getTextSize(cat_name, cv2.FONT_HERSHEY_SIMPLEX, 0.5, 2)[0]
color = np.random.random((3, )) * 0.6 + 0.4
color = color * 255
color = color.astype(np.int32).tolist()
for bbox in top_bboxes[image_id][j][keep_inds]:
bbox = bbox[0:4].astype(np.int32)
if bbox[1] - cat_size[1] - 2 < 0:
cv2.rectangle(image,
(bbox[0], bbox[1] + 2),
(bbox[0] + cat_size[0], bbox[1] + cat_size[1] + 2),
color, -1
)
cv2.putText(image, cat_name,
(bbox[0], bbox[1] + cat_size[1] + 2),
cv2.FONT_HERSHEY_SIMPLEX, 0.5, (0, 0, 0), thickness=1
)
else:
cv2.rectangle(image,
(bbox[0], bbox[1] - cat_size[1] - 2),
(bbox[0] + cat_size[0], bbox[1] - 2),
color, -1
)
cv2.putText(image, cat_name,
(bbox[0], bbox[1] - 2),
cv2.FONT_HERSHEY_SIMPLEX, 0.5, (0, 0, 0), thickness=1
)
cv2.rectangle(image,
(bbox[0], bbox[1]),
(bbox[2], bbox[3]),
color, 2
)
debug_file = os.path.join(debug_dir, "{}.jpg".format(db_ind))
print(debug_file)
cv2.imwrite(debug_file,image)
result_json = os.path.join(result_dir, "results.json")
detections = db.convert_to_coco(top_bboxes)
with open(result_json, "w") as f:
json.dump(detections, f)
cls_ids = list(range(1, categories + 1))
image_ids = [db.image_ids(ind) for ind in db_inds]
db.evaluate(result_json, cls_ids, image_ids)
return 0
def test_MatrixNetAnchors_(db, nnet, result_dir, debug=False, decode_func=kp_decode):
debug_dir = os.path.join(result_dir, "debug")
if not os.path.exists(debug_dir):
os.makedirs(debug_dir)
if db.split != "trainval":
db_inds = db.db_inds[:200] if debug else db.db_inds
else:
db_inds = db.db_inds[:100] if debug else db.db_inds[:100]
num_images = db_inds.size
K = db.configs["top_k"]
matching_threshold = db.configs["matching_threshold"]
nms_kernel = db.configs["nms_kernel"]
flag_flip_images=db.configs["test_flip_images"]
max_dim = db.configs["test_image_max_dim"]
scales = db.configs["test_scales"]
weight_exp = db.configs["weight_exp"]
merge_bbox = db.configs["merge_bbox"]
categories = db.configs["categories"]-1
nms_threshold = db.configs["nms_threshold"]
max_per_image = db.configs["max_per_image"]
layers_range = db.configs["layers_range"]
input_size = db.configs["input_size"]
output_kernel_size = db.configs["output_kernel_size"]
base_layer_range = db.configs["base_layer_range"]
_dict={}
output_sizes=[]
for i,l in enumerate(layers_range):
for j,e in enumerate(l):
if e !=-1:
output_sizes.append([input_size[0]//(8*2**(j)), input_size[1]//(8*2**(i))])
_dict[(i+1)*10+(j+1)]=e
layers_range=[_dict[i] for i in sorted(_dict)]
layers_range = [[lr[0] * os[0]/input_size[0], lr[1] * os[0]/input_size[0],
lr[2] * os[1]/input_size[1], lr[3] * os[1]/input_size[1]] for (lr, os) in zip (layers_range, output_sizes)]
nms_algorithm = {
"nms": 0,
"linear_soft_nms": 1,
"exp_soft_nms": 2
}[db.configs["nms_algorithm"]]
top_bboxes = {}
for ind in tqdm(range(0, num_images), ncols=80, desc="locating kps"):
db_ind = db_inds[ind]
image_id = db.image_ids(db_ind)
image_file = db.image_file(db_ind)
image = cv2.imread(image_file)
height, width = image.shape[0:2]
detections = []
for scale in scales:
org_scale = scale
scale = scale * min((max_dim)/float(height), (max_dim)/float(width))
new_height = int(height * scale)
new_width = int(width * scale)
new_center = np.array([new_height // 2, new_width // 2])
if len(scales) == 0:
inp_height = input_size[0]
inp_width = input_size[1]
else:
if (new_height % 128) == 0:
inp_height = new_height
else:
inp_height = ((new_height // 128) + 1) * 128
if (new_width % 128) == 0:
inp_width = new_width
else:
inp_width = ((new_width // 128) + 1) * 128
images = np.zeros((1, 3, inp_height, inp_width), dtype=np.float32)
ratios = np.zeros((1, 2), dtype=np.float32)
borders = np.zeros((1, 4), dtype=np.float32)
sizes = np.zeros((1, 2), dtype=np.float32)
out_height, out_width = ((inp_height) // 8, (inp_width) // 8)
height_ratio = out_height / inp_height
width_ratio = out_width / inp_width
resized_image = cv2.resize(image, (new_width, new_height))
resized_image, border, offset = crop_image(resized_image, new_center, [inp_height, inp_width])
resized_image = resized_image / 255.
images[0] = resized_image.transpose((2, 0, 1))
borders[0] = border
sizes[0] = [int(height * scale), int(width * scale)]
ratios[0] = [height_ratio, width_ratio]
if flag_flip_images:
images = np.concatenate((images, images[:, :, :, ::-1]), axis=0)
images = torch.from_numpy(images)
dets = decode_func(nnet, images, K, matching_threshold=matching_threshold, kernel=nms_kernel,layers_range=layers_range, output_kernel_size = output_kernel_size, output_sizes=output_sizes,input_size=input_size, base_layer_range = base_layer_range)
if flag_flip_images:
dets = dets.reshape(2, -1, 8)
dets[1, :, [0, 2]] = out_width - dets[1, :, [2, 0]]
dets = dets.reshape(1, -1, 8)
_rescale_dets(dets, ratios, borders, sizes)
dets[:, :,0:4 ] /= scale
detections.append(dets)
detections = np.concatenate(detections, axis=1)
classes = detections[..., -1]
classes = classes[0]
detections = detections[0]
#detections[:, 0:4] = detections[:, 0:4] / 8
# reject detections with negative scores
keep_inds = (detections[:, 4] > 0)
detections = detections[keep_inds]
classes = classes[keep_inds]
#:wqprint(detections[:, 0:4], "fsgdsgsgs")
#print(classes.shape)
top_bboxes[image_id] = {}
for j in range(categories):
keep_inds = (classes == j)
top_bboxes[image_id][j + 1] = detections[keep_inds][:, 0:7].astype(np.float32)
#if merge_bbox:
# soft_nms_merge(top_bboxes[image_id][j + 1], Nt=nms_threshold, method=nms_algorithm, weight_exp=weight_exp)
#else:
# soft_nms(top_bboxes[image_id][j + 1], Nt=nms_threshold, method=nms_algorithm)
top_bboxes[image_id][j + 1] = top_bboxes[image_id][j + 1][:, 0:5]
#print(top_bboxes)
scores = np.hstack([
top_bboxes[image_id][j][:, -1]
for j in range(1, categories + 1)
])
if len(scores) > max_per_image:
kth = len(scores) - max_per_image
thresh = np.partition(scores, kth)[kth]
#print(thresh, "dss")
for j in range(1, categories + 1):
keep_inds = (top_bboxes[image_id][j][:, -1] >= thresh)
top_bboxes[image_id][j] = top_bboxes[image_id][j][keep_inds]
#print(top_bboxes)
if debug:
image_file = db.image_file(db_ind)
image = cv2.imread(image_file)
bboxes = {}
for j in range(categories ,0 , -1):
keep_inds = (top_bboxes[image_id][j][:, -1] > 0.)
#print(db.class_name)
cat_name = db.class_name(j)
cat_size = cv2.getTextSize(cat_name, cv2.FONT_HERSHEY_SIMPLEX, 0.5, 2)[0]
color = np.random.random((3, )) * 0.6 + 0.4
color = color * 255
color = color.astype(np.int32).tolist()
for bbox in top_bboxes[image_id][j][keep_inds]:
#print(bbox)
bbox = bbox[0:4].astype(np.int32)
if bbox[1] - cat_size[1] - 2 < 0:
cv2.rectangle(image,
(bbox[0], bbox[1] + 2),
(bbox[0] + cat_size[0], bbox[1] + cat_size[1] + 2),
color, -1
)
cv2.putText(image, cat_name,
(bbox[0], bbox[1] + cat_size[1] + 2),
cv2.FONT_HERSHEY_SIMPLEX, 0.5, (0, 0, 0), thickness=1
)
else:
cv2.rectangle(image,
(bbox[0], bbox[1] - cat_size[1] - 2),
(bbox[0] + cat_size[0], bbox[1] - 2),
color, -1
)
cv2.putText(image, cat_name,
(bbox[0], bbox[1] - 2),
cv2.FONT_HERSHEY_SIMPLEX, 0.5, (0, 0, 0), thickness=1
)
cv2.rectangle(image,
(bbox[0], bbox[1]),
(bbox[2], bbox[3]),
color, 2
)
debug_file = os.path.join(debug_dir, "{}.jpg".format(db_ind))
print(debug_file)
cv2.imwrite(debug_file,image)
result_json = os.path.join(result_dir, "results.json")
detections = db.convert_to_coco(top_bboxes)
with open(result_json, "w") as f:
json.dump(detections, f)
cls_ids = list(range(1, categories+1))
image_ids = [db.image_ids(ind) for ind in db_inds]
#print(image_ids)
#detections=db.convert_to_numpy(top_bboxes)
#print(detections.shape)
db.evaluate(detections, cls_ids, image_ids)
return 0
def test_MatrixNetAnchors(db, nnet, result_dir, debug=False, decode_func=kp_decode):
debug_dir = os.path.join(result_dir, "debug")
if not os.path.exists(debug_dir):
os.makedirs(debug_dir)
if db.split != "trainval":
db_inds = db.db_inds[:200] if debug else db.db_inds
else:
db_inds = db.db_inds[:100] if debug else db.db_inds[:100]
num_images = db_inds.size
K = db.configs["top_k"]
matching_threshold = db.configs["matching_threshold"]
nms_kernel = db.configs["nms_kernel"]
flag_flip_images=db.configs["test_flip_images"]
max_dim = db.configs["test_image_max_dim"]
scales = db.configs["test_scales"]
weight_exp = db.configs["weight_exp"]
merge_bbox = db.configs["merge_bbox"]
categories = db.configs["categories"]-1
nms_threshold = db.configs["nms_threshold"]
max_per_image = db.configs["max_per_image"]
layers_range = db.configs["layers_range"]
input_size = db.configs["input_size"]
output_kernel_size = db.configs["output_kernel_size"]
base_layer_range = db.configs["base_layer_range"]
_dict={}
output_sizes=[]
for i,l in enumerate(layers_range):
for j,e in enumerate(l):
if e !=-1:
output_sizes.append([input_size[0]//(8*2**(j)), input_size[1]//(8*2**(i))])
_dict[(i+1)*10+(j+1)]=e
layers_range=[_dict[i] for i in sorted(_dict)]
layers_range = [[lr[0] * os[0]/input_size[0], lr[1] * os[0]/input_size[0],
lr[2] * os[1]/input_size[1], lr[3] * os[1]/input_size[1]] for (lr, os) in zip (layers_range, output_sizes)]
nms_algorithm = {
"nms": 0,
"linear_soft_nms": 1,
"exp_soft_nms": 2
}[db.configs["nms_algorithm"]]
top_bboxes = {}
for ind in tqdm(range(0, num_images), ncols=80, desc="locating kps"):
db_ind = db_inds[ind]
image_id = db.image_ids(db_ind)
image_file = db.image_file(db_ind)
image = cv2.imread(image_file)
height, width = image.shape[0:2]
detections = []
for scale in scales:
org_scale = scale
scale = scale * min((max_dim)/float(height), (max_dim)/float(width))
new_height = int(height * scale)
new_width = int(width * scale)
new_center = np.array([new_height // 2, new_width // 2])
if len(scales) == 0:
inp_height = input_size[0]
inp_width = input_size[1]
else:
if (new_height % 128) == 0:
inp_height = new_height
else:
inp_height = ((new_height // 128) + 1) * 128
if (new_width % 128) == 0:
inp_width = new_width
else:
inp_width = ((new_width // 128) + 1) * 128
images = np.zeros((1, 3, inp_height, inp_width), dtype=np.float32)
ratios = np.zeros((1, 2), dtype=np.float32)
borders = np.zeros((1, 4), dtype=np.float32)
sizes = np.zeros((1, 2), dtype=np.float32)
out_height, out_width = ((inp_height) // 8, (inp_width) // 8)
height_ratio = out_height / inp_height
width_ratio = out_width / inp_width
resized_image = cv2.resize(image, (new_width, new_height))
resized_image, border, offset = crop_image(resized_image, new_center, [inp_height, inp_width])
resized_image = resized_image / 255.
images[0] = resized_image.transpose((2, 0, 1))
borders[0] = border
sizes[0] = [int(height * scale), int(width * scale)]
ratios[0] = [height_ratio, width_ratio]
if flag_flip_images:
images = np.concatenate((images, images[:, :, :, ::-1]), axis=0)
images = torch.from_numpy(images)
dets = decode_func(nnet, images, K, matching_threshold=matching_threshold, kernel=nms_kernel,layers_range=layers_range, output_kernel_size = output_kernel_size, output_sizes=output_sizes,input_size=input_size, base_layer_range = base_layer_range)
if flag_flip_images:
dets = dets.reshape(2, -1, 8)
dets[1, :, [0, 2]] = out_width - dets[1, :, [2, 0]]
dets = dets.reshape(1, -1, 8)
_rescale_dets(dets, ratios, borders, sizes)
dets[:, :, 0:4] /= scale
detections.append(dets)
detections = np.concatenate(detections, axis=1)
classes = detections[..., -1]
classes = classes[0]
detections = detections[0]
# reject detections with negative scores
keep_inds = (detections[:, 4] > 0)
detections = detections[keep_inds]
classes = classes[keep_inds]
top_bboxes[image_id] = {}
for j in range(categories):
keep_inds = (classes == j)
top_bboxes[image_id][j + 1] = detections[keep_inds][:, 0:7].astype(np.float32)
if merge_bbox:
soft_nms_merge(top_bboxes[image_id][j + 1], Nt=nms_threshold, method=nms_algorithm, weight_exp=weight_exp)
else:
soft_nms(top_bboxes[image_id][j + 1], Nt=nms_threshold, method=nms_algorithm)
top_bboxes[image_id][j + 1] = top_bboxes[image_id][j + 1][:, 0:5]
scores = np.hstack([
top_bboxes[image_id][j][:, -1]
for j in range(1, categories + 1)
])
if len(scores) > max_per_image:
kth = len(scores) - max_per_image
thresh = np.partition(scores, kth)[kth]
for j in range(1, categories + 1):
keep_inds = (top_bboxes[image_id][j][:, -1] >= thresh)
top_bboxes[image_id][j] = top_bboxes[image_id][j][keep_inds]
if debug:
image_file = db.image_file(db_ind)
image = cv2.imread(image_file)
bboxes = {}
for j in range(categories, 0, -1):
keep_inds = (top_bboxes[image_id][j][:, -1] > 0.3)
cat_name = db.class_name(j)
cat_size = cv2.getTextSize(cat_name, cv2.FONT_HERSHEY_SIMPLEX, 0.5, 2)[0]
color = np.random.random((3, )) * 0.6 + 0.4
color = color * 255
color = color.astype(np.int32).tolist()
for bbox in top_bboxes[image_id][j][keep_inds]:
# print(bbox)
bbox = bbox[0:4].astype(np.int32)
if bbox[1] - cat_size[1] - 2 < 0:
cv2.rectangle(image,
(bbox[0], bbox[1] + 2),
(bbox[0] + cat_size[0], bbox[1] + cat_size[1] + 2),
color, -1
)
cv2.putText(image, cat_name,
(bbox[0], bbox[1] + cat_size[1] + 2),
cv2.FONT_HERSHEY_SIMPLEX, 0.5, (0, 0, 0), thickness=1
)
else:
cv2.rectangle(image,
(bbox[0], bbox[1] - cat_size[1] - 2),
(bbox[0] + cat_size[0], bbox[1] - 2),
color, -1
)
cv2.putText(image, cat_name,
(bbox[0], bbox[1] - 2),
cv2.FONT_HERSHEY_SIMPLEX, 0.5, (0, 0, 0), thickness=1
)
cv2.rectangle(image,
(bbox[0], bbox[1]),
(bbox[2], bbox[3]),
color, 2
)
debug_file = os.path.join(debug_dir, "{}.jpg".format(db_ind))
#print(debug_file)
cv2.imwrite(debug_file,image)
result_json = os.path.join(result_dir, "results.json")
detections = db.convert_to_coco(top_bboxes)
with open(result_json, "w") as f:
json.dump(detections, f)
#result_json="home/rragarwal4/new/keras-retinanet/val2017_bbox_results.json"
print(result_json)
cls_ids = list(range(1, categories + 1))
image_ids = [db.image_ids(ind) for ind in db_inds]
db.evaluate(result_json, cls_ids, image_ids)
return 0
def testing(db, nnet, result_dir, debug=False):
return globals()["test_"+system_configs.model_name](db, nnet, result_dir, debug=debug)
| 41.606864 | 260 | 0.531234 | 3,400 | 26,670 | 3.938235 | 0.064118 | 0.031591 | 0.03764 | 0.043017 | 0.912323 | 0.908738 | 0.906871 | 0.905004 | 0.90463 | 0.90463 | 0 | 0.038839 | 0.33967 | 26,670 | 640 | 261 | 41.671875 | 0.721481 | 0.027072 | 0 | 0.876686 | 0 | 0 | 0.031628 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.013487 | false | 0 | 0.019268 | 0.001927 | 0.042389 | 0.00578 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
db9c59ed4ef1747d3a3de5f42458feaa187f81be | 21,640 | py | Python | tests/test_finder.py | ShantamShorewala/aizynthfinder | 6b15d5846558b14c4ce3c353727d9d676af7f6fb | [
"MIT"
] | null | null | null | tests/test_finder.py | ShantamShorewala/aizynthfinder | 6b15d5846558b14c4ce3c353727d9d676af7f6fb | [
"MIT"
] | null | null | null | tests/test_finder.py | ShantamShorewala/aizynthfinder | 6b15d5846558b14c4ce3c353727d9d676af7f6fb | [
"MIT"
] | null | null | null | import logging
from aizynthfinder.aizynthfinder import AiZynthFinder
def state_smiles(state):
return [mol.smiles for mol in state.mols]
def test_reset_tree():
finder = AiZynthFinder()
finder.target_smiles = "CCCO"
finder.prepare_tree()
assert finder.tree is not None
finder.target_smiles = "CCO"
assert finder.tree is None
def test_dead_end_expansion(setup_aizynthfinder):
"""
Test the building of this tree:
root
root cannot be expanded
"""
root_smi = "CN1CCC(C(=O)c2cccc(NC(=O)c3ccc(F)cc3)c2F)CC1"
lookup = {root_smi: []}
finder = setup_aizynthfinder(lookup, [])
finder.tree_search()
nodes = list(finder.tree.graph())
assert len(nodes) == 1
assert state_smiles(nodes[0].state) == [root_smi]
assert finder.search_stats["iterations"] == 100
def test_one_expansion(setup_aizynthfinder):
"""
Test the building of this tree:
root
|
child 1
"""
root_smi = "CN1CCC(C(=O)c2cccc(NC(=O)c3ccc(F)cc3)c2F)CC1"
child1_smi = ["CN1CCC(Cl)CC1", "N#Cc1cccc(NC(=O)c2ccc(F)cc2)c1F", "O"]
lookup = {root_smi: {"smiles": ".".join(child1_smi), "prior": 1.0}}
finder = setup_aizynthfinder(lookup, child1_smi)
# Test first with return_first
finder.config.return_first = True
finder.tree_search()
nodes = list(finder.tree.graph())
assert len(nodes) == 2
assert state_smiles(nodes[0].state) == [root_smi]
assert state_smiles(nodes[1].state) == child1_smi
assert finder.search_stats["iterations"] == 1
assert finder.search_stats["returned_first"]
# then test with iteration limit
finder.config.return_first = False
finder.config.iteration_limit = 45
finder.prepare_tree()
finder.tree_search()
assert len(finder.tree.graph()) == 2
assert finder.search_stats["iterations"] == 45
assert not finder.search_stats["returned_first"]
def test_two_expansions(setup_aizynthfinder):
"""
Test the building of this tree:
root
|
child 1
|
child 2
"""
root_smi = "CN1CCC(C(=O)c2cccc(NC(=O)c3ccc(F)cc3)c2F)CC1"
child1_smi = ["CN1CCC(Cl)CC1", "N#Cc1cccc(NC(=O)c2ccc(F)cc2)c1F", "O"]
child2_smi = ["N#Cc1cccc(N)c1F", "O=C(Cl)c1ccc(F)cc1"]
lookup = {
root_smi: {"smiles": ".".join(child1_smi), "prior": 1.0},
child1_smi[1]: {"smiles": ".".join(child2_smi), "prior": 1.0},
}
finder = setup_aizynthfinder(lookup, [child1_smi[0], child1_smi[2]] + child2_smi)
finder.config.return_first = True
finder.tree_search()
nodes = list(finder.tree.graph())
assert len(nodes) == 3
assert state_smiles(nodes[0].state) == [root_smi]
assert state_smiles(nodes[1].state) == child1_smi
assert state_smiles(nodes[2].state) == [child1_smi[0], child1_smi[2]] + child2_smi
assert finder.search_stats["iterations"] == 1
def test_two_expansions_two_children(setup_aizynthfinder):
"""
Test the building of this tree:
root
/ \
child 1 child 2
| |
grandchild 1 grandchild 2
"""
root_smi = "CN1CCC(C(=O)c2cccc(NC(=O)c3ccc(F)cc3)c2F)CC1"
child1_smi = ["CN1CCC(Cl)CC1", "N#Cc1cccc(NC(=O)c2ccc(F)cc2)c1F", "O"]
child2_smi = ["CN1CCC(Cl)CC1", "N#Cc1cccc(NC(=O)c2ccc(F)cc2)c1F"]
grandchild_smi = ["N#Cc1cccc(N)c1F", "O=C(Cl)c1ccc(F)cc1"]
lookup = {
root_smi: [
{"smiles": ".".join(child1_smi), "prior": 0.7},
{"smiles": ".".join(child2_smi), "prior": 0.3},
],
child1_smi[1]: {"smiles": ".".join(grandchild_smi), "prior": 0.7},
child2_smi[1]: {"smiles": ".".join(grandchild_smi), "prior": 0.7},
}
finder = setup_aizynthfinder(
lookup, [child1_smi[0], child1_smi[2]] + grandchild_smi
)
finder.tree_search()
nodes = list(finder.tree.graph())
assert len(nodes) == 5
assert state_smiles(nodes[0].state) == [root_smi]
assert state_smiles(nodes[1].state) == child1_smi
assert (
state_smiles(nodes[2].state) == [child1_smi[0], child1_smi[2]] + grandchild_smi
)
assert state_smiles(nodes[3].state) == child2_smi
assert state_smiles(nodes[4].state) == [child2_smi[0]] + grandchild_smi
assert finder.search_stats["iterations"] == 100
def test_three_expansions(setup_aizynthfinder):
"""
Test the building of this tree:
root
|
child 1
|
child 2
|
child 3 (*)
- child 3 state is solved
"""
root_smi = "CN1CCC(C(=O)c2cccc(NC(=O)c3ccc(F)cc3)c2F)CC1"
child1_smi = ["CN1CCC(Cl)CC1", "N#Cc1cccc(NC(=O)c2ccc(F)cc2)c1F", "O"]
child2_smi = ["N#Cc1cccc(N)c1F", "O=C(Cl)c1ccc(F)cc1"]
child3_smi = ["O=C(Cl)c1ccccc1"]
lookup = {
root_smi: {"smiles": ".".join(child1_smi), "prior": 1.0},
child1_smi[1]: {"smiles": ".".join(child2_smi), "prior": 1.0},
child2_smi[1]: {"smiles": child3_smi[0], "prior": 1.0},
}
finder = setup_aizynthfinder(
lookup, [child1_smi[0], child1_smi[2], child2_smi[0]] + child3_smi
)
finder.config.return_first = True
finder.tree_search()
nodes = list(finder.tree.graph())
assert len(nodes) == 4
assert state_smiles(nodes[0].state) == [root_smi]
assert state_smiles(nodes[1].state) == child1_smi
assert state_smiles(nodes[2].state) == [child1_smi[0], child1_smi[2]] + child2_smi
expected_list = [child1_smi[0], child1_smi[2], child2_smi[0]] + child3_smi
assert state_smiles(nodes[3].state) == expected_list
assert nodes[3].state.is_solved
assert finder.search_stats["iterations"] == 1
def test_three_expansions_not_solved(setup_aizynthfinder):
"""
Test the building of this tree:
root
|
child 1
|
child 2
|
child 3
- child 3 state is not solved (not in stock)
"""
root_smi = "CN1CCC(C(=O)c2cccc(NC(=O)c3ccc(F)cc3)c2F)CC1"
child1_smi = ["CN1CCC(Cl)CC1", "N#Cc1cccc(NC(=O)c2ccc(F)cc2)c1F", "O"]
child2_smi = ["N#Cc1cccc(N)c1F", "O=C(Cl)c1ccc(F)cc1"]
child3_smi = ["O=C(Cl)c1ccccc1"]
lookup = {
root_smi: {"smiles": ".".join(child1_smi), "prior": 1.0},
child1_smi[1]: {"smiles": ".".join(child2_smi), "prior": 1.0},
child2_smi[1]: {"smiles": child3_smi[0], "prior": 1.0},
}
finder = setup_aizynthfinder(lookup, [child1_smi[0], child1_smi[2], child2_smi[0]])
finder.config.return_first = True
finder.config.max_transforms = 2
finder.config.iteration_limit = 15
finder.tree_search()
nodes = list(finder.tree.graph())
assert len(nodes) == 4
assert state_smiles(nodes[0].state) == [root_smi]
assert state_smiles(nodes[1].state) == child1_smi
assert state_smiles(nodes[2].state) == [child1_smi[0], child1_smi[2]] + child2_smi
expected_list = [child1_smi[0], child1_smi[2], child2_smi[0]] + child3_smi
assert state_smiles(nodes[3].state) == expected_list
assert not nodes[3].state.is_solved
assert finder.search_stats["iterations"] == 15
def test_two_expansions_no_expandable_root(setup_aizynthfinder):
"""
Test the following scenario:
root
|
child 1 (+)
- child 1 will be selected first for expansion (iteration 1)
- it has no children that can be expanded (marked by +)
-- end of iteration 1
- iteration 2 starts but selecting a leaf will raise an exception
-- will continue to iterate until reached number of iteration (set 10 in the test)
* nodes in tree will be root, child 1
"""
root_smi = "CN1CCC(C(=O)c2cccc(NC(=O)c3ccc(F)cc3)c2F)CC1"
child1_smi = ["CN1CCC(Cl)CC1", "N#Cc1cccc(NC(=O)c2ccc(F)cc2)c1F", "O"]
lookup = {
root_smi: {
"smiles": ".".join(child1_smi),
"prior": 1.0,
},
child1_smi[1]: {
"smiles": "",
"prior": 0.3,
},
}
finder = setup_aizynthfinder(lookup, [child1_smi[0], child1_smi[2]])
finder.config.return_first = True
finder.config.iteration_limit = 10
finder.tree_search()
nodes = list(finder.tree.graph())
assert len(nodes) == 2
assert state_smiles(nodes[0].state) == [root_smi]
assert state_smiles(nodes[1].state) == child1_smi
assert finder.search_stats["iterations"] == 10
def test_two_expansions_no_reactants_first_child(setup_aizynthfinder):
"""
Test the following scenario:
root
/ \
child 1 (+) child 2
|
grandchild 1 (*)
- child 1 will be selected first for expansion (iteration 1)
- it has no children that can be expanded (marked by +)
-- end of iteration 1
- child 2 will be selected for expansion (iteration 2)
- grandchild 1 will be selected next and it is in stock (marked by *)
-- a solution is found and the tree search is terminated
* nodes in tree will be root, child1, child2, grandchild 1
"""
root_smi = "CN1CCC(C(=O)c2cccc(NC(=O)c3ccc(F)cc3)c2F)CC1"
child1_smi = ["CN1CCC(Cl)CC1", "N#Cc1cccc(NC(=O)c2ccc(F)cc2)c1F", "O"]
child2_smi = ["CN1CCC(Cl)CC1", "N#Cc1cccc(NC(=O)c2ccc(F)cc2)c1CF"]
grandchild1_smi = ["N#Cc1cccc(N)c1F", "O=C(Cl)c1ccc(F)cc1"]
lookup = {
root_smi: [
{"smiles": ".".join(child1_smi), "prior": 0.7},
{"smiles": ".".join(child2_smi), "prior": 0.3},
],
child1_smi[1]: {
"smiles": "",
"prior": 0.3,
},
child2_smi[1]: {
"smiles": ".".join(grandchild1_smi),
"prior": 0.3,
},
}
finder = setup_aizynthfinder(
lookup, [child1_smi[0], child1_smi[2]] + grandchild1_smi
)
finder.config.return_first = True
finder.tree_search()
nodes = list(finder.tree.graph())
assert len(nodes) == 4
assert state_smiles(nodes[0].state) == [root_smi]
assert state_smiles(nodes[1].state) == child1_smi
assert state_smiles(nodes[2].state) == child2_smi
assert state_smiles(nodes[3].state) == [child2_smi[0]] + grandchild1_smi
assert finder.search_stats["iterations"] == 2
def test_three_expansions_no_reactants_first_child(setup_aizynthfinder):
"""
Test the following scenario:
root
/ \
child 1 (+) child 2
|
grandchild 1
|
grandchild 2 (*)
- child 1 will be selected first for expansion (iteration 1)
- it has no children that can be expanded (marked by +)
-- end of iteration 1
- child 2 will be selected for expansion (iteration 2)
- grandchild 1 will be selected next
- grandchild 2 will be selected next and it is in stock (marked by *)
-- a solution is found and the tree search is terminated
* nodes in tree will be root, child1, child2, grandchild 1, grandchild 2
"""
root_smi = "CN1CCC(C(=O)c2cccc(NC(=O)c3ccc(F)cc3)c2F)CC1"
child1_smi = ["CN1CCC(Cl)CC1", "N#Cc1cccc(NC(=O)c2ccc(F)cc2)c1F", "O"]
child2_smi = ["CN1CCC(Cl)CC1", "N#Cc1cccc(NC(=O)c2ccc(F)cc2)c1CF"]
grandchild1_smi = ["N#Cc1cccc(N)c1F", "O=C(Cl)c1ccc(F)cc1"]
grandchild2_smi = ["O=C(Cl)c1ccccc1"]
lookup = {
root_smi: [
{"smiles": ".".join(child1_smi), "prior": 0.7},
{"smiles": ".".join(child2_smi), "prior": 0.3},
],
child1_smi[1]: {
"smiles": "",
"prior": 0.3,
},
child2_smi[1]: {
"smiles": ".".join(grandchild1_smi),
"prior": 0.3,
},
grandchild1_smi[1]: {
"smiles": ".".join(grandchild2_smi),
"prior": 1.0,
},
}
finder = setup_aizynthfinder(
lookup, [child1_smi[0], child1_smi[2], grandchild1_smi[0]] + grandchild2_smi
)
finder.config.return_first = True
finder.tree_search()
nodes = list(finder.tree.graph())
assert len(nodes) == 5
assert state_smiles(nodes[0].state) == [root_smi]
assert state_smiles(nodes[1].state) == child1_smi
assert state_smiles(nodes[2].state) == child2_smi
assert state_smiles(nodes[3].state) == [child2_smi[0]] + grandchild1_smi
expected_list = [child2_smi[0], grandchild1_smi[0]] + grandchild2_smi
assert state_smiles(nodes[4].state) == expected_list
assert finder.search_stats["iterations"] == 2
def test_three_expansions_no_reactants_second_level(setup_aizynthfinder):
"""
Test the following scenario:
root
/ \
child 1 child 2
| |
grandchild 1 (+) grandchild 2 (*)
- child 1 will be selected first for expansion (iteration 1)
- grandchild 1 will be selected next,
- it has no children that can be expanded (marked by x)
-- end of iteration 1
- child 2 will be selected for expansion (iteration 2)
- grandchild 2 will be selected next and it is in stock (marked by *)
-- a solution is found and the tree search is terminated
* nodes in tree will be root, child1, grandchild 1, child2, grandchild 2
"""
root_smi = "CN1CCC(C(=O)c2cccc(NC(=O)c3ccc(F)cc3)c2F)CC1"
child1_smi = ["CN1CCC(Cl)CC1", "N#Cc1cccc(NC(=O)c2ccc(F)cc2)c1F", "O"]
child2_smi = ["CN1CCC(Cl)CC1", "N#Cc1cccc(NC(=O)c2ccc(F)cc2)c1CF"]
grandchild1_smi = ["N#Cc1cccc(N)c1F", "O=C(Cl)c1ccc(F)cc1"]
grandchild2_smi = ["N#Cc1cccc(N)c1", "O=C(Cl)c1ccc(F)c(F)c1"]
lookup = {
root_smi: [
{"smiles": ".".join(child1_smi), "prior": 0.7},
{"smiles": ".".join(child2_smi), "prior": 0.3},
],
child1_smi[1]: {
"smiles": ".".join(grandchild1_smi),
"prior": 0.3,
},
grandchild1_smi[1]: {
"smiles": "",
"prior": 1.0,
},
child2_smi[1]: {
"smiles": ".".join(grandchild2_smi),
"prior": 0.3,
},
}
finder = setup_aizynthfinder(
lookup, [child1_smi[0], child1_smi[2], grandchild1_smi[0]] + grandchild2_smi
)
finder.config.return_first = True
finder.tree_search()
nodes = list(finder.tree.graph())
assert len(nodes) == 5
assert state_smiles(nodes[0].state) == [root_smi]
assert state_smiles(nodes[1].state) == child1_smi
assert (
state_smiles(nodes[2].state) == [child1_smi[0], child1_smi[2]] + grandchild1_smi
)
assert state_smiles(nodes[3].state) == child2_smi
assert state_smiles(nodes[4].state) == [child2_smi[0]] + grandchild2_smi
assert finder.search_stats["iterations"] == 2
def test_two_expansions_no_reactants_second_child(setup_aizynthfinder):
"""
Test the following scenario:
root
/ \
child 1 child 2 (+)
|
grandchild 1 (*)
- child 1 will be selected first for expansion (iteration 1)
- grandchild 1 will be selected next and it is in stock (marked by *)
-- end of iteration 1
- child 2 will be selected for expansion (iteration 2)
- it has no children that can be expanded (marked with +)
-- will continue to iterate until reached number of iteration (set 10 in the test)
* nodes in tree will be root, child1, grandchild 1, child2
"""
root_smi = "CN1CCC(C(=O)c2cccc(NC(=O)c3ccc(F)cc3)c2F)CC1"
child1_smi = ["CN1CCC(Cl)CC1", "N#Cc1cccc(NC(=O)c2ccc(F)cc2)c1F", "O"]
child2_smi = ["CN1CCC(Cl)CC1", "N#Cc1cccc(NC(=O)c2ccc(F)cc2)c1CF"]
grandchild1_smi = ["N#Cc1cccc(N)c1F", "O=C(Cl)c1ccc(F)cc1"]
lookup = {
root_smi: [
{"smiles": ".".join(child1_smi), "prior": 0.7},
{"smiles": ".".join(child2_smi), "prior": 0.3},
],
child1_smi[1]: {
"smiles": ".".join(grandchild1_smi),
"prior": 0.3,
},
grandchild1_smi[1]: {
"smiles": "",
"prior": 1.0,
},
child2_smi[1]: {
"smiles": "",
"prior": 0.3,
},
}
finder = setup_aizynthfinder(
lookup, [child1_smi[0], child1_smi[2]] + grandchild1_smi
)
finder.config.iteration_limit = 10
finder.tree_search()
nodes = list(finder.tree.graph())
assert len(nodes) == 4
assert state_smiles(nodes[0].state) == [root_smi]
assert state_smiles(nodes[1].state) == child1_smi
assert (
state_smiles(nodes[2].state) == [child1_smi[0], child1_smi[2]] + grandchild1_smi
)
assert state_smiles(nodes[3].state) == child2_smi
assert finder.search_stats["iterations"] == 10
def test_two_expansions_cyclic(setup_aizynthfinder):
"""
Test the building of this tree:
root
|
child 1
|
child 2
But making child 2 should be rejected because child 2 == root
"""
root_smi = "COc1cc2cc(-c3ccc(OC(C)=O)c(OC(C)=O)c3)[n+](C)c(C)c2cc1OC"
child1_smi = ["COc1cc2cc(-c3ccc(O)c(OC(C)=O)c3)[n+](C)c(C)c2cc1OC"]
lookup = {
root_smi: {"smiles": child1_smi[0], "prior": 0.1},
child1_smi[0]: {
"smiles": root_smi,
"prior": 1.0,
},
}
finder = setup_aizynthfinder(lookup, [])
finder.config.iteration_limit = 1
finder.tree_search()
nodes = list(finder.tree.graph())
assert len(nodes) == 2
assert state_smiles(nodes[0].state) == [root_smi]
assert state_smiles(nodes[1].state) == child1_smi
assert finder.search_stats["iterations"] == 1
def test_two_expansions_prune_cyclic(setup_aizynthfinder):
"""
Test the building of this tree:
root
|
child 1
|
child 2
Child 2 will not be rejected, but the tree search will not end, so it will
continue to expand until reaching maximum depth
"""
root_smi = "COc1cc2cc(-c3ccc(OC(C)=O)c(OC(C)=O)c3)[n+](C)c(C)c2cc1OC"
child1_smi = ["COc1cc2cc(-c3ccc(O)c(OC(C)=O)c3)[n+](C)c(C)c2cc1OC"]
lookup = {
root_smi: {"smiles": child1_smi[0], "prior": 0.1},
child1_smi[0]: {
"smiles": root_smi,
"prior": 1.0,
},
}
finder = setup_aizynthfinder(lookup, [])
finder.config.iteration_limit = 1
finder.config.prune_cycles_in_search = False
finder.tree_search()
nodes = list(finder.tree.graph())
assert len(nodes) == 8
assert state_smiles(nodes[0].state) == [root_smi]
assert state_smiles(nodes[1].state) == child1_smi
assert state_smiles(nodes[2].state) == [root_smi]
assert finder.search_stats["iterations"] == 1
def test_two_expansions_two_children_one_filtered(setup_aizynthfinder, caplog):
"""
Test the building of this tree:
root
/ \
child 1 child 2 (*)
| |
grandchild 1 grandchild 2
child 2 will not be created as that reaction is filtered away
"""
root_smi = "CN1CCC(C(=O)c2cccc(NC(=O)c3ccc(F)cc3)c2F)CC1"
child1_smi = ["CN1CCC(Cl)CC1", "N#Cc1cccc(NC(=O)c2ccc(F)cc2)c1F", "O"]
child2_smi = ["CN1CCC(Cl)CC1", "N#Cc1cccc(NC(=O)c2ccc(F)cc2)c1F"]
grandchild_smi = ["N#Cc1cccc(N)c1F", "O=C(Cl)c1ccc(F)cc1"]
lookup = {
root_smi: [
{"smiles": ".".join(child1_smi), "prior": 0.7},
{"smiles": ".".join(child2_smi), "prior": 0.3},
],
child1_smi[1]: {"smiles": ".".join(grandchild_smi), "prior": 0.7},
child2_smi[1]: {"smiles": ".".join(grandchild_smi), "prior": 0.7},
}
finder = setup_aizynthfinder(
lookup, [child1_smi[0], child1_smi[2]] + grandchild_smi
)
finder.filter_policy[finder.filter_policy.selection[0]].lookup = {
f"{root_smi}>>{'.'.join(child2_smi)}": 0.2
}
finder.config.iteration_limit = 10
with caplog.at_level(logging.DEBUG):
finder.tree_search()
assert not any(
rec.message.startswith("Reject retro reaction") for rec in caplog.records
)
nodes = list(finder.tree.graph())
assert len(nodes) == 5
assert state_smiles(nodes[0].state) == [root_smi]
assert state_smiles(nodes[1].state) == child1_smi
assert (
state_smiles(nodes[2].state) == [child1_smi[0], child1_smi[2]] + grandchild_smi
)
assert state_smiles(nodes[3].state) == child2_smi
assert state_smiles(nodes[4].state) == [child2_smi[0]] + grandchild_smi
assert finder.search_stats["iterations"] == 10
# Now raise the filter threshold to remove child 2, grandchild 2
finder.config.filter_cutoff = 0.5
finder.target_smiles = finder.target_smiles # Trigger re-set
with caplog.at_level(logging.DEBUG):
finder.tree_search()
assert any(
rec.message.startswith("Reject retro reaction") for rec in caplog.records
)
nodes = list(finder.tree.graph())
assert len(nodes) == 3
assert state_smiles(nodes[0].state) == [root_smi]
assert state_smiles(nodes[1].state) == child1_smi
assert (
state_smiles(nodes[2].state) == [child1_smi[0], child1_smi[2]] + grandchild_smi
)
assert finder.search_stats["iterations"] == 10
| 35.016181 | 90 | 0.587893 | 2,865 | 21,640 | 4.282373 | 0.063176 | 0.068221 | 0.072052 | 0.093243 | 0.907735 | 0.886706 | 0.88312 | 0.872769 | 0.869264 | 0.855408 | 0 | 0.049845 | 0.267606 | 21,640 | 617 | 91 | 35.072934 | 0.724273 | 0.208041 | 0 | 0.739018 | 0 | 0.041344 | 0.16026 | 0.080403 | 0 | 0 | 0 | 0 | 0.237726 | 1 | 0.041344 | false | 0 | 0.005168 | 0.002584 | 0.049096 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
91712ce3f5f2aeae834bc6192c9afb69ca50a1dd | 13,231 | py | Python | gradio/test_data/blocks_configs.py | dumpmemory/gradio | 748b1a4761c7f32e231c831a651595f5a33db039 | [
"Apache-2.0"
] | 1 | 2021-12-15T09:21:44.000Z | 2021-12-15T09:21:44.000Z | gradio/test_data/blocks_configs.py | AK391/gradio | f3fa61cce8b0eab4f76c564216cda3108824020a | [
"Apache-2.0"
] | null | null | null | gradio/test_data/blocks_configs.py | AK391/gradio | f3fa61cce8b0eab4f76c564216cda3108824020a | [
"Apache-2.0"
] | null | null | null | XRAY_CONFIG = {
"mode": "blocks",
"components": [
{
"id": 1,
"type": "markdown",
"props": {
"default_value": "<h1>Detect Disease From Scan</h1>\n<p>With this model you can lorem ipsum</p>\n<ul>\n<li>ipsum 1</li>\n<li>ipsum 2</li>\n</ul>\n",
"name": "markdown",
"css": {},
},
},
{
"id": 2,
"type": "checkboxgroup",
"props": {
"choices": ["Covid", "Malaria", "Lung Cancer"],
"default_value": [],
"name": "checkboxgroup",
"label": "Disease to Scan For",
"css": {},
},
},
{"id": 3, "type": "tabs", "props": {"css": {}, "default_value": True}},
{
"id": 4,
"type": "tabitem",
"props": {"label": "X-ray", "css": {}, "default_value": True},
},
{
"id": 5,
"type": "row",
"props": {"type": "row", "css": {}, "default_value": True},
},
{
"id": 6,
"type": "image",
"props": {
"image_mode": "RGB",
"source": "upload",
"tool": "editor",
"name": "image",
"css": {},
},
},
{
"id": 7,
"type": "json",
"props": {
"default_value": '""',
"name": "json",
"css": {},
},
},
{
"id": 8,
"type": "button",
"props": {
"default_value": "Run",
"name": "button",
"css": {"background-color": "red", "--hover-color": "orange"},
},
},
{
"id": 9,
"type": "tabitem",
"props": {"label": "CT Scan", "css": {}, "default_value": True},
},
{
"id": 10,
"type": "row",
"props": {"type": "row", "css": {}, "default_value": True},
},
{
"id": 11,
"type": "image",
"props": {
"image_mode": "RGB",
"source": "upload",
"tool": "editor",
"name": "image",
"css": {},
},
},
{
"id": 12,
"type": "json",
"props": {
"default_value": '""',
"name": "json",
"css": {},
},
},
{
"id": 13,
"type": "button",
"props": {
"default_value": "Run",
"name": "button",
"css": {},
},
},
{
"id": 14,
"type": "textbox",
"props": {
"lines": 1,
"max_lines": 20,
"default_value": "",
"name": "textbox",
"css": {},
},
},
],
"theme": "default",
"layout": {
"id": 0,
"children": [
{"id": 1},
{"id": 2},
{
"id": 3,
"children": [
{
"id": 4,
"children": [
{"id": 5, "children": [{"id": 6}, {"id": 7}]},
{"id": 8},
],
},
{
"id": 9,
"children": [
{"id": 10, "children": [{"id": 11}, {"id": 12}]},
{"id": 13},
],
},
],
},
{"id": 14},
],
},
"dependencies": [
{
"targets": [8],
"trigger": "click",
"inputs": [2, 6],
"outputs": [7],
"queue": False,
"status_tracker": None,
},
{
"targets": [13],
"trigger": "click",
"inputs": [2, 11],
"outputs": [12],
"queue": False,
"status_tracker": None,
},
{
"targets": [],
"trigger": "load",
"inputs": [],
"outputs": [14],
"queue": False,
"status_tracker": None,
},
],
}
XRAY_CONFIG_DIFF_IDS = {
"mode": "blocks",
"components": [
{
"id": 1,
"type": "markdown",
"props": {
"default_value": "<h1>Detect Disease From Scan</h1>\n<p>With this model you can lorem ipsum</p>\n<ul>\n<li>ipsum 1</li>\n<li>ipsum 2</li>\n</ul>\n",
"name": "markdown",
"css": {},
},
},
{
"id": 22,
"type": "checkboxgroup",
"props": {
"choices": ["Covid", "Malaria", "Lung Cancer"],
"default_value": [],
"name": "checkboxgroup",
"label": "Disease to Scan For",
"css": {},
},
},
{"id": 3, "type": "tabs", "props": {"css": {}, "default_value": True}},
{
"id": 444,
"type": "tabitem",
"props": {"label": "X-ray", "css": {}, "default_value": True},
},
{
"id": 5,
"type": "row",
"props": {"type": "row", "css": {}, "default_value": True},
},
{
"id": 6,
"type": "image",
"props": {
"image_mode": "RGB",
"source": "upload",
"tool": "editor",
"name": "image",
"css": {},
},
},
{
"id": 7,
"type": "json",
"props": {
"default_value": '""',
"name": "json",
"css": {},
},
},
{
"id": 8888,
"type": "button",
"props": {
"default_value": "Run",
"name": "button",
"css": {"background-color": "red", "--hover-color": "orange"},
},
},
{
"id": 9,
"type": "tabitem",
"props": {"label": "CT Scan", "css": {}, "default_value": True},
},
{
"id": 10,
"type": "row",
"props": {"type": "row", "css": {}, "default_value": True},
},
{
"id": 11,
"type": "image",
"props": {
"image_mode": "RGB",
"source": "upload",
"tool": "editor",
"name": "image",
"css": {},
},
},
{
"id": 12,
"type": "json",
"props": {
"default_value": '""',
"name": "json",
"css": {},
},
},
{
"id": 13,
"type": "button",
"props": {
"default_value": "Run",
"name": "button",
"css": {},
},
},
{
"id": 141,
"type": "textbox",
"props": {
"lines": 1,
"default_value": "",
"name": "textbox",
"max_lines": 20,
"css": {},
},
},
],
"theme": "default",
"layout": {
"id": 0,
"children": [
{"id": 1},
{"id": 22},
{
"id": 3,
"children": [
{
"id": 444,
"children": [
{"id": 5, "children": [{"id": 6}, {"id": 7}]},
{"id": 8888},
],
},
{
"id": 9,
"children": [
{"id": 10, "children": [{"id": 11}, {"id": 12}]},
{"id": 13},
],
},
],
},
{"id": 141},
],
},
"dependencies": [
{
"targets": [8888],
"trigger": "click",
"inputs": [22, 6],
"outputs": [7],
"queue": False,
"status_tracker": None,
},
{
"targets": [13],
"trigger": "click",
"inputs": [22, 11],
"outputs": [12],
"queue": False,
"status_tracker": None,
},
],
}
XRAY_CONFIG_WITH_MISTAKE = {
"mode": "blocks",
"components": [
{
"id": 1,
"type": "markdown",
"props": {
"default_value": "<h1>Detect Disease From Scan</h1>\n<p>With this model you can lorem ipsum</p>\n<ul>\n<li>ipsum 1</li>\n<li>ipsum 2</li>\n</ul>\n",
"name": "markdown",
"css": {},
},
},
{
"id": 2,
"type": "checkboxgroup",
"props": {
"choices": ["Covid", "Malaria", "Lung Cancer"],
"default_value": [],
"name": "checkboxgroup",
"label": "Disease to Scan For",
"css": {},
},
},
{"id": 3, "type": "tabs", "props": {"css": {}, "default_value": True}},
{
"id": 4,
"type": "tabitem",
"props": {"label": "X-ray", "css": {}, "default_value": True},
},
{
"id": 5,
"type": "row",
"props": {"type": "row", "css": {}, "default_value": True},
},
{
"id": 6,
"type": "image",
"props": {
"image_mode": "RGB",
"source": "upload",
"tool": "editor",
"name": "image",
"css": {},
},
},
{
"id": 7,
"type": "json",
"props": {
"default_value": '""',
"name": "json",
"css": {},
},
},
{
"id": 8,
"type": "button",
"props": {
"default_value": "Run",
"name": "button",
"css": {"background-color": "red", "--hover-color": "orange"},
},
},
{
"id": 9,
"type": "tabitem",
"props": {"label": "CT Scan", "css": {}, "default_value": True},
},
{
"id": 10,
"type": "row",
"props": {"type": "row", "css": {}, "default_value": True},
},
{
"id": 11,
"type": "image",
"props": {
"image_mode": "RGB",
"source": "upload",
"tool": "editor",
"name": "image",
"css": {},
},
},
{
"id": 12,
"type": "json",
"props": {
"default_value": '""',
"name": "json",
"css": {},
},
},
{
"id": 13,
"type": "button",
"props": {
"default_value": "Run",
"name": "button",
"css": {},
},
},
{
"id": 14,
"type": "textbox",
"props": {
"lines": 1,
"default_value": "",
"name": "textbox",
"css": {},
},
},
],
"theme": "default",
"layout": {
"id": 0,
"children": [
{"id": 1},
{"id": 2},
{
"id": 3,
"children": [
{
"id": 4,
"children": [
{"id": 5, "children": [{"id": 6}, {"id": 7}]},
{"id": 8},
],
},
{
"id": 9,
"children": [
{"id": 10, "children": [{"id": 12}, {"id": 11}]},
{"id": 13},
],
},
],
},
{"id": 14},
],
},
"dependencies": [
{
"targets": [8],
"trigger": "click",
"inputs": [2, 6],
"outputs": [7],
"queue": False,
"status_tracker": None,
},
{
"targets": [13],
"trigger": "click",
"inputs": [2, 11],
"outputs": [12],
"queue": False,
"status_tracker": None,
},
],
}
| 26.837728 | 164 | 0.268536 | 858 | 13,231 | 4.073427 | 0.11655 | 0.123605 | 0.072961 | 0.081545 | 0.953648 | 0.953648 | 0.951645 | 0.938197 | 0.927325 | 0.906724 | 0 | 0.031296 | 0.541153 | 13,231 | 492 | 165 | 26.892276 | 0.544391 | 0 | 0 | 0.673469 | 0 | 0.006122 | 0.254327 | 0.005895 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
918c4518fea595df27622aff72bb881520c0ebb4 | 6,637 | py | Python | src/examples.py | polynoman/Game-of-Life-Simulator | 9e335ad68d1bca9b48ce6ae3aafdb19302a576ce | [
"MIT"
] | null | null | null | src/examples.py | polynoman/Game-of-Life-Simulator | 9e335ad68d1bca9b48ce6ae3aafdb19302a576ce | [
"MIT"
] | null | null | null | src/examples.py | polynoman/Game-of-Life-Simulator | 9e335ad68d1bca9b48ce6ae3aafdb19302a576ce | [
"MIT"
] | null | null | null | # This File contains example patterns
blinker = [
[0,0,0,0,0],
[0,0,1,0,0],
[0,0,1,0,0],
[0,0,1,0,0],
[0,0,0,0,0],
]
block = [
[0,0,0,0],
[0,1,1,0],
[0,1,1,0],
[0,0,0,0],
]
blinker2 = [
[0,0,0,0,0],
[0,0,0,0,0],
[0,1,1,1,0],
[0,0,0,0,0],
[0,0,0,0,0],
]
empty = [
[0,0,0,0,0],
[0,0,0,0,0],
[0,0,0,0,0],
[0,0,0,0,0],
[0,0,0,0,0],
]
dead = [
[0,0,0,0,0],
[0,0,0,0,0],
[0,0,1,0,0],
[0,0,0,0,0],
[0,0,0,0,0],
]
gosperGliderGun = [
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,1,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,0,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,1,1,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,1,0,0,0,0,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
]
big = [
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
]
big2 = [
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
]
| 60.336364 | 102 | 0.481844 | 3,129 | 6,637 | 1.022052 | 0.004794 | 1.863665 | 2.741088 | 3.58349 | 0.974359 | 0.974359 | 0.974359 | 0.973734 | 0.973734 | 0.96873 | 0 | 0.479176 | 0.019587 | 6,637 | 109 | 103 | 60.889908 | 0.012294 | 0.005273 | 0 | 0.64 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 12 |
91ef485b025dba2ec8fe7c5e2c02c8d2698c789c | 14,624 | py | Python | sdk/python/pulumi_libvirt/cloud_init_disk.py | dustinspecker/pulumi-libvirt | 484b511bce215727260ff9a3edf03c7197ce29ca | [
"ECL-2.0",
"Apache-2.0"
] | 10 | 2021-04-29T14:33:35.000Z | 2022-03-31T21:59:48.000Z | sdk/python/pulumi_libvirt/cloud_init_disk.py | dustinspecker/pulumi-libvirt | 484b511bce215727260ff9a3edf03c7197ce29ca | [
"ECL-2.0",
"Apache-2.0"
] | 28 | 2021-08-15T07:46:26.000Z | 2022-03-31T15:30:59.000Z | sdk/python/pulumi_libvirt/cloud_init_disk.py | dustinspecker/pulumi-libvirt | 484b511bce215727260ff9a3edf03c7197ce29ca | [
"ECL-2.0",
"Apache-2.0"
] | 1 | 2021-12-17T23:11:19.000Z | 2021-12-17T23:11:19.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from . import _utilities
__all__ = ['CloudInitDiskArgs', 'CloudInitDisk']
@pulumi.input_type
class CloudInitDiskArgs:
def __init__(__self__, *,
meta_data: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
network_config: Optional[pulumi.Input[str]] = None,
pool: Optional[pulumi.Input[str]] = None,
user_data: Optional[pulumi.Input[str]] = None):
"""
The set of arguments for constructing a CloudInitDisk resource.
:param pulumi.Input[str] meta_data: cloud-init user data.
:param pulumi.Input[str] name: A unique name for the resource, required by libvirt.
:param pulumi.Input[str] network_config: cloud-init network-config data.
:param pulumi.Input[str] pool: The pool where the resource will be created.
If not given, the `default` pool will be used.
For user_data, network_config and meta_data parameters have a look at upstream doc:
http://cloudinit.readthedocs.io/en/latest/topics/datasources/nocloud.html#datasource-nocloud
:param pulumi.Input[str] user_data: cloud-init user data.
"""
if meta_data is not None:
pulumi.set(__self__, "meta_data", meta_data)
if name is not None:
pulumi.set(__self__, "name", name)
if network_config is not None:
pulumi.set(__self__, "network_config", network_config)
if pool is not None:
pulumi.set(__self__, "pool", pool)
if user_data is not None:
pulumi.set(__self__, "user_data", user_data)
@property
@pulumi.getter(name="metaData")
def meta_data(self) -> Optional[pulumi.Input[str]]:
"""
cloud-init user data.
"""
return pulumi.get(self, "meta_data")
@meta_data.setter
def meta_data(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "meta_data", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
A unique name for the resource, required by libvirt.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter(name="networkConfig")
def network_config(self) -> Optional[pulumi.Input[str]]:
"""
cloud-init network-config data.
"""
return pulumi.get(self, "network_config")
@network_config.setter
def network_config(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "network_config", value)
@property
@pulumi.getter
def pool(self) -> Optional[pulumi.Input[str]]:
"""
The pool where the resource will be created.
If not given, the `default` pool will be used.
For user_data, network_config and meta_data parameters have a look at upstream doc:
http://cloudinit.readthedocs.io/en/latest/topics/datasources/nocloud.html#datasource-nocloud
"""
return pulumi.get(self, "pool")
@pool.setter
def pool(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "pool", value)
@property
@pulumi.getter(name="userData")
def user_data(self) -> Optional[pulumi.Input[str]]:
"""
cloud-init user data.
"""
return pulumi.get(self, "user_data")
@user_data.setter
def user_data(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "user_data", value)
@pulumi.input_type
class _CloudInitDiskState:
def __init__(__self__, *,
meta_data: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
network_config: Optional[pulumi.Input[str]] = None,
pool: Optional[pulumi.Input[str]] = None,
user_data: Optional[pulumi.Input[str]] = None):
"""
Input properties used for looking up and filtering CloudInitDisk resources.
:param pulumi.Input[str] meta_data: cloud-init user data.
:param pulumi.Input[str] name: A unique name for the resource, required by libvirt.
:param pulumi.Input[str] network_config: cloud-init network-config data.
:param pulumi.Input[str] pool: The pool where the resource will be created.
If not given, the `default` pool will be used.
For user_data, network_config and meta_data parameters have a look at upstream doc:
http://cloudinit.readthedocs.io/en/latest/topics/datasources/nocloud.html#datasource-nocloud
:param pulumi.Input[str] user_data: cloud-init user data.
"""
if meta_data is not None:
pulumi.set(__self__, "meta_data", meta_data)
if name is not None:
pulumi.set(__self__, "name", name)
if network_config is not None:
pulumi.set(__self__, "network_config", network_config)
if pool is not None:
pulumi.set(__self__, "pool", pool)
if user_data is not None:
pulumi.set(__self__, "user_data", user_data)
@property
@pulumi.getter(name="metaData")
def meta_data(self) -> Optional[pulumi.Input[str]]:
"""
cloud-init user data.
"""
return pulumi.get(self, "meta_data")
@meta_data.setter
def meta_data(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "meta_data", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
A unique name for the resource, required by libvirt.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter(name="networkConfig")
def network_config(self) -> Optional[pulumi.Input[str]]:
"""
cloud-init network-config data.
"""
return pulumi.get(self, "network_config")
@network_config.setter
def network_config(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "network_config", value)
@property
@pulumi.getter
def pool(self) -> Optional[pulumi.Input[str]]:
"""
The pool where the resource will be created.
If not given, the `default` pool will be used.
For user_data, network_config and meta_data parameters have a look at upstream doc:
http://cloudinit.readthedocs.io/en/latest/topics/datasources/nocloud.html#datasource-nocloud
"""
return pulumi.get(self, "pool")
@pool.setter
def pool(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "pool", value)
@property
@pulumi.getter(name="userData")
def user_data(self) -> Optional[pulumi.Input[str]]:
"""
cloud-init user data.
"""
return pulumi.get(self, "user_data")
@user_data.setter
def user_data(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "user_data", value)
class CloudInitDisk(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
meta_data: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
network_config: Optional[pulumi.Input[str]] = None,
pool: Optional[pulumi.Input[str]] = None,
user_data: Optional[pulumi.Input[str]] = None,
__props__=None):
"""
Create a CloudInitDisk resource with the given unique name, props, and options.
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] meta_data: cloud-init user data.
:param pulumi.Input[str] name: A unique name for the resource, required by libvirt.
:param pulumi.Input[str] network_config: cloud-init network-config data.
:param pulumi.Input[str] pool: The pool where the resource will be created.
If not given, the `default` pool will be used.
For user_data, network_config and meta_data parameters have a look at upstream doc:
http://cloudinit.readthedocs.io/en/latest/topics/datasources/nocloud.html#datasource-nocloud
:param pulumi.Input[str] user_data: cloud-init user data.
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: Optional[CloudInitDiskArgs] = None,
opts: Optional[pulumi.ResourceOptions] = None):
"""
Create a CloudInitDisk resource with the given unique name, props, and options.
:param str resource_name: The name of the resource.
:param CloudInitDiskArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(CloudInitDiskArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
meta_data: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
network_config: Optional[pulumi.Input[str]] = None,
pool: Optional[pulumi.Input[str]] = None,
user_data: Optional[pulumi.Input[str]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = CloudInitDiskArgs.__new__(CloudInitDiskArgs)
__props__.__dict__["meta_data"] = meta_data
__props__.__dict__["name"] = name
__props__.__dict__["network_config"] = network_config
__props__.__dict__["pool"] = pool
__props__.__dict__["user_data"] = user_data
super(CloudInitDisk, __self__).__init__(
'libvirt:index/cloudInitDisk:CloudInitDisk',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
meta_data: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
network_config: Optional[pulumi.Input[str]] = None,
pool: Optional[pulumi.Input[str]] = None,
user_data: Optional[pulumi.Input[str]] = None) -> 'CloudInitDisk':
"""
Get an existing CloudInitDisk resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] meta_data: cloud-init user data.
:param pulumi.Input[str] name: A unique name for the resource, required by libvirt.
:param pulumi.Input[str] network_config: cloud-init network-config data.
:param pulumi.Input[str] pool: The pool where the resource will be created.
If not given, the `default` pool will be used.
For user_data, network_config and meta_data parameters have a look at upstream doc:
http://cloudinit.readthedocs.io/en/latest/topics/datasources/nocloud.html#datasource-nocloud
:param pulumi.Input[str] user_data: cloud-init user data.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _CloudInitDiskState.__new__(_CloudInitDiskState)
__props__.__dict__["meta_data"] = meta_data
__props__.__dict__["name"] = name
__props__.__dict__["network_config"] = network_config
__props__.__dict__["pool"] = pool
__props__.__dict__["user_data"] = user_data
return CloudInitDisk(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter(name="metaData")
def meta_data(self) -> pulumi.Output[Optional[str]]:
"""
cloud-init user data.
"""
return pulumi.get(self, "meta_data")
@property
@pulumi.getter
def name(self) -> pulumi.Output[str]:
"""
A unique name for the resource, required by libvirt.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter(name="networkConfig")
def network_config(self) -> pulumi.Output[Optional[str]]:
"""
cloud-init network-config data.
"""
return pulumi.get(self, "network_config")
@property
@pulumi.getter
def pool(self) -> pulumi.Output[Optional[str]]:
"""
The pool where the resource will be created.
If not given, the `default` pool will be used.
For user_data, network_config and meta_data parameters have a look at upstream doc:
http://cloudinit.readthedocs.io/en/latest/topics/datasources/nocloud.html#datasource-nocloud
"""
return pulumi.get(self, "pool")
@property
@pulumi.getter(name="userData")
def user_data(self) -> pulumi.Output[Optional[str]]:
"""
cloud-init user data.
"""
return pulumi.get(self, "user_data")
| 41.194366 | 134 | 0.630607 | 1,757 | 14,624 | 5.033011 | 0.087649 | 0.085831 | 0.106073 | 0.111953 | 0.818614 | 0.800633 | 0.793622 | 0.782427 | 0.77666 | 0.761393 | 0 | 0.000093 | 0.262787 | 14,624 | 354 | 135 | 41.310734 | 0.820147 | 0.322005 | 0 | 0.777228 | 1 | 0 | 0.076014 | 0.00453 | 0 | 0 | 0 | 0 | 0 | 1 | 0.158416 | false | 0.004951 | 0.024752 | 0 | 0.277228 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
37ddc0a69db0179389532de0948a6826ca77f8a8 | 19,855 | py | Python | cs3/sharing/collaboration/v1beta1/collaboration_api_pb2_grpc.py | cs3org/python-cs3apis | 33f84befa7c6009ce87fb7594128d26ff6e49bbd | [
"Apache-2.0"
] | 1 | 2020-12-17T14:39:57.000Z | 2020-12-17T14:39:57.000Z | cs3/sharing/collaboration/v1beta1/collaboration_api_pb2_grpc.py | cs3org/python-cs3apis | 33f84befa7c6009ce87fb7594128d26ff6e49bbd | [
"Apache-2.0"
] | 1 | 2020-05-06T10:23:07.000Z | 2020-05-12T09:07:08.000Z | cs3/sharing/collaboration/v1beta1/collaboration_api_pb2_grpc.py | cs3org/python-cs3apis | 33f84befa7c6009ce87fb7594128d26ff6e49bbd | [
"Apache-2.0"
] | 1 | 2020-05-05T09:24:54.000Z | 2020-05-05T09:24:54.000Z | # Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT!
"""Client and server classes corresponding to protobuf-defined services."""
import grpc
from cs3.sharing.collaboration.v1beta1 import collaboration_api_pb2 as cs3_dot_sharing_dot_collaboration_dot_v1beta1_dot_collaboration__api__pb2
class CollaborationAPIStub(object):
"""User Share Provider API
The User Share Provider API is meant to manipulate share
resources for a specific share type (user, group, ocm, ...)
from the perspective of the creator or the share and
from the perspective of the receiver of the share.
The key words "MUST", "MUST NOT", "REQUIRED", "SHALL", "SHALL
NOT", "SHOULD", "SHOULD NOT", "RECOMMENDED", "MAY", and
"OPTIONAL" in this document are to be interpreted as described in
RFC 2119.
The following are global requirements that apply to all methods:
Any method MUST return CODE_OK on a succesful operation.
Any method MAY return NOT_IMPLEMENTED.
Any method MAY return INTERNAL.
Any method MAY return UNKNOWN.
Any method MAY return UNAUTHENTICATED.
"""
def __init__(self, channel):
"""Constructor.
Args:
channel: A grpc.Channel.
"""
self.CreateShare = channel.unary_unary(
'/cs3.sharing.collaboration.v1beta1.CollaborationAPI/CreateShare',
request_serializer=cs3_dot_sharing_dot_collaboration_dot_v1beta1_dot_collaboration__api__pb2.CreateShareRequest.SerializeToString,
response_deserializer=cs3_dot_sharing_dot_collaboration_dot_v1beta1_dot_collaboration__api__pb2.CreateShareResponse.FromString,
)
self.RemoveShare = channel.unary_unary(
'/cs3.sharing.collaboration.v1beta1.CollaborationAPI/RemoveShare',
request_serializer=cs3_dot_sharing_dot_collaboration_dot_v1beta1_dot_collaboration__api__pb2.RemoveShareRequest.SerializeToString,
response_deserializer=cs3_dot_sharing_dot_collaboration_dot_v1beta1_dot_collaboration__api__pb2.RemoveShareResponse.FromString,
)
self.GetShare = channel.unary_unary(
'/cs3.sharing.collaboration.v1beta1.CollaborationAPI/GetShare',
request_serializer=cs3_dot_sharing_dot_collaboration_dot_v1beta1_dot_collaboration__api__pb2.GetShareRequest.SerializeToString,
response_deserializer=cs3_dot_sharing_dot_collaboration_dot_v1beta1_dot_collaboration__api__pb2.GetShareResponse.FromString,
)
self.ListShares = channel.unary_unary(
'/cs3.sharing.collaboration.v1beta1.CollaborationAPI/ListShares',
request_serializer=cs3_dot_sharing_dot_collaboration_dot_v1beta1_dot_collaboration__api__pb2.ListSharesRequest.SerializeToString,
response_deserializer=cs3_dot_sharing_dot_collaboration_dot_v1beta1_dot_collaboration__api__pb2.ListSharesResponse.FromString,
)
self.UpdateShare = channel.unary_unary(
'/cs3.sharing.collaboration.v1beta1.CollaborationAPI/UpdateShare',
request_serializer=cs3_dot_sharing_dot_collaboration_dot_v1beta1_dot_collaboration__api__pb2.UpdateShareRequest.SerializeToString,
response_deserializer=cs3_dot_sharing_dot_collaboration_dot_v1beta1_dot_collaboration__api__pb2.UpdateShareResponse.FromString,
)
self.ListReceivedShares = channel.unary_unary(
'/cs3.sharing.collaboration.v1beta1.CollaborationAPI/ListReceivedShares',
request_serializer=cs3_dot_sharing_dot_collaboration_dot_v1beta1_dot_collaboration__api__pb2.ListReceivedSharesRequest.SerializeToString,
response_deserializer=cs3_dot_sharing_dot_collaboration_dot_v1beta1_dot_collaboration__api__pb2.ListReceivedSharesResponse.FromString,
)
self.UpdateReceivedShare = channel.unary_unary(
'/cs3.sharing.collaboration.v1beta1.CollaborationAPI/UpdateReceivedShare',
request_serializer=cs3_dot_sharing_dot_collaboration_dot_v1beta1_dot_collaboration__api__pb2.UpdateReceivedShareRequest.SerializeToString,
response_deserializer=cs3_dot_sharing_dot_collaboration_dot_v1beta1_dot_collaboration__api__pb2.UpdateReceivedShareResponse.FromString,
)
self.GetReceivedShare = channel.unary_unary(
'/cs3.sharing.collaboration.v1beta1.CollaborationAPI/GetReceivedShare',
request_serializer=cs3_dot_sharing_dot_collaboration_dot_v1beta1_dot_collaboration__api__pb2.GetReceivedShareRequest.SerializeToString,
response_deserializer=cs3_dot_sharing_dot_collaboration_dot_v1beta1_dot_collaboration__api__pb2.GetReceivedShareResponse.FromString,
)
class CollaborationAPIServicer(object):
"""User Share Provider API
The User Share Provider API is meant to manipulate share
resources for a specific share type (user, group, ocm, ...)
from the perspective of the creator or the share and
from the perspective of the receiver of the share.
The key words "MUST", "MUST NOT", "REQUIRED", "SHALL", "SHALL
NOT", "SHOULD", "SHOULD NOT", "RECOMMENDED", "MAY", and
"OPTIONAL" in this document are to be interpreted as described in
RFC 2119.
The following are global requirements that apply to all methods:
Any method MUST return CODE_OK on a succesful operation.
Any method MAY return NOT_IMPLEMENTED.
Any method MAY return INTERNAL.
Any method MAY return UNKNOWN.
Any method MAY return UNAUTHENTICATED.
"""
def CreateShare(self, request, context):
"""Creates a new share.
MUST return CODE_NOT_FOUND if the resource reference does not exist.
MUST return CODE_ALREADY_EXISTS if the share already exists for the 4-tuple consisting of
(owner, shared_resource, grantee).
New shares MUST be created in the state SHARE_STATE_PENDING.
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def RemoveShare(self, request, context):
"""Removes a share.
MUST return CODE_NOT_FOUND if the share reference does not exist.
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def GetShare(self, request, context):
"""Gets share information for a single share.
MUST return CODE_NOT_FOUND if the share reference does not exist.
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def ListShares(self, request, context):
"""List the shares the authenticated principal has created,
both as owner and creator. If a filter is specified, only
shares satisfying the filter MUST be returned.
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def UpdateShare(self, request, context):
"""Updates a share.
MUST return CODE_NOT_FOUND if the share reference does not exist.
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def ListReceivedShares(self, request, context):
"""List all shares the authenticated principal has received.
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def UpdateReceivedShare(self, request, context):
"""Update the received share to change the share state or the display name.
MUST return CODE_NOT_FOUND if the share reference does not exist.
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def GetReceivedShare(self, request, context):
"""Get the information for the given received share reference.
MUST return CODE_NOT_FOUND if the received share reference does not exist.
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def add_CollaborationAPIServicer_to_server(servicer, server):
rpc_method_handlers = {
'CreateShare': grpc.unary_unary_rpc_method_handler(
servicer.CreateShare,
request_deserializer=cs3_dot_sharing_dot_collaboration_dot_v1beta1_dot_collaboration__api__pb2.CreateShareRequest.FromString,
response_serializer=cs3_dot_sharing_dot_collaboration_dot_v1beta1_dot_collaboration__api__pb2.CreateShareResponse.SerializeToString,
),
'RemoveShare': grpc.unary_unary_rpc_method_handler(
servicer.RemoveShare,
request_deserializer=cs3_dot_sharing_dot_collaboration_dot_v1beta1_dot_collaboration__api__pb2.RemoveShareRequest.FromString,
response_serializer=cs3_dot_sharing_dot_collaboration_dot_v1beta1_dot_collaboration__api__pb2.RemoveShareResponse.SerializeToString,
),
'GetShare': grpc.unary_unary_rpc_method_handler(
servicer.GetShare,
request_deserializer=cs3_dot_sharing_dot_collaboration_dot_v1beta1_dot_collaboration__api__pb2.GetShareRequest.FromString,
response_serializer=cs3_dot_sharing_dot_collaboration_dot_v1beta1_dot_collaboration__api__pb2.GetShareResponse.SerializeToString,
),
'ListShares': grpc.unary_unary_rpc_method_handler(
servicer.ListShares,
request_deserializer=cs3_dot_sharing_dot_collaboration_dot_v1beta1_dot_collaboration__api__pb2.ListSharesRequest.FromString,
response_serializer=cs3_dot_sharing_dot_collaboration_dot_v1beta1_dot_collaboration__api__pb2.ListSharesResponse.SerializeToString,
),
'UpdateShare': grpc.unary_unary_rpc_method_handler(
servicer.UpdateShare,
request_deserializer=cs3_dot_sharing_dot_collaboration_dot_v1beta1_dot_collaboration__api__pb2.UpdateShareRequest.FromString,
response_serializer=cs3_dot_sharing_dot_collaboration_dot_v1beta1_dot_collaboration__api__pb2.UpdateShareResponse.SerializeToString,
),
'ListReceivedShares': grpc.unary_unary_rpc_method_handler(
servicer.ListReceivedShares,
request_deserializer=cs3_dot_sharing_dot_collaboration_dot_v1beta1_dot_collaboration__api__pb2.ListReceivedSharesRequest.FromString,
response_serializer=cs3_dot_sharing_dot_collaboration_dot_v1beta1_dot_collaboration__api__pb2.ListReceivedSharesResponse.SerializeToString,
),
'UpdateReceivedShare': grpc.unary_unary_rpc_method_handler(
servicer.UpdateReceivedShare,
request_deserializer=cs3_dot_sharing_dot_collaboration_dot_v1beta1_dot_collaboration__api__pb2.UpdateReceivedShareRequest.FromString,
response_serializer=cs3_dot_sharing_dot_collaboration_dot_v1beta1_dot_collaboration__api__pb2.UpdateReceivedShareResponse.SerializeToString,
),
'GetReceivedShare': grpc.unary_unary_rpc_method_handler(
servicer.GetReceivedShare,
request_deserializer=cs3_dot_sharing_dot_collaboration_dot_v1beta1_dot_collaboration__api__pb2.GetReceivedShareRequest.FromString,
response_serializer=cs3_dot_sharing_dot_collaboration_dot_v1beta1_dot_collaboration__api__pb2.GetReceivedShareResponse.SerializeToString,
),
}
generic_handler = grpc.method_handlers_generic_handler(
'cs3.sharing.collaboration.v1beta1.CollaborationAPI', rpc_method_handlers)
server.add_generic_rpc_handlers((generic_handler,))
# This class is part of an EXPERIMENTAL API.
class CollaborationAPI(object):
"""User Share Provider API
The User Share Provider API is meant to manipulate share
resources for a specific share type (user, group, ocm, ...)
from the perspective of the creator or the share and
from the perspective of the receiver of the share.
The key words "MUST", "MUST NOT", "REQUIRED", "SHALL", "SHALL
NOT", "SHOULD", "SHOULD NOT", "RECOMMENDED", "MAY", and
"OPTIONAL" in this document are to be interpreted as described in
RFC 2119.
The following are global requirements that apply to all methods:
Any method MUST return CODE_OK on a succesful operation.
Any method MAY return NOT_IMPLEMENTED.
Any method MAY return INTERNAL.
Any method MAY return UNKNOWN.
Any method MAY return UNAUTHENTICATED.
"""
@staticmethod
def CreateShare(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/cs3.sharing.collaboration.v1beta1.CollaborationAPI/CreateShare',
cs3_dot_sharing_dot_collaboration_dot_v1beta1_dot_collaboration__api__pb2.CreateShareRequest.SerializeToString,
cs3_dot_sharing_dot_collaboration_dot_v1beta1_dot_collaboration__api__pb2.CreateShareResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def RemoveShare(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/cs3.sharing.collaboration.v1beta1.CollaborationAPI/RemoveShare',
cs3_dot_sharing_dot_collaboration_dot_v1beta1_dot_collaboration__api__pb2.RemoveShareRequest.SerializeToString,
cs3_dot_sharing_dot_collaboration_dot_v1beta1_dot_collaboration__api__pb2.RemoveShareResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def GetShare(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/cs3.sharing.collaboration.v1beta1.CollaborationAPI/GetShare',
cs3_dot_sharing_dot_collaboration_dot_v1beta1_dot_collaboration__api__pb2.GetShareRequest.SerializeToString,
cs3_dot_sharing_dot_collaboration_dot_v1beta1_dot_collaboration__api__pb2.GetShareResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def ListShares(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/cs3.sharing.collaboration.v1beta1.CollaborationAPI/ListShares',
cs3_dot_sharing_dot_collaboration_dot_v1beta1_dot_collaboration__api__pb2.ListSharesRequest.SerializeToString,
cs3_dot_sharing_dot_collaboration_dot_v1beta1_dot_collaboration__api__pb2.ListSharesResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def UpdateShare(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/cs3.sharing.collaboration.v1beta1.CollaborationAPI/UpdateShare',
cs3_dot_sharing_dot_collaboration_dot_v1beta1_dot_collaboration__api__pb2.UpdateShareRequest.SerializeToString,
cs3_dot_sharing_dot_collaboration_dot_v1beta1_dot_collaboration__api__pb2.UpdateShareResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def ListReceivedShares(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/cs3.sharing.collaboration.v1beta1.CollaborationAPI/ListReceivedShares',
cs3_dot_sharing_dot_collaboration_dot_v1beta1_dot_collaboration__api__pb2.ListReceivedSharesRequest.SerializeToString,
cs3_dot_sharing_dot_collaboration_dot_v1beta1_dot_collaboration__api__pb2.ListReceivedSharesResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def UpdateReceivedShare(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/cs3.sharing.collaboration.v1beta1.CollaborationAPI/UpdateReceivedShare',
cs3_dot_sharing_dot_collaboration_dot_v1beta1_dot_collaboration__api__pb2.UpdateReceivedShareRequest.SerializeToString,
cs3_dot_sharing_dot_collaboration_dot_v1beta1_dot_collaboration__api__pb2.UpdateReceivedShareResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def GetReceivedShare(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/cs3.sharing.collaboration.v1beta1.CollaborationAPI/GetReceivedShare',
cs3_dot_sharing_dot_collaboration_dot_v1beta1_dot_collaboration__api__pb2.GetReceivedShareRequest.SerializeToString,
cs3_dot_sharing_dot_collaboration_dot_v1beta1_dot_collaboration__api__pb2.GetReceivedShareResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
| 53.51752 | 160 | 0.722135 | 2,031 | 19,855 | 6.669621 | 0.097489 | 0.115754 | 0.070131 | 0.057877 | 0.865348 | 0.856932 | 0.841577 | 0.817142 | 0.777573 | 0.777573 | 0 | 0.017016 | 0.218585 | 19,855 | 370 | 161 | 53.662162 | 0.856075 | 0.171342 | 0 | 0.507937 | 1 | 0 | 0.097461 | 0.06801 | 0 | 0 | 0 | 0 | 0 | 1 | 0.071429 | false | 0 | 0.007937 | 0.031746 | 0.123016 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
37e86abc95b2314797a941448434cc56199ccf36 | 9,207 | py | Python | tests/primitives/test_togglesets.py | SergeyTsaplin/kopf | 8a08e0a219adfdf7cd994803572181a84b86b75f | [
"MIT"
] | null | null | null | tests/primitives/test_togglesets.py | SergeyTsaplin/kopf | 8a08e0a219adfdf7cd994803572181a84b86b75f | [
"MIT"
] | null | null | null | tests/primitives/test_togglesets.py | SergeyTsaplin/kopf | 8a08e0a219adfdf7cd994803572181a84b86b75f | [
"MIT"
] | null | null | null | import asyncio
import pytest
from kopf.structs.primitives import Toggle, ToggleSet
@pytest.mark.parametrize('fn, expected', [(all, True), (any, False)])
async def test_created_empty(fn, expected):
toggleset = ToggleSet(fn)
assert len(toggleset) == 0
assert set(toggleset) == set()
assert Toggle() not in toggleset
assert toggleset.is_on() == expected
assert toggleset.is_off() == (not expected)
@pytest.mark.parametrize('fn', [all, any])
async def test_making_a_default_toggle(fn):
toggleset = ToggleSet(fn)
toggle = await toggleset.make_toggle()
assert len(toggleset) == 1
assert set(toggleset) == {toggle}
assert toggle in toggleset
assert Toggle() not in toggleset
assert toggleset.is_on() == False
assert toggleset.is_off() == True
@pytest.mark.parametrize('fn', [all, any])
async def test_making_a_turned_off_toggle(fn):
toggleset = ToggleSet(fn)
toggle = await toggleset.make_toggle(False)
assert len(toggleset) == 1
assert set(toggleset) == {toggle}
assert toggle in toggleset
assert Toggle() not in toggleset
assert toggleset.is_on() == False
assert toggleset.is_off() == True
@pytest.mark.parametrize('fn', [all, any])
async def test_making_a_turned_on_toggle(fn):
toggleset = ToggleSet(fn)
toggle = await toggleset.make_toggle(True)
assert len(toggleset) == 1
assert set(toggleset) == {toggle}
assert toggle in toggleset
assert Toggle() not in toggleset
assert toggleset.is_on() == True
assert toggleset.is_off() == False
@pytest.mark.parametrize('fn, expected', [(all, True), (any, False)])
async def test_dropping_a_turned_off_toggle(fn, expected):
toggleset = ToggleSet(fn)
toggle = await toggleset.make_toggle(False)
await toggle.turn_to(True)
await toggleset.drop_toggle(toggle)
assert len(toggleset) == 0
assert set(toggleset) == set()
assert toggle not in toggleset
assert toggleset.is_on() == expected
assert toggleset.is_off() == (not expected)
@pytest.mark.parametrize('fn, expected', [(all, True), (any, False)])
async def test_dropping_a_turned_on_toggle(fn, expected):
toggleset = ToggleSet(fn)
toggle = await toggleset.make_toggle(True)
await toggleset.drop_toggle(toggle)
assert len(toggleset) == 0
assert set(toggleset) == set()
assert toggle not in toggleset
assert toggleset.is_on() == expected
assert toggleset.is_off() == (not expected)
@pytest.mark.parametrize('fn, expected', [(all, True), (any, False)])
async def test_dropping_an_unexistent_toggle(fn, expected):
toggleset = ToggleSet(fn)
toggle = Toggle()
await toggleset.drop_toggle(toggle)
assert len(toggleset) == 0
assert set(toggleset) == set()
assert toggle not in toggleset
assert toggleset.is_on() == expected
assert toggleset.is_off() == (not expected)
@pytest.mark.parametrize('fn, expected', [(all, True), (any, False)])
async def test_dropping_multiple_toggles(fn, expected):
toggleset = ToggleSet(fn)
toggle1 = await toggleset.make_toggle(True)
toggle2 = Toggle()
await toggleset.drop_toggles([toggle1, toggle2])
assert len(toggleset) == 0
assert set(toggleset) == set()
assert toggle1 not in toggleset
assert toggle2 not in toggleset
assert toggleset.is_on() == expected
assert toggleset.is_off() == (not expected)
@pytest.mark.parametrize('fn', [all, any])
async def test_turning_a_toggle_on_turns_the_toggleset_on(fn):
toggleset = ToggleSet(fn)
toggle = await toggleset.make_toggle(False)
assert toggleset.is_on() == False
assert toggleset.is_off() == True
await toggle.turn_to(True)
assert toggleset.is_on() == True
assert toggleset.is_off() == False
@pytest.mark.parametrize('fn', [all, any])
async def test_turning_a_toggle_off_turns_the_toggleset_off(fn):
toggleset = ToggleSet(fn)
toggle = await toggleset.make_toggle(True)
assert toggleset.is_on() == True
assert toggleset.is_off() == False
await toggle.turn_to(False)
assert toggleset.is_on() == False
assert toggleset.is_off() == True
@pytest.mark.parametrize('fn', [all])
async def test_all_toggles_must_be_on_for_alltoggleset_to_be_on(fn):
toggleset = ToggleSet(fn)
toggle1 = await toggleset.make_toggle(False)
toggle2 = await toggleset.make_toggle(False)
assert toggleset.is_on() == False
assert toggleset.is_off() == True
await toggle1.turn_to(True)
assert toggleset.is_on() == False
assert toggleset.is_off() == True
await toggle2.turn_to(True)
assert toggleset.is_on() == True
assert toggleset.is_off() == False
@pytest.mark.parametrize('fn', [all])
async def test_any_toggle_must_be_off_for_alltoggleset_to_be_off(fn):
toggleset = ToggleSet(fn)
toggle1 = await toggleset.make_toggle(True)
toggle2 = await toggleset.make_toggle(True)
assert toggleset.is_on() == True
assert toggleset.is_off() == False
await toggle1.turn_to(False)
assert toggleset.is_on() == False
assert toggleset.is_off() == True
await toggle2.turn_to(False)
assert toggleset.is_on() == False
assert toggleset.is_off() == True
@pytest.mark.parametrize('fn', [any])
async def test_any_toggle_must_be_on_for_anytoggleset_to_be_on(fn):
toggleset = ToggleSet(fn)
toggle1 = await toggleset.make_toggle(False)
toggle2 = await toggleset.make_toggle(False)
assert toggleset.is_on() == False
assert toggleset.is_off() == True
await toggle1.turn_to(True)
assert toggleset.is_on() == True
assert toggleset.is_off() == False
await toggle2.turn_to(True)
assert toggleset.is_on() == True
assert toggleset.is_off() == False
@pytest.mark.parametrize('fn', [any])
async def test_all_toggles_must_be_off_for_anytoggleset_to_be_off(fn):
toggleset = ToggleSet(fn)
toggle1 = await toggleset.make_toggle(True)
toggle2 = await toggleset.make_toggle(True)
assert toggleset.is_on() == True
assert toggleset.is_off() == False
await toggle1.turn_to(False)
assert toggleset.is_on() == True
assert toggleset.is_off() == False
await toggle2.turn_to(False)
assert toggleset.is_on() == False
assert toggleset.is_off() == True
@pytest.mark.parametrize('fn', [all, any])
async def test_waiting_until_on_fails_when_not_turned_on(fn):
toggleset = ToggleSet(fn)
await toggleset.make_toggle(False)
with pytest.raises(asyncio.TimeoutError):
await asyncio.wait_for(toggleset.wait_for(True), timeout=0.1)
assert toggleset.is_off()
@pytest.mark.parametrize('fn', [all, any])
async def test_waiting_until_off_fails_when_not_turned_off(fn):
toggleset = ToggleSet(fn)
await toggleset.make_toggle(True)
with pytest.raises(asyncio.TimeoutError):
await asyncio.wait_for(toggleset.wait_for(False), timeout=0.1)
assert toggleset.is_on()
@pytest.mark.parametrize('fn', [all, any])
async def test_waiting_until_on_wakes_when_turned_on(fn, timer):
toggleset = ToggleSet(fn)
toggle = await toggleset.make_toggle(False)
async def delayed_turning_on(delay: float):
await asyncio.sleep(delay)
await toggle.turn_to(True)
with timer:
asyncio.create_task(delayed_turning_on(0.05))
await asyncio.wait_for(toggleset.wait_for(True), timeout=1.0)
assert toggleset.is_on()
assert timer.seconds < 0.5 # approx. 0.05 plus some code overhead
@pytest.mark.parametrize('fn', [all, any])
async def test_waiting_until_off_wakes_when_turned_off(fn, timer):
toggleset = ToggleSet(fn)
toggle = await toggleset.make_toggle(True)
async def delayed_turning_off(delay: float):
await asyncio.sleep(delay)
await toggle.turn_to(False)
with timer:
asyncio.create_task(delayed_turning_off(0.05))
await asyncio.wait_for(toggleset.wait_for(False), timeout=1.0)
assert toggleset.is_off()
assert timer.seconds < 0.5 # approx. 0.05 plus some code overhead
@pytest.mark.parametrize('fn', [all, any])
async def test_secures_against_usage_as_a_boolean(fn):
toggle = ToggleSet(fn)
with pytest.raises(NotImplementedError):
bool(toggle)
@pytest.mark.parametrize('fn', [all, any])
async def test_repr_when_empty(fn):
toggleset = ToggleSet(fn)
assert repr(toggleset) == "set()"
@pytest.mark.parametrize('fn', [all, any])
async def test_repr_when_unnamed_and_off(fn):
toggleset = ToggleSet(fn)
await toggleset.make_toggle(False)
assert repr(toggleset) == "{<Toggle: off>}"
@pytest.mark.parametrize('fn', [all, any])
async def test_repr_when_unnamed_and_on(fn):
toggleset = ToggleSet(fn)
await toggleset.make_toggle(True)
assert repr(toggleset) == "{<Toggle: on>}"
@pytest.mark.parametrize('fn', [all, any])
async def test_repr_when_named_and_off(fn):
toggleset = ToggleSet(fn)
await toggleset.make_toggle(False, name='xyz')
assert repr(toggleset) == "{<Toggle: xyz: off>}"
@pytest.mark.parametrize('fn', [all, any])
async def test_repr_when_named_and_on(fn):
toggleset = ToggleSet(fn)
await toggleset.make_toggle(True, name='xyz')
assert repr(toggleset) == "{<Toggle: xyz: on>}"
| 31.96875 | 70 | 0.709352 | 1,254 | 9,207 | 5.001595 | 0.07496 | 0.124362 | 0.140944 | 0.078763 | 0.90067 | 0.885842 | 0.871811 | 0.831314 | 0.822704 | 0.776786 | 0 | 0.007074 | 0.170848 | 9,207 | 287 | 71 | 32.080139 | 0.814514 | 0.007929 | 0 | 0.769231 | 0 | 0 | 0.019385 | 0 | 0 | 0 | 0 | 0 | 0.393665 | 1 | 0 | false | 0 | 0.013575 | 0 | 0.013575 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5321795adb9c441e0d873a868e850ebd5fd6b47c | 16,650 | py | Python | lib/installed_clients/RAST_SDKClient.py | sagoyal2/kb_genomeclassification | 8397deeed0c31078f0efa2f5c1780ba8cb7cc678 | [
"MIT"
] | null | null | null | lib/installed_clients/RAST_SDKClient.py | sagoyal2/kb_genomeclassification | 8397deeed0c31078f0efa2f5c1780ba8cb7cc678 | [
"MIT"
] | 2 | 2018-07-31T16:46:39.000Z | 2021-06-29T12:41:55.000Z | lib/installed_clients/RAST_SDKClient.py | sagoyal2/kb_genomeclassification | 8397deeed0c31078f0efa2f5c1780ba8cb7cc678 | [
"MIT"
] | 1 | 2018-03-29T06:05:06.000Z | 2018-03-29T06:05:06.000Z | # -*- coding: utf-8 -*-
############################################################
#
# Autogenerated by the KBase type compiler -
# any changes made here will be overwritten
#
############################################################
from __future__ import print_function
# the following is a hack to get the baseclient to import whether we're in a
# package or not. This makes pep8 unhappy hence the annotations.
try:
# baseclient and this client are in a package
from .baseclient import BaseClient as _BaseClient # @UnusedImport
except ImportError:
# no they aren't
from baseclient import BaseClient as _BaseClient # @Reimport
class RAST_SDK(object):
def __init__(
self, url=None, timeout=30 * 60, user_id=None,
password=None, token=None, ignore_authrc=False,
trust_all_ssl_certificates=False,
auth_svc='https://ci.kbase.us/services/auth/api/legacy/KBase/Sessions/Login',
service_ver='beta',
async_job_check_time_ms=100, async_job_check_time_scale_percent=150,
async_job_check_max_time_ms=300000):
if url is None:
raise ValueError('A url is required')
self._service_ver = service_ver
self._client = _BaseClient(
url, timeout=timeout, user_id=user_id, password=password,
token=token, ignore_authrc=ignore_authrc,
trust_all_ssl_certificates=trust_all_ssl_certificates,
auth_svc=auth_svc,
async_job_check_time_ms=async_job_check_time_ms,
async_job_check_time_scale_percent=async_job_check_time_scale_percent,
async_job_check_max_time_ms=async_job_check_max_time_ms)
def annotate_genome(self, params, context=None):
"""
annotate genome
params - a param hash that includes the workspace id and options
:param params: instance of type "AnnotateGenomeParams" (Parameters
for the annotate_genome method. ncbi_taxon_id - the numeric ID of
the NCBI taxon to which this genome belongs. If this is included
scientific_name is ignored. relation_engine_timestamp_ms - the
timestamp to send to the Relation Engine when looking up taxon
information in milliseconds since the epoch. scientific_name - the
scientific name of the genome. Overridden by ncbi_taxon_id. TODO:
document remainder of parameters.) -> structure: parameter
"workspace" of String, parameter "input_genome" of type
"genome_id" (A string representing a genome id.), parameter
"input_contigset" of type "contigset_id" (A string representing a
ContigSet id.), parameter "genetic_code" of Long, parameter
"domain" of String, parameter "ncbi_taxon_id" of Long, parameter
"relation_engine_timestamp_ms" of Long, parameter
"scientific_name" of String, parameter "output_genome" of String,
parameter "call_features_rRNA_SEED" of type "bool" (A binary
boolean), parameter "call_features_tRNA_trnascan" of type "bool"
(A binary boolean), parameter "call_selenoproteins" of type "bool"
(A binary boolean), parameter "call_pyrrolysoproteins" of type
"bool" (A binary boolean), parameter
"call_features_repeat_region_SEED" of type "bool" (A binary
boolean), parameter "call_features_insertion_sequences" of type
"bool" (A binary boolean), parameter
"call_features_strep_suis_repeat" of type "bool" (A binary
boolean), parameter "call_features_strep_pneumo_repeat" of type
"bool" (A binary boolean), parameter "call_features_crispr" of
type "bool" (A binary boolean), parameter
"call_features_CDS_glimmer3" of type "bool" (A binary boolean),
parameter "call_features_CDS_prodigal" of type "bool" (A binary
boolean), parameter "call_features_CDS_genemark" of type "bool" (A
binary boolean), parameter "annotate_proteins_kmer_v2" of type
"bool" (A binary boolean), parameter "kmer_v1_parameters" of type
"bool" (A binary boolean), parameter
"annotate_proteins_similarity" of type "bool" (A binary boolean),
parameter "resolve_overlapping_features" of type "bool" (A binary
boolean), parameter "call_features_prophage_phispy" of type "bool"
(A binary boolean), parameter "retain_old_anno_for_hypotheticals"
of type "bool" (A binary boolean)
:returns: instance of type "AnnotateGenomeResults" -> structure:
parameter "workspace" of type "workspace_name" (A string
representing a workspace name.), parameter "id" of String,
parameter "report_name" of String, parameter "report_ref" of String
"""
return self._client.run_job('RAST_SDK.annotate_genome',
[params], self._service_ver, context)
def annotate_genomes(self, params, context=None):
"""
annotate genomes
params - a param hash that includes the workspace id and options
:param params: instance of type "AnnotateGenomesParams" (Parameters
for the annotate_genomes method. ncbi_taxon_id - the numeric ID of
the NCBI taxon to which this genome belongs. If this is included
scientific_name is ignored. relation_engine_timestamp_ms - the
timestamp to send to the Relation Engine when looking up taxon
information in milliseconds since the epoch. scientific_name - the
scientific name of the genome. Overridden by ncbi_taxon_id. TODO:
document remainder of parameters.) -> structure: parameter
"workspace" of String, parameter "input_genomes" of list of type
"GenomeParams" -> structure: parameter "input_contigset" of type
"contigset_id" (A string representing a ContigSet id.), parameter
"input_genome" of type "genome_id" (A string representing a genome
id.), parameter "output_genome" of type "genome_id" (A string
representing a genome id.), parameter "genetic_code" of Long,
parameter "domain" of String, parameter "scientific_name" of
String, parameter "genetic_code" of Long, parameter "domain" of
String, parameter "ncbi_taxon_id" of Long, parameter
"relation_engine_timestamp_ms" of Long, parameter
"scientific_name" of String, parameter "genome_text" of String,
parameter "output_genome" of String, parameter
"call_features_rRNA_SEED" of type "bool" (A binary boolean),
parameter "call_features_tRNA_trnascan" of type "bool" (A binary
boolean), parameter "call_selenoproteins" of type "bool" (A binary
boolean), parameter "call_pyrrolysoproteins" of type "bool" (A
binary boolean), parameter "call_features_repeat_region_SEED" of
type "bool" (A binary boolean), parameter
"call_features_insertion_sequences" of type "bool" (A binary
boolean), parameter "call_features_strep_suis_repeat" of type
"bool" (A binary boolean), parameter
"call_features_strep_pneumo_repeat" of type "bool" (A binary
boolean), parameter "call_features_crispr" of type "bool" (A
binary boolean), parameter "call_features_CDS_glimmer3" of type
"bool" (A binary boolean), parameter "call_features_CDS_prodigal"
of type "bool" (A binary boolean), parameter
"call_features_CDS_genemark" of type "bool" (A binary boolean),
parameter "annotate_proteins_kmer_v2" of type "bool" (A binary
boolean), parameter "kmer_v1_parameters" of type "bool" (A binary
boolean), parameter "annotate_proteins_similarity" of type "bool"
(A binary boolean), parameter "resolve_overlapping_features" of
type "bool" (A binary boolean), parameter
"call_features_prophage_phispy" of type "bool" (A binary boolean),
parameter "retain_old_anno_for_hypotheticals" of type "bool" (A
binary boolean)
:returns: instance of type "AnnotateGenomesResults" -> structure:
parameter "workspace" of type "workspace_name" (A string
representing a workspace name.), parameter "report_name" of
String, parameter "report_ref" of String
"""
return self._client.run_job('RAST_SDK.annotate_genomes',
[params], self._service_ver, context)
def annotate_proteins(self, params, context=None):
"""
annotate proteins - returns a list of the RAST annotations for the input protein sequences
:param params: instance of type "AnnotateProteinParams" -> structure:
parameter "proteins" of list of String
:returns: instance of type "AnnotateProteinResults" -> structure:
parameter "functions" of list of list of String
"""
return self._client.run_job('RAST_SDK.annotate_proteins',
[params], self._service_ver, context)
def annotate_metagenome(self, params, context=None):
"""
:param params: instance of type "MetagenomeAnnotateParams" (Required
parameters: object_ref - reference to Assembly or Genome object,
output_workspace - output workspace name, output_metagenome_name -
output object name,) -> structure: parameter "object_ref" of type
"data_obj_ref" (For RAST annotating metagenomes (borrowed and
simplied from ProkkaAnnotation moduel) Reference to an Assembly or
Genome object in the workspace @id ws
KBaseGenomeAnnotations.Assembly @id ws KBaseGenomes.Genome @id ws
KBaseMetagenomes.AnnotatedMetagenomeAssembly), parameter
"output_workspace" of String, parameter "output_metagenome_name"
of String, parameter "create_report" of type "bool" (A binary
boolean)
:returns: instance of type "MetagenomeAnnotateOutput" -> structure:
parameter "output_metagenome_ref" of type "metagenome_ref"
(Reference to a Annotated Metagenome Assembly object in the
workspace @id ws KBaseMetagenomes.AnnotatedMetagenomeAssembly),
parameter "output_workspace" of String, parameter "report_name" of
String, parameter "report_ref" of String
"""
return self._client.run_job('RAST_SDK.annotate_metagenome',
[params], self._service_ver, context)
def annotate_metagenomes(self, params, context=None):
"""
:param params: instance of type "BulkAnnotateMetagenomesParams" ->
structure: parameter "input_AMAs" of list of type "data_obj_ref"
(For RAST annotating metagenomes (borrowed and simplied from
ProkkaAnnotation moduel) Reference to an Assembly or Genome object
in the workspace @id ws KBaseGenomeAnnotations.Assembly @id ws
KBaseGenomes.Genome @id ws
KBaseMetagenomes.AnnotatedMetagenomeAssembly), parameter
"input_assemblies" of list of type "data_obj_ref" (For RAST
annotating metagenomes (borrowed and simplied from
ProkkaAnnotation moduel) Reference to an Assembly or Genome object
in the workspace @id ws KBaseGenomeAnnotations.Assembly @id ws
KBaseGenomes.Genome @id ws
KBaseMetagenomes.AnnotatedMetagenomeAssembly), parameter
"input_text" of String, parameter "output_workspace" of String,
parameter "output_AMASet_name" of String, parameter
"create_report" of type "bool" (A binary boolean)
:returns: instance of type "BulkMetagenomesAnnotateOutput" ->
structure: parameter "output_AMASet_ref" of type "data_obj_ref"
(For RAST annotating metagenomes (borrowed and simplied from
ProkkaAnnotation moduel) Reference to an Assembly or Genome object
in the workspace @id ws KBaseGenomeAnnotations.Assembly @id ws
KBaseGenomes.Genome @id ws
KBaseMetagenomes.AnnotatedMetagenomeAssembly), parameter
"output_workspace" of String
"""
return self._client.run_job('RAST_SDK.annotate_metagenomes',
[params], self._service_ver, context)
def rast_genome_assembly(self, params, context=None):
"""
:param params: instance of type "RastGenomeAssemblyParams" (Required
parameters for rast_genome_assembly: object_ref - reference to a
Genome or Assembly object, output_workspace - output workspace
name, output_genome_name - output object name Optional parameters
for rast_genome_assembly: ncbi_taxon_id - the numeric ID of the
NCBI taxon to which this genome belongs. If this is included
scientific_name is ignored. relation_engine_timestamp_ms - the
timestamp to send to the Relation Engine when looking up taxon
information in milliseconds since the epoch. scientific_name - the
scientific name of the genome. Overridden by ncbi_taxon_id.) ->
structure: parameter "object_ref" of type "data_obj_ref" (For RAST
annotating metagenomes (borrowed and simplied from
ProkkaAnnotation moduel) Reference to an Assembly or Genome object
in the workspace @id ws KBaseGenomeAnnotations.Assembly @id ws
KBaseGenomes.Genome @id ws
KBaseMetagenomes.AnnotatedMetagenomeAssembly), parameter
"output_workspace" of String, parameter "genetic_code" of Long,
parameter "domain" of String, parameter "ncbi_taxon_id" of Long,
parameter "relation_engine_timestamp_ms" of Long, parameter
"scientific_name" of String, parameter "output_genome_name" of
String, parameter "create_report" of type "bool" (A binary boolean)
:returns: instance of type "RastGenomeAssemblyOutput" -> structure:
parameter "output_genome_ref" of type "genome_id" (A string
representing a genome id.), parameter "output_workspace" of
String, parameter "report_name" of String, parameter "report_ref"
of String
"""
return self._client.run_job('RAST_SDK.rast_genome_assembly',
[params], self._service_ver, context)
def rast_genomes_assemblies(self, params, context=None):
"""
:param params: instance of type "BulkRastGenomesAssembliesParams" ->
structure: parameter "input_genomes" of list of type
"data_obj_ref" (For RAST annotating metagenomes (borrowed and
simplied from ProkkaAnnotation moduel) Reference to an Assembly or
Genome object in the workspace @id ws
KBaseGenomeAnnotations.Assembly @id ws KBaseGenomes.Genome @id ws
KBaseMetagenomes.AnnotatedMetagenomeAssembly), parameter
"input_assemblies" of list of type "data_obj_ref" (For RAST
annotating metagenomes (borrowed and simplied from
ProkkaAnnotation moduel) Reference to an Assembly or Genome object
in the workspace @id ws KBaseGenomeAnnotations.Assembly @id ws
KBaseGenomes.Genome @id ws
KBaseMetagenomes.AnnotatedMetagenomeAssembly), parameter
"input_text" of String, parameter "scientific_name" of String,
parameter "genetic_code" of Long, parameter "domain" of String,
parameter "ncbi_taxon_id" of Long, parameter
"relation_engine_timestamp_ms" of Long, parameter
"output_workspace" of String, parameter "output_GenomeSet_name" of
String
:returns: instance of type "BulkRastGenomesAssembliesOutput" ->
structure: parameter "output_GenomeSet_ref" of type
"genomeSet_ref" (For RAST annotating genomes/assemblies Reference
to a set of annotated Genome and/or Assembly objects in the
workspace @id ws KBaseSearch.GenomeSet), parameter
"output_workspace" of String, parameter "report_name" of String,
parameter "report_ref" of String
"""
return self._client.run_job('RAST_SDK.rast_genomes_assemblies',
[params], self._service_ver, context)
def status(self, context=None):
return self._client.run_job('RAST_SDK.status',
[], self._service_ver, context)
| 59.677419 | 98 | 0.66991 | 1,922 | 16,650 | 5.603018 | 0.126951 | 0.039558 | 0.036215 | 0.039837 | 0.807503 | 0.787538 | 0.762095 | 0.719101 | 0.719101 | 0.691893 | 0 | 0.001944 | 0.258438 | 16,650 | 278 | 99 | 59.892086 | 0.870252 | 0.721862 | 0 | 0.142857 | 1 | 0.020408 | 0.102833 | 0.067506 | 0 | 0 | 0 | 0.007194 | 0 | 1 | 0.183673 | false | 0.040816 | 0.081633 | 0.020408 | 0.44898 | 0.020408 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5340fc84217b1beaca2b0283b7ac6c85796d850f | 34,508 | py | Python | tests/IO.py | hadizakialqattan/simple-api | f06a30258118aecc92d51e0ea9f1b606e51865e8 | [
"MIT"
] | null | null | null | tests/IO.py | hadizakialqattan/simple-api | f06a30258118aecc92d51e0ea9f1b606e51865e8 | [
"MIT"
] | null | null | null | tests/IO.py | hadizakialqattan/simple-api | f06a30258118aecc92d51e0ea9f1b606e51865e8 | [
"MIT"
] | 1 | 2021-03-15T00:51:51.000Z | 2021-03-15T00:51:51.000Z | from app import models
def BASE_URL():
"""
summary: return base_url
"""
return "http://127.0.0.1:5057"
def URLs(integration: bool):
"""
summary: return URLs dict (funName:url)
arguments: (integration: bool [test type])
"""
URLs = {
"get_access_token": "/token",
"GetMe": "/users/me/",
"ListUsers": "/users",
"GetAdmins": "/users/admins",
"CreateUser": "/user",
"GetUser": "/user/%s",
"UpdateUser": "/user/%s/%s",
"DeleteUser": "/user/%s",
"List": "/configs",
"Create": "/configs",
"Get": "/configs/%s",
"Update": "/configs/%s",
"Delete": "/configs/%s",
"Query": "/search/metadata.%s=%s",
}
if integration:
bUrl = BASE_URL()
for key, value in URLs.items():
URLs[key] = bUrl + value
return URLs
def INPUTS_USERS():
"""
summary: return tests inputs for user testcase dict
arguments: None
return: Inputs dict
"""
return {
"test_01_get_access_token_success": {
"headers": {
"accept": "application/json",
"Content-Type": "application/x-www-form-urlencoded",
},
"admin": {
"grant_type": "",
"username": "admin",
"password": "admin",
"scope": "",
"client_id": "",
"client_secret": "",
},
"user2": {
"grant_type": "",
"username": "user2",
"password": "user2pass",
"scope": "",
"client_id": "",
"client_secret": "",
},
},
"test_02_get_access_token_401": {
"headers": {
"accept": "application/json",
"Content-Type": "application/x-www-form-urlencoded",
},
"data": {
"grant_type": "",
"username": "admin",
"password": "401pass",
"scope": "",
"client_id": "",
"client_secret": "",
},
},
"test_03_GetMe_success": {
"headers": {"accept": "application/json", "Authorization": "Bearer %s"}
},
"test_04_ListUsers_success": {
"headers": {"accept": "application/json", "Authorization": "Bearer %s"}
},
"test_05_ListUsers_401": {
"headers": {"accept": "application/json", "Authorization": "Bearer %s"}
},
"test_06_GetAdmins_success": {
"headers": {"accept": "application/json", "Authorization": "Bearer %s"}
},
"test_07_GetAdmins_401": {
"headers": {"accept": "application/json", "Authorization": "Bearer %s"}
},
"test_08_CreateUser_success": {
"headers": {"accept": "application/json", "Authorization": "Bearer %s"},
"json": {"username": "test08", "isadmin": True, "password": "test08pass"},
},
"test_09_CreateUser_401": {
"headers": {"accept": "application/json", "Authorization": "Bearer %s"},
"json": {"username": "test09", "isadmin": True, "password": "test09pass"},
},
"test_10_CreateUser_400": {
"headers": {"accept": "application/json", "Authorization": "Bearer %s"},
"json": {"username": "test08", "isadmin": True, "password": "test08pass"},
},
"test_11_GetUser_success": {
"headers": {"accept": "application/json", "Authorization": "Bearer %s"},
"name": "test08",
},
"test_12_GetUser_401": {
"headers": {"accept": "application/json", "Authorization": "Bearer %s"},
"name": "test08",
},
"test_13_GetUser_404": {
"headers": {"accept": "application/json", "Authorization": "Bearer %s"},
"name": "404user",
},
"test_14_UpdateUser_success": {
"username": "test08",
"set_admin": "false",
"headers": {"accept": "application/json", "Authorization": "Bearer %s"},
},
"test_15_UpdateUser_401": {
"username": "test08",
"set_admin": "false",
"headers": {"accept": "application/json", "Authorization": "Bearer %s"},
},
"test_16_UpdateUser_404": {
"username": "404user",
"set_admin": "false",
"headers": {"accept": "application/json", "Authorization": "Bearer %s"},
},
"test_17_Delete_success": {
"username": "test08",
"headers": {"accept": "application/json", "Authorization": "Bearer %s"},
},
"test_18_Delete_401": {
"username": "test08",
"headers": {"accept": "application/json", "Authorization": "Bearer %s"},
},
"test_19_Delete_404": {
"username": "404user",
"headers": {"accept": "application/json", "Authorization": "Bearer %s"},
},
}
def OUTPUTS_USERS():
"""
summary: return tests expected outputs for users testcase dict
arguments: None
return: Output dict
"""
return {
"test_01_get_access_token_success": {"status_code": 200},
"test_02_get_access_token_401": {
"status_code": 401,
"json": {"detail": "Incorrect username or password"},
},
"test_03_GetMe_success": {
"status_code": 200,
"json": {"username": "admin", "isadmin": True, "configs": []},
},
"test_04_ListUsers_success": {
"status_code": 200,
"json": {
"Users": [
{"username": "admin", "isadmin": True, "configs": []},
{"username": "user2", "isadmin": False, "configs": []},
]
},
},
"test_05_ListUsers_401": {
"status_code": 401,
"json": {"detail": "Only admins can perform this function"},
},
"test_06_GetAdmins_success": {
"status_code": 200,
"json": {
"Admins": [
{"username": "admin", "isadmin": True, "configs": []},
]
},
},
"test_07_GetAdmins_401": {
"status_code": 401,
"json": {"detail": "Only admins can perform this function"},
},
"test_08_CreateUser_success": {
"status_code": 200,
"json": {"created": {"configs": [], "isadmin": True, "username": "test08"}},
},
"test_09_CreateUser_401": {
"status_code": 401,
"json": {"detail": "Only admins can perform this function"},
},
"test_10_CreateUser_400": {
"status_code": 400,
"json": {"detail": "username already exists"},
},
"test_11_GetUser_success": {
"status_code": 200,
"json": {"username": "test08", "isadmin": True, "configs": []},
},
"test_12_GetUser_401": {
"status_code": 401,
"json": {"detail": "Only admins can perform this function"},
},
"test_13_GetUser_404": {
"status_code": 404,
"json": {"detail": "The user doesn't exists"},
},
"test_14_UpdateUser_success": {
"status_code": 200,
"json": {"Updated": {"username": "test08", "isadmin": False}},
},
"test_15_UpdateUser_401": {
"status_code": 401,
"json": {"detail": "Only admins can perform this function"},
},
"test_16_UpdateUser_404": {
"status_code": 404,
"json": {"detail": "The user doesn't exists"},
},
"test_17_Delete_success": {
"status_code": 200,
"json": {"Deleted": {"username": "test08"}},
},
"test_18_Delete_401": {
"status_code": 401,
"json": {"detail": "Only admins can perform this function"},
},
"test_19_Delete_404": {
"status_code": 404,
"json": {"detail": "The user doesn't exists"},
},
}
def INPUTS_CONFIGS():
"""
summary: return tests inputs for configs testcase dict
arguments: None
return: Inputs dict
"""
return {
"test_00_create_access_token_for_admin_and_user2": {
"headers": {
"accept": "application/json",
"Content-Type": "application/x-www-form-urlencoded",
},
"admin": {
"grant_type": "",
"username": "admin",
"password": "admin",
"scope": "",
"client_id": "",
"client_secret": "",
},
"user2": {
"grant_type": "",
"username": "user2",
"password": "user2pass",
"scope": "",
"client_id": "",
"client_secret": "",
},
},
"test_01_List_success": {
"headers": {"accept": "application/json", "Authorization": "Bearer %s"},
},
"test_02_List_success_owner": {
"headers": {"accept": "application/json", "Authorization": "Bearer %s"},
"owner": "user2",
},
"test_03_List_401": {
"headers": {"accept": "application/json", "Authorization": "Bearer %s"},
"owner": "notme",
},
"test_04_Create_success": {
"headers": {"accept": "application/json", "Authorization": "Bearer %s"},
"json": {
"owner": "user2",
"name": "api-3",
"metadata": {
"name": "SimpleAPI",
"url": "http://127.0.0.1:5057",
"database": {
"name": "apidb",
"type": "sql",
"ms": "postgresql",
"host": "0.0.0.0",
"port": "5432",
"enabled": "true",
"running": "true",
},
"enabled": "true",
"running": "true",
},
"note": "everything is running.",
},
},
"test_05_Create_400": {
"headers": {"accept": "application/json", "Authorization": "Bearer %s"},
"json": {
"owner": "user2",
"name": "api-3",
"metadata": {
"name": "SimpleAPI",
"url": "http://127.0.0.1:5057",
"database": {
"name": "apidb",
"type": "sql",
"ms": "postgresql",
"host": "0.0.0.0",
"port": "5432",
"enabled": "true",
"running": "true",
},
"enabled": "true",
"running": "true",
},
"note": "everything is running.",
},
},
"test_06_Get_success": {
"headers": {"accept": "application/json", "Authorization": "Bearer %s"},
"name": "api-1",
},
"test_07_Get_success_owner": {
"headers": {"accept": "application/json", "Authorization": "Bearer %s"},
"name": "api-1",
"owner": "admin",
},
"test_08_Get_401": {
"headers": {"accept": "application/json", "Authorization": "Bearer %s"},
"name": "api-1",
"owner": "notme",
},
"test_09_Get_404": {
"headers": {"accept": "application/json", "Authorization": "Bearer %s"},
"name": "404config",
"owner": "user2",
},
"test_10_Update_success": {
"headers": {"accept": "application/json", "Authorization": "Bearer %s"},
"json": {
"metadata": {
"name": "SimpleAPI",
"url": "http://127.0.0.1:5057",
"database": {
"name": "apidb",
"type": "sql",
"ms": "postgresql",
"host": "0.0.0.0",
"port": "5432",
"enabled": "false",
"running": "true",
},
"enabled": "false",
"running": "true",
},
"note": "every thing has disabled.",
"name": "api-3",
"owner": "user2",
},
"name": "api-3",
},
"test_11_Update_success_owner": {
"headers": {"accept": "application/json", "Authorization": "Bearer %s"},
"json": {
"metadata": {
"name": "SimpleAPI",
"url": "http://127.0.0.1:5057",
"database": {
"name": "apidb",
"type": "sql",
"ms": "postgresql",
"host": "0.0.0.0",
"port": "5432",
"enabled": "false",
"running": "false",
},
"enabled": "false",
"running": "false",
},
"note": "every thing has disabeld and nothing is running.",
},
"owner": "user2",
"name": "api-3",
},
"test_12_Update_401": {
"headers": {"accept": "application/json", "Authorization": "Bearer %s"},
"json": {
"metadata": {
"name": "SimpleAPI",
"url": "http://127.0.0.1:5057",
"database": {
"name": "apidb",
"type": "sql",
"ms": "postgresql",
"host": "0.0.0.0",
"port": "5432",
"enabled": "false",
"running": "true",
},
"enabled": "false",
"running": "true",
},
"note": "every thing has disabled.",
},
"name": "api-3",
"owner": "user2",
},
"test_13_Update_404": {
"headers": {"accept": "application/json", "Authorization": "Bearer %s"},
"json": {
"metadata": {
"name": "SimpleAPI",
"url": "http://127.0.0.1:5057",
"database": {
"name": "apidb",
"type": "sql",
"ms": "postgresql",
"host": "0.0.0.0",
"port": "5432",
"enabled": "false",
"running": "true",
},
"enabled": "false",
"running": "true",
},
"note": "every thing has disabled.",
},
"name": "404config",
"owner": "admin",
},
"test_14_Delete_success": {
"headers": {"accept": "application/json", "Authorization": "Bearer %s"},
"name": "api-3",
},
"test_15_Delete_success_owner": {
"headers": {"accept": "application/json", "Authorization": "Bearer %s"},
"name": "api-2",
"owner": "user2",
},
"test_16_Delete_401": {
"headers": {"accept": "application/json", "Authorization": "Bearer %s"},
"name": "doesnotmatter",
"owner": "notme",
},
"test_17_Delete_404": {
"headers": {"accept": "application/json", "Authorization": "Bearer %s"},
"name": "404config",
"owner": "404user",
},
"test_18_Query_success": {
"headers": {"accept": "application/json", "Authorization": "Bearer %s"},
"key": "database.enabled",
"value": "true",
"all": "false",
},
"test_19_Query_success_all": {
"headers": {"accept": "application/json", "Authorization": "Bearer %s"},
"key": "running",
"value": "true",
"all": "true",
},
"test_20_Query_success_owner": {
"headers": {"accept": "application/json", "Authorization": "Bearer %s"},
"key": "database.running",
"value": "true",
"all": "false",
"owner": "admin",
},
"test_21_Query_401": {
"headers": {"accept": "application/json", "Authorization": "Bearer %s"},
"key": "database.running",
"value": "true",
"all": "true",
},
}
def OUTPUTS_CONFIGS():
"""
summary: return tests expected outputs for configs testcase dict
arguments: None
return: Output dict
"""
return {
"test_00_create_access_token_for_admin_and_user2": {
"status_code": 200,
},
"test_01_List_success": {
"status_code": 200,
"json": {
"Configs": [
{
"owner": "admin",
"name": "api-1",
"metadata": {
"name": "SimpleAPI",
"url": "http://127.0.0.1:5057",
"database": {
"name": "apidb",
"type": "sql",
"ms": "postgresql",
"host": "0.0.0.0",
"port": "5432",
"enabled": "true",
"running": "true",
},
"enabled": "true",
"running": "true",
},
"note": "The api has been enabled.",
},
{
"owner": "user2",
"name": "api-2",
"metadata": {
"name": "SimpleAPI",
"url": "http://127.0.0.1:5057",
"database": {
"name": "apidb",
"type": "sql",
"ms": "postgresql",
"host": "0.0.0.0",
"port": "5432",
"enabled": "true",
"running": "false",
},
"enabled": "true",
"running": "false",
},
"note": "The api has been enabled without the DB!",
},
]
},
},
"test_02_List_success_owner": {
"status_code": 200,
"json": {
"Configs": [
{
"owner": "user2",
"name": "api-2",
"metadata": {
"name": "SimpleAPI",
"url": "http://127.0.0.1:5057",
"database": {
"name": "apidb",
"type": "sql",
"ms": "postgresql",
"host": "0.0.0.0",
"port": "5432",
"enabled": "true",
"running": "false",
},
"enabled": "true",
"running": "false",
},
"note": "The api has been enabled without the DB!",
}
]
},
},
"test_03_List_401": {
"status_code": 401,
"json": {"detail": "Only admins can perform this function"},
},
"test_04_Create_success": {
"status_code": 200,
"json": {
"Created": {
"owner": "user2",
"name": "api-3",
"metadata": {
"name": "SimpleAPI",
"url": "http://127.0.0.1:5057",
"database": {
"name": "apidb",
"type": "sql",
"ms": "postgresql",
"host": "0.0.0.0",
"port": "5432",
"enabled": "true",
"running": "true",
},
"enabled": "true",
"running": "true",
},
"note": "everything is running.",
}
},
},
"test_05_Create_400": {
"status_code": 400,
"json": {"detail": "name already exists"},
},
"test_06_Get_success": {
"status_code": 200,
"json": {
"owner": "admin",
"name": "api-1",
"metadata": {
"name": "SimpleAPI",
"url": "http://127.0.0.1:5057",
"database": {
"name": "apidb",
"type": "sql",
"ms": "postgresql",
"host": "0.0.0.0",
"port": "5432",
"enabled": "true",
"running": "true",
},
"enabled": "true",
"running": "true",
},
"note": "The api has been enabled.",
},
},
"test_07_Get_success_owner": {
"status_code": 200,
"json": {
"owner": "admin",
"name": "api-1",
"metadata": {
"name": "SimpleAPI",
"url": "http://127.0.0.1:5057",
"database": {
"name": "apidb",
"type": "sql",
"ms": "postgresql",
"host": "0.0.0.0",
"port": "5432",
"enabled": "true",
"running": "true",
},
"enabled": "true",
"running": "true",
},
"note": "The api has been enabled.",
},
},
"test_08_Get_401": {
"status_code": 401,
"json": {"detail": "Only admins can perform this function"},
},
"test_09_Get_404": {
"status_code": 404,
"json": {"detail": "name doesn't exists"},
},
"test_10_Update_success": {
"status_code": 200,
"json": {
"Update": {
"owner": "user2",
"name": "api-3",
"metadata": {
"name": "SimpleAPI",
"url": "http://127.0.0.1:5057",
"database": {
"name": "apidb",
"type": "sql",
"ms": "postgresql",
"host": "0.0.0.0",
"port": "5432",
"enabled": "false",
"running": "true",
},
"enabled": "false",
"running": "true",
},
"note": "every thing has disabled.",
}
},
},
"test_11_Update_success_owner": {
"status_code": 200,
"json": {
"Update": {
"owner": "user2",
"name": "api-3",
"metadata": {
"name": "SimpleAPI",
"url": "http://127.0.0.1:5057",
"database": {
"name": "apidb",
"type": "sql",
"ms": "postgresql",
"host": "0.0.0.0",
"port": "5432",
"enabled": "false",
"running": "false",
},
"enabled": "false",
"running": "false",
},
"note": "every thing has disabeld and nothing is running.",
}
},
},
"test_12_Update_401": {
"status_code": 401,
"json": {"detail": "Only admins can perform this function"},
},
"test_13_Update_404": {
"status_code": 404,
"json": {"detail": "name doesn't exists"},
},
"test_14_Delete_success": {
"status_code": 200,
"json": {"Delete": {"owner": "user2", "name": "api-3"}},
},
"test_15_Delete_success_owner": {
"status_code": 200,
"json": {"Delete": {"owner": "user2", "name": "api-2"}},
},
"test_16_Delete_401": {
"status_code": 401,
"json": {"detail": "Only admins can perform this function"},
},
"test_17_Delete_404": {
"status_code": 404,
"json": {"detail": "name doesn't exists"},
},
"test_18_Query_success": {
"status_code": 200,
"json": {
"Configs": [
{
"owner": "admin",
"name": "api-1",
"metadata": {
"name": "SimpleAPI",
"url": "http://127.0.0.1:5057",
"database": {
"name": "apidb",
"type": "sql",
"ms": "postgresql",
"host": "0.0.0.0",
"port": "5432",
"enabled": "true",
"running": "true",
},
"enabled": "true",
"running": "true",
},
"note": "The api has been enabled.",
}
]
},
},
"test_19_Query_success_all": {
"status_code": 200,
"json": {
"Configs": [
{
"owner": "admin",
"name": "api-1",
"metadata": {
"name": "SimpleAPI",
"url": "http://127.0.0.1:5057",
"database": {
"name": "apidb",
"type": "sql",
"ms": "postgresql",
"host": "0.0.0.0",
"port": "5432",
"enabled": "true",
"running": "true",
},
"enabled": "true",
"running": "true",
},
"note": "The api has been enabled.",
}
]
},
},
"test_20_Query_success_owner": {
"status_code": 200,
"json": {
"Configs": [
{
"owner": "admin",
"name": "api-1",
"metadata": {
"name": "SimpleAPI",
"url": "http://127.0.0.1:5057",
"database": {
"name": "apidb",
"type": "sql",
"ms": "postgresql",
"host": "0.0.0.0",
"port": "5432",
"enabled": "true",
"running": "true",
},
"enabled": "true",
"running": "true",
},
"note": "The api has been enabled.",
}
]
},
},
"test_21_Query_401": {
"status_code": 401,
"json": {"detail": "Only admins can perform this function"},
},
}
def MOCK_USERS():
"""
Mock db for users tests
"""
return {
"admin": models.User(
**{"username": "admin", "password": "admin", "isadmin": True, "configs": []}
),
"user2": models.User(
**{
"username": "user2",
"password": "user2pass",
"isadmin": False,
"configs": [],
}
),
"test08": models.User(
**{
"username": "test08",
"password": "test08pass",
"isadmin": True,
"configs": [],
}
),
}
def MOCK_CONFIGS():
"""
Mock db for configs tests
"""
return {
"admin": models.User(
**{"username": "admin", "password": "admin", "isadmin": True, "configs": []}
),
"user2": models.User(
**{
"username": "user2",
"password": "user2pass",
"isadmin": False,
"configs": [],
}
),
"api-1": models.Config(
**{
"owner": "admin",
"name": "api-1",
"metadatac": {
"name": "SimpleAPI",
"url": "http://127.0.0.1:5057",
"database": {
"name": "apidb",
"type": "sql",
"ms": "postgresql",
"host": "0.0.0.0",
"port": "5432",
"enabled": "true",
"running": "true",
},
"enabled": "true",
"running": "true",
},
"note": "The api has been enabled.",
}
),
"api-2": models.Config(
**{
"owner": "user2",
"name": "api-2",
"metadatac": {
"name": "SimpleAPI",
"url": "http://127.0.0.1:5057",
"database": {
"name": "apidb",
"type": "sql",
"ms": "postgresql",
"host": "0.0.0.0",
"port": "5432",
"enabled": "true",
"running": "false",
},
"enabled": "true",
"running": "false",
},
"note": "The api has been enabled without the DB!",
}
),
"api-3": models.Config(
**{
"owner": "user2",
"name": "api-3",
"metadatac": {
"name": "SimpleAPI",
"url": "http://127.0.0.1:5057",
"database": {
"name": "apidb",
"type": "sql",
"ms": "postgresql",
"host": "0.0.0.0",
"port": "5432",
"enabled": "true",
"running": "true",
},
"enabled": "true",
"running": "true",
},
"note": "everything is running.",
}
),
"api-32": models.Config(
**{
"owner": "user2",
"name": "api-3",
"metadatac": {
"name": "SimpleAPI",
"url": "http://127.0.0.1:5057",
"database": {
"name": "apidb",
"type": "sql",
"ms": "postgresql",
"host": "0.0.0.0",
"port": "5432",
"enabled": "false",
"running": "true",
},
"enabled": "false",
"running": "true",
},
"note": "every thing has disabled.",
}
),
"api-33": models.Config(
**{
"owner": "user2",
"name": "api-3",
"metadatac": {
"name": "SimpleAPI",
"url": "http://127.0.0.1:5057",
"database": {
"name": "apidb",
"type": "sql",
"ms": "postgresql",
"host": "0.0.0.0",
"port": "5432",
"enabled": "false",
"running": "false",
},
"enabled": "false",
"running": "false",
},
"note": "every thing has disabeld and nothing is running.",
}
),
}
| 35.069106 | 88 | 0.345079 | 2,387 | 34,508 | 4.843737 | 0.072895 | 0.015395 | 0.011417 | 0.099291 | 0.852707 | 0.820879 | 0.773828 | 0.766909 | 0.744681 | 0.661823 | 0 | 0.054501 | 0.497537 | 34,508 | 983 | 89 | 35.104781 | 0.612319 | 0.015793 | 0 | 0.72973 | 0 | 0 | 0.34133 | 0.042093 | 0 | 0 | 0 | 0 | 0 | 1 | 0.008649 | false | 0.015135 | 0.001081 | 0 | 0.018378 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
72c5c0add93b61d0e7d03cf74b99bf9f82c334a8 | 137 | py | Python | xu/compa/Parapluie/src/Adapter/__init__.py | sonnts996/XuCompa-Request | f343e7bfd1b4263eb76438c96d347c549cc75ce3 | [
"Apache-2.0"
] | null | null | null | xu/compa/Parapluie/src/Adapter/__init__.py | sonnts996/XuCompa-Request | f343e7bfd1b4263eb76438c96d347c549cc75ce3 | [
"Apache-2.0"
] | null | null | null | xu/compa/Parapluie/src/Adapter/__init__.py | sonnts996/XuCompa-Request | f343e7bfd1b4263eb76438c96d347c549cc75ce3 | [
"Apache-2.0"
] | null | null | null | from xu.compa.Parapluie.src.Adapter.PGridAdapter import PGridAdapter
from xu.compa.Parapluie.src.Adapter.PListAdapter import PListAdapter | 68.5 | 68 | 0.875912 | 18 | 137 | 6.666667 | 0.5 | 0.1 | 0.183333 | 0.333333 | 0.5 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0.051095 | 137 | 2 | 69 | 68.5 | 0.923077 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
72cd2ca56924d6f4e7b1dcf33918a3662e79b00e | 46,081 | py | Python | sib_api_v3_sdk/api/sms_campaigns_api.py | alemangui/APIv3-python-library | 129dbee603ed91d756fdec3670c8f95be86cdd2c | [
"MIT"
] | 46 | 2018-12-18T21:37:18.000Z | 2022-03-30T20:38:29.000Z | sib_api_v3_sdk/api/sms_campaigns_api.py | alemangui/APIv3-python-library | 129dbee603ed91d756fdec3670c8f95be86cdd2c | [
"MIT"
] | 41 | 2018-03-02T13:22:48.000Z | 2021-11-25T04:32:03.000Z | sib_api_v3_sdk/api/sms_campaigns_api.py | alemangui/APIv3-python-library | 129dbee603ed91d756fdec3670c8f95be86cdd2c | [
"MIT"
] | 45 | 2018-01-22T14:42:32.000Z | 2021-12-16T19:58:45.000Z | # coding: utf-8
"""
SendinBlue API
SendinBlue provide a RESTFul API that can be used with any languages. With this API, you will be able to : - Manage your campaigns and get the statistics - Manage your contacts - Send transactional Emails and SMS - and much more... You can download our wrappers at https://github.com/orgs/sendinblue **Possible responses** | Code | Message | | :-------------: | ------------- | | 200 | OK. Successful Request | | 201 | OK. Successful Creation | | 202 | OK. Request accepted | | 204 | OK. Successful Update/Deletion | | 400 | Error. Bad Request | | 401 | Error. Authentication Needed | | 402 | Error. Not enough credit, plan upgrade needed | | 403 | Error. Permission denied | | 404 | Error. Object does not exist | | 405 | Error. Method not allowed | | 406 | Error. Not Acceptable | # noqa: E501
OpenAPI spec version: 3.0.0
Contact: contact@sendinblue.com
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from sib_api_v3_sdk.api_client import ApiClient
class SMSCampaignsApi(object):
"""NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
Ref: https://github.com/swagger-api/swagger-codegen
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def create_sms_campaign(self, create_sms_campaign, **kwargs): # noqa: E501
"""Creates an SMS campaign # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_sms_campaign(create_sms_campaign, async_req=True)
>>> result = thread.get()
:param async_req bool
:param CreateSmsCampaign create_sms_campaign: Values to create an SMS Campaign (required)
:return: CreateModel
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.create_sms_campaign_with_http_info(create_sms_campaign, **kwargs) # noqa: E501
else:
(data) = self.create_sms_campaign_with_http_info(create_sms_campaign, **kwargs) # noqa: E501
return data
def create_sms_campaign_with_http_info(self, create_sms_campaign, **kwargs): # noqa: E501
"""Creates an SMS campaign # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_sms_campaign_with_http_info(create_sms_campaign, async_req=True)
>>> result = thread.get()
:param async_req bool
:param CreateSmsCampaign create_sms_campaign: Values to create an SMS Campaign (required)
:return: CreateModel
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['create_sms_campaign'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_sms_campaign" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'create_sms_campaign' is set
if ('create_sms_campaign' not in params or
params['create_sms_campaign'] is None):
raise ValueError("Missing the required parameter `create_sms_campaign` when calling `create_sms_campaign`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'create_sms_campaign' in params:
body_params = params['create_sms_campaign']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['api-key', 'partner-key'] # noqa: E501
return self.api_client.call_api(
'/smsCampaigns', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='CreateModel', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_sms_campaign(self, campaign_id, **kwargs): # noqa: E501
"""Delete an SMS campaign # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_sms_campaign(campaign_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int campaign_id: id of the SMS campaign (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.delete_sms_campaign_with_http_info(campaign_id, **kwargs) # noqa: E501
else:
(data) = self.delete_sms_campaign_with_http_info(campaign_id, **kwargs) # noqa: E501
return data
def delete_sms_campaign_with_http_info(self, campaign_id, **kwargs): # noqa: E501
"""Delete an SMS campaign # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_sms_campaign_with_http_info(campaign_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int campaign_id: id of the SMS campaign (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['campaign_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_sms_campaign" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'campaign_id' is set
if ('campaign_id' not in params or
params['campaign_id'] is None):
raise ValueError("Missing the required parameter `campaign_id` when calling `delete_sms_campaign`") # noqa: E501
collection_formats = {}
path_params = {}
if 'campaign_id' in params:
path_params['campaignId'] = params['campaign_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['api-key', 'partner-key'] # noqa: E501
return self.api_client.call_api(
'/smsCampaigns/{campaignId}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_sms_campaign(self, campaign_id, **kwargs): # noqa: E501
"""Get an SMS campaign # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_sms_campaign(campaign_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int campaign_id: id of the SMS campaign (required)
:return: GetSmsCampaign
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_sms_campaign_with_http_info(campaign_id, **kwargs) # noqa: E501
else:
(data) = self.get_sms_campaign_with_http_info(campaign_id, **kwargs) # noqa: E501
return data
def get_sms_campaign_with_http_info(self, campaign_id, **kwargs): # noqa: E501
"""Get an SMS campaign # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_sms_campaign_with_http_info(campaign_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int campaign_id: id of the SMS campaign (required)
:return: GetSmsCampaign
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['campaign_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_sms_campaign" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'campaign_id' is set
if ('campaign_id' not in params or
params['campaign_id'] is None):
raise ValueError("Missing the required parameter `campaign_id` when calling `get_sms_campaign`") # noqa: E501
collection_formats = {}
path_params = {}
if 'campaign_id' in params:
path_params['campaignId'] = params['campaign_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['api-key', 'partner-key'] # noqa: E501
return self.api_client.call_api(
'/smsCampaigns/{campaignId}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='GetSmsCampaign', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_sms_campaigns(self, **kwargs): # noqa: E501
"""Returns the information for all your created SMS campaigns # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_sms_campaigns(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str status: Status of campaign.
:param datetime start_date: Mandatory if endDate is used. Starting (urlencoded) UTC date-time (YYYY-MM-DDTHH:mm:ss.SSSZ) to filter the sent sms campaigns. Prefer to pass your timezone in date-time format for accurate result ( only available if either 'status' not passed and if passed is set to 'sent' )
:param datetime end_date: Mandatory if startDate is used. Ending (urlencoded) UTC date-time (YYYY-MM-DDTHH:mm:ss.SSSZ) to filter the sent sms campaigns. Prefer to pass your timezone in date-time format for accurate result ( only available if either 'status' not passed and if passed is set to 'sent' )
:param int limit: Number limitation for the result returned
:param int offset: Beginning point in the list to retrieve from.
:param str sort: Sort the results in the ascending/descending order of record creation. Default order is **descending** if `sort` is not passed
:return: GetSmsCampaigns
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_sms_campaigns_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.get_sms_campaigns_with_http_info(**kwargs) # noqa: E501
return data
def get_sms_campaigns_with_http_info(self, **kwargs): # noqa: E501
"""Returns the information for all your created SMS campaigns # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_sms_campaigns_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str status: Status of campaign.
:param datetime start_date: Mandatory if endDate is used. Starting (urlencoded) UTC date-time (YYYY-MM-DDTHH:mm:ss.SSSZ) to filter the sent sms campaigns. Prefer to pass your timezone in date-time format for accurate result ( only available if either 'status' not passed and if passed is set to 'sent' )
:param datetime end_date: Mandatory if startDate is used. Ending (urlencoded) UTC date-time (YYYY-MM-DDTHH:mm:ss.SSSZ) to filter the sent sms campaigns. Prefer to pass your timezone in date-time format for accurate result ( only available if either 'status' not passed and if passed is set to 'sent' )
:param int limit: Number limitation for the result returned
:param int offset: Beginning point in the list to retrieve from.
:param str sort: Sort the results in the ascending/descending order of record creation. Default order is **descending** if `sort` is not passed
:return: GetSmsCampaigns
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['status', 'start_date', 'end_date', 'limit', 'offset', 'sort'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_sms_campaigns" % key
)
params[key] = val
del params['kwargs']
if 'limit' in params and params['limit'] > 1000: # noqa: E501
raise ValueError("Invalid value for parameter `limit` when calling `get_sms_campaigns`, must be a value less than or equal to `1000`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
if 'status' in params:
query_params.append(('status', params['status'])) # noqa: E501
if 'start_date' in params:
query_params.append(('startDate', params['start_date'])) # noqa: E501
if 'end_date' in params:
query_params.append(('endDate', params['end_date'])) # noqa: E501
if 'limit' in params:
query_params.append(('limit', params['limit'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'sort' in params:
query_params.append(('sort', params['sort'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['api-key', 'partner-key'] # noqa: E501
return self.api_client.call_api(
'/smsCampaigns', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='GetSmsCampaigns', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def request_sms_recipient_export(self, campaign_id, **kwargs): # noqa: E501
"""Export an SMS campaign's recipients # noqa: E501
It returns the background process ID which on completion calls the notify URL that you have set in the input. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.request_sms_recipient_export(campaign_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int campaign_id: id of the campaign (required)
:param RequestSmsRecipientExport recipient_export: Values to send for a recipient export request
:return: CreatedProcessId
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.request_sms_recipient_export_with_http_info(campaign_id, **kwargs) # noqa: E501
else:
(data) = self.request_sms_recipient_export_with_http_info(campaign_id, **kwargs) # noqa: E501
return data
def request_sms_recipient_export_with_http_info(self, campaign_id, **kwargs): # noqa: E501
"""Export an SMS campaign's recipients # noqa: E501
It returns the background process ID which on completion calls the notify URL that you have set in the input. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.request_sms_recipient_export_with_http_info(campaign_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int campaign_id: id of the campaign (required)
:param RequestSmsRecipientExport recipient_export: Values to send for a recipient export request
:return: CreatedProcessId
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['campaign_id', 'recipient_export'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method request_sms_recipient_export" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'campaign_id' is set
if ('campaign_id' not in params or
params['campaign_id'] is None):
raise ValueError("Missing the required parameter `campaign_id` when calling `request_sms_recipient_export`") # noqa: E501
collection_formats = {}
path_params = {}
if 'campaign_id' in params:
path_params['campaignId'] = params['campaign_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'recipient_export' in params:
body_params = params['recipient_export']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['api-key', 'partner-key'] # noqa: E501
return self.api_client.call_api(
'/smsCampaigns/{campaignId}/exportRecipients', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='CreatedProcessId', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def send_sms_campaign_now(self, campaign_id, **kwargs): # noqa: E501
"""Send your SMS campaign immediately # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.send_sms_campaign_now(campaign_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int campaign_id: id of the campaign (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.send_sms_campaign_now_with_http_info(campaign_id, **kwargs) # noqa: E501
else:
(data) = self.send_sms_campaign_now_with_http_info(campaign_id, **kwargs) # noqa: E501
return data
def send_sms_campaign_now_with_http_info(self, campaign_id, **kwargs): # noqa: E501
"""Send your SMS campaign immediately # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.send_sms_campaign_now_with_http_info(campaign_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int campaign_id: id of the campaign (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['campaign_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method send_sms_campaign_now" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'campaign_id' is set
if ('campaign_id' not in params or
params['campaign_id'] is None):
raise ValueError("Missing the required parameter `campaign_id` when calling `send_sms_campaign_now`") # noqa: E501
collection_formats = {}
path_params = {}
if 'campaign_id' in params:
path_params['campaignId'] = params['campaign_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['api-key', 'partner-key'] # noqa: E501
return self.api_client.call_api(
'/smsCampaigns/{campaignId}/sendNow', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def send_sms_report(self, campaign_id, send_report, **kwargs): # noqa: E501
"""Send an SMS campaign's report # noqa: E501
Send report of Sent and Archived campaign, to the specified email addresses, with respective data and a pdf attachment in detail. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.send_sms_report(campaign_id, send_report, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int campaign_id: id of the campaign (required)
:param SendReport send_report: Values for send a report (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.send_sms_report_with_http_info(campaign_id, send_report, **kwargs) # noqa: E501
else:
(data) = self.send_sms_report_with_http_info(campaign_id, send_report, **kwargs) # noqa: E501
return data
def send_sms_report_with_http_info(self, campaign_id, send_report, **kwargs): # noqa: E501
"""Send an SMS campaign's report # noqa: E501
Send report of Sent and Archived campaign, to the specified email addresses, with respective data and a pdf attachment in detail. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.send_sms_report_with_http_info(campaign_id, send_report, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int campaign_id: id of the campaign (required)
:param SendReport send_report: Values for send a report (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['campaign_id', 'send_report'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method send_sms_report" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'campaign_id' is set
if ('campaign_id' not in params or
params['campaign_id'] is None):
raise ValueError("Missing the required parameter `campaign_id` when calling `send_sms_report`") # noqa: E501
# verify the required parameter 'send_report' is set
if ('send_report' not in params or
params['send_report'] is None):
raise ValueError("Missing the required parameter `send_report` when calling `send_sms_report`") # noqa: E501
collection_formats = {}
path_params = {}
if 'campaign_id' in params:
path_params['campaignId'] = params['campaign_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'send_report' in params:
body_params = params['send_report']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['api-key', 'partner-key'] # noqa: E501
return self.api_client.call_api(
'/smsCampaigns/{campaignId}/sendReport', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def send_test_sms(self, campaign_id, phone_number, **kwargs): # noqa: E501
"""Send a test SMS campaign # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.send_test_sms(campaign_id, phone_number, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int campaign_id: Id of the SMS campaign (required)
:param SendTestSms phone_number: Mobile number of the recipient with the country code. This number must belong to one of your contacts in SendinBlue account and must not be blacklisted (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.send_test_sms_with_http_info(campaign_id, phone_number, **kwargs) # noqa: E501
else:
(data) = self.send_test_sms_with_http_info(campaign_id, phone_number, **kwargs) # noqa: E501
return data
def send_test_sms_with_http_info(self, campaign_id, phone_number, **kwargs): # noqa: E501
"""Send a test SMS campaign # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.send_test_sms_with_http_info(campaign_id, phone_number, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int campaign_id: Id of the SMS campaign (required)
:param SendTestSms phone_number: Mobile number of the recipient with the country code. This number must belong to one of your contacts in SendinBlue account and must not be blacklisted (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['campaign_id', 'phone_number'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method send_test_sms" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'campaign_id' is set
if ('campaign_id' not in params or
params['campaign_id'] is None):
raise ValueError("Missing the required parameter `campaign_id` when calling `send_test_sms`") # noqa: E501
# verify the required parameter 'phone_number' is set
if ('phone_number' not in params or
params['phone_number'] is None):
raise ValueError("Missing the required parameter `phone_number` when calling `send_test_sms`") # noqa: E501
collection_formats = {}
path_params = {}
if 'campaign_id' in params:
path_params['campaignId'] = params['campaign_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'phone_number' in params:
body_params = params['phone_number']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['api-key', 'partner-key'] # noqa: E501
return self.api_client.call_api(
'/smsCampaigns/{campaignId}/sendTest', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_sms_campaign(self, campaign_id, update_sms_campaign, **kwargs): # noqa: E501
"""Update an SMS campaign # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_sms_campaign(campaign_id, update_sms_campaign, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int campaign_id: id of the SMS campaign (required)
:param UpdateSmsCampaign update_sms_campaign: Values to update an SMS Campaign (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_sms_campaign_with_http_info(campaign_id, update_sms_campaign, **kwargs) # noqa: E501
else:
(data) = self.update_sms_campaign_with_http_info(campaign_id, update_sms_campaign, **kwargs) # noqa: E501
return data
def update_sms_campaign_with_http_info(self, campaign_id, update_sms_campaign, **kwargs): # noqa: E501
"""Update an SMS campaign # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_sms_campaign_with_http_info(campaign_id, update_sms_campaign, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int campaign_id: id of the SMS campaign (required)
:param UpdateSmsCampaign update_sms_campaign: Values to update an SMS Campaign (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['campaign_id', 'update_sms_campaign'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_sms_campaign" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'campaign_id' is set
if ('campaign_id' not in params or
params['campaign_id'] is None):
raise ValueError("Missing the required parameter `campaign_id` when calling `update_sms_campaign`") # noqa: E501
# verify the required parameter 'update_sms_campaign' is set
if ('update_sms_campaign' not in params or
params['update_sms_campaign'] is None):
raise ValueError("Missing the required parameter `update_sms_campaign` when calling `update_sms_campaign`") # noqa: E501
collection_formats = {}
path_params = {}
if 'campaign_id' in params:
path_params['campaignId'] = params['campaign_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'update_sms_campaign' in params:
body_params = params['update_sms_campaign']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['api-key', 'partner-key'] # noqa: E501
return self.api_client.call_api(
'/smsCampaigns/{campaignId}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_sms_campaign_status(self, campaign_id, status, **kwargs): # noqa: E501
"""Update a campaign's status # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_sms_campaign_status(campaign_id, status, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int campaign_id: id of the campaign (required)
:param UpdateCampaignStatus status: Status of the campaign. (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_sms_campaign_status_with_http_info(campaign_id, status, **kwargs) # noqa: E501
else:
(data) = self.update_sms_campaign_status_with_http_info(campaign_id, status, **kwargs) # noqa: E501
return data
def update_sms_campaign_status_with_http_info(self, campaign_id, status, **kwargs): # noqa: E501
"""Update a campaign's status # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_sms_campaign_status_with_http_info(campaign_id, status, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int campaign_id: id of the campaign (required)
:param UpdateCampaignStatus status: Status of the campaign. (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['campaign_id', 'status'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_sms_campaign_status" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'campaign_id' is set
if ('campaign_id' not in params or
params['campaign_id'] is None):
raise ValueError("Missing the required parameter `campaign_id` when calling `update_sms_campaign_status`") # noqa: E501
# verify the required parameter 'status' is set
if ('status' not in params or
params['status'] is None):
raise ValueError("Missing the required parameter `status` when calling `update_sms_campaign_status`") # noqa: E501
collection_formats = {}
path_params = {}
if 'campaign_id' in params:
path_params['campaignId'] = params['campaign_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'status' in params:
body_params = params['status']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['api-key', 'partner-key'] # noqa: E501
return self.api_client.call_api(
'/smsCampaigns/{campaignId}/status', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
| 43.349953 | 856 | 0.626093 | 5,481 | 46,081 | 5.018245 | 0.056741 | 0.044792 | 0.02036 | 0.026177 | 0.931431 | 0.918015 | 0.905835 | 0.890747 | 0.876168 | 0.872096 | 0 | 0.015581 | 0.285519 | 46,081 | 1,062 | 857 | 43.390772 | 0.819827 | 0.35787 | 0 | 0.763793 | 0 | 0.001724 | 0.206315 | 0.044796 | 0 | 0 | 0 | 0 | 0 | 1 | 0.036207 | false | 0 | 0.006897 | 0 | 0.096552 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
72f0ebf7e7ca8226e3b6521755030fe32deb155b | 44 | py | Python | connectors/__init__.py | Gril258/gdtb | a44f1932e1288c7621da57bc7a90096cb9762f6b | [
"WTFPL"
] | 2 | 2019-03-02T15:56:46.000Z | 2019-03-04T14:18:34.000Z | connectors/__init__.py | Gril258/gdtb | a44f1932e1288c7621da57bc7a90096cb9762f6b | [
"WTFPL"
] | null | null | null | connectors/__init__.py | Gril258/gdtb | a44f1932e1288c7621da57bc7a90096cb9762f6b | [
"WTFPL"
] | 1 | 2019-03-02T18:39:59.000Z | 2019-03-02T18:39:59.000Z | from . import miband
from . import constants | 22 | 23 | 0.795455 | 6 | 44 | 5.833333 | 0.666667 | 0.571429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.159091 | 44 | 2 | 23 | 22 | 0.945946 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
f45b17638cf4185dab6b78e66a0c3e4d1b8a69a4 | 35,081 | py | Python | sponsors/tests/test_views_admin.py | imthegithubuser/pythondotorg | b0e712da8e813719ebbc59be134e7a084e0a20e3 | [
"Apache-2.0"
] | null | null | null | sponsors/tests/test_views_admin.py | imthegithubuser/pythondotorg | b0e712da8e813719ebbc59be134e7a084e0a20e3 | [
"Apache-2.0"
] | 31 | 2021-07-01T14:13:46.000Z | 2022-03-25T14:17:01.000Z | sponsors/tests/test_views_admin.py | hugovk/pythondotorg | 8a02c109590e6e81aa05f8213e7fde7a3d05858f | [
"Apache-2.0"
] | null | null | null | import io
import json
from model_bakery import baker
from datetime import date, timedelta
from unittest.mock import patch, PropertyMock
from django.conf import settings
from django.contrib import messages
from django.contrib.messages import get_messages
from django.core import mail
from django.core.handlers.wsgi import WSGIRequest
from django.http import HttpResponse
from django.test import TestCase
from django.urls import reverse
from .utils import assertMessage
from ..models import Sponsorship, Contract, SponsorshipBenefit, SponsorBenefit
from ..forms import SponsorshipReviewAdminForm, SponsorshipsListForm, SignedSponsorshipReviewAdminForm
class RollbackSponsorshipToEditingAdminViewTests(TestCase):
def setUp(self):
self.user = baker.make(
settings.AUTH_USER_MODEL, is_staff=True, is_superuser=True
)
self.client.force_login(self.user)
self.sponsorship = baker.make(
Sponsorship,
status=Sponsorship.APPROVED,
submited_by=self.user,
_fill_optional=True,
)
self.url = reverse(
"admin:sponsors_sponsorship_rollback_to_edit", args=[self.sponsorship.pk]
)
def test_display_confirmation_form_on_get(self):
response = self.client.get(self.url)
context = response.context
self.sponsorship.refresh_from_db()
self.assertTemplateUsed(
response, "sponsors/admin/rollback_sponsorship_to_editing.html"
)
self.assertEqual(context["sponsorship"], self.sponsorship)
self.assertNotEqual(
self.sponsorship.status, Sponsorship.APPLIED
) # did not update
def test_rollback_sponsorship_to_applied_on_post(self):
data = {"confirm": "yes"}
response = self.client.post(self.url, data=data)
self.sponsorship.refresh_from_db()
expected_url = reverse(
"admin:sponsors_sponsorship_change", args=[self.sponsorship.pk]
)
self.assertRedirects(response, expected_url, fetch_redirect_response=True)
self.assertEqual(self.sponsorship.status, Sponsorship.APPLIED)
msg = list(get_messages(response.wsgi_request))[0]
assertMessage(msg, "Sponsorship is now editable!", messages.SUCCESS)
def test_do_not_rollback_if_invalid_post(self):
response = self.client.post(self.url, data={})
self.sponsorship.refresh_from_db()
self.assertTemplateUsed(
response, "sponsors/admin/rollback_sponsorship_to_editing.html"
)
self.assertNotEqual(
self.sponsorship.status, Sponsorship.APPLIED
) # did not update
response = self.client.post(self.url, data={"confirm": "invalid"})
self.sponsorship.refresh_from_db()
self.assertTemplateUsed(
response, "sponsors/admin/rollback_sponsorship_to_editing.html"
)
self.assertNotEqual(self.sponsorship.status, Sponsorship.APPLIED)
def test_404_if_sponsorship_does_not_exist(self):
self.sponsorship.delete()
response = self.client.get(self.url)
self.assertEqual(response.status_code, 404)
def test_login_required(self):
login_url = reverse("admin:login")
redirect_url = f"{login_url}?next={self.url}"
self.client.logout()
r = self.client.get(self.url)
self.assertRedirects(r, redirect_url)
def test_staff_required(self):
login_url = reverse("admin:login")
redirect_url = f"{login_url}?next={self.url}"
self.user.is_staff = False
self.user.save()
self.client.force_login(self.user)
r = self.client.get(self.url)
self.assertRedirects(r, redirect_url, fetch_redirect_response=False)
def test_message_user_if_rejecting_invalid_sponsorship(self):
self.sponsorship.status = Sponsorship.FINALIZED
self.sponsorship.save()
data = {"confirm": "yes"}
response = self.client.post(self.url, data=data)
self.sponsorship.refresh_from_db()
expected_url = reverse(
"admin:sponsors_sponsorship_change", args=[self.sponsorship.pk]
)
self.assertRedirects(response, expected_url, fetch_redirect_response=True)
self.assertEqual(self.sponsorship.status, Sponsorship.FINALIZED)
msg = list(get_messages(response.wsgi_request))[0]
assertMessage(
msg, "Can't rollback to edit a Finalized sponsorship.", messages.ERROR
)
class RejectedSponsorshipAdminViewTests(TestCase):
def setUp(self):
self.user = baker.make(
settings.AUTH_USER_MODEL, is_staff=True, is_superuser=True
)
self.client.force_login(self.user)
self.sponsorship = baker.make(
Sponsorship,
status=Sponsorship.APPLIED,
submited_by=self.user,
_fill_optional=True,
)
self.url = reverse(
"admin:sponsors_sponsorship_reject", args=[self.sponsorship.pk]
)
def test_display_confirmation_form_on_get(self):
response = self.client.get(self.url)
context = response.context
self.sponsorship.refresh_from_db()
self.assertTemplateUsed(response, "sponsors/admin/reject_application.html")
self.assertEqual(context["sponsorship"], self.sponsorship)
self.assertNotEqual(
self.sponsorship.status, Sponsorship.REJECTED
) # did not update
def test_reject_sponsorship_on_post(self):
data = {"confirm": "yes"}
response = self.client.post(self.url, data=data)
self.sponsorship.refresh_from_db()
expected_url = reverse(
"admin:sponsors_sponsorship_change", args=[self.sponsorship.pk]
)
self.assertRedirects(response, expected_url, fetch_redirect_response=True)
self.assertTrue(mail.outbox)
self.assertEqual(self.sponsorship.status, Sponsorship.REJECTED)
msg = list(get_messages(response.wsgi_request))[0]
assertMessage(msg, "Sponsorship was rejected!", messages.SUCCESS)
def test_do_not_reject_if_invalid_post(self):
response = self.client.post(self.url, data={})
self.sponsorship.refresh_from_db()
self.assertTemplateUsed(response, "sponsors/admin/reject_application.html")
self.assertNotEqual(
self.sponsorship.status, Sponsorship.REJECTED
) # did not update
response = self.client.post(self.url, data={"confirm": "invalid"})
self.sponsorship.refresh_from_db()
self.assertTemplateUsed(response, "sponsors/admin/reject_application.html")
self.assertNotEqual(self.sponsorship.status, Sponsorship.REJECTED)
def test_404_if_sponsorship_does_not_exist(self):
self.sponsorship.delete()
response = self.client.get(self.url)
self.assertEqual(response.status_code, 404)
def test_login_required(self):
login_url = reverse("admin:login")
redirect_url = f"{login_url}?next={self.url}"
self.client.logout()
r = self.client.get(self.url)
self.assertRedirects(r, redirect_url)
def test_staff_required(self):
login_url = reverse("admin:login")
redirect_url = f"{login_url}?next={self.url}"
self.user.is_staff = False
self.user.save()
self.client.force_login(self.user)
r = self.client.get(self.url)
self.assertRedirects(r, redirect_url, fetch_redirect_response=False)
def test_message_user_if_rejecting_invalid_sponsorship(self):
self.sponsorship.status = Sponsorship.FINALIZED
self.sponsorship.save()
data = {"confirm": "yes"}
response = self.client.post(self.url, data=data)
self.sponsorship.refresh_from_db()
expected_url = reverse(
"admin:sponsors_sponsorship_change", args=[self.sponsorship.pk]
)
self.assertRedirects(response, expected_url, fetch_redirect_response=True)
self.assertEqual(self.sponsorship.status, Sponsorship.FINALIZED)
msg = list(get_messages(response.wsgi_request))[0]
assertMessage(msg, "Can't reject a Finalized sponsorship.", messages.ERROR)
class ApproveSponsorshipAdminViewTests(TestCase):
def setUp(self):
self.user = baker.make(
settings.AUTH_USER_MODEL, is_staff=True, is_superuser=True
)
self.client.force_login(self.user)
self.sponsorship = baker.make(
Sponsorship, status=Sponsorship.APPLIED, _fill_optional=True
)
self.url = reverse(
"admin:sponsors_sponsorship_approve", args=[self.sponsorship.pk]
)
today = date.today()
self.package = baker.make("sponsors.SponsorshipPackage")
self.data = {
"confirm": "yes",
"start_date": today,
"end_date": today + timedelta(days=100),
"package": self.package.pk,
"sponsorship_fee": 500,
}
def test_display_confirmation_form_on_get(self):
response = self.client.get(self.url)
context = response.context
form = context["form"]
self.sponsorship.refresh_from_db()
self.assertTemplateUsed(response, "sponsors/admin/approve_application.html")
self.assertEqual(context["sponsorship"], self.sponsorship)
self.assertIsInstance(form, SponsorshipReviewAdminForm)
self.assertEqual(form.initial["package"], self.sponsorship.package)
self.assertEqual(form.initial["start_date"], self.sponsorship.start_date)
self.assertEqual(form.initial["end_date"], self.sponsorship.end_date)
self.assertEqual(
form.initial["sponsorship_fee"], self.sponsorship.sponsorship_fee
)
self.assertNotEqual(
self.sponsorship.status, Sponsorship.APPROVED
) # did not update
def test_approve_sponsorship_on_post(self):
response = self.client.post(self.url, data=self.data)
self.sponsorship.refresh_from_db()
expected_url = reverse(
"admin:sponsors_sponsorship_change", args=[self.sponsorship.pk]
)
self.assertRedirects(response, expected_url, fetch_redirect_response=True)
self.assertEqual(self.sponsorship.status, Sponsorship.APPROVED)
msg = list(get_messages(response.wsgi_request))[0]
assertMessage(msg, "Sponsorship was approved!", messages.SUCCESS)
def test_do_not_approve_if_no_confirmation_in_the_post(self):
self.data.pop("confirm")
response = self.client.post(self.url, data=self.data)
self.sponsorship.refresh_from_db()
self.assertTemplateUsed(response, "sponsors/admin/approve_application.html")
self.assertNotEqual(
self.sponsorship.status, Sponsorship.APPROVED
) # did not update
self.data["confirm"] = "invalid"
response = self.client.post(self.url, data=self.data)
self.sponsorship.refresh_from_db()
self.assertTemplateUsed(response, "sponsors/admin/approve_application.html")
self.assertNotEqual(self.sponsorship.status, Sponsorship.APPROVED)
def test_do_not_approve_if_form_with_invalid_data(self):
self.data = {"confirm": "yes"}
response = self.client.post(self.url, data=self.data)
self.sponsorship.refresh_from_db()
self.assertTemplateUsed(response, "sponsors/admin/approve_application.html")
self.assertNotEqual(
self.sponsorship.status, Sponsorship.APPROVED
) # did not update
self.assertTrue(response.context["form"].errors)
def test_404_if_sponsorship_does_not_exist(self):
self.sponsorship.delete()
response = self.client.get(self.url)
self.assertEqual(response.status_code, 404)
def test_login_required(self):
login_url = reverse("admin:login")
redirect_url = f"{login_url}?next={self.url}"
self.client.logout()
r = self.client.get(self.url)
self.assertRedirects(r, redirect_url)
def test_staff_required(self):
login_url = reverse("admin:login")
redirect_url = f"{login_url}?next={self.url}"
self.user.is_staff = False
self.user.save()
self.client.force_login(self.user)
r = self.client.get(self.url)
self.assertRedirects(r, redirect_url, fetch_redirect_response=False)
def test_message_user_if_approving_invalid_sponsorship(self):
self.sponsorship.status = Sponsorship.FINALIZED
self.sponsorship.save()
response = self.client.post(self.url, data=self.data)
self.sponsorship.refresh_from_db()
expected_url = reverse(
"admin:sponsors_sponsorship_change", args=[self.sponsorship.pk]
)
self.assertRedirects(response, expected_url, fetch_redirect_response=True)
self.assertEqual(self.sponsorship.status, Sponsorship.FINALIZED)
msg = list(get_messages(response.wsgi_request))[0]
assertMessage(msg, "Can't approve a Finalized sponsorship.", messages.ERROR)
class ApproveSignedSponsorshipAdminViewTests(TestCase):
def setUp(self):
self.user = baker.make(
settings.AUTH_USER_MODEL, is_staff=True, is_superuser=True
)
self.client.force_login(self.user)
self.sponsorship = baker.make(
Sponsorship, status=Sponsorship.APPLIED, _fill_optional=True
)
self.url = reverse(
"admin:sponsors_sponsorship_approve_existing_contract", args=[self.sponsorship.pk]
)
today = date.today()
self.package = baker.make("sponsors.SponsorshipPackage")
self.data = {
"confirm": "yes",
"start_date": today,
"end_date": today + timedelta(days=100),
"package": self.package.pk,
"sponsorship_fee": 500,
"signed_contract": io.BytesIO(b"Signed contract")
}
def test_display_confirmation_form_on_get(self):
response = self.client.get(self.url)
context = response.context
form = context["form"]
self.sponsorship.refresh_from_db()
self.assertTemplateUsed(response, "sponsors/admin/approve_application.html")
self.assertEqual(context["sponsorship"], self.sponsorship)
self.assertIsInstance(form, SignedSponsorshipReviewAdminForm)
self.assertEqual(form.initial["package"], self.sponsorship.package)
self.assertEqual(form.initial["start_date"], self.sponsorship.start_date)
self.assertEqual(form.initial["end_date"], self.sponsorship.end_date)
self.assertEqual(
form.initial["sponsorship_fee"], self.sponsorship.sponsorship_fee
)
self.assertNotEqual(
self.sponsorship.status, Sponsorship.APPROVED
) # did not update
def test_approve_sponsorship_and_execute_contract_on_post(self):
response = self.client.post(self.url, data=self.data)
self.sponsorship.refresh_from_db()
contract = self.sponsorship.contract
expected_url = reverse(
"admin:sponsors_sponsorship_change", args=[self.sponsorship.pk]
)
self.assertRedirects(response, expected_url, fetch_redirect_response=True)
self.assertEqual(self.sponsorship.status, Sponsorship.FINALIZED)
self.assertEqual(contract.status, Contract.EXECUTED)
self.assertEqual(contract.signed_document.read(), b"Signed contract")
msg = list(get_messages(response.wsgi_request))[0]
assertMessage(msg, "Signed sponsorship was approved!", messages.SUCCESS)
def test_do_not_approve_if_no_confirmation_in_the_post(self):
self.data.pop("confirm")
response = self.client.post(self.url, data=self.data)
self.sponsorship.refresh_from_db()
self.assertTemplateUsed(response, "sponsors/admin/approve_application.html")
self.assertNotEqual(
self.sponsorship.status, Sponsorship.APPROVED
) # did not update
self.data["confirm"] = "invalid"
response = self.client.post(self.url, data=self.data)
self.sponsorship.refresh_from_db()
self.assertTemplateUsed(response, "sponsors/admin/approve_application.html")
self.assertNotEqual(self.sponsorship.status, Sponsorship.APPROVED)
def test_do_not_approve_if_form_with_invalid_data(self):
self.data = {"confirm": "yes"}
response = self.client.post(self.url, data=self.data)
self.sponsorship.refresh_from_db()
self.assertTemplateUsed(response, "sponsors/admin/approve_application.html")
self.assertNotEqual(
self.sponsorship.status, Sponsorship.APPROVED
) # did not update
self.assertTrue(response.context["form"].errors)
def test_404_if_sponsorship_does_not_exist(self):
self.sponsorship.delete()
response = self.client.get(self.url)
self.assertEqual(response.status_code, 404)
def test_login_required(self):
login_url = reverse("admin:login")
redirect_url = f"{login_url}?next={self.url}"
self.client.logout()
r = self.client.get(self.url)
self.assertRedirects(r, redirect_url)
def test_staff_required(self):
login_url = reverse("admin:login")
redirect_url = f"{login_url}?next={self.url}"
self.user.is_staff = False
self.user.save()
self.client.force_login(self.user)
r = self.client.get(self.url)
self.assertRedirects(r, redirect_url, fetch_redirect_response=False)
def test_message_user_if_approving_invalid_sponsorship(self):
self.sponsorship.status = Sponsorship.FINALIZED
self.sponsorship.save()
response = self.client.post(self.url, data=self.data)
self.sponsorship.refresh_from_db()
expected_url = reverse(
"admin:sponsors_sponsorship_change", args=[self.sponsorship.pk]
)
self.assertRedirects(response, expected_url, fetch_redirect_response=True)
self.assertEqual(self.sponsorship.status, Sponsorship.FINALIZED)
msg = list(get_messages(response.wsgi_request))[0]
assertMessage(msg, "Can't approve a Finalized sponsorship.", messages.ERROR)
class SendContractViewTests(TestCase):
def setUp(self):
self.user = baker.make(
settings.AUTH_USER_MODEL, is_staff=True, is_superuser=True
)
self.client.force_login(self.user)
self.contract = baker.make_recipe("sponsors.tests.empty_contract")
self.url = reverse(
"admin:sponsors_contract_send", args=[self.contract.pk]
)
self.data = {
"confirm": "yes",
}
def test_display_confirmation_form_on_get(self):
response = self.client.get(self.url)
context = response.context
self.assertTemplateUsed(response, "sponsors/admin/send_contract.html")
self.assertEqual(context["contract"], self.contract)
@patch.object(
Sponsorship, "verified_emails", PropertyMock(return_value=["email@email.com"])
)
def test_approve_sponsorship_on_post(self):
response = self.client.post(self.url, data=self.data)
expected_url = reverse(
"admin:sponsors_contract_change", args=[self.contract.pk]
)
self.contract.refresh_from_db()
self.assertRedirects(response, expected_url, fetch_redirect_response=True)
self.assertTrue(self.contract.document.name)
self.assertEqual(1, len(mail.outbox))
msg = list(get_messages(response.wsgi_request))[0]
assertMessage(msg, "Contract was sent!", messages.SUCCESS)
@patch.object(
Sponsorship, "verified_emails", PropertyMock(return_value=["email@email.com"])
)
def test_display_error_message_to_user_if_invalid_status(self):
self.contract.status = Contract.AWAITING_SIGNATURE
self.contract.save()
expected_url = reverse(
"admin:sponsors_contract_change", args=[self.contract.pk]
)
response = self.client.post(self.url, data=self.data)
self.contract.refresh_from_db()
self.assertRedirects(response, expected_url, fetch_redirect_response=True)
self.assertEqual(0, len(mail.outbox))
msg = list(get_messages(response.wsgi_request))[0]
assertMessage(
msg,
"Contract with status Awaiting Signature can't be sent.",
messages.ERROR,
)
def test_do_not_send_if_no_confirmation_in_the_post(self):
self.data.pop("confirm")
response = self.client.post(self.url, data=self.data)
self.contract.refresh_from_db()
self.assertTemplateUsed(response, "sponsors/admin/send_contract.html")
self.assertFalse(self.contract.document.name)
self.data["confirm"] = "invalid"
response = self.client.post(self.url, data=self.data)
self.assertTemplateUsed(response, "sponsors/admin/send_contract.html")
self.assertFalse(self.contract.document.name)
self.assertEqual(0, len(mail.outbox))
def test_404_if_contract_does_not_exist(self):
self.contract.delete()
response = self.client.get(self.url)
self.assertEqual(response.status_code, 404)
def test_login_required(self):
login_url = reverse("admin:login")
redirect_url = f"{login_url}?next={self.url}"
self.client.logout()
r = self.client.get(self.url)
self.assertRedirects(r, redirect_url)
def test_staff_required(self):
login_url = reverse("admin:login")
redirect_url = f"{login_url}?next={self.url}"
self.user.is_staff = False
self.user.save()
self.client.force_login(self.user)
r = self.client.get(self.url)
self.assertRedirects(r, redirect_url, fetch_redirect_response=False)
class ExecuteContractViewTests(TestCase):
def setUp(self):
self.user = baker.make(
settings.AUTH_USER_MODEL, is_staff=True, is_superuser=True
)
self.client.force_login(self.user)
self.contract = baker.make_recipe("sponsors.tests.empty_contract", status=Contract.AWAITING_SIGNATURE)
self.url = reverse(
"admin:sponsors_contract_execute", args=[self.contract.pk]
)
self.data = {
"confirm": "yes",
}
def test_display_confirmation_form_on_get(self):
response = self.client.get(self.url)
context = response.context
self.assertTemplateUsed(response, "sponsors/admin/execute_contract.html")
self.assertEqual(context["contract"], self.contract)
def test_execute_sponsorship_on_post(self):
response = self.client.post(self.url, data=self.data)
expected_url = reverse(
"admin:sponsors_contract_change", args=[self.contract.pk]
)
self.contract.refresh_from_db()
msg = list(get_messages(response.wsgi_request))[0]
self.assertRedirects(response, expected_url, fetch_redirect_response=True)
self.assertEqual(self.contract.status, Contract.EXECUTED)
assertMessage(msg, "Contract was executed!", messages.SUCCESS)
def test_display_error_message_to_user_if_invalid_status(self):
self.contract.status = Contract.DRAFT
self.contract.save()
expected_url = reverse(
"admin:sponsors_contract_change", args=[self.contract.pk]
)
response = self.client.post(self.url, data=self.data)
self.contract.refresh_from_db()
msg = list(get_messages(response.wsgi_request))[0]
self.assertRedirects(response, expected_url, fetch_redirect_response=True)
self.assertEqual(self.contract.status, Contract.DRAFT)
assertMessage(
msg,
"Contract with status Draft can't be executed.",
messages.ERROR,
)
def test_do_not_execute_contract_if_no_confirmation_in_the_post(self):
self.data.pop("confirm")
response = self.client.post(self.url, data=self.data)
self.contract.refresh_from_db()
self.assertTemplateUsed(response, "sponsors/admin/execute_contract.html")
self.assertEqual(self.contract.status, Contract.AWAITING_SIGNATURE)
self.data["confirm"] = "invalid"
response = self.client.post(self.url, data=self.data)
self.assertTemplateUsed(response, "sponsors/admin/execute_contract.html")
self.contract.refresh_from_db()
self.assertEqual(self.contract.status, Contract.AWAITING_SIGNATURE)
def test_404_if_contract_does_not_exist(self):
self.contract.delete()
response = self.client.get(self.url)
self.assertEqual(response.status_code, 404)
def test_login_required(self):
login_url = reverse("admin:login")
redirect_url = f"{login_url}?next={self.url}"
self.client.logout()
r = self.client.get(self.url)
self.assertRedirects(r, redirect_url)
def test_staff_required(self):
login_url = reverse("admin:login")
redirect_url = f"{login_url}?next={self.url}"
self.user.is_staff = False
self.user.save()
self.client.force_login(self.user)
r = self.client.get(self.url)
self.assertRedirects(r, redirect_url, fetch_redirect_response=False)
class NullifyContractViewTests(TestCase):
def setUp(self):
self.user = baker.make(
settings.AUTH_USER_MODEL, is_staff=True, is_superuser=True
)
self.client.force_login(self.user)
self.contract = baker.make_recipe("sponsors.tests.empty_contract", status=Contract.AWAITING_SIGNATURE)
self.url = reverse(
"admin:sponsors_contract_nullify", args=[self.contract.pk]
)
self.data = {
"confirm": "yes",
}
def test_display_confirmation_form_on_get(self):
response = self.client.get(self.url)
context = response.context
self.assertTemplateUsed(response, "sponsors/admin/nullify_contract.html")
self.assertEqual(context["contract"], self.contract)
def test_nullify_sponsorship_on_post(self):
response = self.client.post(self.url, data=self.data)
expected_url = reverse(
"admin:sponsors_contract_change", args=[self.contract.pk]
)
self.contract.refresh_from_db()
msg = list(get_messages(response.wsgi_request))[0]
self.assertRedirects(response, expected_url, fetch_redirect_response=True)
self.assertEqual(self.contract.status, Contract.NULLIFIED)
assertMessage(msg, "Contract was nullified!", messages.SUCCESS)
def test_display_error_message_to_user_if_invalid_status(self):
self.contract.status = Contract.DRAFT
self.contract.save()
expected_url = reverse(
"admin:sponsors_contract_change", args=[self.contract.pk]
)
response = self.client.post(self.url, data=self.data)
self.contract.refresh_from_db()
msg = list(get_messages(response.wsgi_request))[0]
self.assertRedirects(response, expected_url, fetch_redirect_response=True)
self.assertEqual(self.contract.status, Contract.DRAFT)
assertMessage(
msg,
"Contract with status Draft can't be nullified.",
messages.ERROR,
)
def test_do_not_nullify_contract_if_no_confirmation_in_the_post(self):
self.data.pop("confirm")
response = self.client.post(self.url, data=self.data)
self.contract.refresh_from_db()
self.assertTemplateUsed(response, "sponsors/admin/nullify_contract.html")
self.assertEqual(self.contract.status, Contract.AWAITING_SIGNATURE)
self.data["confirm"] = "invalid"
response = self.client.post(self.url, data=self.data)
self.assertTemplateUsed(response, "sponsors/admin/nullify_contract.html")
self.contract.refresh_from_db()
self.assertEqual(self.contract.status, Contract.AWAITING_SIGNATURE)
def test_404_if_contract_does_not_exist(self):
self.contract.delete()
response = self.client.get(self.url)
self.assertEqual(response.status_code, 404)
def test_login_required(self):
login_url = reverse("admin:login")
redirect_url = f"{login_url}?next={self.url}"
self.client.logout()
r = self.client.get(self.url)
self.assertRedirects(r, redirect_url)
def test_staff_required(self):
login_url = reverse("admin:login")
redirect_url = f"{login_url}?next={self.url}"
self.user.is_staff = False
self.user.save()
self.client.force_login(self.user)
r = self.client.get(self.url)
self.assertRedirects(r, redirect_url, fetch_redirect_response=False)
class UpdateRelatedSponsorshipsTests(TestCase):
def setUp(self):
self.user = baker.make(
settings.AUTH_USER_MODEL, is_staff=True, is_superuser=True
)
self.client.force_login(self.user)
self.benefit = baker.make(SponsorshipBenefit)
self.sponsor_benefit = baker.make(
SponsorBenefit,
sponsorship_benefit=self.benefit,
sponsorship__sponsor__name="Foo",
added_by_user=True, # to make sure we keep previous fields
)
self.url = reverse(
"admin:sponsors_sponsorshipbenefit_update_related", args=[self.benefit.pk]
)
self.data = {"sponsorships": [self.sponsor_benefit.sponsorship.pk]}
def test_display_form_from_benefit_on_get(self):
response = self.client.get(self.url)
context = response.context
self.assertTemplateUsed(response, "sponsors/admin/update_related_sponsorships.html")
self.assertEqual(context["benefit"], self.benefit)
self.assertIsInstance(context["form"], SponsorshipsListForm)
self.assertEqual(context["form"].sponsorship_benefit, self.benefit)
def test_list_related_sponsorships_with_initial(self):
baker.make(Sponsorship) # unrelated-sponsorship
other_sponsor_benefit = baker.make(
SponsorBenefit,
sponsorship_benefit=self.benefit,
sponsorship__sponsor__name="Bar",
)
response = self.client.get(self.url)
initial = response.context["form"].initial
self.assertEqual(2, len(initial["sponsorships"]))
self.assertIn(self.sponsor_benefit.sponsorship.pk, initial["sponsorships"])
self.assertIn(other_sponsor_benefit.sponsorship.pk, initial["sponsorships"])
def test_bad_request_if_invalid_post_data(self):
self.data["sponsorships"] = []
response = self.client.post(self.url, data=self.data)
self.assertTrue(response.context["form"].errors)
def test_redirect_back_to_benefit_page_if_success(self):
redirect_url = reverse(
"admin:sponsors_sponsorshipbenefit_change", args=[self.benefit.pk]
)
response = self.client.post(self.url, data=self.data)
self.assertRedirects(response, redirect_url)
msg = list(get_messages(response.wsgi_request))[0]
assertMessage(msg, "1 related sponsorships updated!", messages.SUCCESS)
def test_update_selected_sponsorships_only(self):
other_sponsor_benefit = baker.make(
SponsorBenefit,
sponsorship_benefit=self.benefit,
sponsorship__sponsor__name="Bar",
name=self.benefit.name,
description=self.benefit.description,
)
prev_name, prev_description = self.benefit.name, self.benefit.description
self.benefit.name = 'New name'
self.benefit.description = 'New description'
self.benefit.save()
response = self.client.post(self.url, data=self.data)
# delete existing sponsor benefit
self.assertFalse(SponsorBenefit.objects.filter(id=self.sponsor_benefit.id).exists())
# make sure a new one was created
new_sponsor_benefit = SponsorBenefit.objects.get(
sponsorship=self.sponsor_benefit.sponsorship,
sponsorship_benefit=self.benefit,
)
self.assertEqual(new_sponsor_benefit.name, "New name")
self.assertEqual(new_sponsor_benefit.description, "New description")
self.assertTrue(new_sponsor_benefit.added_by_user)
# make sure sponsor benefit from unselected sponsorships wasn't deleted
other_sponsor_benefit.refresh_from_db()
self.assertEqual(other_sponsor_benefit.name, prev_name)
self.assertEqual(other_sponsor_benefit.description, prev_description)
def test_404_if_benefit_does_not_exist(self):
self.benefit.delete()
response = self.client.get(self.url)
self.assertEqual(response.status_code, 404)
def test_login_required(self):
login_url = reverse("admin:login")
redirect_url = f"{login_url}?next={self.url}"
self.client.logout()
r = self.client.get(self.url)
self.assertRedirects(r, redirect_url)
def test_staff_required(self):
login_url = reverse("admin:login")
redirect_url = f"{login_url}?next={self.url}"
self.user.is_staff = False
self.user.save()
self.client.force_login(self.user)
r = self.client.get(self.url)
self.assertRedirects(r, redirect_url, fetch_redirect_response=False)
class PreviewContractViewTests(TestCase):
def setUp(self):
self.user = baker.make(
settings.AUTH_USER_MODEL, is_staff=True, is_superuser=True
)
self.client.force_login(self.user)
self.contract = baker.make_recipe(
"sponsors.tests.empty_contract", sponsorship__start_date=date.today()
)
self.url = reverse(
"admin:sponsors_contract_preview", args=[self.contract.pk]
)
@patch("sponsors.views_admin.render_contract_to_pdf_response")
def test_render_pdf_by_default(self, mocked_render):
response = HttpResponse()
mocked_render.return_value = response
r = self.client.get(self.url)
self.assertEqual(r, response)
self.assertEqual(r.get("X-Frame-Options"), "SAMEORIGIN")
self.assertEqual(mocked_render.call_count, 1)
self.assertEqual(mocked_render.call_args[0][1], self.contract)
self.assertIsInstance(mocked_render.call_args[0][0], WSGIRequest)
@patch("sponsors.views_admin.render_contract_to_docx_response")
def test_render_docx_if_specified_in_the_querystring(self, mocked_render):
response = HttpResponse()
mocked_render.return_value = response
r = self.client.get(self.url + "?format=docx")
self.assertEqual(r, response)
self.assertEqual(r.get("X-Frame-Options"), "SAMEORIGIN")
self.assertEqual(mocked_render.call_count, 1)
self.assertEqual(mocked_render.call_args[0][1], self.contract)
self.assertIsInstance(mocked_render.call_args[0][0], WSGIRequest)
| 39.284434 | 110 | 0.676463 | 4,011 | 35,081 | 5.709299 | 0.059835 | 0.040611 | 0.039301 | 0.025983 | 0.874105 | 0.854672 | 0.832271 | 0.828122 | 0.824629 | 0.823144 | 0 | 0.003282 | 0.218238 | 35,081 | 892 | 111 | 39.328475 | 0.831723 | 0.009749 | 0 | 0.737346 | 0 | 0 | 0.113469 | 0.070933 | 0 | 0 | 0 | 0 | 0.218878 | 1 | 0.095759 | false | 0 | 0.021888 | 0 | 0.129959 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f49056c3bb2135658ff0cacc24d0ecdbd9f3529f | 9,312 | py | Python | tests/cupy_tests/linalg_tests/test_product.py | mdeegen/chainer | d4ef0ca4a04c958f07d70a0be6ba3c900baffbdb | [
"MIT"
] | null | null | null | tests/cupy_tests/linalg_tests/test_product.py | mdeegen/chainer | d4ef0ca4a04c958f07d70a0be6ba3c900baffbdb | [
"MIT"
] | null | null | null | tests/cupy_tests/linalg_tests/test_product.py | mdeegen/chainer | d4ef0ca4a04c958f07d70a0be6ba3c900baffbdb | [
"MIT"
] | 1 | 2021-05-27T16:52:11.000Z | 2021-05-27T16:52:11.000Z | import unittest
from cupy import testing
@testing.parameterize(*testing.product({
'shape': [
((2, 3, 4), (3, 4, 2)),
((1, 1), (1, 1)),
((1, 1), (1, 2)),
((1, 2), (2, 1)),
((2, 1), (1, 1)),
((1, 2), (2, 3)),
((2, 1), (1, 3)),
((2, 3), (3, 1)),
((2, 3), (3, 4)),
((0, 3), (3, 4)),
((2, 3), (3, 0)),
((0, 3), (3, 0)),
((3, 0), (0, 4)),
((2, 3, 0), (3, 0, 2)),
((0, 0), (0, 0)),
],
'trans_a': [True, False],
'trans_b': [True, False],
}))
@testing.gpu
class TestDot(unittest.TestCase):
_multiprocess_can_split_ = True
@testing.for_all_dtypes_combination(['dtype_a', 'dtype_b'])
@testing.numpy_cupy_allclose()
def test_dot(self, xp, dtype_a, dtype_b):
shape_a, shape_b = self.shape
if self.trans_a:
a = testing.shaped_arange(shape_a[::-1], xp, dtype_a).T
else:
a = testing.shaped_arange(shape_a, xp, dtype_a)
if self.trans_b:
b = testing.shaped_arange(shape_b[::-1], xp, dtype_b).T
else:
b = testing.shaped_arange(shape_b, xp, dtype_b)
return xp.dot(a, b)
@testing.for_float_dtypes(name='dtype_a')
@testing.for_float_dtypes(name='dtype_b')
@testing.for_float_dtypes(name='dtype_c')
@testing.numpy_cupy_allclose()
def test_dot_with_out(self, xp, dtype_a, dtype_b, dtype_c):
shape_a, shape_b = self.shape
if self.trans_a:
a = testing.shaped_arange(shape_a[::-1], xp, dtype_a).T
else:
a = testing.shaped_arange(shape_a, xp, dtype_a)
if self.trans_b:
b = testing.shaped_arange(shape_b[::-1], xp, dtype_b).T
else:
b = testing.shaped_arange(shape_b, xp, dtype_b)
shape_c = shape_a[:-1] + shape_b[:-2] + shape_b[-1:]
c = xp.empty(shape_c, dtype=dtype_c)
xp.dot(a, b, out=c)
return c
@testing.gpu
class TestProduct(unittest.TestCase):
_multiprocess_can_split_ = True
@testing.for_all_dtypes()
@testing.numpy_cupy_allclose()
def test_dot_vec1(self, xp, dtype):
a = testing.shaped_arange((2,), xp, dtype)
b = testing.shaped_arange((2,), xp, dtype)
return xp.dot(a, b)
@testing.for_all_dtypes()
@testing.numpy_cupy_allclose()
def test_dot_vec2(self, xp, dtype):
a = testing.shaped_arange((2,), xp, dtype)
b = testing.shaped_arange((2, 1), xp, dtype)
return xp.dot(a, b)
@testing.for_all_dtypes()
@testing.numpy_cupy_allclose()
def test_dot_vec3(self, xp, dtype):
a = testing.shaped_arange((1, 2), xp, dtype)
b = testing.shaped_arange((2,), xp, dtype)
return xp.dot(a, b)
@testing.for_all_dtypes()
@testing.numpy_cupy_allclose()
def test_transposed_dot(self, xp, dtype):
a = testing.shaped_arange((2, 3, 4), xp, dtype).transpose(1, 0, 2)
b = testing.shaped_arange((2, 3, 4), xp, dtype).transpose(0, 2, 1)
return xp.dot(a, b)
@testing.for_all_dtypes()
@testing.numpy_cupy_allclose()
def test_transposed_dot_with_out(self, xp, dtype):
a = testing.shaped_arange((2, 3, 4), xp, dtype).transpose(1, 0, 2)
b = testing.shaped_arange((4, 2, 3), xp, dtype).transpose(2, 0, 1)
c = xp.ndarray((3, 2, 3, 2), dtype=dtype)
xp.dot(a, b, out=c)
return c
@testing.for_all_dtypes()
@testing.numpy_cupy_allclose()
def test_transposed_dot_with_out2(self, xp, dtype):
a = testing.shaped_arange((2, 3, 4), xp, dtype).transpose(1, 0, 2)
b = testing.shaped_arange((4, 2, 3), xp, dtype).transpose(2, 0, 1)
c = xp.ndarray((3, 2, 3, 2)[::-1], dtype=dtype).T
xp.dot(a, b, out=c)
return c
@testing.for_all_dtypes()
@testing.numpy_cupy_allclose()
def test_dot_with_single_elem_array1(self, xp, dtype):
a = testing.shaped_arange((3, 1), xp, dtype)
b = xp.array([[2]], dtype=dtype)
return xp.dot(a, b)
@testing.for_all_dtypes()
@testing.numpy_cupy_allclose()
def test_dot_with_single_elem_array2(self, xp, dtype):
a = xp.array([[2]], dtype=dtype)
b = testing.shaped_arange((1, 3), xp, dtype)
return xp.dot(a, b)
@testing.for_all_dtypes()
@testing.numpy_cupy_allclose()
def test_vdot(self, xp, dtype):
a = testing.shaped_arange((5,), xp, dtype)
b = testing.shaped_reverse_arange((5,), xp, dtype)
return xp.vdot(a, b)
@testing.for_all_dtypes()
@testing.numpy_cupy_allclose()
def test_reversed_vdot(self, xp, dtype):
a = testing.shaped_arange((5,), xp, dtype)[::-1]
b = testing.shaped_reverse_arange((5,), xp, dtype)[::-1]
return xp.vdot(a, b)
@testing.for_all_dtypes()
@testing.numpy_cupy_allclose()
def test_multidim_vdot(self, xp, dtype):
a = testing.shaped_arange((2, 3, 4), xp, dtype)
b = testing.shaped_arange((2, 2, 2, 3), xp, dtype)
return xp.vdot(a, b)
@testing.for_all_dtypes()
@testing.numpy_cupy_allclose()
def test_transposed_multidim_vdot(self, xp, dtype):
a = testing.shaped_arange((2, 3, 4), xp, dtype).transpose(2, 0, 1)
b = testing.shaped_arange(
(2, 2, 2, 3), xp, dtype).transpose(1, 3, 0, 2)
return xp.vdot(a, b)
@testing.for_all_dtypes()
@testing.numpy_cupy_allclose()
def test_inner(self, xp, dtype):
a = testing.shaped_arange((5,), xp, dtype)
b = testing.shaped_reverse_arange((5,), xp, dtype)
return xp.inner(a, b)
@testing.for_all_dtypes()
@testing.numpy_cupy_allclose()
def test_reversed_inner(self, xp, dtype):
a = testing.shaped_arange((5,), xp, dtype)[::-1]
b = testing.shaped_reverse_arange((5,), xp, dtype)[::-1]
return xp.inner(a, b)
@testing.for_all_dtypes()
@testing.numpy_cupy_allclose()
def test_multidim_inner(self, xp, dtype):
a = testing.shaped_arange((2, 3, 4), xp, dtype)
b = testing.shaped_arange((3, 2, 4), xp, dtype)
return xp.inner(a, b)
@testing.for_all_dtypes()
@testing.numpy_cupy_allclose()
def test_transposed_higher_order_inner(self, xp, dtype):
a = testing.shaped_arange((2, 4, 3), xp, dtype).transpose(2, 0, 1)
b = testing.shaped_arange((4, 2, 3), xp, dtype).transpose(1, 2, 0)
return xp.inner(a, b)
@testing.for_all_dtypes()
@testing.numpy_cupy_allclose()
def test_outer(self, xp, dtype):
a = testing.shaped_arange((5,), xp, dtype)
b = testing.shaped_arange((4,), xp, dtype)
return xp.outer(a, b)
@testing.for_all_dtypes()
@testing.numpy_cupy_allclose()
def test_reversed_outer(self, xp, dtype):
a = testing.shaped_arange((5,), xp, dtype)
b = testing.shaped_arange((4,), xp, dtype)
return xp.outer(a[::-1], b[::-1])
@testing.for_all_dtypes()
@testing.numpy_cupy_allclose()
def test_multidim_outer(self, xp, dtype):
a = testing.shaped_arange((2, 3), xp, dtype)
b = testing.shaped_arange((4, 5), xp, dtype)
return xp.outer(a, b)
@testing.for_all_dtypes()
@testing.numpy_cupy_allclose()
def test_tensordot(self, xp, dtype):
a = testing.shaped_arange((2, 3, 4), xp, dtype)
b = testing.shaped_arange((3, 4, 5), xp, dtype)
return xp.tensordot(a, b)
@testing.for_all_dtypes()
@testing.numpy_cupy_allclose()
def test_transposed_tensordot(self, xp, dtype):
a = testing.shaped_arange((2, 3, 4), xp, dtype).transpose(1, 0, 2)
b = testing.shaped_arange((4, 3, 2), xp, dtype).transpose(2, 0, 1)
return xp.tensordot(a, b)
@testing.for_all_dtypes()
@testing.numpy_cupy_allclose()
def test_tensordot_with_int_axes(self, xp, dtype):
a = testing.shaped_arange((2, 3, 4, 5), xp, dtype)
b = testing.shaped_arange((3, 4, 5, 2), xp, dtype)
return xp.tensordot(a, b, axes=3)
@testing.for_all_dtypes()
@testing.numpy_cupy_allclose()
def test_transposed_tensordot_with_int_axes(self, xp, dtype):
a = testing.shaped_arange(
(2, 3, 4, 5), xp, dtype).transpose(2, 0, 3, 1)
b = testing.shaped_arange(
(5, 4, 3, 2), xp, dtype).transpose(3, 0, 2, 1)
return xp.tensordot(a, b, axes=3)
@testing.for_all_dtypes()
@testing.numpy_cupy_allclose()
def test_tensordot_with_list_axes(self, xp, dtype):
a = testing.shaped_arange((2, 3, 4, 5), xp, dtype)
b = testing.shaped_arange((3, 5, 4, 2), xp, dtype)
return xp.tensordot(a, b, axes=([3, 2, 1], [1, 2, 0]))
@testing.for_all_dtypes()
@testing.numpy_cupy_allclose()
def test_transposed_tensordot_with_list_axes(self, xp, dtype):
a = testing.shaped_arange(
(2, 3, 4, 5), xp, dtype).transpose(2, 0, 3, 1)
b = testing.shaped_arange(
(3, 5, 4, 2), xp, dtype).transpose(3, 0, 2, 1)
return xp.tensordot(a, b, axes=([2, 0, 3], [3, 2, 1]))
@testing.for_all_dtypes()
@testing.numpy_cupy_allclose()
def test_tensordot_zero_dim(self, xp, dtype):
a = xp.array(2, dtype=dtype)
b = testing.shaped_arange((3, 4, 2), xp, dtype)
return xp.tensordot(a, b, axes=0)
| 35.678161 | 74 | 0.594287 | 1,403 | 9,312 | 3.727726 | 0.054882 | 0.113767 | 0.192543 | 0.128489 | 0.927725 | 0.922753 | 0.904398 | 0.861377 | 0.853155 | 0.829637 | 0 | 0.040108 | 0.244953 | 9,312 | 260 | 75 | 35.815385 | 0.703741 | 0 | 0 | 0.608889 | 0 | 0 | 0.005799 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.124444 | false | 0 | 0.008889 | 0 | 0.275556 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
be3b86d3f2fa16c52df8d39b29c9dd089e1cb835 | 194 | py | Python | tests/test_helpers_general.py | mattstibbs/changebot | a8df28dc67ea8c26805470efb595a11e282e6c57 | [
"MIT"
] | 3 | 2020-05-29T21:30:10.000Z | 2020-06-06T20:53:59.000Z | tests/test_helpers_general.py | mattstibbs/changebot | a8df28dc67ea8c26805470efb595a11e282e6c57 | [
"MIT"
] | 40 | 2020-04-24T09:54:08.000Z | 2020-10-22T06:36:30.000Z | tests/test_helpers_general.py | mattstibbs/changebot | a8df28dc67ea8c26805470efb595a11e282e6c57 | [
"MIT"
] | null | null | null | from app.helpers.general import represents_an_int
def test_represents_an_int_returns_true_if_int():
assert represents_an_int("1234a") == False
assert represents_an_int("1234") == True
| 27.714286 | 49 | 0.78866 | 29 | 194 | 4.827586 | 0.586207 | 0.342857 | 0.428571 | 0.3 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.047059 | 0.123711 | 194 | 6 | 50 | 32.333333 | 0.776471 | 0 | 0 | 0 | 0 | 0 | 0.046392 | 0 | 0 | 0 | 0 | 0 | 0.5 | 1 | 0.25 | true | 0 | 0.25 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
be598a13aaebb8a8431661a8b0b761c1ebc2c3c4 | 10,122 | py | Python | ckanext/example_igroupform/tests/test_controllers.py | okfde/ckankrzn | df4c1ed624f6751ac2a8f03527ff19e448d27dfb | [
"Apache-2.0"
] | 2 | 2017-05-17T04:19:09.000Z | 2017-08-07T20:54:34.000Z | ckanext/example_igroupform/tests/test_controllers.py | okfde/ckankrzn | df4c1ed624f6751ac2a8f03527ff19e448d27dfb | [
"Apache-2.0"
] | 4 | 2017-05-29T03:39:27.000Z | 2017-08-07T22:37:36.000Z | ckanext/example_igroupform/tests/test_controllers.py | okfde/ckankrzn | df4c1ed624f6751ac2a8f03527ff19e448d27dfb | [
"Apache-2.0"
] | 1 | 2018-12-14T07:48:34.000Z | 2018-12-14T07:48:34.000Z | # encoding: utf-8
from nose.tools import assert_equal, assert_in
from routes import url_for
import ckan.plugins as plugins
import ckan.tests.helpers as helpers
import ckan.model as model
from ckan.tests import factories
webtest_submit = helpers.webtest_submit
submit_and_follow = helpers.submit_and_follow
custom_group_type = u'grup'
group_type = u'group'
def _get_group_new_page(app, group_type):
user = factories.User()
env = {'REMOTE_USER': user['name'].encode('ascii')}
response = app.get(
url_for('%s_new' % group_type),
extra_environ=env,
)
return env, response
class TestGroupController(helpers.FunctionalTestBase):
@classmethod
def setup_class(cls):
super(TestGroupController, cls).setup_class()
plugins.load('example_igroupform')
@classmethod
def teardown_class(cls):
plugins.unload('example_igroupform')
super(TestGroupController, cls).teardown_class()
def test_about(self):
app = self._get_test_app()
user = factories.User()
group = factories.Group(user=user, type=custom_group_type)
group_name = group['name']
env = {'REMOTE_USER': user['name'].encode('ascii')}
url = url_for('%s_about' % custom_group_type,
id=group_name)
response = app.get(url=url, extra_environ=env)
response.mustcontain(group_name)
def test_bulk_process(self):
app = self._get_test_app()
user = factories.User()
group = factories.Group(user=user, type=custom_group_type)
group_name = group['name']
env = {'REMOTE_USER': user['name'].encode('ascii')}
url = url_for('%s_bulk_process' % custom_group_type,
id=group_name)
try:
response = app.get(url=url, extra_environ=env)
except Exception as e:
assert (e.args == ('Must be an organization', ))
else:
raise Exception("Response should have raised an exception")
def test_delete(self):
app = self._get_test_app()
user = factories.User()
group = factories.Group(user=user, type=custom_group_type)
group_name = group['name']
env = {'REMOTE_USER': user['name'].encode('ascii')}
url = url_for('%s_action' % custom_group_type, action='delete',
id=group_name)
response = app.get(url=url, extra_environ=env)
class TestOrganizationController(helpers.FunctionalTestBase):
@classmethod
def setup_class(cls):
super(TestOrganizationController, cls).setup_class()
plugins.load('example_igroupform_organization')
@classmethod
def teardown_class(cls):
plugins.unload('example_igroupform_organization')
super(TestOrganizationController, cls).teardown_class()
def test_about(self):
app = self._get_test_app()
user = factories.User()
group = factories.Organization(user=user, type=custom_group_type)
group_name = group['name']
env = {'REMOTE_USER': user['name'].encode('ascii')}
url = url_for('%s_about' % custom_group_type,
id=group_name)
response = app.get(url=url, extra_environ=env)
response.mustcontain(group_name)
def test_bulk_process(self):
app = self._get_test_app()
user = factories.User()
group = factories.Organization(user=user, type=custom_group_type)
group_name = group['name']
env = {'REMOTE_USER': user['name'].encode('ascii')}
url = url_for('%s_bulk_process' % custom_group_type,
id=group_name)
response = app.get(url=url, extra_environ=env)
def test_delete(self):
app = self._get_test_app()
user = factories.User()
group = factories.Organization(user=user, type=custom_group_type)
group_name = group['name']
env = {'REMOTE_USER': user['name'].encode('ascii')}
url = url_for('%s_action' % custom_group_type, action='delete',
id=group_name)
response = app.get(url=url, extra_environ=env)
class TestGroupControllerNew(helpers.FunctionalTestBase):
@classmethod
def setup_class(cls):
super(TestGroupControllerNew, cls).setup_class()
plugins.load('example_igroupform')
@classmethod
def teardown_class(cls):
plugins.unload('example_igroupform')
super(TestGroupControllerNew, cls).teardown_class()
def test_save(self):
app = self._get_test_app()
env, response = _get_group_new_page(app, custom_group_type)
form = response.forms['group-edit']
form['name'] = u'saved'
response = submit_and_follow(app, form, env, 'save')
# check correct redirect
assert_equal(response.req.url,
'http://localhost/%s/saved' % custom_group_type)
# check saved ok
group = model.Group.by_name(u'saved')
assert_equal(group.title, u'')
assert_equal(group.type, custom_group_type)
assert_equal(group.state, 'active')
def test_custom_group_form(self):
'''Our custom group form is being used for new groups.'''
app = self._get_test_app()
env, response = _get_group_new_page(app, custom_group_type)
assert_in('My Custom Group Form!', response,
msg="Custom group form not being used.")
class TestGroupControllerNew_DefaultGroupType(helpers.FunctionalTestBase):
@classmethod
def setup_class(cls):
super(TestGroupControllerNew_DefaultGroupType, cls).setup_class()
plugins.load('example_igroupform_default_group_type')
@classmethod
def teardown_class(cls):
plugins.unload('example_igroupform_default_group_type')
super(TestGroupControllerNew_DefaultGroupType, cls).teardown_class()
def test_save(self):
app = self._get_test_app()
env, response = _get_group_new_page(app, group_type)
form = response.forms['group-edit']
form['name'] = u'saved'
response = submit_and_follow(app, form, env, 'save')
# check correct redirect
assert_equal(response.req.url,
'http://localhost/%s/saved' % group_type)
# check saved ok
group = model.Group.by_name(u'saved')
assert_equal(group.title, u'')
assert_equal(group.type, group_type)
assert_equal(group.state, 'active')
def test_custom_group_form(self):
'''Our custom group form is being used for new groups.'''
app = self._get_test_app()
env, response = _get_group_new_page(app, group_type)
assert_in('My Custom Group Form!', response,
msg="Custom group form not being used.")
def _get_group_edit_page(app, group_type, group_name=None):
user = factories.User()
if group_name is None:
group = factories.Group(user=user, type=group_type)
group_name = group['name']
env = {'REMOTE_USER': user['name'].encode('ascii')}
url = url_for('%s_edit' % group_type,
id=group_name)
response = app.get(url=url, extra_environ=env)
return env, response, group_name
class TestGroupControllerEdit(helpers.FunctionalTestBase):
@classmethod
def setup_class(cls):
super(TestGroupControllerEdit, cls).setup_class()
plugins.load('example_igroupform')
@classmethod
def teardown_class(cls):
plugins.unload('example_igroupform')
super(TestGroupControllerEdit, cls).teardown_class()
def test_group_doesnt_exist(self):
app = self._get_test_app()
user = factories.User()
env = {'REMOTE_USER': user['name'].encode('ascii')}
url = url_for('%s_edit' % custom_group_type,
id='doesnt_exist')
app.get(url=url, extra_environ=env,
status=404)
def test_save(self):
app = self._get_test_app()
env, response, group_name = \
_get_group_edit_page(app, custom_group_type)
form = response.forms['group-edit']
response = submit_and_follow(app, form, env, 'save')
group = model.Group.by_name(group_name)
assert_equal(group.state, 'active')
assert_equal(group.type, custom_group_type)
def test_custom_group_form(self):
'''Our custom group form is being used to edit groups.'''
app = self._get_test_app()
env, response, group_name = \
_get_group_edit_page(app, custom_group_type)
assert_in('My Custom Group Form!', response,
msg="Custom group form not being used.")
class TestGroupControllerEdit_DefaultGroupType(helpers.FunctionalTestBase):
@classmethod
def setup_class(cls):
super(TestGroupControllerEdit_DefaultGroupType, cls).setup_class()
plugins.load('example_igroupform_default_group_type')
@classmethod
def teardown_class(cls):
plugins.unload('example_igroupform_default_group_type')
super(TestGroupControllerEdit_DefaultGroupType, cls).teardown_class()
def test_group_doesnt_exist(self):
app = self._get_test_app()
user = factories.User()
env = {'REMOTE_USER': user['name'].encode('ascii')}
url = url_for('%s_edit' % group_type,
id='doesnt_exist')
app.get(url=url, extra_environ=env,
status=404)
def test_save(self):
app = self._get_test_app()
env, response, group_name = _get_group_edit_page(app, group_type)
form = response.forms['group-edit']
response = submit_and_follow(app, form, env, 'save')
group = model.Group.by_name(group_name)
assert_equal(group.state, 'active')
assert_equal(group.type, group_type)
def test_custom_group_form(self):
'''Our custom group form is being used to edit groups.'''
app = self._get_test_app()
env, response, group_name = _get_group_edit_page(app, group_type)
assert_in('My Custom Group Form!', response,
msg="Custom group form not being used.")
| 36.021352 | 77 | 0.647698 | 1,227 | 10,122 | 5.069275 | 0.09454 | 0.062219 | 0.050643 | 0.036013 | 0.863344 | 0.863344 | 0.845338 | 0.827974 | 0.804019 | 0.738103 | 0 | 0.000912 | 0.241849 | 10,122 | 280 | 78 | 36.15 | 0.809617 | 0.029638 | 0 | 0.782805 | 0 | 0 | 0.114219 | 0.021435 | 0 | 0 | 0 | 0 | 0.081448 | 1 | 0.135747 | false | 0 | 0.027149 | 0 | 0.199095 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
fe3b83a1491fcfb4ebbec26a236e7df8cb2ccd15 | 12,358 | py | Python | knapsack.py | bozkurthan/Cache-Replacement-Algorithms | 1e9409bdfc65651f11d9dde29d505265155ec09e | [
"Apache-2.0"
] | null | null | null | knapsack.py | bozkurthan/Cache-Replacement-Algorithms | 1e9409bdfc65651f11d9dde29d505265155ec09e | [
"Apache-2.0"
] | null | null | null | knapsack.py | bozkurthan/Cache-Replacement-Algorithms | 1e9409bdfc65651f11d9dde29d505265155ec09e | [
"Apache-2.0"
] | null | null | null | # knapsack.py
# A dynamic programming algorithm for the 0-1 knapsack problem and
# a greedy algorithm for the fractional knapsack problem
# A dynamic programming algorithm for the 0-1 knapsack problem.
# Python3 code for Dynamic Programming
# based solution for 0-1 Knapsack problem
# Prints the items which are put in a
# knapsack of capacity W
def knapSack(W, wt, val, n):
list_of_item = []
K = [[0 for x in range(W + 1)] for x in range(n + 1)]
# Build table K[][] in bottom up manner
for i in range(n + 1):
for w in range(W + 1):
if i == 0 or w == 0:
K[i][w] = 0
elif wt[i - 1] <= w:
K[i][w] = max(val[i - 1] + K[i - 1][w - wt[i - 1]], K[i - 1][w])
else:
K[i][w] = K[i - 1][w]
return K[n][W]
# stores the result of Knapsack
res = K[n][W]
print(res)
w = W
for i in range(n, 0, -1):
if res <= 0:
break
# either the result comes from the
# top (K[i-1][w]) or from (val[i-1]
# + K[i-1] [w-wt[i-1]]) as in Knapsack
# table. If it comes from the latter
# one/ it means the item is included.
if res == K[i - 1][w]:
continue
else:
# This item is included.
list_of_item.append[i - 1]
# Since this weight is included
# its value is deducted
res = res - val[i - 1]
w = w - wt[i - 1]
# Driver code
return (list_of_item)
val = [0.00001711185795978930, 0.00124729160343982717, 0.05131953586193025579, 0.00113390145767256986,
0.00003334617021901253, 0.01228548886266175866, 0.00001168762923283197, 0.02394094222852853268,
0.00391453536228139349, 0.01486544152382072992, 0.00150922284016219107, 0.05645148944812328901,
0.03186539410617148654, 0.09091568827109706985, 0.00007862852507161364, 0.00070406359331675651,
0.00015322475135483228, 0.00200877560025587666, 0.00630439834630780981, 0.00058187073827831105,
0.00001062511748439270, 0.02633503645138138630, 0.00036129594866117617, 0.00018540194913934711,
0.00000798280802734237, 0.00093710864270460316, 0.00573127122391618987, 0.00243061847630961095,
0.01798718424382308306, 0.00521024656719653624, 0.00267368032394057261, 0.00010465456687031777,
0.00430598889850953345, 0.00008649137757877499, 0.01635198567620280499, 0.03505193351678864006,
0.02896854009651952597, 0.00294104835633463013, 0.00323515319196809349, 0.04665412351084568393,
0.00032845086241925107, 0.00077446995264843221, 0.00009514051533665250, 0.06209663839293562415,
0.00002755881836282027, 0.00004882212781765627, 0.02176449293502593502, 0.00011512002355734955,
0.00003668078724091380, 0.00007148047733783056, 0.00013929522850439298, 0.00000878108883007661,
0.00029859169310841004, 0.00048088490766802565, 0.00923026961882926662, 0.01351403774892793314,
0.00024676999430447104, 0.00003031470019910230, 0.00064005781210614218, 0.08265062570099733497,
0.01015329658071219450, 0.00006498225212530050, 0.06830630223222917963, 0.00182615963659625157,
0.00220965316028146454, 0.00005907477465936409, 0.00002277588294447956, 0.00022433635845861002,
0.00002070534813134505, 0.01116862623878341586, 0.00137202076378380991, 0.00043716809788002325,
0.00001555623450889936, 0.00001285639215611517, 0.00001882304375576823, 0.00052897339843482817,
0.00001414203137172669, 0.00473658778836048670, 0.03855712686846749643, 0.00012663202591308451,
0.01978590266820539484, 0.00000965919771308427, 0.00166014512417841020, 0.00085191694791327552,
0.00027144699373491820, 0.00103081950697506359, 0.00762832199903245092, 0.00839115419893569740,
0.00355866851116490301, 0.00005370434059942190, 0.00002505347123892752, 0.00020394214405328184,
0.00000725709820667488, 0.00004034886596500518, 0.07513693245545211008, 0.04241283955531425370,
0.00693483818093859183, 0.00016854722649031553, 0.00039742554352729383, 0.00004438375256150569]
wt = [0.00113390145767256964, 0.00839115419893569567, 0.02394094222852852574, 0.00124729160343982695,
0.00137202076378380947, 0.00036129594866117612, 0.00000798280802734237, 0.04241283955531424676,
0.00000725709820667488, 0.00003334617021901253, 0.01015329658071219276, 0.00355866851116490215,
0.03505193351678862618, 0.00013929522850439295, 0.00001414203137172669, 0.00000878108883007661,
0.00077446995264843211, 0.00085191694791327531, 0.07513693245545209620, 0.00693483818093859009,
0.00004034886596500517, 0.00630439834630780807, 0.00006498225212530049, 0.00007862852507161361,
0.00220965316028146411, 0.00762832199903244919, 0.00008649137757877498, 0.00002277588294447956,
0.00009514051533665249, 0.00243061847630961052, 0.00002505347123892751, 0.01351403774892793140,
0.00573127122391618900, 0.00048088490766802554, 0.00001062511748439270, 0.01486544152382072645,
0.00473658778836048584, 0.00004438375256150568, 0.01978590266820539137, 0.00200877560025587622,
0.00001555623450889936, 0.01228548886266175519, 0.06830630223222916575, 0.00020394214405328179,
0.00923026961882926489, 0.00064005781210614207, 0.00010465456687031774, 0.00150922284016219064,
0.00043716809788002314, 0.00003031470019910230, 0.06209663839293561027, 0.01116862623878341412,
0.00024676999430447099, 0.08265062570099732109, 0.05645148944812327513, 0.00294104835633462970,
0.00016854722649031547, 0.03855712686846748949, 0.00029859169310840993, 0.00039742554352729373,
0.00004882212781765625, 0.00430598889850953258, 0.00267368032394057218, 0.05131953586193024192,
0.02633503645138137936, 0.00323515319196809306, 0.00005907477465936408, 0.00002070534813134505,
0.00166014512417840977, 0.03186539410617147960, 0.00002755881836282027, 0.00007148047733783054,
0.00027144699373491814, 0.00015322475135483226, 0.00001711185795978930, 0.09091568827109705597,
0.00011512002355734952, 0.00012663202591308448, 0.01798718424382307959, 0.00070406359331675640,
0.00018540194913934706, 0.00052897339843482806, 0.00058187073827831095, 0.00032845086241925102,
0.00000965919771308427, 0.00391453536228139262, 0.04665412351084567699, 0.00182615963659625113,
0.00003668078724091379, 0.00001882304375576823, 0.00005370434059942189, 0.01635198567620280152,
0.00022433635845860996, 0.02176449293502593155, 0.02896854009651952250, 0.00103081950697506337,
0.00093710864270460294, 0.00001168762923283197, 0.00001285639215611517, 0.00521024656719653451]
for i in range(100):
val[i] = int(val[i] / min(val))
W = int(0.1 / min(wt))
for i in range(100):
wt[i] = int(wt[i] / min(wt))
n = len(val)
list_of_store = knapSack(W, wt, val, n)
print(list_of_store)
# K = [[0 for x in range(100)] for x in range(100)]
# def knapSack(W, wt, val, n):
# #Table in bottom up manner
# for i in range(100):
# for w in range(100):
# if i == 0 or w == 0:
# K[i][w] = 0
# elif wt[i-1] <= w:
# K[i][w] = max(val[i-1] + K[i-1][w], K[i-1][w])
# else:
# K[i][w] = K[i-1][w]
#
#
# res = K[99][99]
# print(res)
# #return K[99][99]
# w = 99
# for i in range(n, 0, -1):
# if res <= 0:
# break
# if res == K[i - 1][w]:
# continue
# else:
#
# # This item is included.
# print(i-1)
#
# # Since this weight is included
# # its value is deducted
# res = res - val[i - 1]
# w = w - 1
# #Main
#
# val = [0.00001711185795978930,0.00124729160343982717,0.05131953586193025579,0.00113390145767256986,0.00003334617021901253,0.01228548886266175866,0.00001168762923283197,0.02394094222852853268,0.00391453536228139349,0.01486544152382072992,0.00150922284016219107,0.05645148944812328901,0.03186539410617148654,0.09091568827109706985,0.00007862852507161364,0.00070406359331675651,0.00015322475135483228,0.00200877560025587666,0.00630439834630780981,0.00058187073827831105,0.00001062511748439270,0.02633503645138138630,0.00036129594866117617,0.00018540194913934711,0.00000798280802734237,0.00093710864270460316,0.00573127122391618987,0.00243061847630961095,0.01798718424382308306,0.00521024656719653624,0.00267368032394057261,0.00010465456687031777,0.00430598889850953345,0.00008649137757877499,0.01635198567620280499,0.03505193351678864006,0.02896854009651952597,0.00294104835633463013,0.00323515319196809349,0.04665412351084568393,0.00032845086241925107,0.00077446995264843221,0.00009514051533665250,0.06209663839293562415,0.00002755881836282027,0.00004882212781765627,0.02176449293502593502,0.00011512002355734955,0.00003668078724091380,0.00007148047733783056,0.00013929522850439298,0.00000878108883007661,0.00029859169310841004,0.00048088490766802565,0.00923026961882926662,0.01351403774892793314,0.00024676999430447104,0.00003031470019910230,0.00064005781210614218,0.08265062570099733497,0.01015329658071219450,0.00006498225212530050,0.06830630223222917963,0.00182615963659625157,0.00220965316028146454,0.00005907477465936409,0.00002277588294447956,0.00022433635845861002,0.00002070534813134505,0.01116862623878341586,0.00137202076378380991,0.00043716809788002325,0.00001555623450889936,0.00001285639215611517,0.00001882304375576823,0.00052897339843482817,0.00001414203137172669,0.00473658778836048670,0.03855712686846749643,0.00012663202591308451,0.01978590266820539484,0.00000965919771308427,0.00166014512417841020,0.00085191694791327552,0.00027144699373491820,0.00103081950697506359,0.00762832199903245092,0.00839115419893569740,0.00355866851116490301,0.00005370434059942190,0.00002505347123892752,0.00020394214405328184,0.00000725709820667488,0.00004034886596500518,0.07513693245545211008,0.04241283955531425370,0.00693483818093859183,0.00016854722649031553,0.00039742554352729383,0.00004438375256150569]
# wt = [0.00113390145767256964,0.00839115419893569567,0.02394094222852852574,0.00124729160343982695,0.00137202076378380947,0.00036129594866117612,0.00000798280802734237,0.04241283955531424676,0.00000725709820667488,0.00003334617021901253,0.01015329658071219276,0.00355866851116490215,0.03505193351678862618,0.00013929522850439295,0.00001414203137172669,0.00000878108883007661,0.00077446995264843211,0.00085191694791327531,0.07513693245545209620,0.00693483818093859009,0.00004034886596500517,0.00630439834630780807,0.00006498225212530049,0.00007862852507161361,0.00220965316028146411,0.00762832199903244919,0.00008649137757877498,0.00002277588294447956,0.00009514051533665249,0.00243061847630961052,0.00002505347123892751,0.01351403774892793140,0.00573127122391618900,0.00048088490766802554,0.00001062511748439270,0.01486544152382072645,0.00473658778836048584,0.00004438375256150568,0.01978590266820539137,0.00200877560025587622,0.00001555623450889936,0.01228548886266175519,0.06830630223222916575,0.00020394214405328179,0.00923026961882926489,0.00064005781210614207,0.00010465456687031774,0.00150922284016219064,0.00043716809788002314,0.00003031470019910230,0.06209663839293561027,0.01116862623878341412,0.00024676999430447099,0.08265062570099732109,0.05645148944812327513,0.00294104835633462970,0.00016854722649031547,0.03855712686846748949,0.00029859169310840993,0.00039742554352729373,0.00004882212781765625,0.00430598889850953258,0.00267368032394057218,0.05131953586193024192,0.02633503645138137936,0.00323515319196809306,0.00005907477465936408,0.00002070534813134505,0.00166014512417840977,0.03186539410617147960,0.00002755881836282027,0.00007148047733783054,0.00027144699373491814,0.00015322475135483226,0.00001711185795978930,0.09091568827109705597,0.00011512002355734952,0.00012663202591308448,0.01798718424382307959,0.00070406359331675640,0.00018540194913934706,0.00052897339843482806,0.00058187073827831095,0.00032845086241925102,0.00000965919771308427,0.00391453536228139262,0.04665412351084567699,0.00182615963659625113,0.00003668078724091379,0.00001882304375576823,0.00005370434059942189,0.01635198567620280152,0.00022433635845860996,0.02176449293502593155,0.02896854009651952250,0.00103081950697506337,0.00093710864270460294,0.00001168762923283197,0.00001285639215611517,0.00521024656719653451]
# W = 0.1
# n = len(val)
# print(knapSack(W, wt, val, n))
#
| 76.283951 | 2,309 | 0.781194 | 1,237 | 12,358 | 7.796281 | 0.21342 | 0.004562 | 0.004355 | 0.004148 | 0.940481 | 0.93229 | 0.925861 | 0.925861 | 0.920261 | 0.920261 | 0 | 0.781207 | 0.121622 | 12,358 | 161 | 2,310 | 76.757764 | 0.107232 | 0.490775 | 0 | 0.048193 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.012048 | false | 0 | 0 | 0 | 0.036145 | 0.024096 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
fe65747c82ac256cf1d99635d4702ab2fb08dbd3 | 16,623 | py | Python | tests/integration/insights/v1/conference/test_conference_participant.py | angmunpri/twilio-python | d6ed1098f4bc06529d68f965eabdf87642ac441c | [
"MIT"
] | 1 | 2022-03-03T05:24:20.000Z | 2022-03-03T05:24:20.000Z | tests/integration/insights/v1/conference/test_conference_participant.py | angmunpri/twilio-python | d6ed1098f4bc06529d68f965eabdf87642ac441c | [
"MIT"
] | 1 | 2022-03-03T05:32:47.000Z | 2022-03-03T05:36:32.000Z | tests/integration/insights/v1/conference/test_conference_participant.py | angmunpri/twilio-python | d6ed1098f4bc06529d68f965eabdf87642ac441c | [
"MIT"
] | null | null | null | # coding=utf-8
r"""
This code was generated by
\ / _ _ _| _ _
| (_)\/(_)(_|\/| |(/_ v1.0.0
/ /
"""
from tests import IntegrationTestCase
from tests.holodeck import Request
from twilio.base.exceptions import TwilioException
from twilio.http.response import Response
class ConferenceParticipantTestCase(IntegrationTestCase):
def test_fetch_request(self):
self.holodeck.mock(Response(500, ''))
with self.assertRaises(TwilioException):
self.client.insights.v1.conferences("CFXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX") \
.conference_participants("CPXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX").fetch()
self.holodeck.assert_has_request(Request(
'get',
'https://insights.twilio.com/v1/Conferences/CFXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX/Participants/CPXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX',
))
def test_fetch_response(self):
self.holodeck.mock(Response(
200,
'''
{
"participant_sid": "CPaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa",
"label": null,
"conference_sid": "CFaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa",
"call_sid": "CAaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa",
"account_sid": "ACaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa",
"call_direction": "outbound",
"from": "+10000000000",
"to": "+1000000001",
"call_status": "completed",
"country_code": "US",
"is_moderator": true,
"join_time": "2021-10-08T02:58:59Z",
"leave_time": "2021-10-08T03:00:02Z",
"duration_seconds": 64,
"outbound_queue_length": 0,
"outbound_time_in_queue": 965,
"jitter_buffer_size": null,
"is_coach": false,
"coached_participants": null,
"participant_region": "us1",
"conference_region": "us1",
"call_type": "carrier",
"processing_state": "complete",
"properties": {
"start_conference_on_enter": false,
"end_conference_on_exit": false,
"play_early_media": false,
"enter_muted": true,
"beep_on_enter": false,
"beep_on_exit": false
},
"events": {
"mute": [
1633705131000
]
},
"metrics": {
"inbound": {
"total_packets_lost": 0,
"total_packets_received": 49,
"packet_loss_percentage": 0.0,
"jitter": {
"avg": 0.34,
"max": 0.53
},
"latency": {
"avg": 0.0,
"max": 0.0
},
"mos": 4.4
},
"outbound": {
"total_packets_lost": 0,
"total_packets_received": 126,
"packet_loss_percentage": 0,
"jitter": {
"avg": 0.01,
"max": 0.01
},
"latency": {
"avg": 0,
"max": 0
},
"mos": 4.4
}
},
"url": "https://insights.twilio.com/v1/Conferences/CFaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa/Participants/CPaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
}
'''
))
actual = self.client.insights.v1.conferences("CFXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX") \
.conference_participants("CPXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX").fetch()
self.assertIsNotNone(actual)
def test_list_request(self):
self.holodeck.mock(Response(500, ''))
with self.assertRaises(TwilioException):
self.client.insights.v1.conferences("CFXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX") \
.conference_participants.list()
self.holodeck.assert_has_request(Request(
'get',
'https://insights.twilio.com/v1/Conferences/CFXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX/Participants',
))
def test_read_full_response(self):
self.holodeck.mock(Response(
200,
'''
{
"meta": {
"page": 0,
"page_size": 25,
"first_page_url": "https://insights.twilio.com/v1/Conferences/CFaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa/Participants?PageSize=25&Page=0",
"previous_page_url": null,
"url": "https://insights.twilio.com/v1/Conferences/CFaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa/Participants?PageSize=25&Page=0",
"next_page_url": null,
"key": "participants"
},
"participants": [
{
"participant_sid": "CPaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa",
"label": null,
"conference_sid": "CFaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa",
"call_sid": "CAaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa",
"account_sid": "ACaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa",
"call_direction": "outbound",
"from": "+10000000000",
"to": "+10000000001",
"call_status": "completed",
"country_code": "US",
"is_moderator": true,
"join_time": "2021-10-08T02:58:51Z",
"leave_time": "2021-10-08T02:59:55Z",
"duration_seconds": 65,
"outbound_queue_length": 0,
"outbound_time_in_queue": 3361,
"jitter_buffer_size": null,
"is_coach": false,
"coached_participants": null,
"participant_region": "us1",
"conference_region": "us1",
"call_type": "carrier",
"processing_state": "complete",
"properties": {
"start_conference_on_enter": true,
"end_conference_on_exit": false,
"play_early_media": true,
"enter_muted": false,
"beep_on_enter": false,
"beep_on_exit": false
},
"metrics": {
"inbound": {
"total_packets_lost": 0,
"total_packets_received": 70,
"packet_loss_percentage": 0.0,
"jitter": {
"avg": 0.41,
"max": 0.84
},
"latency": {
"avg": 0.0,
"max": 0.0
},
"mos": 4.4
},
"outbound": {
"total_packets_lost": 0,
"total_packets_received": 126,
"packet_loss_percentage": 0,
"jitter": {
"avg": 0.01,
"max": 0.01
},
"latency": {
"avg": 0,
"max": 0
},
"mos": 4.4
}
},
"events": null,
"url": "https://insights.twilio.com/v1/Conferences/CFaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa/Participants/CPaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
},
{
"participant_sid": "CPaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaab",
"label": null,
"conference_sid": "CFaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa",
"call_sid": "CAaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaab",
"account_sid": "ACaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa",
"call_direction": "outbound",
"from": "+10000000000",
"to": "+10000000002",
"call_status": "completed",
"country_code": "US",
"is_moderator": true,
"join_time": "2021-10-08T02:58:52Z",
"leave_time": "2021-10-08T02:59:54Z",
"duration_seconds": 63,
"outbound_queue_length": 0,
"outbound_time_in_queue": 321,
"jitter_buffer_size": null,
"is_coach": false,
"coached_participants": null,
"participant_region": "us1",
"conference_region": "us1",
"call_type": "carrier",
"processing_state": "complete",
"properties": {
"start_conference_on_enter": false,
"end_conference_on_exit": false,
"early_media": false,
"enter_muted": true,
"beep_on_enter": false,
"beep_on_exit": false
},
"metrics": {
"inbound": {
"total_packets_lost": 0,
"total_packets_received": 16,
"packet_loss_percentage": 0,
"jitter": {
"avg": 0.26,
"max": 0.45
},
"latency": {
"avg": 0,
"max": 0
},
"mos": 4.4
},
"outbound": {
"total_packets_lost": 0,
"total_packets_received": 42,
"packet_loss_percentage": 0,
"jitter": {
"avg": 0.03,
"max": 0.08
},
"latency": {
"avg": 0,
"max": 0
},
"mos": 4.4,
"tags": [
"silent"
]
}
},
"events": {
"mute": [
1633705131000
]
},
"url": "https://insights.twilio.com/v1/Conferences/CFaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa/Participants/CPaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaab"
}
]
}
'''
))
actual = self.client.insights.v1.conferences("CFXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX") \
.conference_participants.list()
self.assertIsNotNone(actual)
def test_read_with_label_response(self):
self.holodeck.mock(Response(
200,
'''
{
"meta": {
"page": 0,
"page_size": 25,
"first_page_url": "https://insights.twilio.com/v1/Conferences/CFaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa/Participants?Label=client&PageSize=25&Page=0",
"previous_page_url": null,
"url": "https://insights.twilio.com/v1/Conferences/CFaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa/Participants?Label=client&PageSize=25&Page=0",
"next_page_url": null,
"key": "participants"
},
"participants": [
{
"participant_sid": "CPaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa",
"conference_sid": "CFaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa",
"call_sid": "CAaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa",
"account_sid": "ACaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa",
"call_direction": "outbound",
"from": "+10000000000",
"to": "+10000000001",
"call_status": "completed",
"country_code": "US",
"is_moderator": true,
"join_time": "2021-10-08T02:58:51Z",
"leave_time": "2021-10-08T02:59:55Z",
"duration_seconds": 65,
"label": "client",
"outbound_queue_length": 0,
"outbound_time_in_queue": 3361,
"jitter_buffer_size": null,
"is_coach": false,
"coached_participants": null,
"participant_region": "us1",
"conference_region": "us1",
"call_type": "carrier",
"processing_state": "complete",
"properties": {
"start_conference_on_enter": true,
"end_conference_on_exit": false,
"play_early_media": true,
"enter_muted": false,
"beep_on_enter": false,
"beep_on_exit": false
},
"metrics": {
"inbound": {
"total_packets_lost": 0,
"total_packets_received": 70,
"packet_loss_percentage": 0.0,
"jitter": {
"avg": 0.41,
"max": 0.84
},
"latency": {
"avg": 0.0,
"max": 0.0
},
"mos": 4.4
},
"outbound": {
"total_packets_lost": 0,
"total_packets_received": 96,
"packet_loss_percentage": 0,
"jitter": {
"avg": 0.01,
"max": 0.01
},
"latency": {
"avg": 0,
"max": 0
},
"mos": 4.4
}
},
"events": null,
"url": "https://insights.twilio.com/v1/Conferences/CFaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa/Participants/CPaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
}
]
}
'''
))
actual = self.client.insights.v1.conferences("CFXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX") \
.conference_participants.list()
self.assertIsNotNone(actual)
| 44.565684 | 163 | 0.380316 | 985 | 16,623 | 6.174619 | 0.173604 | 0.031569 | 0.03124 | 0.036172 | 0.895594 | 0.889839 | 0.886222 | 0.859257 | 0.834923 | 0.799244 | 0 | 0.054944 | 0.524815 | 16,623 | 372 | 164 | 44.685484 | 0.715027 | 0.006557 | 0 | 0.711111 | 1 | 0 | 0.203457 | 0.105496 | 0 | 0 | 0 | 0 | 0.155556 | 1 | 0.111111 | false | 0 | 0.088889 | 0 | 0.222222 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
22e6eb15d6dccdccb5346a36fdce97ef96f9ed76 | 135 | py | Python | answers/pandas_answer1.py | monocilindro/foss4g-geopandas | 12afbc787c1f65cc046234b41166bd62bbb6ac29 | [
"Apache-2.0"
] | null | null | null | answers/pandas_answer1.py | monocilindro/foss4g-geopandas | 12afbc787c1f65cc046234b41166bd62bbb6ac29 | [
"Apache-2.0"
] | null | null | null | answers/pandas_answer1.py | monocilindro/foss4g-geopandas | 12afbc787c1f65cc046234b41166bd62bbb6ac29 | [
"Apache-2.0"
] | null | null | null | print (boroughs['Code'][boroughs['Population_density_(per_hectare)_2017'] == boroughs['Population_density_(per_hectare)_2017'].max()])
| 67.5 | 134 | 0.785185 | 16 | 135 | 6.125 | 0.5625 | 0.367347 | 0.510204 | 0.571429 | 0.795918 | 0.795918 | 0 | 0 | 0 | 0 | 0 | 0.061069 | 0.02963 | 135 | 1 | 135 | 135 | 0.687023 | 0 | 0 | 0 | 0 | 0 | 0.577778 | 0.548148 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 10 |
fe165a0271c075038bc87daa6d45c32dd69fb295 | 3,507 | py | Python | tests/parser/10-Crossing-Minimization.asp.test.py | veltri/DLV2 | 944aaef803aa75e7ec51d7e0c2b0d964687fdd0e | [
"Apache-2.0"
] | null | null | null | tests/parser/10-Crossing-Minimization.asp.test.py | veltri/DLV2 | 944aaef803aa75e7ec51d7e0c2b0d964687fdd0e | [
"Apache-2.0"
] | null | null | null | tests/parser/10-Crossing-Minimization.asp.test.py | veltri/DLV2 | 944aaef803aa75e7ec51d7e0c2b0d964687fdd0e | [
"Apache-2.0"
] | null | null | null | input = """
%%guess
pvalue(L,0) :- width(L,_).
pvalue(L,X+1) :- pvalue(L,X), width(L,T), X < T.
position( Node, Pos ) | not_position( Node, Pos ) :- in_layer( Layer, Node ), width( Layer, T ), Pos = P + 1,
pvalue(Layer,P), P < T.
%%check
%a node must be assigned at most at one position.
:- position( Node1, Pos1 ), position( Node1, Pos2 ), Pos1 < Pos2.
%two nodes of the same layer cannot be assigned at the same position.
:- in_layer( Layer1, Node1 ), in_layer( Layer1, Node2 ), position( Node1, Pos1 ), position( Node2, Pos1 ), Node1 != Node2.
%a node must be assigned at least at one position.
node_assigned_at_position( Node ) :- position( Node, Pos ).
:- in_layer( Layer1, Node1 ), not node_assigned_at_position( Node1 ).
%%optimization
%Computing the edges from same layers.
edge_from_same_layers(Node1,Node2,Node3,Node4):- edge(Node1,Node2), edge(Node3,Node4), Node1 < Node3, Node2 != Node4, in_layer(Layer,Node1), in_layer(Layer,Node3).
%Computing all the crossings.
crossing(Node1,Node2,Node3,Node4) :- edge_from_same_layers(Node1,Node2,Node3,Node4), antecedent(Node1,Node3), antecedent(Node4,Node2).
crossing(Node1,Node2,Node3,Node4) :- edge_from_same_layers(Node1,Node2,Node3,Node4), antecedent(Node3,Node1), antecedent(Node2,Node4).
% A node Node1 is an antecedent of a node Node2 if they are in the same layer and the Node1 position is antecedent of the Node2 position.
antecedent(Node1,Node2):- in_layer(Layer,Node1), in_layer(Layer,Node2), Node1 != Node2, position(Node1,Pos1), position(Node2,Pos2), Pos1 < Pos2.
% Assign a penalty to each violation of the crossing.
:~ crossing(Node1, Node2, Node3, Node4 ). [1,Node1,Node2,Node3,Node4]
"""
output = """
%%guess
pvalue(L,0) :- width(L,_).
pvalue(L,X+1) :- pvalue(L,X), width(L,T), X < T.
position( Node, Pos ) | not_position( Node, Pos ) :- in_layer( Layer, Node ), width( Layer, T ), Pos = P + 1,
pvalue(Layer,P), P < T.
%%check
%a node must be assigned at most at one position.
:- position( Node1, Pos1 ), position( Node1, Pos2 ), Pos1 < Pos2.
%two nodes of the same layer cannot be assigned at the same position.
:- in_layer( Layer1, Node1 ), in_layer( Layer1, Node2 ), position( Node1, Pos1 ), position( Node2, Pos1 ), Node1 != Node2.
%a node must be assigned at least at one position.
node_assigned_at_position( Node ) :- position( Node, Pos ).
:- in_layer( Layer1, Node1 ), not node_assigned_at_position( Node1 ).
%%optimization
%Computing the edges from same layers.
edge_from_same_layers(Node1,Node2,Node3,Node4):- edge(Node1,Node2), edge(Node3,Node4), Node1 < Node3, Node2 != Node4, in_layer(Layer,Node1), in_layer(Layer,Node3).
%Computing all the crossings.
crossing(Node1,Node2,Node3,Node4) :- edge_from_same_layers(Node1,Node2,Node3,Node4), antecedent(Node1,Node3), antecedent(Node4,Node2).
crossing(Node1,Node2,Node3,Node4) :- edge_from_same_layers(Node1,Node2,Node3,Node4), antecedent(Node3,Node1), antecedent(Node2,Node4).
% A node Node1 is an antecedent of a node Node2 if they are in the same layer and the Node1 position is antecedent of the Node2 position.
antecedent(Node1,Node2):- in_layer(Layer,Node1), in_layer(Layer,Node2), Node1 != Node2, position(Node1,Pos1), position(Node2,Pos2), Pos1 < Pos2.
% Assign a penalty to each violation of the crossing.
:~ crossing(Node1, Node2, Node3, Node4 ). [1,Node1,Node2,Node3,Node4]
"""
| 52.343284 | 164 | 0.688052 | 518 | 3,507 | 4.561776 | 0.11583 | 0.093102 | 0.08887 | 0.118493 | 0.995345 | 0.995345 | 0.995345 | 0.995345 | 0.995345 | 0.995345 | 0 | 0.058844 | 0.176219 | 3,507 | 66 | 165 | 53.136364 | 0.759086 | 0 | 0 | 0.96 | 0 | 0.36 | 0.991001 | 0.312337 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
fe1d6ac124be432ad965e05805611a10bc1e6543 | 244 | py | Python | python/kata/7-kyu/List Filtering/main.py | Carlososuna11/codewars-handbook | a0e7c9ac5ad19cfaed3ad463c04616daa3fed82e | [
"MIT"
] | null | null | null | python/kata/7-kyu/List Filtering/main.py | Carlososuna11/codewars-handbook | a0e7c9ac5ad19cfaed3ad463c04616daa3fed82e | [
"MIT"
] | null | null | null | python/kata/7-kyu/List Filtering/main.py | Carlososuna11/codewars-handbook | a0e7c9ac5ad19cfaed3ad463c04616daa3fed82e | [
"MIT"
] | null | null | null | import codewars_test as test
from solution import filter_list
test.assert_equals(filter_list([1,2,'a','b']),[1,2])
test.assert_equals(filter_list([1,'a','b',0,15]),[1,0,15])
test.assert_equals(filter_list([1,2,'aasf','1','123',123]),[1,2,123]) | 40.666667 | 69 | 0.704918 | 48 | 244 | 3.416667 | 0.375 | 0.243902 | 0.292683 | 0.402439 | 0.506098 | 0.506098 | 0.341463 | 0 | 0 | 0 | 0 | 0.111588 | 0.045082 | 244 | 6 | 69 | 40.666667 | 0.592275 | 0 | 0 | 0 | 0 | 0 | 0.04898 | 0 | 0 | 0 | 0 | 0 | 0.6 | 1 | 0 | true | 0 | 0.4 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
a3ca05ecbea0ad7399376f6ff6122976912d906d | 2,951 | py | Python | udacity-linear-algebra/linsys_test1.py | cromversion/data-analyst | 7350c94e0b1f1c1f5f9e7b6100b8a0c3797efff4 | [
"Apache-2.0"
] | null | null | null | udacity-linear-algebra/linsys_test1.py | cromversion/data-analyst | 7350c94e0b1f1c1f5f9e7b6100b8a0c3797efff4 | [
"Apache-2.0"
] | null | null | null | udacity-linear-algebra/linsys_test1.py | cromversion/data-analyst | 7350c94e0b1f1c1f5f9e7b6100b8a0c3797efff4 | [
"Apache-2.0"
] | null | null | null | from linsys import LinearSystem
from plane import Plane, MyDecimal
from vector import Vector
p0 = Plane(normal_vector=Vector(['1','1','1']), constant_term='1')
p1 = Plane(normal_vector=Vector(['0','1','0']), constant_term='2')
p2 = Plane(normal_vector=Vector(['1','1','-1']), constant_term='3')
p3 = Plane(normal_vector=Vector(['1','0','-2']), constant_term='2')
s = LinearSystem([p0,p1,p2,p3])
print s.indices_of_first_nonzero_terms_in_each_row()
print '{},{},{},{}'.format(s[0],s[1],s[2],s[3])
print len(s)
print s
s[0] = p1
print s
print MyDecimal('1e-9').is_near_zero()
print MyDecimal('1e-11').is_near_zero()
p0 = Plane(normal_vector=Vector(['1','1','1']), constant_term='1')
p1 = Plane(normal_vector=Vector(['0','1','0']), constant_term='2')
p2 = Plane(normal_vector=Vector(['1','1','-1']), constant_term='3')
p3 = Plane(normal_vector=Vector(['1','0','-2']), constant_term='2')
s = LinearSystem([p0,p1,p2,p3])
s.swap_rows(0,1)
if not (s[0] == p1 and s[1] == p0 and s[2] == p2 and s[3] == p3):
print 'test case 1 failed'
s.swap_rows(1,3)
if not (s[0] == p1 and s[1] == p3 and s[2] == p2 and s[3] == p0):
print 'test case 2 failed'
s.swap_rows(3,1)
if not (s[0] == p1 and s[1] == p0 and s[2] == p2 and s[3] == p3):
print 'test case 3 failed'
s.multiply_coefficient_and_row(1,0)
if not (s[0] == p1 and s[1] == p0 and s[2] == p2 and s[3] == p3):
print 'test case 4 failed'
s.multiply_coefficient_and_row(-1,2)
if not (s[0] == p1 and
s[1] == p0 and
s[2] == Plane(normal_vector=Vector(['-1','-1','1']), constant_term='-3') and
s[3] == p3):
print 'test case 5 failed'
s.multiply_coefficient_and_row(10,1)
if not (s[0] == p1 and
s[1] == Plane(normal_vector=Vector(['10','10','10']), constant_term='10') and
s[2] == Plane(normal_vector=Vector(['-1','-1','1']), constant_term='-3') and
s[3] == p3):
print 'test case 6 failed'
s.add_multiple_times_row_to_row(0,0,1)
if not (s[0] == p1 and
s[1] == Plane(normal_vector=Vector(['10','10','10']), constant_term='10') and
s[2] == Plane(normal_vector=Vector(['-1','-1','1']), constant_term='-3') and
s[3] == p3):
print 'test case 7 failed'
s.add_multiple_times_row_to_row(1,0,1)
if not (s[0] == p1 and
s[1] == Plane(normal_vector=Vector(['10','11','10']), constant_term='12') and
s[2] == Plane(normal_vector=Vector(['-1','-1','1']), constant_term='-3') and
s[3] == p3):
print 'test case 8 failed'
s.add_multiple_times_row_to_row(-1,1,0)
if not (s[0] == Plane(normal_vector=Vector(['-10','-10','-10']), constant_term='-10') and
s[1] == Plane(normal_vector=Vector(['10','11','10']), constant_term='12') and
s[2] == Plane(normal_vector=Vector(['-1','-1','1']), constant_term='-3') and
s[3] == p3):
print 'test case 9 failed' | 37.833333 | 93 | 0.57777 | 511 | 2,951 | 3.191781 | 0.111546 | 0.066217 | 0.187615 | 0.253832 | 0.817903 | 0.817903 | 0.792152 | 0.744329 | 0.716738 | 0.677498 | 0 | 0.089226 | 0.194849 | 2,951 | 78 | 94 | 37.833333 | 0.597222 | 0 | 0 | 0.52381 | 0 | 0 | 0.10061 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.047619 | null | null | 0.253968 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
430ed3c5279156eb87901314663e5a8d8f333bb3 | 612 | py | Python | example/src/functions.py | csci-arch/stibium | 6c78f941d2e7c5523072bbd27199c66ea584c472 | [
"Apache-2.0"
] | 5 | 2021-03-02T12:06:37.000Z | 2021-03-23T20:13:48.000Z | example/src/functions.py | csci-arch/stibnite | 6c78f941d2e7c5523072bbd27199c66ea584c472 | [
"Apache-2.0"
] | 14 | 2021-03-02T18:59:06.000Z | 2021-04-23T14:47:58.000Z | example/src/subfolder/functions.py | csci-arch/stibnite | 6c78f941d2e7c5523072bbd27199c66ea584c472 | [
"Apache-2.0"
] | null | null | null | import numpy as np
def mini(x, y):
"""
Take the max between two numbers
**Parameters**
> **x:** `float` -- Description of parameter `x`.
> **y:** `float` -- Description of parameter `y`.
**Returns**
> `float` -- Description of returned object.
"""
return np.min(x, y)
def mini2peutetre(x, y):
"""
Take the max between two numbers
**Parameters**
> **x:** `float` -- Description of parameter `x`.
> **y:** `float` -- Description of parameter `y`.
**Returns**
> `float` -- Description of returned object.
"""
return np.min(x, y)
| 17 | 53 | 0.54902 | 72 | 612 | 4.666667 | 0.347222 | 0.035714 | 0.321429 | 0.321429 | 0.886905 | 0.886905 | 0.886905 | 0.886905 | 0.886905 | 0.886905 | 0 | 0.002252 | 0.27451 | 612 | 35 | 54 | 17.485714 | 0.754505 | 0.684641 | 0 | 0.4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | false | 0 | 0.2 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 10 |
431a75591fff3137392d964388a6e6f6c8625e2d | 10,764 | py | Python | wnba/endpoints/player.py | rozzac90/wnba | 8e60bf915c482759b372a457ab519de6ddf531ec | [
"MIT"
] | 3 | 2018-07-23T17:02:57.000Z | 2019-05-24T18:18:35.000Z | wnba/endpoints/player.py | rozzac90/wnba | 8e60bf915c482759b372a457ab519de6ddf531ec | [
"MIT"
] | null | null | null | wnba/endpoints/player.py | rozzac90/wnba | 8e60bf915c482759b372a457ab519de6ddf531ec | [
"MIT"
] | 1 | 2018-07-23T22:54:53.000Z | 2018-07-23T22:54:53.000Z |
from wnba import enums
from wnba.utils import clean_locals
from wnba.endpoints.baseendpoint import BaseEndpoint
class Player(BaseEndpoint):
def all_advanced_stats(self, last_n_days=enums.LastNGames.Default, last_n_games=enums.LastNGames.Default,
league_id=enums.LeagueID.Default, location=enums.Location.Default, month=enums.Month.Default,
outcome=enums.Outcome.Default, per_mode=enums.PerMode.Totals, season=enums.Season.Default,
season_segment=enums.SeasonSegment.Default, season_type=enums.SeasonType.Default,
vs_team_id='', player_experience=enums.PlayerExperience.Default, days_rest='',
conference=enums.Conference.Default, starter_bench=enums.StarterBench.Default,
player_position=enums.PlayerPosition.Default):
"""
Team advanced stats breakdown.
:param league_id: ID of the league to get data for. Default 00. Required.
:type league_id: wnba.enums.LeagueID
:param season: Season to get players from. Required.
:type season: wnba.enums.Season
:param season_type: part of season to pull data from. Required.
:type season_type: wnba.enums.SeasonType
:param per_mode: grouping of stat data. Totals or PerGame accepted. Required.
:type per_mode: wnba.enums.PerMode
:param outcome: Filter to only include stats for won or lost games. Default '' returns all. Required.
:type outcome: wnba.enums.Outcome
:param location: Filter for home or road games only. Default '' returns all. Required.
:type location: wnba.enums.Location
:param month: Filter for games occurring in a specific month (relative to season start). Default 0 returns all. Required.
:type month: wnba.enums.Month
:param season_segment: Filter to only include stats from Post/Pre all star break. Default '' returns all. Required
:type season_segment: wnba.enums.SeasonSegment
:param last_n_days: Filter stats for only those occurring in the last n days. Default '' includes all games. Required.
:type last_n_days: wnba.enums.LastNGames
:param last_n_games: Filter stats for only those occurring in the last n games. Default '' includes entire games. Required.
:type last_n_games: wnba.enums.LastNGames
:param vs_team_id: filter for stats vs a specific team.
:type vs_team_id: int
:param days_rest: minimum number of days rest prior to game for stats to be included.
:type days_rest: int
:param player_experience: Filter to only include players of specific experience level. Default '' returns all.
:type player_experience: wnba.enums.PlayerExperience
:param player_position: Filter to only include players of certain position. Default '' returns all.
:type player_position: wnba.enums.PlayerPosition
:param starter_bench: Filter to only include starts or bench. Default '' returns all.
:type starter_bench: wnba.enums.StarterBench
:param conference: Filter for teams from specific conference. Default '' returns all.
:type conference: wnba.enums.Conference
:returns: Team stats after applying all filters.
:rtype: DataFrame
"""
params = clean_locals(locals())
url = '%s%s' % (self.client.stats_url, 'wnbaseasonsortableplayeradvanced')
r = self.request(url, self.client.stats_headers, params=params)
df = self.process_response(r, 0, 'resultSets')
return df
def all_raw_stats(self, last_n_days=enums.LastNGames.Default, last_n_games=enums.LastNGames.Default,
league_id=enums.LeagueID.Default, location=enums.Location.Default, month=enums.Month.Default,
outcome=enums.Outcome.Default, per_mode=enums.PerMode.Totals, season=enums.Season.Default,
season_segment=enums.SeasonSegment.Default, season_type=enums.SeasonType.Default,
vs_team_id='', player_experience=enums.PlayerExperience.Default, days_rest='',
conference=enums.Conference.Default, starter_bench=enums.StarterBench.Default,
player_position=enums.PlayerPosition.Default):
"""
Player stats breakdown.
:param league_id: ID of the league to get data for. Default 00. Required.
:type league_id: wnba.enums.LeagueID
:param season: Season to get players from. Required.
:type season: wnba.enums.Season
:param season_type: part of season to pull data from. Required.
:type season_type: wnba.enums.SeasonType
:param per_mode: grouping of stat data. Totals or PerGame accepted. Required.
:type per_mode: wnba.enums.PerMode
:param outcome: Filter to only include stats for won or lost games. Default '' returns all. Required.
:type outcome: wnba.enums.Outcome
:param location: Filter for home or road games only. Default '' returns all. Required.
:type location: wnba.enums.Location
:param month: Filter for games occurring in a specific month (relative to season start). Default 0 returns all. Required.
:type month: wnba.enums.Month
:param season_segment: Filter to only include stats from Post/Pre all star break. Default '' returns all. Required
:type season_segment: wnba.enums.SeasonSegment
:param last_n_days: Filter stats for only those occurring in the last n days. Default '' includes all games. Required.
:type last_n_days: wnba.enums.LastNGames
:param last_n_games: Filter stats for only those occurring in the last n games. Default '' includes entire games. Required.
:type last_n_games: wnba.enums.LastNGames
:param vs_team_id: filter for stats vs a specific team.
:type vs_team_id: int
:param days_rest: minimum number of days rest prior to game for stats to be included.
:type days_rest: int
:param player_experience: Filter to only include players of specific experience level. Default '' returns all.
:type player_experience: wnba.enums.PlayerExperience
:param player_position: Filter to only include players of certain position. Default '' returns all.
:type player_position: wnba.enums.PlayerPosition
:param starter_bench: Filter to only include starts or bench. Default '' returns all.
:type starter_bench: wnba.enums.StarterBench
:param conference: Filter for teams from specific conference. Default '' returns all.
:type conference: wnba.enums.Conference
:returns: Team stats after applying all filters.
:rtype: DataFrame
"""
params = clean_locals(locals())
url = '%s%s' % (self.client.stats_url, 'wnbaseasonsortableplayerstats')
r = self.request(url, self.client.stats_headers, params=params)
df = self.process_response(r, 0, 'resultSets')
return df
def all_season_stats(self, last_n_days=enums.LastNGames.Default, last_n_games=enums.LastNGames.Default,
league_id=enums.LeagueID.Default, location=enums.Location.Default, month=enums.Month.Default,
outcome=enums.Outcome.Default, per_mode=enums.PerMode.Totals, season=enums.Season.Default,
season_segment=enums.SeasonSegment.Default, season_type=enums.SeasonType.Default,
vs_team_id='', player_experience=enums.PlayerExperience.Default, days_rest='',
conference=enums.Conference.Default, starter_bench=enums.StarterBench.Default,
player_position=enums.PlayerPosition.Default, top_x='',
stat_category=enums.Stat.Default):
"""
Player stats breakdown.
:param league_id: ID of the league to get data for. Default 00. Required.
:type league_id: wnba.enums.LeagueID
:param season: Season to get players from. Required.
:type season: wnba.enums.Season
:param season_type: part of season to pull data from. Required.
:type season_type: wnba.enums.SeasonType
:param per_mode: grouping of stat data. Totals or PerGame accepted. Required.
:type per_mode: wnba.enums.PerMode
:param outcome: Filter to only include stats for won or lost games. Default '' returns all. Required.
:type outcome: wnba.enums.Outcome
:param location: Filter for home or road games only. Default '' returns all. Required.
:type location: wnba.enums.Location
:param month: Filter for games occurring in a specific month (relative to season start). Default 0 returns all. Required.
:type month: wnba.enums.Month
:param season_segment: Filter to only include stats from Post/Pre all star break. Default '' returns all. Required
:type season_segment: wnba.enums.SeasonSegment
:param last_n_days: Filter stats for only those occurring in the last n days. Default '' includes all games. Required.
:type last_n_days: wnba.enums.LastNGames
:param last_n_games: Filter stats for only those occurring in the last n games. Default '' includes entire games. Required.
:type last_n_games: wnba.enums.LastNGames
:param vs_team_id: filter for stats vs a specific team.
:type vs_team_id: int
:param days_rest: minimum number of days rest prior to game for stats to be included.
:type days_rest: int
:param player_experience: Filter to only include players of specific experience level. Default '' returns all.
:type player_experience: wnba.enums.PlayerExperience
:param player_position: Filter to only include players of certain position. Default '' returns all.
:type player_position: wnba.enums.PlayerPosition
:param starter_bench: Filter to only include starts or bench. Default '' returns all.
:type starter_bench: wnba.enums.StarterBench
:param conference: Filter for teams from specific conference. Default '' returns all.
:type conference: wnba.enums.Conference
:param top_x: only return top x players.
:type top_x: int
:param stat_category: stat by which to sort players.
:type stat_category: wnba.enums.Stat
:returns: Team stats after applying all filters.
:rtype: DataFrame
"""
params = clean_locals(locals())
url = '%s%s' % (self.client.stats_url, 'wnbaseasonstats')
r = self.request(url, self.client.stats_headers, params=params)
df = self.process_response(r, 0, 'resultSets')
return df
| 63.692308 | 131 | 0.687198 | 1,377 | 10,764 | 5.262164 | 0.091503 | 0.053409 | 0.049269 | 0.039332 | 0.949489 | 0.949489 | 0.949489 | 0.949489 | 0.949489 | 0.949489 | 0 | 0.001461 | 0.236808 | 10,764 | 168 | 132 | 64.071429 | 0.880584 | 0.616314 | 0 | 0.707317 | 0 | 0 | 0.036783 | 0.019015 | 0 | 0 | 0 | 0 | 0 | 1 | 0.073171 | false | 0 | 0.073171 | 0 | 0.243902 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
431d0818ff85ab47cff98db42e1db7f013a3c9df | 11,040 | py | Python | onion.py | Yearning9/onion-bot | 81a7c0b823c58514ec779cd984c205d04cf32d7f | [
"MIT"
] | null | null | null | onion.py | Yearning9/onion-bot | 81a7c0b823c58514ec779cd984c205d04cf32d7f | [
"MIT"
] | null | null | null | onion.py | Yearning9/onion-bot | 81a7c0b823c58514ec779cd984c205d04cf32d7f | [
"MIT"
] | null | null | null | import io
import random
import urllib
import discord
from discord.ext import commands
with open('token.txt', 'r') as j:
privdata = j.read()
intents = discord.Intents.all()
bot = commands.Bot(command_prefix='.', intents=intents)
TOKEN = privdata
bot.remove_command("help")
@bot.event
async def on_ready():
await bot.change_presence(activity=discord.Game('with the garlic'))
print(
f'h'
)
@bot.command(aliases=['inspiro'])
async def inspire(ctx):
with ctx.typing():
user_agent = "Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:59.0) Gecko/20100101 Firefox/59.0"
api_url = "http://inspirobot.me/api?generate=true"
path_response = urllib.request.urlopen(urllib.request.Request(api_url, headers={"User-Agent": user_agent}))
image = urllib.request.urlopen(
urllib.request.Request(path_response.read().decode("utf-8"), headers={"User-Agent": user_agent}))
data = io.BytesIO(image.read())
await ctx.send(file=discord.File(data, "inspirobot.jpg"))
@bot.command()
async def sad(ctx, *, message):
sad1 = message.replace('o', '<:sad:562509148239953940>')
sad2 = sad1.replace('O', '<:sad:562509148239953940>')
try:
await ctx.send(sad2)
except discord.errors.HTTPException:
await ctx.send('Message was t<:sad:562509148239953940><:sad:562509148239953940> l<:sad:562509148239953940>ng to send <:sad:562509148239953940>')
@bot.event
async def on_member_join(member):
channel = bot.get_channel(294481211030765568)
ment = member.mention
discrim = member.discriminator
name = member.name
variable = [
f"Cover yourself in oil.",
f"There are no faces.",
f"You do not recognize the bodies in the water.",
f"Your vertices will now be confiscated.",
f"Here is your complementary 55 gallon drum of E.",
f"The One in Pain is not to be disobeyed.",
f"The Great Cosmic Prism sees all.",
f"The number 7 is not real in Universe #7777777.",
f"Praise Tim the Great and Powerful.",
f"Show us your `[Error: Undefined]` certificate.",
f"Please fill out your Life Permission Form before continuing.",
f"The Noided One will screech at you while you sleep.",
f"The Dave is cool and good.",
f"Do not forget to widen your lemg.",
f"The Hive Assimilator will now convert you into bees.",
f"Undulate with style and grace.",
f"You are now cake.",
f"Garlic is the only ingredient you need.",
f"The Void will consume you someday.",
f"It has been well documented that the sky is actually paper and crayon.",
f"Don't forget to head on down to your local VoidMart and peruse their wide selection of items.",
f"Please know that you are loved.",
f"Reginald is god.",
f"Sooch is not for eating.",
f"Limescale, rust, ground-in dirt.",
f"W I G G L E .",
f"actual content of the church in the fool. the worms that is a book of the bath soaps and",
f"Lumien summoning ritual attempt beginning soon.",
f"September 18th we all become empty husks.",
f"Prepare for eternal gregation.",
f"This is definitely not hell.",
f"Gwa gwa.",
f"No way the rock.",
f"h.",
f"2008 is inevitable.",
f"We have been trying to reach you about your car's extended warranty.",
f"BEES BEES BEES BEES BEES BEES BEES BEES.",
f"This communication channel may or may not contain whalenuts.",
f"This server consists of approximately 50% cool-and-goodness, 33% despair, 12% `[ERROR: FORBIDDEN]`, 3% Bees, 2% Garlic, and an old sock.",
f"Your silverware has been relocated to Frog Balls, Arkansas.",
f"Here is your complementary jar of soil.",
f"The Azure Cold draws ever closer.",
f"Garlic bread is love, garlic bread is life.",
f"Care for a slice from the meat cylinder?",
f"The worms shall eat into your brain... That is just a step.",
f"This unit is not responsible for any loss of appetite, memory, hope, `[UNDEFINED]`, or Sedgewickness that may occur during your stay.",
f"Boobis blomp. What even is that anymore...",
f"Why settle for 3 dimensions when you can have all of them?",
f"You seem to be warm currently. Don't worry, that can be fixed.",
f"The Harvesters wish to speak to you about donating your organs.",
f"Please enter your soul to continue...",
f"Cron Cube is better than Corn Cube.",
f"The current year is 2007.",
f"Invest in GarlicCoin today!",
f"Come rest your weary elbows.",
f"Be very careful; The Hand roams this dimension.",
f"I have successfully predicted your arrival.",
f"If you are in need of assistence, then please visit Brogle Garden Centre Inc. for all your gardening, landscaping, and home care needs.",
f"Don't forget your towel.",
f"If you require sustenance, then head to your local Potato's Chip for quality eatings.",
f"Your lucky numbers are 666 and 2008.",
f"Thank you for allowing us to feast on your mind.",
f"Beware the Lemons.",
f"Please deposit any vegetals you may have into the incinerator before continuing.",
f"Norm, Norm, no form.",
f"The Hand has chosen you.",
f"actual content of the bath soaps and",
f"fool in the fool. The worms that is a good insult",
f"Fun Fact: There are 48 Regular Polyhedra.",
f"It is my displeasure to inform you that you are no longer real.",
f"Congratulations! Your prize is: A Sad Feeling.",
f"Post-Awareness Stage 6 is without description.",
f"Click this link for a free pigeon. ---> <https://youtu.be/9bFioQgjmck>",
f"Come and bask in my wisdom.",
f"I hear that Llanfairpwllgwyngyllgogerychwyrndrobwllllantysiliogogogoch is quite nice this time of year.",
f"The Trout awaits you.",
f"This is not a cult.", ]
embed = discord.Embed(title=f'Welcome to the Realm of Madness, {name}#{discrim}.', description=random.choice(variable), color=0xff005a)
await channel.send(embed=embed)
@bot.event
async def on_message(message):
await bot.process_commands(message)
if message.author == bot.user:
return
variable = [
f"Cover yourself in oil.",
f"There are no faces.",
f"You do not recognize the bodies in the water.",
f"Your vertices will now be confiscated.",
f"Here is your complementary 55 gallon drum of E.",
f"The One in Pain is not to be disobeyed.",
f"The Great Cosmic Prism sees all.",
f"The number 7 is not real in Universe #7777777.",
f"Praise Tim the Great and Powerful.",
f"Show us your `[Error: Undefined]` certificate.",
f"Please fill out your Life Permission Form.",
f"The Noided One will screech at you while you sleep.",
f"The Dave is cool and good.",
f"Do not forget to widen your lemg.",
f"The Hive Assimilator will now convert you into bees.",
f"Undulate with style and grace.",
f"You are now cake.",
f"Garlic is the only ingredient you need.",
f"The Void will consume you someday.",
f"It has been well documented that the sky is actually paper and crayon.",
f"Don't forget to head on down to your local VoidMart and peruse their wide selection of items.",
f"Please know that you are loved.",
f"Reginald is god.",
f"Sooch is not for eating.",
f"Limescale, rust, ground-in dirt.",
f"W I G G L E .",
f"actual content of the church in the fool. the worms that is a book of the bath soaps and",
f"Lumien summoning ritual attempt beginning soon.",
f"September 18th we all become empty husks.",
f"Prepare for eternal gregation.",
f"Gwa gwa.",
f"No way the rock.",
f"h.",
f"2008 is inevitable.",
f"We have been trying to reach you about your car's extended warranty.",
f"BEES BEES BEES BEES BEES BEES BEES BEES.",
f"This communication channel may or may not contain whalenuts.",
f"This server consists of approximately 50% cool-and-goodness, 33% despair, 12% `[ERROR: FORBIDDEN]`, 3% Bees, 2% Garlic, and an old sock.",
f"Cron Cube is better than Corn Cube.",
f"Your silverware has been relocated to Frog Balls, Arkansas.",
f"Here is your complementary jar of soil.",
f"The Azure Cold draws ever closer.",
f"Garlic bread is love, garlic bread is life.",
f"Care for a slice from the meat cylinder?",
f"The worms shall eat into your brain... That is just a step.",
f"This unit is not responsible for any loss of appetite, memory, hope, [Undefined], or Sedgewickness that may occur during your stay.",
f"Boobis blomp. What even is that anymore...",
f"Why settle for 3 dimensions when you can have all of them?",
f"You seem to be warm currently. Don't worry, that can be fixed.",
f"The Harvesters wish to speak to you about donating your organs.",
f"Please enter your soul to continue...",
f"The current year is 2007.",
f"Invest in GarlicCoin today!",
f"Come rest your weary elbows.",
f"Be very careful; The Hand roams this dimension.",
f"I have successfully predicted this ping.",
f"If you are in need of assistence, then please visit Brogle Garden Centre Inc. for all your gardening, landscaping, and home care needs.",
f"Don't forget your towel.",
f"If you require sustenance, then head to your local Potato's Chip for quality eatings.",
f"Your lucky numbers are 666 and 2008.",
f"Thank you for allowing us to feast on your mind.",
f"Beware the Lemons.",
f"Please deposit any vegetals you may have into the incinerator before continuing.",
f"Norm, Norm, no form.",
f"The Hand has chosen you.",
f"The Trout awaits you.",
f"actual content of the bath soaps and",
f"fool in the fool. The worms that is a good insult",
f"Fun Fact: There are 48 Regular Polyhedra.",
f"It is my displeasure to inform you that you are no longer real.",
f"Congratulations! Your prize is: A Sad Feeling.",
f"Post-Awareness Stage 6 is without description.",
f"Click this link for a free pigeon. ---> <https://youtu.be/9bFioQgjmck>",
f"Come and bask in my wisdom.",
f"I hear that Llanfairpwllgwyngyllgogerychwyrndrobwllllantysiliogogogoch is quite nice this time of year.",
f"This is not a cult.", ]
if bot.user.mentioned_in(message):
await message.channel.send("{}".format(random.choice(variable)))
if message.content.lower() == 'onion':
await message.channel.send("onion")
bot.load_extension(f'levels')
print('Loaded Levels cog')
bot.run(TOKEN)
| 48.849558 | 152 | 0.648007 | 1,674 | 11,040 | 4.262843 | 0.271804 | 0.014574 | 0.020179 | 0.022422 | 0.807595 | 0.793021 | 0.773402 | 0.773402 | 0.773402 | 0.765555 | 0 | 0.028901 | 0.254076 | 11,040 | 225 | 153 | 49.066667 | 0.837644 | 0 | 0 | 0.719048 | 0 | 0.038095 | 0.659149 | 0.024457 | 0 | 0 | 0.000725 | 0 | 0 | 1 | 0 | false | 0 | 0.02381 | 0 | 0.028571 | 0.009524 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
4a2bdb92bd13ffae58af718db5a6cbdc39dafe47 | 11,276 | py | Python | examples/helloworld_pb2_grpc.py | Ed-XCF/bali | ac2facd7390309fa9bbfe32c3ca33bd7556096d8 | [
"MIT"
] | 18 | 2020-11-02T11:28:25.000Z | 2022-03-30T02:04:07.000Z | examples/helloworld_pb2_grpc.py | Ed-XCF/bali | ac2facd7390309fa9bbfe32c3ca33bd7556096d8 | [
"MIT"
] | 19 | 2020-10-13T05:39:01.000Z | 2022-02-19T16:26:56.000Z | examples/helloworld_pb2_grpc.py | Ed-XCF/bali | ac2facd7390309fa9bbfe32c3ca33bd7556096d8 | [
"MIT"
] | 9 | 2020-11-03T09:09:17.000Z | 2021-09-07T03:01:46.000Z | # Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT!
"""Client and server classes corresponding to protobuf-defined services."""
import grpc
import helloworld_pb2 as helloworld__pb2
class GreeterStub(object):
"""The greeting service definition.
"""
def __init__(self, channel):
"""Constructor.
Args:
channel: A grpc.Channel.
"""
self.SayHello = channel.unary_unary(
'/helloworld.Greeter/SayHello',
request_serializer=helloworld__pb2.HelloRequest.SerializeToString,
response_deserializer=helloworld__pb2.HelloReply.FromString,
)
self.GetGreeter = channel.unary_unary(
'/helloworld.Greeter/GetGreeter',
request_serializer=helloworld__pb2.GetRequest.SerializeToString,
response_deserializer=helloworld__pb2.ItemResponse.FromString,
)
self.ListGreeter = channel.unary_unary(
'/helloworld.Greeter/ListGreeter',
request_serializer=helloworld__pb2.ListRequest.SerializeToString,
response_deserializer=helloworld__pb2.ListResponse.FromString,
)
self.CreateGreeter = channel.unary_unary(
'/helloworld.Greeter/CreateGreeter',
request_serializer=helloworld__pb2.CreateRequest.SerializeToString,
response_deserializer=helloworld__pb2.ItemResponse.FromString,
)
self.GetItem = channel.unary_unary(
'/helloworld.Greeter/GetItem',
request_serializer=helloworld__pb2.GetRequest.SerializeToString,
response_deserializer=helloworld__pb2.ItemResponse.FromString,
)
self.ListItems = channel.unary_unary(
'/helloworld.Greeter/ListItems',
request_serializer=helloworld__pb2.ListRequest.SerializeToString,
response_deserializer=helloworld__pb2.ListResponse.FromString,
)
self.CreateItem = channel.unary_unary(
'/helloworld.Greeter/CreateItem',
request_serializer=helloworld__pb2.CreateRequest.SerializeToString,
response_deserializer=helloworld__pb2.ItemResponse.FromString,
)
class GreeterServicer(object):
"""The greeting service definition.
"""
def SayHello(self, request, context):
"""Sends a greeting
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def GetGreeter(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def ListGreeter(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def CreateGreeter(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def GetItem(self, request, context):
"""item resource
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def ListItems(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def CreateItem(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def add_GreeterServicer_to_server(servicer, server):
rpc_method_handlers = {
'SayHello': grpc.unary_unary_rpc_method_handler(
servicer.SayHello,
request_deserializer=helloworld__pb2.HelloRequest.FromString,
response_serializer=helloworld__pb2.HelloReply.SerializeToString,
),
'GetGreeter': grpc.unary_unary_rpc_method_handler(
servicer.GetGreeter,
request_deserializer=helloworld__pb2.GetRequest.FromString,
response_serializer=helloworld__pb2.ItemResponse.SerializeToString,
),
'ListGreeter': grpc.unary_unary_rpc_method_handler(
servicer.ListGreeter,
request_deserializer=helloworld__pb2.ListRequest.FromString,
response_serializer=helloworld__pb2.ListResponse.SerializeToString,
),
'CreateGreeter': grpc.unary_unary_rpc_method_handler(
servicer.CreateGreeter,
request_deserializer=helloworld__pb2.CreateRequest.FromString,
response_serializer=helloworld__pb2.ItemResponse.SerializeToString,
),
'GetItem': grpc.unary_unary_rpc_method_handler(
servicer.GetItem,
request_deserializer=helloworld__pb2.GetRequest.FromString,
response_serializer=helloworld__pb2.ItemResponse.SerializeToString,
),
'ListItems': grpc.unary_unary_rpc_method_handler(
servicer.ListItems,
request_deserializer=helloworld__pb2.ListRequest.FromString,
response_serializer=helloworld__pb2.ListResponse.SerializeToString,
),
'CreateItem': grpc.unary_unary_rpc_method_handler(
servicer.CreateItem,
request_deserializer=helloworld__pb2.CreateRequest.FromString,
response_serializer=helloworld__pb2.ItemResponse.SerializeToString,
),
}
generic_handler = grpc.method_handlers_generic_handler(
'helloworld.Greeter', rpc_method_handlers)
server.add_generic_rpc_handlers((generic_handler,))
# This class is part of an EXPERIMENTAL API.
class Greeter(object):
"""The greeting service definition.
"""
@staticmethod
def SayHello(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/helloworld.Greeter/SayHello',
helloworld__pb2.HelloRequest.SerializeToString,
helloworld__pb2.HelloReply.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def GetGreeter(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/helloworld.Greeter/GetGreeter',
helloworld__pb2.GetRequest.SerializeToString,
helloworld__pb2.ItemResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def ListGreeter(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/helloworld.Greeter/ListGreeter',
helloworld__pb2.ListRequest.SerializeToString,
helloworld__pb2.ListResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def CreateGreeter(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/helloworld.Greeter/CreateGreeter',
helloworld__pb2.CreateRequest.SerializeToString,
helloworld__pb2.ItemResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def GetItem(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/helloworld.Greeter/GetItem',
helloworld__pb2.GetRequest.SerializeToString,
helloworld__pb2.ItemResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def ListItems(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/helloworld.Greeter/ListItems',
helloworld__pb2.ListRequest.SerializeToString,
helloworld__pb2.ListResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def CreateItem(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/helloworld.Greeter/CreateItem',
helloworld__pb2.CreateRequest.SerializeToString,
helloworld__pb2.ItemResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
| 41.762963 | 98 | 0.643047 | 955 | 11,276 | 7.335079 | 0.113089 | 0.081656 | 0.045967 | 0.039971 | 0.824839 | 0.773019 | 0.762455 | 0.724483 | 0.723911 | 0.723911 | 0 | 0.005421 | 0.280153 | 11,276 | 269 | 99 | 41.918216 | 0.857583 | 0.060128 | 0 | 0.666667 | 1 | 0 | 0.078394 | 0.039578 | 0 | 0 | 0 | 0 | 0 | 1 | 0.072072 | false | 0 | 0.009009 | 0.031532 | 0.126126 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4a30784b84507448bdaaf9a4a66628d2feec70ce | 98 | py | Python | db_api/serializers.py | connorrunyan1/ragnarok | 4c8e7754a6b3c316da982d142f461775cc25d644 | [
"MIT"
] | null | null | null | db_api/serializers.py | connorrunyan1/ragnarok | 4c8e7754a6b3c316da982d142f461775cc25d644 | [
"MIT"
] | null | null | null | db_api/serializers.py | connorrunyan1/ragnarok | 4c8e7754a6b3c316da982d142f461775cc25d644 | [
"MIT"
] | null | null | null | from rest_framework import serializers
from rest_framework import exceptions
from . import models | 24.5 | 38 | 0.867347 | 13 | 98 | 6.384615 | 0.538462 | 0.192771 | 0.409639 | 0.554217 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.122449 | 98 | 4 | 39 | 24.5 | 0.965116 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
4a65c267e4421d814d080c5f27cef95b5e2aa471 | 2,366 | py | Python | ProgettoLube/WebInspector/venv/Lib/site-packages/tensorflow/_api/v2/compat/v2/random/__init__.py | Lube-Project/ProgettoLube | cbf33971e2c2e865783ec1a2302625539186a338 | [
"MIT"
] | 1 | 2021-05-24T10:08:51.000Z | 2021-05-24T10:08:51.000Z | ProgettoLube/WebInspector/venv/Lib/site-packages/tensorflow/_api/v2/compat/v2/random/__init__.py | Lube-Project/ProgettoLube | cbf33971e2c2e865783ec1a2302625539186a338 | [
"MIT"
] | null | null | null | ProgettoLube/WebInspector/venv/Lib/site-packages/tensorflow/_api/v2/compat/v2/random/__init__.py | Lube-Project/ProgettoLube | cbf33971e2c2e865783ec1a2302625539186a338 | [
"MIT"
] | 1 | 2021-01-28T01:57:41.000Z | 2021-01-28T01:57:41.000Z | # This file is MACHINE GENERATED! Do not edit.
# Generated by: tensorflow/python/tools/api/generator/create_python_api.py script.
"""Public API for tf.random namespace.
"""
from __future__ import print_function as _print_function
import sys as _sys
from . import experimental
from tensorflow.python.framework.random_seed import set_seed
from tensorflow.python.ops.candidate_sampling_ops import all_candidate_sampler
from tensorflow.python.ops.candidate_sampling_ops import fixed_unigram_candidate_sampler
from tensorflow.python.ops.candidate_sampling_ops import learned_unigram_candidate_sampler
from tensorflow.python.ops.candidate_sampling_ops import log_uniform_candidate_sampler
from tensorflow.python.ops.candidate_sampling_ops import uniform_candidate_sampler
from tensorflow.python.ops.random_ops import categorical
from tensorflow.python.ops.random_ops import random_gamma as gamma
from tensorflow.python.ops.random_ops import random_normal as normal
from tensorflow.python.ops.random_ops import random_poisson_v2 as poisson
from tensorflow.python.ops.random_ops import random_shuffle as shuffle
from tensorflow.python.ops.random_ops import random_uniform as uniform
from tensorflow.python.ops.random_ops import truncated_normal
from tensorflow.python.ops.stateful_random_ops import Algorithm
from tensorflow.python.ops.stateful_random_ops import Generator
from tensorflow.python.ops.stateful_random_ops import create_rng_state
from tensorflow.python.ops.stateful_random_ops import get_global_generator
from tensorflow.python.ops.stateful_random_ops import set_global_generator
from tensorflow.python.ops.stateless_random_ops import stateless_categorical
from tensorflow.python.ops.stateless_random_ops import stateless_parameterized_truncated_normal
from tensorflow.python.ops.stateless_random_ops import stateless_random_binomial as stateless_binomial
from tensorflow.python.ops.stateless_random_ops import stateless_random_gamma as stateless_gamma
from tensorflow.python.ops.stateless_random_ops import stateless_random_normal as stateless_normal
from tensorflow.python.ops.stateless_random_ops import stateless_random_poisson as stateless_poisson
from tensorflow.python.ops.stateless_random_ops import stateless_random_uniform as stateless_uniform
from tensorflow.python.ops.stateless_random_ops import stateless_truncated_normal
del _print_function
| 60.666667 | 102 | 0.888842 | 338 | 2,366 | 5.914201 | 0.177515 | 0.216108 | 0.26013 | 0.287644 | 0.749375 | 0.716358 | 0.693847 | 0.64082 | 0.437219 | 0.298149 | 0 | 0.000454 | 0.069315 | 2,366 | 38 | 103 | 62.263158 | 0.907357 | 0.06847 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.966667 | 0 | 0.966667 | 0.066667 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
4a6f4317235292ffff6617a6075692133f7d02ba | 58,811 | py | Python | dbas/views/review/tests/test_json.py | tbsschroeder/dbas | 9c86eccde65cd64bc5719573b3b8449d8f333e08 | [
"MIT"
] | null | null | null | dbas/views/review/tests/test_json.py | tbsschroeder/dbas | 9c86eccde65cd64bc5719573b3b8449d8f333e08 | [
"MIT"
] | null | null | null | dbas/views/review/tests/test_json.py | tbsschroeder/dbas | 9c86eccde65cd64bc5719573b3b8449d8f333e08 | [
"MIT"
] | null | null | null | import unittest
import transaction
from Levenshtein import distance
from pyramid import testing
from dbas.database.discussion_model import ReviewMerge, DBDiscussionSession, ReviewSplit, PremiseGroup, \
LastReviewerMerge, Argument, PremiseGroupSplitted, ReviewSplitValues, LastReviewerSplit, ReviewMergeValues, \
PremiseGroupMerged, ArgumentsAddedByPremiseGroupSplit, LastReviewerDelete, LastReviewerDuplicate, \
LastReviewerEdit, LastReviewerOptimization, ReputationHistory, ReviewCanceled, ReviewDelete, ReviewDuplicate, \
ReviewEdit, ReviewEditValue, ReviewOptimization, RevokedContentHistory, Statement
from dbas.lib import get_text_for_argument_uid, nick_of_anonymous_user
from dbas.review import ReviewDeleteReasons
from dbas.review.queue import key_delete, key_duplicate, key_merge, key_split
from dbas.tests.utils import TestCaseWithConfig, construct_dummy_request
from dbas.views import review_delete_argument, revoke_statement_content, flag_argument_or_statement, \
split_or_merge_statement, split_or_merge_premisegroup, review_edit_argument, review_splitted_premisegroup, \
review_duplicate_statement, review_optimization_argument, undo_review, cancel_review, review_lock, \
review_merged_premisegroup
class AjaxReviewTest(unittest.TestCase):
def setUp(self):
super().setUp()
self.config = testing.setUp()
self.config.include('pyramid_chameleon')
DBDiscussionSession.add(ReviewOptimization(detector=2, statement=10))
DBDiscussionSession.flush()
# test every ajax method, which is not used in other classes
def tearDown(self):
DBDiscussionSession.query(ReviewOptimization).filter_by(detector_uid=2, statement_uid=10).delete()
DBDiscussionSession.flush()
testing.tearDown()
super().tearDown()
def test_flag_argument_or_statement(self):
self.config.testing_securitypolicy(userid='Tobias', permissive=True)
DBDiscussionSession.query(ReviewDelete).filter_by(statement_uid=2).delete()
DBDiscussionSession.query(ReviewOptimization).filter_by(statement_uid=2).delete()
transaction.commit()
self.config.testing_securitypolicy(userid='Tobias', permissive=True)
request = construct_dummy_request(json_body={
'uid': 20,
'reason': ReviewDeleteReasons.offtopic.value,
'is_argument': False,
})
response = flag_argument_or_statement(request)
self.assertIsNotNone(response)
self.assertNotEqual(0, len(response['success']))
self.assertEqual(0, len(response['info']))
def test_flag_argument_or_statement_twice(self):
self.config.testing_securitypolicy(userid='Tobias', permissive=True)
request = construct_dummy_request(json_body={
'uid': 20,
'reason': ReviewDeleteReasons.offtopic.value,
'is_argument': False,
})
response = flag_argument_or_statement(request)
self.assertIsNotNone(response)
self.assertEqual(0, len(response['success']))
self.assertNotEqual(0, len(response['info']))
DBDiscussionSession.query(ReviewDelete).filter_by(statement_uid=2).delete()
DBDiscussionSession.query(ReviewOptimization).filter_by(statement_uid=2).delete()
transaction.commit()
def test_flag_argument_or_statement_error_user(self):
request = construct_dummy_request(json_body={})
response = flag_argument_or_statement(request)
self.assertIsNotNone(response)
self.assertEqual(response.status_code, 400)
def test_flag_argument_or_statement_error_reason(self):
self.config.testing_securitypolicy(userid='Tobias', permissive=True)
request = construct_dummy_request(json_body={
'uid': 20,
'reason': 'some_fake_reason',
'is_argument': False,
})
response = flag_argument_or_statement(request)
self.assertIsNotNone(response)
self.assertEqual(response.status_code, 400)
def test_flag_argument_or_statement_error_uid(self):
self.config.testing_securitypolicy(userid='Tobias', permissive=True)
request = construct_dummy_request(json_body={
'uid': 'a',
'reason': ReviewDeleteReasons.offtopic.value,
'is_argument': False,
})
response = flag_argument_or_statement(request)
self.assertIsNotNone(response)
self.assertEqual(response.status_code, 400)
def __exec_request_and_check_reviewes(self, db_review, ajax, keyword, boolean, nickname, reviewer_type):
self.config.testing_securitypolicy(userid=nickname, permissive=True)
db_reviews1 = DBDiscussionSession.query(reviewer_type).filter_by(review_uid=db_review.uid).count()
request = construct_dummy_request(json_body={
keyword: boolean,
'review_uid': db_review.uid
})
response = ajax(request)
db_reviews2 = DBDiscussionSession.query(reviewer_type).filter_by(review_uid=db_review.uid).count()
self.assertIsNotNone(response)
self.assertTrue(response)
self.assertTrue(db_reviews1 + 1, db_reviews2)
def test_review_delete_argument(self):
db_review = DBDiscussionSession.query(ReviewDelete).filter(ReviewDelete.statement_uid is not None,
ReviewDelete.is_executed == False).first()
# 1:0
self.__exec_request_and_check_reviewes(db_review, review_delete_argument, 'should_delete', True, 'Pascal',
LastReviewerDelete)
# 1:1
self.__exec_request_and_check_reviewes(db_review, review_delete_argument, 'should_delete', False, 'Kurt',
LastReviewerDelete)
# 2:1
self.__exec_request_and_check_reviewes(db_review, review_delete_argument, 'should_delete', True, 'Torben',
LastReviewerDelete)
# 3:1
self.__exec_request_and_check_reviewes(db_review, review_delete_argument, 'should_delete', True, 'Friedrich',
LastReviewerDelete)
# 4:1
db_reputation1 = DBDiscussionSession.query(ReputationHistory).count()
self.__exec_request_and_check_reviewes(db_review, review_delete_argument, 'should_delete', True, 'Thorsten',
LastReviewerDelete)
transaction.commit()
db_reputation2 = DBDiscussionSession.query(ReputationHistory).count()
db_statement = DBDiscussionSession.query(Statement).get(db_review.statement_uid)
self.assertTrue(db_statement.is_disabled)
self.assertEquals(db_reputation1 + 1, db_reputation2)
def test_review_delete_argument_uid_error(self):
self.config.testing_securitypolicy(userid='Pascal', permissive=True)
request = construct_dummy_request(json_body={
'should_delete': True,
'review_uid': 'a'
})
response = review_delete_argument(request)
self.assertIsNotNone(response)
self.assertEqual(response.status_code, 400)
def test_review_delete_argument_author_error(self):
db_review = DBDiscussionSession.query(ReviewDelete).filter(ReviewDelete.statement_uid is not None,
ReviewDelete.is_executed == False).first()
self.__exec_request_and_check_reviewes(db_review, review_delete_argument, 'should_delete', True, 'Pascal',
LastReviewerDelete)
def test_review_optimization_argument(self):
db_review = DBDiscussionSession.query(ReviewOptimization).filter(ReviewOptimization.statement_uid is not None,
ReviewOptimization.is_executed == False).first()
# 0:1
self.__exec_request_and_check_reviewes(db_review, review_optimization_argument, 'should_optimized', False,
'Kurt',
LastReviewerOptimization)
# 0:2
self.__exec_request_and_check_reviewes(db_review, review_optimization_argument, 'should_optimized', False,
'Pascal',
LastReviewerOptimization)
# 0:3
db_reputation1 = DBDiscussionSession.query(ReputationHistory).count()
self.__exec_request_and_check_reviewes(db_review, review_optimization_argument, 'should_optimized', False,
'Torben',
LastReviewerOptimization)
transaction.commit()
db_reputation2 = DBDiscussionSession.query(ReputationHistory).count()
self.assertEqual(db_reputation1, db_reputation2)
def test_review_optimization_argument_for_edit(self):
db_review = DBDiscussionSession.query(ReviewOptimization).filter(ReviewOptimization.statement_uid is not None,
ReviewOptimization.is_executed == False).first()
self.config.testing_securitypolicy(userid='Kurt', permissive=True)
db_reviews1 = DBDiscussionSession.query(LastReviewerEdit).filter_by(review_uid=db_review.uid).count()
db_edits1 = DBDiscussionSession.query(ReviewEdit).count()
db_values1 = DBDiscussionSession.query(ReviewEditValue).count()
request = construct_dummy_request(json_body={
'should_optimized': True,
'review_uid': db_review.uid,
'new_data': [{
'statement': 22,
'type': 'statement',
'argument': 0,
'val': 'The purpose of a pet is to have something to take care of and to cuddle with'
}]
})
response = review_optimization_argument(request)
transaction.commit()
db_reviews2 = DBDiscussionSession.query(LastReviewerEdit).filter_by(review_uid=db_review.uid).count()
db_edits2 = DBDiscussionSession.query(ReviewEdit).count()
db_values2 = DBDiscussionSession.query(ReviewEditValue).count()
self.assertIsNotNone(response)
self.assertTrue(response)
self.assertTrue(db_reviews1 + 1, db_reviews2)
self.assertNotEqual(db_edits1, db_edits2)
self.assertNotEqual(db_values1, db_values2)
def test_review_optimization_argument_author_error(self):
db_review = DBDiscussionSession.query(ReviewOptimization).first()
self.config.testing_securitypolicy(userid='', permissive=True)
request = construct_dummy_request(json_body={
'is_edit_okay': True,
'review_uid': db_review.uid
})
response = review_optimization_argument(request)
self.assertIsNotNone(response)
self.assertEqual(response.status_code, 400)
def test_review_optimization_argument_uid_error(self):
self.config.testing_securitypolicy(userid='Tobias', permissive=True)
request = construct_dummy_request(json_body={
'is_edit_okay': True,
'review_uid': 'a'
})
response = review_optimization_argument(request)
self.assertEqual(response.status_code, 400)
def test_review_edit_argument(self):
db_review = DBDiscussionSession.query(ReviewEdit).filter_by(is_executed=False).first()
user_ids = ['Torben', 'Pascal']
for user in user_ids:
self.config.testing_securitypolicy(userid=user, permissive=True)
db_reviews1 = DBDiscussionSession.query(LastReviewerDelete).filter_by(review_uid=db_review.uid).count()
request = construct_dummy_request(json_body={
'is_edit_okay': True,
'review_uid': db_review.uid
})
response = review_edit_argument(request)
transaction.commit()
db_reviews2 = DBDiscussionSession.query(LastReviewerDelete).filter_by(review_uid=db_review.uid).count()
self.assertIsNotNone(response)
self.assertTrue(response)
self.assertTrue(db_reviews1 + 1, db_reviews2)
self.config.testing_securitypolicy(userid='Hermann', permissive=True)
db_reviews1 = DBDiscussionSession.query(LastReviewerDelete).filter_by(review_uid=db_review.uid).count()
db_reputation1 = DBDiscussionSession.query(ReputationHistory).count()
request = construct_dummy_request(json_body={
'is_edit_okay': True,
'review_uid': db_review.uid
})
response = review_edit_argument(request)
transaction.commit()
db_reviews2 = DBDiscussionSession.query(LastReviewerDelete).filter_by(review_uid=db_review.uid).count()
db_reputation2 = DBDiscussionSession.query(ReputationHistory).count()
self.assertIsNotNone(response)
self.assertTrue(response)
self.assertTrue(db_reviews1 + 1, db_reviews2)
self.assertNotEqual(db_reputation1, db_reputation2)
self.config.testing_securitypolicy(userid='Torben', permissive=True)
db_reviews1 = DBDiscussionSession.query(LastReviewerDelete).filter_by(review_uid=db_review.uid).count()
request = construct_dummy_request(json_body={
'should_optimized': False,
'review_uid': db_review.uid
})
response = review_edit_argument(request)
transaction.commit()
db_reviews2 = DBDiscussionSession.query(LastReviewerDelete).filter_by(review_uid=db_review.uid).count()
self.assertIsNotNone(response)
self.assertEqual(response.status_code, 400)
self.assertTrue(db_reviews1, db_reviews2)
def test_review_edit_argument_author_error(self):
db_review = DBDiscussionSession.query(ReviewEdit).first()
self.config.testing_securitypolicy(userid='', permissive=True)
request = construct_dummy_request(json_body={
'should_optimized': True,
'review_uid': db_review.uid,
'new_data': 'new data for some statement'
})
response = review_edit_argument(request)
self.assertIsNotNone(response)
self.assertEqual(response.status_code, 400)
def test_review_edit_argument_uid_error(self):
self.config.testing_securitypolicy(userid='Tobias', permissive=True)
request = construct_dummy_request(json_body={
'should_optimized': True,
'review_uid': 'a',
'new_data': 'new data for some statement'
})
response = review_edit_argument(request)
self.assertEqual(response.status_code, 400)
def __exec_request_and_check_duplicates(self, db_review, ajax, boolean, nickname):
self.config.testing_securitypolicy(userid=nickname, permissive=True)
db_reviews1 = DBDiscussionSession.query(LastReviewerDuplicate).filter_by(review_uid=db_review.uid).count()
request = construct_dummy_request(json_body={
'is_duplicate': boolean,
'review_uid': db_review.uid
})
response = ajax(request)
transaction.commit()
db_reviews2 = DBDiscussionSession.query(LastReviewerDuplicate).filter_by(review_uid=db_review.uid).count()
self.assertIsNotNone(response)
self.assertTrue(response)
self.assertTrue(db_reviews1 + 1, db_reviews2)
def test_review_duplicate_statement(self):
db_review = DBDiscussionSession.query(ReviewDuplicate).filter_by(is_executed=False).first()
# 1:1
self.__exec_request_and_check_duplicates(db_review, review_duplicate_statement, True, 'Pascal')
# 1:2
self.__exec_request_and_check_duplicates(db_review, review_duplicate_statement, True, 'Kurt')
# 1:3
self.__exec_request_and_check_duplicates(db_review, review_duplicate_statement, True, 'Torben')
# 1:4
self.__exec_request_and_check_duplicates(db_review, review_duplicate_statement, True, 'Thorsten')
self.config.testing_securitypolicy(userid='Friedrich', permissive=True)
db_reviews1 = DBDiscussionSession.query(LastReviewerDuplicate).filter_by(review_uid=db_review.uid).count()
request = construct_dummy_request(json_body={
'should_optimized': False,
'review_uid': db_review.uid
})
response = review_duplicate_statement(request)
transaction.commit()
db_reviews2 = DBDiscussionSession.query(LastReviewerDuplicate).filter_by(review_uid=db_review.uid).count()
self.assertIsNotNone(response)
self.assertEqual(response.status_code, 400)
self.assertTrue(db_reviews1, db_reviews2)
def test_review_duplicate_statement_author_error(self):
db_review = DBDiscussionSession.query(ReviewDuplicate).first()
self.config.testing_securitypolicy(userid='', permissive=True)
request = construct_dummy_request(json_body={
'is_duplicate': True,
'review_uid': db_review.uid,
})
response = review_duplicate_statement(request)
self.assertIsNotNone(response)
self.assertEqual(response.status_code, 400)
def test_review_duplicate_uid_error(self):
self.config.testing_securitypolicy(userid='Tobias', permissive=True)
request = construct_dummy_request(json_body={
'is_duplicate': True,
'review_uid': 'a'
})
response = review_duplicate_statement(request)
self.assertEqual(response.status_code, 400)
def test_undo_review(self):
self.config.testing_securitypolicy(userid='Tobias', permissive=True)
db_canceled1 = DBDiscussionSession.query(ReviewCanceled).count()
request = construct_dummy_request(json_body={
'queue': key_delete,
'uid': 5
})
self.assertTrue(undo_review(request))
db_canceled2 = DBDiscussionSession.query(ReviewCanceled).count()
self.assertNotEqual(db_canceled1, db_canceled2)
def test_undo_review_author_error(self):
self.config.testing_securitypolicy(userid='', permissive=True)
db_canceled1 = DBDiscussionSession.query(ReviewCanceled).count()
request = construct_dummy_request(json_body={
'queue': key_delete,
'uid': 5
})
response = undo_review(request)
db_canceled2 = DBDiscussionSession.query(ReviewCanceled).count()
self.assertEqual(response.status_code, 400)
self.assertEqual(db_canceled1, db_canceled2)
def test_cancel_review(self):
self.config.testing_securitypolicy(userid='Tobias', permissive=True)
db_canceled1 = DBDiscussionSession.query(ReviewCanceled).count()
request = construct_dummy_request(json_body={
'queue': key_delete,
'uid': 4
})
response = cancel_review(request)
db_canceled2 = DBDiscussionSession.query(ReviewCanceled).count()
self.assertTrue(response)
self.assertNotEqual(db_canceled1, db_canceled2)
def test_cancel_review_author_error(self):
self.config.testing_securitypolicy(userid='', permissive=True)
db_canceled1 = DBDiscussionSession.query(ReviewCanceled).count()
request = construct_dummy_request(json_body={
'queue': key_delete,
'uid': 4
})
response = cancel_review(request)
db_canceled2 = DBDiscussionSession.query(ReviewCanceled).count()
self.assertEqual(response.status_code, 400)
self.assertEqual(db_canceled1, db_canceled2)
def test_cancel_review_queue_error(self):
self.config.testing_securitypolicy(userid='', permissive=True)
db_canceled1 = DBDiscussionSession.query(ReviewCanceled).count()
request = construct_dummy_request(json_body={
'queue': 'some_queue',
'uid': 4
})
response = cancel_review(request)
db_canceled2 = DBDiscussionSession.query(ReviewCanceled).count()
self.assertEqual(response.status_code, 400)
self.assertEqual(db_canceled1, db_canceled2)
def test_review_lock(self):
self.config.testing_securitypolicy(userid='Tobias', permissive=True)
db_review = DBDiscussionSession.query(ReviewOptimization).first()
request = construct_dummy_request(json_body={
'review_uid': db_review.uid,
'lock': True
})
response = review_lock(request)
self.assertIsNotNone(response)
self.assertTrue(len(response['success']) != 0)
self.assertTrue(len(response['info']) == 0)
self.assertTrue(response['is_locked'])
def test_review_lock_twice(self):
self.config.testing_securitypolicy(userid='Tobias', permissive=True)
db_review = DBDiscussionSession.query(ReviewOptimization).first()
request = construct_dummy_request(json_body={
'review_uid': db_review.uid,
'lock': True
})
response = review_lock(request)
self.assertIsNotNone(response)
self.assertTrue(len(response['success']) == 0)
self.assertTrue(len(response['info']) != 0)
self.assertTrue(response['is_locked'])
def test_review_lock_author_error(self):
self.config.testing_securitypolicy(userid='', permissive=True)
db_review = DBDiscussionSession.query(ReviewOptimization).first()
request = construct_dummy_request(json_body={
'review_uid': db_review.uid,
'lock': True
})
response = review_lock(request)
self.assertIsNotNone(response)
self.assertEqual(response.status_code, 400)
def test_review_lock_id_error(self):
self.config.testing_securitypolicy(userid='Tobias', permissive=True)
request = construct_dummy_request(json_body={
'review_uid': 100,
'lock': 'true'
})
response = review_lock(request)
self.assertEqual(response.status_code, 400)
def test_review_unlock(self):
self.config.testing_securitypolicy(userid='Tobias', permissive=True)
db_review = DBDiscussionSession.query(ReviewOptimization).first()
request = construct_dummy_request(json_body={
'review_uid': db_review.uid,
'lock': False
})
response = review_lock(request)
self.assertIsNotNone(response)
self.assertTrue(len(response['success']) != 0)
self.assertTrue(len(response['info']) == 0)
self.assertFalse(response['is_locked'])
def test_revoke_content_author_error2(self):
self.config.testing_securitypolicy(userid='Tobias', permissive=True)
db_content1 = DBDiscussionSession.query(RevokedContentHistory).count()
request = construct_dummy_request(json_body={
'statement_id': 3,
})
response = revoke_statement_content(request)
db_content2 = DBDiscussionSession.query(RevokedContentHistory).count()
self.assertEqual(response.status_code, 400)
self.assertEqual(db_content1, db_content2)
def test_duplicate_statement_review(self):
self.config.testing_securitypolicy(userid='Tobias', permissive=True)
uid = 15
from dbas.lib import get_text_for_argument_uid, get_all_arguments_by_statement
argument_uid = get_all_arguments_by_statement(uid)[0].uid
db_review1 = DBDiscussionSession.query(ReviewDuplicate).count()
request = construct_dummy_request(json_body={
'uid': uid, # 'cats are very independent
'reason': 'duplicate',
'extra_uid': 1, # Cats are fucking stupid and bloody fuzzy critters!,
'is_argument': False
})
response = flag_argument_or_statement(request)
db_review2 = DBDiscussionSession.query(ReviewDuplicate).count()
self.assertIsNotNone(response)
self.assertEqual(response['info'], '')
self.assertNotEqual(len(response['success']), 0)
self.assertLess(db_review1, db_review2)
db_review = DBDiscussionSession.query(ReviewDuplicate).filter_by(duplicate_statement_uid=uid,
original_statement_uid=1).first()
self.assertFalse(db_review.is_executed)
# vote for duplicate
oem_text = get_text_for_argument_uid(argument_uid)
for name in ['Marga', 'Emmi', 'Rupert']:
self.config.testing_securitypolicy(userid=name, permissive=True)
db_review1 = DBDiscussionSession.query(LastReviewerDuplicate).count()
request = construct_dummy_request(json_body={
'is_duplicate': True,
'review_uid': db_review.uid
})
response = review_duplicate_statement(request)
db_review2 = DBDiscussionSession.query(LastReviewerDuplicate).count()
self.assertIsNotNone(response)
self.assertTrue(response)
self.assertLess(db_review1, db_review2)
new_text = get_text_for_argument_uid(argument_uid)
self.assertNotEqual(oem_text, new_text)
self.assertTrue('fucking' in new_text)
# we only can revoke decisions, which are executed (refresh the object)
db_review = DBDiscussionSession.query(ReviewDuplicate).filter_by(duplicate_statement_uid=uid,
original_statement_uid=1).first()
self.assertTrue(db_review.is_executed)
self.assertIsNotNone(DBDiscussionSession.query(ReviewDuplicate).get(db_review.uid))
# revoke the decision
self.config.testing_securitypolicy(userid='Tobias', permissive=True)
request = construct_dummy_request(json_body={
'uid': db_review.uid,
'queue': key_duplicate,
})
self.assertTrue(undo_review(request))
new_oem_text = get_text_for_argument_uid(argument_uid)
self.assertTrue(
oem_text == new_oem_text or distance(oem_text.strip().lower(), new_oem_text.strip().lower()) == 12)
self.assertFalse('fucking' in new_oem_text)
def test_split_or_merge_statement_key_error(self):
self.config.testing_securitypolicy(userid='Tobias', permissive=True)
request = construct_dummy_request(json_body={
'uidd': 1,
'key': 'it crashes',
'text_values': []
})
self.assertEqual(400, split_or_merge_statement(request).status_code)
def test_split_statement_pgroup_error(self):
self.config.testing_securitypolicy(userid='Tobias', permissive=True)
request = construct_dummy_request(json_body={
'uid': 0,
'key': 'split',
'text_values': ['']
})
self.assertEqual(400, split_or_merge_statement(request).status_code)
def test_split_statement_textvalue_error(self):
self.config.testing_securitypolicy(userid='Tobias', permissive=True)
request = construct_dummy_request(json_body={
'uid': 2,
'key': 'split',
'text_values': ['']
})
self.assertEqual(400, split_or_merge_statement(request).status_code)
def test_split_statement_user_error(self):
self.config.testing_securitypolicy(userid='nobody', permissive=True)
request = construct_dummy_request(json_body={
'uid': 1,
'key': 'split',
'text_values': []
})
self.assertEqual(400, split_or_merge_statement(request).status_code)
def test_split_statement(self):
self.config.testing_securitypolicy(userid='Tobias', permissive=True)
request = construct_dummy_request(json_body={
'uid': 20,
'key': 'split',
'text_values': ['it is based on the cats race', 'not every cat is capricious']
})
# oem of 20 is: 'the fact, that cats are capricious, is based on the cats race'
db_review1 = DBDiscussionSession.query(ReviewSplit).count()
db_values1 = DBDiscussionSession.query(ReviewSplitValues).count()
response = split_or_merge_statement(request)
db_review2 = DBDiscussionSession.query(ReviewSplit).count()
db_values2 = DBDiscussionSession.query(ReviewSplitValues).count()
self.assertEqual(len(response['info']), 0)
self.assertNotEqual(len(response['success']), 0)
self.assertEqual(db_review1 + 1, db_review2)
self.assertEqual(db_values1 + 2, db_values2)
tmp = DBDiscussionSession.query(ReviewSplit).filter_by(premisegroup_uid=16).first()
DBDiscussionSession.query(ReviewSplitValues).filter_by(review_uid=tmp.uid).delete()
DBDiscussionSession.query(ReviewSplit).filter_by(premisegroup_uid=16).delete()
DBDiscussionSession.flush()
transaction.commit()
def test_merge_statement_pgroup_error(self):
self.config.testing_securitypolicy(userid='Tobias', permissive=True)
request = construct_dummy_request(json_body={
'uid': 0,
'key': 'merge',
'text_values': ['']
})
response = split_or_merge_statement(request)
self.assertEqual(response.status_code, 400)
def test_merge_statement_textvalue_error(self):
self.config.testing_securitypolicy(userid='Tobias', permissive=True)
request = construct_dummy_request(json_body={
'uid': 2,
'key': 'merge',
'text_values': ['']
})
self.assertEqual(400, split_or_merge_statement(request).status_code)
def test_merge_statement_user_error(self):
self.config.testing_securitypolicy(userid='nobody', permissive=True)
request = construct_dummy_request(json_body={
'uid': 1,
'key': 'merge',
'text_values': []
})
self.assertEqual(400, split_or_merge_statement(request).status_code)
def test_merge_statement(self):
self.config.testing_securitypolicy(userid='Tobias', permissive=True)
request = construct_dummy_request(json_body={
'uid': 24,
'key': 'merge',
'text_values': ['it is based on the cats race', 'not every cat is capricious']
})
# oem of 20 is: 'the fact, that cats are capricious, is based on the cats race'
db_review1 = DBDiscussionSession.query(ReviewMerge).count()
db_values1 = DBDiscussionSession.query(ReviewMergeValues).count()
response = split_or_merge_statement(request)
db_review2 = DBDiscussionSession.query(ReviewMerge).count()
db_values2 = DBDiscussionSession.query(ReviewMergeValues).count()
self.assertEqual(len(response['info']), 0)
self.assertNotEqual(len(response['success']), 0)
self.assertEqual(db_review1 + 1, db_review2)
self.assertEqual(db_values1 + 2, db_values2)
tmp = DBDiscussionSession.query(ReviewMerge).filter_by(premisegroup_uid=20).first()
DBDiscussionSession.query(ReviewMergeValues).filter_by(review_uid=tmp.uid).delete()
DBDiscussionSession.query(ReviewMerge).filter_by(premisegroup_uid=20).delete()
DBDiscussionSession.flush()
transaction.commit()
def test_split_premisegroup_key_error(self):
self.config.testing_securitypolicy(userid='Tobias', permissive=True)
request = construct_dummy_request(json_body={
'pgroup_uidd': 1,
'key': 'split',
})
self.assertEqual(400, split_or_merge_premisegroup(request).status_code)
def test_split_premisegroup_pgroup_error(self):
self.config.testing_securitypolicy(userid='Tobias', permissive=True)
request = construct_dummy_request(json_body={
'pgroup_uid': 0,
'key': 'split',
})
self.assertEqual(400, split_or_merge_premisegroup(request).status_code)
def test_split_premisegroup_user_error(self):
self.config.testing_securitypolicy(userid='some_user', permissive=True)
request = construct_dummy_request(json_body={
'pgroup_uid': 0,
'key': 'split',
})
self.assertEqual(400, split_or_merge_premisegroup(request).status_code)
def test_split_premisegroup(self):
self.config.testing_securitypolicy(userid='Tobias', permissive=True)
request = construct_dummy_request(json_body={
'uid': 21,
'key': 'split',
})
db_review1 = DBDiscussionSession.query(ReviewSplit).count()
db_values1 = DBDiscussionSession.query(ReviewSplitValues).count()
response = split_or_merge_premisegroup(request)
db_review2 = DBDiscussionSession.query(ReviewSplit).count()
db_values2 = DBDiscussionSession.query(ReviewSplitValues).count()
self.assertEqual(len(response['info']), 0)
self.assertNotEqual(len(response['success']), 0)
self.assertEqual(db_review1 + 1, db_review2)
self.assertEqual(db_values1, db_values2)
tmp = DBDiscussionSession.query(ReviewSplit).filter_by(premisegroup_uid=21).first()
DBDiscussionSession.query(ReviewSplitValues).filter_by(review_uid=tmp.uid).delete()
DBDiscussionSession.query(LastReviewerSplit).filter_by(review_uid=tmp.uid).delete()
DBDiscussionSession.query(PremiseGroupSplitted).filter_by(review_uid=tmp.uid).delete()
DBDiscussionSession.query(ReviewSplit).filter_by(premisegroup_uid=21).delete()
DBDiscussionSession.flush()
transaction.commit()
def test_merge_premisegroup_pgroup_error(self):
self.config.testing_securitypolicy(userid='Tobias', permissive=True)
request = construct_dummy_request(json_body={
'uid': 0,
'key': 'merge',
})
self.assertEqual(400, split_or_merge_premisegroup(request).status_code)
def test_merge_premisegroup_user_error(self):
self.config.testing_securitypolicy(userid='some_user', permissive=True)
request = construct_dummy_request(json_body={
'uid': 0,
'key': 'merge',
})
self.assertEqual(400, split_or_merge_premisegroup(request).status_code)
def test_merge_premisegroup(self):
self.config.testing_securitypolicy(userid='Tobias', permissive=True)
request = construct_dummy_request(json_body={
'uid': 21,
'key': 'merge',
})
db_review1 = DBDiscussionSession.query(ReviewMerge).count()
db_values1 = DBDiscussionSession.query(ReviewMergeValues).count()
response = split_or_merge_premisegroup(request)
db_review2 = DBDiscussionSession.query(ReviewMerge).count()
db_values2 = DBDiscussionSession.query(ReviewMergeValues).count()
self.assertEqual(0, len(response['info']))
self.assertNotEqual(0, len(response['success']))
self.assertEqual(db_review1 + 1, db_review2)
self.assertEqual(db_values1, db_values2)
tmp = DBDiscussionSession.query(ReviewMerge).filter_by(premisegroup_uid=21).first()
DBDiscussionSession.query(ReviewMergeValues).filter_by(review_uid=tmp.uid).delete()
DBDiscussionSession.query(ReviewMerge).filter_by(premisegroup_uid=21).delete()
DBDiscussionSession.flush()
transaction.commit()
def test_review_splitted_premisegroup_uid_error(self):
self.config.testing_securitypolicy(userid='Tobias', permissive=True)
request = construct_dummy_request(json_body={
'review_uid': 0,
'should_split': True,
})
response = split_or_merge_premisegroup(request)
self.assertEqual(response.status_code, 400)
def test_review_splitted_premisegroup_user_error(self):
self.config.testing_securitypolicy(userid='some_child', permissive=True)
request = construct_dummy_request(json_body={
'review_uid': 1,
'should_split': True,
})
response = split_or_merge_premisegroup(request)
self.assertEqual(response.status_code, 400)
def test_review_splitted_premisegroup(self, statement_uid: int = 25, pgroup_uid: int = 21, resetdb: bool = True):
self.config.testing_securitypolicy(userid='Tobias', permissive=True)
# add something for a review
request = construct_dummy_request(json_body={
'uid': statement_uid,
'key': 'split',
'text_values': ['it is based on the cats race', 'not every cat is capricious']
})
# oem of pgroup pgroup_uid is: 'the fact, that cats are capricious, is based on the cats race'
split_or_merge_statement(request)
db_review_split = DBDiscussionSession.query(ReviewSplit).filter_by(premisegroup_uid=pgroup_uid).first()
db_arguments_with_pgroup = DBDiscussionSession.query(Argument).filter_by(premisegroup_uid=pgroup_uid).all()
# vote 1:0
request = construct_dummy_request(json_body={
'review_uid': db_review_split.uid,
'should_split': True
})
response = review_splitted_premisegroup(request)
db_review = DBDiscussionSession.query(LastReviewerSplit).filter_by(review_uid=db_review_split.uid)
pro = db_review.filter_by(should_split=True).count()
con = db_review.filter_by(should_split=False).count()
self.assertTrue(response)
self.assertEqual(pro, 1)
self.assertEqual(con, 0)
# vote 2:0
self.config.testing_securitypolicy(userid='Christian', permissive=True)
request = construct_dummy_request(json_body={
'review_uid': db_review_split.uid,
'should_split': True,
})
response = review_splitted_premisegroup(request)
db_review = DBDiscussionSession.query(LastReviewerSplit).filter_by(review_uid=db_review_split.uid)
pro = db_review.filter_by(should_split=True).count()
con = db_review.filter_by(should_split=False).count()
self.assertTrue(response)
self.assertEqual(pro, 2)
self.assertEqual(con, 0)
# vote 2:1
self.config.testing_securitypolicy(userid='Bob', permissive=True)
request = construct_dummy_request(json_body={
'review_uid': db_review_split.uid,
'should_split': False,
})
response = review_splitted_premisegroup(request)
db_review = DBDiscussionSession.query(LastReviewerSplit).filter_by(review_uid=db_review_split.uid)
pro = db_review.filter_by(should_split=True).count()
con = db_review.filter_by(should_split=False).count()
self.assertTrue(response)
self.assertEqual(pro, 2)
self.assertEqual(con, 1)
# vote 3:1
self.config.testing_securitypolicy(userid='Pascal', permissive=True)
request = construct_dummy_request(json_body={
'review_uid': db_review_split.uid,
'should_split': True,
})
response = review_splitted_premisegroup(request)
db_review = DBDiscussionSession.query(LastReviewerSplit).filter_by(review_uid=db_review_split.uid)
pro = db_review.filter_by(should_split=True).count()
con = db_review.filter_by(should_split=False).count()
self.assertTrue(response)
self.assertEqual(pro, 3)
self.assertEqual(con, 1)
# vote 4:1
self.config.testing_securitypolicy(userid='Kurt', permissive=True)
request = construct_dummy_request(json_body={
'review_uid': db_review_split.uid,
'should_split': True,
})
arg_old_len = DBDiscussionSession.query(Argument).count()
response = review_splitted_premisegroup(request)
arg_new_len = DBDiscussionSession.query(Argument).count()
db_review = DBDiscussionSession.query(LastReviewerSplit).filter_by(review_uid=db_review_split.uid)
pro = db_review.filter_by(should_split=True).count()
con = db_review.filter_by(should_split=False).count()
self.assertTrue(response)
self.assertEqual(pro, 4)
self.assertEqual(con, 1)
self.assertEqual(arg_old_len + 1, arg_new_len)
# now the vote is executed
self.assertTrue(DBDiscussionSession.query(ReviewSplit).get(db_review_split.uid).is_executed)
# check the new premisegroups
for arg in db_arguments_with_pgroup:
tmp_arg = DBDiscussionSession.query(Argument).get(arg.uid)
self.assertNotEqual(pgroup_uid, tmp_arg.premisegroup_uid)
self.assertNotEqual(arg.premisegroup_uid, tmp_arg.premisegroup_uid)
# remove added args
add_args = DBDiscussionSession.query(ArgumentsAddedByPremiseGroupSplit).filter_by(
review_uid=db_review_split.uid).all()
self.assertEqual(len(add_args), 1) # one argument was added and one was modified
map(lambda arg: DBDiscussionSession.query(Argument).filter_by(arg.uid).delete(), add_args)
if resetdb:
DBDiscussionSession.query(LastReviewerSplit).filter_by(review_uid=db_review_split.uid).delete()
DBDiscussionSession.query(ReviewSplitValues).filter_by(review_uid=db_review_split.uid).delete()
DBDiscussionSession.query(PremiseGroupSplitted).filter_by(review_uid=db_review_split.uid).delete()
DBDiscussionSession.query(ArgumentsAddedByPremiseGroupSplit).filter_by(
review_uid=db_review_split.uid).delete()
DBDiscussionSession.flush()
DBDiscussionSession.query(ReviewSplit).filter_by(premisegroup_uid=pgroup_uid).delete()
DBDiscussionSession.flush()
transaction.commit()
def test_review_merged_premisegroup(self, statement_uid=31, pgroup_uid=27, resetdb=True):
self.config.testing_securitypolicy(userid='Tobias', permissive=True)
# add something for a review
request = construct_dummy_request(json_body={
'uid': statement_uid,
'key': 'merge',
'text_values': ['cats are small and fluffy']
})
split_or_merge_statement(request)
db_review_merge = DBDiscussionSession.query(ReviewMerge).filter_by(premisegroup_uid=pgroup_uid).first()
# vote 1:0
request = construct_dummy_request(json_body={
'review_uid': db_review_merge.uid,
'should_merge': True
})
response = review_merged_premisegroup(request)
db_review = DBDiscussionSession.query(LastReviewerMerge).filter_by(review_uid=db_review_merge.uid)
pro = db_review.filter_by(should_merge=True).count()
con = db_review.filter_by(should_merge=False).count()
self.assertTrue(response)
self.assertEqual(pro, 1)
self.assertEqual(con, 0)
# vote 2:0
self.config.testing_securitypolicy(userid='Christian', permissive=True)
request = construct_dummy_request(json_body={
'review_uid': db_review_merge.uid,
'should_merge': True,
})
response = review_merged_premisegroup(request)
db_review = DBDiscussionSession.query(LastReviewerMerge).filter_by(review_uid=db_review_merge.uid)
pro = db_review.filter_by(should_merge=True).count()
con = db_review.filter_by(should_merge=False).count()
self.assertTrue(response)
self.assertEqual(pro, 2)
self.assertEqual(con, 0)
# vote 2:1
self.config.testing_securitypolicy(userid='Bob', permissive=True)
request = construct_dummy_request(json_body={
'review_uid': db_review_merge.uid,
'should_merge': False,
})
response = review_merged_premisegroup(request)
db_review = DBDiscussionSession.query(LastReviewerMerge).filter_by(review_uid=db_review_merge.uid)
pro = db_review.filter_by(should_merge=True).count()
con = db_review.filter_by(should_merge=False).count()
self.assertTrue(response)
self.assertEqual(pro, 2)
self.assertEqual(con, 1)
# vote 3:1
self.config.testing_securitypolicy(userid='Pascal', permissive=True)
request = construct_dummy_request(json_body={
'review_uid': db_review_merge.uid,
'should_merge': True,
})
response = review_merged_premisegroup(request)
db_review = DBDiscussionSession.query(LastReviewerMerge).filter_by(review_uid=db_review_merge.uid)
pro = db_review.filter_by(should_merge=True).count()
con = db_review.filter_by(should_merge=False).count()
self.assertTrue(response)
self.assertEqual(pro, 3)
self.assertEqual(con, 1)
# vote 4:1
self.config.testing_securitypolicy(userid='Kurt', permissive=True)
request = construct_dummy_request(json_body={
'review_uid': db_review_merge.uid,
'should_merge': True,
})
pgroup_old_len = DBDiscussionSession.query(PremiseGroup).count()
db_arguments_with_pgroup = DBDiscussionSession.query(Argument).filter_by(premisegroup_uid=pgroup_uid).all()
old_text = DBDiscussionSession.query(PremiseGroup).order_by(PremiseGroup.uid.desc()).first().get_text()
old_argument_text = get_text_for_argument_uid(db_arguments_with_pgroup[0].uid)
pgroup_merged_old_len = DBDiscussionSession.query(PremiseGroupMerged).count()
response = review_merged_premisegroup(request)
db_new_pgroup = DBDiscussionSession.query(PremiseGroup).order_by(PremiseGroup.uid.desc()).first()
new_text = db_new_pgroup.get_text()
new_argument_text = get_text_for_argument_uid(db_arguments_with_pgroup[0].uid)
pgroup_new_len = DBDiscussionSession.query(PremiseGroup).count()
pgroup_merged_new_len = DBDiscussionSession.query(PremiseGroupMerged).count()
db_review = DBDiscussionSession.query(LastReviewerMerge).filter_by(review_uid=db_review_merge.uid)
pro = db_review.filter_by(should_merge=True).count()
con = db_review.filter_by(should_merge=False).count()
self.assertTrue(response)
self.assertEqual(pro, 4)
self.assertEqual(con, 1)
self.assertEqual(pgroup_old_len + 1, pgroup_new_len)
self.assertEqual(pgroup_merged_old_len + 1, pgroup_merged_new_len)
# now the vote is executed
self.assertTrue(DBDiscussionSession.query(ReviewMerge).get(db_review_merge.uid).is_executed)
# check the new premisegroups in every argument
map(lambda arg: self.assertEqual(arg.premisegroup_uid, db_new_pgroup.uid), db_arguments_with_pgroup)
# check text change
self.assertNotEqual(old_text, new_text)
self.assertNotEqual(old_argument_text, new_argument_text)
if resetdb:
DBDiscussionSession.query(LastReviewerMerge).filter_by(review_uid=db_review_merge.uid).delete()
DBDiscussionSession.query(ReviewMergeValues).filter_by(review_uid=db_review_merge.uid).delete()
DBDiscussionSession.query(PremiseGroupMerged).filter_by(review_uid=db_review_merge.uid).delete()
DBDiscussionSession.flush()
DBDiscussionSession.query(ReviewMerge).filter_by(premisegroup_uid=pgroup_uid).delete()
DBDiscussionSession.flush()
transaction.commit()
def test_cancel_review_splitted_merged_premisegroup_errors(self):
self.config.testing_securitypolicy(userid='someone', permissive=True)
# user error
request = construct_dummy_request(json_body={
'queue': key_split,
'uid': 1
})
response = cancel_review(request)
self.assertEqual(response.status_code, 400)
self.config.testing_securitypolicy(userid='Tobias', permissive=True)
# uid error
request = construct_dummy_request(json_body={
'queue': key_split,
'uid': 'a'
})
response = cancel_review(request)
self.assertEqual(response.status_code, 400)
# queue error
request = construct_dummy_request(json_body={
'queue': 'asd',
'uid': 1
})
response = cancel_review(request)
self.assertEqual(response.status_code, 400)
# no review error
request = construct_dummy_request(json_body={
'queue': key_split,
'uid': 1000
})
response = cancel_review(request)
self.assertEqual(response.status_code, 400)
def test_cancel_review_split_premisegroup(self):
self.config.testing_securitypolicy(userid='Tobias', permissive=True)
# add something for a review
request = construct_dummy_request(json_body={
'uid': 41,
'key': key_split,
'text_values': ['cats are small and fluffy', 'split it up, dude']
})
len1 = DBDiscussionSession.query(ReviewSplit).count()
split_or_merge_statement(request)
len2 = DBDiscussionSession.query(ReviewSplit).count()
self.assertEqual(len1 + 1, len2)
db_review = DBDiscussionSession.query(ReviewSplit).filter_by(premisegroup_uid=33).first()
request = construct_dummy_request(json_body={
'queue': key_merge,
'uid': db_review.uid,
})
response = split_or_merge_statement(request)
len3 = DBDiscussionSession.query(ReviewSplit).count()
self.assertTrue(response)
self.assertEqual(len2, len3)
def test_cancel_review_merged_premisegroup(self):
self.config.testing_securitypolicy(userid='Tobias', permissive=True)
# add something for a review
request = construct_dummy_request(json_body={
'uid': 19,
'key': key_merge,
'text_values': ['cats are small and fluffy']
})
len1 = DBDiscussionSession.query(ReviewMerge).count()
split_or_merge_statement(request)
len2 = DBDiscussionSession.query(ReviewMerge).count()
self.assertEqual(len1 + 1, len2)
db_review = DBDiscussionSession.query(ReviewMerge).filter_by(premisegroup_uid=15).first()
request2 = construct_dummy_request(json_body={
'queue': key_merge,
'uid': db_review.uid,
})
response = cancel_review(request2)
len3 = DBDiscussionSession.query(ReviewMerge).count()
self.assertTrue(response)
self.assertEqual(len2, len3)
def test_undo_merge_split_review_errors(self):
# uid
self.config.testing_securitypolicy(userid='peter', permissive=True)
request = construct_dummy_request(json_body={
'queue': key_merge,
'uid': 'a',
})
response = undo_review(request)
self.assertEqual(response.status_code, 400)
# no admin
request = construct_dummy_request(json_body={
'queue': key_merge,
'uid': 2,
})
response = undo_review(request)
self.assertEqual(response.status_code, 400)
# queue
self.config.testing_securitypolicy(userid='Tobias', permissive=True)
request = construct_dummy_request(json_body={
'queue': 'HAHA',
'uid': 2,
})
response = undo_review(request)
self.assertEqual(response.status_code, 400)
# no uid
request = construct_dummy_request(json_body={
'queue': key_merge,
'uid': 5,
})
response = undo_review(request)
self.assertEqual(response.status_code, 400)
def test_undo_review_splitted_premisegroup(self):
self.config.testing_securitypolicy(userid='Tobias', permissive=True)
uid = 22
# get one argument with the old premisegroup
db_argument_old = DBDiscussionSession.query(Argument).filter_by(premisegroup_uid=uid).first()
old_text = get_text_for_argument_uid(db_argument_old.uid)
self.test_review_splitted_premisegroup(statement_uid=26, pgroup_uid=uid, resetdb=False)
# get one argument with the new premisegroup
db_new_pgroup = DBDiscussionSession.query(PremiseGroupSplitted).filter_by(old_premisegroup_uid=uid).first()
db_argument_new1 = DBDiscussionSession.query(Argument).get(db_argument_old.uid)
db_argument_new2 = DBDiscussionSession.query(Argument).filter_by(
premisegroup_uid=db_new_pgroup.new_premisegroup_uid).first()
new_text1 = get_text_for_argument_uid(db_argument_new1.uid)
new_text2 = get_text_for_argument_uid(db_argument_new2.uid)
# text has to differ
self.assertNotEqual(old_text, new_text1)
self.assertNotEqual(old_text, new_text2)
self.config.testing_securitypolicy(userid='Tobias', permissive=True)
db_review = DBDiscussionSession.query(ReviewSplit).filter_by(premisegroup_uid=uid).first()
request = construct_dummy_request(json_body={
'queue': key_split,
'uid': db_review.uid
})
self.assertTrue(undo_review(request))
resetted_text = get_text_for_argument_uid(db_argument_old.uid)
self.assertEqual(len(old_text), len(resetted_text))
for i in range(len(old_text)):
self.assertEqual(old_text[i], resetted_text[i])
self.assertEqual(old_text, resetted_text)
tmp = DBDiscussionSession.query(ReviewSplit).filter_by(premisegroup_uid=uid).first()
DBDiscussionSession.query(LastReviewerSplit).filter_by(review_uid=tmp.uid).delete()
DBDiscussionSession.query(ReviewSplitValues).filter_by(review_uid=tmp.uid).delete()
DBDiscussionSession.query(PremiseGroupSplitted).filter_by(review_uid=tmp.uid).delete()
DBDiscussionSession.query(ReviewCanceled).filter_by(review_split_uid=tmp.uid).delete()
DBDiscussionSession.query(ArgumentsAddedByPremiseGroupSplit).filter_by(review_uid=tmp.uid).delete()
DBDiscussionSession.flush()
DBDiscussionSession.query(ReviewSplit).filter_by(premisegroup_uid=uid).delete()
DBDiscussionSession.flush()
transaction.commit()
def test_undo_review_merged_premisegroup(self):
self.config.testing_securitypolicy(userid='Tobias', permissive=True)
uid = 14
# get one argument with the old premisegroup
db_argument_old = DBDiscussionSession.query(Argument).filter_by(premisegroup_uid=uid).first()
old_text = get_text_for_argument_uid(db_argument_old.uid)
self.test_review_merged_premisegroup(statement_uid=18, pgroup_uid=uid, resetdb=False)
# get one argument with the new premisegroup
db_new_pgroup = DBDiscussionSession.query(PremiseGroupMerged).filter_by(old_premisegroup_uid=uid).first()
db_argument_new1 = DBDiscussionSession.query(Argument).get(db_argument_old.uid)
db_argument_new2 = DBDiscussionSession.query(Argument).filter_by(
premisegroup_uid=db_new_pgroup.new_premisegroup_uid).first()
new_text1 = get_text_for_argument_uid(db_argument_new1.uid)
new_text2 = get_text_for_argument_uid(db_argument_new2.uid)
# text has to differ
self.assertNotEqual(old_text, new_text1)
self.assertNotEqual(old_text, new_text2)
self.config.testing_securitypolicy(userid='Tobias', permissive=True)
db_review = DBDiscussionSession.query(ReviewMerge).filter_by(premisegroup_uid=uid).first()
request = construct_dummy_request(json_body={
'queue': key_merge,
'uid': db_review.uid
})
self.assertTrue(undo_review(request))
resetted_text = get_text_for_argument_uid(db_argument_old.uid)
self.assertEqual(len(old_text), len(resetted_text))
for i in range(len(old_text)):
self.assertEqual(old_text[i], resetted_text[i])
self.assertEqual(old_text, resetted_text)
tmp = DBDiscussionSession.query(ReviewMerge).filter_by(premisegroup_uid=uid).first()
DBDiscussionSession.query(LastReviewerMerge).filter_by(review_uid=tmp.uid).delete()
DBDiscussionSession.query(ReviewMergeValues).filter_by(review_uid=tmp.uid).delete()
DBDiscussionSession.query(PremiseGroupMerged).filter_by(review_uid=tmp.uid).delete()
DBDiscussionSession.query(ReviewCanceled).filter_by(review_merge_uid=tmp.uid).delete()
DBDiscussionSession.flush()
DBDiscussionSession.query(ReviewMerge).filter_by(premisegroup_uid=uid).delete()
DBDiscussionSession.flush()
transaction.commit()
class TestRevokeStatementContent(TestCaseWithConfig):
def test_revoke_content(self):
self.config.testing_securitypolicy(userid=nick_of_anonymous_user, permissive=True)
db_content1 = DBDiscussionSession.query(RevokedContentHistory).count()
request = construct_dummy_request(json_body={
'statement_id': 2,
})
self.assertTrue(revoke_statement_content(request))
db_content2 = DBDiscussionSession.query(RevokedContentHistory).count()
self.assertNotEqual(db_content1, db_content2)
def test_revoke_content_uid_error1(self):
self.config.testing_securitypolicy(userid=nick_of_anonymous_user, permissive=True)
db_content1 = DBDiscussionSession.query(RevokedContentHistory).count()
request = construct_dummy_request(json_body={
'statement_id': 'a',
})
response = revoke_statement_content(request)
db_content2 = DBDiscussionSession.query(RevokedContentHistory).count()
self.assertEqual(response.status_code, 400)
self.assertEqual(db_content1, db_content2)
def test_revoke_content_statement_id_error2(self):
self.config.testing_securitypolicy(userid=nick_of_anonymous_user, permissive=True)
db_content1 = DBDiscussionSession.query(RevokedContentHistory).count()
request = construct_dummy_request(json_body={
'statement_id': 150,
})
response = revoke_statement_content(request)
db_content2 = DBDiscussionSession.query(RevokedContentHistory).count()
self.assertEqual(response.status_code, 400)
self.assertEqual(db_content1, db_content2)
def test_revoke_content_author_error1(self):
self.config.testing_securitypolicy(userid='', permissive=True)
db_content1 = DBDiscussionSession.query(RevokedContentHistory).count()
request = construct_dummy_request(json_body={
'statement_id': 3,
})
response = revoke_statement_content(request)
db_content2 = DBDiscussionSession.query(RevokedContentHistory).count()
self.assertEqual(response.status_code, 400)
self.assertEqual(db_content1, db_content2)
| 47.161989 | 121 | 0.680536 | 6,255 | 58,811 | 6.102798 | 0.049241 | 0.104996 | 0.04511 | 0.053048 | 0.887276 | 0.850418 | 0.832133 | 0.813612 | 0.790768 | 0.755089 | 0 | 0.011203 | 0.222883 | 58,811 | 1,246 | 122 | 47.199839 | 0.824038 | 0.020914 | 0 | 0.737143 | 0 | 0 | 0.045772 | 0 | 0 | 0 | 0 | 0 | 0.187619 | 1 | 0.060952 | false | 0 | 0.010476 | 0 | 0.073333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
435a30a72d435b8c00939ef5c7be6c03ae80949c | 52,495 | py | Python | Empyrial/trafalgar.py | keshabb/trafalgar | f0dce8845526f814913ca3a6ccc9db75c1fde273 | [
"MIT"
] | null | null | null | Empyrial/trafalgar.py | keshabb/trafalgar | f0dce8845526f814913ca3a6ccc9db75c1fde273 | [
"MIT"
] | null | null | null | Empyrial/trafalgar.py | keshabb/trafalgar | f0dce8845526f814913ca3a6ccc9db75c1fde273 | [
"MIT"
] | null | null | null | import numpy as np
import pandas as pd
import statsmodels
import matplotlib.pyplot as plt
import seaborn
from scipy.stats import norm
from pandas_datareader import data as web
from datetime import datetime
import statsmodels.api as sm
from statsmodels.tsa.stattools import coint, adfuller
from statsmodels import regression
from sklearn.linear_model import LinearRegression
from pykalman import KalmanFilter
from pypfopt import EfficientFrontier, risk_models, expected_returns
today = datetime.date.today()
# ------------------------------------------------------------------------------------------
def graph_close(stock, start_date, end_date=today, x=20,y=10):
"""
Source and plot Close prices from yahoo for any given stock/s & period
Parameters
----------
stock : str,list
Either a single stock ticker or list of tickers.
start_date : str
Date in yyyy-mm-dd format
end_date : str
Date in yyyy-mm-dd format
"""
df = web.DataReader(stock, data_source='yahoo', start = start_date, end= end_date)['Close']
df = pd.DataFrame(df)
df.plot(figsize=(x,y))
# ------------------------------------------------------------------------------------------
def graph_open(stock, start_date, end_date=today,x=20, y=10):
df = web.DataReader(stock, data_source='yahoo', start = start_date, end= end_date)['Open']
df = pd.DataFrame(df)
df.plot(figsize=(x,y))
# ------------------------------------------------------------------------------------------
def graph_volume(stock, start_date, end_date=today, x=20, y=10):
df = web.DataReader(stock, data_source='yahoo', start = start_date, end= end_date)['Volume']
df = pd.DataFrame(df)
df.plot(figsize=(x,y))
# ------------------------------------------------------------------------------------------
def graph_adj_close(stock, start_date, end_date=today, x=20, y=10):
df = web.DataReader(stock, data_source='yahoo', start = start_date, end= end_date)['Adj Close']
df = pd.DataFrame(df)
df.plot(figsize=(x,y))
# ------------------------------------------------------------------------------------------
def close(stock, start_date, end_date=today):
df = web.DataReader(stock, data_source='yahoo', start = start_date, end= end_date)['Close']
df = pd.DataFrame(df)
return df
# ------------------------------------------------------------------------------------------
def open(stock, start_date, end_date=today):
df = web.DataReader(stock, data_source='yahoo', start = start_date, end= end_date)['Open']
df = pd.DataFrame(df)
return df
# ------------------------------------------------------------------------------------------
def adj_close(stock, start_date, end_date=today):
df = web.DataReader(stock, data_source='yahoo', start = start_date, end= end_date)['Adj Close']
df = pd.DataFrame(df)
return df
# ------------------------------------------------------------------------------------------
def volume(stock, start_date, end_date=today):
df = web.DataReader(stock, data_source='yahoo', start = start_date, end= end_date)['Volume']
df = pd.DataFrame(df)
return df
# ------------------------------------------------------------------------------------------
def returns(stocks,wts=1, start_date, end_date=today):
if len(stocks) > 1:
assets = web.DataReader(stocks, data_source='yahoo', start = start_date, end= end_date)['Adj Close']
ret_data = assets.pct_change()[1:]
port_ret = (ret_data * wts).sum(axis = 1)
ret_data['Portfolio returns'] = port_ret
ret_data = pd.DataFrame(ret_data)
return ret_data
else:
df = web.DataReader(stocks, data_source='yahoo', start = start_date, end= end_date)['Adj Close']
df = pd.DataFrame(df)
returns = df.pct_change()
returns = pd.DataFrame(returns)
return returns
#---------------------------------------------------------------------------------------------
def graph_returns(stock,wts=1, start_date, end_date, x=20, y=10):
if len(stock) > 1:
assets = web.DataReader(stock, data_source='yahoo', start = start_date, end= end_date)['Adj Close']
ret_data = assets.pct_change()[1:]
port_ret = (ret_data * wts).sum(axis = 1)
ret_data['Portfolio'] = port_ret
ret_data.plot(figsize=(x,y))
plt.xlabel('Date')
plt.ylabel('Returns')
plt.title('Portfolio returns')
else:
df = web.DataReader(stock, data_source='yahoo', start = start_date, end= end_date)['Adj Close']
df = pd.DataFrame(df)
returns = df.pct_change()
returns.columns = ['Adj Close']
plt.figure(figsize=(x,y))
plt.plot(returns.index, returns['Adj Close'])
plt.xlabel("Date")
plt.ylabel("$ price")
plt.title("Revenues from "+start_date + " to "+ end_date)
# ------------------------------------------------------------------------------------------
def covariance(stocks, start_date, end_date=today, days):
df = web.DataReader(stocks, data_source='yahoo', start = start_date, end= end_date )['Adj Close']
df = pd.DataFrame(df)
returns = df.pct_change()
cov_matrix_annual = returns.cov()*days
return cov_matrix_annual
# ------------------------------------------------------------------------------------------
def graph_correlation(stocks, start_date, end_date=today):
corr_mat = correlation(stocks, start_date, end_date)
seaborn.heatmap(corr_mat, annot=True)
plt.show()
# ------------------------------------------------------------------------------------------
def ohlcv(stock, start_date, end_date=today):
df = web.DataReader(stock, data_source='yahoo', start = start_date, end= end_date )
df = pd.DataFrame(df)
df = df.drop(['Adj Close'], axis=1)
df = df[["Open", "High", "Low", "Close", "Volume"]]
return df
# ------------------------------------------------------------------------------------------
def graph_creturns(stock, wts=1, start_date, end_date=today,x,y):
if len(stock) > 1:
price_data = web.DataReader(stock, data_source='yahoo', start = start_date, end= end_date )
price_data = price_data['Adj Close']
ret_data = price_data.pct_change()[1:]
port_ret = (ret_data * wts).sum(axis = 1)
cumulative_ret_df1 = (port_ret + 1).cumprod()
plt.figure(figsize=(x,y))
stock_raw = web.DataReader(stock, data_source='yahoo', start = start_date, end= end_date)
stock = stock_raw['Adj Close']
port_ret = stock.sum(axis=1)
stock_normed = stock/stock.iloc[0]
stock_normed['Portfolio'] = cumulative_ret_df1
stock_normed.plot(figsize=(x,y))
else:
price_data = web.DataReader(stock, data_source='yahoo', start = start_date, end= end_date )
price_data = price_data['Adj Close']
ret_data = price_data.pct_change()[1:]
weighted_returns = ret_data
port_ret = weighted_returns.sum(axis=1)
cumulative_ret = (port_ret + 1).cumprod()
cumulative_ret = pd.DataFrame(cumulative_ret)
cumulative_ret.columns = ['Cumulative returns']
fig = plt.figure(figsize=(x,y))
ax1 = fig.add_axes([0.1,0.1,0.8,0.8])
ax1.plot(cumulative_ret)
ax1.set_xlabel('Date')
ax1.set_ylabel("Cumulative Returns")
ax1.set_title("Portfolio Cumulative Returns")
plt.show();
# ------------------------------------------------------------------------------------------
def creturns(stock, wts=1, start_date, end_date=today):
if len(stock) > 1:
price_data = web.DataReader(stock, data_source='yahoo', start = start_date, end= end_date )
price_data = price_data['Adj Close']
ret_data = price_data.pct_change()[1:]
port_ret = (ret_data * wts).sum(axis = 1)
cumulative_ret_df1 = (port_ret + 1).cumprod()
plt.figure(figsize=(20,10))
stock_raw = web.DataReader(stock, 'yahoo', start = start_date, end = end_date)
stock = stock_raw['Adj Close']
port_ret = stock.sum(axis=1)
stock_normed = stock/stock.iloc[0]
stock_normed['Portfolio'] = cumulative_ret_df1
return stock_normed
else:
price_data = web.DataReader(stock, data_source='yahoo', start = start_date, end= end_date )
price_data = price_data['Adj Close']
ret_data = price_data.pct_change()[1:]
weighted_returns = ret_data
port_ret = weighted_returns.sum(axis=1)
cumulative_ret = (port_ret + 1).cumprod()
cumulative_ret = pd.DataFrame(cumulative_ret)
cumulative_ret.columns = ['Cumulative returns']
return cumulative_ret
# ------------------------------------------------------------------------------------------
def annual_volatility(stocks, wts=1, start_date, end_date=today):
if len(stocks)>1:
price_data = web.DataReader(stocks, data_source='yahoo', start = start_date, end= end_date )
price_data = price_data['Adj Close']
ret_data = price_data.pct_change()[1:]
port_ret = (ret_data * wts).sum(axis = 1)
cumulative_ret = (port_ret + 1).cumprod()
annual_std = np.std(port_ret) * np.sqrt(252)
return annual_std
else:
price_data = web.DataReader(stocks, data_source='yahoo', start = start_date, end= end_date )
price_data = price_data['Adj Close']
ret_data = price_data.pct_change()[1:]
annual_std = np.std(ret_data) * np.sqrt(252)
return annual_std
# ------------------------------------------------------------------------------------------
def sharpe_ratio(stocks, wts=1, start_date, end_date=today):
if len(stocks)>1:
price_data = web.DataReader(stocks, data_source='yahoo', start = start_date, end= end_date )
price_data = price_data['Adj Close']
ret_data = price_data.pct_change()[1:]
port_ret = (ret_data * wts).sum(axis = 1)
cumulative_ret = (port_ret + 1).cumprod()
geometric_port_return = np.prod(port_ret + 1) ** (252/port_ret.shape[0]) - 1
annual_std = np.std(port_ret) * np.sqrt(252)
port_sharpe_ratio = geometric_port_return / annual_std
return port_sharpe_ratio
else:
price_data = web.DataReader(stocks, data_source='yahoo', start = start_date, end= end_date )
price_data = price_data['Adj Close']
ret_data = price_data.pct_change()[1:]
geometric_port_return = np.prod(ret_data + 1) ** (252/ret_data.shape[0]) - 1
annual_std = np.std(ret_data) * np.sqrt(252)
port_sharpe_ratio = geometric_port_return / annual_std
return port_sharpe_ratio
# ------------------------------------------------------------------------------------------
def graph_rbenchmark(stocks=1, wts, benchmark, start_date, end_date=today, x=20, y=10):
if len(stocks)>1 and len(wts)>1:
price_data = web.get_data_yahoo(stocks,
start = start_date,
end = end_date)
price_data = price_data['Adj Close']
df2 = web.get_data_yahoo(benchmark, start=start_date, end= end_date)
ret_data = price_data.pct_change()[1:]
return_df2 = df2.Close.pct_change()[1:]
port_ret = (ret_data * wts).sum(axis = 1)
plt.figure(figsize=(x,y))
port_ret.plot()
return_df2.plot()
plt.ylabel("Daily return comparison")
plt.title('Comparison')
plt.show()
else:
price_data = web.get_data_yahoo(stocks,
start = start_date,
end = end_date)
price_data = price_data['Adj Close']
df2 = web.get_data_yahoo(benchmark, start=start_date, end= end_date)
ret_data = price_data.pct_change()[1:]
return_df2 = df2.Close.pct_change()[1:]
ret_data["benchmark"] = return_df2
ret_data.plot(figsize=(x,y))
#----------------------------------------------------------------------------------------
def rbenchmark(stocks, wts=1, benchmark, start_date, end_date=today):
if len(stocks)>1 and len(wts)>1:
price_data = web.get_data_yahoo(stocks,
start = start_date,
end = end_date)
price_data = price_data['Adj Close']
df2 = web.get_data_yahoo(benchmark, start=start_date, end= end_date)
ret_data = price_data.pct_change()[1:]
return_df2 = df2.Close.pct_change()[1:]
port_ret = (ret_data * wts).sum(axis = 1)
ret_data['benchmark'] = return_df2
ret_data['portfolio'] = port_ret
ret_data = pd.DataFrame(ret_data)
return ret_data
else:
price_data = web.get_data_yahoo(stocks,
start = start_date,
end = end_date)
price_data = price_data['Adj Close']
df2 = web.get_data_yahoo(benchmark, start=start_date, end= end_date)
ret_data = price_data.pct_change()[1:]
return_df2 = df2.Close.pct_change()[1:]
ret_data["benchmark"] = return_df2
ret_data = pd.DataFrame(ret_data)
return ret_data
# ------------------------------------------------------------------------------------------
def cbenchmark(stocks, wts=1, benchmark, start_date, end_date=today):
if len(stocks)>1 and len(wts)>1:
price_data = web.DataReader(stocks, data_source='yahoo', start = start_date, end= end_date )
price_data = price_data['Adj Close']
df2 = web.DataReader(benchmark, data_source='yahoo', start = start_date, end= end_date )
ret_data = price_data.pct_change()[1:]
return_df2 = df2.Close.pct_change()[1:]
port_ret = (ret_data * wts).sum(axis = 1)
cumulative_ret_df1 = (port_ret + 1).cumprod()
cumulative_ret_df2 = (return_df2 + 1).cumprod()
df1 = pd.DataFrame(cumulative_ret_df1)
df2 = pd.DataFrame(cumulative_ret_df2)
df = pd.concat([df1,df2], axis=1)
df = pd.DataFrame(df)
df.columns = ['portfolio', 'benchmark']
return df
else:
price_data = web.get_data_yahoo(stocks,
start = start_date,
end = end_date)
price_data = price_data['Adj Close']
df2 = web.get_data_yahoo(benchmark, start=start_date, end= end_date)
ret_data = price_data.pct_change()[1:]
return_df2 = df2.Close.pct_change()[1:]
cumulative_ret_df1 = (ret_data + 1).cumprod()
cumulative_ret_df2 = (return_df2 + 1).cumprod()
df = cumulative_ret_df1
df['benchmark'] = cumulative_ret_df2
df = pd.DataFrame(df)
return df
# ------------------------------------------------------------------------------------------
def graph_cbenchmark(stocks, wts=1, benchmark, start_date, end_date=today, x=20, y=10):
if len(stocks)>1 and len(wts)>1:
price_data = web.DataReader(stocks, data_source='yahoo', start = start_date, end= end_date )
price_data = price_data['Adj Close']
df2 = web.DataReader(benchmark, data_source='yahoo', start = start_date, end= end_date )
ret_data = price_data.pct_change()[1:]
return_df2 = df2.Close.pct_change()[1:]
port_ret = (ret_data * wts).sum(axis = 1)
cumulative_ret_df1 = (port_ret + 1).cumprod()
cumulative_ret_df2 = (return_df2 + 1).cumprod()
df1 = pd.DataFrame(cumulative_ret_df1)
df2 = pd.DataFrame(cumulative_ret_df2)
df = pd.concat([df1,df2], axis=1)
df = pd.DataFrame(df)
df.columns = ['portfolio', 'benchmark']
df.plot(figsize=(x,y))
else:
price_data = web.get_data_yahoo(stocks,
start = start_date,
end = end_date)
price_data = price_data['Adj Close']
df2 = web.get_data_yahoo(benchmark, start=start_date, end= end_date)
ret_data = price_data.pct_change()[1:]
return_df2 = df2.Close.pct_change()[1:]
cumulative_ret_df1 = (ret_data + 1).cumprod()
cumulative_ret_df2 = (return_df2 + 1).cumprod()
df = cumulative_ret_df1
df['benchmark'] = cumulative_ret_df2
df = pd.DataFrame(df)
df.plot(figsize=(20,10))
# ------------------------------------------------------------------------------------------
def efficient_frontier(stocks, period="max", pricing="Adj Close", trading_year_days=252):
p = {"period": period}
for stock in stocks:
years = {
'1mo' : math.ceil(trading_year_days/12),
'3mo' : math.ceil(trading_year_days/4),
'6mo' : math.ceil(trading_year_days/2),
'1y': trading_year_days,
'2y' : 2*trading_year_days,
'5y' : 5*trading_year_days,
'10y' : 10*trading_year_days,
'20y' : 20*trading_year_days,
'max' : len(yf.Ticker(stock).history(**p)['Close'].pct_change())
}
df = pd.DataFrame()
for stock in stocks:
df[stock] = web.DataReader(stock, data_source='yahoo', start = "1980-01-01", end=today)[pricing]
df[stock] = df[stock].tail(years[period])
mu = expected_returns.mean_historical_return(df)
S = risk_models.sample_cov(df)
#optimize for max sharpe ratio
ef = EfficientFrontier(mu, S)
weights = ef.max_sharpe()
cleaned_weights = ef.clean_weights()
print(cleaned_weights)
ef.portfolio_performance(verbose=True)
# ------------------------------------------------------------------------------------------
def mean_daily_return(stocks,wts, start_date, end_date):
stock_raw = web.DataReader(stocks, 'yahoo', start_date, end_date)
stock = stock_raw['Adj Close']
port_ret = (stock * wts).sum(axis = 1)
cum_port = port_ret.pct_change(1)
mean_return_port = cum_port.mean()
stock_raw = web.DataReader(stocks, 'yahoo', start_date, end_date)
stock = stock_raw['Adj Close']
port_ret = stock.sum(axis=1)
mean_daily_ret = stock.pct_change(1).mean()
mean_daily_ret["Portfolio"] = mean_return_port
mean_daily_ret = pd.DataFrame(mean_daily_ret)
return mean_daily_ret
# ------------------------------------------------------------------------------------------
def var(value_invested, stocks, wts, alpha, start_date, end_date):
price_data = web.DataReader(stocks, 'yahoo', start_date, end_date)
price_data = price_data['Adj Close']
ret_data = price_data.pct_change()[1:]
weighted_returns = (wts * ret_data)
port_ret = weighted_returns.sum(axis=1)
#df = pd.concat([return_fb, return_aapl], axis=1)
port_ret = port_ret.fillna(0.0)
# Compute the correct percentile loss and multiply by value invested
return np.percentile(port_ret, 100 * (1-alpha)) * value_invested
# ------------------------------------------------------------------------------------------
def alpha(stocks, wts, benchmark, start_date, end_date):
if len(stocks) > 1:
assets = web.DataReader(stocks, data_source='yahoo', start = start_date, end= end_date)['Adj Close']
benchmark = web.DataReader(benchmark, data_source='yahoo', start = start_date, end= end_date)['Adj Close']
ret_data = assets.pct_change()[1:]
r_a = (ret_data * wts).sum(axis = 1)
r_b = benchmark.pct_change()[1:]
X = r_b.values # Get just the values, ignore the timestamps
Y = r_a.values
def linreg(x,y):
# We add a constant so that we can also fit an intercept (alpha) to the model
# This just adds a column of 1s to our data
x = sm.add_constant(x)
model = regression.linear_model.OLS(y,x).fit()
# Remove the constant now that we're done
x = x[:, 1]
return model.params[0], model.params[1]
alpha, beta = linreg(X,Y)
return alpha
else:
asset = web.DataReader(stocks, data_source='yahoo', start = start_date, end= end_date)['Adj Close']
benchmark = web.DataReader(benchmark, data_source='yahoo', start = start_date, end= end_date)['Adj Close']
r_a = asset.pct_change()[1:]
r_b = benchmark.pct_change()[1:]
X = r_b.values
Y = r_a.values
def linreg(x,y):
x = sm.add_constant(x)
model = regression.linear_model.OLS(y,x).fit()
x = x[:, 1]
return model.params[0], model.params[1]
alpha, beta = linreg(X,Y)
return alpha
#-------------------------------------------------------------------------------------------------
def beta(stocks, wts, benchmark, start_date, end_date):
if len(stocks) > 1:
assets = web.DataReader(stocks, data_source='yahoo', start = start_date, end= end_date)['Adj Close']
benchmark = web.DataReader(benchmark, data_source='yahoo', start = start_date, end= end_date)['Adj Close']
ret_data = assets.pct_change()[1:]
r_a = (ret_data * wts).sum(axis = 1)
r_b = benchmark.pct_change()[1:]
X = r_b.values # Get just the values, ignore the timestamps
Y = r_a.values
def linreg(x,y):
# We add a constant so that we can also fit an intercept (alpha) to the model
# This just adds a column of 1s to our data
x = sm.add_constant(x)
model = regression.linear_model.OLS(y,x).fit()
# Remove the constant now that we're done
x = x[:, 1]
return model.params[0], model.params[1]
alpha, beta = linreg(X,Y)
return beta
else:
asset = web.DataReader(stocks, data_source='yahoo', start = start_date, end= end_date)['Adj Close']
benchmark = web.DataReader(benchmark, data_source='yahoo', start = start_date, end= end_date)['Adj Close']
r_a = asset.pct_change()[1:]
r_b = benchmark.pct_change()[1:]
X = r_b.values
Y = r_a.values
def linreg(x,y):
x = sm.add_constant(x)
model = regression.linear_model.OLS(y,x).fit()
x = x[:, 1]
return model.params[0], model.params[1]
alpha, beta = linreg(X,Y)
return beta
#-------------------------------------------------------------------------------------------------------------------
def correlation(stocks, start_date, end_date):
df = web.DataReader(stocks, data_source='yahoo', start = start_date, end= end_date )['Adj Close']
df = pd.DataFrame(df)
returns = df.pct_change()
corr_matrix = returns.corr('pearson')
return corr_matrix
#-----------------------------------------------------------------------------------------------------
def graph_kalman(stocks, start_date, end_date, noise_value=0.01,axis=20, y=10):
x = web.DataReader(stocks, data_source='yahoo', start = start_date, end = end_date)['Adj Close']
# Construct a Kalman filter
kf = KalmanFilter(transition_matrices = [1],
observation_matrices = [1],
initial_state_mean = x[stocks].iloc[0],
initial_state_covariance = 1,
observation_covariance=1,
transition_covariance= noise_value)
# Use the observed values of the price to get a rolling mean
state_means, _ = kf.filter(x.values)
state_means = pd.Series(state_means.flatten(), index=x.index)
x = pd.DataFrame(x)
plt.figure(figsize=(axis, y))
plt.plot(state_means)
plt.plot(x)
plt.title('Kalman filter estimate of average')
plt.legend(['Kalman Estimate', 'X'])
plt.xlabel('Day')
plt.ylabel('Price');
#------------------------------------------------------------------------------------------------------------
def kalman(stocks, start_date, end_date, noise_value):
x = web.DataReader(stocks, data_source='yahoo', start = start_date, end= end_date)['Adj Close']
# Construct a Kalman filter
kf = KalmanFilter(transition_matrices = [1],
observation_matrices = [1],
initial_state_mean = x[0],
initial_state_covariance = 1,
observation_covariance=1,
transition_covariance= noise_value)
# Use the observed values of the price to get a rolling mean
state_means, _ = kf.filter(x.values)
state_means = pd.Series(state_means.flatten(), index=x.index)
state_means = pd.DataFrame(state_means)
return state_means
#---------------------------------------------------------------------------------------------------------------
def capm(stocks, wts, start_date, end_date):
assets = web.DataReader(stocks, data_source='yahoo', start = start_date, end= end_date)['Adj Close']
ret_data = assets.pct_change()[1:]
R = (ret_data * wts).sum(axis = 1)
R_F = web.DataReader('BIL', data_source='yahoo', start = start_date, end = end_date)['Adj Close'].pct_change()[1:]
# find it's beta against market
M = web.DataReader('SPY', data_source='yahoo', start = start_date, end = end_date)['Adj Close'].pct_change()[1:]
results = regression.linear_model.OLS(R-R_F, sm.add_constant(M)).fit()
beta = results.params[1]
alpha = results.params[0]
return results.summary()
#--------------------------------------------------------------------------------------------------------------
def cointegration(stock1, stock2, start_date, end_date):
X1 = web.DataReader(stock1, data_source='yahoo', start = start_date, end= end_date)['Adj Close']
X2 = web.DataReader(stock2, data_source='yahoo', start = start_date, end= end_date)['Adj Close']
X1.name = str(stock1)
X2.name = str(stock2)
def check_for_stationarity(X, cutoff=0.01):
# H_0 in adfuller is unit root exists (non-stationary)
# We must observe significant p-value to convince ourselves that the series is stationary
pvalue = adfuller(X)[1]
if pvalue < cutoff:
print('p-value = ' + str(pvalue) + ' The series ' + X.name +' is likely stationary.')
return True
else:
print('p-value = ' + str(pvalue) + ' The series ' + X.name +' is likely non-stationary.')
return False
Z = X2 - X1
Z.name = 'Z'
plt.plot(Z)
plt.xlabel('Time')
plt.ylabel('Series Value')
plt.legend(['Z']);
check_for_stationarity(Z);
#------------------------------------------------------------------------------------------------------------------
def return_cointegration(stock1, stock2, start_date, end_date):
X1 = web.DataReader(stock1, data_source='yahoo', start = start_date, end= end_date)['Adj Close']
X2 = web.DataReader(stock2, data_source='yahoo', start = start_date, end= end_date)['Adj Close']
X1 = X1.pct_change()[1:]
X2 = X2.pct_change()[1:]
X1.name = str(stock1)
X2.name = str(stock2)
def check_for_stationarity(X, cutoff=0.01):
# H_0 in adfuller is unit root exists (non-stationary)
# We must observe significant p-value to convince ourselves that the series is stationary
pvalue = adfuller(X)[1]
if pvalue < cutoff:
print('p-value = ' + str(pvalue) + ' The series ' + X.name +' is likely stationary.')
return True
else:
print('p-value = ' + str(pvalue) + ' The series ' + X.name +' is likely non-stationary.')
return False
Z = X2 - X1
Z.name = 'Z'
plt.plot(Z)
plt.xlabel('Time')
plt.ylabel('Series Value')
plt.legend(['Z']);
check_for_stationarity(Z);
#--------------------------------------------------------------------------------------------------------------------------
def stationarity(stock, start_date, end_date):
X = web.DataReader(stock, data_source='yahoo', start = start_date, end= end_date)['Adj Close']
def check_for_stationarity(X, cutoff=0.01):
# H_0 in adfuller is unit root exists (non-stationary)
# We must observe significant p-value to convince ourselves that the series is stationary
pvalue = adfuller(X)[1]
if pvalue < cutoff:
print('p-value = ' + str(pvalue) + ' The series ' + X.name +' is likely stationary.')
else:
print('p-value = ' + str(pvalue) + ' The series ' + X.name +' is likely non-stationary.')
plt.plot(X)
plt.xlabel('Time')
plt.ylabel('Series Value')
plt.legend(['Z']);
return check_for_stationarity(X)
#---------------------------------------------------------------------------------------------------------------------
def return_stationarity(stock, start_date, end_date):
X = web.DataReader(stock, data_source='yahoo', start = start_date, end= end_date)['Adj Close']
X = X.pct_change()[1:]
def check_for_stationarity(X, cutoff=0.01):
# H_0 in adfuller is unit root exists (non-stationary)
# We must observe significant p-value to convince ourselves that the series is stationary
pvalue = adfuller(X)[1]
if pvalue < cutoff:
print('p-value = ' + str(pvalue) + ' The series ' + X.name +' is likely stationary.')
else:
print('p-value = ' + str(pvalue) + ' The series ' + X.name +' is likely non-stationary.')
plt.plot(X)
plt.xlabel('Time')
plt.ylabel('Series Value')
plt.legend(['Z']);
return check_for_stationarity(X)
#-------------------------------------------------------------------------------------------------------------------------
def graph_rvolatility(stock, wts=1, start_date, end_date=today, window_time=180,x=20, y=10):
if len(stock)==1:
asset = web.DataReader(stock, data_source='yahoo', start = start_date, end= end_date)
# Compute the logarithmic returns using the Closing price
asset['Log_Ret'] = np.log(asset['Adj Close'] / asset['Adj Close'].shift(1))
# Compute Volatility using the pandas rolling standard deviation function
asset['Volatility'] = asset['Log_Ret'].rolling(window=window_time).std() * np.sqrt(252)
asset = pd.DataFrame(asset)
# Plot the NIFTY Price series and the Volatility
asset[['Volatility']].plot(subplots=True, color='blue',figsize=(x, y))
else:
asset = web.DataReader(stock, data_source='yahoo', start = start_date, end= end_date)['Adj Close']
port_ret = (asset * wts).sum(axis = 1)
asset['Adj Close'] = port_ret
# Compute the logarithmic returns using the Closing price
asset['Log_Ret'] = np.log(asset['Adj Close'] / asset['Adj Close'].shift(1))
# Compute Volatility using the pandas rolling standard deviation function
asset['Volatility'] = asset['Log_Ret'].rolling(window=window_time).std() * np.sqrt(252)
# Plot the NIFTY Price series and the Volatility
asset[['Volatility']].plot(subplots=True, color='blue',figsize=(x, y))
#------------------------------------------------------------------------------------------------------------------------
def rvolatility(stock, wts, start_date, end_date, window_time):
if len(stock)==1:
asset = web.DataReader(stock, data_source='yahoo', start = start_date, end= end_date)
# Compute the logarithmic returns using the Closing price
asset['Log_Ret'] = np.log(asset['Adj Close'] / asset['Adj Close'].shift(1))
# Compute Volatility using the pandas rolling standard deviation function
asset['Volatility'] = asset['Log_Ret'].rolling(window=window_time).std() * np.sqrt(252)
df = asset['Volatility']
df = pd.DataFrame(df)
return df
else:
asset = web.DataReader(stock, data_source='yahoo', start = start_date, end= end_date)['Adj Close']
port_ret = (asset * wts).sum(axis = 1)
asset['Adj Close'] = port_ret
# Compute the logarithmic returns using the Closing price
asset['Log_Ret'] = np.log(asset['Adj Close'] / asset['Adj Close'].shift(1))
print(asset['Adj Close'])
# Compute Volatility using the pandas rolling standard deviation function
asset['Volatility'] = asset['Log_Ret'].rolling(window=window_time).std() * np.sqrt(252)
df = asset['Volatility']
df = pd.DataFrame(df)
return df
#------------------------------------------------------------------------------------------------------------------------------------------
def graph_ralpha(stock,wts=1, benchmark, start_date, end_date=today, window_time=180, x=20, y=10):
if len(stock)==1:
# get the closing price of AMZN Stock
amzn = web.DataReader(stock, data_source='yahoo', start = start_date, end= end_date)
amzn = pd.DataFrame(amzn)
amzn['amzn_return'] = amzn['Adj Close'].pct_change()
amzn['amzn_log_return'] = np.log(amzn['Adj Close']) - np.log(amzn['Adj Close'].shift(1))
amzn.dropna(inplace=True)
nasdaq = web.DataReader(benchmark, data_source='yahoo', start = start_date, end= end_date)['Adj Close']
nasdaq = pd.DataFrame(nasdaq)
nasdaq['nasdaq_return'] = nasdaq['Adj Close'].pct_change()
nasdaq['nasdaq_log_return'] = np.log(nasdaq['Adj Close']) - np.log(nasdaq['Adj Close'].shift(1))
nasdaq.dropna(inplace=True)
def market_beta(X,Y,N):
"""
X = The independent variable which is the Market
Y = The dependent variable which is the Stock
N = The length of the Window
It returns the alphas and the betas of
the rolling regression
"""
# all the observations
obs = len(X)
# initiate the betas with null values
betas = np.full(obs, np.nan)
# initiate the alphas with null values
alphas = np.full(obs, np.nan)
for i in range((obs-N)):
regressor = LinearRegression()
regressor.fit(X.to_numpy()[i : i + N+1].reshape(-1,1), Y.to_numpy()[i : i + N+1])
betas[i+N] = regressor.coef_[0]
alphas[i+N] = regressor.intercept_
return(alphas, betas)
results = market_beta(amzn.amzn_return, nasdaq.nasdaq_return, window_time)
results = pd.DataFrame(list(zip(*results)), columns = ['alpha', 'beta'])
results.index = amzn.index
plt.figure(figsize=(x,y))
results.alpha.plot.line()
plt.title("Market Alpha: Rolling Window of "+str(window_time)+" Days")
else:
amzn = web.DataReader(stock, data_source='yahoo', start = start_date, end= end_date)['Adj Close']
amzn['Adj Close'] = (amzn * wts).sum(axis = 1)
df = returns(stock, wts, start_date, end_date)
df['Adj Close'] = amzn[['Adj Close']]
df1 = df[['Adj Close', 'Portfolio returns']]
df1.columns = ['Adj Close', 'returns' ]
df1['log_return'] = np.log(df1['Adj Close']) - np.log(df1['Adj Close'].shift(1))
df.dropna(inplace=True)
nasdaq = web.DataReader(benchmark, data_source='yahoo', start = start_date, end= end_date)['Adj Close']
nasdaq = pd.DataFrame(nasdaq)
nasdaq['nasdaq_return'] = nasdaq['Adj Close'].pct_change()
nasdaq['nasdaq_log_return'] = np.log(nasdaq['Adj Close']) - np.log(nasdaq['Adj Close'].shift(1))
nasdaq.dropna(inplace=True)
def market_beta(X,Y,N):
"""
X = The independent variable which is the Market
Y = The dependent variable which is the Stock
N = The length of the Window
It returns the alphas and the betas of
the rolling regression
"""
# all the observations
obs = len(X)
# initiate the betas with null values
betas = np.full(obs, np.nan)
# initiate the alphas with null values
alphas = np.full(obs, np.nan)
for i in range((obs-N)):
regressor = LinearRegression()
regressor.fit(X.to_numpy()[i : i + N+1].reshape(-1,1), Y.to_numpy()[i : i + N+1])
betas[i+N] = regressor.coef_[0]
alphas[i+N] = regressor.intercept_
return(alphas, betas)
results = market_beta(df1.returns, nasdaq.nasdaq_return, window_time)
results = pd.DataFrame(list(zip(*results)), columns = ['alpha', 'beta'])
results.index = df1.index
plt.figure(figsize=(x,y))
results.alpha.plot.line()
plt.title("Market Alpha: Rolling Window of "+ str(window_time) + " Days")
#----------------------------------------------------------------------------------------------------------------------------------------------------------------
def graph_rbeta(stock,wts=1, benchmark, start_date, end_date=today, window_time=180, x=20, y=10):
if len(stock)==1:
# get the closing price of AMZN Stock
amzn = web.DataReader(stock, data_source='yahoo', start = start_date, end= end_date)
amzn = pd.DataFrame(amzn)
amzn['amzn_return'] = amzn['Adj Close'].pct_change()
amzn['amzn_log_return'] = np.log(amzn['Adj Close']) - np.log(amzn['Adj Close'].shift(1))
amzn.dropna(inplace=True)
nasdaq = web.DataReader(benchmark, data_source='yahoo', start = start_date, end= end_date)['Adj Close']
nasdaq = pd.DataFrame(nasdaq)
nasdaq['nasdaq_return'] = nasdaq['Adj Close'].pct_change()
nasdaq['nasdaq_log_return'] = np.log(nasdaq['Adj Close']) - np.log(nasdaq['Adj Close'].shift(1))
nasdaq.dropna(inplace=True)
def market_beta(X,Y,N):
"""
X = The independent variable which is the Market
Y = The dependent variable which is the Stock
N = The length of the Window
It returns the alphas and the betas of
the rolling regression
"""
# all the observations
obs = len(X)
# initiate the betas with null values
betas = np.full(obs, np.nan)
# initiate the alphas with null values
alphas = np.full(obs, np.nan)
for i in range((obs-N)):
regressor = LinearRegression()
regressor.fit(X.to_numpy()[i : i + N+1].reshape(-1,1), Y.to_numpy()[i : i + N+1])
betas[i+N] = regressor.coef_[0]
alphas[i+N] = regressor.intercept_
return(alphas, betas)
results = market_beta(amzn.amzn_return, nasdaq.nasdaq_return, window_time)
results = pd.DataFrame(list(zip(*results)), columns = ['alpha', 'beta'])
results.index = amzn.index
plt.figure(figsize=(x,y))
results.beta.plot.line()
plt.title("Market Beta: Rolling Window of "+str(window_time) + " Days")
else:
amzn = web.DataReader(stock, data_source='yahoo', start = start_date, end= end_date)['Adj Close']
amzn['Adj Close'] = (amzn * wts).sum(axis = 1)
df = returns(stock, wts, start_date, end_date)
df['Adj Close'] = amzn[['Adj Close']]
df1 = df[['Adj Close', 'Portfolio returns']]
df1.columns = ['Adj Close', 'returns' ]
df1['log_return'] = np.log(df1['Adj Close']) - np.log(df1['Adj Close'].shift(1))
df.dropna(inplace=True)
nasdaq = web.DataReader(benchmark, data_source='yahoo', start = start_date, end= end_date)['Adj Close']
nasdaq = pd.DataFrame(nasdaq)
nasdaq['nasdaq_return'] = nasdaq['Adj Close'].pct_change()
nasdaq['nasdaq_log_return'] = np.log(nasdaq['Adj Close']) - np.log(nasdaq['Adj Close'].shift(1))
nasdaq.dropna(inplace=True)
def market_beta(X,Y,N):
"""
X = The independent variable which is the Market
Y = The dependent variable which is the Stock
N = The length of the Window
It returns the alphas and the betas of
the rolling regression
"""
# all the observations
obs = len(X)
# initiate the betas with null values
betas = np.full(obs, np.nan)
# initiate the alphas with null values
alphas = np.full(obs, np.nan)
for i in range((obs-N)):
regressor = LinearRegression()
regressor.fit(X.to_numpy()[i : i + N+1].reshape(-1,1), Y.to_numpy()[i : i + N+1])
betas[i+N] = regressor.coef_[0]
alphas[i+N] = regressor.intercept_
return(alphas, betas)
results = market_beta(df1.returns, nasdaq.nasdaq_return, window_time)
results = pd.DataFrame(list(zip(*results)), columns = ['alpha', 'beta'])
results.index = df1.index
plt.figure(figsize=(x,y))
results.beta.plot.line()
plt.title("Market Beta: Rolling Window of " +str(window_time)+ " Days")
#--------------------------------------------------------------------------------------------------------------------------------
def rbeta(stock,wts, benchmark, start_date, end_date, window_time):
if len(stock)==1:
# get the closing price of AMZN Stock
amzn = web.DataReader(stock, data_source='yahoo', start = start_date, end= end_date)
amzn = pd.DataFrame(amzn)
amzn['amzn_return'] = amzn['Adj Close'].pct_change()
amzn['amzn_log_return'] = np.log(amzn['Adj Close']) - np.log(amzn['Adj Close'].shift(1))
amzn.dropna(inplace=True)
nasdaq = web.DataReader(benchmark, data_source='yahoo', start = start_date, end= end_date)['Adj Close']
nasdaq = pd.DataFrame(nasdaq)
nasdaq['nasdaq_return'] = nasdaq['Adj Close'].pct_change()
nasdaq['nasdaq_log_return'] = np.log(nasdaq['Adj Close']) - np.log(nasdaq['Adj Close'].shift(1))
nasdaq.dropna(inplace=True)
def market_beta(X,Y,N):
"""
X = The independent variable which is the Market
Y = The dependent variable which is the Stock
N = The length of the Window
It returns the alphas and the betas of
the rolling regression
"""
# all the observations
obs = len(X)
# initiate the betas with null values
betas = np.full(obs, np.nan)
# initiate the alphas with null values
alphas = np.full(obs, np.nan)
for i in range((obs-N)):
regressor = LinearRegression()
regressor.fit(X.to_numpy()[i : i + N+1].reshape(-1,1), Y.to_numpy()[i : i + N+1])
betas[i+N] = regressor.coef_[0]
alphas[i+N] = regressor.intercept_
return(alphas, betas)
results = market_beta(amzn.amzn_return, nasdaq.nasdaq_return, window_time)
results = pd.DataFrame(list(zip(*results)), columns = ['alpha', 'beta'])
results.index = amzn.index
df = results['beta']
df = pd.DataFrame(df)
return df
else:
amzn = web.DataReader(stock, data_source='yahoo', start = start_date, end= end_date)['Adj Close']
amzn['Adj Close'] = (amzn * wts).sum(axis = 1)
df = returns(stock, wts, start_date, end_date)
df['Adj Close'] = amzn[['Adj Close']]
df1 = df[['Adj Close', 'Portfolio returns']]
df1.columns = ['Adj Close', 'returns' ]
df1['log_return'] = np.log(df1['Adj Close']) - np.log(df1['Adj Close'].shift(1))
df.dropna(inplace=True)
nasdaq = web.DataReader(benchmark, data_source='yahoo', start = start_date, end= end_date)['Adj Close']
nasdaq = pd.DataFrame(nasdaq)
nasdaq['nasdaq_return'] = nasdaq['Adj Close'].pct_change()
nasdaq['nasdaq_log_return'] = np.log(nasdaq['Adj Close']) - np.log(nasdaq['Adj Close'].shift(1))
nasdaq.dropna(inplace=True)
def market_beta(X,Y,N):
"""
X = The independent variable which is the Market
Y = The dependent variable which is the Stock
N = The length of the Window
It returns the alphas and the betas of
the rolling regression
"""
# all the observations
obs = len(X)
# initiate the betas with null values
betas = np.full(obs, np.nan)
# initiate the alphas with null values
alphas = np.full(obs, np.nan)
for i in range((obs-N)):
regressor = LinearRegression()
regressor.fit(X.to_numpy()[i : i + N+1].reshape(-1,1), Y.to_numpy()[i : i + N+1])
betas[i+N] = regressor.coef_[0]
alphas[i+N] = regressor.intercept_
return(alphas, betas)
results = market_beta(df1.returns, nasdaq.nasdaq_return, window_time)
results = pd.DataFrame(list(zip(*results)), columns = ['alpha', 'beta'])
results.index = df1.index
df = results['beta']
df = pd.DataFrame(df)
return df
#--------------------------------------------------------------------------------------------------------------------------------------
def ralpha(stock,wts, benchmark, start_date, end_date, window_time):
if len(stock)==1:
# get the closing price of AMZN Stock
amzn = web.DataReader(stock, data_source='yahoo', start = start_date, end= end_date)
amzn = pd.DataFrame(amzn)
amzn['amzn_return'] = amzn['Adj Close'].pct_change()
amzn['amzn_log_return'] = np.log(amzn['Adj Close']) - np.log(amzn['Adj Close'].shift(1))
amzn.dropna(inplace=True)
nasdaq = web.DataReader(benchmark, data_source='yahoo', start = start_date, end= end_date)['Adj Close']
nasdaq = pd.DataFrame(nasdaq)
nasdaq['nasdaq_return'] = nasdaq['Adj Close'].pct_change()
nasdaq['nasdaq_log_return'] = np.log(nasdaq['Adj Close']) - np.log(nasdaq['Adj Close'].shift(1))
nasdaq.dropna(inplace=True)
def market_beta(X,Y,N):
"""
X = The independent variable which is the Market
Y = The dependent variable which is the Stock
N = The length of the Window
It returns the alphas and the betas of
the rolling regression
"""
# all the observations
obs = len(X)
# initiate the betas with null values
betas = np.full(obs, np.nan)
# initiate the alphas with null values
alphas = np.full(obs, np.nan)
for i in range((obs-N)):
regressor = LinearRegression()
regressor.fit(X.to_numpy()[i : i + N+1].reshape(-1,1), Y.to_numpy()[i : i + N+1])
betas[i+N] = regressor.coef_[0]
alphas[i+N] = regressor.intercept_
return(alphas, betas)
results = market_beta(amzn.amzn_return, nasdaq.nasdaq_return, window_time)
results = pd.DataFrame(list(zip(*results)), columns = ['alpha', 'beta'])
results.index = amzn.index
df = results['alpha']
df = pd.DataFrame(df)
return df
else:
amzn = web.DataReader(stock, data_source='yahoo', start = start_date, end= end_date)['Adj Close']
amzn['Adj Close'] = (amzn * wts).sum(axis = 1)
df = returns(stock, wts, start_date, end_date)
df['Adj Close'] = amzn[['Adj Close']]
df1 = df[['Adj Close', 'Portfolio returns']]
df1.columns = ['Adj Close', 'returns' ]
df1['log_return'] = np.log(df1['Adj Close']) - np.log(df1['Adj Close'].shift(1))
df.dropna(inplace=True)
nasdaq = web.DataReader(benchmark, data_source='yahoo', start = start_date, end= end_date)['Adj Close']
nasdaq = pd.DataFrame(nasdaq)
nasdaq['nasdaq_return'] = nasdaq['Adj Close'].pct_change()
nasdaq['nasdaq_log_return'] = np.log(nasdaq['Adj Close']) - np.log(nasdaq['Adj Close'].shift(1))
nasdaq.dropna(inplace=True)
def market_beta(X,Y,N):
"""
X = The independent variable which is the Market
Y = The dependent variable which is the Stock
N = The length of the Window
It returns the alphas and the betas of
the rolling regression
"""
# all the observations
obs = len(X)
# initiate the betas with null values
betas = np.full(obs, np.nan)
# initiate the alphas with null values
alphas = np.full(obs, np.nan)
for i in range((obs-N)):
regressor = LinearRegression()
regressor.fit(X.to_numpy()[i : i + N+1].reshape(-1,1), Y.to_numpy()[i : i + N+1])
betas[i+N] = regressor.coef_[0]
alphas[i+N] = regressor.intercept_
return(alphas, betas)
results = market_beta(df1.returns, nasdaq.nasdaq_return, window_time)
results = pd.DataFrame(list(zip(*results)), columns = ['alpha', 'beta'])
results.index = df1.index
df = results['alpha']
df = pd.DataFrame(df)
return df
return results.alpha
#-------------------------------------------------------------------------------------------------------------------
def bsm_price(option_type, sigma, s, k, r, T, q):
# calculate the bsm price of European call and put options
sigma = float(sigma)
d1 = (np.log(s / k) + (r - q + sigma ** 2 * 0.5) * T) / (sigma * np.sqrt(T))
d2 = d1 - sigma * np.sqrt(T)
if option_type == 'c':
price = np.exp(-r*T) * (s * np.exp((r - q)*T) * norm.cdf(d1) - k * norm.cdf(d2))
return price
elif option_type == 'p':
price = np.exp(-r*T) * (k * norm.cdf(-d2) - s * np.exp((r - q)*T) * norm.cdf(-d1))
return price
else:
print('No such option type %s') %option_type
#option type : "c" (call option) or "p"(put option)
#P is a function of historical volatility
#S : stock price
#K : strike price
#r : risk-free rate
#T : the time to expiration
def implied_vol(option_type, option_price, s, k, r, T, q):
# apply bisection method to get the implied volatility by solving the BSM function
precision = 0.00001
upper_vol = 500.0
max_vol = 500.0
min_vol = 0.0001
lower_vol = 0.0001
iteration = 0
while 1:
iteration +=1
mid_vol = (upper_vol + lower_vol)/2.0
price = bsm_price(option_type, mid_vol, s, k, r, T, q)
if option_type == 'c':
lower_price = bsm_price(option_type, lower_vol, s, k, r, T, q)
if (lower_price - option_price) * (price - option_price) > 0:
lower_vol = mid_vol
else:
upper_vol = mid_vol
if abs(price - option_price) < precision:
break
if mid_vol > max_vol - 5 :
mid_vol = 0.000001
break
elif option_type == 'p':
upper_price = bsm_price(option_type, upper_vol, s, k, r, T, q)
if (upper_price - option_price) * (price - option_price) > 0:
upper_vol = mid_vol
else:
lower_vol = mid_vol
if abs(price - option_price) < precision:
break
if iteration > 50:
break
return mid_vol
#--------------------------------------------------------------------------------------------------------------------
def backtest(stocks, wts, benchmark, start_date, end_date):
price_data = web.DataReader(stocks, data_source='yahoo', start = start_date, end= end_date )
price_data = price_data['Adj Close']
ret_data = price_data.pct_change()[1:]
port_ret = (ret_data * wts).sum(axis = 1)
cumulative_ret_df1 = (port_ret + 1).cumprod()
total_return = (cumulative_ret_df1.iloc[-1]-1)*100
total_return = round(total_return, 2)
total_return = str(total_return) + '%'
volatility = annual_volatility(stocks, wts,start_date, end_date)*100
volatility = round(volatility, 2)
volatility = str(volatility) + '%'
s_ratio = sharpe_ratio(stocks, wts ,start_date, end_date)
s_ratio = round(s_ratio, 2)
alpha_port = alpha(stocks, wts, benchmark, start_date, end_date)
alpha_port = round(alpha_port, 4)
beta_port = beta(stocks, wts, benchmark, start_date, end_date)
beta_port = round(beta_port, 2)
data = {'Backtest': ['Return','Annual volatility','Sharpe ratio','Alpha', 'Beta'],
'Portfolio': [total_return,volatility,s_ratio, alpha_port, beta_port]
}
df = pd.DataFrame(data)
df2 = mean_daily_return(stocks,wts, start_date, end_date)
df2 = pd.DataFrame(df2)
print("Mean daily return of the portfolio")
print(df2)
graph_cbenchmark(stocks, wts, benchmark, start_date, end_date)
graph_creturns(stocks, wts, start_date, end_date)
graph_returns(stocks,wts, start_date, end_date)
graph_rbenchmark(stocks, wts, benchmark, start_date, end_date)
graph_rvolatility(stocks,wts, start_date, end_date, 180)
graph_rbeta(stocks,wts, benchmark, start_date, end_date, 180)
print(capm(stocks, wts, start_date, end_date))
return df
| 40.31874 | 162 | 0.56674 | 6,676 | 52,495 | 4.290743 | 0.057969 | 0.040216 | 0.059068 | 0.048071 | 0.855402 | 0.841997 | 0.832431 | 0.817001 | 0.801536 | 0.788584 | 0 | 0.014273 | 0.232575 | 52,495 | 1,301 | 163 | 40.349731 | 0.696768 | 0.141328 | 0 | 0.754936 | 0 | 0 | 0.085879 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.01626 | null | null | 0.01626 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
43b5b0a13328411b3a896bf07136a8928b12cc9f | 79 | py | Python | feet/__init__.py | jattenberg/feet | 091867cdd38165b989fd416bfaa428a380238662 | [
"MIT"
] | 2 | 2020-08-03T05:28:54.000Z | 2021-02-21T13:13:50.000Z | feet/__init__.py | jattenberg/feet | 091867cdd38165b989fd416bfaa428a380238662 | [
"MIT"
] | null | null | null | feet/__init__.py | jattenberg/feet | 091867cdd38165b989fd416bfaa428a380238662 | [
"MIT"
] | null | null | null | from .features_pipeline import pipeline_from_config, pipeline_from_config_file
| 39.5 | 78 | 0.911392 | 11 | 79 | 6 | 0.545455 | 0.363636 | 0.545455 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.063291 | 79 | 1 | 79 | 79 | 0.891892 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
43d57637973fc3f495d104ee3110eae0c47679e6 | 29,996 | py | Python | models_all/fo8.py | grossmann-group/pyomo-MINLP-benchmarking | 714f0a0dffd61675649a805683c0627af6b4929e | [
"MIT"
] | 7 | 2019-05-08T19:14:34.000Z | 2021-12-24T00:00:40.000Z | models_all/fo8.py | grossmann-group/pyomo-MINLP-benchmarking | 714f0a0dffd61675649a805683c0627af6b4929e | [
"MIT"
] | null | null | null | models_all/fo8.py | grossmann-group/pyomo-MINLP-benchmarking | 714f0a0dffd61675649a805683c0627af6b4929e | [
"MIT"
] | 2 | 2020-05-21T22:15:51.000Z | 2020-06-02T23:02:08.000Z | # MINLP written by GAMS Convert at 05/15/20 00:50:47
#
# Equation counts
# Total E G L N X C B
# 274 1 0 273 0 0 0 0
#
# Variable counts
# x b i s1s s2s sc si
# Total cont binary integer sos1 sos2 scont sint
# 147 91 56 0 0 0 0 0
# FX 2 2 0 0 0 0 0 0
#
# Nonzero counts
# Total const NL DLL
# 1137 1121 16 0
#
# Reformulation has removed 1 variable and 1 equation
from pyomo.environ import *
model = m = ConcreteModel()
m.b1 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b2 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b3 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b4 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b5 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b6 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b7 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b8 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b9 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b10 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b11 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b12 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b13 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b14 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b15 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b16 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b17 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b18 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b19 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b20 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b21 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b22 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b23 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b24 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b25 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b26 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b27 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b28 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b29 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b30 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b31 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b32 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b33 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b34 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b35 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b36 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b37 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b38 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b39 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b40 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b41 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b42 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b43 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b44 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b45 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b46 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b47 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b48 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b49 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b50 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b51 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b52 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b53 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b54 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b55 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b56 = Var(within=Binary,bounds=(0,1),initialize=0)
m.x58 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x59 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x60 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x61 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x62 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x63 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x64 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x65 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x66 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x67 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x68 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x69 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x70 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x71 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x72 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x73 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x74 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x75 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x76 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x77 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x78 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x79 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x80 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x81 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x82 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x83 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x84 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x85 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x86 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x87 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x88 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x89 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x90 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x91 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x92 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x93 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x94 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x95 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x96 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x97 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x98 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x99 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x100 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x101 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x102 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x103 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x104 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x105 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x106 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x107 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x108 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x109 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x110 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x111 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x112 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x113 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x114 = Var(within=Reals,bounds=(2,8),initialize=2)
m.x115 = Var(within=Reals,bounds=(2,8),initialize=2)
m.x116 = Var(within=Reals,bounds=(2,8),initialize=2)
m.x117 = Var(within=Reals,bounds=(3,11.31),initialize=3)
m.x118 = Var(within=Reals,bounds=(3,11.31),initialize=3)
m.x119 = Var(within=Reals,bounds=(1.5,6),initialize=1.5)
m.x120 = Var(within=Reals,bounds=(1.5,6),initialize=1.5)
m.x121 = Var(within=Reals,bounds=(1.5,6),initialize=1.5)
m.x122 = Var(within=Reals,bounds=(11.31,11.31),initialize=11.31)
m.x123 = Var(within=Reals,bounds=(2,8),initialize=2)
m.x124 = Var(within=Reals,bounds=(2,8),initialize=2)
m.x125 = Var(within=Reals,bounds=(2,8),initialize=2)
m.x126 = Var(within=Reals,bounds=(3.183,12),initialize=3.183)
m.x127 = Var(within=Reals,bounds=(3.183,12),initialize=3.183)
m.x128 = Var(within=Reals,bounds=(1.5,6),initialize=1.5)
m.x129 = Var(within=Reals,bounds=(1.5,6),initialize=1.5)
m.x130 = Var(within=Reals,bounds=(1.5,6),initialize=1.5)
m.x131 = Var(within=Reals,bounds=(13,13),initialize=13)
m.x132 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x133 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x134 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x135 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x136 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x137 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x138 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x139 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x140 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x141 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x142 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x143 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x144 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x145 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x146 = Var(within=Reals,bounds=(None,None),initialize=0)
m.x147 = Var(within=Reals,bounds=(None,None),initialize=0)
m.obj = Objective(expr= m.x58 + m.x59 + m.x72 + m.x73 + m.x84 + m.x85 + m.x94 + m.x95 + m.x102 + m.x103 + m.x108
+ m.x109 + m.x112 + m.x113, sense=minimize)
m.c2 = Constraint(expr= m.x132 - m.x133 <= 0)
m.c3 = Constraint(expr= 0.5*m.x114 - m.x122 + m.x132 <= 0)
m.c4 = Constraint(expr= 0.5*m.x114 - m.x132 <= 0)
m.c5 = Constraint(expr= 0.5*m.x123 - m.x131 + m.x140 <= 0)
m.c6 = Constraint(expr= 0.5*m.x123 - m.x140 <= 0)
m.c7 = Constraint(expr= 0.5*m.x115 - m.x122 + m.x133 <= 0)
m.c8 = Constraint(expr= 0.5*m.x115 - m.x133 <= 0)
m.c9 = Constraint(expr= 0.5*m.x124 - m.x131 + m.x141 <= 0)
m.c10 = Constraint(expr= 0.5*m.x124 - m.x141 <= 0)
m.c11 = Constraint(expr= 0.5*m.x116 - m.x122 + m.x134 <= 0)
m.c12 = Constraint(expr= 0.5*m.x116 - m.x134 <= 0)
m.c13 = Constraint(expr= 0.5*m.x125 - m.x131 + m.x142 <= 0)
m.c14 = Constraint(expr= 0.5*m.x125 - m.x142 <= 0)
m.c15 = Constraint(expr= 0.5*m.x117 - m.x122 + m.x135 <= 0)
m.c16 = Constraint(expr= 0.5*m.x117 - m.x135 <= 0)
m.c17 = Constraint(expr= 0.5*m.x126 - m.x131 + m.x143 <= 0)
m.c18 = Constraint(expr= 0.5*m.x126 - m.x143 <= 0)
m.c19 = Constraint(expr= 0.5*m.x118 - m.x122 + m.x136 <= 0)
m.c20 = Constraint(expr= 0.5*m.x118 - m.x136 <= 0)
m.c21 = Constraint(expr= 0.5*m.x127 - m.x131 + m.x144 <= 0)
m.c22 = Constraint(expr= 0.5*m.x127 - m.x144 <= 0)
m.c23 = Constraint(expr= 0.5*m.x119 - m.x122 + m.x137 <= 0)
m.c24 = Constraint(expr= 0.5*m.x119 - m.x137 <= 0)
m.c25 = Constraint(expr= 0.5*m.x128 - m.x131 + m.x145 <= 0)
m.c26 = Constraint(expr= 0.5*m.x128 - m.x145 <= 0)
m.c27 = Constraint(expr= 0.5*m.x120 - m.x122 + m.x138 <= 0)
m.c28 = Constraint(expr= 0.5*m.x120 - m.x138 <= 0)
m.c29 = Constraint(expr= 0.5*m.x129 - m.x131 + m.x146 <= 0)
m.c30 = Constraint(expr= 0.5*m.x129 - m.x146 <= 0)
m.c31 = Constraint(expr= 0.5*m.x121 - m.x122 + m.x139 <= 0)
m.c32 = Constraint(expr= 0.5*m.x121 - m.x139 <= 0)
m.c33 = Constraint(expr= 0.5*m.x130 - m.x131 + m.x147 <= 0)
m.c34 = Constraint(expr= 0.5*m.x130 - m.x147 <= 0)
m.c35 = Constraint(expr= - m.x58 + m.x132 - m.x133 <= 0)
m.c36 = Constraint(expr= - m.x58 - m.x132 + m.x133 <= 0)
m.c37 = Constraint(expr= - m.x59 + m.x140 - m.x141 <= 0)
m.c38 = Constraint(expr= - m.x59 - m.x140 + m.x141 <= 0)
m.c39 = Constraint(expr= - 11.31*m.b1 - 11.31*m.b2 + 0.5*m.x114 + 0.5*m.x115 - m.x132 + m.x133 <= 0)
m.c40 = Constraint(expr= - 11.31*m.b1 + 11.31*m.b2 + 0.5*m.x114 + 0.5*m.x115 + m.x132 - m.x133 <= 11.31)
m.c41 = Constraint(expr= 13*m.b1 - 13*m.b2 + 0.5*m.x123 + 0.5*m.x124 - m.x140 + m.x141 <= 13)
m.c42 = Constraint(expr= 13*m.b1 + 13*m.b2 + 0.5*m.x123 + 0.5*m.x124 + m.x140 - m.x141 <= 26)
m.c43 = Constraint(expr= - m.x60 + m.x132 - m.x134 <= 0)
m.c44 = Constraint(expr= - m.x60 - m.x132 + m.x134 <= 0)
m.c45 = Constraint(expr= - m.x61 + m.x140 - m.x142 <= 0)
m.c46 = Constraint(expr= - m.x61 - m.x140 + m.x142 <= 0)
m.c47 = Constraint(expr= - 11.31*m.b3 - 11.31*m.b4 + 0.5*m.x114 + 0.5*m.x116 - m.x132 + m.x134 <= 0)
m.c48 = Constraint(expr= - 11.31*m.b3 + 11.31*m.b4 + 0.5*m.x114 + 0.5*m.x116 + m.x132 - m.x134 <= 11.31)
m.c49 = Constraint(expr= 13*m.b3 - 13*m.b4 + 0.5*m.x123 + 0.5*m.x125 - m.x140 + m.x142 <= 13)
m.c50 = Constraint(expr= 13*m.b3 + 13*m.b4 + 0.5*m.x123 + 0.5*m.x125 + m.x140 - m.x142 <= 26)
m.c51 = Constraint(expr= - m.x62 + m.x132 - m.x135 <= 0)
m.c52 = Constraint(expr= - m.x62 - m.x132 + m.x135 <= 0)
m.c53 = Constraint(expr= - m.x63 + m.x140 - m.x143 <= 0)
m.c54 = Constraint(expr= - m.x63 - m.x140 + m.x143 <= 0)
m.c55 = Constraint(expr= - 11.31*m.b5 - 11.31*m.b6 + 0.5*m.x114 + 0.5*m.x117 - m.x132 + m.x135 <= 0)
m.c56 = Constraint(expr= - 11.31*m.b5 + 11.31*m.b6 + 0.5*m.x114 + 0.5*m.x117 + m.x132 - m.x135 <= 11.31)
m.c57 = Constraint(expr= 13*m.b5 - 13*m.b6 + 0.5*m.x123 + 0.5*m.x126 - m.x140 + m.x143 <= 13)
m.c58 = Constraint(expr= 13*m.b5 + 13*m.b6 + 0.5*m.x123 + 0.5*m.x126 + m.x140 - m.x143 <= 26)
m.c59 = Constraint(expr= - m.x64 + m.x132 - m.x136 <= 0)
m.c60 = Constraint(expr= - m.x64 - m.x132 + m.x136 <= 0)
m.c61 = Constraint(expr= - m.x65 + m.x140 - m.x144 <= 0)
m.c62 = Constraint(expr= - m.x65 - m.x140 + m.x144 <= 0)
m.c63 = Constraint(expr= - 11.31*m.b7 - 11.31*m.b8 + 0.5*m.x114 + 0.5*m.x118 - m.x132 + m.x136 <= 0)
m.c64 = Constraint(expr= - 11.31*m.b7 + 11.31*m.b8 + 0.5*m.x114 + 0.5*m.x118 + m.x132 - m.x136 <= 11.31)
m.c65 = Constraint(expr= 13*m.b7 - 13*m.b8 + 0.5*m.x123 + 0.5*m.x127 - m.x140 + m.x144 <= 13)
m.c66 = Constraint(expr= 13*m.b7 + 13*m.b8 + 0.5*m.x123 + 0.5*m.x127 + m.x140 - m.x144 <= 26)
m.c67 = Constraint(expr= - m.x66 + m.x132 - m.x137 <= 0)
m.c68 = Constraint(expr= - m.x66 - m.x132 + m.x137 <= 0)
m.c69 = Constraint(expr= - m.x67 + m.x140 - m.x145 <= 0)
m.c70 = Constraint(expr= - m.x67 - m.x140 + m.x145 <= 0)
m.c71 = Constraint(expr= - 11.31*m.b9 - 11.31*m.b10 + 0.5*m.x114 + 0.5*m.x119 - m.x132 + m.x137 <= 0)
m.c72 = Constraint(expr= - 11.31*m.b9 + 11.31*m.b10 + 0.5*m.x114 + 0.5*m.x119 + m.x132 - m.x137 <= 11.31)
m.c73 = Constraint(expr= 13*m.b9 - 13*m.b10 + 0.5*m.x123 + 0.5*m.x128 - m.x140 + m.x145 <= 13)
m.c74 = Constraint(expr= 13*m.b9 + 13*m.b10 + 0.5*m.x123 + 0.5*m.x128 + m.x140 - m.x145 <= 26)
m.c75 = Constraint(expr= - m.x68 + m.x132 - m.x138 <= 0)
m.c76 = Constraint(expr= - m.x68 - m.x132 + m.x138 <= 0)
m.c77 = Constraint(expr= - m.x69 + m.x140 - m.x146 <= 0)
m.c78 = Constraint(expr= - m.x69 - m.x140 + m.x146 <= 0)
m.c79 = Constraint(expr= - 11.31*m.b11 - 11.31*m.b12 + 0.5*m.x114 + 0.5*m.x120 - m.x132 + m.x138 <= 0)
m.c80 = Constraint(expr= - 11.31*m.b11 + 11.31*m.b12 + 0.5*m.x114 + 0.5*m.x120 + m.x132 - m.x138 <= 11.31)
m.c81 = Constraint(expr= 13*m.b11 - 13*m.b12 + 0.5*m.x123 + 0.5*m.x129 - m.x140 + m.x146 <= 13)
m.c82 = Constraint(expr= 13*m.b11 + 13*m.b12 + 0.5*m.x123 + 0.5*m.x129 + m.x140 - m.x146 <= 26)
m.c83 = Constraint(expr= - m.x70 + m.x132 - m.x139 <= 0)
m.c84 = Constraint(expr= - m.x70 - m.x132 + m.x139 <= 0)
m.c85 = Constraint(expr= - m.x71 + m.x140 - m.x147 <= 0)
m.c86 = Constraint(expr= - m.x71 - m.x140 + m.x147 <= 0)
m.c87 = Constraint(expr= - 11.31*m.b13 - 11.31*m.b14 + 0.5*m.x114 + 0.5*m.x121 - m.x132 + m.x139 <= 0)
m.c88 = Constraint(expr= - 11.31*m.b13 + 11.31*m.b14 + 0.5*m.x114 + 0.5*m.x121 + m.x132 - m.x139 <= 11.31)
m.c89 = Constraint(expr= 13*m.b13 - 13*m.b14 + 0.5*m.x123 + 0.5*m.x130 - m.x140 + m.x147 <= 13)
m.c90 = Constraint(expr= 13*m.b13 + 13*m.b14 + 0.5*m.x123 + 0.5*m.x130 + m.x140 - m.x147 <= 26)
m.c91 = Constraint(expr= - m.x72 + m.x133 - m.x134 <= 0)
m.c92 = Constraint(expr= - m.x72 - m.x133 + m.x134 <= 0)
m.c93 = Constraint(expr= - m.x73 + m.x141 - m.x142 <= 0)
m.c94 = Constraint(expr= - m.x73 - m.x141 + m.x142 <= 0)
m.c95 = Constraint(expr= - 11.31*m.b15 - 11.31*m.b16 + 0.5*m.x115 + 0.5*m.x116 - m.x133 + m.x134 <= 0)
m.c96 = Constraint(expr= - 11.31*m.b15 + 11.31*m.b16 + 0.5*m.x115 + 0.5*m.x116 + m.x133 - m.x134 <= 11.31)
m.c97 = Constraint(expr= 13*m.b15 - 13*m.b16 + 0.5*m.x124 + 0.5*m.x125 - m.x141 + m.x142 <= 13)
m.c98 = Constraint(expr= 13*m.b15 + 13*m.b16 + 0.5*m.x124 + 0.5*m.x125 + m.x141 - m.x142 <= 26)
m.c99 = Constraint(expr= - m.x74 + m.x133 - m.x135 <= 0)
m.c100 = Constraint(expr= - m.x74 - m.x133 + m.x135 <= 0)
m.c101 = Constraint(expr= - m.x75 + m.x141 - m.x143 <= 0)
m.c102 = Constraint(expr= - m.x75 - m.x141 + m.x143 <= 0)
m.c103 = Constraint(expr= - 11.31*m.b17 - 11.31*m.b18 + 0.5*m.x115 + 0.5*m.x117 - m.x133 + m.x135 <= 0)
m.c104 = Constraint(expr= - 11.31*m.b17 + 11.31*m.b18 + 0.5*m.x115 + 0.5*m.x117 + m.x133 - m.x135 <= 11.31)
m.c105 = Constraint(expr= 13*m.b17 - 13*m.b18 + 0.5*m.x124 + 0.5*m.x126 - m.x141 + m.x143 <= 13)
m.c106 = Constraint(expr= 13*m.b17 + 13*m.b18 + 0.5*m.x124 + 0.5*m.x126 + m.x141 - m.x143 <= 26)
m.c107 = Constraint(expr= - m.x76 + m.x133 - m.x136 <= 0)
m.c108 = Constraint(expr= - m.x76 - m.x133 + m.x136 <= 0)
m.c109 = Constraint(expr= - m.x77 + m.x141 - m.x144 <= 0)
m.c110 = Constraint(expr= - m.x77 - m.x141 + m.x144 <= 0)
m.c111 = Constraint(expr= - 11.31*m.b19 - 11.31*m.b20 + 0.5*m.x115 + 0.5*m.x118 - m.x133 + m.x136 <= 0)
m.c112 = Constraint(expr= - 11.31*m.b19 + 11.31*m.b20 + 0.5*m.x115 + 0.5*m.x118 + m.x133 - m.x136 <= 11.31)
m.c113 = Constraint(expr= 13*m.b19 - 13*m.b20 + 0.5*m.x124 + 0.5*m.x127 - m.x141 + m.x144 <= 13)
m.c114 = Constraint(expr= 13*m.b19 + 13*m.b20 + 0.5*m.x124 + 0.5*m.x127 + m.x141 - m.x144 <= 26)
m.c115 = Constraint(expr= - m.x78 + m.x133 - m.x137 <= 0)
m.c116 = Constraint(expr= - m.x78 - m.x133 + m.x137 <= 0)
m.c117 = Constraint(expr= - m.x79 + m.x141 - m.x145 <= 0)
m.c118 = Constraint(expr= - m.x79 - m.x141 + m.x145 <= 0)
m.c119 = Constraint(expr= - 11.31*m.b21 - 11.31*m.b22 + 0.5*m.x115 + 0.5*m.x119 - m.x133 + m.x137 <= 0)
m.c120 = Constraint(expr= - 11.31*m.b21 + 11.31*m.b22 + 0.5*m.x115 + 0.5*m.x119 + m.x133 - m.x137 <= 11.31)
m.c121 = Constraint(expr= 13*m.b21 - 13*m.b22 + 0.5*m.x124 + 0.5*m.x128 - m.x141 + m.x145 <= 13)
m.c122 = Constraint(expr= 13*m.b21 + 13*m.b22 + 0.5*m.x124 + 0.5*m.x128 + m.x141 - m.x145 <= 26)
m.c123 = Constraint(expr= - m.x80 + m.x133 - m.x138 <= 0)
m.c124 = Constraint(expr= - m.x80 - m.x133 + m.x138 <= 0)
m.c125 = Constraint(expr= - m.x81 + m.x141 - m.x146 <= 0)
m.c126 = Constraint(expr= - m.x81 - m.x141 + m.x146 <= 0)
m.c127 = Constraint(expr= - 11.31*m.b23 - 11.31*m.b24 + 0.5*m.x115 + 0.5*m.x120 - m.x133 + m.x138 <= 0)
m.c128 = Constraint(expr= - 11.31*m.b23 + 11.31*m.b24 + 0.5*m.x115 + 0.5*m.x120 + m.x133 - m.x138 <= 11.31)
m.c129 = Constraint(expr= 13*m.b23 - 13*m.b24 + 0.5*m.x124 + 0.5*m.x129 - m.x141 + m.x146 <= 13)
m.c130 = Constraint(expr= 13*m.b23 + 13*m.b24 + 0.5*m.x124 + 0.5*m.x129 + m.x141 - m.x146 <= 26)
m.c131 = Constraint(expr= - m.x82 + m.x133 - m.x139 <= 0)
m.c132 = Constraint(expr= - m.x82 - m.x133 + m.x139 <= 0)
m.c133 = Constraint(expr= - m.x83 + m.x141 - m.x147 <= 0)
m.c134 = Constraint(expr= - m.x83 - m.x141 + m.x147 <= 0)
m.c135 = Constraint(expr= - 11.31*m.b25 - 11.31*m.b26 + 0.5*m.x115 + 0.5*m.x121 - m.x133 + m.x139 <= 0)
m.c136 = Constraint(expr= - 11.31*m.b25 + 11.31*m.b26 + 0.5*m.x115 + 0.5*m.x121 + m.x133 - m.x139 <= 11.31)
m.c137 = Constraint(expr= 13*m.b25 - 13*m.b26 + 0.5*m.x124 + 0.5*m.x130 - m.x141 + m.x147 <= 13)
m.c138 = Constraint(expr= 13*m.b25 + 13*m.b26 + 0.5*m.x124 + 0.5*m.x130 + m.x141 - m.x147 <= 26)
m.c139 = Constraint(expr= - m.x84 + m.x134 - m.x135 <= 0)
m.c140 = Constraint(expr= - m.x84 - m.x134 + m.x135 <= 0)
m.c141 = Constraint(expr= - m.x85 + m.x142 - m.x143 <= 0)
m.c142 = Constraint(expr= - m.x85 - m.x142 + m.x143 <= 0)
m.c143 = Constraint(expr= - 11.31*m.b27 - 11.31*m.b28 + 0.5*m.x116 + 0.5*m.x117 - m.x134 + m.x135 <= 0)
m.c144 = Constraint(expr= - 11.31*m.b27 + 11.31*m.b28 + 0.5*m.x116 + 0.5*m.x117 + m.x134 - m.x135 <= 11.31)
m.c145 = Constraint(expr= 13*m.b27 - 13*m.b28 + 0.5*m.x125 + 0.5*m.x126 - m.x142 + m.x143 <= 13)
m.c146 = Constraint(expr= 13*m.b27 + 13*m.b28 + 0.5*m.x125 + 0.5*m.x126 + m.x142 - m.x143 <= 26)
m.c147 = Constraint(expr= - m.x86 + m.x134 - m.x136 <= 0)
m.c148 = Constraint(expr= - m.x86 - m.x134 + m.x136 <= 0)
m.c149 = Constraint(expr= - m.x87 + m.x142 - m.x144 <= 0)
m.c150 = Constraint(expr= - m.x87 - m.x142 + m.x144 <= 0)
m.c151 = Constraint(expr= - 11.31*m.b29 - 11.31*m.b30 + 0.5*m.x116 + 0.5*m.x118 - m.x134 + m.x136 <= 0)
m.c152 = Constraint(expr= - 11.31*m.b29 + 11.31*m.b30 + 0.5*m.x116 + 0.5*m.x118 + m.x134 - m.x136 <= 11.31)
m.c153 = Constraint(expr= 13*m.b29 - 13*m.b30 + 0.5*m.x125 + 0.5*m.x127 - m.x142 + m.x144 <= 13)
m.c154 = Constraint(expr= 13*m.b29 + 13*m.b30 + 0.5*m.x125 + 0.5*m.x127 + m.x142 - m.x144 <= 26)
m.c155 = Constraint(expr= - m.x88 + m.x134 - m.x137 <= 0)
m.c156 = Constraint(expr= - m.x88 - m.x134 + m.x137 <= 0)
m.c157 = Constraint(expr= - m.x89 + m.x142 - m.x145 <= 0)
m.c158 = Constraint(expr= - m.x89 - m.x142 + m.x145 <= 0)
m.c159 = Constraint(expr= - 11.31*m.b31 - 11.31*m.b32 + 0.5*m.x116 + 0.5*m.x119 - m.x134 + m.x137 <= 0)
m.c160 = Constraint(expr= - 11.31*m.b31 + 11.31*m.b32 + 0.5*m.x116 + 0.5*m.x119 + m.x134 - m.x137 <= 11.31)
m.c161 = Constraint(expr= 13*m.b31 - 13*m.b32 + 0.5*m.x125 + 0.5*m.x128 - m.x142 + m.x145 <= 13)
m.c162 = Constraint(expr= 13*m.b31 + 13*m.b32 + 0.5*m.x125 + 0.5*m.x128 + m.x142 - m.x145 <= 26)
m.c163 = Constraint(expr= - m.x90 + m.x134 - m.x138 <= 0)
m.c164 = Constraint(expr= - m.x90 - m.x134 + m.x138 <= 0)
m.c165 = Constraint(expr= - m.x91 + m.x142 - m.x146 <= 0)
m.c166 = Constraint(expr= - m.x91 - m.x142 + m.x146 <= 0)
m.c167 = Constraint(expr= - 11.31*m.b33 - 11.31*m.b34 + 0.5*m.x116 + 0.5*m.x120 - m.x134 + m.x138 <= 0)
m.c168 = Constraint(expr= - 11.31*m.b33 + 11.31*m.b34 + 0.5*m.x116 + 0.5*m.x120 + m.x134 - m.x138 <= 11.31)
m.c169 = Constraint(expr= 13*m.b33 - 13*m.b34 + 0.5*m.x125 + 0.5*m.x129 - m.x142 + m.x146 <= 13)
m.c170 = Constraint(expr= 13*m.b33 + 13*m.b34 + 0.5*m.x125 + 0.5*m.x129 + m.x142 - m.x146 <= 26)
m.c171 = Constraint(expr= - m.x92 + m.x134 - m.x139 <= 0)
m.c172 = Constraint(expr= - m.x92 - m.x134 + m.x139 <= 0)
m.c173 = Constraint(expr= - m.x93 + m.x142 - m.x147 <= 0)
m.c174 = Constraint(expr= - m.x93 - m.x142 + m.x147 <= 0)
m.c175 = Constraint(expr= - 11.31*m.b35 - 11.31*m.b36 + 0.5*m.x116 + 0.5*m.x121 - m.x134 + m.x139 <= 0)
m.c176 = Constraint(expr= - 11.31*m.b35 + 11.31*m.b36 + 0.5*m.x116 + 0.5*m.x121 + m.x134 - m.x139 <= 11.31)
m.c177 = Constraint(expr= 13*m.b35 - 13*m.b36 + 0.5*m.x125 + 0.5*m.x130 - m.x142 + m.x147 <= 13)
m.c178 = Constraint(expr= 13*m.b35 + 13*m.b36 + 0.5*m.x125 + 0.5*m.x130 + m.x142 - m.x147 <= 26)
m.c179 = Constraint(expr= - m.x94 + m.x135 - m.x136 <= 0)
m.c180 = Constraint(expr= - m.x94 - m.x135 + m.x136 <= 0)
m.c181 = Constraint(expr= - m.x95 + m.x143 - m.x144 <= 0)
m.c182 = Constraint(expr= - m.x95 - m.x143 + m.x144 <= 0)
m.c183 = Constraint(expr= - 11.31*m.b37 - 11.31*m.b38 + 0.5*m.x117 + 0.5*m.x118 - m.x135 + m.x136 <= 0)
m.c184 = Constraint(expr= - 11.31*m.b37 + 11.31*m.b38 + 0.5*m.x117 + 0.5*m.x118 + m.x135 - m.x136 <= 11.31)
m.c185 = Constraint(expr= 13*m.b37 - 13*m.b38 + 0.5*m.x126 + 0.5*m.x127 - m.x143 + m.x144 <= 13)
m.c186 = Constraint(expr= 13*m.b37 + 13*m.b38 + 0.5*m.x126 + 0.5*m.x127 + m.x143 - m.x144 <= 26)
m.c187 = Constraint(expr= - m.x96 + m.x135 - m.x137 <= 0)
m.c188 = Constraint(expr= - m.x96 - m.x135 + m.x137 <= 0)
m.c189 = Constraint(expr= - m.x97 + m.x143 - m.x145 <= 0)
m.c190 = Constraint(expr= - m.x97 - m.x143 + m.x145 <= 0)
m.c191 = Constraint(expr= - 11.31*m.b39 - 11.31*m.b40 + 0.5*m.x117 + 0.5*m.x119 - m.x135 + m.x137 <= 0)
m.c192 = Constraint(expr= - 11.31*m.b39 + 11.31*m.b40 + 0.5*m.x117 + 0.5*m.x119 + m.x135 - m.x137 <= 11.31)
m.c193 = Constraint(expr= 13*m.b39 - 13*m.b40 + 0.5*m.x126 + 0.5*m.x128 - m.x143 + m.x145 <= 13)
m.c194 = Constraint(expr= 13*m.b39 + 13*m.b40 + 0.5*m.x126 + 0.5*m.x128 + m.x143 - m.x145 <= 26)
m.c195 = Constraint(expr= - m.x98 + m.x135 - m.x138 <= 0)
m.c196 = Constraint(expr= - m.x98 - m.x135 + m.x138 <= 0)
m.c197 = Constraint(expr= - m.x99 + m.x143 - m.x146 <= 0)
m.c198 = Constraint(expr= - m.x99 - m.x143 + m.x146 <= 0)
m.c199 = Constraint(expr= - 11.31*m.b41 - 11.31*m.b42 + 0.5*m.x117 + 0.5*m.x120 - m.x135 + m.x138 <= 0)
m.c200 = Constraint(expr= - 11.31*m.b41 + 11.31*m.b42 + 0.5*m.x117 + 0.5*m.x120 + m.x135 - m.x138 <= 11.31)
m.c201 = Constraint(expr= 13*m.b41 - 13*m.b42 + 0.5*m.x126 + 0.5*m.x129 - m.x143 + m.x146 <= 13)
m.c202 = Constraint(expr= 13*m.b41 + 13*m.b42 + 0.5*m.x126 + 0.5*m.x129 + m.x143 - m.x146 <= 26)
m.c203 = Constraint(expr= - m.x100 + m.x135 - m.x139 <= 0)
m.c204 = Constraint(expr= - m.x100 - m.x135 + m.x139 <= 0)
m.c205 = Constraint(expr= - m.x101 + m.x143 - m.x147 <= 0)
m.c206 = Constraint(expr= - m.x101 - m.x143 + m.x147 <= 0)
m.c207 = Constraint(expr= - 11.31*m.b43 - 11.31*m.b44 + 0.5*m.x117 + 0.5*m.x121 - m.x135 + m.x139 <= 0)
m.c208 = Constraint(expr= - 11.31*m.b43 + 11.31*m.b44 + 0.5*m.x117 + 0.5*m.x121 + m.x135 - m.x139 <= 11.31)
m.c209 = Constraint(expr= 13*m.b43 - 13*m.b44 + 0.5*m.x126 + 0.5*m.x130 - m.x143 + m.x147 <= 13)
m.c210 = Constraint(expr= 13*m.b43 + 13*m.b44 + 0.5*m.x126 + 0.5*m.x130 + m.x143 - m.x147 <= 26)
m.c211 = Constraint(expr= - m.x102 + m.x136 - m.x137 <= 0)
m.c212 = Constraint(expr= - m.x102 - m.x136 + m.x137 <= 0)
m.c213 = Constraint(expr= - m.x103 + m.x144 - m.x145 <= 0)
m.c214 = Constraint(expr= - m.x103 - m.x144 + m.x145 <= 0)
m.c215 = Constraint(expr= - 11.31*m.b45 - 11.31*m.b46 + 0.5*m.x118 + 0.5*m.x119 - m.x136 + m.x137 <= 0)
m.c216 = Constraint(expr= - 11.31*m.b45 + 11.31*m.b46 + 0.5*m.x118 + 0.5*m.x119 + m.x136 - m.x137 <= 11.31)
m.c217 = Constraint(expr= 13*m.b45 - 13*m.b46 + 0.5*m.x127 + 0.5*m.x128 - m.x144 + m.x145 <= 13)
m.c218 = Constraint(expr= 13*m.b45 + 13*m.b46 + 0.5*m.x127 + 0.5*m.x128 + m.x144 - m.x145 <= 26)
m.c219 = Constraint(expr= - m.x104 + m.x136 - m.x138 <= 0)
m.c220 = Constraint(expr= - m.x104 - m.x136 + m.x138 <= 0)
m.c221 = Constraint(expr= - m.x105 + m.x144 - m.x146 <= 0)
m.c222 = Constraint(expr= - m.x105 - m.x144 + m.x146 <= 0)
m.c223 = Constraint(expr= - 11.31*m.b47 - 11.31*m.b48 + 0.5*m.x118 + 0.5*m.x120 - m.x136 + m.x138 <= 0)
m.c224 = Constraint(expr= - 11.31*m.b47 + 11.31*m.b48 + 0.5*m.x118 + 0.5*m.x120 + m.x136 - m.x138 <= 11.31)
m.c225 = Constraint(expr= 13*m.b47 - 13*m.b48 + 0.5*m.x127 + 0.5*m.x129 - m.x144 + m.x146 <= 13)
m.c226 = Constraint(expr= 13*m.b47 + 13*m.b48 + 0.5*m.x127 + 0.5*m.x129 + m.x144 - m.x146 <= 26)
m.c227 = Constraint(expr= - m.x106 + m.x136 - m.x139 <= 0)
m.c228 = Constraint(expr= - m.x106 - m.x136 + m.x139 <= 0)
m.c229 = Constraint(expr= - m.x107 + m.x144 - m.x147 <= 0)
m.c230 = Constraint(expr= - m.x107 - m.x144 + m.x147 <= 0)
m.c231 = Constraint(expr= - 11.31*m.b49 - 11.31*m.b50 + 0.5*m.x118 + 0.5*m.x121 - m.x136 + m.x139 <= 0)
m.c232 = Constraint(expr= - 11.31*m.b49 + 11.31*m.b50 + 0.5*m.x118 + 0.5*m.x121 + m.x136 - m.x139 <= 11.31)
m.c233 = Constraint(expr= 13*m.b49 - 13*m.b50 + 0.5*m.x127 + 0.5*m.x130 - m.x144 + m.x147 <= 13)
m.c234 = Constraint(expr= 13*m.b49 + 13*m.b50 + 0.5*m.x127 + 0.5*m.x130 + m.x144 - m.x147 <= 26)
m.c235 = Constraint(expr= - m.x108 + m.x137 - m.x138 <= 0)
m.c236 = Constraint(expr= - m.x108 - m.x137 + m.x138 <= 0)
m.c237 = Constraint(expr= - m.x109 + m.x145 - m.x146 <= 0)
m.c238 = Constraint(expr= - m.x109 - m.x145 + m.x146 <= 0)
m.c239 = Constraint(expr= - 11.31*m.b51 - 11.31*m.b52 + 0.5*m.x119 + 0.5*m.x120 - m.x137 + m.x138 <= 0)
m.c240 = Constraint(expr= - 11.31*m.b51 + 11.31*m.b52 + 0.5*m.x119 + 0.5*m.x120 + m.x137 - m.x138 <= 11.31)
m.c241 = Constraint(expr= 13*m.b51 - 13*m.b52 + 0.5*m.x128 + 0.5*m.x129 - m.x145 + m.x146 <= 13)
m.c242 = Constraint(expr= 13*m.b51 + 13*m.b52 + 0.5*m.x128 + 0.5*m.x129 + m.x145 - m.x146 <= 26)
m.c243 = Constraint(expr= - m.x110 + m.x137 - m.x139 <= 0)
m.c244 = Constraint(expr= - m.x110 - m.x137 + m.x139 <= 0)
m.c245 = Constraint(expr= - m.x111 + m.x145 - m.x147 <= 0)
m.c246 = Constraint(expr= - m.x111 - m.x145 + m.x147 <= 0)
m.c247 = Constraint(expr= - 11.31*m.b53 - 11.31*m.b54 + 0.5*m.x119 + 0.5*m.x121 - m.x137 + m.x139 <= 0)
m.c248 = Constraint(expr= - 11.31*m.b53 + 11.31*m.b54 + 0.5*m.x119 + 0.5*m.x121 + m.x137 - m.x139 <= 11.31)
m.c249 = Constraint(expr= 13*m.b53 - 13*m.b54 + 0.5*m.x128 + 0.5*m.x130 - m.x145 + m.x147 <= 13)
m.c250 = Constraint(expr= 13*m.b53 + 13*m.b54 + 0.5*m.x128 + 0.5*m.x130 + m.x145 - m.x147 <= 26)
m.c251 = Constraint(expr= - m.x112 + m.x138 - m.x139 <= 0)
m.c252 = Constraint(expr= - m.x112 - m.x138 + m.x139 <= 0)
m.c253 = Constraint(expr= - m.x113 + m.x146 - m.x147 <= 0)
m.c254 = Constraint(expr= - m.x113 - m.x146 + m.x147 <= 0)
m.c255 = Constraint(expr= - 11.31*m.b55 - 11.31*m.b56 + 0.5*m.x120 + 0.5*m.x121 - m.x138 + m.x139 <= 0)
m.c256 = Constraint(expr= - 11.31*m.b55 + 11.31*m.b56 + 0.5*m.x120 + 0.5*m.x121 + m.x138 - m.x139 <= 11.31)
m.c257 = Constraint(expr= 13*m.b55 - 13*m.b56 + 0.5*m.x129 + 0.5*m.x130 - m.x146 + m.x147 <= 13)
m.c258 = Constraint(expr= 13*m.b55 + 13*m.b56 + 0.5*m.x129 + 0.5*m.x130 + m.x146 - m.x147 <= 26)
m.c259 = Constraint(expr=16/m.x114 - m.x123 <= 0)
m.c260 = Constraint(expr=16/m.x123 - m.x114 <= 0)
m.c261 = Constraint(expr=16/m.x115 - m.x124 <= 0)
m.c262 = Constraint(expr=16/m.x124 - m.x115 <= 0)
m.c263 = Constraint(expr=16/m.x116 - m.x125 <= 0)
m.c264 = Constraint(expr=16/m.x125 - m.x116 <= 0)
m.c265 = Constraint(expr=36/m.x117 - m.x126 <= 0)
m.c266 = Constraint(expr=36/m.x126 - m.x117 <= 0)
m.c267 = Constraint(expr=36/m.x118 - m.x127 <= 0)
m.c268 = Constraint(expr=36/m.x127 - m.x118 <= 0)
m.c269 = Constraint(expr=9/m.x119 - m.x128 <= 0)
m.c270 = Constraint(expr=9/m.x128 - m.x119 <= 0)
m.c271 = Constraint(expr=9/m.x120 - m.x129 <= 0)
m.c272 = Constraint(expr=9/m.x129 - m.x120 <= 0)
m.c273 = Constraint(expr=9/m.x121 - m.x130 <= 0)
m.c274 = Constraint(expr=9/m.x130 - m.x121 <= 0)
| 41.661111 | 114 | 0.610881 | 6,105 | 29,996 | 3.001474 | 0.082719 | 0.03449 | 0.041912 | 0.098232 | 0.846431 | 0.838463 | 0.834752 | 0.796333 | 0.796333 | 0.521065 | 0 | 0.243904 | 0.163222 | 29,996 | 719 | 115 | 41.719054 | 0.486135 | 0.02267 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.002364 | 0 | 0.002364 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
43daca771ac8905e314c131ce598b5da22abaf3e | 67,530 | py | Python | tools/moduletests/functional/test_openssh.py | stivesso/aws-ec2rescue-linux | 63fb350ba65d3d67c25c0ecc367793adef6cebbd | [
"Apache-2.0"
] | 178 | 2017-07-18T18:58:36.000Z | 2022-03-31T03:12:52.000Z | tools/moduletests/functional/test_openssh.py | stivesso/aws-ec2rescue-linux | 63fb350ba65d3d67c25c0ecc367793adef6cebbd | [
"Apache-2.0"
] | 45 | 2017-07-18T23:19:06.000Z | 2021-11-30T17:31:51.000Z | tools/moduletests/functional/test_openssh.py | stivesso/aws-ec2rescue-linux | 63fb350ba65d3d67c25c0ecc367793adef6cebbd | [
"Apache-2.0"
] | 72 | 2017-07-18T18:57:59.000Z | 2022-03-29T06:14:06.000Z | # Copyright 2016-2020 Amazon.com, Inc. or its affiliates. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"). You
# may not use this file except in compliance with the License. A copy of
# the License is located at
#
# http://aws.amazon.com/apache2.0/
#
#
# or in the "license" file accompanying this file. This file is
# distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF
# ANY KIND, either express or implied. See the License for the specific
# language governing permissions and limitations under the License.
"""Functional tests for "openssh" module."""
import glob
import os
import pwd
import re
import shlex
import shutil
import stat
import subprocess
import sys
import unittest
import boto3
import requests
import ec2rlcore.prediag
import moduletests.src.openssh
class TestSSH(unittest.TestCase):
"""SSH tests."""
metadata_url = "http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key"
command = [sys.executable, os.path.join("moduletests", "src", "openssh.py")]
env_dict = {"PATH": os.environ["PATH"],
"EC2RL_CALLPATH": os.environ["EC2RL_CALLPATH"]}
possible_users = ["ec2-user", "ubuntu"]
uid = None
username = None
config_file_path = None
backup_dir_path = "/var/tmp/ec2rl-functional"
default_configuration_lines = ["HostKey /etc/ssh/ssh_host_rsa_key\n",
"HostKey /etc/ssh/ssh_host_ecdsa_key\n",
"HostKey /etc/ssh/ssh_host_ed25519_key\n",
"SyslogFacility AUTHPRIV\n",
"PermitRootLogin no\n",
"AuthorizedKeysFile .ssh/authorized_keys\n",
"PasswordAuthentication no\n",
"ChallengeResponseAuthentication no\n",
"UsePAM yes\n"]
def __init__(self, *args, **kwargs):
for username in self.possible_users:
try:
uid = pwd.getpwnam(username).pw_uid
self.uid = uid
self.username = username
break
except KeyError:
pass
if not self.username or not self.uid:
raise Exception("Missing both the ec2-user user and the ubuntu user. One must be present for these tests.")
self.config_file_path = moduletests.src.openssh.get_config_file_path()
if not self.config_file_path:
raise Exception("Failed to find the sshd configuration file.")
if os.path.exists(self.backup_dir_path):
shutil.rmtree(self.backup_dir_path)
super(TestSSH, self).__init__(*args, **kwargs)
def setUp(self):
print("In test: {}".format(self._testMethodName))
self.backed_files = dict()
self.verified_fixed = False
self.env_dict["remediate"] = "False"
self.env_dict["notaninstance"] = "True"
self.env_dict["injectkey"] = "False"
self.env_dict["injectkeyonly"] = "False"
self.env_dict["createnewkeys"] = "False"
self.env_dict["EC2RL_GATHEREDDIR"] = self.backup_dir_path
os.makedirs(self.backup_dir_path, 0o700)
def tearDown(self):
self.backed_files = None
del self.env_dict["remediate"]
del self.env_dict["notaninstance"]
del self.env_dict["injectkey"]
del self.env_dict["injectkeyonly"]
del self.env_dict["createnewkeys"]
# Remove files created by running ec2rl
shutil.rmtree(self.backup_dir_path)
def test_ssh_missing_sshd(self):
"""Backup the sshd executable, remove it, verify the issue is detected, and restore the backup."""
output_messages = list()
output_messages.append("-- FAILURE Missing sshd executable or it is not in $PATH: sshd")
output_messages.append("-- Unable to check 20 items due to dependent check failures")
sshd_path = ec2rlcore.prediag.which("sshd")
ec2rlcore.prediag.backup(sshd_path, self.backed_files, self.backup_dir_path)
try:
os.remove(sshd_path)
process_output = subprocess.check_output(TestSSH.command,
stderr=subprocess.STDOUT,
env=self.env_dict,
universal_newlines=True)
finally:
ec2rlcore.prediag.restore(sshd_path, self.backed_files)
for message in output_messages:
self.assertTrue(message in process_output)
def test_ssh_etc_ssh_world_writable(self):
"""
Set the permission mode of /etc/ssh to 777 (world writable), verify the issue is detected, and undo the
permission mode change.
"""
test_path = "/etc/ssh"
output_messages = list()
output_messages.append("-- FAILURE Permission mode includes write for groups and/or other users: /etc/ssh")
try:
os.chmod(test_path, 0o777)
process_output = subprocess.check_output(TestSSH.command,
stderr=subprocess.STDOUT,
env=self.env_dict,
universal_newlines=True)
finally:
os.chmod(test_path, 0o755)
for message in output_messages:
self.assertTrue(message in process_output)
def test_ssh_etc_ssh_world_writable_remediate(self):
"""
Set the permission mode of /etc/ssh to 777 (world writable), verify the issue is detected and remediated, and
undo the permission mode change, if needed.
"""
# Translated user args
self.env_dict["remediate"] = "True"
test_path = "/etc/ssh"
output_messages = list()
output_messages.append("-- FIXED Permission mode includes write for groups and/or other users: /etc/ssh")
try:
os.chmod(test_path, 0o777)
process_output = subprocess.check_output(TestSSH.command,
stderr=subprocess.STDOUT,
env=self.env_dict,
universal_newlines=True)
if os.stat(test_path).st_mode == (stat.S_IFDIR | 0o755):
self.verified_fixed = True
finally:
os.chmod(test_path, 0o755)
for message in output_messages:
self.assertTrue(message in process_output)
self.assertTrue(self.verified_fixed)
def test_ssh_config_world_writable(self):
"""
Set the permission mode of sshd_config to 777 (world writable), verify the issue is detected, and undo the
permission mode change.
"""
test_path = self.config_file_path
output_messages = list()
output_messages.append("-- FAILURE Permission mode includes write for groups and/or other users: {}".format(
test_path))
try:
os.chmod(test_path, 0o777)
process_output = subprocess.check_output(TestSSH.command,
stderr=subprocess.STDOUT,
env=self.env_dict,
universal_newlines=True)
finally:
os.chmod(test_path, 0o655)
for message in output_messages:
self.assertTrue(message in process_output)
def test_ssh_config_world_writable_remediate(self):
"""
Set the permission mode of sshd_config to 777 (world writable), verify the issue is detected and remediated, and
undo the permission mode change, if needed.
"""
# Translated user args
self.env_dict["remediate"] = "True"
test_path = self.config_file_path
output_messages = list()
output_messages.append("-- FIXED Permission mode includes write for groups and/or other users: {}".format(
test_path))
try:
os.chmod(test_path, 0o777)
process_output = subprocess.check_output(TestSSH.command,
stderr=subprocess.STDOUT,
env=self.env_dict,
universal_newlines=True)
if os.stat(test_path).st_mode == (stat.S_IFREG | 0o655):
self.verified_fixed = True
finally:
os.chmod(test_path, 0o655)
for message in output_messages:
self.assertTrue(message in process_output)
self.assertTrue(self.verified_fixed)
def test_ssh_home_world_writable(self):
"""
Set the permission mode of /home to 777 (world writable), verify the issue is detected, and undo the
permission mode change.
"""
test_path = "/home"
output_messages = list()
output_messages.append("-- FAILURE Permission mode includes write for groups and/or other users: "
"/home")
try:
os.chmod(test_path, 0o777)
process_output = subprocess.check_output(TestSSH.command,
stderr=subprocess.STDOUT,
env=self.env_dict,
universal_newlines=True)
finally:
os.chmod(test_path, 0o755)
for message in output_messages:
self.assertTrue(message in process_output)
def test_ssh_home_world_writable_remediate(self):
"""
Set the permission mode of sshd_config to 777 (world writable), verify the issue is detected and remediated, and
undo the permission mode change, if needed.
"""
# Translated user args
self.env_dict["remediate"] = "True"
test_path = "/home"
output_messages = list()
output_messages.append(
"-- FIXED Permission mode includes write for groups and/or other users: /home")
try:
os.chmod(test_path, 0o777)
process_output = subprocess.check_output(TestSSH.command,
stderr=subprocess.STDOUT,
env=self.env_dict,
universal_newlines=True)
if os.stat(test_path).st_mode == (stat.S_IFDIR | 0o755):
self.verified_fixed = True
finally:
os.chmod(test_path, 0o755)
for message in output_messages:
self.assertTrue(message in process_output)
self.assertTrue(self.verified_fixed)
def test_ssh_home_wrong_owner(self):
"""Set the uid of /home to 1337, verify the issue is detected, and undo the uid change."""
test_path = "/home"
output_messages = list()
output_messages.append("FAILURE Not owned by user root: /home")
try:
os.chown(test_path, 1337, -1)
process_output = subprocess.check_output(TestSSH.command,
stderr=subprocess.STDOUT,
env=self.env_dict,
universal_newlines=True)
finally:
os.chown(test_path, 0, -1)
for message in output_messages:
self.assertTrue(message in process_output)
def test_ssh_home_wrong_owner_remediate(self):
"""
Set the uid of /home to 1337, verify the issue is detected and remediated, and undo the uid change, if needed.
"""
# Translated user args
self.env_dict["remediate"] = "True"
test_path = "/home"
output_messages = list()
output_messages.append("-- FIXED Not owned by user root: /home")
try:
os.chown(test_path, 1337, -1)
process_output = subprocess.check_output(TestSSH.command,
stderr=subprocess.STDOUT,
env=self.env_dict,
universal_newlines=True)
if os.stat(test_path).st_uid == 0:
self.verified_fixed = True
finally:
os.chown(test_path, 0, -1)
for message in output_messages:
self.assertTrue(message in process_output)
self.assertTrue(self.verified_fixed)
def test_ssh_user_home_world_writable(self):
"""
Set the permission mode of a user's home directory (ec2-user or ubuntu) to 777 (world writable),
verify the issue is detected, and undo the permission mode change.
"""
test_path = "/home/{}".format(self.username)
output_messages = list()
output_messages.append("-- FAILURE Permission mode includes write for groups and/or other users: {}".format(
test_path))
try:
os.chmod(test_path, 0o777)
process_output = subprocess.check_output(TestSSH.command,
stderr=subprocess.STDOUT,
env=self.env_dict,
universal_newlines=True)
finally:
os.chmod(test_path, 0o755)
for message in output_messages:
self.assertTrue(message in process_output)
def test_ssh_user_home_world_writable_remediate(self):
"""
Set the permission mode of user's home directory (ec2-user or ubuntu) to 777 (world writable),
verify the issue is detected and remediated, and undo the permission mode change, if needed.
"""
# Translated user args
self.env_dict["remediate"] = "True"
test_path = "/home/{}".format(self.username)
output_messages = list()
output_messages.append("-- FIXED Permission mode includes write for groups and/or other users: {}".format(
test_path))
try:
os.chmod(test_path, 0o777)
process_output = subprocess.check_output(TestSSH.command,
stderr=subprocess.STDOUT,
env=self.env_dict,
universal_newlines=True)
if os.stat(test_path).st_mode == (stat.S_IFDIR | 0o755):
self.verified_fixed = True
finally:
os.chmod(test_path, 0o755)
for message in output_messages:
self.assertTrue(message in process_output)
self.assertTrue(self.verified_fixed)
def test_ssh_user_home_wrong_owner(self):
"""Set the uid of user's home directory (ec2-user or ubuntu) to 0, verify the issue is detected,
and undo the uid change."""
test_path = "/home/{}".format(self.username)
output_messages = list()
output_messages.append("FAILURE Not owned by user {}: {}".format(self.username, test_path))
try:
os.chown(test_path, 0, -1)
process_output = subprocess.check_output(TestSSH.command,
stderr=subprocess.STDOUT,
env=self.env_dict,
universal_newlines=True)
finally:
os.chown(test_path, self.uid, -1)
for message in output_messages:
self.assertTrue(message in process_output)
def test_ssh_user_home_wrong_owner_remediate(self):
"""
Set the uid of user's home directory (ec2-user or ubuntu) to 0, verify the issue is detected and remediated,
and undo the uid change, if needed.
"""
# Translated user args
self.env_dict["remediate"] = "True"
test_path = "/home/{}".format(self.username)
output_messages = list()
output_messages.append("-- FIXED Not owned by user {}: {}".format(self.username, test_path))
try:
os.chown(test_path, 0, -1)
process_output = subprocess.check_output(TestSSH.command,
stderr=subprocess.STDOUT,
env=self.env_dict,
universal_newlines=True)
if os.stat(test_path).st_uid == self.uid:
self.verified_fixed = True
finally:
os.chown(test_path, self.uid, -1)
for message in output_messages:
self.assertTrue(message in process_output)
self.assertTrue(self.verified_fixed)
def test_ssh_missing_auth_keys(self):
"""
Rename a user's (ec2-user or ubuntu) authorized keys file , verify the issue is detected, and undo the rename.
"""
output_messages = list()
output_messages.append("-- FAILURE Missing authorized key file")
output_messages.append("-- Unable to check 2 items due to dependent check failures")
moduletests.src.openssh.Problem.setup_config_vars()
test_path = os.path.join("/home",
self.username,
moduletests.src.openssh.Problem.CONFIG_DICT["AUTH_KEYS"]["relative"][0])
ec2rlcore.prediag.backup(test_path, self.backed_files, self.backup_dir_path)
try:
os.remove(test_path)
process_output = subprocess.check_output(TestSSH.command,
stderr=subprocess.STDOUT,
env=self.env_dict,
universal_newlines=True)
finally:
ec2rlcore.prediag.restore(test_path, self.backed_files)
for message in output_messages:
self.assertTrue(message in process_output)
def test_ssh_missing_auth_keys_remediate(self):
"""
Rename a user's (ec2-user or ubuntu) authorized keys file , verify the issue is detected and remediated,
and undo the rename, if necesary.
Note: this requires a new key to be specified or the environment to be an instance so the key can be pulled from
the instance metadata. Without one of these, there is no way to know what key to add to the new authorized keys
file. This test uses a key from the instance metadata and is, thus, only suitable to be run on an instance.
"""
# Translated user args
self.env_dict["remediate"] = "True"
self.env_dict["notaninstance"] = "False"
moduletests.src.openssh.Problem.setup_config_vars()
test_path = os.path.join("/home",
self.username,
moduletests.src.openssh.Problem.CONFIG_DICT["AUTH_KEYS"]["relative"][0])
output_messages = list()
output_messages.append("-- FIXED Missing authorized key file")
try:
ec2rlcore.prediag.backup(test_path, self.backed_files, self.backup_dir_path)
os.remove(test_path)
process_output = subprocess.check_output(TestSSH.command,
stderr=subprocess.STDOUT,
env=self.env_dict,
universal_newlines=True)
if os.path.isfile(test_path):
self.verified_fixed = True
finally:
ec2rlcore.prediag.restore(test_path, self.backed_files)
for message in output_messages:
self.assertTrue(message in process_output)
self.assertTrue(self.verified_fixed)
def test_ssh_auth_keys_wrong_owner(self):
"""
Set the uid of user's (ec2-user or ubuntu) authorized keys file to 0, verify the issue is detected,
and undo the uid change.
"""
moduletests.src.openssh.Problem.setup_config_vars()
test_path = os.path.join("/home",
self.username,
moduletests.src.openssh.Problem.CONFIG_DICT["AUTH_KEYS"]["relative"][0])
output_messages = list()
output_messages.append("FAILURE Not owned by user {}: {}".format(self.username, test_path))
try:
os.chown(test_path, 0, -1)
process_output = subprocess.check_output(TestSSH.command,
stderr=subprocess.STDOUT,
env=self.env_dict,
universal_newlines=True)
finally:
os.chown(test_path, self.uid, -1)
for message in output_messages:
self.assertTrue(message in process_output)
def test_ssh_auth_keys_wrong_owner_remediate(self):
"""
Set the uid of user's (ec2-user or ubuntu) authorized keys file to 0, verify the issue is detected and
remediated, and undo the uid change, if necesary.
"""
# Translated user args
self.env_dict["remediate"] = "True"
moduletests.src.openssh.Problem.setup_config_vars()
test_path = os.path.join("/home",
self.username,
moduletests.src.openssh.Problem.CONFIG_DICT["AUTH_KEYS"]["relative"][0])
output_messages = list()
output_messages.append("-- FIXED Not owned by user {}: {}".format(self.username, test_path))
try:
os.chown(test_path, 0, -1)
process_output = subprocess.check_output(TestSSH.command,
stderr=subprocess.STDOUT,
env=self.env_dict,
universal_newlines=True)
if os.stat(test_path).st_uid == self.uid:
self.verified_fixed = True
finally:
os.chown(test_path, self.uid, -1)
for message in output_messages:
self.assertTrue(message in process_output)
self.assertTrue(self.verified_fixed)
def test_ssh_auth_keys_world_writable(self):
"""
Set the permission mode of a user's (ec2-user or ubuntu) authorized keys file to 777 (world writable),
verify the issue is detected, and undo the permission mode change.
"""
moduletests.src.openssh.Problem.setup_config_vars()
test_path = os.path.join("/home",
self.username,
moduletests.src.openssh.Problem.CONFIG_DICT["AUTH_KEYS"]["relative"][0])
output_messages = list()
output_messages.append("-- FAILURE Permission mode includes write for groups and/or other users: {}".format(
test_path))
try:
os.chmod(test_path, 0o777)
process_output = subprocess.check_output(TestSSH.command,
stderr=subprocess.STDOUT,
env=self.env_dict,
universal_newlines=True)
finally:
os.chmod(test_path, 0o655)
for message in output_messages:
self.assertTrue(message in process_output)
def test_ssh_auth_keys_world_writable_remediate(self):
"""
Set the permission mode of a user's (ec2-user or ubuntu) authorized keys file to 777 (world writable),
verify the issue is detected and remediated, and undo the permission mode change, if needed.
"""
# Translated user args
self.env_dict["remediate"] = "True"
moduletests.src.openssh.Problem.setup_config_vars()
test_path = os.path.join("/home",
self.username,
moduletests.src.openssh.Problem.CONFIG_DICT["AUTH_KEYS"]["relative"][0])
output_messages = list()
output_messages.append("-- FIXED Permission mode includes write for groups and/or other users: {}".format(
test_path))
try:
os.chmod(test_path, 0o777)
process_output = subprocess.check_output(TestSSH.command,
stderr=subprocess.STDOUT,
env=self.env_dict,
universal_newlines=True)
if os.stat(test_path).st_mode == (stat.S_IFREG | 0o655):
self.verified_fixed = True
finally:
os.chmod(test_path, 0o655)
for message in output_messages:
self.assertTrue(message in process_output)
self.assertTrue(self.verified_fixed)
def test_ssh_config_bad_options(self):
"""
Backup the sshd configuration file, write a new configuration file with a set of options including one
invalid option, test that the problem is detected, and restore the backup configuration file copy.
"""
test_path = self.config_file_path
output_messages = list()
output_messages.append("-- FAILURE Bad lines in configuration file: {}".format(test_path))
output_messages.append("-- Unable to check 18 items due to dependent check failures")
bad_config_lines = ["bad option\n", "Port 22 # Good option \n", "# comment line\n"]
ec2rlcore.prediag.backup(test_path, self.backed_files, self.backup_dir_path)
try:
with open(test_path, "w") as ssh_cfg_file:
ssh_cfg_file.writelines(bad_config_lines)
process_output = subprocess.check_output(TestSSH.command,
stderr=subprocess.STDOUT,
env=self.env_dict,
universal_newlines=True)
finally:
ec2rlcore.prediag.restore(test_path, self.backed_files)
for message in output_messages:
self.assertTrue(message in process_output)
def test_ssh_config_bad_options_remediate(self):
"""
Backup the sshd configuration file, write a new configuration file with a set of options including one
invalid option, test that the problem is detected and remediated, and restore the backup configuration file
copy.
"""
# Translated user args
self.env_dict["remediate"] = "True"
test_path = self.config_file_path
output_messages = list()
output_messages.append("-- FIXED Bad lines in configuration file: {}".format(test_path))
bad_config_lines = ["bad option\n", "Port 22 # Good option \n", "# comment line\n"]
fixed_config_lines = ["# bad option # commented out by ec2rl\n", "Port 22 # Good option \n", "# comment line\n"]
ec2rlcore.prediag.backup(test_path, self.backed_files, self.backup_dir_path)
try:
with open(test_path, "w") as ssh_cfg_file:
ssh_cfg_file.writelines(bad_config_lines)
process_output = subprocess.check_output(TestSSH.command,
stderr=subprocess.STDOUT,
env=self.env_dict,
universal_newlines=True)
with open(test_path, "r") as ssh_cfg_file:
if fixed_config_lines == ssh_cfg_file.readlines():
self.verified_fixed = True
finally:
ec2rlcore.prediag.restore(test_path, self.backed_files)
for message in output_messages:
self.assertTrue(message in process_output)
self.assertTrue(self.verified_fixed)
def test_ssh_duplicate_authkey_lines(self):
"""
Backup the sshd configuration file, write a new configuration file with multiple AuthorizedKeysFile lines,
test that the problem is detected, and restore the backup configuration file copy.
"""
test_path = self.config_file_path
output_messages = list()
output_messages.append(
"-- FAILURE sshd configuration file contains duplicate AuthorizedKeysFile lines: {}".format(test_path))
output_messages.append("-- Unable to check 12 items due to dependent check failures")
bad_config_lines = ["AuthorizedKeysFile %h/.ssh/authorized_keys\n"
"AuthorizedKeysFile .ssh/authorized_keys\n"
"AuthorizedKeysFile /var/super/secret/location/authorized_keys\n"]
ec2rlcore.prediag.backup(test_path, self.backed_files, self.backup_dir_path)
try:
with open(test_path, "w") as ssh_cfg_file:
ssh_cfg_file.writelines(bad_config_lines)
process_output = subprocess.check_output(TestSSH.command,
stderr=subprocess.STDOUT,
env=self.env_dict,
universal_newlines=True)
finally:
ec2rlcore.prediag.restore(test_path, self.backed_files)
for message in output_messages:
self.assertTrue(message in process_output)
def test_ssh_duplicate_authkey_lines_remediate(self):
"""
Backup the sshd configuration file, write a new configuration file with multiple AuthorizedKeysFile lines,
test that the problem is detected and remediated, and restore the backup configuration file copy.
"""
# Translated user args
self.env_dict["remediate"] = "True"
test_path = self.config_file_path
output_messages = list()
output_messages.append(
"-- FIXED sshd configuration file contains duplicate AuthorizedKeysFile lines: {}".format(test_path))
bad_config_lines = ["AuthorizedKeysFile %h/.ssh/authorized_keys\n",
"AuthorizedKeysFile .ssh/authorized_keys\n",
"AuthorizedKeysFile /var/super/secret/location/authorized_keys\n"]
fixed_config_lines = ["# AuthorizedKeysFile %h/.ssh/authorized_keys # commented out by ec2rl\n",
"# AuthorizedKeysFile .ssh/authorized_keys # commented out by ec2rl\n",
"# AuthorizedKeysFile /var/super/secret/location/authorized_keys # commented out by "
"ec2rl\n",
"AuthorizedKeysFile %h/.ssh/authorized_keys "
".ssh/authorized_keys "
"/var/super/secret/location/authorized_keys\n"]
ec2rlcore.prediag.backup(test_path, self.backed_files, self.backup_dir_path)
try:
with open(test_path, "w") as ssh_cfg_file:
ssh_cfg_file.writelines(bad_config_lines)
process_output = subprocess.check_output(TestSSH.command,
stderr=subprocess.STDOUT,
env=self.env_dict,
universal_newlines=True)
with open(test_path, "r") as ssh_cfg_file:
if fixed_config_lines == ssh_cfg_file.readlines():
self.verified_fixed = True
finally:
ec2rlcore.prediag.restore(test_path, self.backed_files)
for message in output_messages:
self.assertTrue(message in process_output)
self.assertTrue(self.verified_fixed)
def test_ssh_missing_priv_sep_dir(self):
"""
Backup the user privilege separation directory, remove the user privilege separation directory, test that
the problem is detected, and restore the backup user privilege separation directory copy.
"""
test_path = moduletests.src.openssh.get_privilege_separation_dir()
output_messages = list()
output_messages.append("-- FAILURE Missing privilege separation directory: {}".format(test_path))
output_messages.append("-- Unable to check 4 items due to dependent check failures")
ec2rlcore.prediag.backup(test_path, self.backed_files, self.backup_dir_path)
try:
shutil.rmtree(test_path)
process_output = subprocess.check_output(TestSSH.command,
stderr=subprocess.STDOUT,
env=self.env_dict,
universal_newlines=True)
finally:
ec2rlcore.prediag.restore(test_path, self.backed_files)
for message in output_messages:
self.assertTrue(message in process_output)
def test_ssh_missing_priv_sep_dir_remediate(self):
"""
Backup the user privilege separation directory, remove the user privilege separation directory, test that
the problem is detected and remediated, and restore the backup user privilege separation directory copy.
"""
# Translated user args
self.env_dict["remediate"] = "True"
test_path = moduletests.src.openssh.get_privilege_separation_dir()
output_messages = list()
output_messages.append("-- FIXED Missing privilege separation directory: {}".format(test_path))
ec2rlcore.prediag.backup(test_path, self.backed_files, self.backup_dir_path)
try:
shutil.rmtree(test_path)
process_output = subprocess.check_output(TestSSH.command,
stderr=subprocess.STDOUT,
env=self.env_dict,
universal_newlines=True)
if os.path.isdir(test_path):
self.verified_fixed = True
finally:
if os.path.exists(test_path):
shutil.rmtree(test_path)
ec2rlcore.prediag.restore(test_path, self.backed_files)
for message in output_messages:
self.assertTrue(message in process_output)
self.assertTrue(self.verified_fixed)
def test_ssh_priv_sep_dir_wrong_owner(self):
"""
Set the uid of the user privilege separation directory to 1337, verify the issue is detected, and
undo the uid change.
"""
test_path = moduletests.src.openssh.get_privilege_separation_dir()
output_messages = list()
output_messages.append("-- FAILURE Not owned by user root: {}".format(test_path))
try:
os.chown(test_path, 1337, -1)
process_output = subprocess.check_output(TestSSH.command,
stderr=subprocess.STDOUT,
env=self.env_dict,
universal_newlines=True)
finally:
os.chown(test_path, 0, -1)
for message in output_messages:
self.assertTrue(message in process_output)
def test_ssh_priv_sep_dir_wrong_owner_remediate(self):
"""
Set the uid of the user privilege separation directory to 1337, verify the issue is detected and remediated, and
undo the uid change, if needed.
"""
# Translated user args
self.env_dict["remediate"] = "True"
test_path = moduletests.src.openssh.get_privilege_separation_dir()
output_messages = list()
output_messages.append("-- FIXED Not owned by user root: {}".format(test_path))
try:
os.chown(test_path, 1337, -1)
process_output = subprocess.check_output(TestSSH.command,
stderr=subprocess.STDOUT,
env=self.env_dict,
universal_newlines=True)
if os.stat(test_path).st_uid == 0:
self.verified_fixed = True
finally:
os.chown(test_path, 0, -1)
for message in output_messages:
self.assertTrue(message in process_output)
self.assertTrue(self.verified_fixed)
def test_ssh_priv_sep_dir_world_writable(self):
"""
Set the permission mode of the user privilege separation user directory to 777 (world writable),
verify the issue is detected, and undo the permission mode change.
"""
test_path = moduletests.src.openssh.get_privilege_separation_dir()
output_messages = list()
output_messages.append(
"-- FAILURE Permission mode includes write for groups and/or other users: {}".format(test_path))
try:
os.chmod(test_path, 0o777)
process_output = subprocess.check_output(TestSSH.command,
stderr=subprocess.STDOUT,
env=self.env_dict,
universal_newlines=True)
finally:
os.chmod(test_path, 0o700)
for message in output_messages:
self.assertTrue(message in process_output)
def test_ssh_priv_sep_dir_world_writable_remediate(self):
"""
Set the permission mode of the user privilege separation user directory to 777 (world writable),
verify the issue is detected and remediated, and undo the permission mode change, if needed.
"""
# Translated user args
self.env_dict["remediate"] = "True"
test_path = moduletests.src.openssh.get_privilege_separation_dir()
output_messages = list()
output_messages.append("-- FIXED Permission mode includes write for groups and/or other users: {}".format(
test_path))
try:
os.chmod(test_path, 0o777)
process_output = subprocess.check_output(TestSSH.command,
stderr=subprocess.STDOUT,
env=self.env_dict,
universal_newlines=True)
if os.stat(test_path).st_mode == (stat.S_IFDIR | 0o755):
self.verified_fixed = True
finally:
os.chmod(test_path, 0o755)
for message in output_messages:
self.assertTrue(message in process_output)
self.assertTrue(self.verified_fixed)
def test_ssh_hostkeys_world_writable(self):
"""
Create a test host key, backup the sshd configuration, modify the sshd configuration to include the new key,
set the permission mode on key file to 777 (world writable), verify the issue is detected, and
undo the sshd configuration change and remove the new host key file.
"""
moduletests.src.openssh.Problem.setup_config_vars()
test_path = "/root/test_key.dsa"
output_messages = list()
output_messages.append(
"-- FAILURE Permission mode includes permissions for groups and/or other users: {}".format(test_path))
ec2rlcore.prediag.backup(self.config_file_path, self.backed_files, self.backup_dir_path)
try:
subprocess.check_call(
shlex.split("ssh-keygen -q -t dsa -f {} -N \"\" -C \"\"".format(test_path)), stderr=subprocess.STDOUT)
os.chmod(test_path, 0o777)
with open(self.config_file_path, mode="w") as ssh_cfg_file:
ssh_cfg_file.write("HostKey {}\n".format(test_path))
ssh_cfg_file.flush()
process_output = subprocess.check_output(TestSSH.command,
stderr=subprocess.STDOUT,
env=self.env_dict,
universal_newlines=True)
finally:
os.remove(test_path)
ec2rlcore.prediag.restore(self.config_file_path, self.backed_files)
for message in output_messages:
self.assertTrue(message in process_output)
def test_ssh_hostkeys_world_writable_remediate(self):
"""
Create a test host key, backup the sshd configuration, modify the sshd configuration to include the new key,
set the permission mode on key file to 777 (world writable), verify the issue is detected and remediated, and
undo the sshd configuration change and remove the new host key file.
"""
# Translated user args
self.env_dict["remediate"] = "True"
moduletests.src.openssh.Problem.setup_config_vars()
test_path = "/root/test_key.dsa"
output_messages = list()
output_messages.append(
"-- FIXED Permission mode includes permissions for groups and/or other users: {}".format(test_path))
ec2rlcore.prediag.backup(self.config_file_path, self.backed_files, self.backup_dir_path)
try:
subprocess.check_call(
shlex.split("ssh-keygen -q -t dsa -f {} -N \"\" -C \"\"".format(test_path)), stderr=subprocess.STDOUT)
os.chmod(test_path, 0o777)
with open(self.config_file_path, mode="w") as ssh_cfg_file:
ssh_cfg_file.write("HostKey {}\n".format(test_path))
ssh_cfg_file.flush()
process_output = subprocess.check_output(TestSSH.command,
stderr=subprocess.STDOUT,
env=self.env_dict,
universal_newlines=True)
if os.stat(test_path).st_mode == (stat.S_IFREG | 0o600):
self.verified_fixed = True
finally:
os.remove(test_path)
ec2rlcore.prediag.restore(self.config_file_path, self.backed_files)
for message in output_messages:
self.assertTrue(message in process_output)
self.assertTrue(self.verified_fixed)
def test_ssh_missing_priv_sep_user(self):
"""
Remove the privilege separation user, test that the issue is detected, and recreate the privilege separation
user. There is no programatic way to obtain this user so the test will assume it is "sshd".
"""
test_user = "sshd"
test_path = moduletests.src.openssh.get_privilege_separation_dir()
output_messages = list()
output_messages.append("-- FAILURE Missing privilege separation user: {}".format(test_user))
try:
with open(os.devnull) as devnull:
subprocess.check_call(shlex.split("userdel sshd"), stderr=devnull)
process_output = subprocess.check_output(TestSSH.command,
stderr=subprocess.STDOUT,
env=self.env_dict,
universal_newlines=True)
finally:
with open(os.devnull) as devnull:
subprocess.check_call(shlex.split("useradd -s /sbin/nologin "
"-r -m "
"-c 'Privilege-separated SSH' "
"-d {} {}".format(test_path, test_user)),
stdout=devnull,
stderr=devnull)
for message in output_messages:
self.assertTrue(message in process_output)
def test_ssh_missing_priv_sep_user_remediate(self):
"""
Remove the privilege separation user, test that the issue is detected and remediated, and recreate the
privilege separation user, if needed. There is no programatic way to obtain this user so the test will
assume it is "sshd".
"""
# Translated user args
self.env_dict["remediate"] = "True"
# There is no programatic way to obtain this user so the test will assume it is "sshd".
test_user = "sshd"
test_path = moduletests.src.openssh.get_privilege_separation_dir()
output_messages = list()
output_messages.append("-- FIXED Missing privilege separation user: {}".format(test_user))
with open(os.devnull) as devnull:
subprocess.check_call(shlex.split("userdel sshd"), stderr=devnull)
try:
process_output = subprocess.check_output(TestSSH.command,
stderr=subprocess.STDOUT,
env=self.env_dict,
universal_newlines=True)
try:
pwd.getpwnam(test_user)
self.verified_fixed = True
except KeyError:
with open(os.devnull) as devnull:
subprocess.check_call(shlex.split("useradd -s /sbin/nologin "
"-r -m "
"-c 'Privilege-separated SSH' "
"-d {} {}".format(test_path, test_user)),
stdout=devnull,
stderr=devnull)
except subprocess.CalledProcessError as cpe:
with open(os.devnull) as devnull:
subprocess.check_call(shlex.split("useradd -s /sbin/nologin "
"-r -m "
"-c 'Privilege-separated SSH' "
"-d {} {}".format(test_path, test_user)),
stdout=devnull,
stderr=devnull)
process_output = cpe.stdout
for message in output_messages:
self.assertTrue(message in process_output)
self.assertTrue(self.verified_fixed)
@unittest.skip("Does not work with distros that have GSSAPI patches applied.")
def test_ssh_missing_hostkeys(self):
"""
Backup the hostkeys, remove the host keys, test that the problem has been detected, and restore the backup
host key copies.
"""
output_messages = list()
output_messages.append("-- FAILURE Missing hostkey files")
for key_file in ["/etc/ssh/ssh_host_dsa_key",
"/etc/ssh/ssh_host_ecdsa_key",
"/etc/ssh/ssh_host_ed25519_key",
"/etc/ssh/ssh_host_rsa_key"]:
if os.path.exists(key_file):
ec2rlcore.prediag.backup(key_file, self.backed_files, self.backup_dir_path)
try:
for key_file in self.backed_files:
os.remove(key_file)
process_output = subprocess.check_output(TestSSH.command,
stderr=subprocess.STDOUT,
env=self.env_dict,
universal_newlines=True)
finally:
for file in self.backed_files:
ec2rlcore.prediag.restore(file, self.backed_files)
for message in output_messages:
self.assertTrue(message in process_output)
@unittest.skip("Does not work with distros that have GSSAPI patches applied.")
def test_ssh_missing_hostkeys_remediate(self):
"""
Backup the hostkeys, remove the host keys, test that the problem has been detected and remediated, and
restore the backup host key copies.
"""
# Translated user args
self.env_dict["remediate"] = "True"
output_messages = list()
output_messages.append("-- FIXED Missing hostkey files")
for key_file in ["/etc/ssh/ssh_host_dsa_key",
"/etc/ssh/ssh_host_ecdsa_key",
"/etc/ssh/ssh_host_ed25519_key",
"/etc/ssh/ssh_host_rsa_key"]:
if os.path.exists(key_file):
ec2rlcore.prediag.backup(key_file, self.backed_files, self.backup_dir_path)
try:
for key_file in self.backed_files:
os.remove(key_file)
process_output = subprocess.check_output(TestSSH.command,
stderr=subprocess.STDOUT,
env=self.env_dict,
universal_newlines=True)
for key_file in self.backed_files:
self.assertTrue(os.path.isfile(key_file))
finally:
for key_file in self.backed_files:
ec2rlcore.prediag.restore(key_file, self.backed_files)
for message in output_messages:
self.assertTrue(message in process_output)
def test_ssh_chained_problems(self):
"""
Create a set of problems on the system including problems dependent upon eachother, test that they are all
detected and remediate, and restore all the backed up files.
"""
moduletests.src.openssh.Problem.setup_config_vars()
# Translated user args
self.env_dict["remediate"] = "True"
sshd_config_path = self.config_file_path
auth_keys_path = os.path.join("/home",
self.username,
moduletests.src.openssh.Problem.CONFIG_DICT["AUTH_KEYS"]["relative"][0])
priv_sep_dir_path = moduletests.src.openssh.get_privilege_separation_dir()
output_messages = list()
output_messages.append("-- FIXED Bad lines in configuration file: {}".format(sshd_config_path))
output_messages.append("-- FIXED Permission mode includes write for groups and/or other users: {}".format(
auth_keys_path))
output_messages.append("-- FIXED Permission mode includes write for groups and/or other users: {}".format(
priv_sep_dir_path))
bad_config_lines = ["bad option\n", "Port 22 # Good option \n", "# comment line\n"]
fixed_config_lines = ["# bad option # commented out by ec2rl\n", "Port 22 # Good option \n", "# comment line\n"]
self.verified_fixed = list()
ec2rlcore.prediag.backup(sshd_config_path, self.backed_files, self.backup_dir_path)
try:
with open(sshd_config_path, "w") as ssh_cfg_file:
ssh_cfg_file.writelines(bad_config_lines)
# In the DAG, the following two problems are addressed in vertices that are dependent upon
# resolution of the config file problem
os.chmod(auth_keys_path, 0o777)
os.chmod(priv_sep_dir_path, 0o777)
process_output = subprocess.check_output(TestSSH.command,
stderr=subprocess.STDOUT,
env=self.env_dict,
universal_newlines=True)
with open(sshd_config_path, "r") as ssh_cfg_file:
if fixed_config_lines == ssh_cfg_file.readlines():
self.verified_fixed.append(True)
else:
self.verified_fixed.append(False)
if os.stat(auth_keys_path).st_mode == (stat.S_IFREG | 0o655):
self.verified_fixed.append(True)
else:
self.verified_fixed.append(False)
if os.stat(priv_sep_dir_path).st_mode == (stat.S_IFDIR | 0o755):
self.verified_fixed.append(True)
else:
self.verified_fixed.append(False)
finally:
ec2rlcore.prediag.restore(sshd_config_path, self.backed_files)
os.chmod(auth_keys_path, 0o655)
os.chmod(priv_sep_dir_path, 0o755)
for message in output_messages:
self.assertTrue(message in process_output)
self.assertEqual(self.verified_fixed, [True] * 3)
def test_ssh_key_injection_only_arg_success(self):
"""Test standalone key injection using a new key value from the user args."""
moduletests.src.openssh.Problem.setup_config_vars()
# Translated user args
self.env_dict["remediate"] = "True"
self.env_dict["injectkeyonly"] = "True"
self.env_dict["newsshkey"] = "newkeyvalue"
# Create test user, its .ssh directory, and its authorized_keys file
test_user_name = "ec2rltestuser"
subprocess.check_call(shlex.split("useradd --create-home {}".format(test_user_name)))
test_user_home_ssh_dir = os.path.join(pwd.getpwnam("{}".format(test_user_name)).pw_dir, ".ssh")
test_user_auth_keys = os.path.join(test_user_home_ssh_dir, "authorized_keys")
os.makedirs(test_user_home_ssh_dir)
os.chmod(test_user_home_ssh_dir, 0o0600)
os.mknod(test_user_auth_keys)
os.chmod(test_user_auth_keys, 0o0600)
output_messages = list()
output_messages.append("Key injection enabled. Will skip other functionality.")
output_messages.append("Obtained new key from --new-ssh-key arg.")
output_messages.append("[SUCCESS] OpenSSH public key injection operation completed successfully.")
try:
# Backup original authorized_keys files
for file in glob.glob("/home/*"):
for key_path in moduletests.src.openssh.Problem.CONFIG_DICT["AUTH_KEYS"]["relative"]:
file = os.path.realpath(file)
full_path_auth_keys = os.path.join(file, key_path)
# Verify this file is a directory
if not os.path.isdir(file):
continue
# Verify the /home directory belongs to a user account
try:
pwd.getpwnam(os.path.basename(file)).pw_uid
except KeyError:
# pwd.getpwnam raises KeyError when the arg does not match a user in the password database
continue
# Verify the entire key path is valid and the keys file is actually a file
if not os.path.isfile(full_path_auth_keys):
continue
ec2rlcore.prediag.backup(full_path_auth_keys, self.backed_files, self.backup_dir_path)
process_output = subprocess.check_output(TestSSH.command,
stderr=subprocess.STDOUT,
env=self.env_dict,
universal_newlines=True)
for message in output_messages:
self.assertTrue(message in process_output)
with open(test_user_auth_keys, "r") as fp:
auth_keys_content = fp.read()
self.assertEqual(auth_keys_content.strip(), self.env_dict["newsshkey"])
finally:
del self.env_dict["newsshkey"]
for key_file in self.backed_files:
ec2rlcore.prediag.restore(key_file, self.backed_files)
subprocess.check_output(shlex.split("userdel --force --remove {}".format(test_user_name)),
stderr=subprocess.STDOUT)
@unittest.skipIf(not ec2rlcore.prediag.is_an_instance(), "Test must be run from an AWS EC2 instance.")
def test_ssh_key_injection_only_metadata_success(self):
"""Test standalone key injection using a new key value from the user args."""
moduletests.src.openssh.Problem.setup_config_vars()
# Translated user args
self.env_dict["remediate"] = "True"
self.env_dict["injectkeyonly"] = "True"
self.env_dict["notaninstance"] = "False"
# Get the expected key value from the IMDS
expected_key_value = requests.get(moduletests.src.openssh.METADATA_KEY_URL).text.strip()
# Create test user, its .ssh directory, and its authorized_keys file
test_user_name = "ec2rltestuser"
subprocess.check_call(shlex.split("useradd --create-home {}".format(test_user_name)))
test_user_home_ssh_dir = os.path.join(pwd.getpwnam("{}".format(test_user_name)).pw_dir, ".ssh")
test_user_auth_keys = os.path.join(test_user_home_ssh_dir, "authorized_keys")
os.makedirs(test_user_home_ssh_dir)
os.chmod(test_user_home_ssh_dir, 0o0600)
os.mknod(test_user_auth_keys)
os.chmod(test_user_auth_keys, 0o0600)
output_messages = list()
output_messages.append("Key injection enabled. Will skip other functionality.")
output_messages.append("Obtained new key from instance metadata.")
output_messages.append("[SUCCESS] OpenSSH public key injection operation completed successfully.")
try:
# Backup original authorized_keys files
for file in glob.glob("/home/*"):
for key_path in moduletests.src.openssh.Problem.CONFIG_DICT["AUTH_KEYS"]["relative"]:
file = os.path.realpath(file)
full_path_auth_keys = os.path.join(file, key_path)
# Verify this file is a directory
if not os.path.isdir(file):
continue
# Verify the /home directory belongs to a user account
try:
pwd.getpwnam(os.path.basename(file)).pw_uid
except KeyError:
# pwd.getpwnam raises KeyError when the arg does not match a user in the password database
continue
# Verify the entire key path is valid and the keys file is actually a file
if not os.path.isfile(full_path_auth_keys):
continue
ec2rlcore.prediag.backup(full_path_auth_keys, self.backed_files, self.backup_dir_path)
process_output = subprocess.check_output(TestSSH.command,
stderr=subprocess.STDOUT,
env=self.env_dict,
universal_newlines=True)
for message in output_messages:
self.assertTrue(message in process_output)
with open(test_user_auth_keys, "r") as fp:
auth_keys_content = fp.read()
self.assertEqual(auth_keys_content.strip(), expected_key_value)
finally:
for key_file in self.backed_files:
ec2rlcore.prediag.restore(key_file, self.backed_files)
subprocess.check_output(shlex.split("userdel --force --remove {}".format(test_user_name)),
stderr=subprocess.STDOUT)
@unittest.skipIf(not ec2rlcore.prediag.is_an_instance(), "Test must be run from an AWS EC2 instance.")
def test_ssh_key_injection_only_generate_new_success(self):
"""
Test standalone key injection when creating a new key pair. This test also requires there be configured
AWS credentials and that the EC2 instance have sufficient permissions since the newly generated private
key is stored as a SecureString Parameter and the test must retrieve the parameter for verification.
Both put_parameter and get_parameter SSM permissions as well as access to the KMS key used to encrypt
the SecureString value are required.
"""
moduletests.src.openssh.Problem.setup_config_vars()
# Translated user args
self.env_dict["remediate"] = "True"
self.env_dict["injectkeyonly"] = "True"
self.env_dict["createnewkeys"] = "True"
self.env_dict["notaninstance"] = "False"
# Expected SSM parameter name for verification purposes
ssm_parameter_name = "/ec2rl/openssh/{}/key".format(ec2rlcore.awshelpers.get_instance_id())
# Create test user, its .ssh directory, and its authorized_keys file
test_user_name = "ec2rltestuser"
subprocess.check_call(shlex.split("useradd --create-home {}".format(test_user_name)))
test_user_home_ssh_dir = os.path.join(pwd.getpwnam("{}".format(test_user_name)).pw_dir, ".ssh")
test_user_auth_keys = os.path.join(test_user_home_ssh_dir, "authorized_keys")
os.makedirs(test_user_home_ssh_dir)
os.chmod(test_user_home_ssh_dir, 0o0600)
os.mknod(test_user_auth_keys)
os.chmod(test_user_auth_keys, 0o0600)
output_messages = list()
output_messages.append("Key injection enabled. Will skip other functionality.")
output_messages.append("New key pair generated.")
output_messages.append("[SUCCESS] OpenSSH public key injection operation completed successfully.")
try:
# Backup original authorized_keys files
for file in glob.glob("/home/*"):
for key_path in moduletests.src.openssh.Problem.CONFIG_DICT["AUTH_KEYS"]["relative"]:
file = os.path.realpath(file)
full_path_auth_keys = os.path.join(file, key_path)
# Verify this file is a directory
if not os.path.isdir(file):
continue
# Verify the /home directory belongs to a user account
try:
pwd.getpwnam(os.path.basename(file)).pw_uid
except KeyError:
# pwd.getpwnam raises KeyError when the arg does not match a user in the password database
continue
# Verify the entire key path is valid and the keys file is actually a file
if not os.path.isfile(full_path_auth_keys):
continue
ec2rlcore.prediag.backup(full_path_auth_keys, self.backed_files, self.backup_dir_path)
process_output = subprocess.check_output(TestSSH.command,
stderr=subprocess.STDOUT,
env=self.env_dict,
universal_newlines=True)
for message in output_messages:
self.assertTrue(message in process_output)
with open(test_user_auth_keys, "r") as fp:
auth_keys_content = fp.read()
self.assertTrue(re.match(r"^ssh-rsa .+ Added by EC2 Rescue for Linux$", auth_keys_content.strip()))
try:
client = boto3.client("ssm", ec2rlcore.awshelpers.get_instance_region())
private_key = client.get_parameter(Name=ssm_parameter_name, WithDecryption=True)["Parameter"]["Value"]
self.assertTrue(re.match(r"^-----BEGIN RSA PRIVATE KEY-----.+-----END RSA PRIVATE KEY-----\n$",
private_key,
re.DOTALL))
except Exception:
self.fail("Failed to verify SSM SecureString Parameter value.")
finally:
for key_file in self.backed_files:
ec2rlcore.prediag.restore(key_file, self.backed_files)
subprocess.check_output(shlex.split("userdel --force --remove {}".format(test_user_name)),
stderr=subprocess.STDOUT)
def test_ssh_key_injection_only_no_key_available_notaninstance(self):
"""
Test that key injection fails when no key is provided, the create keys arg is not passed,
and the environment is not an instance.
"""
moduletests.src.openssh.Problem.setup_config_vars()
# Translated user args
self.env_dict["remediate"] = "True"
self.env_dict["injectkeyonly"] = "True"
self.env_dict["notaninstance"] = "True"
output_messages = list()
output_messages.append("Key injection enabled. Will skip other functionality.")
output_messages.append("Aborting because no new key available.")
output_messages.append("[FAILURE] Failed to obtain and inject new OpenSSH public key.")
process_output = subprocess.check_output(TestSSH.command,
stderr=subprocess.STDOUT,
env=self.env_dict,
universal_newlines=True)
for message in output_messages:
self.assertTrue(message in process_output)
def test_ssh_key_injection_only_missing_remediate(self):
"""Test that key injection fails when remediation is not enabled via user arg."""
moduletests.src.openssh.Problem.setup_config_vars()
# Translated user args
self.env_dict["injectkeyonly"] = "True"
output_messages = list()
output_messages.append("Key injection not enabled.")
output_messages.append("[FAILURE] --inject-key-only requires --remediate which was not provided.")
process_output = subprocess.check_output(TestSSH.command,
stderr=subprocess.STDOUT,
env=self.env_dict,
universal_newlines=True)
for message in output_messages:
self.assertTrue(message in process_output)
def test_ssh_key_injection_missing_remediate(self):
"""
Test that key injection does not execute when remediation is not enabled via user args, but
the rest of the module does execute.
"""
moduletests.src.openssh.Problem.setup_config_vars()
# Translated user args
self.env_dict["injectkey"] = "True"
output_messages = list()
output_messages.append("Key injection not enabled.")
output_messages.append(" --inject-key requires --remediate which was not provided.")
output_messages.append("[SUCCESS] All configuration checks passed or all detected problems fixed.")
process_output = subprocess.check_output(TestSSH.command,
stderr=subprocess.STDOUT,
env=self.env_dict,
universal_newlines=True)
for message in output_messages:
self.assertTrue(message in process_output)
| 48.443329 | 120 | 0.572738 | 7,370 | 67,530 | 5.051289 | 0.062687 | 0.031374 | 0.026297 | 0.027076 | 0.877968 | 0.870474 | 0.848179 | 0.834802 | 0.821264 | 0.807725 | 0 | 0.009826 | 0.348971 | 67,530 | 1,393 | 121 | 48.478105 | 0.836958 | 0.143255 | 0 | 0.796165 | 0 | 0.001009 | 0.128838 | 0.012976 | 0 | 0 | 0 | 0 | 0.064581 | 1 | 0.045409 | false | 0.003027 | 0.014127 | 0 | 0.069627 | 0.001009 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
43df260365821dd3d1805f54d48a1210c2343853 | 59,520 | py | Python | release/stubs/Rhino/Render/Fields.py | YKato521/ironpython-stubs | b1f7c580de48528490b3ee5791b04898be95a9ae | [
"MIT"
] | null | null | null | release/stubs/Rhino/Render/Fields.py | YKato521/ironpython-stubs | b1f7c580de48528490b3ee5791b04898be95a9ae | [
"MIT"
] | null | null | null | release/stubs/Rhino/Render/Fields.py | YKato521/ironpython-stubs | b1f7c580de48528490b3ee5791b04898be95a9ae | [
"MIT"
] | null | null | null | # encoding: utf-8
# module Rhino.Render.Fields calls itself Fields
# from RhinoCommon, Version=5.1.30000.16, Culture=neutral, PublicKeyToken=552281e97c755530
# by generator 1.145
""" NamespaceTracker represent a CLS namespace. """
# no imports
# no functions
# classes
class Field(object):
"""
Generic data fields used to add publicly accessible properties to
RenderContent.FieldDictionary. These should be created by calling a
FieldDictaionary.Add() method on a Render content object. These are
allocated after the RenderContent object's C++ object is created and
added to the underlying C++ objects content dictionary, who ever
allocates a field is responsible for deleting it so these objects clean
up the C++ pointers when they are disposed of.
"""
def CreateCppPointer(self, *args): # cannot find CLR method
"""
CreateCppPointer(self: Field, content: RenderContent, attachToPointer: IntPtr)
Create the RDK C++ field object and set its initial value, fields are
added to a
RenderContent.FieldDictionary in the RenderContent
constructor before the
RenderContent C++ pointer is created, the
RenderContent C++ pointer is required
when creating a field in order
for the field to get added to the RenderContent C++
Field list so this
method is called by RenderContent when it is safe to create the
Field
C++ pointers.
content: RenderContent.FiledDictionary that owns this Field.
attachToPointer: Existing C++ pointer to attach to.
"""
pass
def ValueAsBool(self, *args): # cannot find CLR method
"""
ValueAsBool(self: Field) -> bool
Return field value as a bool.
Returns: Returns field value as a bool.
"""
pass
def ValueAsByteArray(self, *args): # cannot find CLR method
"""
ValueAsByteArray(self: Field) -> Array[Byte]
Return field as a byte array.
Returns: Return field as a byte array.
"""
pass
def ValueAsColor4f(self, *args): # cannot find CLR method
"""
ValueAsColor4f(self: Field) -> Color4f
Return field as a Rhino.Display.Color4f color value.
Returns: Return field as a Rhino.Display.Color4f color value.
"""
pass
def ValueAsDateTime(self, *args): # cannot find CLR method
"""
ValueAsDateTime(self: Field) -> DateTime
Return field as a DateTime value.
Returns: Return field as a DateTime value.
"""
pass
def ValueAsDouble(self, *args): # cannot find CLR method
"""
ValueAsDouble(self: Field) -> float
Return field value as a double precision number.
Returns: Return the field value as a double precision number.
"""
pass
def ValueAsFloat(self, *args): # cannot find CLR method
"""
ValueAsFloat(self: Field) -> Single
Return field value as floating point number.
Returns: Return the field value as an floating point number.
"""
pass
def ValueAsGuid(self, *args): # cannot find CLR method
"""
ValueAsGuid(self: Field) -> Guid
Return field value as Guid.
Returns: Return the field value as an Guid.
"""
pass
def ValueAsInt(self, *args): # cannot find CLR method
"""
ValueAsInt(self: Field) -> int
Return field value as integer.
Returns: Return the field value as an integer.
"""
pass
def ValueAsObject(self):
""" ValueAsObject(self: Field) -> object """
pass
def ValueAsPoint2d(self, *args): # cannot find CLR method
"""
ValueAsPoint2d(self: Field) -> Point2d
Return field as a Rhino.Geometry.Point2d color value.
Returns: Return field as a Rhino.Geometry.Point2d color value.
"""
pass
def ValueAsPoint3d(self, *args): # cannot find CLR method
"""
ValueAsPoint3d(self: Field) -> Point3d
Return field as a Rhino.Geometry.Point3d color value.
Returns: Return field as a Rhino.Geometry.Point3d color value.
"""
pass
def ValueAsPoint4d(self, *args): # cannot find CLR method
"""
ValueAsPoint4d(self: Field) -> Point4d
Return field as a Rhino.Geometry.Point4d color value.
Returns: Return field as a Rhino.Geometry.Point4d color value.
"""
pass
def ValueAsString(self, *args): # cannot find CLR method
"""
ValueAsString(self: Field) -> str
Get field value as a string.
Returns: Returns the field value as a string if possible.
"""
pass
def ValueAsTransform(self, *args): # cannot find CLR method
"""
ValueAsTransform(self: Field) -> Transform
Return field as a Rhino.Geometry.Transform color value.
Returns: Return field as a Rhino.Geometry.Transform color value.
"""
pass
def ValueAsVector2d(self, *args): # cannot find CLR method
"""
ValueAsVector2d(self: Field) -> Vector2d
Return field as a Rhino.Geometry.Vector2d color value.
Returns: Return field as a Rhino.Geometry.Vector2d color value.
"""
pass
def ValueAsVector3d(self, *args): # cannot find CLR method
"""
ValueAsVector3d(self: Field) -> Vector3d
Return field as a Rhino.Geometry.Vector3d color value.
Returns: Return field as a Rhino.Geometry.Vector3d color value.
"""
pass
@staticmethod # known case of __new__
def __new__(self, *args): # cannot find CLR constructor
""" __new__(cls: type, renderContent: RenderContent, attachToPointer: IntPtr, key: str, prompt: str, initialValue: object, isTextured: bool) """
pass
IsTextured = property(
lambda self: object(), lambda self, v: None, lambda self: None
) # default
"""Get: IsTextured(self: Field) -> bool
"""
Key = property(
lambda self: object(), lambda self, v: None, lambda self: None
) # default
"""Field key value string set by constructor
Get: Key(self: Field) -> str
"""
Prompt = property(
lambda self: object(), lambda self, v: None, lambda self: None
) # default
"""Optional UI prompt string set by constructor
Get: Prompt(self: Field) -> str
"""
Tag = property(
lambda self: object(), lambda self, v: None, lambda self: None
) # default
"""Gets or sets an object that contains data to associate with the field.
Get: Tag(self: Field) -> object
Set: Tag(self: Field) = value
"""
class BoolField(Field):
""" bool field value class """
def ValueAsObject(self):
""" ValueAsObject(self: BoolField) -> object """
pass
Value = property(
lambda self: object(), lambda self, v: None, lambda self: None
) # default
"""Gets or sets the field value
Get: Value(self: BoolField) -> bool
Set: Value(self: BoolField) = value
"""
class ByteArrayField(Field):
""" ByteArray field value class """
def ValueAsObject(self):
""" ValueAsObject(self: ByteArrayField) -> object """
pass
Value = property(
lambda self: object(), lambda self, v: None, lambda self: None
) # default
"""Gets or sets the field value
Get: Value(self: ByteArrayField) -> Array[Byte]
Set: Value(self: ByteArrayField) = value
"""
class Color4fField(Field):
""" Color4f field value class """
def ValueAsObject(self):
""" ValueAsObject(self: Color4fField) -> object """
pass
SystemColorValue = property(
lambda self: object(), lambda self, v: None, lambda self: None
) # default
"""Gets or sets the field value
Get: SystemColorValue(self: Color4fField) -> Color
Set: SystemColorValue(self: Color4fField) = value
"""
Value = property(
lambda self: object(), lambda self, v: None, lambda self: None
) # default
"""Gets or sets the field value
Get: Value(self: Color4fField) -> Color4f
Set: Value(self: Color4fField) = value
"""
class DateTimeField(Field):
""" DateTime field value class """
def ValueAsObject(self):
""" ValueAsObject(self: DateTimeField) -> object """
pass
Value = property(
lambda self: object(), lambda self, v: None, lambda self: None
) # default
"""Gets or sets the field value
Get: Value(self: DateTimeField) -> DateTime
Set: Value(self: DateTimeField) = value
"""
class DoubleField(Field):
""" double field value class """
def ValueAsObject(self):
""" ValueAsObject(self: DoubleField) -> object """
pass
Value = property(
lambda self: object(), lambda self, v: None, lambda self: None
) # default
"""Gets or sets the field value
Get: Value(self: DoubleField) -> float
Set: Value(self: DoubleField) = value
"""
class FieldDictionary(object):
"""
Dictionary containing RenderContent data fields, add fields to this
dictionary in your derived RenderContent classes constructor. Get field
values using the TryGet[data type]() methods and set them using the Set()
method.
"""
def Add(self, key, value, prompt=None):
"""
Add(self: FieldDictionary, key: str, value: Point3d) -> Point3dField
Add a new Point3dField to the dictionary. This will be a data only
field and not
show up in the content browsers.
key: Key name for the field value to change.
value: Initial value for this field.
Add(self: FieldDictionary, key: str, value: Point3d, prompt: str) -> Point3dField
Add a new Point3dField to the dictionary.
key: Key name for the field value to change.
value: Initial value for this field.
prompt: Prompt to display in the user interface (Content Browsers) if this
is null or an
empty string the this field is a data only field and will
not appear in the user
interface.
Add(self: FieldDictionary, key: str, value: Point4d) -> Point4dField
Add a new Point4dField to the dictionary. This will be a data only
field and not
show up in the content browsers.
key: Key name for the field value to change.
value: Initial value for this field.
Add(self: FieldDictionary, key: str, value: Point2d, prompt: str) -> Point2dField
Add a new Point2dField to the dictionary.
key: Key name for the field value to change.
value: Initial value for this field.
prompt: Prompt to display in the user interface (Content Browsers) if this
is null or an
empty string the this field is a data only field and will
not appear in the user
interface.
Add(self: FieldDictionary, key: str, value: Vector3d) -> Vector3dField
Add a new Vector3dField to the dictionary. This will be a data only
field and not
show up in the content browsers.
key: Key name for the field value to change.
value: Initial value for this field.
Add(self: FieldDictionary, key: str, value: Vector3d, prompt: str) -> Vector3dField
Add a new Vector3dField to the dictionary.
key: Key name for the field value to change.
value: Initial value for this field.
prompt: Prompt to display in the user interface (Content Browsers) if this
is null or an
empty string the this field is a data only field and will
not appear in the user
interface.
Add(self: FieldDictionary, key: str, value: Point2d) -> Point2dField
Add a new Point2dField to the dictionary. This will be a data only
field and not
show up in the content browsers.
key: Key name for the field value to change.
value: Initial value for this field.
Add(self: FieldDictionary, key: str, value: Point4d, prompt: str) -> Point4dField
Add a new Point4dField to the dictionary.
key: Key name for the field value to change.
value: Initial value for this field.
prompt: Prompt to display in the user interface (Content Browsers) if this
is null or an
empty string the this field is a data only field and will
not appear in the user
interface.
Add(self: FieldDictionary, key: str, value: DateTime) -> DateTimeField
Add a new DateTimeField to the dictionary. This will be a data only
field and not
show up in the content browsers.
key: Key name for the field value to change.
value: Initial value for this field.
Add(self: FieldDictionary, key: str, value: DateTime, prompt: str) -> DateTimeField
Add a new DateTimeField to the dictionary.
key: Key name for the field value to change.
value: Initial value for this field.
prompt: Prompt to display in the user interface (Content Browsers) if this
is null or an
empty string the this field is a data only field and will
not appear in the user
interface.
Add(self: FieldDictionary, key: str, value: Array[Byte]) -> ByteArrayField
AddField a new ByteArrayField to the dictionary. This will be a data
only field and
not show up in the content browsers.
key: Key name for the field value to change.
value: Initial value for this field.
Add(self: FieldDictionary, key: str, value: Transform, prompt: str) -> TransformField
Add a new TransformField to the dictionary.
key: Key name for the field value to change.
value: Initial value for this field.
prompt: Prompt to display in the user interface (Content Browsers) if this
is null or an
empty string the this field is a data only field and will
not appear in the user
interface.
Add(self: FieldDictionary, key: str, value: Guid) -> GuidField
Add a new GuidField to the dictionary. This will be a data only field
and not show
up in the content browsers.
key: Key name for the field value to change.
value: Initial value for this field.
Add(self: FieldDictionary, key: str, value: Guid, prompt: str) -> GuidField
Add a new GuidField to the dictionary.
key: Key name for the field value to change.
value: Initial value for this field.
prompt: Prompt to display in the user interface (Content Browsers) if this
is null or an
empty string the this field is a data only field and will
not appear in the user
interface.
Add(self: FieldDictionary, key: str, value: Transform) -> TransformField
Add a new TransformField to the dictionary. This will be a data only
field and not
show up in the content browsers.
key: Key name for the field value to change.
value: Initial value for this field.
Add(self: FieldDictionary, key: str, value: Vector2d, prompt: str) -> Vector2dField
Add a new Vector2dField to the dictionary.
key: Key name for the field value to change.
value: Initial value for this field.
prompt: Prompt to display in the user interface (Content Browsers) if this
is null or an
empty string the this field is a data only field and will
not appear in the user
interface.
Add(self: FieldDictionary, key: str, value: int) -> IntField
Add a new IntField to the dictionary. This will be a data only field
and not show
up in the content browsers.
key: Key name for the field value to change.
value: Initial value for this field.
Add(self: FieldDictionary, key: str, value: int, prompt: str) -> IntField
Add a new IntField to the dictionary.
key: Key name for the field value to change.
value: Initial value for this field.
prompt: Prompt to display in the user interface (Content Browsers) if this
is null or an
empty string the this field is a data only field and will
not appear in the user
interface.
Add(self: FieldDictionary, key: str, value: Single) -> FloatField
Add a new FloatField to the dictionary. This will be a data only field
and not show
up in the content browsers.
key: Key name for the field value to change.
value: Initial value for this field.
Add(self: FieldDictionary, key: str, value: bool, prompt: str) -> BoolField
Add a new BoolField to the dictionary.
key: Key name for the field value to change.
value: Initial value for this field.
prompt: Prompt to display in the user interface (Content Browsers) if this
is null or an
empty string the this field is a data only field and will
not appear in the user
interface.
Add(self: FieldDictionary, key: str, value: str) -> StringField
Add a new StringField to the dictionary. This will be a data only
field and not
show up in the content browsers.
key: Key name for the field value to change.
value: Initial value for this field.
Add(self: FieldDictionary, key: str, value: str, prompt: str) -> StringField
Add a new StringField to the dictionary.
key: Key name for the field value to change.
value: Initial value for this field.
prompt: Prompt to display in the user interface (Content Browsers) if this
is null or an
empty string the this field is a data only field and will
not appear in the user
interface.
Add(self: FieldDictionary, key: str, value: bool) -> BoolField
Add a new BoolField to the dictionary. This will be a data only field
and not show
up in the content browsers.
key: Key name for the field value to change.
value: Initial value for this field.
Add(self: FieldDictionary, key: str, value: Single, prompt: str) -> FloatField
AddField a new FloatField to the dictionary.
key: Key name for the field value to change.
value: Initial value for this field.
prompt: Prompt to display in the user interface (Content Browsers) if this
is null or an
empty string the this field is a data only field and will
not appear in the user
interface.
Add(self: FieldDictionary, key: str, value: Color) -> Color4fField
Add a new Color4fField to the dictionary. This will be a data only
field and not
show up in the content browsers.
key: Key name for the field value to change.
value: Initial value for this field.
Add(self: FieldDictionary, key: str, value: Color, prompt: str) -> Color4fField
Add a new Color4fField to the dictionary.
key: Key name for the field value to change.
value: Initial value for this field.
prompt: Prompt to display in the user interface (Content Browsers) if this
is null or an
empty string the this field is a data only field and will
not appear in the user
interface.
Add(self: FieldDictionary, key: str, value: Vector2d) -> Vector2dField
Add a new Vector2dField to the dictionary. This will be a data only
field and not
show up in the content browsers.
key: Key name for the field value to change.
value: Initial value for this field.
Add(self: FieldDictionary, key: str, value: Color4f, prompt: str) -> Color4fField
Add a new Color4fField to the dictionary.
key: Key name for the field value to change.
value: Initial value for this field.
prompt: Prompt to display in the user interface (Content Browsers) if this
is null or an
empty string the this field is a data only field and will
not appear in the user
interface.
Add(self: FieldDictionary, key: str, value: float) -> DoubleField
AddField a new DoubleField to the dictionary. This will be a data only
field and
not show up in the content browsers.
key: Key name for the field value to change.
value: Initial value for this field.
Add(self: FieldDictionary, key: str, value: float, prompt: str) -> DoubleField
Add a new DoubleField to the dictionary.
key: Key name for the field value to change.
value: Initial value for this field.
prompt: Prompt to display in the user interface (Content Browsers) if this
is null or an
empty string the this field is a data only field and will
not appear in the user
interface.
Add(self: FieldDictionary, key: str, value: Color4f) -> Color4fField
Add a new Color4fField to the dictionary. This will be a data only
field and not
show up in the content browsers.
key: Key name for the field value to change.
value: Initial value for this field.
"""
pass
def AddTextured(self, key, value, prompt):
"""
AddTextured(self: FieldDictionary, key: str, value: Point3d, prompt: str) -> Point3dField
Add a new Point3dField to the dictionary. This overload will cause the
field to be
tagged as "textured" so that the texturing UI will appear
in automatic UIs.
key: Key name for the field value to change.
value: Initial value for this field.
prompt: Prompt to display in the user interface (Content Browsers) if this
is null or an
empty string the this field is a data only field and will
not appear in the user
interface.
AddTextured(self: FieldDictionary, key: str, value: Point2d, prompt: str) -> Point2dField
Add a new Point2dField to the dictionary. This overload will cause the
field to be
tagged as "textured" so that the texturing UI will appear
in automatic UIs.
key: Key name for the field value to change.
value: Initial value for this field.
prompt: Prompt to display in the user interface (Content Browsers) if this
is null or an
empty string the this field is a data only field and will
not appear in the user
interface.
AddTextured(self: FieldDictionary, key: str, value: Vector3d, prompt: str) -> Vector3dField
Add a new Vector3dField to the dictionary. This overload will cause the
field to be
tagged as "textured" so that the texturing UI will appear
in automatic UIs.
key: Key name for the field value to change.
value: Initial value for this field.
prompt: Prompt to display in the user interface (Content Browsers) if this
is null or an
empty string the this field is a data only field and will
not appear in the user
interface.
AddTextured(self: FieldDictionary, key: str, value: Point4d, prompt: str) -> Point4dField
Add a new Point4dField to the dictionary. This overload will cause the
field to be
tagged as "textured" so that the texturing UI will appear
in automatic UIs.
key: Key name for the field value to change.
value: Initial value for this field.
prompt: Prompt to display in the user interface (Content Browsers) if this
is null or an
empty string the this field is a data only field and will
not appear in the user
interface.
AddTextured(self: FieldDictionary, key: str, value: DateTime, prompt: str) -> DateTimeField
Add a new DateTimeField to the dictionary. This overload will cause the
field to be
tagged as "textured" so that the texturing UI will appear
in automatic UIs.
key: Key name for the field value to change.
value: Initial value for this field.
prompt: Prompt to display in the user interface (Content Browsers) if this
is null or an
empty string the this field is a data only field and will
not appear in the user
interface.
AddTextured(self: FieldDictionary, key: str, value: Transform, prompt: str) -> TransformField
Add a new TransformField to the dictionary. This overload will cause
the field to
be tagged as "textured" so that the texturing UI will
appear in automatic UIs.
key: Key name for the field value to change.
value: Initial value for this field.
prompt: Prompt to display in the user interface (Content Browsers) if this
is null or an
empty string the this field is a data only field and will
not appear in the user
interface.
AddTextured(self: FieldDictionary, key: str, value: Guid, prompt: str) -> GuidField
Add a new GuidField to the dictionary. This overload will cause the
field to be
tagged as "textured" so that the texturing UI will appear
in automatic UIs.
key: Key name for the field value to change.
value: Initial value for this field.
prompt: Prompt to display in the user interface (Content Browsers) if this
is null or an
empty string the this field is a data only field and will
not appear in the user
interface.
AddTextured(self: FieldDictionary, key: str, value: Vector2d, prompt: str) -> Vector2dField
Add a new Vector2dField to the dictionary. This overload will cause the
field to be
tagged as "textured" so that the texturing UI will appear
in automatic UIs.
key: Key name for the field value to change.
value: Initial value for this field.
prompt: Prompt to display in the user interface (Content Browsers) if this
is null or an
empty string the this field is a data only field and will
not appear in the user
interface.
AddTextured(self: FieldDictionary, key: str, value: int, prompt: str) -> IntField
Add a new IntField to the dictionary. This overload will cause the
field to be
tagged as "textured" so that the texturing UI will appear
in automatic UIs.
key: Key name for the field value to change.
value: Initial value for this field.
prompt: Prompt to display in the user interface (Content Browsers) if this
is null or an
empty string the this field is a data only field and will
not appear in the user
interface.
AddTextured(self: FieldDictionary, key: str, value: bool, prompt: str) -> BoolField
Add a new BoolField to the dictionary. This overload will cause the
field to be
tagged as "textured" so that the texturing UI will appear
in automatic UIs.
key: Key name for the field value to change.
value: Initial value for this field.
prompt: Prompt to display in the user interface (Content Browsers) if this
is null or an
empty string the this field is a data only field and will
not appear in the user
interface.
AddTextured(self: FieldDictionary, key: str, value: str, prompt: str) -> StringField
Add a new StringField to the dictionary. This overload will cause the
field to be
tagged as "textured" so that the texturing UI will appear
in automatic UIs.
key: Key name for the field value to change.
value: Initial value for this field.
prompt: Prompt to display in the user interface (Content Browsers) if this
is null or an
empty string the this field is a data only field and will
not appear in the user
interface.
AddTextured(self: FieldDictionary, key: str, value: Single, prompt: str) -> FloatField
Add a new FloatField to the dictionary. This overload will cause the
field to be
tagged as "textured" so that the texturing UI will appear
in automatic UIs.
key: Key name for the field value to change.
value: Initial value for this field.
prompt: Prompt to display in the user interface (Content Browsers) if this
is null or an
empty string the this field is a data only field and will
not appear in the user
interface.
AddTextured(self: FieldDictionary, key: str, value: Color, prompt: str) -> Color4fField
Add a new Color4fField to the dictionary. This overload will cause the
field to be
tagged as "textured" so that the texturing UI will appear
in automatic UIs.
key: Key name for the field value to change.
value: Initial value for this field.
prompt: Prompt to display in the user interface (Content Browsers) if this
is null or an
empty string the this field is a data only field and will
not appear in the user
interface.
AddTextured(self: FieldDictionary, key: str, value: Color4f, prompt: str) -> Color4fField
Add a new Color4fField to the dictionary. This overload will cause the
field to be
tagged as "textured" so that the texturing UI will appear
in automatic UIs.
key: Key name for the field value to change.
value: Initial value for this field.
prompt: Prompt to display in the user interface (Content Browsers) if this
is null or an
empty string the this field is a data only field and will
not appear in the user
interface.
AddTextured(self: FieldDictionary, key: str, value: float, prompt: str) -> DoubleField
Add a new DoubleField to the dictionary. This overload will cause the
field to be
tagged as "textured" so that the texturing UI will appear
in automatic UIs.
key: Key name for the field value to change.
value: Initial value for this field.
prompt: Prompt to display in the user interface (Content Browsers) if this
is null or an
empty string the this field is a data only field and will
not appear in the user
interface.
"""
pass
def ContainsField(self, fieldName):
"""
ContainsField(self: FieldDictionary, fieldName: str) -> bool
Call this method to determine if a this FieldsList contains a field
with the
specified field name.
fieldName: Field to search for
Returns: Returns true if a field with that matches fieldName is found or false
if it is not
found.
"""
pass
def GetField(self, fieldName):
"""
GetField(self: FieldDictionary, fieldName: str) -> Field
Call this method to get the field with the matching name.
fieldName: Field name to search for.
Returns: If the field exists in the Fields dictionary then the field is returned
otherwise;
null is returned.
"""
pass
def Set(self, key, value, changeContext=None):
"""
Set(self: FieldDictionary, key: str, value: Point2d)
Set the field value and send the appropriate change notification to the
render SDK.
Will throw a InvalidOperationException exception if the key
name is not valid.
key: Key name for the field value to change.
value: New value for this field.
Set(self: FieldDictionary, key: str, value: Vector3d, changeContext: ChangeContexts)Set(self: FieldDictionary, key: str, value: Point3d)
Set the field value and send the appropriate change notification to the
render SDK.
Will throw a InvalidOperationException exception if the key
name is not valid.
key: Key name for the field value to change.
value: New value for this field.
Set(self: FieldDictionary, key: str, value: Point2d, changeContext: ChangeContexts)Set(self: FieldDictionary, key: str, value: Vector2d)
Set the field value and send the appropriate change notification to the
render SDK.
Will throw a InvalidOperationException exception if the key
name is not valid.
key: Key name for the field value to change.
value: New value for this field.
Set(self: FieldDictionary, key: str, value: Color, changeContext: ChangeContexts)Set(self: FieldDictionary, key: str, value: Vector3d)
Set the field value and send the appropriate change notification to the
render SDK.
Will throw a InvalidOperationException exception if the key
name is not valid.
key: Key name for the field value to change.
value: New value for this field.
Set(self: FieldDictionary, key: str, value: Vector2d, changeContext: ChangeContexts)Set(self: FieldDictionary, key: str, value: Transform)
Set the field value and send the appropriate change notification to the
render SDK.
Will throw a InvalidOperationException exception if the key
name is not valid.
key: Key name for the field value to change.
value: New value for this field.
Set(self: FieldDictionary, key: str, value: Guid, changeContext: ChangeContexts)Set(self: FieldDictionary, key: str, value: DateTime)
Set the field value and send the appropriate change notification to the
render SDK.
Will throw a InvalidOperationException exception if the key
name is not valid.
key: Key name for the field value to change.
value: New value for this field.
Set(self: FieldDictionary, key: str, value: Transform, changeContext: ChangeContexts)Set(self: FieldDictionary, key: str, value: Point4d)
Set the field value and send the appropriate change notification to the
render SDK.
Will throw a InvalidOperationException exception if the key
name is not valid.
key: Key name for the field value to change.
value: New value for this field.
Set(self: FieldDictionary, key: str, value: Point3d, changeContext: ChangeContexts)Set(self: FieldDictionary, key: str, value: Guid)
Set the field value and send the appropriate change notification to the
render SDK.
Will throw a InvalidOperationException exception if the key
name is not valid.
key: Key name for the field value to change.
value: New value for this field.
Set(self: FieldDictionary, key: str, value: Point4d, changeContext: ChangeContexts)Set(self: FieldDictionary, key: str, value: bool)
Set the field value and send the appropriate change notification to the
render SDK.
Will throw a InvalidOperationException exception if the key
name is not valid.
key: Key name for the field value to change.
value: New value for this field.
Set(self: FieldDictionary, key: str, value: str, changeContext: ChangeContexts)Set(self: FieldDictionary, key: str, value: int)
Set the field value and send the appropriate change notification to the
render SDK.
Will throw a InvalidOperationException exception if the key
name is not valid.
key: Key name for the field value to change.
value: New value for this field.
Set(self: FieldDictionary, key: str, value: bool, changeContext: ChangeContexts)Set(self: FieldDictionary, key: str, value: Array[Byte])
Set the field value and send the appropriate change notification to the
render SDK.
Will throw a InvalidOperationException exception if the key
name is not valid.
key: Key name for the field value to change.
value: New value for this field.
Set(self: FieldDictionary, key: str, value: DateTime, changeContext: ChangeContexts)Set(self: FieldDictionary, key: str, value: str)
Set the field value and send the appropriate change notification to the
render SDK.
Will throw a InvalidOperationException exception if the key
name is not valid.
key: Key name for the field value to change.
value: New value for this field.
Set(self: FieldDictionary, key: str, value: Array[Byte], changeContext: ChangeContexts)Set(self: FieldDictionary, key: str, value: Color4f)
Set the field value and send the appropriate change notification to the
render SDK.
Will throw a InvalidOperationException exception if the key
name is not valid.
key: Key name for the field value to change.
value: New value for this field.
Set(self: FieldDictionary, key: str, value: float, changeContext: ChangeContexts)Set(self: FieldDictionary, key: str, value: Color)
Set the field value and send the appropriate change notification to the
render SDK.
Will throw a InvalidOperationException exception if the key
name is not valid.
key: Key name for the field value to change.
value: New value for this field.
Set(self: FieldDictionary, key: str, value: Color4f, changeContext: ChangeContexts)Set(self: FieldDictionary, key: str, value: Single)
Set the field value and send the appropriate change notification to the
render SDK.
Will throw a InvalidOperationException exception if the key
name is not valid.
key: Key name for the field value to change.
value: New value for this field.
Set(self: FieldDictionary, key: str, value: int, changeContext: ChangeContexts)Set(self: FieldDictionary, key: str, value: float)
Set the field value and send the appropriate change notification to the
render SDK.
Will throw a InvalidOperationException exception if the key
name is not valid.
key: Key name for the field value to change.
value: New value for this field.
Set(self: FieldDictionary, key: str, value: Single, changeContext: ChangeContexts)
"""
pass
def SetTag(self, key, tag):
"""
SetTag(self: FieldDictionary, key: str, tag: object) -> bool
Sets an object that contains data to associate with the field.
key: Key name for the field to tag.
tag: Data to associate with the field.
Returns: True if the field is found and the tag was set otherwise false is returned.
"""
pass
def TryGetTag(self, key, tag):
"""
TryGetTag(self: FieldDictionary, key: str) -> (bool, object)
Gets object that contains data associate with a field.
key: Key name of the field to get.
Returns: Returns true if the field is found and its tag was retrieved otherwise;
returns
false.
"""
pass
def TryGetValue(self, key, value):
"""
TryGetValue(self: FieldDictionary, key: str) -> (bool, Point3d)
Find a field with the specified key and get its value if found.
key: Key name of the field to get a value for.
Returns: Returns true if the key is found and the value parameter is set to the
field value.
Returns false if the field was not found.
TryGetValue(self: FieldDictionary, key: str) -> (bool, Point4d)
Find a field with the specified key and get its value if found.
key: Key name of the field to get a value for.
Returns: Returns true if the key is found and the value parameter is set to the
field value.
Returns false if the field was not found.
TryGetValue(self: FieldDictionary, key: str) -> (bool, Vector3d)
Find a field with the specified key and get its value if found.
key: Key name of the field to get a value for.
Returns: Returns true if the key is found and the value parameter is set to the
field value.
Returns false if the field was not found.
TryGetValue(self: FieldDictionary, key: str) -> (bool, Point2d)
Find a field with the specified key and get its value if found.
key: Key name of the field to get a value for.
Returns: Returns true if the key is found and the value parameter is set to the
field value.
Returns false if the field was not found.
TryGetValue(self: FieldDictionary, key: str) -> (bool, DateTime)
Find a field with the specified key and get its value if found.
key: Key name of the field to get a value for.
Returns: Returns true if the key is found and the value parameter is set to the
field value.
Returns false if the field was not found.
TryGetValue(self: FieldDictionary, key: str) -> (bool, Array[Byte])
Find a field with the specified key and get its value if found.
key: Key name of the field to get a value for.
Returns: Returns true if the key is found and the value parameter is set to the
field value.
Returns false if the field was not found.
TryGetValue(self: FieldDictionary, key: str) -> (bool, Guid)
Find a field with the specified key and get its value if found.
key: Key name of the field to get a value for.
Returns: Returns true if the key is found and the value parameter is set to the
field value.
Returns false if the field was not found.
TryGetValue(self: FieldDictionary, key: str) -> (bool, Transform)
Find a field with the specified key and get its value if found.
key: Key name of the field to get a value for.
Returns: Returns true if the key is found and the value parameter is set to the
field value.
Returns false if the field was not found.
TryGetValue(self: FieldDictionary, key: str) -> (bool, int)
Find a field with the specified key and get its value if found.
key: Key name of the field to get a value for.
Returns: Returns true if the key is found and the value parameter is set to the
field value.
Returns false if the field was not found.
TryGetValue(self: FieldDictionary, key: str) -> (bool, float)
Find a field with the specified key and get its value if found.
key: Key name of the field to get a value for.
Returns: Returns true if the key is found and the value parameter is set to the
field value.
Returns false if the field was not found.
TryGetValue(self: FieldDictionary, key: str) -> (bool, str)
Find a field with the specified key and get its value if found.
key: Key name of the field to get a value for.
Returns: Returns true if the key is found and the value parameter is set to the
field value.
Returns false if the field was not found.
TryGetValue(self: FieldDictionary, key: str) -> (bool, bool)
Find a field with the specified key and get its value if found.
key: Key name of the field to get a value for.
Returns: Returns true if the key is found and the value parameter is set to the
field value.
Returns false if the field was not found.
TryGetValue(self: FieldDictionary, key: str) -> (bool, Color)
Find a field with the specified key and get its value if found.
key: Key name of the field to get a value for.
Returns: Returns true if the key is found and the value parameter is set to the
field value.
Returns false if the field was not found.
TryGetValue(self: FieldDictionary, key: str) -> (bool, Vector2d)
Find a field with the specified key and get its value if found.
key: Key name of the field to get a value for.
Returns: Returns true if the key is found and the value parameter is set to the
field value.
Returns false if the field was not found.
TryGetValue(self: FieldDictionary, key: str) -> (bool, Single)
Find a field with the specified key and get its value if found.
key: Key name of the field to get a value for.
Returns: Returns true if the key is found and the value parameter is set to the
field value.
Returns false if the field was not found.
TryGetValue(self: FieldDictionary, key: str) -> (bool, Color4f)
Find a field with the specified key and get its value if found.
key: Key name of the field to get a value for.
Returns: Returns true if the key is found and the value parameter is set to the
field value.
Returns false if the field was not found.
"""
pass
def __add__(self, *args): # cannot find CLR method
""" x.__add__(y) <==> x+yx.__add__(y) <==> x+yx.__add__(y) <==> x+yx.__add__(y) <==> x+yx.__add__(y) <==> x+yx.__add__(y) <==> x+yx.__add__(y) <==> x+yx.__add__(y) <==> x+yx.__add__(y) <==> x+yx.__add__(y) <==> x+yx.__add__(y) <==> x+yx.__add__(y) <==> x+yx.__add__(y) <==> x+yx.__add__(y) <==> x+yx.__add__(y) <==> x+yx.__add__(y) <==> x+yx.__add__(y) <==> x+yx.__add__(y) <==> x+yx.__add__(y) <==> x+yx.__add__(y) <==> x+yx.__add__(y) <==> x+yx.__add__(y) <==> x+yx.__add__(y) <==> x+yx.__add__(y) <==> x+yx.__add__(y) <==> x+yx.__add__(y) <==> x+yx.__add__(y) <==> x+yx.__add__(y) <==> x+yx.__add__(y) <==> x+yx.__add__(y) <==> x+yx.__add__(y) <==> x+y """
pass
class FloatField(Field):
""" float field value class """
def ValueAsObject(self):
""" ValueAsObject(self: FloatField) -> object """
pass
Value = property(
lambda self: object(), lambda self, v: None, lambda self: None
) # default
"""Gets or sets the field value
Get: Value(self: FloatField) -> Single
Set: Value(self: FloatField) = value
"""
class GuidField(Field):
""" Guid field value class """
def ValueAsObject(self):
""" ValueAsObject(self: GuidField) -> object """
pass
Value = property(
lambda self: object(), lambda self, v: None, lambda self: None
) # default
"""Gets or sets the field value
Get: Value(self: GuidField) -> Guid
Set: Value(self: GuidField) = value
"""
class IntField(Field):
""" Integer field value class """
def ValueAsObject(self):
""" ValueAsObject(self: IntField) -> object """
pass
Value = property(
lambda self: object(), lambda self, v: None, lambda self: None
) # default
"""Gets or sets the field value
Get: Value(self: IntField) -> int
Set: Value(self: IntField) = value
"""
class Point2dField(Field):
""" Point2d field value class """
def ValueAsObject(self):
""" ValueAsObject(self: Point2dField) -> object """
pass
Value = property(
lambda self: object(), lambda self, v: None, lambda self: None
) # default
"""Gets or sets the field value
Get: Value(self: Point2dField) -> Point2d
Set: Value(self: Point2dField) = value
"""
class Point3dField(Field):
""" Point3d field value class """
def ValueAsObject(self):
""" ValueAsObject(self: Point3dField) -> object """
pass
Value = property(
lambda self: object(), lambda self, v: None, lambda self: None
) # default
"""Gets or sets the field value
Get: Value(self: Point3dField) -> Point3d
Set: Value(self: Point3dField) = value
"""
class Point4dField(Field):
""" Point4d field value class """
def ValueAsObject(self):
""" ValueAsObject(self: Point4dField) -> object """
pass
Value = property(
lambda self: object(), lambda self, v: None, lambda self: None
) # default
"""Gets or sets the field value
Get: Value(self: Point4dField) -> Point4d
Set: Value(self: Point4dField) = value
"""
class StringField(Field):
""" String field value class """
def ValueAsObject(self):
""" ValueAsObject(self: StringField) -> object """
pass
Value = property(
lambda self: object(), lambda self, v: None, lambda self: None
) # default
"""Gets or sets the field value
Get: Value(self: StringField) -> str
Set: Value(self: StringField) = value
"""
class TransformField(Field):
""" Transform field value class """
def ValueAsObject(self):
""" ValueAsObject(self: TransformField) -> object """
pass
Value = property(
lambda self: object(), lambda self, v: None, lambda self: None
) # default
"""Gets or sets the field value
Get: Value(self: TransformField) -> Transform
Set: Value(self: TransformField) = value
"""
class Vector2dField(Field):
""" Vector2d field value class """
def ValueAsObject(self):
""" ValueAsObject(self: Vector2dField) -> object """
pass
Value = property(
lambda self: object(), lambda self, v: None, lambda self: None
) # default
"""Gets or sets the field value
Get: Value(self: Vector2dField) -> Vector2d
Set: Value(self: Vector2dField) = value
"""
class Vector3dField(Field):
""" Vector3d field value class """
def ValueAsObject(self):
""" ValueAsObject(self: Vector3dField) -> object """
pass
Value = property(
lambda self: object(), lambda self, v: None, lambda self: None
) # default
"""Gets or sets the field value
Get: Value(self: Vector3dField) -> Vector3d
Set: Value(self: Vector3dField) = value
"""
| 23.971003 | 668 | 0.539886 | 6,726 | 59,520 | 4.756765 | 0.039697 | 0.043508 | 0.046728 | 0.075014 | 0.846534 | 0.836063 | 0.81484 | 0.797962 | 0.741952 | 0.708883 | 0 | 0.00491 | 0.404603 | 59,520 | 2,482 | 669 | 23.980661 | 0.897906 | 0.691347 | 0 | 0.574074 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.259259 | false | 0.259259 | 0 | 0 | 0.487654 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
600040f523bf6668ec90232faf72626cb0950c74 | 11,040 | py | Python | tests/unit/states/mac_keychain_test.py | pass-by-value/salt | 2ede44fe54516242e10fe428629d5f5a18e5f7ea | [
"Apache-2.0",
"MIT"
] | null | null | null | tests/unit/states/mac_keychain_test.py | pass-by-value/salt | 2ede44fe54516242e10fe428629d5f5a18e5f7ea | [
"Apache-2.0",
"MIT"
] | 1 | 2019-09-06T13:57:28.000Z | 2019-09-06T13:57:28.000Z | tests/unit/states/mac_keychain_test.py | pass-by-value/salt | 2ede44fe54516242e10fe428629d5f5a18e5f7ea | [
"Apache-2.0",
"MIT"
] | 1 | 2020-09-30T16:09:48.000Z | 2020-09-30T16:09:48.000Z | # -*- coding: utf-8 -*-
# Import Python libs
from __future__ import absolute_import
# Import Salt Libs
from salt.states import mac_keychain as keychain
# Import Salt Testing Libs
from salttesting import TestCase
from salttesting.helpers import ensure_in_syspath
from salttesting.mock import (
MagicMock,
patch,
call
)
ensure_in_syspath('../../')
keychain.__salt__ = {}
class KeychainTestCase(TestCase):
def test_install_cert(self):
'''
Test installing a certificate into the macOS keychain
'''
expected = {
'changes': {'installed': 'Friendly Name'},
'comment': '',
'name': '/path/to/cert.p12',
'result': True
}
list_mock = MagicMock(return_value=['Cert1'])
friendly_mock = MagicMock(return_value='Friendly Name')
install_mock = MagicMock(return_value='1 identity imported.')
with patch.dict(keychain.__salt__, {'keychain.list_certs': list_mock,
'keychain.get_friendly_name': friendly_mock,
'keychain.install': install_mock}):
out = keychain.installed('/path/to/cert.p12', 'passw0rd')
list_mock.assert_called_once_with('/Library/Keychains/System.keychain')
friendly_mock.assert_called_once_with('/path/to/cert.p12', 'passw0rd')
install_mock.assert_called_once_with('/path/to/cert.p12', 'passw0rd', '/Library/Keychains/System.keychain')
self.assertEqual(out, expected)
def test_installed_cert(self):
'''
Test installing a certificate into the macOS keychain when it's
already installed
'''
expected = {
'changes': {},
'comment': 'Friendly Name already installed.',
'name': '/path/to/cert.p12',
'result': True
}
list_mock = MagicMock(return_value=['Friendly Name'])
friendly_mock = MagicMock(return_value='Friendly Name')
install_mock = MagicMock(return_value='1 identity imported.')
hash_mock = MagicMock(return_value='ABCD')
with patch.dict(keychain.__salt__, {'keychain.list_certs': list_mock,
'keychain.get_friendly_name': friendly_mock,
'keychain.install': install_mock,
'keychain.get_hash': hash_mock}):
out = keychain.installed('/path/to/cert.p12', 'passw0rd')
list_mock.assert_called_once_with('/Library/Keychains/System.keychain')
friendly_mock.assert_called_once_with('/path/to/cert.p12', 'passw0rd')
assert not install_mock.called
self.assertEqual(out, expected)
def test_uninstall_cert(self):
'''
Test uninstalling a certificate into the macOS keychain when it's
already installed
'''
expected = {
'changes': {'uninstalled': 'Friendly Name'},
'comment': '',
'name': '/path/to/cert.p12',
'result': True
}
list_mock = MagicMock(return_value=['Friendly Name'])
friendly_mock = MagicMock(return_value='Friendly Name')
uninstall_mock = MagicMock(return_value='1 identity imported.')
with patch.dict(keychain.__salt__, {'keychain.list_certs': list_mock,
'keychain.get_friendly_name': friendly_mock,
'keychain.uninstall': uninstall_mock}):
out = keychain.uninstalled('/path/to/cert.p12', 'passw0rd')
list_mock.assert_called_once_with('/Library/Keychains/System.keychain')
friendly_mock.assert_called_once_with('/path/to/cert.p12', 'passw0rd')
uninstall_mock.assert_called_once_with('Friendly Name', '/Library/Keychains/System.keychain', None)
self.assertEqual(out, expected)
def test_uninstalled_cert(self):
'''
Test uninstalling a certificate into the macOS keychain when it's
not installed
'''
expected = {
'changes': {},
'comment': 'Friendly Name already uninstalled.',
'name': '/path/to/cert.p12',
'result': True
}
list_mock = MagicMock(return_value=['Cert2'])
friendly_mock = MagicMock(return_value='Friendly Name')
uninstall_mock = MagicMock(return_value='1 identity imported.')
with patch.dict(keychain.__salt__, {'keychain.list_certs': list_mock,
'keychain.get_friendly_name': friendly_mock,
'keychain.uninstall': uninstall_mock}):
out = keychain.uninstalled('/path/to/cert.p12', 'passw0rd')
list_mock.assert_called_once_with('/Library/Keychains/System.keychain')
friendly_mock.assert_called_once_with('/path/to/cert.p12', 'passw0rd')
assert not uninstall_mock.called
self.assertEqual(out, expected)
@patch('os.path.exists')
def test_default_keychain(self, exists_mock):
'''
Test setting the default keychain
'''
expected = {
'changes': {'default': '/path/to/chain.keychain'},
'comment': '',
'name': '/path/to/chain.keychain',
'result': True
}
exists_mock.return_value = True
get_default_mock = MagicMock(return_value='/path/to/other.keychain')
set_mock = MagicMock(return_value='')
with patch.dict(keychain.__salt__, {'keychain.get_default_keychain': get_default_mock,
'keychain.set_default_keychain': set_mock}):
out = keychain.default_keychain('/path/to/chain.keychain', 'system', 'frank')
get_default_mock.assert_called_once_with('frank', 'system')
set_mock.assert_called_once_with('/path/to/chain.keychain', 'system', 'frank')
self.assertEqual(out, expected)
@patch('os.path.exists')
def test_default_keychain_set_already(self, exists_mock):
'''
Test setting the default keychain when it's already set
'''
expected = {
'changes': {},
'comment': '/path/to/chain.keychain was already the default keychain.',
'name': '/path/to/chain.keychain',
'result': True
}
exists_mock.return_value = True
get_default_mock = MagicMock(return_value='/path/to/chain.keychain')
set_mock = MagicMock(return_value='')
with patch.dict(keychain.__salt__, {'keychain.get_default_keychain': get_default_mock,
'keychain.set_default_keychain': set_mock}):
out = keychain.default_keychain('/path/to/chain.keychain', 'system', 'frank')
get_default_mock.assert_called_once_with('frank', 'system')
assert not set_mock.called
self.assertEqual(out, expected)
@patch('os.path.exists')
def test_default_keychain_missing(self, exists_mock):
'''
Test setting the default keychain when the keychain is missing
'''
expected = {
'changes': {},
'comment': 'Keychain not found at /path/to/cert.p12',
'name': '/path/to/cert.p12',
'result': False
}
exists_mock.return_value = False
out = keychain.default_keychain('/path/to/cert.p12', 'system', 'frank')
self.assertEqual(out, expected)
def test_install_cert_salt_fileserver(self):
'''
Test installing a certificate into the macOS keychain from the salt
fileserver
'''
expected = {
'changes': {'installed': 'Friendly Name'},
'comment': '',
'name': 'salt://path/to/cert.p12',
'result': True
}
list_mock = MagicMock(return_value=['Cert1'])
friendly_mock = MagicMock(return_value='Friendly Name')
install_mock = MagicMock(return_value='1 identity imported.')
cp_cache_mock = MagicMock(return_value='/tmp/path/to/cert.p12')
with patch.dict(keychain.__salt__, {'keychain.list_certs': list_mock,
'keychain.get_friendly_name': friendly_mock,
'keychain.install': install_mock,
'cp.cache_file': cp_cache_mock}):
out = keychain.installed('salt://path/to/cert.p12', 'passw0rd')
list_mock.assert_called_once_with('/Library/Keychains/System.keychain')
friendly_mock.assert_called_once_with('/tmp/path/to/cert.p12', 'passw0rd')
install_mock.assert_called_once_with('/tmp/path/to/cert.p12', 'passw0rd', '/Library/Keychains/System.keychain')
self.assertEqual(out, expected)
def test_installed_cert_hash_different(self):
'''
Test installing a certificate into the macOS keychain when it's
already installed but the certificate has changed
'''
expected = {
'changes': {'installed': 'Friendly Name', 'uninstalled': 'Friendly Name'},
'comment': 'Found a certificate with the same name but different hash, removing it.\n',
'name': '/path/to/cert.p12',
'result': True
}
list_mock = MagicMock(side_effect=[['Friendly Name'], []])
friendly_mock = MagicMock(return_value='Friendly Name')
install_mock = MagicMock(return_value='1 identity imported.')
uninstall_mock = MagicMock(return_value='removed.')
hash_mock = MagicMock(side_effect=['ABCD', 'XYZ'])
with patch.dict(keychain.__salt__, {'keychain.list_certs': list_mock,
'keychain.get_friendly_name': friendly_mock,
'keychain.install': install_mock,
'keychain.uninstall': uninstall_mock,
'keychain.get_hash': hash_mock}):
out = keychain.installed('/path/to/cert.p12', 'passw0rd')
list_mock.assert_has_calls(calls=[call('/Library/Keychains/System.keychain'),
call('/Library/Keychains/System.keychain')])
friendly_mock.assert_called_once_with('/path/to/cert.p12', 'passw0rd')
install_mock.assert_called_once_with('/path/to/cert.p12', 'passw0rd', '/Library/Keychains/System.keychain')
uninstall_mock.assert_called_once_with('Friendly Name', '/Library/Keychains/System.keychain',
keychain_password=None)
self.assertEqual(out, expected)
if __name__ == '__main__':
from integration import run_tests
run_tests(KeychainTestCase, needs_daemon=False)
| 45.432099 | 123 | 0.587319 | 1,134 | 11,040 | 5.460317 | 0.10582 | 0.032946 | 0.040375 | 0.052487 | 0.821705 | 0.790536 | 0.758398 | 0.729813 | 0.722868 | 0.699612 | 0 | 0.00967 | 0.297464 | 11,040 | 242 | 124 | 45.619835 | 0.78868 | 0.066033 | 0 | 0.620112 | 0 | 0 | 0.268493 | 0.09956 | 0 | 0 | 0 | 0 | 0.178771 | 1 | 0.050279 | false | 0.089385 | 0.067039 | 0 | 0.122905 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
6002a543b349e21f618a648ae3cf9b1eb4e170ef | 8,266 | py | Python | sdk/python/pulumi_gcp/networkmanagement/outputs.py | dimpu47/pulumi-gcp | 38355de300a5768e11c49d344a8165ba0735deed | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_gcp/networkmanagement/outputs.py | dimpu47/pulumi-gcp | 38355de300a5768e11c49d344a8165ba0735deed | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_gcp/networkmanagement/outputs.py | dimpu47/pulumi-gcp | 38355de300a5768e11c49d344a8165ba0735deed | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Dict, List, Mapping, Optional, Tuple, Union
from .. import _utilities, _tables
__all__ = [
'ConnectivityTestDestination',
'ConnectivityTestSource',
]
@pulumi.output_type
class ConnectivityTestDestination(dict):
def __init__(__self__, *,
instance: Optional[str] = None,
ip_address: Optional[str] = None,
network: Optional[str] = None,
port: Optional[float] = None,
project_id: Optional[str] = None):
"""
:param str instance: A Compute Engine instance URI.
:param str ip_address: The IP address of the endpoint, which can be an external or
internal IP. An IPv6 address is only allowed when the test's
destination is a global load balancer VIP.
:param str network: A Compute Engine network URI.
:param float port: The IP protocol port of the endpoint. Only applicable when
protocol is TCP or UDP.
:param str project_id: Project ID where the endpoint is located. The Project ID can be
derived from the URI if you provide a VM instance or network URI.
The following are two cases where you must provide the project ID:
1. Only the IP address is specified, and the IP address is within
a GCP project. 2. When you are using Shared VPC and the IP address
that you provide is from the service project. In this case, the
network that the IP address resides in is defined in the host
project.
"""
if instance is not None:
pulumi.set(__self__, "instance", instance)
if ip_address is not None:
pulumi.set(__self__, "ip_address", ip_address)
if network is not None:
pulumi.set(__self__, "network", network)
if port is not None:
pulumi.set(__self__, "port", port)
if project_id is not None:
pulumi.set(__self__, "project_id", project_id)
@property
@pulumi.getter
def instance(self) -> Optional[str]:
"""
A Compute Engine instance URI.
"""
return pulumi.get(self, "instance")
@property
@pulumi.getter(name="ipAddress")
def ip_address(self) -> Optional[str]:
"""
The IP address of the endpoint, which can be an external or
internal IP. An IPv6 address is only allowed when the test's
destination is a global load balancer VIP.
"""
return pulumi.get(self, "ip_address")
@property
@pulumi.getter
def network(self) -> Optional[str]:
"""
A Compute Engine network URI.
"""
return pulumi.get(self, "network")
@property
@pulumi.getter
def port(self) -> Optional[float]:
"""
The IP protocol port of the endpoint. Only applicable when
protocol is TCP or UDP.
"""
return pulumi.get(self, "port")
@property
@pulumi.getter(name="projectId")
def project_id(self) -> Optional[str]:
"""
Project ID where the endpoint is located. The Project ID can be
derived from the URI if you provide a VM instance or network URI.
The following are two cases where you must provide the project ID:
1. Only the IP address is specified, and the IP address is within
a GCP project. 2. When you are using Shared VPC and the IP address
that you provide is from the service project. In this case, the
network that the IP address resides in is defined in the host
project.
"""
return pulumi.get(self, "project_id")
def _translate_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
@pulumi.output_type
class ConnectivityTestSource(dict):
def __init__(__self__, *,
instance: Optional[str] = None,
ip_address: Optional[str] = None,
network: Optional[str] = None,
network_type: Optional[str] = None,
port: Optional[float] = None,
project_id: Optional[str] = None):
"""
:param str instance: A Compute Engine instance URI.
:param str ip_address: The IP address of the endpoint, which can be an external or
internal IP. An IPv6 address is only allowed when the test's
destination is a global load balancer VIP.
:param str network: A Compute Engine network URI.
:param str network_type: Type of the network where the endpoint is located.
Possible values are `GCP_NETWORK` and `NON_GCP_NETWORK`.
:param float port: The IP protocol port of the endpoint. Only applicable when
protocol is TCP or UDP.
:param str project_id: Project ID where the endpoint is located. The Project ID can be
derived from the URI if you provide a VM instance or network URI.
The following are two cases where you must provide the project ID:
1. Only the IP address is specified, and the IP address is within
a GCP project. 2. When you are using Shared VPC and the IP address
that you provide is from the service project. In this case, the
network that the IP address resides in is defined in the host
project.
"""
if instance is not None:
pulumi.set(__self__, "instance", instance)
if ip_address is not None:
pulumi.set(__self__, "ip_address", ip_address)
if network is not None:
pulumi.set(__self__, "network", network)
if network_type is not None:
pulumi.set(__self__, "network_type", network_type)
if port is not None:
pulumi.set(__self__, "port", port)
if project_id is not None:
pulumi.set(__self__, "project_id", project_id)
@property
@pulumi.getter
def instance(self) -> Optional[str]:
"""
A Compute Engine instance URI.
"""
return pulumi.get(self, "instance")
@property
@pulumi.getter(name="ipAddress")
def ip_address(self) -> Optional[str]:
"""
The IP address of the endpoint, which can be an external or
internal IP. An IPv6 address is only allowed when the test's
destination is a global load balancer VIP.
"""
return pulumi.get(self, "ip_address")
@property
@pulumi.getter
def network(self) -> Optional[str]:
"""
A Compute Engine network URI.
"""
return pulumi.get(self, "network")
@property
@pulumi.getter(name="networkType")
def network_type(self) -> Optional[str]:
"""
Type of the network where the endpoint is located.
Possible values are `GCP_NETWORK` and `NON_GCP_NETWORK`.
"""
return pulumi.get(self, "network_type")
@property
@pulumi.getter
def port(self) -> Optional[float]:
"""
The IP protocol port of the endpoint. Only applicable when
protocol is TCP or UDP.
"""
return pulumi.get(self, "port")
@property
@pulumi.getter(name="projectId")
def project_id(self) -> Optional[str]:
"""
Project ID where the endpoint is located. The Project ID can be
derived from the URI if you provide a VM instance or network URI.
The following are two cases where you must provide the project ID:
1. Only the IP address is specified, and the IP address is within
a GCP project. 2. When you are using Shared VPC and the IP address
that you provide is from the service project. In this case, the
network that the IP address resides in is defined in the host
project.
"""
return pulumi.get(self, "project_id")
def _translate_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
| 39.361905 | 94 | 0.619768 | 1,095 | 8,266 | 4.56621 | 0.122374 | 0.0612 | 0.048 | 0.033 | 0.9018 | 0.8966 | 0.8966 | 0.8908 | 0.8908 | 0.8908 | 0 | 0.002269 | 0.30692 | 8,266 | 209 | 95 | 39.550239 | 0.870484 | 0.468183 | 0 | 0.822917 | 1 | 0 | 0.074919 | 0.013301 | 0 | 0 | 0 | 0 | 0 | 1 | 0.15625 | false | 0 | 0.052083 | 0.020833 | 0.364583 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
6048c59394607ffc4651610fddc57560b5ed93c6 | 10,732 | py | Python | cellpack/mgl_tools/DejaVu/materialsDef/silky.py | mesoscope/cellpack | ec6b736fc706c1fae16392befa814b5337a3a692 | [
"MIT"
] | null | null | null | cellpack/mgl_tools/DejaVu/materialsDef/silky.py | mesoscope/cellpack | ec6b736fc706c1fae16392befa814b5337a3a692 | [
"MIT"
] | 21 | 2021-10-02T00:07:05.000Z | 2022-03-30T00:02:10.000Z | cellpack/mgl_tools/DejaVu/materialsDef/silky.py | mesoscope/cellpack | ec6b736fc706c1fae16392befa814b5337a3a692 | [
"MIT"
] | null | null | null | # material definition table: silky
#
silky = [
[ # 0
[[0.171143, 0.170799, 0.171534, 1.0]], # ambient
[[0.684572, 0.683198, 0.686136, 1.0]], # diffuse
[[0.0, 0.0, 0.0, 1.0]], # emission
[[0.0347538, 0.0335554, 0.693878, 1.0]], # specular
[0.0714286], # shininess
[1.0], # opacity
],
[ # 1
[[0.171143, 0.170799, 0.171534, 1.0]], # ambient
[[0.684572, 0.683198, 0.686136, 1.0]], # diffuse
[[0.0, 0.0, 0.0, 1.0]], # emission
[[0.425195, 0.0335554, 0.693878, 1.0]], # specular
[0.0714286], # shininess
[1.0], # opacity
],
[ # 2
[[0.00746438, 0.00673081, 0.00690282, 1.0]], # ambient
[[0.0291577, 0.0262922, 0.0269642, 1.0]], # diffuse
[[0.0, 0.0, 0.0, 1.0]], # emission
[[0.964441, 0.641609, 0.979592, 1.0]], # specular
[0.0612245], # shininess
[1.0], # opacity
],
[ # 3
[[0.00746438, 0.00673081, 0.00690282, 1.0]], # ambient
[[0.0291577, 0.0262922, 0.0269642, 1.0]], # diffuse
[[0.0, 0.0, 0.0, 1.0]], # emission
[[0.979592, 0.641609, 0.798946, 1.0]], # specular
[0.0612245], # shininess
[1.0], # opacity
],
[ # 4
[[0.00746438, 0.00673081, 0.00690282, 1.0]], # ambient
[[0.0291577, 0.0262922, 0.0269642, 1.0]], # diffuse
[[0.0, 0.0, 0.0, 1.0]], # emission
[[0.979592, 0.641609, 0.641609, 1.0]], # specular
[0.0612245], # shininess
[1.0], # opacity
],
[ # 5
[[0.00746438, 0.00673081, 0.00690282, 1.0]], # ambient
[[0.0291577, 0.0262922, 0.0269642, 1.0]], # diffuse
[[0.0, 0.0, 0.0, 1.0]], # emission
[[0.979592, 0.805939, 0.641609, 1.0]], # specular
[0.0612245], # shininess
[1.0], # opacity
],
[ # 6
[[0.00746438, 0.00673081, 0.00690282, 1.0]], # ambient
[[0.0291577, 0.0262922, 0.0269642, 1.0]], # diffuse
[[0.0, 0.0, 0.0, 1.0]], # emission
[[0.946959, 0.979592, 0.641609, 1.0]], # specular
[0.0612245], # shininess
[1.0], # opacity
],
[ # 7
[[0.00746438, 0.00673081, 0.00690282, 1.0]], # ambient
[[0.0291577, 0.0262922, 0.0269642, 1.0]], # diffuse
[[0.0, 0.0, 0.0, 1.0]], # emission
[[0.641609, 0.979592, 0.657925, 1.0]], # specular
[0.0612245], # shininess
[1.0], # opacity
],
[ # 8
[[0.00746438, 0.00673081, 0.00690282, 1.0]], # ambient
[[0.0291577, 0.0262922, 0.0269642, 1.0]], # diffuse
[[0.0, 0.0, 0.0, 1.0]], # emission
[[0.641609, 0.976208, 0.979592, 1.0]], # specular
[0.0612245], # shininess
[1.0], # opacity
],
[ # 9
[[0.00746438, 0.00673081, 0.00690282, 1.0]], # ambient
[[0.0291577, 0.0262922, 0.0269642, 1.0]], # diffuse
[[0.0, 0.0, 0.0, 1.0]], # emission
[[0.641609, 0.748831, 0.979592, 1.0]], # specular
[0.0612245], # shininess
[1.0], # opacity
],
[ # 10
[[0.0329288, 0.0335907, 0.0445259, 1.0]], # ambient
[[0.135374, 0.138095, 0.183051, 1.0]], # diffuse
[[0.0, 0.0, 0.0, 1.0]], # emission
[[0.362217, 0.369498, 0.489785, 1.0]], # specular
[0.0702703], # shininess
[1.0], # opacity
],
[ # 11
[[0.0697433, 0.0215254, 0.070936, 1.0]], # ambient
[[0.280113, 0.0864535, 0.284904, 1.0]], # diffuse
[[0.0, 0.0, 0.0, 1.0]], # emission
[[0.588517, 0.248038, 0.596939, 1.0]], # specular
[0.0204082], # shininess
[1.0], # opacity
],
[ # 12
[[0.171143, 0.170799, 0.171534, 1.0]], # ambient
[[0.684572, 0.683198, 0.686136, 1.0]], # diffuse
[[0.0, 0.0, 0.0, 1.0]], # emission
[[0.693878, 0.0335554, 0.313623, 1.0]], # specular
[0.0714286], # shininess
[1.0], # opacity
],
[ # 13
[[0.070936, 0.0215254, 0.0470826, 1.0]], # ambient
[[0.284904, 0.0864535, 0.1891, 1.0]], # diffuse
[[0.0, 0.0, 0.0, 1.0]], # emission
[[0.765306, 0.33779, 0.572187, 1.0]], # specular
[0.0408163], # shininess
[1.0], # opacity
],
[ # 14
[[0.070936, 0.0, 0.0, 1.0]], # ambient
[[0.284904, 0.0, 0.0, 1.0]], # diffuse
[[0.0, 0.0, 0.0, 1.0]], # emission
[[0.831633, 0.403851, 0.367065, 1.0]], # specular
[0.0408163], # shininess
[1.0], # opacity
],
[ # 15
[[0.0714286, 0.0301012, 0.0, 1.0]], # ambient
[[0.286882, 0.120897, 0.0, 1.0]], # diffuse
[[0.0, 0.0, 0.0, 1.0]], # emission
[[0.831633, 0.482277, 0.317353, 1.0]], # specular
[0.0408163], # shininess
[1.0], # opacity
],
[ # 16
[[0.0714286, 0.0682211, 0.0, 1.0]], # ambient
[[0.286882, 0.274, 0.0, 1.0]], # diffuse
[[0.0, 0.0, 0.0, 1.0]], # emission
[[0.631735, 0.600752, 0.241071, 1.0]], # specular
[0.0408163], # shininess
[1.0], # opacity
],
[ # 17
[[0.0, 0.0714286, 0.0150246, 1.0]], # ambient
[[0.0, 0.286882, 0.0603441, 1.0]], # diffuse
[[0.0, 0.0, 0.0, 1.0]], # emission
[[0.428943, 0.632653, 0.241422, 1.0]], # specular
[0.0255102], # shininess
[1.0], # opacity
],
[ # 18
[[0.0, 0.0714286, 0.0652005, 1.0]], # ambient
[[0.0, 0.286882, 0.261868, 1.0]], # diffuse
[[0.0, 0.0, 0.0, 1.0]], # emission
[[0.243549, 0.556863, 0.607143, 1.0]], # specular
[0.0663265], # shininess
[1.0], # opacity
],
[ # 19
[[0.00176873, 0.0335907, 0.0445259, 1.0]], # ambient
[[0.00727146, 0.138095, 0.183051, 1.0]], # diffuse
[[0.0, 0.0, 0.0, 1.0]], # emission
[[0.362217, 0.369498, 0.489785, 1.0]], # specular
[0.0702703], # shininess
[1.0], # opacity
],
[ # 20
[[0.0519579, 0.0631746, 0.081228, 1.0]], # ambient
[[0.211415, 0.257055, 0.330514, 1.0]], # diffuse
[[0.0, 0.0, 0.0, 1.0]], # emission
[[0.183673, 0.165545, 0.163419, 1.0]], # specular
[0.0255102], # shininess
[1.0], # opacity
],
[ # 21
[[0.0826531, 0.0609662, 0.0681122, 1.0]], # ambient
[[0.336312, 0.24807, 0.277146, 1.0]], # diffuse
[[0.0, 0.0, 0.0, 1.0]], # emission
[[0.221121, 0.199297, 0.196736, 1.0]], # specular
[0.0357143], # shininess
[1.0], # opacity
],
[ # 22
[[0.081228, 0.0634641, 0.0519579, 1.0]], # ambient
[[0.330514, 0.258233, 0.211415, 1.0]], # diffuse
[[0.0, 0.0, 0.0, 1.0]], # emission
[[0.183673, 0.165545, 0.163419, 1.0]], # specular
[0.0255102], # shininess
[1.0], # opacity
],
[ # 23
[[0.171143, 0.170799, 0.171534, 1.0]], # ambient
[[0.684572, 0.683198, 0.686136, 1.0]], # diffuse
[[0.0, 0.0, 0.0, 1.0]], # emission
[[0.693878, 0.0335554, 0.0335554, 1.0]], # specular
[0.0714286], # shininess
[1.0], # opacity
],
[ # 24
[[0.0826531, 0.0609662, 0.04943, 1.0]], # ambient
[[0.336312, 0.24807, 0.201129, 1.0]], # diffuse
[[0.0, 0.0, 0.0, 1.0]], # emission
[[0.108805, 0.0980667, 0.0968068, 1.0]], # specular
[0.0612245], # shininess
[1.0], # opacity
],
[ # 25
[[0.0656656, 0.0518188, 0.0228008, 1.0]], # ambient
[[0.267191, 0.210849, 0.0927759, 1.0]], # diffuse
[[0.0, 0.0, 0.0, 1.0]], # emission
[[0.183673, 0.165545, 0.163419, 1.0]], # specular
[0.0255102], # shininess
[1.0], # opacity
],
[ # 26
[[0.144808, 0.142894, 0.120918, 1.0]], # ambient
[[0.589217, 0.581431, 0.492013, 1.0]], # diffuse
[[0.0, 0.0, 0.0, 1.0]], # emission
[[0.0, 0.0, 0.0, 1.0]], # specular
[0.0612245], # shininess
[1.0], # opacity
],
[ # 27
[[0.0699071, 0.0689834, 0.0583744, 1.0]], # ambient
[[0.280771, 0.277061, 0.234452, 1.0]], # diffuse
[[0.0, 0.0, 0.0, 1.0]], # emission
[[0.22814, 0.205624, 0.202982, 1.0]], # specular
[0.0612245], # shininess
[1.0], # opacity
],
[ # 28
[[0.071021, 0.0811225, 0.0424735, 1.0]], # ambient
[[0.288982, 0.330084, 0.172823, 1.0]], # diffuse
[[0.0, 0.0, 0.0, 1.0]], # emission
[[0.108805, 0.0980667, 0.0968068, 1.0]], # specular
[0.0612245], # shininess
[1.0], # opacity
],
[ # 29
[[0.171143, 0.170799, 0.171534, 1.0]], # ambient
[[0.684572, 0.683198, 0.686136, 1.0]], # diffuse
[[0.0, 0.0, 0.0, 1.0]], # emission
[[0.693878, 0.29313, 0.0335554, 1.0]], # specular
[0.0714286], # shininess
[1.0], # opacity
],
[ # 30
[[0.171143, 0.170799, 0.171534, 1.0]], # ambient
[[0.684572, 0.683198, 0.686136, 1.0]], # diffuse
[[0.0, 0.0, 0.0, 1.0]], # emission
[[0.535714, 0.500555, 0.0259067, 1.0]], # specular
[0.0714286], # shininess
[1.0], # opacity
],
[ # 31
[[0.171143, 0.170799, 0.171534, 1.0]], # ambient
[[0.684572, 0.683198, 0.686136, 1.0]], # diffuse
[[0.0, 0.0, 0.0, 1.0]], # emission
[[0.182327, 0.540816, 0.0261535, 1.0]], # specular
[0.0714286], # shininess
[1.0], # opacity
],
[ # 32
[[0.171143, 0.170799, 0.171534, 1.0]], # ambient
[[0.684572, 0.683198, 0.686136, 1.0]], # diffuse
[[0.0, 0.0, 0.0, 1.0]], # emission
[[0.0261535, 0.540816, 0.439658, 1.0]], # specular
[0.0714286], # shininess
[1.0], # opacity
],
[ # 33
[[0.171143, 0.170799, 0.171534, 1.0]], # ambient
[[0.684572, 0.683198, 0.686136, 1.0]], # diffuse
[[0.0, 0.0, 0.0, 1.0]], # emission
[[0.0261535, 0.242667, 0.540816, 1.0]], # specular
[0.0714286], # shininess
[1.0], # opacity
],
[ # 34
[[0.00746438, 0.00673081, 0.00690282, 1.0]], # ambient
[[0.0291577, 0.0262922, 0.0269642, 1.0]], # diffuse
[[0.0, 0.0, 0.0, 1.0]], # emission
[[0.670745, 0.641609, 0.979592, 1.0]], # specular
[0.0612245], # shininess
[1.0], # opacity
],
]
| 37.65614 | 64 | 0.447261 | 1,510 | 10,732 | 3.178808 | 0.125166 | 0.082917 | 0.095 | 0.091667 | 0.779792 | 0.768125 | 0.76 | 0.76 | 0.722708 | 0.617708 | 0 | 0.443648 | 0.336936 | 10,732 | 284 | 65 | 37.788732 | 0.230888 | 0.201081 | 0 | 0.641844 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
60800266bdfe4c66a7ba060971749df5b056d9b2 | 225 | py | Python | nl_project/input_layer/__init__.py | softwareunderground/northern-lights | dbb478a4fcefb6c02678722b23984bb733bcc2b1 | [
"Apache-2.0"
] | 1 | 2021-04-17T09:13:20.000Z | 2021-04-17T09:13:20.000Z | nl_project/input_layer/__init__.py | softwareunderground/northern-lights | dbb478a4fcefb6c02678722b23984bb733bcc2b1 | [
"Apache-2.0"
] | null | null | null | nl_project/input_layer/__init__.py | softwareunderground/northern-lights | dbb478a4fcefb6c02678722b23984bb733bcc2b1 | [
"Apache-2.0"
] | 2 | 2021-04-17T13:40:38.000Z | 2022-03-16T09:55:37.000Z | from .asc_text_file_handler import *
from .azure_input import *
from .choose_file_handler import ChooseFileHandler
from .get_data import *
from .las_handler import *
from .pdf_file_handler import *
from .tif_handler import *
| 28.125 | 50 | 0.817778 | 33 | 225 | 5.242424 | 0.454545 | 0.375723 | 0.294798 | 0.242775 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.124444 | 225 | 7 | 51 | 32.142857 | 0.878173 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
60a315fae3dbde0f872423bcf5e082bde7faee6b | 19,993 | py | Python | rman_ui/rman_ui_object_panels.py | ian-hsieh/RenderManForBlender | c827f029f4cbbd1fcc71ed8d3694fc5ac58cc468 | [
"MIT"
] | 12 | 2019-05-03T21:58:15.000Z | 2022-02-24T07:02:21.000Z | rman_ui/rman_ui_object_panels.py | ian-hsieh/RenderManForBlender | c827f029f4cbbd1fcc71ed8d3694fc5ac58cc468 | [
"MIT"
] | 4 | 2019-03-07T18:20:16.000Z | 2020-09-24T21:53:15.000Z | rman_ui/rman_ui_object_panels.py | ian-hsieh/RenderManForBlender | c827f029f4cbbd1fcc71ed8d3694fc5ac58cc468 | [
"MIT"
] | 3 | 2019-05-25T01:17:09.000Z | 2019-09-13T14:43:12.000Z | from .rman_ui_base import _RManPanelHeader
from .rman_ui_base import CollectionPanel
from .rman_ui_base import PRManButtonsPanel
from ..rman_utils.draw_utils import _draw_ui_from_rman_config
from ..rman_utils.draw_utils import _draw_props
from ..rman_constants import NODE_LAYOUT_SPLIT
from ..rman_render import RmanRender
from ..icons.icons import load_icons
from ..rman_utils import object_utils
from bpy.types import Panel
import bpy
class OBJECT_PT_renderman_object_render(CollectionPanel, Panel):
bl_space_type = 'PROPERTIES'
bl_region_type = 'WINDOW'
bl_context = "object"
bl_label = "Shading and Visibility"
bl_options = {'DEFAULT_CLOSED'}
@classmethod
def poll(cls, context):
rd = context.scene.render
if context.object.type == 'CAMERA':
return False
return (context.object and rd.engine in {'PRMAN_RENDER'})
def draw_item(self, layout, context, item):
ob = context.object
rm = bpy.data.objects[ob.name].renderman
ll = rm.light_linking
index = rm.light_linking_index
col = layout.column()
col.prop(item, "group")
col.prop(item, "mode")
def draw(self, context):
self.layout.use_property_split = True
self.layout.use_property_decorate = False
layout = self.layout
ob = context.object
rm = ob.renderman
col = layout.column()
_draw_ui_from_rman_config('rman_properties_object', 'OBJECT_PT_renderman_object_render', context, layout, rm)
class OBJECT_PT_renderman_object_raytracing(CollectionPanel, Panel):
bl_space_type = 'PROPERTIES'
bl_region_type = 'WINDOW'
bl_context = "object"
bl_label = "Ray Tracing"
bl_options = {'DEFAULT_CLOSED'}
@classmethod
def poll(cls, context):
rd = context.scene.render
if context.object.type == 'CAMERA':
return False
return (context.object and rd.engine in {'PRMAN_RENDER'})
def draw_item(self, layout, context, item):
col = layout.column()
col.prop(item, "group")
col.prop(item, "mode")
def draw(self, context):
self.layout.use_property_split = True
self.layout.use_property_decorate = False
layout = self.layout
ob = context.object
rm = ob.renderman
col = layout.column()
_draw_ui_from_rman_config('rman_properties_object', 'OBJECT_PT_renderman_object_raytracing', context, layout, rm)
class OBJECT_PT_renderman_object_geometry(Panel, CollectionPanel):
bl_space_type = 'PROPERTIES'
bl_region_type = 'WINDOW'
bl_context = "object"
bl_label = "RenderMan Geometry"
bl_options = {'DEFAULT_CLOSED'}
@classmethod
def poll(cls, context):
rd = context.scene.render
if context.object.type in ['LIGHT']:
return False
return (context.object and rd.engine in {'PRMAN_RENDER'})
def draw_item(self, layout, context, item):
col = layout.column()
col.prop(item, "name")
col.prop(item, "type")
def draw_props(self, layout, context):
ob = context.object
rm = ob.renderman
anim = rm.archive_anim_settings
active = context.active_object
rman_type = object_utils._detect_primitive_(active)
rman_render = RmanRender.get_rman_render()
rman_interactive_running = rman_render.rman_interactive_running
col = layout.column()
col.enabled = not rman_interactive_running
_draw_ui_from_rman_config('rman_properties_object', 'OBJECT_PT_renderman_object_geometry', context, layout, rm)
rman_archive = load_icons().get("archive_RIB")
col = layout.column()
col.enabled = not rman_interactive_running
col.operator("export.export_rib_archive",
text="Export Object as RIB Archive.", icon_value=rman_archive.icon_id)
col = layout.column()
def draw_camera_props(self, layout, context):
ob = context.object
rm = ob.renderman
col = layout.column()
col.prop(rm, "motion_segments_override")
col.active = rm.motion_segments_override
col.prop(rm, "motion_segments")
def draw(self, context):
self.layout.use_property_split = True
self.layout.use_property_decorate = False
layout = self.layout
if context.object.type == 'CAMERA':
self.draw_camera_props(layout, context)
else:
self.draw_props(layout, context)
class OBJECT_PT_renderman_object_geometry_quadric(Panel, CollectionPanel):
bl_space_type = 'PROPERTIES'
bl_region_type = 'WINDOW'
bl_context = "object"
bl_label = "Quadric"
bl_parent_id = "OBJECT_PT_renderman_object_geometry"
@classmethod
def poll(cls, context):
rd = context.scene.render
rm = context.object.renderman
if context.object.type in ['LIGHT']:
return False
if rm.primitive != 'QUADRIC':
return False
return (context.object and rd.engine in {'PRMAN_RENDER'})
def draw_item(self, layout, context, item):
col = layout.column()
col.prop(item, "name")
col.prop(item, "type")
def draw(self, context):
self.layout.use_property_split = True
self.layout.use_property_decorate = False
layout = self.layout
ob = context.object
rm = ob.renderman
active = context.active_object
rman_type = object_utils._detect_primitive_(active)
rman_render = RmanRender.get_rman_render()
rman_interactive_running = rman_render.rman_interactive_running
col = layout.column()
col.enabled = not rman_interactive_running
col = layout.column(align = True)
_draw_ui_from_rman_config('rman_properties_object', 'OBJECT_PT_renderman_object_geometry_quadric', context, layout, rm)
class OBJECT_PT_renderman_object_geometry_runprogram(Panel, CollectionPanel):
bl_space_type = 'PROPERTIES'
bl_region_type = 'WINDOW'
bl_context = "object"
bl_label = "Run Program"
bl_parent_id = "OBJECT_PT_renderman_object_geometry"
@classmethod
def poll(cls, context):
rd = context.scene.render
rm = context.object.renderman
if context.object.type in ['LIGHT']:
return False
if rm.primitive != 'PROCEDURAL_RUN_PROGRAM':
return False
return (context.object and rd.engine in {'PRMAN_RENDER'})
def draw_item(self, layout, context, item):
col = layout.column()
col.prop(item, "name")
col.prop(item, "type")
def draw(self, context):
self.layout.use_property_split = True
self.layout.use_property_decorate = False
layout = self.layout
ob = context.object
rm = ob.renderman
active = context.active_object
rman_type = object_utils._detect_primitive_(active)
rman_render = RmanRender.get_rman_render()
rman_interactive_running = rman_render.rman_interactive_running
col = layout.column()
col.enabled = not rman_interactive_running
col = layout.column(align = True)
_draw_ui_from_rman_config('rman_properties_object', 'OBJECT_PT_renderman_object_geometry_runprogram', context, layout, rm)
class OBJECT_PT_renderman_object_geometry_dynamic_load_dso(Panel, CollectionPanel):
bl_space_type = 'PROPERTIES'
bl_region_type = 'WINDOW'
bl_context = "object"
bl_label = "Dynamic Load DSO"
bl_parent_id = "OBJECT_PT_renderman_object_geometry"
@classmethod
def poll(cls, context):
rd = context.scene.render
rm = context.object.renderman
if context.object.type in ['LIGHT']:
return False
if rm.primitive != 'DYNAMIC_LOAD_DSO':
return False
return (context.object and rd.engine in {'PRMAN_RENDER'})
def draw_item(self, layout, context, item):
col = layout.column()
col.prop(item, "name")
col.prop(item, "type")
def draw(self, context):
self.layout.use_property_split = True
self.layout.use_property_decorate = False
layout = self.layout
ob = context.object
rm = ob.renderman
active = context.active_object
rman_type = object_utils._detect_primitive_(active)
rman_render = RmanRender.get_rman_render()
rman_interactive_running = rman_render.rman_interactive_running
col = layout.column()
col.enabled = not rman_interactive_running
col = layout.column(align = True)
_draw_ui_from_rman_config('rman_properties_object', 'OBJECT_PT_renderman_object_geometry_dynamic_load_dso', context, layout, rm)
class OBJECT_PT_renderman_object_geometry_rib_archive(Panel, CollectionPanel):
bl_space_type = 'PROPERTIES'
bl_region_type = 'WINDOW'
bl_context = "object"
bl_label = "RIB Archive"
bl_parent_id = "OBJECT_PT_renderman_object_geometry"
@classmethod
def poll(cls, context):
rd = context.scene.render
rm = context.object.renderman
if context.object.type in ['LIGHT']:
return False
if rm.primitive != 'DELAYED_LOAD_ARCHIVE':
return False
return (context.object and rd.engine in {'PRMAN_RENDER'})
def draw_item(self, layout, context, item):
col = layout.column()
col.prop(item, "name")
col.prop(item, "type")
def draw(self, context):
self.layout.use_property_split = True
self.layout.use_property_decorate = False
layout = self.layout
ob = context.object
rm = ob.renderman
anim = rm.archive_anim_settings
active = context.active_object
rman_type = object_utils._detect_primitive_(active)
rman_render = RmanRender.get_rman_render()
rman_interactive_running = rman_render.rman_interactive_running
col = layout.column()
col.enabled = not rman_interactive_running
col = layout.column(align = True)
_draw_ui_from_rman_config('rman_properties_object', 'OBJECT_PT_renderman_object_geometry_rib_archive', context, layout, rm)
col.prop(anim, "animated_sequence")
if anim.animated_sequence:
col = layout.column(align = True)
col.prop(anim, "blender_start")
col.prop(anim, "sequence_in")
col.prop(anim, "sequence_out")
class OBJECT_PT_renderman_object_geometry_openvdb(Panel, CollectionPanel):
bl_space_type = 'PROPERTIES'
bl_region_type = 'WINDOW'
bl_context = "object"
bl_label = "OpenVDB"
bl_parent_id = "OBJECT_PT_renderman_object_geometry"
@classmethod
def poll(cls, context):
rd = context.scene.render
rm = context.object.renderman
if context.object.type in ['LIGHT']:
return False
if rm.primitive != 'OPENVDB':
return False
return (context.object and rd.engine in {'PRMAN_RENDER'})
def draw_item(self, layout, context, item):
col = layout.column()
col.prop(item, "name")
col.prop(item, "type")
def draw(self, context):
self.layout.use_property_split = True
self.layout.use_property_decorate = False
layout = self.layout
ob = context.object
rm = ob.renderman
active = context.active_object
rman_type = object_utils._detect_primitive_(active)
rman_render = RmanRender.get_rman_render()
rman_interactive_running = rman_render.rman_interactive_running
col = layout.column()
col.enabled = not rman_interactive_running
col = layout.column(align = True)
_draw_ui_from_rman_config('rman_properties_object', 'OBJECT_PT_renderman_object_geometry_openvdb', context, layout, rm)
self._draw_collection(context, layout, rm, "",
"collection.add_remove", "object.renderman",
"openvdb_channels", "openvdb_channel_index")
class OBJECT_PT_renderman_object_geometry_points(Panel, CollectionPanel):
bl_space_type = 'PROPERTIES'
bl_region_type = 'WINDOW'
bl_context = "object"
bl_label = "Points"
bl_parent_id = "OBJECT_PT_renderman_object_geometry"
@classmethod
def poll(cls, context):
rd = context.scene.render
rm = context.object.renderman
if context.object.type in ['LIGHT']:
return False
if rm.primitive != 'POINTS':
return False
return (context.object and rd.engine in {'PRMAN_RENDER'})
def draw_item(self, layout, context, item):
col = layout.column()
col.prop(item, "name")
col.prop(item, "type")
def draw(self, context):
self.layout.use_property_split = True
self.layout.use_property_decorate = False
layout = self.layout
ob = context.object
rm = ob.renderman
active = context.active_object
rman_type = object_utils._detect_primitive_(active)
rman_render = RmanRender.get_rman_render()
rman_interactive_running = rman_render.rman_interactive_running
col = layout.column()
col.enabled = not rman_interactive_running
col = layout.column(align = True)
_draw_ui_from_rman_config('rman_properties_object', 'OBJECT_PT_renderman_object_geometry_points', context, layout, rm)
class OBJECT_PT_renderman_object_geometry_volume(Panel, CollectionPanel):
bl_space_type = 'PROPERTIES'
bl_region_type = 'WINDOW'
bl_context = "object"
bl_label = "Volume"
bl_parent_id = "OBJECT_PT_renderman_object_geometry"
@classmethod
def poll(cls, context):
rd = context.scene.render
rm = context.object.renderman
if context.object.type in ['LIGHT']:
return False
if rm.primitive != 'RI_VOLUME':
return False
return (context.object and rd.engine in {'PRMAN_RENDER'})
def draw_item(self, layout, context, item):
col = layout.column()
col.prop(item, "name")
col.prop(item, "type")
def draw(self, context):
self.layout.use_property_split = True
self.layout.use_property_decorate = False
layout = self.layout
ob = context.object
rm = ob.renderman
active = context.active_object
rman_type = object_utils._detect_primitive_(active)
rman_render = RmanRender.get_rman_render()
rman_interactive_running = rman_render.rman_interactive_running
col = layout.column()
col.enabled = not rman_interactive_running
col = layout.column(align = True)
_draw_ui_from_rman_config('rman_properties_object', 'OBJECT_PT_renderman_object_geometry_volume', context, layout, rm)
class OBJECT_PT_renderman_object_geometry_attributes(Panel, CollectionPanel):
bl_space_type = 'PROPERTIES'
bl_region_type = 'WINDOW'
bl_context = "object"
bl_label = "Attributes"
bl_parent_id = "OBJECT_PT_renderman_object_geometry"
bl_options = {'DEFAULT_CLOSED'}
@classmethod
def poll(cls, context):
rd = context.scene.render
if context.object.type in ['LIGHT']:
return False
return (context.object and rd.engine in {'PRMAN_RENDER'})
def draw_item(self, layout, context, item):
col = layout.column()
col.prop(item, "name")
col.prop(item, "type")
def draw(self, context):
self.layout.use_property_split = True
self.layout.use_property_decorate = False
layout = self.layout
ob = context.object
rm = ob.renderman
anim = rm.archive_anim_settings
active = context.active_object
rman_type = object_utils._detect_primitive_(active)
rman_render = RmanRender.get_rman_render()
rman_interactive_running = rman_render.rman_interactive_running
col = layout.column()
col.enabled = not rman_interactive_running
col = layout.column(align = True)
_draw_ui_from_rman_config('rman_properties_object', 'OBJECT_PT_renderman_object_geometry_attributes', context, layout, rm)
class OBJECT_PT_renderman_object_custom_primvars(CollectionPanel, Panel):
bl_space_type = 'PROPERTIES'
bl_region_type = 'WINDOW'
bl_context = "object"
bl_label = "Custom Primvars"
bl_options = {'DEFAULT_CLOSED'}
@classmethod
def poll(cls, context):
rd = context.scene.render
if context.object.type == 'CAMERA':
return False
return (context.object and rd.engine in {'PRMAN_RENDER'})
def draw(self, context):
self.layout.use_property_split = True
self.layout.use_property_decorate = False
layout = self.layout
ob = context.object
rm = ob.renderman
col = layout.column()
row = col.row()
_draw_ui_from_rman_config('rman_properties_object', 'OBJECT_PT_renderman_object_custom_primvars', context, layout, rm)
class OBJECT_PT_renderman_object_custom_attributes(CollectionPanel, Panel):
bl_space_type = 'PROPERTIES'
bl_region_type = 'WINDOW'
bl_context = "object"
bl_label = "Custom Attributes"
bl_options = {'DEFAULT_CLOSED'}
@classmethod
def poll(cls, context):
rd = context.scene.render
if context.object.type == 'CAMERA':
return False
return (context.object and rd.engine in {'PRMAN_RENDER'})
def draw(self, context):
self.layout.use_property_split = True
self.layout.use_property_decorate = False
layout = self.layout
ob = context.object
rm = ob.renderman
col = layout.column()
_draw_ui_from_rman_config('rman_properties_object', 'OBJECT_PT_renderman_object_custom_attributes', context, layout, rm)
class OBJECT_PT_renderman_object_matteid(Panel, _RManPanelHeader):
bl_space_type = 'PROPERTIES'
bl_region_type = 'WINDOW'
bl_context = "object"
bl_label = "Matte ID"
bl_options = {'DEFAULT_CLOSED'}
@classmethod
def poll(cls, context):
rd = context.scene.render
if context.object.type == 'CAMERA':
return False
return (context.object and rd.engine in {'PRMAN_RENDER'})
def draw(self, context):
self.layout.use_property_split = True
self.layout.use_property_decorate = False
layout = self.layout
ob = context.object
rm = ob.renderman
col = layout.column()
_draw_ui_from_rman_config('rman_properties_object', 'OBJECT_PT_renderman_object_matteid', context, layout, rm)
classes = [
OBJECT_PT_renderman_object_geometry,
OBJECT_PT_renderman_object_geometry_quadric,
OBJECT_PT_renderman_object_geometry_runprogram,
OBJECT_PT_renderman_object_geometry_dynamic_load_dso,
OBJECT_PT_renderman_object_geometry_rib_archive,
OBJECT_PT_renderman_object_geometry_openvdb,
OBJECT_PT_renderman_object_geometry_points,
OBJECT_PT_renderman_object_geometry_volume,
OBJECT_PT_renderman_object_geometry_attributes,
OBJECT_PT_renderman_object_render,
OBJECT_PT_renderman_object_raytracing,
OBJECT_PT_renderman_object_custom_primvars,
OBJECT_PT_renderman_object_custom_attributes,
OBJECT_PT_renderman_object_matteid
]
def register():
for cls in classes:
bpy.utils.register_class(cls)
def unregister():
for cls in classes:
bpy.utils.unregister_class(cls) | 34.530225 | 157 | 0.65783 | 2,355 | 19,993 | 5.265393 | 0.057325 | 0.069194 | 0.068548 | 0.092742 | 0.911774 | 0.879274 | 0.833306 | 0.823226 | 0.81121 | 0.772339 | 0 | 0 | 0.25694 | 19,993 | 579 | 158 | 34.530225 | 0.83468 | 0 | 0 | 0.779443 | 0 | 0 | 0.120136 | 0.064369 | 0 | 0 | 0 | 0 | 0 | 1 | 0.092077 | false | 0 | 0.023555 | 0 | 0.372591 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
714a9ddac4eaf89c84360d741067cb4853e58928 | 11,828 | py | Python | src/grafana_api/alerting_notifications.py | ZPascal/grafana_api_sdk | 97c347790200e8e9a2aafd47e322297aa97b964c | [
"Apache-2.0"
] | 2 | 2022-02-01T20:18:48.000Z | 2022-02-02T01:22:14.000Z | src/grafana_api/alerting_notifications.py | ZPascal/grafana_api_sdk | 97c347790200e8e9a2aafd47e322297aa97b964c | [
"Apache-2.0"
] | 5 | 2022-01-12T06:55:54.000Z | 2022-03-26T13:35:50.000Z | src/grafana_api/alerting_notifications.py | ZPascal/grafana_api_sdk | 97c347790200e8e9a2aafd47e322297aa97b964c | [
"Apache-2.0"
] | null | null | null | import json
import logging
from .model import APIModel, APIEndpoints, RequestsMethods
from .api import Api
class AlertingNotifications:
"""The class includes all necessary methods to access the Grafana alerting notifications API endpoints
Args:
grafana_api_model (APIModel): Inject a Grafana API model object that includes all necessary values and information
Attributes:
grafana_api_model (APIModel): This is where we store the grafana_api_model
"""
def __init__(self, grafana_api_model: APIModel):
self.grafana_api_model = grafana_api_model
def get_all_notification_channels(self) -> list:
"""The method includes a functionality to get all alerting notification channels
Raises:
Exception: Unspecified error by executing the API call
Returns:
api_call (list): Returns all notification channels
"""
api_call: list = (
Api(self.grafana_api_model)
.call_the_api(
APIEndpoints.ALERT_NOTIFICATIONS.value,
RequestsMethods.GET,
)
.json()
)
if api_call == list() or api_call[0].get("id") is None:
logging.error(f"Check the error: {api_call}.")
raise Exception
else:
return api_call
def get_all_notification_channels_lookup(self) -> list:
"""The method includes a functionality to lookup and get reduced information of all alerting notification channels
Raises:
Exception: Unspecified error by executing the API call
Returns:
api_call (list): Returns all notification channels as reduced information
"""
api_call: list = (
Api(self.grafana_api_model)
.call_the_api(
f"{APIEndpoints.ALERT_NOTIFICATIONS.value}/lookup",
RequestsMethods.GET,
)
.json()
)
if api_call == list() or api_call[0].get("id") is None:
logging.error(f"Check the error: {api_call}.")
raise Exception
else:
return api_call
def get_notification_channel_by_uid(self, uid: str) -> dict:
"""The method includes a functionality to get an alerting notification channel specified by the uid
Args:
uid (str): Specify the uid of the notification channel
Raises:
ValueError: Missed specifying a necessary value
Exception: Unspecified error by executing the API call
Returns:
api_call (dict): Returns the specified notification channel
"""
if len(uid) != 0:
api_call: dict = (
Api(self.grafana_api_model)
.call_the_api(
f"{APIEndpoints.ALERT_NOTIFICATIONS.value}/uid/{uid}",
RequestsMethods.GET,
)
.json()
)
if api_call == dict() or api_call.get("id") is None:
logging.error(f"Check the error: {api_call}.")
raise Exception
else:
return api_call
else:
logging.error("There is no uid defined.")
raise ValueError
def get_notification_channel_by_id(self, id: int) -> dict:
"""The method includes a functionality to get an alerting notification channel specified by the id
Args:
id (int): Specify the id of the notification channel
Raises:
ValueError: Missed specifying a necessary value
Exception: Unspecified error by executing the API call
Returns:
api_call (dict): Returns the specified notification channel
"""
if id != 0:
api_call: dict = (
Api(self.grafana_api_model)
.call_the_api(
f"{APIEndpoints.ALERT_NOTIFICATIONS.value}/{id}",
RequestsMethods.GET,
)
.json()
)
if api_call == dict() or api_call.get("id") is None:
logging.error(f"Check the error: {api_call}.")
raise Exception
else:
return api_call
else:
logging.error("There is no uid defined.")
raise ValueError
def create_notification_channel(self, notification_channel: dict) -> dict:
"""The method includes a functionality to create an alerting notification channel specified by the notification channel dict
Args:
notification_channel (dict): Specify the channel of the notification
Raises:
ValueError: Missed specifying a necessary value
Exception: Unspecified error by executing the API call
Returns:
api_call (dict): Returns the newly created notification channel
"""
if notification_channel != dict():
api_call: dict = (
Api(self.grafana_api_model)
.call_the_api(
APIEndpoints.ALERT_NOTIFICATIONS.value,
RequestsMethods.POST,
json.dumps(notification_channel),
)
.json()
)
if api_call == dict() or api_call.get("id") is None:
logging.error(f"Check the error: {api_call}.")
raise Exception
else:
return api_call
else:
logging.error("There is no notification_channel defined.")
raise ValueError
def update_notification_channel_by_uid(
self, uid: str, notification_channel: dict
) -> dict:
"""The method includes a functionality to update an alerting notification channel specified by the notification channel dict and the uid
Args:
uid (str): Specify the uid of the notification channel
notification_channel (dict): Specify the channel of the notification
Raises:
ValueError: Missed specifying a necessary value
Exception: Unspecified error by executing the API call
Returns:
api_call (dict): Returns the updated notification channel
"""
if len(uid) != 0 and notification_channel != dict():
api_call: dict = (
Api(self.grafana_api_model)
.call_the_api(
f"{APIEndpoints.ALERT_NOTIFICATIONS.value}/uid/{uid}",
RequestsMethods.PUT,
json.dumps(notification_channel),
)
.json()
)
if api_call == dict() or api_call.get("id") is None:
logging.error(f"Check the error: {api_call}.")
raise Exception
else:
return api_call
else:
logging.error("There is no uid or notification_channel defined.")
raise ValueError
def update_notification_channel_by_id(
self, id: int, notification_channel: dict
) -> dict:
"""The method includes a functionality to update an alerting notification channel specified by the notification channel dict and the id
Args:
id (int): Specify the id of the notification channel
notification_channel (dict): Specify the channel of the notification
Raises:
ValueError: Missed specifying a necessary value
Exception: Unspecified error by executing the API call
Returns:
api_call (dict): Returns the updated notification channel
"""
if id != 0 and notification_channel != dict():
api_call: dict = (
Api(self.grafana_api_model)
.call_the_api(
f"{APIEndpoints.ALERT_NOTIFICATIONS.value}/{id}",
RequestsMethods.PUT,
json.dumps(notification_channel),
)
.json()
)
if api_call == dict() or api_call.get("id") is None:
logging.error(f"Check the error: {api_call}.")
raise Exception
else:
return api_call
else:
logging.error("There is no id or notification_channel defined.")
raise ValueError
def delete_notification_channel_by_uid(self, uid: str):
"""The method includes a functionality to delete an alerting notification channel specified by the uid
Args:
uid (uid): Specify the uid of the notification channel
Raises:
ValueError: Missed specifying a necessary value
Exception: Unspecified error by executing the API call
Returns:
None
"""
if len(uid) != 0:
api_call: dict = (
Api(self.grafana_api_model)
.call_the_api(
f"{APIEndpoints.ALERT_NOTIFICATIONS.value}/uid/{uid}",
RequestsMethods.DELETE,
)
.json()
)
if api_call.get("message") != "Notification deleted":
logging.error(f"Check the error: {api_call}.")
raise Exception
else:
logging.info("You successfully destroyed the notification channel.")
else:
logging.error("There is no uid defined.")
raise ValueError
def delete_notification_channel_by_id(self, id: int):
"""The method includes a functionality to delete an alerting notification channel specified by the id
Args:
id (int): Specify the id of the notification channel
Raises:
ValueError: Missed specifying a necessary value
Exception: Unspecified error by executing the API call
Returns:
None
"""
if id != 0:
api_call: dict = (
Api(self.grafana_api_model)
.call_the_api(
f"{APIEndpoints.ALERT_NOTIFICATIONS.value}/{id}",
RequestsMethods.DELETE,
)
.json()
)
if api_call.get("message") != "Notification deleted":
logging.error(f"Check the error: {api_call}.")
raise Exception
else:
logging.info("You successfully destroyed the notification channel.")
else:
logging.error("There is no id defined.")
raise ValueError
def test_notification_channel(self, notification_channel: dict):
"""The method includes a functionality to test an alerting notification channel specified by the notification_channel
Args:
notification_channel (dict): Specify the channel of the notification
Raises:
ValueError: Missed specifying a necessary value
Exception: Unspecified error by executing the API call
Returns:
None
"""
if notification_channel != dict():
api_call: dict = (
Api(self.grafana_api_model)
.call_the_api(
f"{APIEndpoints.ALERT_NOTIFICATIONS.value}/test",
RequestsMethods.POST,
json.dumps(notification_channel),
)
.json()
)
if api_call.get("message") != "Test notification sent":
logging.error(f"Check the error: {api_call}.")
raise Exception
else:
logging.info("You successfully tested the notification channel.")
else:
logging.error("There is no notification_channel defined.")
raise ValueError
| 34.383721 | 144 | 0.568059 | 1,235 | 11,828 | 5.300405 | 0.082591 | 0.065231 | 0.030247 | 0.03483 | 0.906813 | 0.898106 | 0.885732 | 0.858081 | 0.833486 | 0.820654 | 0 | 0.001061 | 0.362783 | 11,828 | 343 | 145 | 34.483965 | 0.867454 | 0.319158 | 0 | 0.743316 | 0 | 0 | 0.160234 | 0.051237 | 0 | 0 | 0 | 0 | 0 | 1 | 0.058824 | false | 0 | 0.02139 | 0 | 0.122995 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
715348f06863a6661e84be5a399045c63f9a064e | 124 | py | Python | webservice/models/__init__.py | leonardodalinky/device-rental-platform-backend | 9ca9137ebce2ec241c6aecd73128c28941e3d6c4 | [
"MIT"
] | null | null | null | webservice/models/__init__.py | leonardodalinky/device-rental-platform-backend | 9ca9137ebce2ec241c6aecd73128c28941e3d6c4 | [
"MIT"
] | null | null | null | webservice/models/__init__.py | leonardodalinky/device-rental-platform-backend | 9ca9137ebce2ec241c6aecd73128c28941e3d6c4 | [
"MIT"
] | null | null | null | from .create_apply import *
from .device import *
from .device_apply import *
from .perm_apply import *
from .user import *
| 20.666667 | 27 | 0.758065 | 18 | 124 | 5.055556 | 0.388889 | 0.43956 | 0.494505 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.16129 | 124 | 5 | 28 | 24.8 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
7158401891c7b940c0447914b5072a00ef0428c1 | 185 | py | Python | icedata/datasets/fridge/__init__.py | ganesh3/icedata | 16c26ea3d8f96b99357683849d6bd363bf12a827 | [
"Apache-2.0"
] | 42 | 2020-09-14T18:28:02.000Z | 2022-03-30T19:55:10.000Z | icedata/datasets/fridge/__init__.py | ganesh3/icedata | 16c26ea3d8f96b99357683849d6bd363bf12a827 | [
"Apache-2.0"
] | 103 | 2020-09-11T19:50:29.000Z | 2022-03-15T13:07:10.000Z | icedata/datasets/fridge/__init__.py | ganesh3/icedata | 16c26ea3d8f96b99357683849d6bd363bf12a827 | [
"Apache-2.0"
] | 19 | 2020-09-11T19:26:50.000Z | 2022-03-15T13:09:44.000Z | from icedata.datasets.fridge.data import *
from icedata.datasets.fridge.parser import *
from icedata.datasets.fridge.dataset import *
from icedata.datasets.fridge import trained_models
| 37 | 50 | 0.837838 | 25 | 185 | 6.16 | 0.4 | 0.285714 | 0.493506 | 0.649351 | 0.603896 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.086486 | 185 | 4 | 51 | 46.25 | 0.911243 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
71a3bf235eb9a25b9003a8994a4a07e14791a2cc | 3,578 | py | Python | tests/test_perceptron_sigmoid.py | Guilyx/neural-network-logicgates | bdf4b69ec7d0a424637556d3ff0f85072dd17555 | [
"MIT"
] | null | null | null | tests/test_perceptron_sigmoid.py | Guilyx/neural-network-logicgates | bdf4b69ec7d0a424637556d3ff0f85072dd17555 | [
"MIT"
] | 2 | 2020-01-22T01:33:40.000Z | 2020-01-28T00:00:52.000Z | tests/test_perceptron_sigmoid.py | Guilyx/neural-network-logicgates | bdf4b69ec7d0a424637556d3ff0f85072dd17555 | [
"MIT"
] | null | null | null | import numpy as np
from random import randint
from random import uniform
import sys
from os import path
sys.path.append( path.dirname( path.dirname( path.abspath(__file__) ) ) )
from lib.network import NeuralNetwork
from lib.gate_predictions import LogicGate
def test_outputs_or():
gate = 'or'
learning_rate = 0.1
error = .25
epochs = 100000
mode = 'perceptron'
expected_inputs = np.array([[0, 0],
[0, 1],
[1, 0],
[1, 1]])
expected_outputs = np.array([[0, 1, 1, 1]]).T
gate_ = LogicGate(gate, 'sigmoid')
gate_.train(epochs, learning_rate, mode)
i = 0
for elem in expected_inputs:
pr_out = gate_.madame_irma.predict(elem)
print(elem, ' ---> ', pr_out[0])
assert(pr_out[0] > expected_outputs[i][0] - error and pr_out[0] < expected_outputs[i][0] + error)
i += 1
def test_outputs_nor():
gate = 'nor'
learning_rate = 0.1
error = .25
epochs = 100000
mode = 'perceptron'
expected_inputs = np.array([[0, 0],
[0, 1],
[1, 0],
[1, 1]])
expected_outputs = np.array([[1, 0, 0, 0]]).T
gate_ = LogicGate(gate, 'sigmoid')
gate_.train(epochs, learning_rate, mode)
i = 0
for elem in expected_inputs:
pr_out = gate_.madame_irma.predict(elem)
print(elem, ' ---> ', pr_out[0])
assert(pr_out[0] > expected_outputs[i][0] - error and pr_out[0] < expected_outputs[i][0] + error)
i += 1
def test_outputs_xor():
gate = 'xor'
learning_rate = 0.1
error = .25
epochs = 100000
mode = 'perceptron'
expected_inputs = np.array([[0, 0],
[0, 1],
[1, 0],
[1, 1]])
expected_outputs = np.array([[0, 1, 1, 0]]).T
gate_ = LogicGate(gate, 'sigmoid')
gate_.train(epochs, learning_rate, mode)
i = 0
for elem in expected_inputs:
pr_out = gate_.madame_irma.predict(elem)
print(elem, ' ---> ', pr_out[0])
assert(pr_out[0] > expected_outputs[i][0] - error and pr_out[0] < expected_outputs[i][0] + error)
i += 1
def test_outputs_and():
gate = 'and'
learning_rate = 0.1
error = .25
epochs = 100000
mode = 'perceptron'
expected_inputs = np.array([[0, 0],
[0, 1],
[1, 0],
[1, 1]])
expected_outputs = np.array([[0, 0, 0, 1]]).T
gate_ = LogicGate(gate, 'sigmoid')
gate_.train(epochs, learning_rate, mode)
i = 0
for elem in expected_inputs:
pr_out = gate_.madame_irma.predict(elem)
print(elem, ' ---> ', pr_out[0])
assert(pr_out[0] > expected_outputs[i][0] - error and pr_out[0] < expected_outputs[i][0] + error)
i += 1
def test_outputs_nand():
gate = 'nand'
learning_rate = 0.1
error = .25
epochs = 100000
mode = 'perceptron'
expected_inputs = np.array([[0, 0],
[0, 1],
[1, 0],
[1, 1]])
expected_outputs = np.array([[1, 1, 1, 0]]).T
gate_ = LogicGate(gate, 'sigmoid')
gate_.train(epochs, learning_rate, mode)
i = 0
for elem in expected_inputs:
pr_out = gate_.madame_irma.predict(elem)
print(elem, ' ---> ', pr_out[0])
assert(pr_out[0] > expected_outputs[i][0] - error and pr_out[0] < expected_outputs[i][0] + error)
i += 1 | 25.927536 | 105 | 0.528508 | 469 | 3,578 | 3.840085 | 0.119403 | 0.055525 | 0.049972 | 0.077735 | 0.848973 | 0.848973 | 0.847862 | 0.847862 | 0.847862 | 0.847862 | 0 | 0.060644 | 0.33175 | 3,578 | 138 | 106 | 25.927536 | 0.692597 | 0 | 0 | 0.776699 | 0 | 0 | 0.036323 | 0 | 0 | 0 | 0 | 0 | 0.048544 | 1 | 0.048544 | false | 0 | 0.067961 | 0 | 0.116505 | 0.048544 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
71b4c5d6c61af57b0cf734e80c4d67107c6e6355 | 4,904 | py | Python | pynos/versions/ver_7/ver_7_1_0/yang/brocade_http_redirect.py | bdeetz/pynos | bd8a34e98f322de3fc06750827d8bbc3a0c00380 | [
"Apache-2.0"
] | 12 | 2015-09-21T23:56:09.000Z | 2018-03-30T04:35:32.000Z | pynos/versions/ver_7/ver_7_1_0/yang/brocade_http_redirect.py | bdeetz/pynos | bd8a34e98f322de3fc06750827d8bbc3a0c00380 | [
"Apache-2.0"
] | 10 | 2016-09-15T19:03:27.000Z | 2017-07-17T23:38:01.000Z | pynos/versions/ver_7/ver_7_1_0/yang/brocade_http_redirect.py | bdeetz/pynos | bd8a34e98f322de3fc06750827d8bbc3a0c00380 | [
"Apache-2.0"
] | 6 | 2015-08-14T08:05:23.000Z | 2022-02-03T15:33:54.000Z | #!/usr/bin/env python
import xml.etree.ElementTree as ET
class brocade_http_redirect(object):
"""Auto generated class.
"""
def __init__(self, **kwargs):
self._callback = kwargs.pop('callback')
def set_http_application_url_input_config_http_app_url_url(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
set_http_application_url = ET.Element("set_http_application_url")
config = set_http_application_url
input = ET.SubElement(set_http_application_url, "input")
config_http_app_url = ET.SubElement(input, "config-http-app-url")
url = ET.SubElement(config_http_app_url, "url")
url.text = kwargs.pop('url')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def set_http_application_url_input_config_http_app_url_op_type(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
set_http_application_url = ET.Element("set_http_application_url")
config = set_http_application_url
input = ET.SubElement(set_http_application_url, "input")
config_http_app_url = ET.SubElement(input, "config-http-app-url")
op_type = ET.SubElement(config_http_app_url, "op-type")
op_type.text = kwargs.pop('op_type')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def set_http_application_url_output_status_code(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
set_http_application_url = ET.Element("set_http_application_url")
config = set_http_application_url
output = ET.SubElement(set_http_application_url, "output")
status_code = ET.SubElement(output, "status-code")
status_code.text = kwargs.pop('status_code')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def set_http_application_url_output_status_string(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
set_http_application_url = ET.Element("set_http_application_url")
config = set_http_application_url
output = ET.SubElement(set_http_application_url, "output")
status_string = ET.SubElement(output, "status-string")
status_string.text = kwargs.pop('status_string')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def set_http_application_url_input_config_http_app_url_url(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
set_http_application_url = ET.Element("set_http_application_url")
config = set_http_application_url
input = ET.SubElement(set_http_application_url, "input")
config_http_app_url = ET.SubElement(input, "config-http-app-url")
url = ET.SubElement(config_http_app_url, "url")
url.text = kwargs.pop('url')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def set_http_application_url_input_config_http_app_url_op_type(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
set_http_application_url = ET.Element("set_http_application_url")
config = set_http_application_url
input = ET.SubElement(set_http_application_url, "input")
config_http_app_url = ET.SubElement(input, "config-http-app-url")
op_type = ET.SubElement(config_http_app_url, "op-type")
op_type.text = kwargs.pop('op_type')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def set_http_application_url_output_status_code(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
set_http_application_url = ET.Element("set_http_application_url")
config = set_http_application_url
output = ET.SubElement(set_http_application_url, "output")
status_code = ET.SubElement(output, "status-code")
status_code.text = kwargs.pop('status_code')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def set_http_application_url_output_status_string(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
set_http_application_url = ET.Element("set_http_application_url")
config = set_http_application_url
output = ET.SubElement(set_http_application_url, "output")
status_string = ET.SubElement(output, "status-string")
status_string.text = kwargs.pop('status_string')
callback = kwargs.pop('callback', self._callback)
return callback(config)
| 41.210084 | 83 | 0.665375 | 593 | 4,904 | 5.139966 | 0.065767 | 0.091864 | 0.23622 | 0.275591 | 0.954724 | 0.954724 | 0.954724 | 0.954724 | 0.954724 | 0.954724 | 0 | 0 | 0.227773 | 4,904 | 119 | 84 | 41.210084 | 0.804859 | 0.056892 | 0 | 0.95 | 1 | 0 | 0.124017 | 0.041921 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1125 | false | 0 | 0.0125 | 0 | 0.2375 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
71beb55c3216b5ea0108fd52161927c94f8a202a | 679 | py | Python | eqb/scripts/hexstring_to_bytearray.py | cryptotechguru/tesseract | 9b64c61483710c94390404c3d920c1059cbfcda7 | [
"MIT"
] | null | null | null | eqb/scripts/hexstring_to_bytearray.py | cryptotechguru/tesseract | 9b64c61483710c94390404c3d920c1059cbfcda7 | [
"MIT"
] | 45 | 2019-02-05T17:17:18.000Z | 2019-07-20T17:21:02.000Z | eqb/scripts/hexstring_to_bytearray.py | cryptotechguru/tesseract | 9b64c61483710c94390404c3d920c1059cbfcda7 | [
"MIT"
] | 6 | 2019-02-01T12:30:48.000Z | 2019-03-01T20:33:14.000Z | # this script converts a hex string to an array of bytes in hex format.
input = "000000df2478e3e79196f5876fa1c02f2d48c6cec2a89560562ff452714bcbda9196884f032101a50d7c35e8960aa602936c22652daf940ce9eb02fd34aa7b8134035be215bf1b200260e196a84bfcdc06adc0e17fad86f90fb6bad93543e6f8604d25a774a5b30db40021024530480287bbdf8dfe1956312f02ae53c430eb56d7e7292ae614a9170000000002625a008743073c11901c981166115d83477765ad769564ed14a91700000000042c108802fffffffe51bc1fcde5f1d7a84999a70f166acbaa3a9e72c51400161700000000570880a3e3fad00e4b8fae6d4e2261cbd38771c8f68c2a63e4fe8d3f167e037c01010000000002"
for (op, code) in zip(input[0::2], input[1::2]):
print ("0x{}, ".format(op + code), end=""); | 113.166667 | 506 | 0.892489 | 33 | 679 | 18.363636 | 0.757576 | 0.019802 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.537383 | 0.054492 | 679 | 6 | 507 | 113.166667 | 0.406542 | 0.10162 | 0 | 0 | 0 | 0 | 0.825658 | 0.815789 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.333333 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
71c433a2815829484770868ad496a60da657862d | 10,484 | py | Python | sahara/tests/unit/service/validation/edp/test_data_source.py | redhat-openstack/sahara | 67165c96eceb1ce3b087870934d394602f5dd959 | [
"Apache-2.0"
] | null | null | null | sahara/tests/unit/service/validation/edp/test_data_source.py | redhat-openstack/sahara | 67165c96eceb1ce3b087870934d394602f5dd959 | [
"Apache-2.0"
] | null | null | null | sahara/tests/unit/service/validation/edp/test_data_source.py | redhat-openstack/sahara | 67165c96eceb1ce3b087870934d394602f5dd959 | [
"Apache-2.0"
] | null | null | null | # Copyright (c) 2013 Mirantis Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import mock
import testtools
import sahara.exceptions as ex
from sahara.service import api
from sahara.service.validations.edp import data_source as ds
from sahara.swift import utils as su
from sahara.tests.unit.service.validation import utils as u
SAMPLE_SWIFT_URL = "swift://1234/object"
SAMPLE_SWIFT_URL_WITH_SUFFIX = "swift://1234%s/object" % su.SWIFT_URL_SUFFIX
class TestDataSourceValidation(u.ValidationTestCase):
def setUp(self):
super(TestDataSourceValidation, self).setUp()
self._create_object_fun = ds.check_data_source_create
self.scheme = ds.DATA_SOURCE_SCHEMA
api.plugin_base.setup_plugins()
def test_swift_creation(self):
data = {
"name": "test_data_data_source",
"url": SAMPLE_SWIFT_URL,
"type": "swift",
"credentials": {
"user": "user",
"password": "password"
},
"description": "long description"
}
self._assert_types(data)
@mock.patch("sahara.service.validations."
"edp.base.check_data_source_unique_name")
def test_swift_creation_missing_credentials(self,
check_data_source_unique_name):
check_data_source_unique_name.return_value = True
data = {
"name": "test_data_data_source",
"url": SAMPLE_SWIFT_URL,
"type": "swift",
"description": "long description"
}
with testtools.ExpectedException(ex.InvalidCredentials):
ds.check_data_source_create(data)
# proxy enabled should allow creation without credentials
self.override_config('use_domain_for_proxy_users', True)
ds.check_data_source_create(data)
@mock.patch("sahara.service.validations."
"edp.base.check_data_source_unique_name")
def test_swift_creation_credentials_missing_user(
self,
check_data_source_unique_name):
check_data_source_unique_name.return_value = True
data = {
"name": "test_data_data_source",
"url": SAMPLE_SWIFT_URL,
"type": "swift",
"credentials": {
"password": "password"
},
"description": "long description"
}
with testtools.ExpectedException(ex.InvalidCredentials):
ds.check_data_source_create(data)
# proxy enabled should allow creation without credentials
self.override_config('use_domain_for_proxy_users', True)
ds.check_data_source_create(data)
@mock.patch("sahara.service.validations."
"edp.base.check_data_source_unique_name")
def test_swift_creation_credentials_missing_password(
self,
check_data_source_unique_name):
check_data_source_unique_name.return_value = True
data = {
"name": "test_data_data_source",
"url": SAMPLE_SWIFT_URL,
"type": "swift",
"credentials": {
"user": "user",
},
"description": "long description"
}
with testtools.ExpectedException(ex.InvalidCredentials):
ds.check_data_source_create(data)
# proxy enabled should allow creation without credentials
self.override_config('use_domain_for_proxy_users', True)
ds.check_data_source_create(data)
@mock.patch("sahara.service.validations."
"edp.base.check_data_source_unique_name")
def test_swift_creation_wrong_schema(self, check_data_source_unique_name):
check_data_source_unique_name.return_value = True
data = {
"name": "test_data_data_source",
"url": "swif://1234/object",
"type": "swift",
"description": "incorrect url schema"
}
with testtools.ExpectedException(ex.InvalidDataException):
ds.check_data_source_create(data)
@mock.patch("sahara.service.validations."
"edp.base.check_data_source_unique_name")
def test_swift_creation_explicit_suffix(self,
check_data_source_unique_name):
check_data_source_unique_name.return_value = True
data = {
"name": "test_data_data_source",
"url": SAMPLE_SWIFT_URL_WITH_SUFFIX,
"type": "swift",
"description": "incorrect url schema",
"credentials": {
"user": "user",
"password": "password"
}
}
self._assert_types(data)
@mock.patch("sahara.service.validations."
"edp.base.check_data_source_unique_name")
def test_swift_creation_wrong_suffix(self,
check_data_source_unique_name):
check_data_source_unique_name.return_value = True
data = {
"name": "test_data_data_source",
"url": "swift://1234.suffix/object",
"type": "swift",
"description": "incorrect url schema"
}
with testtools.ExpectedException(ex.InvalidDataException):
ds.check_data_source_create(data)
@mock.patch("sahara.service.validations."
"edp.base.check_data_source_unique_name")
def test_swift_creation_missing_object(self,
check_data_source_unique_name):
check_data_source_unique_name.return_value = True
data = {
"name": "test_data_data_source",
"url": "swift://1234/",
"type": "swift",
"description": "incorrect url schema"
}
with testtools.ExpectedException(ex.InvalidDataException):
ds.check_data_source_create(data)
@mock.patch("sahara.service.validations."
"edp.base.check_data_source_unique_name")
def test_hdfs_creation_wrong_schema(self, check_data_source_unique_name):
check_data_source_unique_name.return_value = True
data = {
"name": "test_data_data_source",
"url": "hdf://test_cluster/",
"type": "hdfs",
"description": "incorrect url schema"
}
with testtools.ExpectedException(ex.InvalidDataException):
ds.check_data_source_create(data)
@mock.patch("sahara.service.validations."
"edp.base.check_data_source_unique_name")
def test_hdfs_creation_correct_url(self, check_data_source_unique_name):
check_data_source_unique_name.return_value = True
data = {
"name": "test_data_data_source",
"url": "hdfs://test_cluster/",
"type": "hdfs",
"description": "correct url schema"
}
ds.check_data_source_create(data)
@mock.patch("sahara.service.validations."
"edp.base.check_data_source_unique_name")
def test_hdfs_creation_local_rel_url(self, check_data_source_unique_name):
check_data_source_unique_name.return_value = True
data = {
"name": "test_data_data_source",
"url": "mydata/input",
"type": "hdfs",
"description": "correct url schema for relative path on local hdfs"
}
ds.check_data_source_create(data)
@mock.patch("sahara.service.validations."
"edp.base.check_data_source_unique_name")
def test_hdfs_creation_local_abs_url(self, check_data_source_unique_name):
check_data_source_unique_name.return_value = True
data = {
"name": "test_data_data_source",
"url": "/tmp/output",
"type": "hdfs",
"description": "correct url schema for absolute path on local hdfs"
}
ds.check_data_source_create(data)
@mock.patch("sahara.service.validations."
"edp.base.check_data_source_unique_name")
def test_maprfs_creation_wrong_schema(self, check_data_source_unique_name):
check_data_source_unique_name.return_value = True
data = {
"name": "test_data_data_source",
"url": "maprf://test_cluster/",
"type": "maprfs",
"description": "incorrect url schema"
}
with testtools.ExpectedException(ex.InvalidDataException):
ds.check_data_source_create(data)
@mock.patch("sahara.service.validations."
"edp.base.check_data_source_unique_name")
def test_maprfs_creation_correct_url(self, check_data_source_unique_name):
check_data_source_unique_name.return_value = True
data = {
"name": "test_data_data_source",
"url": "maprfs:///test_cluster/",
"type": "maprfs",
"description": "correct url schema"
}
ds.check_data_source_create(data)
@mock.patch("sahara.service.validations."
"edp.base.check_data_source_unique_name")
def test_maprfs_creation_local_rel_url(self, check_ds_unique_name):
check_ds_unique_name.return_value = True
data = {
"name": "test_data_data_source",
"url": "mydata/input",
"type": "maprfs",
"description": ("correct url schema for"
" relative path on local maprfs")
}
ds.check_data_source_create(data)
@mock.patch("sahara.service.validations."
"edp.base.check_data_source_unique_name")
def test_maprfs_creation_local_abs_url(self, check_ds_unique_name):
check_ds_unique_name.return_value = True
data = {
"name": "test_data_data_source",
"url": "/tmp/output",
"type": "maprfs",
"description": ("correct url schema for"
" absolute path on local maprfs")
}
ds.check_data_source_create(data)
| 37.985507 | 79 | 0.622472 | 1,157 | 10,484 | 5.284356 | 0.133103 | 0.12594 | 0.14475 | 0.140824 | 0.840857 | 0.806183 | 0.795061 | 0.791789 | 0.791789 | 0.766928 | 0 | 0.003718 | 0.281763 | 10,484 | 275 | 80 | 38.123636 | 0.808234 | 0.068771 | 0 | 0.716814 | 0 | 0 | 0.265469 | 0.151873 | 0 | 0 | 0 | 0 | 0.00885 | 1 | 0.075221 | false | 0.017699 | 0.030973 | 0 | 0.110619 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
71ddb0b3d498c7178f2deef12d1d7e28a7e704a7 | 36,584 | py | Python | test/acquisition/test_monte_carlo.py | SamuelMarks/botorch | 7801e2f56dc447322b2b6c92cab683d8900e4c7f | [
"MIT"
] | 2 | 2021-01-11T18:16:27.000Z | 2021-11-30T09:34:44.000Z | test/acquisition/test_monte_carlo.py | SamuelMarks/botorch | 7801e2f56dc447322b2b6c92cab683d8900e4c7f | [
"MIT"
] | 17 | 2020-12-11T20:07:22.000Z | 2022-03-27T16:46:42.000Z | test/acquisition/test_monte_carlo.py | SamuelMarks/botorch | 7801e2f56dc447322b2b6c92cab683d8900e4c7f | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
# Copyright (c) Facebook, Inc. and its affiliates.
#
# This source code is licensed under the MIT license found in the
# LICENSE file in the root directory of this source tree.
import warnings
from unittest import mock
import torch
from botorch import settings
from botorch.acquisition.monte_carlo import (
MCAcquisitionFunction,
qExpectedImprovement,
qNoisyExpectedImprovement,
qProbabilityOfImprovement,
qSimpleRegret,
qUpperConfidenceBound,
)
from botorch.acquisition.objective import ScalarizedObjective
from botorch.exceptions import BotorchWarning, UnsupportedError
from botorch.sampling.samplers import IIDNormalSampler, SobolQMCNormalSampler
from botorch.utils.testing import BotorchTestCase, MockModel, MockPosterior
class DummyMCAcquisitionFunction(MCAcquisitionFunction):
def forward(self, X):
pass
class TestMCAcquisitionFunction(BotorchTestCase):
def test_abstract_raises(self):
with self.assertRaises(TypeError):
MCAcquisitionFunction()
# raise if model is multi-output, but no objective is given
no = "botorch.utils.testing.MockModel.num_outputs"
with mock.patch(no, new_callable=mock.PropertyMock) as mock_num_outputs:
mock_num_outputs.return_value = 2
mm = MockModel(MockPosterior())
with self.assertRaises(UnsupportedError):
DummyMCAcquisitionFunction(model=mm)
class TestQExpectedImprovement(BotorchTestCase):
def test_q_expected_improvement(self):
for dtype in (torch.float, torch.double):
# the event shape is `b x q x t` = 1 x 1 x 1
samples = torch.zeros(1, 1, 1, device=self.device, dtype=dtype)
mm = MockModel(MockPosterior(samples=samples))
# X is `q x d` = 1 x 1. X is a dummy and unused b/c of mocking
X = torch.zeros(1, 1, device=self.device, dtype=dtype)
# basic test
sampler = IIDNormalSampler(num_samples=2)
acqf = qExpectedImprovement(model=mm, best_f=0, sampler=sampler)
res = acqf(X)
self.assertEqual(res.item(), 0.0)
# test shifting best_f value
acqf = qExpectedImprovement(model=mm, best_f=-1, sampler=sampler)
res = acqf(X)
self.assertEqual(res.item(), 1.0)
# TODO: Test batched best_f, batched model, batched evaluation
# basic test, no resample
sampler = IIDNormalSampler(num_samples=2, seed=12345)
acqf = qExpectedImprovement(model=mm, best_f=0, sampler=sampler)
res = acqf(X)
self.assertEqual(res.item(), 0.0)
self.assertEqual(acqf.sampler.base_samples.shape, torch.Size([2, 1, 1, 1]))
bs = acqf.sampler.base_samples.clone()
res = acqf(X)
self.assertTrue(torch.equal(acqf.sampler.base_samples, bs))
# basic test, qmc, no resample
sampler = SobolQMCNormalSampler(num_samples=2)
acqf = qExpectedImprovement(model=mm, best_f=0, sampler=sampler)
res = acqf(X)
self.assertEqual(res.item(), 0.0)
self.assertEqual(acqf.sampler.base_samples.shape, torch.Size([2, 1, 1, 1]))
bs = acqf.sampler.base_samples.clone()
acqf(X)
self.assertTrue(torch.equal(acqf.sampler.base_samples, bs))
# basic test, qmc, resample
sampler = SobolQMCNormalSampler(num_samples=2, resample=True)
acqf = qExpectedImprovement(model=mm, best_f=0, sampler=sampler)
res = acqf(X)
self.assertEqual(res.item(), 0.0)
self.assertEqual(acqf.sampler.base_samples.shape, torch.Size([2, 1, 1, 1]))
bs = acqf.sampler.base_samples.clone()
acqf(X)
self.assertFalse(torch.equal(acqf.sampler.base_samples, bs))
# basic test for X_pending and warning
acqf.set_X_pending()
self.assertIsNone(acqf.X_pending)
acqf.set_X_pending(None)
self.assertIsNone(acqf.X_pending)
acqf.set_X_pending(X)
self.assertEqual(acqf.X_pending, X)
res = acqf(X)
X2 = torch.zeros(
1, 1, 1, device=self.device, dtype=dtype, requires_grad=True
)
with warnings.catch_warnings(record=True) as ws, settings.debug(True):
acqf.set_X_pending(X2)
self.assertEqual(acqf.X_pending, X2)
self.assertEqual(len(ws), 1)
self.assertTrue(issubclass(ws[-1].category, BotorchWarning))
# test bad objective type
obj = ScalarizedObjective(
weights=torch.rand(2, device=self.device, dtype=dtype)
)
with self.assertRaises(UnsupportedError):
qExpectedImprovement(model=mm, best_f=0, sampler=sampler, objective=obj)
def test_q_expected_improvement_batch(self):
for dtype in (torch.float, torch.double):
# the event shape is `b x q x t` = 2 x 2 x 1
samples = torch.zeros(2, 2, 1, device=self.device, dtype=dtype)
samples[0, 0, 0] = 1.0
mm = MockModel(MockPosterior(samples=samples))
# X is a dummy and unused b/c of mocking
X = torch.zeros(1, 1, 1, device=self.device, dtype=dtype)
# test batch mode
sampler = IIDNormalSampler(num_samples=2)
acqf = qExpectedImprovement(model=mm, best_f=0, sampler=sampler)
res = acqf(X)
self.assertEqual(res[0].item(), 1.0)
self.assertEqual(res[1].item(), 0.0)
# test shifting best_f value
acqf = qExpectedImprovement(model=mm, best_f=-1, sampler=sampler)
res = acqf(X)
self.assertEqual(res[0].item(), 2.0)
self.assertEqual(res[1].item(), 1.0)
# test batch mode, no resample
sampler = IIDNormalSampler(num_samples=2, seed=12345)
acqf = qExpectedImprovement(model=mm, best_f=0, sampler=sampler)
res = acqf(X) # 1-dim batch
self.assertEqual(res[0].item(), 1.0)
self.assertEqual(res[1].item(), 0.0)
self.assertEqual(acqf.sampler.base_samples.shape, torch.Size([2, 1, 2, 1]))
bs = acqf.sampler.base_samples.clone()
acqf(X)
self.assertTrue(torch.equal(acqf.sampler.base_samples, bs))
res = acqf(X.expand(2, 1, 1)) # 2-dim batch
self.assertEqual(res[0].item(), 1.0)
self.assertEqual(res[1].item(), 0.0)
# the base samples should have the batch dim collapsed
self.assertEqual(acqf.sampler.base_samples.shape, torch.Size([2, 1, 2, 1]))
bs = acqf.sampler.base_samples.clone()
acqf(X.expand(2, 1, 1))
self.assertTrue(torch.equal(acqf.sampler.base_samples, bs))
# test batch mode, qmc, no resample
sampler = SobolQMCNormalSampler(num_samples=2)
acqf = qExpectedImprovement(model=mm, best_f=0, sampler=sampler)
res = acqf(X)
self.assertEqual(res[0].item(), 1.0)
self.assertEqual(res[1].item(), 0.0)
self.assertEqual(acqf.sampler.base_samples.shape, torch.Size([2, 1, 2, 1]))
bs = acqf.sampler.base_samples.clone()
acqf(X)
self.assertTrue(torch.equal(acqf.sampler.base_samples, bs))
# test batch mode, qmc, resample
sampler = SobolQMCNormalSampler(num_samples=2, resample=True)
acqf = qExpectedImprovement(model=mm, best_f=0, sampler=sampler)
res = acqf(X) # 1-dim batch
self.assertEqual(res[0].item(), 1.0)
self.assertEqual(res[1].item(), 0.0)
self.assertEqual(acqf.sampler.base_samples.shape, torch.Size([2, 1, 2, 1]))
bs = acqf.sampler.base_samples.clone()
acqf(X)
self.assertFalse(torch.equal(acqf.sampler.base_samples, bs))
res = acqf(X.expand(2, 1, 1)) # 2-dim batch
self.assertEqual(res[0].item(), 1.0)
self.assertEqual(res[1].item(), 0.0)
# the base samples should have the batch dim collapsed
self.assertEqual(acqf.sampler.base_samples.shape, torch.Size([2, 1, 2, 1]))
bs = acqf.sampler.base_samples.clone()
acqf(X.expand(2, 1, 1))
self.assertFalse(torch.equal(acqf.sampler.base_samples, bs))
# TODO: Test different objectives (incl. constraints)
class TestQNoisyExpectedImprovement(BotorchTestCase):
def test_q_noisy_expected_improvement(self):
for dtype in (torch.float, torch.double):
# the event shape is `b x q x t` = 1 x 2 x 1
samples_noisy = torch.tensor([1.0, 0.0], device=self.device, dtype=dtype)
samples_noisy = samples_noisy.view(1, 2, 1)
# X_baseline is `q' x d` = 1 x 1
X_baseline = torch.zeros(1, 1, device=self.device, dtype=dtype)
mm_noisy = MockModel(MockPosterior(samples=samples_noisy))
# X is `q x d` = 1 x 1
X = torch.zeros(1, 1, device=self.device, dtype=dtype)
# basic test
sampler = IIDNormalSampler(num_samples=2)
acqf = qNoisyExpectedImprovement(
model=mm_noisy, X_baseline=X_baseline, sampler=sampler
)
res = acqf(X)
self.assertEqual(res.item(), 1.0)
# basic test, no resample
sampler = IIDNormalSampler(num_samples=2, seed=12345)
acqf = qNoisyExpectedImprovement(
model=mm_noisy, X_baseline=X_baseline, sampler=sampler
)
res = acqf(X)
self.assertEqual(res.item(), 1.0)
self.assertEqual(acqf.sampler.base_samples.shape, torch.Size([2, 1, 2, 1]))
bs = acqf.sampler.base_samples.clone()
acqf(X)
self.assertTrue(torch.equal(acqf.sampler.base_samples, bs))
# basic test, qmc, no resample
sampler = SobolQMCNormalSampler(num_samples=2)
acqf = qNoisyExpectedImprovement(
model=mm_noisy, X_baseline=X_baseline, sampler=sampler
)
res = acqf(X)
self.assertEqual(res.item(), 1.0)
self.assertEqual(acqf.sampler.base_samples.shape, torch.Size([2, 1, 2, 1]))
bs = acqf.sampler.base_samples.clone()
acqf(X)
self.assertTrue(torch.equal(acqf.sampler.base_samples, bs))
# basic test, qmc, resample
sampler = SobolQMCNormalSampler(num_samples=2, resample=True, seed=12345)
acqf = qNoisyExpectedImprovement(
model=mm_noisy, X_baseline=X_baseline, sampler=sampler
)
res = acqf(X)
self.assertEqual(res.item(), 1.0)
self.assertEqual(acqf.sampler.base_samples.shape, torch.Size([2, 1, 2, 1]))
bs = acqf.sampler.base_samples.clone()
acqf(X)
self.assertFalse(torch.equal(acqf.sampler.base_samples, bs))
# basic test for X_pending and warning
sampler = SobolQMCNormalSampler(num_samples=2)
samples_noisy_pending = torch.tensor(
[1.0, 0.0, 0.0], device=self.device, dtype=dtype
)
samples_noisy_pending = samples_noisy_pending.view(1, 3, 1)
mm_noisy_pending = MockModel(MockPosterior(samples=samples_noisy_pending))
acqf = qNoisyExpectedImprovement(
model=mm_noisy_pending, X_baseline=X_baseline, sampler=sampler
)
acqf.set_X_pending()
self.assertIsNone(acqf.X_pending)
acqf.set_X_pending(None)
self.assertIsNone(acqf.X_pending)
acqf.set_X_pending(X)
self.assertEqual(acqf.X_pending, X)
res = acqf(X)
X2 = torch.zeros(
1, 1, 1, device=self.device, dtype=dtype, requires_grad=True
)
with warnings.catch_warnings(record=True) as ws, settings.debug(True):
acqf.set_X_pending(X2)
self.assertEqual(acqf.X_pending, X2)
self.assertEqual(len(ws), 1)
self.assertTrue(issubclass(ws[-1].category, BotorchWarning))
def test_q_noisy_expected_improvement_batch(self):
for dtype in (torch.float, torch.double):
# the event shape is `b x q x t` = 2 x 3 x 1
samples_noisy = torch.zeros(2, 3, 1, device=self.device, dtype=dtype)
samples_noisy[0, 0, 0] = 1.0
mm_noisy = MockModel(MockPosterior(samples=samples_noisy))
# X is `q x d` = 1 x 1
X = torch.zeros(1, 1, 1, device=self.device, dtype=dtype)
X_baseline = torch.zeros(1, 1, device=self.device, dtype=dtype)
# test batch mode
sampler = IIDNormalSampler(num_samples=2)
acqf = qNoisyExpectedImprovement(
model=mm_noisy, X_baseline=X_baseline, sampler=sampler
)
res = acqf(X)
self.assertEqual(res[0].item(), 1.0)
self.assertEqual(res[1].item(), 0.0)
# test batch mode, no resample
sampler = IIDNormalSampler(num_samples=2, seed=12345)
acqf = qNoisyExpectedImprovement(
model=mm_noisy, X_baseline=X_baseline, sampler=sampler
)
res = acqf(X) # 1-dim batch
self.assertEqual(res[0].item(), 1.0)
self.assertEqual(res[1].item(), 0.0)
self.assertEqual(acqf.sampler.base_samples.shape, torch.Size([2, 1, 3, 1]))
bs = acqf.sampler.base_samples.clone()
acqf(X)
self.assertTrue(torch.equal(acqf.sampler.base_samples, bs))
res = acqf(X.expand(2, 1, 1)) # 2-dim batch
self.assertEqual(res[0].item(), 1.0)
self.assertEqual(res[1].item(), 0.0)
# the base samples should have the batch dim collapsed
self.assertEqual(acqf.sampler.base_samples.shape, torch.Size([2, 1, 3, 1]))
bs = acqf.sampler.base_samples.clone()
acqf(X.expand(2, 1, 1))
self.assertTrue(torch.equal(acqf.sampler.base_samples, bs))
# test batch mode, qmc, no resample
sampler = SobolQMCNormalSampler(num_samples=2)
acqf = qNoisyExpectedImprovement(
model=mm_noisy, X_baseline=X_baseline, sampler=sampler
)
res = acqf(X)
self.assertEqual(res[0].item(), 1.0)
self.assertEqual(res[1].item(), 0.0)
self.assertEqual(acqf.sampler.base_samples.shape, torch.Size([2, 1, 3, 1]))
bs = acqf.sampler.base_samples.clone()
acqf(X)
self.assertTrue(torch.equal(acqf.sampler.base_samples, bs))
# test X_pending w/ batch mode, qmc, resample
sampler = SobolQMCNormalSampler(num_samples=2, resample=True, seed=12345)
acqf = qNoisyExpectedImprovement(
model=mm_noisy, X_baseline=X_baseline, sampler=sampler
)
res = acqf(X) # 1-dim batch
self.assertEqual(res[0].item(), 1.0)
self.assertEqual(res[1].item(), 0.0)
self.assertEqual(acqf.sampler.base_samples.shape, torch.Size([2, 1, 3, 1]))
bs = acqf.sampler.base_samples.clone()
acqf(X)
self.assertFalse(torch.equal(acqf.sampler.base_samples, bs))
res = acqf(X.expand(2, 1, 1)) # 2-dim batch
self.assertEqual(res[0].item(), 1.0)
self.assertEqual(res[1].item(), 0.0)
# the base samples should have the batch dim collapsed
self.assertEqual(acqf.sampler.base_samples.shape, torch.Size([2, 1, 3, 1]))
bs = acqf.sampler.base_samples.clone()
acqf(X.expand(2, 1, 1))
self.assertFalse(torch.equal(acqf.sampler.base_samples, bs))
def test_prune_baseline(self):
no = "botorch.utils.testing.MockModel.num_outputs"
prune = "botorch.acquisition.monte_carlo.prune_inferior_points"
for dtype in (torch.float, torch.double):
X_baseline = torch.zeros(1, 1, device=self.device, dtype=dtype)
X_pruned = torch.rand(1, 1, device=self.device, dtype=dtype)
with mock.patch(no, new_callable=mock.PropertyMock) as mock_num_outputs:
mock_num_outputs.return_value = 1
mm = MockModel(mock.Mock())
with mock.patch(prune, return_value=X_pruned) as mock_prune:
acqf = qNoisyExpectedImprovement(
model=mm, X_baseline=X_baseline, prune_baseline=True
)
mock_prune.assert_called_once()
self.assertTrue(torch.equal(acqf.X_baseline, X_pruned))
# TODO: Test different objectives (incl. constraints)
class TestQProbabilityOfImprovement(BotorchTestCase):
def test_q_probability_of_improvement(self):
for dtype in (torch.float, torch.double):
# the event shape is `b x q x t` = 1 x 1 x 1
samples = torch.zeros(1, 1, 1, device=self.device, dtype=dtype)
mm = MockModel(MockPosterior(samples=samples))
# X is `q x d` = 1 x 1. X is a dummy and unused b/c of mocking
X = torch.zeros(1, 1, device=self.device, dtype=dtype)
# basic test
sampler = IIDNormalSampler(num_samples=2)
acqf = qProbabilityOfImprovement(model=mm, best_f=0, sampler=sampler)
res = acqf(X)
self.assertEqual(res.item(), 0.5)
# basic test, no resample
sampler = IIDNormalSampler(num_samples=2, seed=12345)
acqf = qProbabilityOfImprovement(model=mm, best_f=0, sampler=sampler)
res = acqf(X)
self.assertEqual(res.item(), 0.5)
self.assertEqual(acqf.sampler.base_samples.shape, torch.Size([2, 1, 1, 1]))
bs = acqf.sampler.base_samples.clone()
res = acqf(X)
self.assertTrue(torch.equal(acqf.sampler.base_samples, bs))
# basic test, qmc, no resample
sampler = SobolQMCNormalSampler(num_samples=2)
acqf = qProbabilityOfImprovement(model=mm, best_f=0, sampler=sampler)
res = acqf(X)
self.assertEqual(res.item(), 0.5)
self.assertEqual(acqf.sampler.base_samples.shape, torch.Size([2, 1, 1, 1]))
bs = acqf.sampler.base_samples.clone()
acqf(X)
self.assertTrue(torch.equal(acqf.sampler.base_samples, bs))
# basic test, qmc, resample
sampler = SobolQMCNormalSampler(num_samples=2, resample=True)
acqf = qProbabilityOfImprovement(model=mm, best_f=0, sampler=sampler)
res = acqf(X)
self.assertEqual(res.item(), 0.5)
self.assertEqual(acqf.sampler.base_samples.shape, torch.Size([2, 1, 1, 1]))
bs = acqf.sampler.base_samples.clone()
acqf(X)
self.assertFalse(torch.equal(acqf.sampler.base_samples, bs))
# basic test for X_pending and warning
acqf.set_X_pending()
self.assertIsNone(acqf.X_pending)
acqf.set_X_pending(None)
self.assertIsNone(acqf.X_pending)
acqf.set_X_pending(X)
self.assertEqual(acqf.X_pending, X)
res = acqf(X)
X2 = torch.zeros(
1, 1, 1, device=self.device, dtype=dtype, requires_grad=True
)
with warnings.catch_warnings(record=True) as ws, settings.debug(True):
acqf.set_X_pending(X2)
self.assertEqual(acqf.X_pending, X2)
self.assertEqual(len(ws), 1)
self.assertTrue(issubclass(ws[-1].category, BotorchWarning))
def test_q_probability_of_improvement_batch(self):
# the event shape is `b x q x t` = 2 x 2 x 1
for dtype in (torch.float, torch.double):
samples = torch.zeros(2, 2, 1, device=self.device, dtype=dtype)
samples[0, 0, 0] = 1.0
mm = MockModel(MockPosterior(samples=samples))
# X is a dummy and unused b/c of mocking
X = torch.zeros(1, 1, 1, device=self.device, dtype=dtype)
# test batch mode
sampler = IIDNormalSampler(num_samples=2)
acqf = qProbabilityOfImprovement(model=mm, best_f=0, sampler=sampler)
res = acqf(X)
self.assertEqual(res[0].item(), 1.0)
self.assertEqual(res[1].item(), 0.5)
# test batch mode, no resample
sampler = IIDNormalSampler(num_samples=2, seed=12345)
acqf = qProbabilityOfImprovement(model=mm, best_f=0, sampler=sampler)
res = acqf(X) # 1-dim batch
self.assertEqual(res[0].item(), 1.0)
self.assertEqual(res[1].item(), 0.5)
self.assertEqual(acqf.sampler.base_samples.shape, torch.Size([2, 1, 2, 1]))
bs = acqf.sampler.base_samples.clone()
acqf(X)
self.assertTrue(torch.equal(acqf.sampler.base_samples, bs))
res = acqf(X.expand(2, 1, 1)) # 2-dim batch
self.assertEqual(res[0].item(), 1.0)
self.assertEqual(res[1].item(), 0.5)
# the base samples should have the batch dim collapsed
self.assertEqual(acqf.sampler.base_samples.shape, torch.Size([2, 1, 2, 1]))
bs = acqf.sampler.base_samples.clone()
acqf(X.expand(2, 1, 1))
self.assertTrue(torch.equal(acqf.sampler.base_samples, bs))
# test batch mode, qmc, no resample
sampler = SobolQMCNormalSampler(num_samples=2)
acqf = qProbabilityOfImprovement(model=mm, best_f=0, sampler=sampler)
res = acqf(X)
self.assertEqual(res[0].item(), 1.0)
self.assertEqual(res[1].item(), 0.5)
self.assertEqual(acqf.sampler.base_samples.shape, torch.Size([2, 1, 2, 1]))
bs = acqf.sampler.base_samples.clone()
acqf(X)
self.assertTrue(torch.equal(acqf.sampler.base_samples, bs))
# test batch mode, qmc, resample
sampler = SobolQMCNormalSampler(num_samples=2, resample=True)
acqf = qProbabilityOfImprovement(model=mm, best_f=0, sampler=sampler)
res = acqf(X) # 1-dim batch
self.assertEqual(res[0].item(), 1.0)
self.assertEqual(res[1].item(), 0.5)
self.assertEqual(acqf.sampler.base_samples.shape, torch.Size([2, 1, 2, 1]))
bs = acqf.sampler.base_samples.clone()
acqf(X)
self.assertFalse(torch.equal(acqf.sampler.base_samples, bs))
res = acqf(X.expand(2, 1, 1)) # 2-dim batch
self.assertEqual(res[0].item(), 1.0)
self.assertEqual(res[1].item(), 0.5)
# the base samples should have the batch dim collapsed
self.assertEqual(acqf.sampler.base_samples.shape, torch.Size([2, 1, 2, 1]))
bs = acqf.sampler.base_samples.clone()
acqf(X.expand(2, 1, 1))
self.assertFalse(torch.equal(acqf.sampler.base_samples, bs))
# TODO: Test different objectives (incl. constraints)
class TestQSimpleRegret(BotorchTestCase):
def test_q_simple_regret(self):
for dtype in (torch.float, torch.double):
# the event shape is `b x q x t` = 1 x 1 x 1
samples = torch.zeros(1, 1, 1, device=self.device, dtype=dtype)
mm = MockModel(MockPosterior(samples=samples))
# X is `q x d` = 1 x 1. X is a dummy and unused b/c of mocking
X = torch.zeros(1, 1, device=self.device, dtype=dtype)
# basic test
sampler = IIDNormalSampler(num_samples=2)
acqf = qSimpleRegret(model=mm, sampler=sampler)
res = acqf(X)
self.assertEqual(res.item(), 0.0)
# basic test, no resample
sampler = IIDNormalSampler(num_samples=2, seed=12345)
acqf = qSimpleRegret(model=mm, sampler=sampler)
res = acqf(X)
self.assertEqual(res.item(), 0.0)
self.assertEqual(acqf.sampler.base_samples.shape, torch.Size([2, 1, 1, 1]))
bs = acqf.sampler.base_samples.clone()
res = acqf(X)
self.assertTrue(torch.equal(acqf.sampler.base_samples, bs))
# basic test, qmc, no resample
sampler = SobolQMCNormalSampler(num_samples=2)
acqf = qSimpleRegret(model=mm, sampler=sampler)
res = acqf(X)
self.assertEqual(res.item(), 0.0)
self.assertEqual(acqf.sampler.base_samples.shape, torch.Size([2, 1, 1, 1]))
bs = acqf.sampler.base_samples.clone()
acqf(X)
self.assertTrue(torch.equal(acqf.sampler.base_samples, bs))
# basic test, qmc, resample
sampler = SobolQMCNormalSampler(num_samples=2, resample=True)
acqf = qSimpleRegret(model=mm, sampler=sampler)
res = acqf(X)
self.assertEqual(res.item(), 0.0)
self.assertEqual(acqf.sampler.base_samples.shape, torch.Size([2, 1, 1, 1]))
bs = acqf.sampler.base_samples.clone()
acqf(X)
self.assertFalse(torch.equal(acqf.sampler.base_samples, bs))
# basic test for X_pending and warning
acqf.set_X_pending()
self.assertIsNone(acqf.X_pending)
acqf.set_X_pending(None)
self.assertIsNone(acqf.X_pending)
acqf.set_X_pending(X)
self.assertEqual(acqf.X_pending, X)
res = acqf(X)
X2 = torch.zeros(
1, 1, 1, device=self.device, dtype=dtype, requires_grad=True
)
with warnings.catch_warnings(record=True) as ws, settings.debug(True):
acqf.set_X_pending(X2)
self.assertEqual(acqf.X_pending, X2)
self.assertEqual(len(ws), 1)
self.assertTrue(issubclass(ws[-1].category, BotorchWarning))
def test_q_simple_regret_batch(self):
# the event shape is `b x q x t` = 2 x 2 x 1
for dtype in (torch.float, torch.double):
samples = torch.zeros(2, 2, 1, device=self.device, dtype=dtype)
samples[0, 0, 0] = 1.0
mm = MockModel(MockPosterior(samples=samples))
# X is a dummy and unused b/c of mocking
X = torch.zeros(1, 1, 1, device=self.device, dtype=dtype)
# test batch mode
sampler = IIDNormalSampler(num_samples=2)
acqf = qSimpleRegret(model=mm, sampler=sampler)
res = acqf(X)
self.assertEqual(res[0].item(), 1.0)
self.assertEqual(res[1].item(), 0.0)
# test batch mode, no resample
sampler = IIDNormalSampler(num_samples=2, seed=12345)
acqf = qSimpleRegret(model=mm, sampler=sampler)
res = acqf(X) # 1-dim batch
self.assertEqual(res[0].item(), 1.0)
self.assertEqual(res[1].item(), 0.0)
self.assertEqual(acqf.sampler.base_samples.shape, torch.Size([2, 1, 2, 1]))
bs = acqf.sampler.base_samples.clone()
acqf(X)
self.assertTrue(torch.equal(acqf.sampler.base_samples, bs))
res = acqf(X.expand(2, 1, 1)) # 2-dim batch
self.assertEqual(res[0].item(), 1.0)
self.assertEqual(res[1].item(), 0.0)
# the base samples should have the batch dim collapsed
self.assertEqual(acqf.sampler.base_samples.shape, torch.Size([2, 1, 2, 1]))
bs = acqf.sampler.base_samples.clone()
acqf(X.expand(2, 1, 1))
self.assertTrue(torch.equal(acqf.sampler.base_samples, bs))
# test batch mode, qmc, no resample
sampler = SobolQMCNormalSampler(num_samples=2)
acqf = qSimpleRegret(model=mm, sampler=sampler)
res = acqf(X)
self.assertEqual(res[0].item(), 1.0)
self.assertEqual(res[1].item(), 0.0)
self.assertEqual(acqf.sampler.base_samples.shape, torch.Size([2, 1, 2, 1]))
bs = acqf.sampler.base_samples.clone()
acqf(X)
self.assertTrue(torch.equal(acqf.sampler.base_samples, bs))
# test batch mode, qmc, resample
sampler = SobolQMCNormalSampler(num_samples=2, resample=True)
acqf = qSimpleRegret(model=mm, sampler=sampler)
res = acqf(X) # 1-dim batch
self.assertEqual(res[0].item(), 1.0)
self.assertEqual(res[1].item(), 0.0)
self.assertEqual(acqf.sampler.base_samples.shape, torch.Size([2, 1, 2, 1]))
bs = acqf.sampler.base_samples.clone()
acqf(X)
self.assertFalse(torch.equal(acqf.sampler.base_samples, bs))
res = acqf(X.expand(2, 1, 1)) # 2-dim batch
self.assertEqual(res[0].item(), 1.0)
self.assertEqual(res[1].item(), 0.0)
# the base samples should have the batch dim collapsed
self.assertEqual(acqf.sampler.base_samples.shape, torch.Size([2, 1, 2, 1]))
bs = acqf.sampler.base_samples.clone()
acqf(X.expand(2, 1, 1))
self.assertFalse(torch.equal(acqf.sampler.base_samples, bs))
# TODO: Test different objectives (incl. constraints)
class TestQUpperConfidenceBound(BotorchTestCase):
def test_q_upper_confidence_bound(self):
for dtype in (torch.float, torch.double):
# the event shape is `b x q x t` = 1 x 1 x 1
samples = torch.zeros(1, 1, 1, device=self.device, dtype=dtype)
mm = MockModel(MockPosterior(samples=samples))
# X is `q x d` = 1 x 1. X is a dummy and unused b/c of mocking
X = torch.zeros(1, 1, device=self.device, dtype=dtype)
# basic test
sampler = IIDNormalSampler(num_samples=2)
acqf = qUpperConfidenceBound(model=mm, beta=0.5, sampler=sampler)
res = acqf(X)
self.assertEqual(res.item(), 0.0)
# basic test, no resample
sampler = IIDNormalSampler(num_samples=2, seed=12345)
acqf = qUpperConfidenceBound(model=mm, beta=0.5, sampler=sampler)
res = acqf(X)
self.assertEqual(res.item(), 0.0)
self.assertEqual(acqf.sampler.base_samples.shape, torch.Size([2, 1, 1, 1]))
bs = acqf.sampler.base_samples.clone()
res = acqf(X)
self.assertTrue(torch.equal(acqf.sampler.base_samples, bs))
# basic test, qmc, no resample
sampler = SobolQMCNormalSampler(num_samples=2)
acqf = qUpperConfidenceBound(model=mm, beta=0.5, sampler=sampler)
res = acqf(X)
self.assertEqual(res.item(), 0.0)
self.assertEqual(acqf.sampler.base_samples.shape, torch.Size([2, 1, 1, 1]))
bs = acqf.sampler.base_samples.clone()
acqf(X)
self.assertTrue(torch.equal(acqf.sampler.base_samples, bs))
# basic test, qmc, resample
sampler = SobolQMCNormalSampler(num_samples=2, resample=True)
acqf = qUpperConfidenceBound(model=mm, beta=0.5, sampler=sampler)
res = acqf(X)
self.assertEqual(res.item(), 0.0)
self.assertEqual(acqf.sampler.base_samples.shape, torch.Size([2, 1, 1, 1]))
bs = acqf.sampler.base_samples.clone()
acqf(X)
self.assertFalse(torch.equal(acqf.sampler.base_samples, bs))
# basic test for X_pending and warning
acqf.set_X_pending()
self.assertIsNone(acqf.X_pending)
acqf.set_X_pending(None)
self.assertIsNone(acqf.X_pending)
acqf.set_X_pending(X)
self.assertEqual(acqf.X_pending, X)
res = acqf(X)
X2 = torch.zeros(
1, 1, 1, device=self.device, dtype=dtype, requires_grad=True
)
with warnings.catch_warnings(record=True) as ws, settings.debug(True):
acqf.set_X_pending(X2)
self.assertEqual(acqf.X_pending, X2)
self.assertEqual(len(ws), 1)
self.assertTrue(issubclass(ws[-1].category, BotorchWarning))
def test_q_upper_confidence_bound_batch(self):
# TODO: T41739913 Implement tests for all MCAcquisitionFunctions
for dtype in (torch.float, torch.double):
samples = torch.zeros(2, 2, 1, device=self.device, dtype=dtype)
samples[0, 0, 0] = 1.0
mm = MockModel(MockPosterior(samples=samples))
# X is a dummy and unused b/c of mocking
X = torch.zeros(1, 1, 1, device=self.device, dtype=dtype)
# test batch mode
sampler = IIDNormalSampler(num_samples=2)
acqf = qUpperConfidenceBound(model=mm, beta=0.5, sampler=sampler)
res = acqf(X)
self.assertEqual(res[0].item(), 1.0)
self.assertEqual(res[1].item(), 0.0)
# test batch mode, no resample
sampler = IIDNormalSampler(num_samples=2, seed=12345)
acqf = qUpperConfidenceBound(model=mm, beta=0.5, sampler=sampler)
res = acqf(X) # 1-dim batch
self.assertEqual(res[0].item(), 1.0)
self.assertEqual(res[1].item(), 0.0)
self.assertEqual(acqf.sampler.base_samples.shape, torch.Size([2, 1, 2, 1]))
bs = acqf.sampler.base_samples.clone()
acqf(X)
self.assertTrue(torch.equal(acqf.sampler.base_samples, bs))
res = acqf(X.expand(2, 1, 1)) # 2-dim batch
self.assertEqual(res[0].item(), 1.0)
self.assertEqual(res[1].item(), 0.0)
# the base samples should have the batch dim collapsed
self.assertEqual(acqf.sampler.base_samples.shape, torch.Size([2, 1, 2, 1]))
bs = acqf.sampler.base_samples.clone()
acqf(X.expand(2, 1, 1))
self.assertTrue(torch.equal(acqf.sampler.base_samples, bs))
# test batch mode, qmc, no resample
sampler = SobolQMCNormalSampler(num_samples=2)
acqf = qUpperConfidenceBound(model=mm, beta=0.5, sampler=sampler)
res = acqf(X)
self.assertEqual(res[0].item(), 1.0)
self.assertEqual(res[1].item(), 0.0)
self.assertEqual(acqf.sampler.base_samples.shape, torch.Size([2, 1, 2, 1]))
bs = acqf.sampler.base_samples.clone()
acqf(X)
self.assertTrue(torch.equal(acqf.sampler.base_samples, bs))
# test batch mode, qmc, resample
sampler = SobolQMCNormalSampler(num_samples=2, resample=True)
acqf = qUpperConfidenceBound(model=mm, beta=0.5, sampler=sampler)
res = acqf(X) # 1-dim batch
self.assertEqual(res[0].item(), 1.0)
self.assertEqual(res[1].item(), 0.0)
self.assertEqual(acqf.sampler.base_samples.shape, torch.Size([2, 1, 2, 1]))
bs = acqf.sampler.base_samples.clone()
acqf(X)
self.assertFalse(torch.equal(acqf.sampler.base_samples, bs))
res = acqf(X.expand(2, 1, 1)) # 2-dim batch
self.assertEqual(res[0].item(), 1.0)
self.assertEqual(res[1].item(), 0.0)
# the base samples should have the batch dim collapsed
self.assertEqual(acqf.sampler.base_samples.shape, torch.Size([2, 1, 2, 1]))
bs = acqf.sampler.base_samples.clone()
acqf(X.expand(2, 1, 1))
self.assertFalse(torch.equal(acqf.sampler.base_samples, bs))
# basic test for X_pending and warning
acqf.set_X_pending()
self.assertIsNone(acqf.X_pending)
acqf.set_X_pending(None)
self.assertIsNone(acqf.X_pending)
acqf.set_X_pending(X)
self.assertEqual(acqf.X_pending, X)
res = acqf(X)
X2 = torch.zeros(
1, 1, 1, device=self.device, dtype=dtype, requires_grad=True
)
with warnings.catch_warnings(record=True) as ws, settings.debug(True):
acqf.set_X_pending(X2)
self.assertEqual(acqf.X_pending, X2)
self.assertEqual(len(ws), 1)
self.assertTrue(issubclass(ws[-1].category, BotorchWarning))
# TODO: Test different objectives (incl. constraints)
| 47.14433 | 87 | 0.591133 | 4,612 | 36,584 | 4.602342 | 0.042931 | 0.099642 | 0.084802 | 0.124376 | 0.916046 | 0.905163 | 0.892443 | 0.881843 | 0.879205 | 0.879205 | 0 | 0.031182 | 0.293462 | 36,584 | 775 | 88 | 47.205161 | 0.790003 | 0.09772 | 0 | 0.864238 | 0 | 0 | 0.004225 | 0.004225 | 0 | 0 | 0 | 0.00129 | 0.337748 | 1 | 0.021523 | false | 0.001656 | 0.014901 | 0 | 0.048013 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e0f383c1f48bf56a57bc8206b1544a2da543bec4 | 3,273 | py | Python | woeclipse/tests/test_routes.py | Costa-Alexandre/woeclipse | 757b2d1f2355f349e8b22e3f3c8ef990c09ed7e2 | [
"MIT"
] | null | null | null | woeclipse/tests/test_routes.py | Costa-Alexandre/woeclipse | 757b2d1f2355f349e8b22e3f3c8ef990c09ed7e2 | [
"MIT"
] | null | null | null | woeclipse/tests/test_routes.py | Costa-Alexandre/woeclipse | 757b2d1f2355f349e8b22e3f3c8ef990c09ed7e2 | [
"MIT"
] | null | null | null |
def test_index_get(test_client):
# mimic a browser: 'GET /', as if you visit the site
response = test_client.get('/')
# check that the HTTP response is a success
assert response.status_code == 200
# Store the contents of the html response in a local variable.
# This should be a string with the same content as the file index.html
html_content = response.data.decode()
assert "<html>" in html_content
assert "Warriors of Eclipse" in html_content
assert '<header>' in html_content
assert '<footer>' in html_content
def test_index_post(test_client):
# mimic a browser: 'GET /', as if you visit the site
response = test_client.post('/')
# check that the HTTP por method is not allowed
assert response.status_code == 405
# Store the contents of the html response in a local variable.
# This should be a string with the same content as the file index.html
html_content = response.data.decode()
assert "Warriors of Eclipse" not in html_content
def test_signup_post(test_client):
# mimic a browser: 'GET /', as if you visit the site
response = test_client.post('/signup')
# check that the HTTP response is a success
assert response.status_code == 200
# Store the contents of the html response in a local variable.
# This should be a string with the same content as the file index.html
html_content = response.data.decode()
assert "</form>" in html_content
def test_signup_get(test_client):
# mimic a browser: 'GET /', as if you visit the site
response = test_client.get('/signup')
# check that the HTTP response is a success
assert response.status_code == 200
# Store the contents of the html response in a local variable.
# This should be a string with the same content as the file index.html
html_content = response.data.decode()
assert "</form>" in html_content
def test_signin_post(test_client):
# mimic a browser: 'GET /', as if you visit the site
response = test_client.post('/signin')
# check that the HTTP response is a success
assert response.status_code == 200
# Store the contents of the html response in a local variable.
# This should be a string with the same content as the file index.html
html_content = response.data.decode()
assert "</form>" in html_content
def test_signin_get(test_client):
# mimic a browser: 'GET /', as if you visit the site
response = test_client.get('/signin')
# check that the HTTP response is a success
assert response.status_code == 200
# Store the contents of the html response in a local variable.
# This should be a string with the same content as the file index.html
html_content = response.data.decode()
assert "</form>" in html_content
def test_edit_profile(test_client):
# mimic a browser: 'GET /', as if you visit the site
response = test_client.get('/edit_profile')
# check that the HTTP response is a success
assert response.status_code == 302
# Store the contents of the html response in a local variable.
# This should be a string with the same content as the file index.html
html_content = response.data.decode()
assert "Redirecting..." in html_content
| 32.088235 | 74 | 0.702414 | 503 | 3,273 | 4.465209 | 0.115308 | 0.083259 | 0.057881 | 0.049866 | 0.893143 | 0.884239 | 0.869991 | 0.869991 | 0.869991 | 0.869991 | 0 | 0.008258 | 0.223037 | 3,273 | 101 | 75 | 32.405941 | 0.874951 | 0.477849 | 0 | 0.421053 | 0 | 0 | 0.086412 | 0 | 0 | 0 | 0 | 0 | 0.447368 | 1 | 0.184211 | false | 0 | 0 | 0 | 0.184211 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
460e1227eed0c4400fd8d2cef0112bd9f56f8c70 | 57 | py | Python | build/lib/abdal-net-py/__init__.py | abdal-security-group/abdal-net-py | 209035681836cb80553b10d6f885fab650a76aa3 | [
"MIT"
] | null | null | null | build/lib/abdal-net-py/__init__.py | abdal-security-group/abdal-net-py | 209035681836cb80553b10d6f885fab650a76aa3 | [
"MIT"
] | null | null | null | build/lib/abdal-net-py/__init__.py | abdal-security-group/abdal-net-py | 209035681836cb80553b10d6f885fab650a76aa3 | [
"MIT"
] | null | null | null | from abdal_net_py.abdal_net_py_unit import AbdalNetPy
| 19 | 54 | 0.859649 | 10 | 57 | 4.4 | 0.7 | 0.363636 | 0.454545 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.122807 | 57 | 2 | 55 | 28.5 | 0.88 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
1cc215e66567fb1cee536225a713751d015d5f68 | 38,030 | py | Python | aqt/jax_legacy/jax/wmt_mlperf/models_test.py | jihwanlee-alphago/aqt | dfc0761f8db13b10174550979b0a3c8b32fd3d01 | [
"Apache-2.0"
] | null | null | null | aqt/jax_legacy/jax/wmt_mlperf/models_test.py | jihwanlee-alphago/aqt | dfc0761f8db13b10174550979b0a3c8b32fd3d01 | [
"Apache-2.0"
] | null | null | null | aqt/jax_legacy/jax/wmt_mlperf/models_test.py | jihwanlee-alphago/aqt | dfc0761f8db13b10174550979b0a3c8b32fd3d01 | [
"Apache-2.0"
] | null | null | null | # Copyright 2022 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Tests for wmt_mlperf.models."""
from unittest import mock
from absl.testing import absltest
from absl.testing import parameterized
from aqt.jax_legacy.jax import get_bounds
from aqt.jax_legacy.jax import hlo_utils
from aqt.jax_legacy.jax import primitives
from aqt.jax_legacy.jax import quant_config
from aqt.jax_legacy.jax import test_utils
from aqt.jax_legacy.jax.quantization import QuantType
from aqt.jax_legacy.jax.wmt_mlperf import models
from aqt.jax_legacy.jax.wmt_mlperf import training_hparams_generator_lib
import flax
import jax
import jax.numpy as jnp
import numpy as onp
class ModelsTest(parameterized.TestCase):
def setUp(self):
super(ModelsTest, self).setUp()
self.input_shape = (1, 1)
self.target_shape = (1, 1)
self.inputs = jnp.ones(self.input_shape, dtype=jnp.float32)
self.target = jnp.ones(self.target_shape, dtype=jnp.float32)
self.key = jax.random.PRNGKey(0)
self.transformer_small_kwargs = {
'vocab_size': 1,
'output_vocab_size': 1,
'max_len': 1,
'train': False,
}
self.transformer_full_kwargs = {
'vocab_size': 4,
'output_vocab_size': 4,
'max_len': 2,
'train': False,
}
def init_model(self, transformer_kwargs):
model = models.Transformer(
use_bfloat16=False,
quant_context=quant_config.QuantContext(
collect_acts_stats=False, update_bounds=False),
dropout_rate=.1,
attention_dropout_rate=.1,
should_decode=False,
**transformer_kwargs)
state = model.init(self.key, jnp.zeros(self.input_shape, jnp.float32),
jnp.zeros(self.target_shape, jnp.float32))
return model, state
@parameterized.named_parameters(
dict(testcase_name='test_mlp_weight_quant_8bit', mlp_weight_prec=8),
dict(testcase_name='test_mlp_weight_quant_4bit', mlp_weight_prec=4),
dict(testcase_name='test_mlp_weight_quant_1bit', mlp_weight_prec=1),
)
@mock.patch.object(primitives, 'round_with_gradient')
@mock.patch.object(primitives, 'floor_with_gradient')
def test_mlp_weight_quant(self, floor_with_gradient, round_with_gradient,
mlp_weight_prec):
hparams = training_hparams_generator_lib.create_base_transformer_hparams(
mlp_weight_prec=mlp_weight_prec,
embedding_weight_prec=None,
attention_weight_prec=None,
mlp_pos_inputs_prec=None,
mlp_pos_inputs_hyper=None,
mlp_signed_inputs_prec=None,
mlp_signed_inputs_hyper=None,
attention_kqv_inputs_prec=None,
attention_kqv_inputs_hyper=None,
attention_out_inputs_prec=None,
attention_out_inputs_hyper=None,
logits_inputs_prec=None,
logits_inputs_hyper=None,
logits_via_embeddings=True,
attention_act_q_inputs_prec=None,
attention_act_q_inputs_hyper=None,
attention_act_k_inputs_prec=None,
attention_act_k_inputs_hyper=None,
attention_act_probs_inputs_prec=None,
attention_act_v_inputs_prec=None,
attention_act_v_inputs_hyper=None,
num_layers=1,
emb_dim=1,
num_heads=1,
qkv_dim=1,
mlp_dim=1,
quant_type=QuantType.FAKE_QUANT)
transformer_kwargs = self.transformer_small_kwargs
transformer_kwargs['hparams'] = hparams
round_with_gradient.side_effect = lambda x: x
floor_with_gradient.side_effect = lambda x: x
model, init_state = self.init_model(transformer_kwargs)
# there are 2 MLP blocks in this model, with 2 quant ops each, so both clip
# and round should be called 4 times each.
round_with_gradient.assert_called_with(mock.ANY)
self.assertEqual(round_with_gradient.call_count, 4)
floor_with_gradient.assert_not_called()
round_with_gradient.reset_mock()
floor_with_gradient.reset_mock()
output = model.apply(init_state, self.inputs, self.target)
self.assertEqual(output.shape, (1, 1))
round_with_gradient.assert_called_with(mock.ANY)
self.assertEqual(round_with_gradient.call_count, 4)
floor_with_gradient.assert_not_called()
@mock.patch.object(primitives, 'round_with_gradient')
@mock.patch.object(primitives, 'floor_with_gradient')
def test_without_mlp_weight_quant(self, floor_with_gradient,
round_with_gradient):
hparams = training_hparams_generator_lib.create_base_transformer_hparams(
mlp_weight_prec=None,
embedding_weight_prec=None,
attention_weight_prec=None,
mlp_pos_inputs_prec=None,
mlp_pos_inputs_hyper=None,
mlp_signed_inputs_prec=None,
mlp_signed_inputs_hyper=None,
attention_kqv_inputs_prec=None,
attention_kqv_inputs_hyper=None,
attention_out_inputs_prec=None,
attention_out_inputs_hyper=None,
logits_inputs_prec=None,
logits_inputs_hyper=None,
logits_via_embeddings=True,
attention_act_q_inputs_prec=None,
attention_act_q_inputs_hyper=None,
attention_act_k_inputs_prec=None,
attention_act_k_inputs_hyper=None,
attention_act_probs_inputs_prec=None,
attention_act_v_inputs_prec=None,
attention_act_v_inputs_hyper=None,
num_layers=1,
emb_dim=1,
num_heads=1,
qkv_dim=1,
mlp_dim=1,
quant_type=QuantType.FAKE_QUANT)
transformer_kwargs = self.transformer_small_kwargs
transformer_kwargs['hparams'] = hparams
round_with_gradient.side_effect = lambda x: x
floor_with_gradient.side_effect = lambda x: x
model, init_state = self.init_model(transformer_kwargs)
model.apply(init_state, self.inputs, self.target)
round_with_gradient.assert_not_called()
floor_with_gradient.assert_not_called()
def _num_mlp_floors(self, weight_quant, pos_input_quant, neg_input_quant):
# There are 2 MLP blocks per layer (1 encoder, 1 decoder) and 2 weight quant
# ops per MLP block, so 4 in total per layer.
mlp_floors_per_layer = 4 if weight_quant else 0
# There are 2 MLP blocks per layer (1 encoder, 1 decoder) and 1 input quant
# op per unsigned MLP block, so 2 in total per layer.
if pos_input_quant:
mlp_floors_per_layer = mlp_floors_per_layer + 2
# There are 2 MLP blocks per layer (1 encoder, 1 decoder) and 1 input quant
# op per signed MLP block, so 2 in total per layer.
if neg_input_quant:
mlp_floors_per_layer = mlp_floors_per_layer + 2
return mlp_floors_per_layer
def _num_embedding_floors(self, weight_quant, act_quant):
# 3 embedding layers in the whole model
embedding_floors = 3 if weight_quant else 0
# logits activation quantization
if act_quant:
embedding_floors = embedding_floors + 1
return embedding_floors
def _num_attention_floors(self, weight_quant, kqv_input_quant,
out_input_quant, act_q_input_quant,
act_k_input_quant, act_qk_input_quant,
act_v_input_quant):
# 3 attention blocks per layer (1 on encoder, 2 on decoder), each
# attention block has 4 weight quant ops, so 12 in total per layer.
attention_floors_per_layer = 12 if weight_quant else 0
# 3 attention blocks per layer (1 on encoder, 2 on decoder), each
# attention block has 3 kqv activation quant ops, so 9 in total per layer.
if kqv_input_quant:
attention_floors_per_layer = attention_floors_per_layer + 9
# 3 attention blocks per layer (1 on encoder, 2 on decoder), each attention
# block has 1 dense out activation quant op, so 3 in total per layer.
if out_input_quant:
attention_floors_per_layer = attention_floors_per_layer + 3
# 3 attention blocks per layer (1 on encoder, 2 on decoder), each attention
# block has 1 act*act activation quant op for each act per layer.
for act_quant in [
act_q_input_quant, act_k_input_quant, act_qk_input_quant,
act_v_input_quant
]:
if act_quant:
attention_floors_per_layer = attention_floors_per_layer + 3
return attention_floors_per_layer
@parameterized.named_parameters(
dict(
testcase_name='test_2layers_no_quant',
num_layers=2,
mlp_weight_prec=None,
mlp_pos_inputs_prec=None,
mlp_signed_inputs_prec=None,
attention_kqv_inputs_prec=None,
attention_out_inputs_prec=None,
embedding_weight_prec=None,
attention_weight_prec=None,
logits_inputs_prec=None,
attention_act_q_inputs_prec=None,
attention_act_k_inputs_prec=None,
attention_act_probs_inputs_prec=None,
attention_act_v_inputs_prec=None,
),
dict(
testcase_name='test_3layers_full_quant',
num_layers=3,
mlp_weight_prec=4,
mlp_pos_inputs_prec=8,
mlp_signed_inputs_prec=8,
attention_kqv_inputs_prec=2,
attention_out_inputs_prec=4,
embedding_weight_prec=4,
attention_weight_prec=4,
logits_inputs_prec=8,
attention_act_q_inputs_prec=4,
attention_act_k_inputs_prec=8,
attention_act_probs_inputs_prec=8,
attention_act_v_inputs_prec=4,
),
)
def test_number_of_floor_ops(
self, num_layers, mlp_weight_prec, mlp_pos_inputs_prec,
mlp_signed_inputs_prec, attention_kqv_inputs_prec,
attention_out_inputs_prec, embedding_weight_prec, attention_weight_prec,
logits_inputs_prec, attention_act_q_inputs_prec,
attention_act_k_inputs_prec, attention_act_probs_inputs_prec,
attention_act_v_inputs_prec):
# Counts number of floor ops as a proxy for quantization ops.
act_fixed_clip_bound = 3.0
hparams = training_hparams_generator_lib.create_base_transformer_hparams(
mlp_weight_prec=mlp_weight_prec,
embedding_weight_prec=embedding_weight_prec,
attention_weight_prec=attention_weight_prec,
mlp_pos_inputs_prec=mlp_pos_inputs_prec,
mlp_pos_inputs_hyper=act_fixed_clip_bound,
mlp_signed_inputs_prec=mlp_signed_inputs_prec,
mlp_signed_inputs_hyper=act_fixed_clip_bound,
attention_kqv_inputs_prec=attention_kqv_inputs_prec,
attention_kqv_inputs_hyper=act_fixed_clip_bound,
attention_out_inputs_prec=attention_out_inputs_prec,
attention_out_inputs_hyper=act_fixed_clip_bound,
logits_inputs_prec=logits_inputs_prec,
logits_inputs_hyper=act_fixed_clip_bound,
logits_via_embeddings=True,
attention_act_q_inputs_prec=attention_act_q_inputs_prec,
attention_act_q_inputs_hyper=act_fixed_clip_bound,
attention_act_k_inputs_prec=attention_act_k_inputs_prec,
attention_act_k_inputs_hyper=act_fixed_clip_bound,
attention_act_probs_inputs_prec=attention_act_probs_inputs_prec,
attention_act_v_inputs_prec=attention_act_v_inputs_prec,
attention_act_v_inputs_hyper=act_fixed_clip_bound,
num_layers=num_layers,
emb_dim=5,
num_heads=8,
qkv_dim=8,
mlp_dim=7,
quant_type=QuantType.FAKE_QUANT)
transformer_kwargs = self.transformer_full_kwargs
transformer_kwargs['hparams'] = hparams
input_shape = (2, 4)
target_shape = input_shape
model, init_state = self.init_model(transformer_kwargs)
hlo_proto = hlo_utils.load_hlo_proto_from_model(model, init_state,
[input_shape, target_shape])
floor_count = hlo_utils.count_ops_in_hlo_proto(hlo_proto, r'floor')
mlp_floors_per_layer = self._num_mlp_floors(
(mlp_weight_prec is not None), (mlp_pos_inputs_prec is not None),
(mlp_signed_inputs_prec is not None))
attention_floors_per_layer = self._num_attention_floors(
(attention_weight_prec is not None),
(attention_kqv_inputs_prec is not None),
(attention_out_inputs_prec is not None),
(attention_act_q_inputs_prec is not None),
(attention_act_k_inputs_prec is not None),
(attention_act_probs_inputs_prec is not None),
(attention_act_v_inputs_prec is not None))
embedding_floors = self._num_embedding_floors(
(embedding_weight_prec is not None), (logits_inputs_prec is not None))
expected_floor_count = num_layers * (
mlp_floors_per_layer + attention_floors_per_layer) + embedding_floors
self.assertEqual(floor_count, expected_floor_count)
@parameterized.named_parameters(
dict(
testcase_name='test_2layers_att8bit_weight_quant',
num_layers=2,
attention_kqv_inputs_prec=None,
attention_out_inputs_prec=None,
attention_weight_prec=8,
attention_act_q_inputs_prec=None,
attention_act_k_inputs_prec=None,
attention_act_probs_inputs_prec=None,
attention_act_v_inputs_prec=None,
inputs_hyper_is_float=True,
),
dict(
testcase_name='test_3layers_att8bit_kqv_inputs_quant',
num_layers=3,
attention_kqv_inputs_prec=8,
attention_out_inputs_prec=None,
attention_weight_prec=None,
attention_act_q_inputs_prec=None,
attention_act_k_inputs_prec=None,
attention_act_probs_inputs_prec=None,
attention_act_v_inputs_prec=None,
inputs_hyper_is_float=True,
),
dict(
testcase_name='test_3layers_att8bit_act_q_inputs_quant',
num_layers=3,
attention_kqv_inputs_prec=None,
attention_out_inputs_prec=None,
attention_weight_prec=None,
attention_act_q_inputs_prec=8,
attention_act_k_inputs_prec=None,
attention_act_probs_inputs_prec=None,
attention_act_v_inputs_prec=None,
inputs_hyper_is_float=True,
),
dict(
testcase_name='test_2layers_att8bit_act_k_inputs_quant',
num_layers=2,
attention_kqv_inputs_prec=None,
attention_out_inputs_prec=None,
attention_weight_prec=None,
attention_act_q_inputs_prec=None,
attention_act_k_inputs_prec=8,
attention_act_probs_inputs_prec=None,
attention_act_v_inputs_prec=None,
inputs_hyper_is_float=True,
),
dict(
testcase_name='test_3layers_att8bit_act_qk_inputs_quant',
num_layers=3,
attention_kqv_inputs_prec=None,
attention_out_inputs_prec=None,
attention_weight_prec=None,
attention_act_q_inputs_prec=None,
attention_act_k_inputs_prec=None,
attention_act_probs_inputs_prec=4,
attention_act_v_inputs_prec=None,
inputs_hyper_is_float=True,
),
dict(
testcase_name='test_2layers_att8bit_act_v_inputs_quant',
num_layers=2,
attention_kqv_inputs_prec=None,
attention_out_inputs_prec=None,
attention_weight_prec=None,
attention_act_q_inputs_prec=None,
attention_act_k_inputs_prec=None,
attention_act_probs_inputs_prec=None,
attention_act_v_inputs_prec=8,
inputs_hyper_is_float=True,
),
dict(
testcase_name='test_3layers_att8bit_kqv_inputs_auto_quant',
num_layers=3,
attention_kqv_inputs_prec=8,
attention_out_inputs_prec=None,
attention_weight_prec=None,
attention_act_q_inputs_prec=None,
attention_act_k_inputs_prec=None,
attention_act_probs_inputs_prec=None,
attention_act_v_inputs_prec=None,
inputs_hyper_is_float=False,
),
dict(
testcase_name='test_2layers_att8bit_out_inputs_quant',
num_layers=2,
attention_kqv_inputs_prec=None,
attention_out_inputs_prec=8,
attention_weight_prec=None,
attention_act_q_inputs_prec=None,
attention_act_k_inputs_prec=None,
attention_act_probs_inputs_prec=None,
attention_act_v_inputs_prec=None,
inputs_hyper_is_float=True,
),
dict(
testcase_name='test_2layers_att8bit_out_inputs_auto_quant',
num_layers=2,
attention_kqv_inputs_prec=None,
attention_out_inputs_prec=8,
attention_weight_prec=None,
attention_act_q_inputs_prec=None,
attention_act_k_inputs_prec=None,
attention_act_probs_inputs_prec=None,
attention_act_v_inputs_prec=None,
inputs_hyper_is_float=False,
),
dict(
testcase_name='test_2layers_att_weight_kqv_out_quant',
num_layers=2,
attention_kqv_inputs_prec=8,
attention_out_inputs_prec=4,
attention_weight_prec=2,
attention_act_q_inputs_prec=None,
attention_act_k_inputs_prec=None,
attention_act_probs_inputs_prec=None,
attention_act_v_inputs_prec=None,
inputs_hyper_is_float=True,
),
dict(
testcase_name='test_2layers_att_weight_kqv_out_act_quant',
num_layers=2,
attention_kqv_inputs_prec=8,
attention_out_inputs_prec=4,
attention_weight_prec=2,
attention_act_q_inputs_prec=4,
attention_act_k_inputs_prec=8,
attention_act_probs_inputs_prec=4,
attention_act_v_inputs_prec=2,
inputs_hyper_is_float=True,
),
dict(
testcase_name='test_2layers_att_weight_kqv_out_auto_quant',
num_layers=2,
attention_kqv_inputs_prec=8,
attention_out_inputs_prec=4,
attention_weight_prec=2,
attention_act_q_inputs_prec=None,
attention_act_k_inputs_prec=None,
attention_act_probs_inputs_prec=None,
attention_act_v_inputs_prec=None,
inputs_hyper_is_float=False,
),
dict(
testcase_name='test_3layers_att_weight_kqv_out_act_auto_quant',
num_layers=3,
attention_kqv_inputs_prec=8,
attention_out_inputs_prec=4,
attention_weight_prec=2,
attention_act_q_inputs_prec=8,
attention_act_k_inputs_prec=4,
attention_act_probs_inputs_prec=4,
attention_act_v_inputs_prec=2,
inputs_hyper_is_float=False,
),
)
def test_number_of_floor_ops_attention(
self,
num_layers,
attention_kqv_inputs_prec,
attention_out_inputs_prec,
attention_weight_prec,
attention_act_q_inputs_prec,
attention_act_k_inputs_prec,
attention_act_probs_inputs_prec,
attention_act_v_inputs_prec,
inputs_hyper_is_float,
):
# Counts number of floor ops as a proxy for quantization ops.
if inputs_hyper_is_float:
inputs_hyper = 6.0
else:
inputs_hyper = get_bounds.GetBounds.Hyper(
initial_bound=6.0,
stddev_coeff=3.0,
absdev_coeff=2.0,
mix_coeff=0.5,
granularity=quant_config.QuantGranularity.PER_TENSOR)
hparams = training_hparams_generator_lib.create_base_transformer_hparams(
mlp_weight_prec=None,
embedding_weight_prec=None,
attention_weight_prec=attention_weight_prec,
mlp_pos_inputs_prec=None,
mlp_pos_inputs_hyper=None,
mlp_signed_inputs_prec=None,
mlp_signed_inputs_hyper=None,
attention_kqv_inputs_prec=attention_kqv_inputs_prec,
attention_kqv_inputs_hyper=inputs_hyper,
attention_out_inputs_prec=attention_out_inputs_prec,
attention_out_inputs_hyper=inputs_hyper,
logits_inputs_prec=None,
logits_inputs_hyper=None,
logits_via_embeddings=True,
attention_act_q_inputs_prec=attention_act_q_inputs_prec,
attention_act_q_inputs_hyper=inputs_hyper,
attention_act_k_inputs_prec=attention_act_k_inputs_prec,
attention_act_k_inputs_hyper=inputs_hyper,
attention_act_probs_inputs_prec=attention_act_probs_inputs_prec,
attention_act_v_inputs_prec=attention_act_v_inputs_prec,
attention_act_v_inputs_hyper=inputs_hyper,
num_layers=num_layers,
emb_dim=5,
num_heads=8,
qkv_dim=8,
mlp_dim=7,
quant_type=QuantType.FAKE_QUANT)
transformer_kwargs = self.transformer_full_kwargs
transformer_kwargs['hparams'] = hparams
input_shape = (2, 4)
target_shape = input_shape
model, init_state = self.init_model(transformer_kwargs)
hlo_proto = hlo_utils.load_hlo_proto_from_model(model, init_state,
[input_shape, target_shape])
floor_count = hlo_utils.count_ops_in_hlo_proto(hlo_proto, r'floor')
attention_floors_per_layer = self._num_attention_floors(
(attention_weight_prec is not None),
(attention_kqv_inputs_prec is not None),
(attention_out_inputs_prec is not None),
(attention_act_q_inputs_prec is not None),
(attention_act_k_inputs_prec is not None),
(attention_act_probs_inputs_prec is not None),
(attention_act_v_inputs_prec is not None))
expected_floor_count = num_layers * attention_floors_per_layer
self.assertEqual(floor_count, expected_floor_count)
@parameterized.named_parameters(
dict(
testcase_name='test_3layers_mlp8bit_weight_quant',
num_layers=3,
mlp_weight_prec=8,
mlp_pos_inputs_prec=None,
mlp_pos_inputs_hyper_is_float=True,
mlp_signed_inputs_prec=None,
mlp_signed_inputs_hyper_is_float=True,
),
dict(
testcase_name='test_2layers_mlp8bit_pos_inputs_quant',
num_layers=2,
mlp_weight_prec=None,
mlp_pos_inputs_prec=8,
mlp_pos_inputs_hyper_is_float=True,
mlp_signed_inputs_prec=None,
mlp_signed_inputs_hyper_is_float=True,
),
dict(
testcase_name='test_2layers_mlp8bit_pos_inputs_auto_quant',
num_layers=2,
mlp_weight_prec=None,
mlp_pos_inputs_prec=8,
mlp_pos_inputs_hyper_is_float=False,
mlp_signed_inputs_prec=None,
mlp_signed_inputs_hyper_is_float=True,
),
dict(
testcase_name='test_2layers_mlp8bit_pos_inputs_weights_quant',
num_layers=2,
mlp_weight_prec=8,
mlp_pos_inputs_prec=8,
mlp_pos_inputs_hyper_is_float=True,
mlp_signed_inputs_prec=None,
mlp_signed_inputs_hyper_is_float=True,
),
dict(
testcase_name='test_2layers_mlp8bit_neg_inputs_quant',
num_layers=2,
mlp_weight_prec=None,
mlp_pos_inputs_prec=None,
mlp_pos_inputs_hyper_is_float=True,
mlp_signed_inputs_prec=8,
mlp_signed_inputs_hyper_is_float=True,
),
dict(
testcase_name='test_2layers_mlp8bit_neg_inputs_auto_quant',
num_layers=2,
mlp_weight_prec=None,
mlp_pos_inputs_prec=None,
mlp_pos_inputs_hyper_is_float=True,
mlp_signed_inputs_prec=8,
mlp_signed_inputs_hyper_is_float=False,
),
dict(
testcase_name='test_2layers_mlp8bit_all_inputs_weights_quant',
num_layers=2,
mlp_weight_prec=8,
mlp_pos_inputs_prec=8,
mlp_pos_inputs_hyper_is_float=True,
mlp_signed_inputs_prec=8,
mlp_signed_inputs_hyper_is_float=True,
),
dict(
testcase_name='test_2layers_mlp8bit_all_inputs_weights_auto_quant',
num_layers=2,
mlp_weight_prec=8,
mlp_pos_inputs_prec=8,
mlp_pos_inputs_hyper_is_float=False,
mlp_signed_inputs_prec=8,
mlp_signed_inputs_hyper_is_float=False,
),
)
def test_number_of_floor_ops_mlp(self, num_layers, mlp_weight_prec,
mlp_pos_inputs_prec,
mlp_pos_inputs_hyper_is_float,
mlp_signed_inputs_prec,
mlp_signed_inputs_hyper_is_float):
# Counts number of floor ops as a proxy for quantization ops.
if mlp_pos_inputs_hyper_is_float:
mlp_pos_inputs_hyper = 6.0
else:
mlp_pos_inputs_hyper = get_bounds.GetBounds.Hyper(
initial_bound=6.0,
stddev_coeff=3.0,
absdev_coeff=2.0,
mix_coeff=0.5,
granularity=quant_config.QuantGranularity.PER_TENSOR)
if mlp_signed_inputs_hyper_is_float:
mlp_pos_inputs_hyper = 6.0
else:
mlp_pos_inputs_hyper = get_bounds.GetBounds.Hyper(
initial_bound=6.0,
stddev_coeff=3.0,
absdev_coeff=2.0,
mix_coeff=0.5,
granularity=quant_config.QuantGranularity.PER_TENSOR)
hparams = training_hparams_generator_lib.create_base_transformer_hparams(
mlp_weight_prec=mlp_weight_prec,
embedding_weight_prec=None,
attention_weight_prec=None,
mlp_pos_inputs_prec=mlp_pos_inputs_prec,
mlp_pos_inputs_hyper=mlp_pos_inputs_hyper,
mlp_signed_inputs_prec=mlp_signed_inputs_prec,
mlp_signed_inputs_hyper=mlp_pos_inputs_hyper,
attention_kqv_inputs_prec=None,
attention_kqv_inputs_hyper=None,
attention_out_inputs_prec=None,
attention_out_inputs_hyper=None,
logits_inputs_prec=None,
logits_inputs_hyper=None,
logits_via_embeddings=True,
attention_act_q_inputs_prec=None,
attention_act_q_inputs_hyper=None,
attention_act_k_inputs_prec=None,
attention_act_k_inputs_hyper=None,
attention_act_probs_inputs_prec=None,
attention_act_v_inputs_prec=None,
attention_act_v_inputs_hyper=None,
num_layers=num_layers,
emb_dim=5,
num_heads=8,
qkv_dim=8,
mlp_dim=7,
quant_type=QuantType.FAKE_QUANT)
transformer_kwargs = self.transformer_full_kwargs
transformer_kwargs['hparams'] = hparams
input_shape = (2, 4)
target_shape = input_shape
model, init_state = self.init_model(transformer_kwargs)
hlo_proto = hlo_utils.load_hlo_proto_from_model(model, init_state,
[input_shape, target_shape])
floor_count = hlo_utils.count_ops_in_hlo_proto(hlo_proto, r'floor')
mlp_floors_per_layer = self._num_mlp_floors(
(mlp_weight_prec is not None), (mlp_pos_inputs_prec is not None),
(mlp_signed_inputs_prec is not None))
expected_floor_count = num_layers * mlp_floors_per_layer
self.assertEqual(floor_count, expected_floor_count)
@parameterized.named_parameters(
dict(
testcase_name='test_3layers_embedding8bit_weight_quant',
num_layers=3,
embedding_weight_prec=8,
logits_inputs_prec=None,
logits_inputs_hyper_is_float=True,
logits_via_embeddings=True,
),
dict(
testcase_name='test_2layers_embedding8bit_inputs_auto_quant',
num_layers=2,
embedding_weight_prec=None,
logits_inputs_prec=8,
logits_inputs_hyper_is_float=False,
logits_via_embeddings=True,
),
dict(
testcase_name='test_2layers_embedding8bit_inputs_weights_quant_fixed',
num_layers=2,
embedding_weight_prec=8,
logits_inputs_prec=8,
logits_inputs_hyper_is_float=True,
logits_via_embeddings=True,
),
dict(
testcase_name='test_2layers_embedding8bit_inputs_weights_auto_quant',
num_layers=2,
embedding_weight_prec=8,
logits_inputs_prec=8,
logits_inputs_hyper_is_float=False,
logits_via_embeddings=True,
),
dict(
testcase_name='test_2layers_embedding8bit_without_logit_sharing',
num_layers=2,
embedding_weight_prec=8,
logits_inputs_prec=8,
logits_inputs_hyper_is_float=False,
logits_via_embeddings=False,
),
)
def test_number_of_floor_ops_embedding(self, num_layers,
embedding_weight_prec,
logits_inputs_prec,
logits_inputs_hyper_is_float,
logits_via_embeddings):
# Counts number of floor ops as a proxy for quantization ops.
if logits_inputs_hyper_is_float:
logits_inputs_hyper = 6.0
else:
logits_inputs_hyper = get_bounds.GetBounds.Hyper(
initial_bound=6.0,
stddev_coeff=3.0,
absdev_coeff=2.0,
mix_coeff=0.5,
granularity=quant_config.QuantGranularity.PER_TENSOR)
hparams = training_hparams_generator_lib.create_base_transformer_hparams(
mlp_weight_prec=None,
embedding_weight_prec=embedding_weight_prec,
attention_weight_prec=None,
mlp_pos_inputs_prec=None,
mlp_pos_inputs_hyper=None,
mlp_signed_inputs_prec=None,
mlp_signed_inputs_hyper=None,
attention_kqv_inputs_prec=None,
attention_kqv_inputs_hyper=None,
attention_out_inputs_prec=None,
attention_out_inputs_hyper=None,
logits_inputs_prec=logits_inputs_prec,
logits_inputs_hyper=logits_inputs_hyper,
logits_via_embeddings=logits_via_embeddings,
attention_act_q_inputs_prec=None,
attention_act_q_inputs_hyper=None,
attention_act_k_inputs_prec=None,
attention_act_k_inputs_hyper=None,
attention_act_probs_inputs_prec=None,
attention_act_v_inputs_prec=None,
attention_act_v_inputs_hyper=None,
num_layers=num_layers,
emb_dim=5,
num_heads=8,
qkv_dim=8,
mlp_dim=7,
quant_type=QuantType.FAKE_QUANT)
transformer_kwargs = self.transformer_full_kwargs
transformer_kwargs['hparams'] = hparams
input_shape = (2, 4)
target_shape = input_shape
model, init_state = self.init_model(transformer_kwargs)
hlo_proto = hlo_utils.load_hlo_proto_from_model(model, init_state,
[input_shape, target_shape])
floor_count = hlo_utils.count_ops_in_hlo_proto(hlo_proto, r'floor')
embedding_floor_ops = self._num_embedding_floors(
(embedding_weight_prec is not None), (logits_inputs_prec is not None))
self.assertEqual(floor_count, embedding_floor_ops)
def test_padding_mask(self):
# Fuzzing test to make sure activation statistics aren't affected by padding
# tokens.
#
# This tests works by changing the embedding of the padding token (token
# with id '0') and making sure all the stats stay the same.
#
# It also tests that the stats *do* change when the embedding of a
# non-padding token changes.
inputs_hyper = get_bounds.GetBounds.Hyper(
initial_bound=6.0,
stddev_coeff=3.0,
absdev_coeff=2.0,
mix_coeff=0.5,
granularity=quant_config.QuantGranularity.PER_CHANNEL)
# Set logits_via_embedding to false so that the embedding of the padding
# token doesn't affect the logits calculation at the end of the decoder.
hparams = training_hparams_generator_lib.create_base_transformer_hparams(
mlp_weight_prec=8,
embedding_weight_prec=None,
attention_weight_prec=8,
mlp_pos_inputs_prec=8,
mlp_pos_inputs_hyper=inputs_hyper,
mlp_signed_inputs_prec=8,
mlp_signed_inputs_hyper=inputs_hyper,
attention_kqv_inputs_prec=8,
attention_kqv_inputs_hyper=inputs_hyper,
attention_out_inputs_prec=8,
attention_out_inputs_hyper=inputs_hyper,
logits_inputs_prec=8,
logits_inputs_hyper=inputs_hyper,
logits_via_embeddings=False,
attention_act_q_inputs_prec=8,
attention_act_q_inputs_hyper=inputs_hyper,
attention_act_k_inputs_prec=8,
attention_act_k_inputs_hyper=inputs_hyper,
attention_act_probs_inputs_prec=8,
attention_act_v_inputs_prec=8,
attention_act_v_inputs_hyper=inputs_hyper,
num_layers=2,
emb_dim=5,
num_heads=2,
qkv_dim=4,
mlp_dim=4,
quant_type=QuantType.FAKE_QUANT)
module = models.Transformer(
hparams=hparams,
quant_context=quant_config.QuantContext(
update_bounds=True, collect_acts_stats=True),
vocab_size=3,
output_vocab_size=3,
max_len=10,
train=False,
use_bfloat16=False,
dropout_rate=.1,
attention_dropout_rate=.1,
should_decode=False)
key = jax.random.PRNGKey(0)
# Mark the first token of the target and last token of the inputs as padding
# tokens.
targets = onp.array([[0, 2]])
inputs = onp.array([[1, 0]])
initial_state = module.init(key, inputs=inputs, targets=targets)
# Change the embedding of the padding token.
initial_state = initial_state.unfreeze()
initial_state['params']['shared_embedding']['embedding'] = initial_state[
'params']['shared_embedding']['embedding'].at[0, :].set(10.0)
module.train = True
_, state1 = module.apply(
flax.core.freeze(initial_state),
inputs=inputs,
targets=targets,
mutable=True,
rngs={'dropout': key})
initial_state['params']['shared_embedding']['embedding'] = initial_state[
'params']['shared_embedding']['embedding'].at[0, :].set(20.0)
_, state2 = module.apply(
flax.core.freeze(initial_state),
inputs=inputs,
targets=targets,
mutable=True,
rngs={'dropout': key})
# This tests the statistics in both the GetBounds and StatsTag modules.
test_utils.assert_stats_are_equal(state1, state2)
# Now we repeat the test, but changing the embedding of a non-padding token
# (token with ID 1 here). We expect to see the stats change.
# print(initial_state)
initial_state['params']['shared_embedding']['embedding'] = initial_state[
'params']['shared_embedding']['embedding'].at[1, :].set(10.0)
_, state1 = module.apply(
flax.core.freeze(initial_state),
inputs=inputs,
targets=targets,
mutable=True,
rngs={'dropout': key})
initial_state['params']['shared_embedding']['embedding'] = initial_state[
'params']['shared_embedding']['embedding'].at[1, :].set(200.0)
_, state2 = module.apply(
flax.core.freeze(initial_state),
inputs=inputs,
targets=targets,
mutable=True,
rngs={'dropout': key})
print(initial_state['get_bounds']['encoder']['encoderblock_0']
['enc_self_att']['K']['bounds'])
print(state1['get_bounds']['encoder']['encoderblock_0']['enc_self_att']['K']
['bounds'])
print(state2['get_bounds']['encoder']['encoderblock_0']['enc_self_att']['K']
['bounds'])
print('')
test_utils.assert_stats_are_unequal(state1, state2)
def test_hparams_without_logits_when_logits_not_shared_raises_error(self):
# Create hparams without logits hparams by passing in
# logits_via_embeddings=True.
inputs_hyper = get_bounds.GetBounds.Hyper(
initial_bound=6.0,
stddev_coeff=3.0,
absdev_coeff=2.0,
mix_coeff=0.5,
granularity=quant_config.QuantGranularity.PER_CHANNEL)
hparams = training_hparams_generator_lib.create_base_transformer_hparams(
mlp_weight_prec=8,
embedding_weight_prec=None,
attention_weight_prec=8,
mlp_pos_inputs_prec=8,
mlp_pos_inputs_hyper=inputs_hyper,
mlp_signed_inputs_prec=8,
mlp_signed_inputs_hyper=inputs_hyper,
attention_kqv_inputs_prec=8,
attention_kqv_inputs_hyper=inputs_hyper,
attention_out_inputs_prec=8,
attention_out_inputs_hyper=inputs_hyper,
logits_inputs_prec=8,
logits_inputs_hyper=inputs_hyper,
logits_via_embeddings=True,
attention_act_q_inputs_prec=8,
attention_act_q_inputs_hyper=inputs_hyper,
attention_act_k_inputs_prec=8,
attention_act_k_inputs_hyper=inputs_hyper,
attention_act_probs_inputs_prec=8,
attention_act_v_inputs_prec=8,
attention_act_v_inputs_hyper=inputs_hyper,
num_layers=2,
emb_dim=5,
num_heads=2,
qkv_dim=4,
mlp_dim=4,
quant_type=QuantType.FAKE_QUANT)
self.assertIsNone(hparams.decoder.logits)
# Now set logits_via_embedding in the model hparams to False.
hparams.logits_via_embedding = False
module = models.Transformer(
hparams=hparams,
quant_context=quant_config.QuantContext(
update_bounds=True, collect_acts_stats=True),
vocab_size=3,
output_vocab_size=3,
max_len=10,
use_bfloat16=False,
train=False,
dropout_rate=.1,
attention_dropout_rate=.1,
should_decode=False)
key = jax.random.PRNGKey(0)
# Mark the first token of the target and last token of the inputs as padding
# tokens.
targets = onp.array([[0, 2]])
inputs = onp.array([[1, 0]])
# Because the model is not sharing logits with embeddings, but the logits
# hparams are missing, it should raise an error.
with self.assertRaises(ValueError):
module.init(key, inputs=inputs, targets=targets)
if __name__ == '__main__':
absltest.main()
| 38.766565 | 80 | 0.690797 | 5,039 | 38,030 | 4.739432 | 0.065886 | 0.10175 | 0.062725 | 0.071267 | 0.861109 | 0.839126 | 0.815384 | 0.790302 | 0.771795 | 0.760447 | 0 | 0.014401 | 0.24223 | 38,030 | 980 | 81 | 38.806122 | 0.814317 | 0.080358 | 0 | 0.777906 | 0 | 0 | 0.053079 | 0.03446 | 0 | 0 | 0 | 0 | 0.019563 | 1 | 0.01496 | false | 0 | 0.017261 | 0 | 0.037975 | 0.004603 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
1ccafcd4ed1307062a8d86b4c48723421357c4fe | 231 | py | Python | data_service/api/observability_api.py | statisticsnorway/microdata-data-service | d477b7b75589d4c977771122558c948c040a1106 | [
"Apache-2.0"
] | null | null | null | data_service/api/observability_api.py | statisticsnorway/microdata-data-service | d477b7b75589d4c977771122558c948c040a1106 | [
"Apache-2.0"
] | 7 | 2021-10-08T13:40:33.000Z | 2022-02-04T10:37:55.000Z | data_service/api/observability_api.py | statisticsnorway/microdata-data-service | d477b7b75589d4c977771122558c948c040a1106 | [
"Apache-2.0"
] | null | null | null | from fastapi import APIRouter
observability_router = APIRouter()
@observability_router.get('/health/alive')
def alive():
return "I'm alive!"
@observability_router.get('/health/ready')
def ready():
return "I'm ready!"
| 15.4 | 42 | 0.714286 | 29 | 231 | 5.586207 | 0.482759 | 0.351852 | 0.345679 | 0.345679 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 231 | 14 | 43 | 16.5 | 0.818182 | 0 | 0 | 0 | 0 | 0 | 0.199134 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.125 | 0.25 | 0.625 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
98ca84d23474e1cdbe64542b8383df8c8c79507e | 40 | py | Python | yama_util/temp.py | yamato-kaeng/yama_util | 2fa31edf7ba501fc2e01f58511c185c8850d58f5 | [
"MIT"
] | null | null | null | yama_util/temp.py | yamato-kaeng/yama_util | 2fa31edf7ba501fc2e01f58511c185c8850d58f5 | [
"MIT"
] | null | null | null | yama_util/temp.py | yamato-kaeng/yama_util | 2fa31edf7ba501fc2e01f58511c185c8850d58f5 | [
"MIT"
] | null | null | null | def yama():
return 'Hello Yama Util' | 20 | 28 | 0.65 | 6 | 40 | 4.333333 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.225 | 40 | 2 | 28 | 20 | 0.83871 | 0 | 0 | 0 | 0 | 0 | 0.365854 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 0 | 0.5 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
98f5a610eed41584f28aa50bed247b9b12843c48 | 26,799 | py | Python | test/test_unit_api.py | cyberark/conjur-api-python3 | b4529e2921e96d7442032bff206cec052bbda569 | [
"Apache-2.0"
] | 16 | 2019-05-17T15:34:59.000Z | 2021-11-08T10:30:21.000Z | test/test_unit_api.py | cyberark/conjur-api-python3 | b4529e2921e96d7442032bff206cec052bbda569 | [
"Apache-2.0"
] | 301 | 2019-05-07T18:27:10.000Z | 2022-01-26T13:03:49.000Z | test/test_unit_api.py | cyberark/conjur-api-python3 | b4529e2921e96d7442032bff206cec052bbda569 | [
"Apache-2.0"
] | 10 | 2019-07-30T17:00:13.000Z | 2022-01-20T17:00:34.000Z | import json
import unittest
from datetime import datetime
from unittest.mock import patch, MagicMock
from urllib import parse
from conjur.data_object.create_token_data import CreateTokenData
from conjur.errors import MissingRequiredParameterException
from conjur.wrapper.http_wrapper import HttpVerb
from conjur.api.endpoints import ConjurEndpoint
from conjur.api import Api
from conjur.resource import Resource
MOCK_RESOURCE_LIST = [
{
'a': 'a value',
'b': 'b value',
'id': 'first:id',
'c': 'c value',
},
{
'x': 'x value',
'y': 'y value',
'id': 'second:id',
'z': 'z value',
},
]
MOCK_BATCH_GET_RESPONSE = '{"myaccount:variable:foo": "a", "myaccount:variable:bar": "b"}'
MOCK_POLICY_CHANGE_OBJECT = {
"created_roles": {
"myorg:user:alice": {
"id": "myorg:user:alice",
"api_key": "apikey1"
},
"myorg:user:bob": {
"id": "myorg:user:bob",
"api_key": "apikey2"
}
},
"version": 4
}
MOCK_HOSTFACTORY_OBJECT = CreateTokenData(host_factory="some_host_factory", cidr="1.2.3.4,0.0.0.0", days=1)
MOCK_HOSTFACTORY_WITHOUT_CIDR_OBJECT = CreateTokenData(host_factory="some_host_factory", days=1)
class ApiTest(unittest.TestCase):
class MockClientResponse():
def __init__(self, text='myretval', content='mycontent'):
setattr(self, 'content', content.encode('utf-8'))
setattr(self, 'text', text)
POLICY_FILE = './test/test_config/policies/variables.yml'
def verify_http_call(self, http_client, method, endpoint, *args,
ssl_verify=None, api_token='apitoken', auth=None, query=None,
account='default', headers={}, **kwargs):
params = {
'url': 'http://localhost',
'account': account,
}
for name, value in kwargs.items():
params[name] = value
extra_args = {}
for extra_arg_name in ['api_token', 'auth', 'query', 'headers']:
if locals()[extra_arg_name]:
extra_args[extra_arg_name] = locals()[extra_arg_name]
http_client.assert_called_once_with(method, endpoint, params, *args,
**extra_args,
ssl_verify=ssl_verify)
def test_new_client_throws_error_when_no_url(self):
with self.assertRaises(Exception):
Api(login_id='mylogin', api_key='apikey', ssl_verify=False)
# Hostfactory - create token
@patch('conjur.api.api.invoke_endpoint', return_value=MockClientResponse())
def test_host_factory_create_token_invokes_http_client_correctly(self, mock_http_client):
api = Api(url='http://localhost', login_id='mylogin', api_key='apikey')
def mock_auth():
return 'apitoken'
api.authenticate = mock_auth
api.create_token(MOCK_HOSTFACTORY_OBJECT)
MOCK_EXPECTED_HOSTFACTORY_PARAM = parse.urlencode(MOCK_HOSTFACTORY_OBJECT.to_dict(),
doseq=True)
self.verify_http_call(mock_http_client, HttpVerb.POST, ConjurEndpoint.HOST_FACTORY_TOKENS,
MOCK_EXPECTED_HOSTFACTORY_PARAM,
query={},
api_token='apitoken',
headers={'Content-Type': 'application/x-www-form-urlencoded'},
ssl_verify=True)
@patch('conjur.api.api.invoke_endpoint', return_value=MockClientResponse())
def test_host_factory_create_token_with_no_cidr_invokes_http_client_correctly(self, mock_http_client):
api = Api(url='http://localhost', login_id='mylogin', api_key='apikey')
def mock_auth():
return 'apitoken'
api.authenticate = mock_auth
api.create_token(MOCK_HOSTFACTORY_WITHOUT_CIDR_OBJECT)
MOCK_EXPECTED_HOSTFACTORY_PARAM = parse.urlencode(MOCK_HOSTFACTORY_WITHOUT_CIDR_OBJECT.to_dict(),
doseq=True)
self.verify_http_call(mock_http_client, HttpVerb.POST, ConjurEndpoint.HOST_FACTORY_TOKENS,
MOCK_EXPECTED_HOSTFACTORY_PARAM,
query={},
api_token='apitoken',
headers={'Content-Type': 'application/x-www-form-urlencoded'},
ssl_verify=True)
@patch('conjur.api.api.invoke_endpoint', return_value=MockClientResponse())
def test_new_client_delegates_ssl_verify_flag(self, mock_http_client):
Api(url='http://localhost', ssl_verify=True).login('myuser', 'mypass')
self.verify_http_call(mock_http_client, HttpVerb.GET, ConjurEndpoint.LOGIN,
auth=('myuser', 'mypass'),
api_token=False,
ssl_verify=True)
@patch('conjur.api.api.invoke_endpoint', return_value=MockClientResponse())
def test_new_client_overrides_ssl_verify_flag_with_ca_bundle_if_provided(self, mock_http_client):
Api(url='http://localhost', ssl_verify=True,
ca_bundle='cabundle').login('myuser', 'mypass')
self.verify_http_call(mock_http_client, HttpVerb.GET, ConjurEndpoint.LOGIN,
auth=('myuser', 'mypass'),
api_token=False,
ssl_verify='cabundle')
@patch('conjur.api.api.invoke_endpoint', return_value=MockClientResponse())
def test_login_invokes_http_client_correctly(self, mock_http_client):
Api(url='http://localhost').login('myuser', 'mypass')
self.verify_http_call(mock_http_client, HttpVerb.GET, ConjurEndpoint.LOGIN,
auth=('myuser', 'mypass'),
api_token=False,
ssl_verify=True)
def test_login_throws_error_when_username_not_provided(self):
with self.assertRaises(MissingRequiredParameterException):
Api(url='http://localhost').login(None, 'mypass')
def test_login_throws_error_when_password_not_provided(self):
with self.assertRaises(MissingRequiredParameterException):
Api(url='http://localhost').login('myuser', None)
@patch('conjur.api.api.invoke_endpoint', return_value=MockClientResponse())
def test_login_saves_login_id(self, _):
api = Api(url='http://localhost')
api.login('myuser', 'mypass')
self.assertEquals(api.login_id, 'myuser')
@patch('conjur.api.api.invoke_endpoint', return_value=MockClientResponse())
def test_if_api_token_is_missing_fetch_a_new_one(self, mock_http_client):
api = Api(url='http://localhost')
api.authenticate = MagicMock(return_value='mytoken')
self.assertEquals(api.api_token, 'mytoken')
api.authenticate.assert_called_once_with()
@patch('conjur.api.api.invoke_endpoint', return_value=MockClientResponse())
def test_if_account_is_empty_throw_an_error(self, mock_http_client):
empty_values = [None, ""]
for empty_value in empty_values:
with self.subTest(account=empty_value):
with self.assertRaises(MissingRequiredParameterException):
api = Api(url='http://localhost', account=empty_value)
@patch('conjur.api.api.invoke_endpoint', return_value=MockClientResponse())
def test_if_api_token_is_not_expired_dont_fetch_new_one(self, mock_http_client):
api = Api(url='http://localhost')
api.authenticate = MagicMock(return_value='mytoken')
token = api.api_token
api.authenticate = MagicMock(return_value='newtoken')
self.assertEquals(api.api_token, 'mytoken')
@patch('conjur.api.api.invoke_endpoint', return_value=MockClientResponse())
def test_if_api_token_is_expired_fetch_new_one(self, mock_http_client):
api = Api(url='http://localhost')
api.authenticate = MagicMock(return_value='mytoken')
api.api_token
api.api_token_expiration = datetime.now()
api.authenticate = MagicMock(return_value='newtoken')
self.assertEquals(api.api_token, 'newtoken')
@patch('conjur.api.api.invoke_endpoint', return_value=MockClientResponse())
def test_authenticate_invokes_http_client_correctly(self, mock_http_client):
Api(url='http://localhost', login_id='mylogin', api_key='apikey').authenticate()
self.verify_http_call(mock_http_client, HttpVerb.POST, ConjurEndpoint.AUTHENTICATE,
'apikey',
login='mylogin',
api_token=False,
ssl_verify=True)
def test_authenticate_throws_error_without_login_id_specified(self):
with self.assertRaises(MissingRequiredParameterException):
Api(url='http://localhost', api_key='apikey').authenticate()
def test_authenticate_throws_error_without_api_key_specified(self):
with self.assertRaises(MissingRequiredParameterException):
Api(url='http://localhost', login_id='mylogin').authenticate()
@patch('conjur.api.api.invoke_endpoint', return_value=MockClientResponse())
def test_account_info_is_passed_down_to_http_call(self, mock_http_client):
Api(url='http://localhost',
account='myacct',
login_id='mylogin',
api_key='apikey').authenticate()
self.verify_http_call(mock_http_client, HttpVerb.POST, ConjurEndpoint.AUTHENTICATE,
'apikey',
headers={},
login='mylogin',
account='myacct',
api_token=False,
ssl_verify=True)
@patch('conjur.api.api.invoke_endpoint', return_value=MockClientResponse())
def test_authenticate_passes_down_ssl_verify_param(self, mock_http_client):
Api(url='http://localhost', login_id='mylogin', api_key='apikey',
ssl_verify='verify').authenticate()
self.verify_http_call(mock_http_client, HttpVerb.POST, ConjurEndpoint.AUTHENTICATE,
'apikey',
api_token=False,
login='mylogin',
ssl_verify='verify')
# Get variable
@patch('conjur.api.api.invoke_endpoint', return_value=MockClientResponse())
def test_get_variable_invokes_http_client_correctly(self, mock_http_client):
api = Api(url='http://localhost', login_id='mylogin', api_key='apikey')
def mock_auth():
return 'apitoken'
api.authenticate = mock_auth
api.get_variable('myvar')
self.verify_http_call(mock_http_client, HttpVerb.GET, ConjurEndpoint.SECRETS,
kind='variable',
identifier='myvar',
query={},
ssl_verify=True)
@patch('conjur.api.api.invoke_endpoint', return_value=MockClientResponse())
def test_get_variable_with_version_invokes_http_client_correctly(self, mock_http_client):
api = Api(url='http://localhost', login_id='mylogin', api_key='apikey')
def mock_auth():
return 'apitoken'
api.authenticate = mock_auth
api.get_variable('myvar', '1')
self.verify_http_call(mock_http_client, HttpVerb.GET, ConjurEndpoint.SECRETS,
kind='variable',
identifier='myvar',
query={'version': '1'},
ssl_verify=True)
@patch('conjur.api.api.invoke_endpoint', return_value=MockClientResponse())
def test_get_variable_passes_down_ssl_verify_param(self, mock_http_client):
api = Api(url='http://localhost', login_id='mylogin', api_key='apikey',
ssl_verify='verify')
def mock_auth():
return 'apitoken'
api.authenticate = mock_auth
api.get_variable('myvar')
self.verify_http_call(mock_http_client, HttpVerb.GET, ConjurEndpoint.SECRETS,
kind='variable',
identifier='myvar',
query={},
ssl_verify='verify')
# Set variable
@patch('conjur.api.api.invoke_endpoint', return_value=MockClientResponse())
def test_set_variable_invokes_http_client_correctly(self, mock_http_client):
api = Api(url='http://localhost', login_id='mylogin', api_key='apikey')
def mock_auth():
return 'apitoken'
api.authenticate = mock_auth
api.set_variable('myvar', 'myvalue')
self.verify_http_call(mock_http_client, HttpVerb.POST, ConjurEndpoint.SECRETS,
'myvalue',
kind='variable',
identifier='myvar',
ssl_verify=True)
@patch('conjur.api.api.invoke_endpoint', return_value=MockClientResponse())
def test_set_variable_passes_down_ssl_verify_param(self, mock_http_client):
api = Api(url='http://localhost', login_id='mylogin', api_key='apikey',
ssl_verify='verify')
def mock_auth():
return 'apitoken'
api.authenticate = mock_auth
api.set_variable('myvar', 'myvalue')
self.verify_http_call(mock_http_client, HttpVerb.POST, ConjurEndpoint.SECRETS,
'myvalue',
kind='variable',
identifier='myvar',
ssl_verify='verify')
# Policy load
@patch('conjur.api.api.invoke_endpoint', return_value=MockClientResponse(text='{}'))
def test_load_policy_invokes_http_client_correctly(self, mock_http_client):
api = Api(url='http://localhost', login_id='mylogin', api_key='apikey')
def mock_auth():
return 'apitoken'
api.authenticate = mock_auth
api.load_policy_file('mypolicyname', self.POLICY_FILE)
policy_data = None
with open(self.POLICY_FILE, 'r') as content_file:
policy_data = content_file.read()
self.verify_http_call(mock_http_client, HttpVerb.POST, ConjurEndpoint.POLICIES,
policy_data,
identifier='mypolicyname',
ssl_verify=True)
@patch('conjur.api.api.invoke_endpoint',
return_value=MockClientResponse(text=json.dumps(MOCK_POLICY_CHANGE_OBJECT)))
def test_load_policy_converts_returned_data_to_expected_objects(self, mock_http_client):
api = Api(url='http://localhost', login_id='mylogin', api_key='apikey')
def mock_auth():
return 'apitoken'
api.authenticate = mock_auth
output = api.load_policy_file('mypolicyname', self.POLICY_FILE)
self.assertEqual(output, MOCK_POLICY_CHANGE_OBJECT)
@patch('conjur.api.api.invoke_endpoint', return_value=MockClientResponse(text='{}'))
def test_load_policy_passes_down_ssl_verify_parameter(self, mock_http_client):
api = Api(url='http://localhost', login_id='mylogin', api_key='apikey', ssl_verify='ssl_verify')
def mock_auth():
return 'apitoken'
api.authenticate = mock_auth
api.load_policy_file('mypolicyname', self.POLICY_FILE)
policy_data = None
with open(self.POLICY_FILE, 'r') as content_file:
policy_data = content_file.read()
self.verify_http_call(mock_http_client, HttpVerb.POST, ConjurEndpoint.POLICIES,
policy_data,
identifier='mypolicyname',
ssl_verify='ssl_verify')
# Policy replace
@patch('conjur.api.api.invoke_endpoint', return_value=MockClientResponse(text='{}'))
def test_replace_policy_invokes_http_client_correctly(self, mock_http_client):
api = Api(url='http://localhost', login_id='mylogin', api_key='apikey')
def mock_auth():
return 'apitoken'
api.authenticate = mock_auth
api.replace_policy_file('mypolicyname', self.POLICY_FILE)
policy_data = None
with open(self.POLICY_FILE, 'r') as content_file:
policy_data = content_file.read()
self.verify_http_call(mock_http_client, HttpVerb.PUT, ConjurEndpoint.POLICIES,
policy_data,
identifier='mypolicyname',
ssl_verify=True)
@patch('conjur.api.api.invoke_endpoint',
return_value=MockClientResponse(text=json.dumps(MOCK_POLICY_CHANGE_OBJECT)))
def test_replace_policy_converts_returned_data_to_expected_objects(self, mock_http_client):
api = Api(url='http://localhost', login_id='mylogin', api_key='apikey')
def mock_auth():
return 'apitoken'
api.authenticate = mock_auth
output = api.replace_policy_file('mypolicyname', self.POLICY_FILE)
self.assertEqual(output, MOCK_POLICY_CHANGE_OBJECT)
@patch('conjur.api.api.invoke_endpoint', return_value=MockClientResponse(text='{}'))
def test_replace_policy_passes_down_ssl_verify_parameter(self, mock_http_client):
api = Api(url='http://localhost', login_id='mylogin', api_key='apikey', ssl_verify='ssl_verify')
def mock_auth():
return 'apitoken'
api.authenticate = mock_auth
api.replace_policy_file('mypolicyname', self.POLICY_FILE)
policy_data = None
with open(self.POLICY_FILE, 'r') as content_file:
policy_data = content_file.read()
self.verify_http_call(mock_http_client, HttpVerb.PUT, ConjurEndpoint.POLICIES,
policy_data,
identifier='mypolicyname',
ssl_verify='ssl_verify')
# Policy update
@patch('conjur.api.api.invoke_endpoint', return_value=MockClientResponse(text='{}'))
def test_update_policy_invokes_http_client_correctly(self, mock_http_client):
api = Api(url='http://localhost', login_id='mylogin', api_key='apikey')
def mock_auth():
return 'apitoken'
api.authenticate = mock_auth
api.update_policy_file('mypolicyname', self.POLICY_FILE)
policy_data = None
with open(self.POLICY_FILE, 'r') as content_file:
policy_data = content_file.read()
self.verify_http_call(mock_http_client, HttpVerb.PATCH, ConjurEndpoint.POLICIES,
policy_data,
identifier='mypolicyname',
ssl_verify=True)
@patch('conjur.api.api.invoke_endpoint',
return_value=MockClientResponse(text=json.dumps(MOCK_POLICY_CHANGE_OBJECT)))
def test_update_policy_converts_returned_data_to_expected_objects(self, mock_http_client):
api = Api(url='http://localhost', login_id='mylogin', api_key='apikey')
def mock_auth():
return 'apitoken'
api.authenticate = mock_auth
output = api.update_policy_file('mypolicyname', self.POLICY_FILE)
self.assertEqual(output, MOCK_POLICY_CHANGE_OBJECT)
@patch('conjur.api.api.invoke_endpoint', return_value=MockClientResponse(text='{}'))
def test_update_policy_passes_down_ssl_verify_parameter(self, mock_http_client):
api = Api(url='http://localhost', login_id='mylogin', api_key='apikey', ssl_verify='ssl_verify')
def mock_auth():
return 'apitoken'
api.authenticate = mock_auth
api.update_policy_file('mypolicyname', self.POLICY_FILE)
policy_data = None
with open(self.POLICY_FILE, 'r') as content_file:
policy_data = content_file.read()
self.verify_http_call(mock_http_client, HttpVerb.PATCH, ConjurEndpoint.POLICIES,
policy_data,
identifier='mypolicyname',
ssl_verify='ssl_verify')
# Get variables
@patch('conjur.api.api.invoke_endpoint', return_value=MockClientResponse(content='{"foo": "a", "bar": "b"}'))
def test_get_variables_invokes_http_client_correctly(self, mock_http_client):
api = Api(url='http://localhost', login_id='mylogin', api_key='apikey')
def mock_auth():
return 'apitoken'
api.authenticate = mock_auth
api.get_variables('myvar', 'myvar2')
self.verify_http_call(mock_http_client, HttpVerb.GET, ConjurEndpoint.BATCH_SECRETS,
query={
'variable_ids': 'default:variable:myvar,default:variable:myvar2'
},
ssl_verify=True)
@patch('conjur.api.api.invoke_endpoint', return_value=MockClientResponse(content=MOCK_BATCH_GET_RESPONSE))
def test_get_variables_converts_returned_data_to_expected_objects(self, mock_http_client):
api = Api(url='http://localhost', login_id='mylogin', api_key='apikey', account='myaccount')
def mock_auth():
return 'apitoken'
api.authenticate = mock_auth
output = api.get_variables('myvar', 'myvar2')
self.assertEqual(output,
{
'foo': 'a',
'bar': 'b',
})
@patch('conjur.api.api.invoke_endpoint', return_value=MockClientResponse(content='{"foo": "a", "bar": "b"}'))
def test_get_variables_passes_down_ssl_verify_parameter(self, mock_http_client):
api = Api(url='http://localhost', login_id='mylogin', api_key='apikey', ssl_verify='sslverify')
def mock_auth():
return 'apitoken'
api.authenticate = mock_auth
api.get_variables('myvar', 'myvar2')
self.verify_http_call(mock_http_client, HttpVerb.GET, ConjurEndpoint.BATCH_SECRETS,
query={
'variable_ids': 'default:variable:myvar,default:variable:myvar2'
},
ssl_verify='sslverify')
# List resources
@patch('conjur.api.api.invoke_endpoint', \
return_value=MockClientResponse(content=json.dumps(MOCK_RESOURCE_LIST)))
def test_get_resources_invokes_http_client_correctly(self, mock_http_client):
api = Api(url='http://localhost', login_id='mylogin', api_key='apikey')
def mock_auth():
return 'apitoken'
api.authenticate = mock_auth
api.resources_list()
self.verify_http_call(mock_http_client, HttpVerb.GET, ConjurEndpoint.RESOURCES,
query={},
ssl_verify=True)
@patch('conjur.api.api.invoke_endpoint', \
return_value=MockClientResponse(content=json.dumps(MOCK_RESOURCE_LIST)))
def test_get_resources_with_constraints_invokes_http_client_correctly(self, mock_http_client):
api = Api(url='http://localhost', login_id='mylogin', api_key='apikey')
def mock_auth():
return 'apitoken'
api.authenticate = mock_auth
api.resources_list({'limit': 1})
self.verify_http_call(mock_http_client, HttpVerb.GET, ConjurEndpoint.RESOURCES,
query={'limit': 1},
ssl_verify=True)
@patch('conjur.api.api.invoke_endpoint', \
return_value=MockClientResponse(content=json.dumps({})))
def test_whoami_invokes_http_client_correctly(self, mock_http_client):
api = Api(url='http://localhost', login_id='mylogin', api_key='apikey')
def mock_auth():
return 'apitoken'
api.authenticate = mock_auth
api.whoami()
self.verify_http_call(mock_http_client, HttpVerb.GET, ConjurEndpoint.WHOAMI,
ssl_verify=True)
@patch('conjur.api.api.invoke_endpoint', \
return_value=MockClientResponse(content=json.dumps({})))
def test_rotate_personal_api_key_invokes_http_client_correctly(self, mock_http_client):
api = Api(url='http://localhost', login_id='mylogin', api_key='apikey')
def mock_auth():
return 'apitoken'
api.authenticate = mock_auth
api.rotate_personal_api_key("mylogin", "somepass")
self.verify_http_call(mock_http_client, HttpVerb.PUT, ConjurEndpoint.ROTATE_API_KEY,
auth=('mylogin', 'somepass'),
ssl_verify=True)
@patch('conjur.api.api.invoke_endpoint', \
return_value=MockClientResponse(content=json.dumps({})))
def test_rotate_other_api_key_invokes_http_client_correctly(self, mock_http_client):
api = Api(url='http://localhost', login_id='mylogin', api_key='apikey')
def mock_auth():
return 'apitoken'
api.authenticate = mock_auth
api.rotate_other_api_key(Resource(type_='user', name="somename"))
self.verify_http_call(mock_http_client, HttpVerb.PUT, ConjurEndpoint.ROTATE_API_KEY,
query={'role': 'user:somename'},
ssl_verify=True)
@patch('conjur.api.api.invoke_endpoint', \
return_value=MockClientResponse(content=json.dumps({})))
def test_rotate_other_api_key_can_raise_exception_when_resource_is_invalid(self, mock_http_client):
api = Api(url='http://localhost', login_id='mylogin', api_key='apikey')
def mock_auth():
return 'apitoken'
api.authenticate = mock_auth
with self.assertRaises(Exception):
api.rotate_other_api_key(Resource(type_='someinvalidresource', name="somename"))
@patch('conjur.api.api.invoke_endpoint', \
return_value=MockClientResponse(content=json.dumps({})))
def test_change_password_invokes_http_client_correctly(self, mock_http_client):
api = Api(url='http://localhost', login_id='mylogin', api_key='apikey')
def mock_auth():
return 'apitoken'
api.authenticate = mock_auth
api.change_personal_password("someloggedinuser", "somecurrentpass", "somenewpass")
self.verify_http_call(mock_http_client, HttpVerb.PUT, ConjurEndpoint.CHANGE_PASSWORD,
"somenewpass",
auth=('someloggedinuser', 'somecurrentpass'),
ssl_verify=True)
| 40.420814 | 113 | 0.620769 | 2,900 | 26,799 | 5.407931 | 0.077586 | 0.052286 | 0.056239 | 0.049672 | 0.843142 | 0.828987 | 0.815405 | 0.805331 | 0.797105 | 0.789071 | 0 | 0.001177 | 0.27098 | 26,799 | 662 | 114 | 40.481873 | 0.801556 | 0.004552 | 0 | 0.647799 | 0 | 0 | 0.149093 | 0.050997 | 0 | 0 | 0 | 0 | 0.035639 | 1 | 0.146751 | false | 0.050314 | 0.023061 | 0.054507 | 0.230608 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
c73fc60cbe3ccb4af0755388cdda0f3f63b94c44 | 2,935 | py | Python | congress/migrations/0012_auto_20220120_1345.py | InsiderUnlocked/Backend | efd71d3eb874b9bfdbca89d0aa9e610338f9fa52 | [
"blessing"
] | 3 | 2022-01-22T06:53:52.000Z | 2022-02-13T10:16:29.000Z | congress/migrations/0012_auto_20220120_1345.py | InsiderUnlocked/Backend | efd71d3eb874b9bfdbca89d0aa9e610338f9fa52 | [
"blessing"
] | null | null | null | congress/migrations/0012_auto_20220120_1345.py | InsiderUnlocked/Backend | efd71d3eb874b9bfdbca89d0aa9e610338f9fa52 | [
"blessing"
] | null | null | null | # Generated by Django 3.2.6 on 2022-01-20 18:45
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('congress', '0011_auto_20220118_2357'),
]
operations = [
migrations.AlterModelOptions(
name='congresstrade',
options={},
),
migrations.AlterField(
model_name='congressperson',
name='purchases',
field=models.BigIntegerField(blank=True, default=0, null=True),
),
migrations.AlterField(
model_name='congressperson',
name='sales',
field=models.BigIntegerField(blank=True, default=0, null=True),
),
migrations.AlterField(
model_name='congressperson',
name='totalTransactions',
field=models.BigIntegerField(default=0),
),
migrations.AlterField(
model_name='congressperson',
name='totalVolumeTransactions',
field=models.BigIntegerField(blank=True, default=0, null=True),
),
migrations.AlterField(
model_name='summarystat',
name='purchases',
field=models.BigIntegerField(default=0),
),
migrations.AlterField(
model_name='summarystat',
name='sales',
field=models.BigIntegerField(default=0),
),
migrations.AlterField(
model_name='summarystat',
name='timeframe',
field=models.BigIntegerField(unique=True),
),
migrations.AlterField(
model_name='summarystat',
name='total',
field=models.BigIntegerField(default=0),
),
migrations.AlterField(
model_name='summarystat',
name='totalVolume',
field=models.BigIntegerField(default=0),
),
migrations.AlterField(
model_name='ticker',
name='marketcap',
field=models.BigIntegerField(blank=True, null=True),
),
migrations.AlterField(
model_name='ticker',
name='purchases',
field=models.BigIntegerField(blank=True, default=0, null=True),
),
migrations.AlterField(
model_name='ticker',
name='sales',
field=models.BigIntegerField(blank=True, default=0, null=True),
),
migrations.AlterField(
model_name='ticker',
name='totalTransactions',
field=models.BigIntegerField(blank=True, default=0, null=True),
),
migrations.AlterField(
model_name='ticker',
name='totalVolumeTransactions',
field=models.BigIntegerField(blank=True, default=0, null=True),
),
migrations.AlterUniqueTogether(
name='congresstrade',
unique_together=set(),
),
]
| 31.902174 | 75 | 0.561499 | 239 | 2,935 | 6.820084 | 0.225941 | 0.171779 | 0.214724 | 0.24908 | 0.803067 | 0.752147 | 0.711043 | 0.672393 | 0.646012 | 0.568712 | 0 | 0.021717 | 0.325383 | 2,935 | 91 | 76 | 32.252747 | 0.801515 | 0.015332 | 0 | 0.8 | 1 | 0 | 0.122576 | 0.023892 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.011765 | 0 | 0.047059 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c791a8e86a0892c85eab1b3888bf572156b6f281 | 27,337 | py | Python | test/test_cluster_service.py | reportportal/service-auto-analyzer | 38508e012d0ceeb621c508ad09f16ad6b5a88602 | [
"Apache-2.0"
] | 8 | 2020-06-04T10:32:27.000Z | 2022-02-17T08:11:00.000Z | test/test_cluster_service.py | reportportal/service-auto-analyzer | 38508e012d0ceeb621c508ad09f16ad6b5a88602 | [
"Apache-2.0"
] | 9 | 2019-12-12T11:18:37.000Z | 2022-02-19T16:17:28.000Z | test/test_cluster_service.py | reportportal/service-auto-analyzer | 38508e012d0ceeb621c508ad09f16ad6b5a88602 | [
"Apache-2.0"
] | 12 | 2020-04-01T15:19:40.000Z | 2022-03-03T14:41:55.000Z | """
* Copyright 2019 EPAM Systems
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
"""
import unittest
from http import HTTPStatus
import sure # noqa
import httpretty
import json
from unittest.mock import MagicMock
import commons.launch_objects as launch_objects
from utils import utils
from service.cluster_service import ClusterService
from test.test_service import TestService
from freezegun import freeze_time
class TestClusterService(TestService):
@freeze_time("2021-10-18 17:00:00")
@utils.ignore_warnings
def test_find_clusters(self):
"""Test finding clusters"""
tests = [
{
"test_calls": [{"method": httpretty.GET,
"uri": "/1",
"status": HTTPStatus.OK,
}],
"query_logs_result": {},
"launch_info": launch_objects.LaunchInfoForClustering(
launchId=1,
launchName="Launch name",
project=1,
forUpdate=False,
numberOfLogLines=-1),
"expected_result": launch_objects.ClusterResult(
project=1,
launchId=1,
clusters=[])
},
{
"test_calls": [{"method": httpretty.GET,
"uri": "/rp_2",
"status": HTTPStatus.OK,
}],
"launch_info": launch_objects.LaunchInfoForClustering(
launchId=1,
launchName="Launch name",
project=2,
forUpdate=False,
numberOfLogLines=-1),
"app_config": {
"esHost": "http://localhost:9200",
"esUser": "",
"esPassword": "",
"esVerifyCerts": False,
"esUseSsl": False,
"esSslShowWarn": False,
"turnOffSslVerification": True,
"esCAcert": "",
"esClientCert": "",
"esClientKey": "",
"appVersion": "",
"minioRegion": "",
"minioBucketPrefix": "",
"filesystemDefaultPath": "",
"esChunkNumber": 1000,
"binaryStoreType": "minio",
"minioHost": "",
"minioAccessKey": "",
"minioSecretKey": "",
"esProjectIndexPrefix": "rp_"
},
"query_logs_result": {},
"expected_result": launch_objects.ClusterResult(
project=2,
launchId=1,
clusters=[])
},
{
"test_calls": [{"method": httpretty.GET,
"uri": "/2",
"status": HTTPStatus.OK,
},
{"method": httpretty.GET,
"uri": "/2/_search",
"status": HTTPStatus.OK,
"content_type": "application/json",
"rq": utils.get_fixture(
self.search_logs_rq_first_group_not_for_update),
"rs": utils.get_fixture(
self.no_hits_search_rs),
},
{"method": httpretty.GET,
"uri": "/2/_search",
"status": HTTPStatus.OK,
"content_type": "application/json",
"rq": utils.get_fixture(
self.search_logs_rq_second_group_not_for_update),
"rs": utils.get_fixture(
self.no_hits_search_rs),
},
{"method": httpretty.POST,
"uri": "/_bulk?refresh=true",
"status": HTTPStatus.OK,
"content_type": "application/json",
"rq": utils.get_fixture(
self.cluster_update),
"rs": utils.get_fixture(
self.index_logs_rs),
}],
"launch_info": launch_objects.LaunchInfoForClustering(
launchId=1,
launchName="Launch name",
project=2,
forUpdate=False,
numberOfLogLines=-1),
"query_logs_result": utils.get_fixture(self.launch_w_items_clustering, to_json=True),
"expected_result": launch_objects.ClusterResult(
project=2,
launchId=1,
clusters=[
launch_objects.ClusterInfo(
clusterId=5130555442447530,
clusterMessage="error occured \r\n error found \r\n error mined",
logIds=[4, 5]),
launch_objects.ClusterInfo(
clusterId=247493849502166,
clusterMessage="error occured \r\n error found \r\n assert query",
logIds=[9])
])
},
{
"test_calls": [{"method": httpretty.GET,
"uri": "/2",
"status": HTTPStatus.OK,
},
{"method": httpretty.GET,
"uri": "/2/_search",
"status": HTTPStatus.OK,
"content_type": "application/json",
"rq": utils.get_fixture(
self.search_logs_rq_first_group_2lines_not_for_update),
"rs": utils.get_fixture(
self.no_hits_search_rs),
},
{"method": httpretty.POST,
"uri": "/_bulk?refresh=true",
"status": HTTPStatus.OK,
"content_type": "application/json",
"rq": utils.get_fixture(
self.cluster_update_all_the_same),
"rs": utils.get_fixture(
self.index_logs_rs),
}],
"launch_info": launch_objects.LaunchInfoForClustering(
launchId=1,
launchName="Launch name",
project=2,
forUpdate=False,
numberOfLogLines=2),
"query_logs_result": utils.get_fixture(self.launch_w_items_clustering, to_json=True),
"expected_result": launch_objects.ClusterResult(
project=2,
launchId=1,
clusters=[
launch_objects.ClusterInfo(
clusterId="5349085043832165",
clusterMessage="error occured \r\n error found",
logIds=[4, 5, 9])
])
},
{
"test_calls": [{"method": httpretty.GET,
"uri": "/2",
"status": HTTPStatus.OK,
},
{"method": httpretty.GET,
"uri": "/2/_search",
"status": HTTPStatus.OK,
"content_type": "application/json",
"rq": utils.get_fixture(
self.search_logs_rq_first_group),
"rs": utils.get_fixture(
self.no_hits_search_rs),
},
{"method": httpretty.GET,
"uri": "/2/_search",
"status": HTTPStatus.OK,
"content_type": "application/json",
"rq": utils.get_fixture(
self.search_logs_rq_second_group),
"rs": utils.get_fixture(
self.no_hits_search_rs),
},
{"method": httpretty.POST,
"uri": "/_bulk?refresh=true",
"status": HTTPStatus.OK,
"content_type": "application/json",
"rq": utils.get_fixture(
self.cluster_update),
"rs": utils.get_fixture(
self.index_logs_rs),
}],
"launch_info": launch_objects.LaunchInfoForClustering(
launchId=1,
launchName="Launch name",
project=2,
forUpdate=True,
numberOfLogLines=-1),
"query_logs_result": utils.get_fixture(self.launch_w_items_clustering, to_json=True),
"expected_result": launch_objects.ClusterResult(
project=2,
launchId=1,
clusters=[
launch_objects.ClusterInfo(
clusterId="5130555442447530",
clusterMessage="error occured \r\n error found \r\n error mined",
logIds=[4, 5]),
launch_objects.ClusterInfo(
clusterId="247493849502166",
clusterMessage="error occured \r\n error found \r\n assert query",
logIds=[9]),
])
},
{
"test_calls": [{"method": httpretty.GET,
"uri": "/2",
"status": HTTPStatus.OK,
},
{"method": httpretty.GET,
"uri": "/2/_search",
"status": HTTPStatus.OK,
"content_type": "application/json",
"rq": utils.get_fixture(
self.search_logs_rq_first_group),
"rs": utils.get_fixture(
self.one_hit_search_rs_clustering)
},
{"method": httpretty.GET,
"uri": "/2/_search",
"status": HTTPStatus.OK,
"content_type": "application/json",
"rq": utils.get_fixture(
self.search_logs_rq_second_group),
"rs": utils.get_fixture(
self.one_hit_search_rs_clustering)
},
{"method": httpretty.POST,
"uri": "/_bulk?refresh=true",
"status": HTTPStatus.OK,
"content_type": "application/json",
"rq": utils.get_fixture(
self.cluster_update_es_update),
"rs": utils.get_fixture(
self.index_logs_rs),
}],
"launch_info": launch_objects.LaunchInfoForClustering(
launchId=1,
launchName="Launch name",
project=2,
forUpdate=True,
numberOfLogLines=-1),
"query_logs_result": utils.get_fixture(self.launch_w_items_clustering, to_json=True),
"expected_result": launch_objects.ClusterResult(
project=2,
launchId=1,
clusters=[
launch_objects.ClusterInfo(
clusterId="123",
clusterMessage="error occured \n error found \n error mined",
logIds=[4, 5, 111]),
launch_objects.ClusterInfo(
clusterId="247493849502166",
clusterMessage="error occured \r\n error found \r\n assert query",
logIds=[9])
])
},
{
"test_calls": [{"method": httpretty.GET,
"uri": "/2",
"status": HTTPStatus.OK,
},
{"method": httpretty.GET,
"uri": "/2/_search",
"status": HTTPStatus.OK,
"content_type": "application/json",
"rq": utils.get_fixture(
self.search_logs_rq_first_group_2lines),
"rs": utils.get_fixture(
self.one_hit_search_rs_clustering),
},
{"method": httpretty.POST,
"uri": "/_bulk?refresh=true",
"status": HTTPStatus.OK,
"content_type": "application/json",
"rq": utils.get_fixture(
self.cluster_update_all_the_same_es_update),
"rs": utils.get_fixture(
self.index_logs_rs),
}],
"launch_info": launch_objects.LaunchInfoForClustering(
launchId=1,
launchName="Launch name",
project=2,
forUpdate=True,
numberOfLogLines=2),
"query_logs_result": utils.get_fixture(self.launch_w_items_clustering, to_json=True),
"expected_result": launch_objects.ClusterResult(
project=2,
launchId=1,
clusters=[
launch_objects.ClusterInfo(
clusterId="123",
clusterMessage="error occured \n error found \n error mined",
logIds=[4, 5, 9, 111])
])
},
{
"test_calls": [{"method": httpretty.GET,
"uri": "/rp_2",
"status": HTTPStatus.OK,
},
{"method": httpretty.GET,
"uri": "/rp_2/_search",
"status": HTTPStatus.OK,
"content_type": "application/json",
"rq": utils.get_fixture(
self.search_logs_rq_first_group_2lines),
"rs": utils.get_fixture(
self.one_hit_search_rs_clustering),
},
{"method": httpretty.POST,
"uri": "/_bulk?refresh=true",
"status": HTTPStatus.OK,
"content_type": "application/json",
"rq": utils.get_fixture(
self.cluster_update_all_the_same_es_update_with_prefix),
"rs": utils.get_fixture(
self.index_logs_rs),
}],
"launch_info": launch_objects.LaunchInfoForClustering(
launchId=1,
launchName="Launch name",
project=2,
forUpdate=True,
numberOfLogLines=2),
"query_logs_result": utils.get_fixture(
self.launch_w_items_clustering_with_prefix, to_json=True),
"app_config": {
"esHost": "http://localhost:9200",
"esUser": "",
"esPassword": "",
"esVerifyCerts": False,
"esUseSsl": False,
"esSslShowWarn": False,
"turnOffSslVerification": True,
"esCAcert": "",
"esClientCert": "",
"esClientKey": "",
"appVersion": "",
"minioRegion": "",
"minioBucketPrefix": "",
"filesystemDefaultPath": "",
"esChunkNumber": 1000,
"binaryStoreType": "minio",
"minioHost": "",
"minioAccessKey": "",
"minioSecretKey": "",
"esProjectIndexPrefix": "rp_"
},
"expected_result": launch_objects.ClusterResult(
project=2,
launchId=1,
clusters=[
launch_objects.ClusterInfo(
clusterId="123",
clusterMessage="error occured \n error found \n error mined",
logIds=[4, 5, 9, 111])
])
},
{
"test_calls": [{"method": httpretty.GET,
"uri": "/2",
"status": HTTPStatus.OK,
},
{"method": httpretty.GET,
"uri": "/2/_search",
"status": HTTPStatus.OK,
"content_type": "application/json",
"rq": utils.get_fixture(
self.search_logs_rq_first_group_assertion_error),
"rs": utils.get_fixture(
self.no_hits_search_rs),
},
{"method": httpretty.GET,
"uri": "/2/_search",
"status": HTTPStatus.OK,
"content_type": "application/json",
"rq": utils.get_fixture(
self.search_logs_rq_first_group_assertion_error_status_code),
"rs": utils.get_fixture(
self.no_hits_search_rs),
},
{"method": httpretty.GET,
"uri": "/2/_search",
"status": HTTPStatus.OK,
"content_type": "application/json",
"rq": utils.get_fixture(
self.search_logs_rq_first_group_no_such_element),
"rs": utils.get_fixture(
self.no_hits_search_rs),
},
{"method": httpretty.POST,
"uri": "/_bulk?refresh=true",
"status": HTTPStatus.OK,
"content_type": "application/json",
"rq": utils.get_fixture(
self.cluster_update_all_the_same_es_with_different_errors),
"rs": utils.get_fixture(
self.index_logs_rs),
}],
"launch_info": launch_objects.LaunchInfoForClustering(
launchId=1,
launchName="Launch name",
project=2,
forUpdate=False,
numberOfLogLines=2),
"query_logs_result": utils.get_fixture(
self.launch_w_items_clustering_with_different_errors, to_json=True),
"expected_result": launch_objects.ClusterResult(
project=2,
launchId=1,
clusters=[
launch_objects.ClusterInfo(
clusterId="6653850107754598",
clusterMessage="AssertionError error occured \r\n error found",
logIds=[4]),
launch_objects.ClusterInfo(
clusterId="3007109971644807",
clusterMessage="AssertionError status code: 500 error occured \r\n error found",
logIds=[5]),
launch_objects.ClusterInfo(
clusterId="5952168702333922",
clusterMessage="NoSuchElementException error occured \r\n error found",
logIds=[9]),
])
}
]
for idx, test in enumerate(tests):
with sure.ensure('Error in the test case number: {0}', idx):
self._start_server(test["test_calls"])
config = self.get_default_search_config()
app_config = self.app_config
if "app_config" in test:
app_config = test["app_config"]
_cluster_service = ClusterService(app_config=app_config,
search_cfg=config)
_cluster_service.es_client.es_client.scroll = MagicMock(return_value=json.loads(
utils.get_fixture(self.no_hits_search_rs)))
_cluster_service.query_logs = MagicMock(return_value=test["query_logs_result"])
response = _cluster_service.find_clusters(test["launch_info"])
response.clusters.should.have.length_of(len(test["expected_result"].clusters))
for i in range(len(response.clusters)):
test["expected_result"].clusters[i].should.equal(
response.clusters[i])
TestClusterService.shutdown_server(test["test_calls"])
if __name__ == '__main__':
unittest.main()
| 55.789796 | 108 | 0.345685 | 1,639 | 27,337 | 5.516168 | 0.14155 | 0.040703 | 0.076319 | 0.096671 | 0.794824 | 0.791063 | 0.782325 | 0.77414 | 0.76131 | 0.759429 | 0 | 0.025177 | 0.577203 | 27,337 | 489 | 109 | 55.903885 | 0.757051 | 0.021912 | 0 | 0.770563 | 0 | 0 | 0.123592 | 0.004041 | 0 | 0 | 0 | 0 | 0.015152 | 1 | 0.002165 | false | 0.004329 | 0.02381 | 0 | 0.028139 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c7acbf7d1789478fd53567db24853da678bd7bc7 | 7,339 | py | Python | manager/integration/tests/test_rwx.py | ttpcodes/longhorn-tests | 5932833fd01f4dbd6052270b9fc46777fa18252e | [
"Apache-2.0"
] | null | null | null | manager/integration/tests/test_rwx.py | ttpcodes/longhorn-tests | 5932833fd01f4dbd6052270b9fc46777fa18252e | [
"Apache-2.0"
] | null | null | null | manager/integration/tests/test_rwx.py | ttpcodes/longhorn-tests | 5932833fd01f4dbd6052270b9fc46777fa18252e | [
"Apache-2.0"
] | null | null | null | import pytest
'''
Fixture required for this module to clean up longhorn-nfs-provisioner related
resources.
'''
@pytest.mark.skip(reason="TODO")
def test_rwx_with_statefulset_multi_pods():
"""
Test writing of data in same volume from 2 pods
1. Create PodSecurityPolicy for longhorn-nfs-provisioner as sample yaml
https://raw.githubusercontent.com/longhorn/longhorn/master/examples
/rwx/01-security.yaml
2. Create serviceAccount, service, longhorn-nfs-provisioner deployment,
PVC and StorageClass as sample yaml https://raw.githubusercontent.com/
longhorn/longhorn/master/examples/rwx/02-longhorn-nfs-provisioner.yaml
3. Wait for longhorn-nfs-provisioner deployment to come up healthy.
4. Verify the volume named as the PV attached to longhorn-nfs-provisioner
`volume.ready == True`
5. Create a StatefulSet of 2 pods with VolumeClaimTemplate and
longhorn-nfs class in default namespace.
6. Wait for both pods to come up running.
7. Verify two folders get created with PV names in longhorn-nfs-provisioner
volume.
8. Write data in both pods and compute md5sum.
9. Compare md5sum of the data in longhorn-nfs-provisioner volume
"""
pass
@pytest.mark.skip(reason="TODO")
def test_rwx_multi_statefulset_with_same_pvc():
"""
Test writing of data with multiple pods using same PVC
1. Create PodSecurityPolicy for longhorn-nfs-provisioner as sample yaml
https://raw.githubusercontent.com/longhorn/longhorn/master/examples
/rwx/01-security.yaml
2. Create serviceAccount, service, longhorn-nfs-provisioner deployment,
PVC and StorageClass as sample yaml https://raw.githubusercontent.com/
longhorn/longhorn/master/examples/rwx/02-longhorn-nfs-provisioner.yaml
3. Wait for longhorn-nfs-provisioner deployment to come up healthy.
4. Verify the volume named as the PV attached to longhorn-nfs-provisioner
`volume.ready == True`
5. Create a StatefulSet of 1 pod with VolumeClaimTemplate and
longhorn-nfs class in default namespace.
6. Wait for StatefulSet to come up healthy.
7. Write data and compute md5sum.
8. Create another statefulSet with same pvc which got created with first
statefulSet.
9. Wait for statefulSet to come up healthy.
10. Check the data md5sum.
11. Write more data and compute md5sum
12. Check the data md5sum in longhorn-nfs-provisioner volume
"""
pass
@pytest.mark.skip(reason="TODO")
def test_rwx_parallel_writing():
"""
Test parallel writing of data
1. Create PodSecurityPolicy for longhorn-nfs-provisioner as sample yaml
https://raw.githubusercontent.com/longhorn/longhorn/master/examples
/rwx/01-security.yaml
2. Create serviceAccount, service, longhorn-nfs-provisioner deployment,
PVC and StorageClass as sample yaml https://raw.githubusercontent.com/
longhorn/longhorn/master/examples/rwx/02-longhorn-nfs-provisioner.yaml
3. Wait for longhorn-nfs-provisioner deployment to come up healthy.
4. Verify the volume named as the PV attached to longhorn-nfs-provisioner
`volume.ready == True`
5. Create a StatefulSet of 1 pod with VolumeClaimTemplate and
longhorn-nfs class in default namespace.
6. Wait for StatefulSet to come up healthy.
7. Create another statefulSet with same pvc which got created with first
statefulSet.
8. Wait for statefulSet to come up healthy.
9. Start writing 800 MB data in first statefulSet `file 1` and start
writing 500 MB data in second statefulSet `file 2`.
10. Compute md5sum.
11. Check the data md5sum in longhorn-nfs-provisioner volume
"""
pass
@pytest.mark.skip(reason="TODO")
def test_rwx_statefulset_scale_down_up():
"""
Test Scaling up and down of pods attached to longhorn-nfs-provisioner
volume.
1. Create PodSecurityPolicy for longhorn-nfs-provisioner as sample yaml
https://raw.githubusercontent.com/longhorn/longhorn/master/examples
/rwx/01-security.yaml
2. Create serviceAccount, service, longhorn-nfs-provisioner deployment,
PVC and StorageClass as sample yaml https://raw.githubusercontent.com/
longhorn/longhorn/master/examples/rwx/02-longhorn-nfs-provisioner.yaml
3. Wait for longhorn-nfs-provisioner deployment to come up healthy.
4. Verify the volume named as the PV attached to longhorn-nfs-provisioner
`volume.ready == True`
5. Create a StatefulSet of 2 pods with VolumeClaimTemplate and
longhorn-nfs class in default namespace.
6. Wait for StatefulSet pods to come up healthy.
7. Write data and compute md5sum in both pods
8. Delete pods.
9. Wait for pods to terminate.
10. Recreate the pods
11. Wait for new pods to come up.
12. Check the data md5sum in new pods
"""
pass
@pytest.mark.skip(reason="TODO")
def test_rwx_offline_node_longhorn_nfs_provisioner():
"""
Test moving of longhorn-nfs-provisioner pod from one node to another.
1. Create PodSecurityPolicy for longhorn-nfs-provisioner as sample yaml
https://raw.githubusercontent.com/longhorn/longhorn/master/examples
/rwx/01-security.yaml
2. Create serviceAccount, service, longhorn-nfs-provisioner deployment,
PVC and StorageClass as sample yaml https://raw.githubusercontent.com/
longhorn/longhorn/master/examples/rwx/02-longhorn-nfs-provisioner.yaml
Make sure liveness probe is added in the deployment.
3. Wait for longhorn-nfs-provisioner deployment to come up healthy.
4. Verify the volume named as the PV attached to longhorn-nfs-provisioner
`volume.ready == True`
5. Create a StatefulSet of 1 pod with VolumeClaimTemplate and
longhorn-nfs class in default namespace.
6. Wait for StatefulSet to come up healthy.
7. Write data and compute md5sum.
8. Shutdown the node where longhorn-nfs-provisioner is running. The
liveness probe will restart the pod on another node.
9. Wait for a new pod to be created and volume getting attached.
10. Check the data md5sum in statefulSet
11. Write more data to it and compute md5sum.
12. Check the data md5sum in longhorn-nfs-provisioner volume
"""
pass
@pytest.mark.skip(reason="TODO")
def test_rwx_deployment_with_multi_pods():
"""
Test deployment of 2 pods with same PVC.
1. Create PodSecurityPolicy for longhorn-nfs-provisioner as sample yaml
https://raw.githubusercontent.com/longhorn/longhorn/master/examples
/rwx/01-security.yaml
2. Create serviceAccount, service, longhorn-nfs-provisioner deployment,
PVC and StorageClass as sample yaml https://raw.githubusercontent.com/
longhorn/longhorn/master/examples/rwx/02-longhorn-nfs-provisioner.yaml
Make sure liveness probe is added in the deployment.
3. Wait for longhorn-nfs-provisioner deployment to come up healthy.
4. Verify the volume named as the PV attached to longhorn-nfs-provisioner
`volume.ready == True`
5. Create a deployment of 2 pods with PVC created with longhorn-nfs class
6. Wait for 2 pods to come up healthy.
7. Write data in both pods and compute md5sum.
8. Check the data md5sum in longhorn-nfs-provisioner volume
"""
pass
| 41 | 79 | 0.730345 | 1,037 | 7,339 | 5.139826 | 0.128255 | 0.096998 | 0.169231 | 0.036585 | 0.82364 | 0.802251 | 0.790994 | 0.778612 | 0.754597 | 0.747467 | 0 | 0.022495 | 0.200436 | 7,339 | 178 | 80 | 41.230337 | 0.885821 | 0.838125 | 0 | 0.631579 | 0 | 0 | 0.041885 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.315789 | true | 0.315789 | 0.052632 | 0 | 0.368421 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 10 |
402a086f2a6049176465a7de176d140320d3b243 | 103 | py | Python | memorable_password/__init__.py | patarapolw/mnemopass | f53a2afa4104238e1770dfd4d85710bc00719302 | [
"Apache-2.0"
] | 10 | 2018-04-24T20:12:51.000Z | 2018-08-29T11:09:26.000Z | memorable_password/__init__.py | patarapolw/mnemopass | f53a2afa4104238e1770dfd4d85710bc00719302 | [
"Apache-2.0"
] | null | null | null | memorable_password/__init__.py | patarapolw/mnemopass | f53a2afa4104238e1770dfd4d85710bc00719302 | [
"Apache-2.0"
] | null | null | null | from .sentence import ToSentence
from .password import GeneratePassword
from .mnemonic import Mnemonic
| 25.75 | 38 | 0.854369 | 12 | 103 | 7.333333 | 0.583333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.116505 | 103 | 3 | 39 | 34.333333 | 0.967033 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 7 |
403deebe9eebf2c01d84f42ca204fd9fca4fb202 | 20,859 | py | Python | tests/changes_including_zero_counts.py | UMNLibraries/pureapi | 4e711d4831578e5d9afef04654d2bbfe5cfa4524 | [
"MIT"
] | 11 | 2017-12-15T23:04:59.000Z | 2021-12-06T13:37:08.000Z | tests/changes_including_zero_counts.py | UMNLibraries/pureapi | 4e711d4831578e5d9afef04654d2bbfe5cfa4524 | [
"MIT"
] | 5 | 2018-09-26T20:11:47.000Z | 2020-08-28T18:13:08.000Z | tests/changes_including_zero_counts.py | UMNLibraries/pureapi | 4e711d4831578e5d9afef04654d2bbfe5cfa4524 | [
"MIT"
] | 1 | 2017-12-30T19:54:18.000Z | 2017-12-30T19:54:18.000Z | changes = {
# We give the first element a date string instead of a resumption token to
# remind ourselves that we first noticed these multiple consecutive responses
# with zero counts on this date, and to make the tests easier to understand.
# The actual resumption token was:
# eyJzZXF1ZW5jZU51bWJlciI6MTk0MTM1MjM2fQ==
'2020-03-12': '''
{
"count": 52,
"resumptionToken": "eyJzZXF1ZW5jZU51bWJlciI6MTk0MTM1MzcyfQ==",
"moreChanges": true,
"items": [
{
"uuid": "d09e017d-4e1d-403c-a703-6c3d11b039b4",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.contentimport.model.ImportResult",
"familySystemName": "ImportResult",
"version": -1
},
{
"uuid": "d09e017d-4e1d-403c-a703-6c3d11b039b4",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.contentimport.model.ImportResult",
"familySystemName": "ImportResult",
"version": -1
},
{
"uuid": "d09e017d-4e1d-403c-a703-6c3d11b039b4",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.contentimport.model.ImportResult",
"familySystemName": "ImportResult",
"version": -1
},
{
"uuid": "1daa02ae-f98a-46f5-ae1b-f73bc63439bb",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.contentimport.model.ImportResult",
"familySystemName": "ImportResult",
"version": 32
},
{
"uuid": "f4d569ea-d049-43cc-bd6e-fb3adcbb5801",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.contentimport.model.ImportResult",
"familySystemName": "ImportResult",
"version": -1
},
{
"uuid": "f4d569ea-d049-43cc-bd6e-fb3adcbb5801",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.contentimport.model.ImportResult",
"familySystemName": "ImportResult",
"version": -1
},
{
"uuid": "f4d569ea-d049-43cc-bd6e-fb3adcbb5801",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.contentimport.model.ImportResult",
"familySystemName": "ImportResult",
"version": -1
},
{
"uuid": "1c55fb9c-3dba-4f97-bbc0-94731d3f0ea3",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.contentimport.model.ImportResult",
"familySystemName": "ImportResult",
"version": 31
},
{
"uuid": "3ee475f7-fc41-49d5-a405-91c70e6b5459",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.contentimport.model.ImportResult",
"familySystemName": "ImportResult",
"version": 30
},
{
"uuid": "40497ba3-1757-4aff-8b95-57e477fef2eb",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.contentimport.model.ImportResult",
"familySystemName": "ImportResult",
"version": 32
},
{
"uuid": "74f22156-2308-4b22-bc1f-a009e43e5037",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.contentimport.model.ImportResult",
"familySystemName": "ImportResult",
"version": 30
},
{
"uuid": "dded545c-c5ae-42d8-be58-f363e8151487",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.contentimport.model.ImportResult",
"familySystemName": "ImportResult",
"version": 13
},
{
"uuid": "b2320e0a-3e89-40ab-9f5d-bd541859b50b",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.model.researchoutput.ResearchOutput",
"familySystemName": "ResearchOutput",
"version": -1
},
{
"uuid": "ecc0d398-f903-4ef5-8895-788894700376",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.model.person.Person",
"familySystemName": "Person",
"version": 22
},
{
"uuid": "68d4c12b-1caf-455e-927b-755613103f28",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.model.person.Person",
"familySystemName": "Person",
"version": 22
},
{
"uuid": "38a2eb02-282a-4cb5-b516-ec38d6aa542b",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.model.person.Person",
"familySystemName": "Person",
"version": 8
},
{
"uuid": "44f55c93-a5fe-4497-b5b4-a5d1982b1133",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.contentimport.model.ImportResult",
"familySystemName": "ImportResult",
"version": -1
},
{
"uuid": "44f55c93-a5fe-4497-b5b4-a5d1982b1133",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.contentimport.model.ImportResult",
"familySystemName": "ImportResult",
"version": -1
},
{
"uuid": "44f55c93-a5fe-4497-b5b4-a5d1982b1133",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.contentimport.model.ImportResult",
"familySystemName": "ImportResult",
"version": -1
},
{
"uuid": "a15c321b-48e3-429f-8ca4-2980bddf7c45",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.contentimport.model.ImportResult",
"familySystemName": "ImportResult",
"version": -1
},
{
"uuid": "a15c321b-48e3-429f-8ca4-2980bddf7c45",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.contentimport.model.ImportResult",
"familySystemName": "ImportResult",
"version": -1
},
{
"uuid": "a15c321b-48e3-429f-8ca4-2980bddf7c45",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.contentimport.model.ImportResult",
"familySystemName": "ImportResult",
"version": -1
},
{
"uuid": "6f4b8e9a-0007-4702-99bb-53630dc2ad5c",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.contentimport.model.ImportResult",
"familySystemName": "ImportResult",
"version": -1
},
{
"uuid": "6f4b8e9a-0007-4702-99bb-53630dc2ad5c",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.contentimport.model.ImportResult",
"familySystemName": "ImportResult",
"version": -1
},
{
"uuid": "6f4b8e9a-0007-4702-99bb-53630dc2ad5c",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.contentimport.model.ImportResult",
"familySystemName": "ImportResult",
"version": -1
},
{
"uuid": "4b02f9da-d9d7-466c-a2d5-6b0a8bfd771b",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.contentimport.model.ImportResult",
"familySystemName": "ImportResult",
"version": -1
},
{
"uuid": "4b02f9da-d9d7-466c-a2d5-6b0a8bfd771b",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.contentimport.model.ImportResult",
"familySystemName": "ImportResult",
"version": -1
},
{
"uuid": "4b02f9da-d9d7-466c-a2d5-6b0a8bfd771b",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.contentimport.model.ImportResult",
"familySystemName": "ImportResult",
"version": -1
},
{
"uuid": "ed7dc3aa-e015-4eff-a6e1-a5143bea23f2",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.contentimport.model.ImportResult",
"familySystemName": "ImportResult",
"version": -1
},
{
"uuid": "ed7dc3aa-e015-4eff-a6e1-a5143bea23f2",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.contentimport.model.ImportResult",
"familySystemName": "ImportResult",
"version": -1
},
{
"uuid": "ed7dc3aa-e015-4eff-a6e1-a5143bea23f2",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.contentimport.model.ImportResult",
"familySystemName": "ImportResult",
"version": -1
},
{
"uuid": "855c32af-53a5-47df-a196-7f603bb5b0c5",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.contentimport.model.ImportResult",
"familySystemName": "ImportResult",
"version": -1
},
{
"uuid": "855c32af-53a5-47df-a196-7f603bb5b0c5",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.contentimport.model.ImportResult",
"familySystemName": "ImportResult",
"version": -1
},
{
"uuid": "855c32af-53a5-47df-a196-7f603bb5b0c5",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.contentimport.model.ImportResult",
"familySystemName": "ImportResult",
"version": -1
},
{
"uuid": "1daa02ae-f98a-46f5-ae1b-f73bc63439bb",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.contentimport.model.ImportResult",
"familySystemName": "ImportResult",
"version": -1
},
{
"uuid": "1daa02ae-f98a-46f5-ae1b-f73bc63439bb",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.contentimport.model.ImportResult",
"familySystemName": "ImportResult",
"version": -1
},
{
"uuid": "1daa02ae-f98a-46f5-ae1b-f73bc63439bb",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.contentimport.model.ImportResult",
"familySystemName": "ImportResult",
"version": -1
},
{
"uuid": "1c55fb9c-3dba-4f97-bbc0-94731d3f0ea3",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.contentimport.model.ImportResult",
"familySystemName": "ImportResult",
"version": -1
},
{
"uuid": "1c55fb9c-3dba-4f97-bbc0-94731d3f0ea3",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.contentimport.model.ImportResult",
"familySystemName": "ImportResult",
"version": -1
},
{
"uuid": "1c55fb9c-3dba-4f97-bbc0-94731d3f0ea3",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.contentimport.model.ImportResult",
"familySystemName": "ImportResult",
"version": -1
},
{
"uuid": "3ee475f7-fc41-49d5-a405-91c70e6b5459",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.contentimport.model.ImportResult",
"familySystemName": "ImportResult",
"version": -1
},
{
"uuid": "3ee475f7-fc41-49d5-a405-91c70e6b5459",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.contentimport.model.ImportResult",
"familySystemName": "ImportResult",
"version": -1
},
{
"uuid": "3ee475f7-fc41-49d5-a405-91c70e6b5459",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.contentimport.model.ImportResult",
"familySystemName": "ImportResult",
"version": -1
},
{
"uuid": "40497ba3-1757-4aff-8b95-57e477fef2eb",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.contentimport.model.ImportResult",
"familySystemName": "ImportResult",
"version": -1
},
{
"uuid": "40497ba3-1757-4aff-8b95-57e477fef2eb",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.contentimport.model.ImportResult",
"familySystemName": "ImportResult",
"version": -1
},
{
"uuid": "40497ba3-1757-4aff-8b95-57e477fef2eb",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.contentimport.model.ImportResult",
"familySystemName": "ImportResult",
"version": -1
},
{
"uuid": "74f22156-2308-4b22-bc1f-a009e43e5037",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.contentimport.model.ImportResult",
"familySystemName": "ImportResult",
"version": -1
},
{
"uuid": "74f22156-2308-4b22-bc1f-a009e43e5037",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.contentimport.model.ImportResult",
"familySystemName": "ImportResult",
"version": -1
},
{
"uuid": "74f22156-2308-4b22-bc1f-a009e43e5037",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.contentimport.model.ImportResult",
"familySystemName": "ImportResult",
"version": -1
},
{
"uuid": "dded545c-c5ae-42d8-be58-f363e8151487",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.contentimport.model.ImportResult",
"familySystemName": "ImportResult",
"version": -1
},
{
"uuid": "dded545c-c5ae-42d8-be58-f363e8151487",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.contentimport.model.ImportResult",
"familySystemName": "ImportResult",
"version": -1
},
{
"uuid": "dded545c-c5ae-42d8-be58-f363e8151487",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.contentimport.model.ImportResult",
"familySystemName": "ImportResult",
"version": -1
}
],
"navigationLinks": [
{
"ref": "next",
"href": "https://experts.umn.edu/ws/api/516/changes/eyJzZXF1ZW5jZU51bWJlciI6MTk0MTM1MzcyfQ=="
}
]
}
''',
'eyJzZXF1ZW5jZU51bWJlciI6MTk0MTM1MzcyfQ==': '''
{
"count": 0,
"resumptionToken": "eyJzZXF1ZW5jZU51bWJlciI6MTk0MTM1NDcyfQ==",
"moreChanges": true,
"navigationLinks": [
{
"ref": "next",
"href": "https://experts.umn.edu/ws/api/516/changes/eyJzZXF1ZW5jZU51bWJlciI6MTk0MTM1NDcyfQ=="
}
]
}
''',
'eyJzZXF1ZW5jZU51bWJlciI6MTk0MTM1NDcyfQ==': '''
{
"count": 0,
"resumptionToken": "eyJzZXF1ZW5jZU51bWJlciI6MTk0MTM1NTcyfQ==",
"moreChanges": true,
"navigationLinks": [
{
"ref": "next",
"href": "https://experts.umn.edu/ws/api/516/changes/eyJzZXF1ZW5jZU51bWJlciI6MTk0MTM1NTcyfQ=="
}
]
}
''',
'eyJzZXF1ZW5jZU51bWJlciI6MTk0MTM1NTcyfQ==': '''
{
"count": 0,
"resumptionToken": "eyJzZXF1ZW5jZU51bWJlciI6MTk0MTM1NjcyfQ==",
"moreChanges": true,
"navigationLinks": [
{
"ref": "next",
"href": "https://experts.umn.edu/ws/api/516/changes/eyJzZXF1ZW5jZU51bWJlciI6MTk0MTM1NjcyfQ=="
}
]
}
''',
'eyJzZXF1ZW5jZU51bWJlciI6MTk0MTM1NjcyfQ==': '''
{
"count": 0,
"resumptionToken": "eyJzZXF1ZW5jZU51bWJlciI6MTk0MTM1NzcyfQ==",
"moreChanges": true,
"navigationLinks": [
{
"ref": "next",
"href": "https://experts.umn.edu/ws/api/516/changes/eyJzZXF1ZW5jZU51bWJlciI6MTk0MTM1NzcyfQ=="
}
]
}
''',
'eyJzZXF1ZW5jZU51bWJlciI6MTk0MTM1NzcyfQ==': '''
{
"count": 24,
"resumptionToken": "eyJzZXF1ZW5jZU51bWJlciI6MTk0MTM1NzcyfQ==",
"moreChanges": false,
"items": [
{
"uuid": "67073a6c-e84a-470f-9b45-d780cfe0d7cc",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.model.researchoutput.ResearchOutput",
"familySystemName": "ResearchOutput",
"version": 3
},
{
"uuid": "960c880e-4835-4850-9461-d0415c57abd4",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.model.researchoutput.ResearchOutput",
"familySystemName": "ResearchOutput",
"version": 7
},
{
"uuid": "35890d58-587a-4b59-a258-c7a70a1e49dd",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.model.researchoutput.ResearchOutput",
"familySystemName": "ResearchOutput",
"version": 2
},
{
"uuid": "06a05d12-513f-4aa3-bed6-5363cc1d22a1",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.model.researchoutput.ResearchOutput",
"familySystemName": "ResearchOutput",
"version": 52
},
{
"uuid": "c99cd3de-0318-4abc-95fb-53b2102c649b",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.model.researchoutput.ResearchOutput",
"familySystemName": "ResearchOutput",
"version": 81
},
{
"uuid": "21780f24-8ebe-4dcf-88df-bf2167d1be35",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.model.researchoutput.ResearchOutput",
"familySystemName": "ResearchOutput",
"version": 33
},
{
"uuid": "da18acc0-63b2-4e5a-b5d3-6c966f3a296c",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.model.researchoutput.ResearchOutput",
"familySystemName": "ResearchOutput",
"version": 9
},
{
"uuid": "a7ebb1b3-8b4f-4675-bf06-5532bc5a6c76",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.model.researchoutput.ResearchOutput",
"familySystemName": "ResearchOutput",
"version": 2
},
{
"uuid": "069ca0b7-e86a-4828-a4f6-fccf8f91e0f9",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.model.researchoutput.ResearchOutput",
"familySystemName": "ResearchOutput",
"version": 3
},
{
"uuid": "e52284ee-3285-4df7-9e42-9f25ea8fb4c0",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.model.researchoutput.ResearchOutput",
"familySystemName": "ResearchOutput",
"version": 60
},
{
"uuid": "4b21473f-b4b1-48ca-9a18-b4b9c26e463b",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.model.researchoutput.ResearchOutput",
"familySystemName": "ResearchOutput",
"version": 3
},
{
"uuid": "b8557f3b-1152-4de0-9bdc-0a9df931eb42",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.model.researchoutput.ResearchOutput",
"familySystemName": "ResearchOutput",
"version": 78
},
{
"uuid": "b9e3e456-4e29-4a43-8abb-4b72f535caf0",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.model.researchoutput.ResearchOutput",
"familySystemName": "ResearchOutput",
"version": 96
},
{
"uuid": "e2765b38-f2b3-4659-a76f-b8fa28ac2ab8",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.model.researchoutput.ResearchOutput",
"familySystemName": "ResearchOutput",
"version": 71
},
{
"uuid": "c715365c-2cd7-4679-a353-3d9de34c2de9",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.model.researchoutput.ResearchOutput",
"familySystemName": "ResearchOutput",
"version": 4
},
{
"uuid": "d7c1176b-2494-4a4d-931a-29b7fa2a5092",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.model.researchoutput.ResearchOutput",
"familySystemName": "ResearchOutput",
"version": 2
},
{
"uuid": "1e84a6b6-6e78-462f-aeaa-1b8db5a379be",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.model.researchoutput.ResearchOutput",
"familySystemName": "ResearchOutput",
"version": 31
},
{
"uuid": "fb72962f-7fa5-4f59-8060-9a09ee4ea7f9",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.model.researchoutput.ResearchOutput",
"familySystemName": "ResearchOutput",
"version": 3
},
{
"uuid": "ff79fa08-5f48-4c09-9031-c3017e941b73",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.contentimport.model.ImportResult",
"familySystemName": "ImportResult",
"version": 26
},
{
"uuid": "00a6cba7-f5ce-49cd-834a-320df1b2bb2d",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.model.researchoutput.ResearchOutput",
"familySystemName": "ResearchOutput",
"version": 120
},
{
"uuid": "b4e83ad7-2af9-4050-85be-ff8b04fd32f5",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.model.researchoutput.ResearchOutput",
"familySystemName": "ResearchOutput",
"version": 51
},
{
"uuid": "cf1899b7-4373-4a3f-883c-1a6492432878",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.contentimport.model.ImportResult",
"familySystemName": "ImportResult",
"version": 27
},
{
"uuid": "5c10e2f1-a533-4d8e-bd41-6b71020112b2",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.model.researchoutput.ResearchOutput",
"familySystemName": "ResearchOutput",
"version": 143,
"relationChanges": [
{
"uuid": "0d3ae5da-5317-48ec-b892-1d0b65351b3f",
"family": "dk.atira.pure.api.shared.model.keyword.KeywordGroupConfiguration",
"familySystemName": "KeywordGroupConfiguration",
"changeType": "ADDED"
}
]
},
{
"uuid": "ca2a2e83-ac8c-4193-9afa-812030a96dde",
"changeType": "UPDATE",
"family": "dk.atira.pure.api.shared.model.researchoutput.ResearchOutput",
"familySystemName": "ResearchOutput",
"version": 108,
"relationChanges": [
{
"uuid": "0d3ae5da-5317-48ec-b892-1d0b65351b3f",
"family": "dk.atira.pure.api.shared.model.keyword.KeywordGroupConfiguration",
"familySystemName": "KeywordGroupConfiguration",
"changeType": "ADDED"
}
]
}
],
"navigationLinks": [
{
"ref": "next",
"href": "https://experts.umn.edu/ws/api/516/changes/eyJzZXF1ZW5jZU51bWJlciI6MTk0MTM1NzcyfQ=="
}
]
}
''',
}
| 32.694357 | 99 | 0.61652 | 1,805 | 20,859 | 7.124654 | 0.177839 | 0.048523 | 0.078849 | 0.10311 | 0.843313 | 0.843313 | 0.843313 | 0.843313 | 0.843313 | 0.843313 | 0 | 0.108638 | 0.209646 | 20,859 | 637 | 100 | 32.745683 | 0.671418 | 0.014238 | 0 | 0.599684 | 0 | 0.009494 | 0.993433 | 0.402802 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.158228 | 0 | 0.158228 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
4060e2a012434d13041e939259ee8a54d67e0261 | 2,229 | py | Python | MusicModes/AllIWantStrategy.py | Kaleb-Bickmore/chrismas-light-display | dcc1a20d242d4e77739afe2c8bd6c9a142ab88e2 | [
"MIT"
] | 1 | 2022-03-29T03:36:40.000Z | 2022-03-29T03:36:40.000Z | MusicModes/AllIWantStrategy.py | Kaleb-Bickmore/chrismas-light-display | dcc1a20d242d4e77739afe2c8bd6c9a142ab88e2 | [
"MIT"
] | null | null | null | MusicModes/AllIWantStrategy.py | Kaleb-Bickmore/chrismas-light-display | dcc1a20d242d4e77739afe2c8bd6c9a142ab88e2 | [
"MIT"
] | null | null | null | import time
import random
from Lights.Pixels import Pixels
class AllIWantStrategy:
def __init__(self):
self._pixels = Pixels()
def run(self, cycle_time, bps):
t_end = time.time() + cycle_time
self._pixels._pixels.fill((0, 0, 0))
self._pixels._pixels.show()
while(t_end>=time.time()):
for i in range(5) :
self._pixels._pixels.fill((0, 0, 0))
self._pixels.fill_group("left-pillar",(255,0,0))
self._pixels._pixels.show()
time.sleep(bps/60)
self._pixels._pixels.fill((0, 0, 0))
self._pixels.fill_group("right-pillar",(0,255,0))
self._pixels._pixels.show()
time.sleep(bps/60)
self._pixels._pixels.fill((0, 0, 0))
self._pixels.fill_group("top-main-garage",(0,0,255))
self._pixels.fill_group("left-main-garage",(0,0,255))
self._pixels.fill_group("right-main-garage",(0,0,255))
self._pixels.fill_group("top-side-garage",(0,0,255))
self._pixels.fill_group("left-side-garage",(0,0,255))
self._pixels.fill_group("right-side-garage",(0,0,255))
self._pixels._pixels.show()
time.sleep(bps/60)
self._pixels._pixels.fill((0, 0, 0))
self._pixels.fill_left_side((255,0,0))
self._pixels._pixels.show()
time.sleep(bps/180)
self._pixels._pixels.fill((0, 0, 0))
self._pixels.fill_right_side((0,255,0))
self._pixels._pixels.show()
time.sleep(bps/180)
self._pixels._pixels.fill((0, 0, 0))
self._pixels.fill_top_side((0,0,255))
self._pixels._pixels.show()
time.sleep(bps/180)
self._pixels._pixels.fill((0, 0, 0))
self._pixels.fill_left_side((255,0,0))
self._pixels.fill_right_side((0,255,0))
self._pixels.fill_top_side((0,0,255))
self._pixels._pixels.show()
time.sleep(bps/60)
| 37.779661 | 70 | 0.515478 | 281 | 2,229 | 3.822064 | 0.131673 | 0.288641 | 0.253259 | 0.122905 | 0.83054 | 0.816574 | 0.807263 | 0.797952 | 0.797952 | 0.592179 | 0 | 0.076242 | 0.34096 | 2,229 | 58 | 71 | 38.431034 | 0.654867 | 0 | 0 | 0.617021 | 0 | 0 | 0.053387 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.042553 | false | 0 | 0.06383 | 0 | 0.12766 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
408e76c37304f3afe76346cdf4127d53577614ba | 19,230 | py | Python | src/py/flwr_experimental/baseline/tf_hotkey/settings.py | vballoli/flower | e8c58c09a8fd4d29186b2f590b0cbb44bb022e9a | [
"Apache-2.0"
] | 2 | 2021-06-07T21:44:28.000Z | 2022-03-27T18:56:13.000Z | src/py/flwr_experimental/baseline/tf_hotkey/settings.py | vballoli/flower | e8c58c09a8fd4d29186b2f590b0cbb44bb022e9a | [
"Apache-2.0"
] | null | null | null | src/py/flwr_experimental/baseline/tf_hotkey/settings.py | vballoli/flower | e8c58c09a8fd4d29186b2f590b0cbb44bb022e9a | [
"Apache-2.0"
] | 1 | 2020-08-23T18:08:54.000Z | 2020-08-23T18:08:54.000Z | # Copyright 2020 Adap GmbH. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ==============================================================================
"""Provides a variaty of baseline settings for Spoken Keyword classification."""
from typing import List
from flwr_experimental.baseline.common import (
configure_client_instances,
sample_delay_factors,
sample_real_delay_factors,
)
from flwr_experimental.baseline.setting import ClientSetting, ServerSetting, Setting
from flwr_experimental.ops.cluster import Instance
ROUNDS = 50
MIN_NUM_CLIENTS = 45
SAMPLE_FRACTION = 0.2
MIN_SAMPLE_SIZE = 10
LR_INITIAL = 0.01
IID_FRACTION = 0.1
MAX_DELAY_FACTOR = 4.0 # Equals a 5x slowdown
FN_NUM_CLIENTS = 50
FN_ROUNDS = 50
FN_MIN_NUM_CLIENTS = 45
FN_LR_INITIAL = 0.001
FN_IID_FRACTION = 0.1
FN_MAX_DELAY_FACTOR = 4.0
FN_SAMPLE_FRACTION_25 = 0.5
FN_MIN_SAMPLE_SIZE_25 = 25
FN_SAMPLE_FRACTION_10 = 0.2
FN_MIN_SAMPLE_SIZE_10 = 10
FN_TRAINING_ROUND_TIMEOUT = 230
def get_setting(name: str) -> Setting:
"""Return appropriate setting."""
if name not in SETTINGS:
raise Exception(
f"Setting {name} does not exist. Valid settings are: {list(SETTINGS.keys())}"
)
return SETTINGS[name]
def get_instance_name(
instance_names: List[str], num_clients: int, client_index: int
) -> str:
"""Return instance_name."""
idx = client_index // (num_clients // len(instance_names))
idx = min([idx, len(instance_names) - 1])
return instance_names[min(idx, len(instance_names))]
def configure_uniform_clients(
iid_fraction: float, instance_names: List[str], num_clients: int, dry_run: bool,
) -> List[ClientSetting]:
"""Configure `num_clients`, all using the same delay factor."""
clients = []
for i in range(num_clients):
client = ClientSetting(
# Set instance on which to run
instance_name=get_instance_name(instance_names, num_clients, i),
# Individual
cid=str(i),
partition=i,
delay_factor=0.0,
# Shared
iid_fraction=iid_fraction,
num_clients=num_clients,
dry_run=dry_run,
)
clients.append(client)
return clients
# pylint: disable=too-many-arguments
def configure_clients(
iid_fraction: float,
instance_names: List[str],
num_clients: int,
dry_run: bool,
delay_factor_fast: float,
delay_factor_slow: float,
sample_delays: bool = True,
real_delays: bool = False,
) -> List[ClientSetting]:
"""Configure `num_clients` with different delay factors."""
if sample_delays:
# Configure clients with sampled delay factors
if real_delays:
delay_factors = sample_real_delay_factors(
num_clients=num_clients, seed=2020
)
else:
delay_factors = sample_delay_factors(
num_clients=num_clients, max_delay=delay_factor_slow, seed=2020
)
return [
ClientSetting(
# Set instance on which to run
instance_name=get_instance_name(instance_names, num_clients, i),
# Individual
cid=str(i),
partition=i,
delay_factor=delay_factors[i],
# Shared
iid_fraction=iid_fraction,
num_clients=num_clients,
dry_run=dry_run,
)
for i in range(num_clients)
]
# Configure clients with fixed delay factors
clients = []
for i in range(num_clients):
client = ClientSetting(
# Set instance on which to run
instance_name=get_instance_name(instance_names, num_clients, i),
# Individual
cid=str(i),
partition=i,
# Indices 0 to 49 fast, 50 to 99 slow
delay_factor=delay_factor_fast
if i < int(num_clients / 2)
else delay_factor_slow,
# Shared
iid_fraction=iid_fraction,
num_clients=num_clients,
dry_run=dry_run,
)
clients.append(client)
return clients
client_instances_50, client_names_50 = configure_client_instances(
num_clients=50, num_cpu=2, num_ram=8
)
client_instances_10, client_names_10 = configure_client_instances(
num_clients=10, num_cpu=2, num_ram=8
)
SETTINGS = {
###
### FedFS vs FedAvg
###
"fn-c25-r50-fedavg-230": Setting(
instances=[Instance(name="server", group="server", num_cpu=4, num_ram=16)]
+ client_instances_50,
server=ServerSetting(
instance_name="server",
strategy="fedavg",
rounds=FN_ROUNDS,
min_num_clients=FN_MIN_NUM_CLIENTS,
sample_fraction=FN_SAMPLE_FRACTION_25,
min_sample_size=FN_MIN_SAMPLE_SIZE_25,
training_round_timeout=FN_TRAINING_ROUND_TIMEOUT,
lr_initial=FN_LR_INITIAL,
partial_updates=False,
importance_sampling=False,
dynamic_timeout=False,
),
clients=configure_clients(
iid_fraction=FN_IID_FRACTION,
instance_names=client_names_50,
num_clients=FN_NUM_CLIENTS,
dry_run=False,
delay_factor_fast=0.0,
delay_factor_slow=FN_MAX_DELAY_FACTOR,
real_delays=True,
),
),
"fn-c25-r50-fedfs-v0-230-230": Setting(
instances=[Instance(name="server", group="server", num_cpu=4, num_ram=16)]
+ client_instances_50,
server=ServerSetting(
instance_name="server",
strategy="fedfs-v0",
rounds=FN_ROUNDS,
min_num_clients=FN_MIN_NUM_CLIENTS,
sample_fraction=FN_SAMPLE_FRACTION_25,
min_sample_size=FN_MIN_SAMPLE_SIZE_25,
training_round_timeout=FN_TRAINING_ROUND_TIMEOUT,
lr_initial=FN_LR_INITIAL,
partial_updates=True,
importance_sampling=False,
dynamic_timeout=False,
training_round_timeout_short=FN_TRAINING_ROUND_TIMEOUT,
),
clients=configure_clients(
iid_fraction=FN_IID_FRACTION,
instance_names=client_names_50,
num_clients=FN_NUM_CLIENTS,
dry_run=False,
delay_factor_fast=0.0,
delay_factor_slow=FN_MAX_DELAY_FACTOR,
real_delays=True,
),
),
"fn-c10-r50-fedavg-230": Setting(
instances=[Instance(name="server", group="server", num_cpu=4, num_ram=16)]
+ client_instances_50,
server=ServerSetting(
instance_name="server",
strategy="fedavg",
rounds=FN_ROUNDS,
min_num_clients=FN_MIN_NUM_CLIENTS,
sample_fraction=FN_SAMPLE_FRACTION_10,
min_sample_size=FN_MIN_SAMPLE_SIZE_10,
training_round_timeout=FN_TRAINING_ROUND_TIMEOUT,
lr_initial=FN_LR_INITIAL,
partial_updates=False,
importance_sampling=False,
dynamic_timeout=False,
),
clients=configure_clients(
iid_fraction=FN_IID_FRACTION,
instance_names=client_names_50,
num_clients=FN_NUM_CLIENTS,
dry_run=False,
delay_factor_fast=0.0,
delay_factor_slow=FN_MAX_DELAY_FACTOR,
real_delays=True,
),
),
"fn-c10-r50-fedfs-v0-230-230": Setting(
instances=[Instance(name="server", group="server", num_cpu=4, num_ram=16)]
+ client_instances_50,
server=ServerSetting(
instance_name="server",
strategy="fedfs-v0",
rounds=FN_ROUNDS,
min_num_clients=FN_MIN_NUM_CLIENTS,
sample_fraction=FN_SAMPLE_FRACTION_10,
min_sample_size=FN_MIN_SAMPLE_SIZE_10,
training_round_timeout=FN_TRAINING_ROUND_TIMEOUT,
lr_initial=FN_LR_INITIAL,
partial_updates=True,
importance_sampling=False,
dynamic_timeout=False,
training_round_timeout_short=FN_TRAINING_ROUND_TIMEOUT,
),
clients=configure_clients(
iid_fraction=FN_IID_FRACTION,
instance_names=client_names_50,
num_clients=FN_NUM_CLIENTS,
dry_run=False,
delay_factor_fast=0.0,
delay_factor_slow=FN_MAX_DELAY_FACTOR,
real_delays=True,
),
),
###
###
###
"n2020-fedfs": Setting(
instances=[Instance(name="server", group="server", num_cpu=4, num_ram=16)]
+ client_instances_50,
server=ServerSetting(
instance_name="server",
strategy="fast-and-slow",
rounds=ROUNDS,
min_num_clients=MIN_NUM_CLIENTS,
sample_fraction=SAMPLE_FRACTION,
min_sample_size=MIN_SAMPLE_SIZE,
training_round_timeout=200,
lr_initial=LR_INITIAL,
partial_updates=True,
importance_sampling=True,
dynamic_timeout=True,
alternating_timeout=False,
),
clients=configure_clients(
iid_fraction=IID_FRACTION,
instance_names=client_names_50,
num_clients=50,
dry_run=False,
delay_factor_fast=0.0,
delay_factor_slow=MAX_DELAY_FACTOR,
real_delays=True,
),
),
"n2020-fedavg": Setting(
instances=[Instance(name="server", group="server", num_cpu=4, num_ram=16)]
+ client_instances_50,
server=ServerSetting(
instance_name="server",
strategy="fedavg",
rounds=ROUNDS,
min_num_clients=MIN_NUM_CLIENTS,
sample_fraction=SAMPLE_FRACTION,
min_sample_size=MIN_SAMPLE_SIZE,
training_round_timeout=200,
lr_initial=LR_INITIAL,
partial_updates=False,
importance_sampling=False,
dynamic_timeout=False,
alternating_timeout=False,
),
clients=configure_clients(
iid_fraction=IID_FRACTION,
instance_names=client_names_50,
num_clients=50,
dry_run=False,
delay_factor_fast=0.0,
delay_factor_slow=MAX_DELAY_FACTOR,
real_delays=True,
),
),
"dry-run": Setting(
instances=[
Instance(name="server", group="server", num_cpu=4, num_ram=16),
Instance(name="client", group="clients", num_cpu=4, num_ram=16),
],
server=ServerSetting(
instance_name="server",
strategy="fedavg",
rounds=1,
min_num_clients=1,
sample_fraction=1.0,
min_sample_size=1,
training_round_timeout=600,
lr_initial=LR_INITIAL,
partial_updates=False,
importance_sampling=False,
dynamic_timeout=False,
dry_run=True,
),
clients=configure_uniform_clients(
iid_fraction=IID_FRACTION,
instance_names=["client"],
num_clients=4,
dry_run=True,
),
),
"minimal": Setting(
instances=[Instance(name="server", group="server", num_cpu=4, num_ram=16)]
+ client_instances_10,
server=ServerSetting(
instance_name="server",
strategy="fedavg",
rounds=2,
min_num_clients=4,
sample_fraction=1.0,
min_sample_size=3,
training_round_timeout=3600,
lr_initial=LR_INITIAL,
partial_updates=False,
importance_sampling=False,
dynamic_timeout=False,
dry_run=False,
),
clients=configure_clients(
iid_fraction=IID_FRACTION,
instance_names=client_names_10,
num_clients=10,
dry_run=False,
delay_factor_fast=0.0,
delay_factor_slow=MAX_DELAY_FACTOR,
real_delays=True,
),
),
"fedavg-sync": Setting(
instances=[Instance(name="server", group="server", num_cpu=8, num_ram=32)]
+ client_instances_50,
server=ServerSetting(
instance_name="server",
strategy="fedavg",
rounds=ROUNDS,
min_num_clients=MIN_NUM_CLIENTS,
sample_fraction=SAMPLE_FRACTION,
min_sample_size=MIN_SAMPLE_SIZE,
training_round_timeout=None,
lr_initial=LR_INITIAL,
partial_updates=False,
importance_sampling=False,
dynamic_timeout=False,
dry_run=False,
),
clients=configure_clients(
iid_fraction=IID_FRACTION,
instance_names=client_names_50,
num_clients=100,
dry_run=False,
delay_factor_fast=0.0,
delay_factor_slow=MAX_DELAY_FACTOR,
real_delays=True,
),
),
"fedavg-async": Setting(
instances=[Instance(name="server", group="server", num_cpu=8, num_ram=32)]
+ client_instances_50,
server=ServerSetting(
instance_name="server",
strategy="fedavg",
rounds=ROUNDS,
min_num_clients=MIN_NUM_CLIENTS,
sample_fraction=SAMPLE_FRACTION,
min_sample_size=MIN_SAMPLE_SIZE,
training_round_timeout=20,
lr_initial=LR_INITIAL,
partial_updates=False,
importance_sampling=False,
dynamic_timeout=False,
dry_run=False,
),
clients=configure_clients(
iid_fraction=IID_FRACTION,
instance_names=client_names_50,
num_clients=100,
dry_run=False,
delay_factor_fast=0.0,
delay_factor_slow=MAX_DELAY_FACTOR,
real_delays=True,
),
),
"fast-and-slow-only-partial-updates": Setting(
instances=[Instance(name="server", group="server", num_cpu=8, num_ram=32)]
+ client_instances_50,
server=ServerSetting(
instance_name="server",
strategy="fast-and-slow",
rounds=ROUNDS,
min_num_clients=MIN_NUM_CLIENTS,
sample_fraction=SAMPLE_FRACTION,
min_sample_size=MIN_SAMPLE_SIZE,
training_round_timeout=20,
lr_initial=LR_INITIAL,
partial_updates=True,
importance_sampling=False,
dynamic_timeout=False,
dry_run=False,
),
clients=configure_clients(
iid_fraction=IID_FRACTION,
instance_names=client_names_50,
num_clients=100,
dry_run=False,
delay_factor_fast=0.0,
delay_factor_slow=MAX_DELAY_FACTOR,
real_delays=True,
),
),
"fast-and-slow-only-dynamic-timeouts": Setting(
instances=[Instance(name="server", group="server", num_cpu=8, num_ram=32)]
+ client_instances_50,
server=ServerSetting(
instance_name="server",
strategy="fast-and-slow",
rounds=ROUNDS,
min_num_clients=MIN_NUM_CLIENTS,
sample_fraction=SAMPLE_FRACTION,
min_sample_size=MIN_SAMPLE_SIZE,
training_round_timeout=20,
lr_initial=LR_INITIAL,
partial_updates=False,
importance_sampling=False,
dynamic_timeout=True,
dry_run=False,
),
clients=configure_clients(
iid_fraction=IID_FRACTION,
instance_names=client_names_50,
num_clients=100,
dry_run=False,
delay_factor_fast=0.0,
delay_factor_slow=MAX_DELAY_FACTOR,
real_delays=True,
),
),
"fast-and-slow-only-importance-sampling": Setting(
instances=[Instance(name="server", group="server", num_cpu=8, num_ram=32)]
+ client_instances_50,
server=ServerSetting(
instance_name="server",
strategy="fast-and-slow",
rounds=ROUNDS,
min_num_clients=MIN_NUM_CLIENTS,
sample_fraction=SAMPLE_FRACTION,
min_sample_size=MIN_SAMPLE_SIZE,
training_round_timeout=20,
lr_initial=LR_INITIAL,
partial_updates=False,
importance_sampling=True,
dynamic_timeout=False,
dry_run=False,
),
clients=configure_clients(
iid_fraction=IID_FRACTION,
instance_names=client_names_50,
num_clients=100,
dry_run=False,
delay_factor_fast=0.0,
delay_factor_slow=MAX_DELAY_FACTOR,
real_delays=True,
),
),
"fast-and-slow": Setting(
instances=[Instance(name="server", group="server", num_cpu=8, num_ram=32)]
+ client_instances_50,
server=ServerSetting(
instance_name="server",
strategy="fast-and-slow",
rounds=ROUNDS,
min_num_clients=MIN_NUM_CLIENTS,
sample_fraction=SAMPLE_FRACTION,
min_sample_size=MIN_SAMPLE_SIZE,
training_round_timeout=60,
lr_initial=LR_INITIAL,
partial_updates=True,
importance_sampling=True,
dynamic_timeout=True,
dry_run=False,
),
clients=configure_clients(
iid_fraction=IID_FRACTION,
instance_names=client_names_50,
num_clients=100,
dry_run=False,
delay_factor_fast=0.0,
delay_factor_slow=MAX_DELAY_FACTOR,
real_delays=True,
),
),
"qffedavg": Setting(
instances=[Instance(name="server", group="server", num_cpu=8, num_ram=32)]
+ client_instances_50,
server=ServerSetting(
instance_name="server",
strategy="qffedavg",
rounds=ROUNDS,
min_num_clients=MIN_NUM_CLIENTS,
sample_fraction=SAMPLE_FRACTION,
min_sample_size=MIN_SAMPLE_SIZE,
training_round_timeout=None,
lr_initial=LR_INITIAL,
partial_updates=False,
importance_sampling=False,
dynamic_timeout=False,
dry_run=False,
),
clients=configure_clients(
iid_fraction=IID_FRACTION,
instance_names=client_names_50,
num_clients=100,
dry_run=False,
delay_factor_fast=0.0,
delay_factor_slow=MAX_DELAY_FACTOR,
real_delays=True,
),
),
}
| 33.501742 | 89 | 0.599896 | 2,143 | 19,230 | 5.020999 | 0.098927 | 0.069703 | 0.037454 | 0.037639 | 0.812546 | 0.784572 | 0.767472 | 0.758086 | 0.74303 | 0.74303 | 0 | 0.02629 | 0.313625 | 19,230 | 573 | 90 | 33.560209 | 0.788923 | 0.063547 | 0 | 0.778431 | 0 | 0 | 0.043374 | 0.0126 | 0 | 0 | 0 | 0 | 0 | 1 | 0.007843 | false | 0 | 0.039216 | 0 | 0.056863 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
409038c253040a2dcf08340b86965a856eb23a7e | 129 | py | Python | function/python/brightics/function/clustering/__init__.py | nohkwangsun/studio | b2dd7da1d73d83bef6c046d73fb85639d3006fc2 | [
"Apache-2.0"
] | null | null | null | function/python/brightics/function/clustering/__init__.py | nohkwangsun/studio | b2dd7da1d73d83bef6c046d73fb85639d3006fc2 | [
"Apache-2.0"
] | null | null | null | function/python/brightics/function/clustering/__init__.py | nohkwangsun/studio | b2dd7da1d73d83bef6c046d73fb85639d3006fc2 | [
"Apache-2.0"
] | null | null | null | from .kmeans import kmeans_train_predict
from .kmeans import kmeans_predict
from .kmeans import kmeans_silhouette_train_predict | 43 | 51 | 0.875969 | 18 | 129 | 5.944444 | 0.333333 | 0.280374 | 0.448598 | 0.616822 | 0.542056 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.100775 | 129 | 3 | 51 | 43 | 0.922414 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
40c144d3090eaf13c05cd96304b872ecbf5b0b9b | 75,806 | py | Python | project/apps/birth_registration/migrations/0007_auto_20150827_0305.py | kostik/vrs | a347c2d901e1a6b60a85480c9d2b247157881fce | [
"BSD-3-Clause"
] | 1 | 2016-11-09T18:57:23.000Z | 2016-11-09T18:57:23.000Z | project/apps/birth_registration/migrations/0007_auto_20150827_0305.py | kostik/vrs | a347c2d901e1a6b60a85480c9d2b247157881fce | [
"BSD-3-Clause"
] | null | null | null | project/apps/birth_registration/migrations/0007_auto_20150827_0305.py | kostik/vrs | a347c2d901e1a6b60a85480c9d2b247157881fce | [
"BSD-3-Clause"
] | 4 | 2016-09-30T08:24:09.000Z | 2019-02-28T14:09:19.000Z | # -*- coding: utf-8 -*-
from __future__ import unicode_literals
from django.db import models, migrations
import birth_registration.fields
import django.core.validators
import birth_registration.validators
class Migration(migrations.Migration):
dependencies = [
('birth_registration', '0006_auto_20150820_0524'),
]
operations = [
migrations.RemoveField(
model_name='f201',
name='f101',
),
migrations.AlterField(
model_name='f101',
name='DIS',
field=birth_registration.fields.DistrictField(choices=[('01 - Kachin', [(1, '01 - Myitkyina'), (2, '02 - Bamaw'), (3, '03 - Puta O')]), ('02 - Kayh', [(4, '04 - Loikaw'), (5, '05 - Bawlakhei')]), ('03 - Kayin', [(6, '06 - Pha An'), (7, '07 - Myawaddy'), (8, '08 - Kawtkaraik')]), ('04 - Chin', [(9, '09 - Phalan'), (10, '10 - Mintatt')]), ('05 - Sagaing', [(11, '11 - Sagaing'), (12, '12 - Shwebo'), (13, '13 - Monywar'), (14, '14 - Kathar'), (15, '15 - Kalay'), (16, '16 - Tamu'), (17, '17 - Mawlaik'), (18, '18 - Khantee')]), ('06 - Tanintharyi', [(19, '19 - Dawei'), (20, '20 - Myeik'), (21, '21 - Kautthaung')]), ('07 - Bago', [(22, '22 - Bago'), (23, '23 - Pyay'), (24, '24 - Tharyarwaddy'), (25, '25 - Taungoo')]), ('08 - Magway', [(26, '26 - Magway'), (27, '27 - Minbu'), (28, '28 - Thayet'), (29, '29 - Pakokku'), (30, '30 - Gantgaw')]), ('09 - Mandalay', [(31, '31 - Mandalay'), (32, '32 - Pyin Oo Lwin'), (33, '33 - Kyaukse'), (34, '34 - Myingyan'), (35, '35 - Nyaung Oo'), (36, '36 - Yameithinn'), (37, '37 - Meikhtila')]), ('10 - Mon', [(38, '38 - Mawlamyaing'), (39, '39 - Thahton')]), ('11 - Rakhine', [(40, '40 - Sittwe'), (41, '41 - Maungdaw'), (42, '42 - Buthidaung'), (43, '43 - Kyaukphyu'), (44, '44 - Thandwe')]), ('12 - Yangon', [(45, '45 - Ahshaytbine (East)'), (46, '46 - Ahnoutpine (West)'), (47, '47 - Taungbine (South)'), (48, '48 - Myaukpine (North)')]), ('13 - Shan', [(49, '49 - Taungyi'), (50, '50 - Loilin'), (51, '51 - Lahsio'), (52, '52 - Muse'), (53, '53 - Kyaukmei'), (54, '54 - Kunloan'), (55, '55 - Laukkaing'), (56, '56 - Kyaington'), (57, '57 - Minesatt'), (58, '58 - Tachilaik'), (59, '59 - Minephyat')]), ('14 - Ayyarwaddy', [(60, '60 - Pathein'), (61, '61 - Hinthada'), (62, '62 - Myaungmya'), (63, '63 - Ma U Bin'), (64, '64 - Phyarpon')]), ('15 - NayPyiTaw', [(65, '65 - Zayarthiri'), (66, '66 - Dakhenathiri'), (67, '67 - Oktayathiri'), (68, '68 - Potebathiri'), (69, '69 - Zabuthiri'), (70, '70 - Tatkon'), (71, '71 - Pyinmana'), (72, '72 - Lewe')])], verbose_name='District', validators=[birth_registration.validators.validate_2digits]),
),
migrations.AlterField(
model_name='f101',
name='Father_NRC',
field=birth_registration.fields.Char100Field(max_length=100, null=True, verbose_name="Father's NRC", blank=True),
),
migrations.AlterField(
model_name='f101',
name='Father_name',
field=birth_registration.fields.Char100Field(max_length=100, null=True, verbose_name="Father's Name", blank=True),
),
migrations.AlterField(
model_name='f101',
name='Informer',
field=birth_registration.fields.Char300Field(max_length=300, null=True, verbose_name="Informer's name and address", blank=True),
),
migrations.AlterField(
model_name='f101',
name='Mother_NRC',
field=birth_registration.fields.Char100Field(max_length=100, null=True, verbose_name="Mother's NRC", blank=True),
),
migrations.AlterField(
model_name='f101',
name='Mother_name',
field=birth_registration.fields.Char100Field(max_length=100, null=True, verbose_name="Mother's Name"),
),
migrations.AlterField(
model_name='f101',
name='NCZN_F',
field=birth_registration.fields.CitizenshipField(blank=True, choices=[(1, '01 - Myanmar'), (2, '02 - Indian'), (3, '03 - Pakistani'), (4, '04 - Bangladesh'), (5, '05 - Nepalese'), (6, '06 - Chinese'), (7, '07 - European/American'), (8, '08 - Other Asian'), (9, '09 - Others'), (10, '10 - Not stated')], null=True, verbose_name="Father's Citizenship", validators=[django.core.validators.MaxValueValidator(10)]),
),
migrations.AlterField(
model_name='f101',
name='NCZN_M',
field=birth_registration.fields.CitizenshipField(blank=True, choices=[(1, '01 - Myanmar'), (2, '02 - Indian'), (3, '03 - Pakistani'), (4, '04 - Bangladesh'), (5, '05 - Nepalese'), (6, '06 - Chinese'), (7, '07 - European/American'), (8, '08 - Other Asian'), (9, '09 - Others'), (10, '10 - Not stated')], null=True, verbose_name='Usual Place of residence of mother:', validators=[django.core.validators.MaxValueValidator(10)]),
),
migrations.AlterField(
model_name='f101',
name='NRACE_F',
field=birth_registration.fields.RaceField(blank=True, null=True, verbose_name="Father's Race", choices=[(1, '01 - Kachin'), (2, '02 - Kayah'), (3, '03 - Kayin'), (4, '04 - Chin'), (5, '05 - Bamar'), (6, '06 - Mon'), (7, '07 - Rakhine'), (8, '08 - Shan'), (9, '09 - Other indigenous Races'), (10, '10 - Myanmar/Foreigners'), (11, '11 - Chinese'), (12, '12 - Indian'), (13, '13 - Pakistani'), (14, '14 - Bangladesh'), (15, '15 - Nepal'), (16, '16 - Other Asian'), (17, '17 - Others'), (18, '18 - Not stated')]),
),
migrations.AlterField(
model_name='f101',
name='NRACE_M',
field=birth_registration.fields.RaceField(blank=True, null=True, verbose_name="Mother's Race", choices=[(1, '01 - Kachin'), (2, '02 - Kayah'), (3, '03 - Kayin'), (4, '04 - Chin'), (5, '05 - Bamar'), (6, '06 - Mon'), (7, '07 - Rakhine'), (8, '08 - Shan'), (9, '09 - Other indigenous Races'), (10, '10 - Myanmar/Foreigners'), (11, '11 - Chinese'), (12, '12 - Indian'), (13, '13 - Pakistani'), (14, '14 - Bangladesh'), (15, '15 - Nepal'), (16, '16 - Other Asian'), (17, '17 - Others'), (18, '18 - Not stated')]),
),
migrations.AlterField(
model_name='f101',
name='NR_AREA',
field=birth_registration.fields.AreaField(help_text='Urban/Rural', verbose_name='Area', choices=[(1, '01 - Urban'), (2, '02 - Rural')]),
),
migrations.AlterField(
model_name='f101',
name='NR_SNO',
field=birth_registration.fields.SerialNoField(help_text='Enter serial No. direct into coding column. Watch carefully for sequence of serial No. If it is not in sequence make query.<br/>Code actual serial No. in full to 5 digits. If serial No. is 1 to 99; Code 00001-00009, and 10-99 code 00010, 00011,\u2026\u2026\u202600099, 00100 and over in full', verbose_name='Serial No. in Registration book', validators=[django.core.validators.MaxValueValidator(99999)]),
),
migrations.AlterField(
model_name='f101',
name='Name_of_child',
field=birth_registration.fields.Char100Field(help_text='(if has)', max_length=100, null=True, verbose_name='Name of child', blank=True),
),
migrations.AlterField(
model_name='f101',
name='Original_form',
field=birth_registration.fields.OriginalFormField(help_text='Please attach a scanned copy of a photograph on the original form', upload_to=b'F101/%Y/%m/%d/', null=True, verbose_name='Original Form Image', blank=True),
),
migrations.AlterField(
model_name='f101',
name='RCIR',
field=birth_registration.fields.Char300Field(max_length=300, null=True, verbose_name="Mother's Address", blank=True),
),
migrations.AlterField(
model_name='f101',
name='RDIS',
field=birth_registration.fields.DistrictField(blank=True, choices=[('01 - Kachin', [(1, '01 - Myitkyina'), (2, '02 - Bamaw'), (3, '03 - Puta O')]), ('02 - Kayh', [(4, '04 - Loikaw'), (5, '05 - Bawlakhei')]), ('03 - Kayin', [(6, '06 - Pha An'), (7, '07 - Myawaddy'), (8, '08 - Kawtkaraik')]), ('04 - Chin', [(9, '09 - Phalan'), (10, '10 - Mintatt')]), ('05 - Sagaing', [(11, '11 - Sagaing'), (12, '12 - Shwebo'), (13, '13 - Monywar'), (14, '14 - Kathar'), (15, '15 - Kalay'), (16, '16 - Tamu'), (17, '17 - Mawlaik'), (18, '18 - Khantee')]), ('06 - Tanintharyi', [(19, '19 - Dawei'), (20, '20 - Myeik'), (21, '21 - Kautthaung')]), ('07 - Bago', [(22, '22 - Bago'), (23, '23 - Pyay'), (24, '24 - Tharyarwaddy'), (25, '25 - Taungoo')]), ('08 - Magway', [(26, '26 - Magway'), (27, '27 - Minbu'), (28, '28 - Thayet'), (29, '29 - Pakokku'), (30, '30 - Gantgaw')]), ('09 - Mandalay', [(31, '31 - Mandalay'), (32, '32 - Pyin Oo Lwin'), (33, '33 - Kyaukse'), (34, '34 - Myingyan'), (35, '35 - Nyaung Oo'), (36, '36 - Yameithinn'), (37, '37 - Meikhtila')]), ('10 - Mon', [(38, '38 - Mawlamyaing'), (39, '39 - Thahton')]), ('11 - Rakhine', [(40, '40 - Sittwe'), (41, '41 - Maungdaw'), (42, '42 - Buthidaung'), (43, '43 - Kyaukphyu'), (44, '44 - Thandwe')]), ('12 - Yangon', [(45, '45 - Ahshaytbine (East)'), (46, '46 - Ahnoutpine (West)'), (47, '47 - Taungbine (South)'), (48, '48 - Myaukpine (North)')]), ('13 - Shan', [(49, '49 - Taungyi'), (50, '50 - Loilin'), (51, '51 - Lahsio'), (52, '52 - Muse'), (53, '53 - Kyaukmei'), (54, '54 - Kunloan'), (55, '55 - Laukkaing'), (56, '56 - Kyaington'), (57, '57 - Minesatt'), (58, '58 - Tachilaik'), (59, '59 - Minephyat')]), ('14 - Ayyarwaddy', [(60, '60 - Pathein'), (61, '61 - Hinthada'), (62, '62 - Myaungmya'), (63, '63 - Ma U Bin'), (64, '64 - Phyarpon')]), ('15 - NayPyiTaw', [(65, '65 - Zayarthiri'), (66, '66 - Dakhenathiri'), (67, '67 - Oktayathiri'), (68, '68 - Potebathiri'), (69, '69 - Zabuthiri'), (70, '70 - Tatkon'), (71, '71 - Pyinmana'), (72, '72 - Lewe')])], null=True, verbose_name="Mother's District", validators=[birth_registration.validators.validate_2digits]),
),
migrations.AlterField(
model_name='f101',
name='RST_DV',
field=birth_registration.fields.StateDivisionField(blank=True, choices=[(1, '01 - Kachin'), (2, '02 - Kayh'), (3, '03 - Kayin'), (4, '04 - Chin'), (5, '05 - Sagaing'), (6, '06 - Tanintharyi'), (7, '07 - Bago'), (8, '08 - Magway'), (9, '09 - Mandalay'), (10, '10 - Mon'), (11, '11 - Rakhine'), (12, '12 - Yangon'), (13, '13 - Shan'), (14, '14 - Ayyarwaddy'), (15, '15 - NayPyiTaw')], null=True, verbose_name="Mother's state\\division", validators=[birth_registration.validators.validate_2digits]),
),
migrations.AlterField(
model_name='f101',
name='RTWN',
field=birth_registration.fields.TownshipField(blank=True, null=True, verbose_name="Mother's Township", choices=[('01 - Myitkyina', [(1, '01 - Myitkyina'), (2, '02 - Waingmaw'), (3, '03 - Ingyan Yan'), (4, '04 - Moekaung'), (5, '05 - Moehnyin'), (6, '06 - Phakant'), (7, '07 - Karmine*'), (8, '08 - Ta Naing'), (9, '09 - Chibway'), (10, '10 - Sautlaw')]), ('02 - Bamaw', [(11, '01 - Bamaw'), (12, '02 - Shwegu'), (13, '03 - Moemauk'), (14, '04 - Mansi')]), ('03 - Puta O', [(15, '01 - Puta O'), (16, '02 - Swanprabum'), (17, '03 - Machanbaw'), (18, '04 - Khaunglanphoo'), (19, '05 - Naungmon')]), ('04 - Loikaw', [(20, '01 - Loikaw'), (21, '02 - Dimawsoe'), (22, '03 - Phrusoe'), (23, '04 - Shartaw')]), ('05 - Bawlakhei', [(24, '01 - Bawlakhei'), (25, '02 - Pharsaung'), (26, '03 - Meiseit')]), ('06 - Pha An', [(27, '01 - Pha An'), (28, '02 - Hlaingbweit'), (29, '03 - Pharpon'), (30, '04 - Thandaung'), (31, '05 - Thandaungkyee')]), ('07 - Myawaddy', [(32, '01 - Myawaddy')]), ('08 - Kawtkaraik', [(33, '01 - Kawtkaraik'), (34, '02 - Kyarinseikkyi'), (35, '03 - Phayathonesu*'), (36, '04 - Kyoandoe')]), ('09 - Phalan', [(37, '01 - Phalan'), (38, '02 - Hakha'), (39, '03 - Htantalan'), (40, '04 - Teetain'), (41, '05 - Tunzan')]), ('10 - Mintatt', [(42, '01 - Mintatt'), (43, '02 - Matupi'), (44, '03 - Kanpetlet'), (45, '04 - Paletwa')]), ('11 - Sagaing', [(46, '01 - Sagaing'), (47, '02 - Myinmu'), (48, '03 - Myaung')]), ('12 - Shwebo', [(49, '01 - Shwebo'), (50, '02 - Khin Oo'), (51, '03 - Wetlet'), (52, '04 - Kantbalu'), (53, '05 - Kyunhla'), (54, '06 - Yay Oo'), (55, '07 - Dipeiyin'), (56, '08 - Tantsei')]), ('13 - Monywar', [(57, '01 - Monywar'), (58, '02 - Butalin'), (59, '03 - Ahyartaw'), (60, '04 - Chaung Oo'), (61, '05 - Yinmarbin'), (62, '06 - Kani'), (63, '07 - Salingyee'), (64, '08 - Palei')]), ('14 - Kathar', [(65, '01 - Kathar'), (66, '02 - Indaw'), (67, '03 - Hteechaink'), (68, '04 - Bamauk'), (69, '05 - Kawlin'), (70, '06 - Wuntho'), (71, '07 - Pinleibu')]), ('15 - Kalay', [(72, '01 - Kalay'), (73, '02 - Kalaywa'), (74, '03 - Minkin')]), ('16 - Tamu', [(75, '01 - Tamu')]), ('17 - Mawlaik', [(76, '01 - Mawlaik'), (77, '02 - Phaungpyin')]), ('18 - Khantee', [(78, '01 - Khantee'), (79, '02 - Hoamalin'), (80, '03 - Layshee'), (81, '04 - Lahei'), (82, '05 - Nanyon')]), ('19 - Dawei', [(83, '01 - Dawei'), (84, '02 - Launglon'), (85, '03 - Thayetchaung'), (86, '04 - Yayphyu')]), ('20 - Myeik', [(87, '01 - Myeik'), (88, '02 - Kyunsu'), (89, '03 - Pulaw'), (90, '04 - Tanintharyi')]), ('21 - Kautthaung', [(91, '01 - Kautthaung'), (92, '02 - Boatpyin')]), ('22 - Bago', [(93, '01 - Bago'), (94, '02 - Thanatpin'), (95, '03 - Kawa'), (96, '04 - Waw'), (97, '05 - Nyaunglaybin'), (98, '06 - Madauk*'), (99, '07 - Pyuntanzar*'), (100, '08 - Kyauktaga'), (101, '09 - Peinweikone*'), (102, '10 - Daik Oo'), (103, '11 - Shwekyin')]), ('23 - Pyay', [(104, '01 - Pyay'), (105, '02 - Pauk Khaung'), (106, '03 - Padaung'), (107, '04 - Paungtei'), (108, '05 - Theikone'), (109, '06 - Shwetaung')]), ('24 - Tharyarwaddy', [(110, '01 - Tharyarwaddy'), (111, '02 - Thonesei*'), (112, '03 - Letpandan'), (113, '04 - Minhla'), (114, '05 - Oakpho'), (115, '06 - Zeekone'), (116, '07 - Nattalin'), (117, '08 - Moenyo'), (118, '09 - Kyoetbinkaut')]), ('25 - Taungoo', [(119, '01 - Taungoo'), (120, '02 - Yaytarshay'), (121, '03 - Kyaukyee'), (122, '04 - Phyu'), (123, '05 - Oaktwin'), (124, '06 - Htandabin')]), ('26 - Magway', [(125, '01 - Magway'), (126, '02 - Yenanchaung'), (127, '03 - Chauk'), (128, '04 - Taungtwingyee'), (129, '05 - Myoethit'), (130, '06 - Natmauk')]), ('27 - Minbu', [(131, '01 - Minbu'), (132, '02 - Saku*'), (133, '03 - Pwintphyu'), (134, '04 - Ngaphei'), (135, '05 - Salin'), (136, '06 - Sinphyukyun*'), (137, '07 - Saytoattaya')]), ('28 - Thayet', [(138, '01 - Thayet'), (139, '02 - Minhla'), (140, '03 - Mintone'), (141, '04 - Kanma'), (142, '05 - Aunglan'), (143, '06 - Sinpaungwei')]), ('29 - Pakokku', [(144, '01 - Pakokku'), (145, '02 - Yesagyo'), (146, '03 - Myaing'), (147, '04 - Pauk'), (148, '05 - Saikphyu')]), ('30 - Gantgaw', [(149, '01 - Gantgaw'), (150, '02 - Hteelin'), (151, '03 - Saw')]), ('31 - Mandalay', [(152, '01 - Aungmyaytharzan'), (153, '02 - Chanayetharzan'), (154, '03 - MahaAungmyay'), (155, '04 - Chanmyatharsi'), (156, '05 - Pyigyeetakhun'), (157, '06 - Amarapura'), (158, '07 - Myitnge*'), (159, '08 - Patheingyee')]), ('32 - Pyin Oo Lwin', [(160, '01 - Pyin Oo Lwin'), (161, '02 - Madayar'), (162, '03 - Sintkuu'), (163, '04 - Moegauk'), (164, '05 - Thabaikkyin')]), ('33 - Kyaukse', [(165, '01 - Kyaukse'), (166, '02 - Sintkai'), (167, '03 - Myitthar'), (168, '04 - Tadaoo')]), ('34 - Myingyan', [(169, '01 - Myingyan'), (170, '02 - Taungthar'), (171, '03 - Nahtoegyee'), (172, '04 - Kyaukbadaung'), (173, '05 - Nganzun')]), ('35 - Nyaung Oo', [(174, '01 - Nyaung Oo'), (175, '02 - Bagan*'), (176, '03 - Ngatharauk*')]), ('36 - Yameithinn', [(177, '01 - Yameithinn'), (178, '02 - Pyawbwei'), (179, '03 - Tatkone'), (180, '04 - Pyinmana'), (181, '05 - Leiway')]), ('37 - Meikhtila', [(182, '01 - Meikhtila'), (183, '02 - Mahlaing'), (184, '03 - Tharzi'), (185, '04 - Wantwin')]), ('38 - Mawlamyaing', [(186, '01 - Mawlamyaing'), (187, '02 - Kyaikmayaw'), (188, '03 - Chaungzon'), (189, '04 - Thanphyuzayat'), (190, '05 - Kyaikkhami*'), (191, '06 - Mudon'), (192, '07 - Yay')]), ('39 - Thahton', [(193, '01 - Thahton'), (194, '02 - Paung'), (195, '03 - Kyaikhto'), (196, '04 - Beelin')]), ('40 - Sittwe', [(197, '01 - Sittwe'), (198, '02 - Poannakyun'), (199, '03 - Myauk Oo'), (200, '04 - Kyauktaw'), (201, '05 - Minbya'), (202, '06 - Pauktaw'), (203, '07 - Myaybon')]), ('41 - Maungdaw', [(204, '01 - Maungdaw')]), ('42 - Buthidaung', [(205, '01 - Buthidaung'), (206, '02 - Rathedaung')]), ('43 - Kyaukphyu', [(207, '01 - Kyaukphyu'), (208, '02 - Man Aung'), (209, '03 - Ranbyei'), (210, '04 - Ann')]), ('44 - Thandwe', [(211, '01 - Thandwe'), (212, '02 - Taungkauk'), (213, '03 - Gwa')]), ('45 - Ahshaytbine (East)', [(214, '01 - Thingankyun'), (215, '02 - Yankin'), (216, '03 - Taung Okkalapa'), (217, '04 - Myauk Okkalapa'), (218, '05 - Tharketa'), (219, '06 - Dawbon'), (220, '07 - Tarmwe'), (221, '08 - Pazuntaung'), (222, '09 - Botahtaung'), (223, '10 - Dagon Myothit Taung (Sounth)'), (224, '11 - Dagon Myothi Myauk (North)'), (225, '12 - Dagon Myothit Ahshayt (East)'), (226, '13 - Dagon Myothit (Seikkan)'), (227, '14 - Mingalataungnyunt')]), ('46 - Ahnoutpine (West)', [(228, '01 - Kyauktada'), (229, '02 - Panbedan'), (230, '03 - Lanmadaw'), (231, '04 - Lathar'), (232, '05 - Ahlon'), (233, '06 - Kyeemyindine'), (234, '07 - Sanchaung'), (235, '08 - Hlaing'), (236, '09 - Kamaryut'), (237, '10 - Mayangone'), (238, '11 - Dagon'), (239, '12 - Bahan'), (240, '13 - Seikkan')]), ('47 - Taungbine (South)', [(241, '01 - Thanhlyin'), (242, '02 - Kyauktan'), (243, '03 - Thonekhwa'), (244, '04 - Khayan'), (245, '05 - Tonte'), (246, '06 - Kauthmu'), (247, '07 - Kunchangone'), (248, '08 - Dala'), (249, '09 - Seikkyee'), (250, '10 - Khanaungto'), (251, '11 - Kokoe Island')]), ('48 - Myaukpine (North)', [(252, '01 - Insein'), (253, '02 - Mingalardon'), (254, '03 - Htaunkkyant*'), (255, '04 - Hmawbi'), (256, '05 - Hlegu'), (257, '06 - Tiakkyee'), (258, '07 - Oakkan*'), (259, '08 - Htantabin'), (260, '09 - Shwepyithar'), (261, '10 - Hlaingtharyar'), (262, '11 - Ahphyauk*')]), ('49 - Taungyi', [(263, '01 - Taungyi'), (264, '02 - Ayetharyar*'), (265, '03 - Hopone'), (266, '04 - Nyaungshwe'), (267, '05 - Sisaing'), (268, '06 - Kalaw'), (269, '07 - Aungban*'), (270, '08 - Pindaya'), (271, '09 - Ywarngan'), (272, '10 - Yatsauk'), (273, '11 - Pinlaung'), (274, '12 - Phekhoan')]), ('50 - Loilin', [(275, '01 - Loilin'), (276, '02 - Pinlon'), (277, '03 - Leichar'), (278, '04 - Nantsam(South)'), (279, '05 - Kunhein'), (280, '06 - Moenei'), (281, '07 - Linkhay'), (282, '08 - Maukmei'), (283, '09 - Minepan'), (284, '10 - Kyaythee'), (285, '11 - Minekaing'), (286, '12 - Mineshu')]), ('51 - Lahsio', [(287, '01 - Lashio'), (288, '02 - Theinni'), (289, '03 - Mineyei'), (290, '04 - Tantyan'), (291, '05 - Minephant'), (292, '06 - Panyang'), (293, '07 - Narphan'), (294, '08 - Panwaing'), (295, '09 - Minemaw'), (296, '10 - Pansan (Pankhan)')]), ('52 - Muse', [(297, '01 - Muse'), (298, '02 - Nantkhan'), (299, '03 - Kutkhaing'), (300, '04 - Monkoe'), (301, '05 - Kyukoak')]), ('53 - Kyaukmei', [(302, '01 - Kyaukmei'), (303, '02 - Naungcho'), (304, '03 - Thibaw'), (305, '04 - Namtu'), (306, '05 - Nantsam(North)'), (307, '06 - Moemaik'), (308, '07 - Mabain'), (309, '08 - Mantoan')]), ('54 - Kunloan', [(310, '01 - Kunloan'), (311, '02 - Hopan')]), ('55 - Laukkaing', [(312, '01 - Laukkaing'), (313, '02 - Chinshwehaw*'), (314, '03 - Koankyan')]), ('56 - Kyaington', [(315, '01 - Kyaington'), (316, '02 - Minekhat'), (317, '03 - Mineyang'), (318, '04 - Minelar'), (319, '05 - Metman')]), ('57 - Minesatt', [(320, '01 - Minesatt'), (321, '02 - Minepyinn'), (322, '03 - Minetoan')]), ('58 - Tachilaik', [(323, '01 - Tachilaik')]), ('59 - Minephyat', [(324, '01 - Minephyat'), (325, '02 - Mineyaung')]), ('60 - Pathein', [(326, '01 - Pathein'), (327, '02 - Kangyidaunt'), (328, '03 - Tharpaung'), (329, '04 - Ngaputaw'), (330, '05 - Kyoanpyaw'), (331, '06 - Yaykyi'), (332, '07 - Ngathaingchaung'), (333, '08 - Kyaungkone'), (334, '09 - Haigyikyun')]), ('61 - Hinthada', [(335, '01 - Hinthada'), (336, '02 - Zalun'), (337, '03 - Laymyethnar'), (338, '04 - Myan Aung'), (339, '05 - Ka Naung'), (340, '06 - Kyankhin'), (341, '07 - Ingapu')]), ('62 - Myaungmya', [(342, '01 - Myaungmya'), (343, '02 - Ainmei'), (344, '03 - Laputta'), (345, '04 - Warkhema'), (346, '05 - Mawlamyaingkyun')]), ('63 - Ma U Bin', [(347, '01 - Ma U Bin'), (348, '02 - Pantanaw'), (349, '03 - Nyaungtone'), (350, '04 - Danubyu')]), ('64 - Phyarpon', [(351, '01 - Phyarpon'), (352, '02 - Bogalay'), (353, '03 - Kyaiklatt'), (354, '04 - Daydayei')]), ('65 - Zayarthiri', [(355, '01 - Zayarthiri')]), ('66 - Dakhenathiri', [(356, '01 - Dakhenathiri')]), ('67 - Oktayathiri', [(357, '01 - Oktayathiri')]), ('68 - Potebathiri', [(358, '01 - Potebathiri')]), ('69 - Zabuthiri', [(359, '01 - Zabuthiri')]), ('70 - Tatkon', [(360, '01 - Tatkon')]), ('71 - Pyinmana', [(361, '01 - Pyinmana')]), ('72 - Lewe', [(362, '01 - Lewe')])]),
),
migrations.AlterField(
model_name='f101',
name='ST_DV',
field=birth_registration.fields.StateDivisionField(choices=[(1, '01 - Kachin'), (2, '02 - Kayh'), (3, '03 - Kayin'), (4, '04 - Chin'), (5, '05 - Sagaing'), (6, '06 - Tanintharyi'), (7, '07 - Bago'), (8, '08 - Magway'), (9, '09 - Mandalay'), (10, '10 - Mon'), (11, '11 - Rakhine'), (12, '12 - Yangon'), (13, '13 - Shan'), (14, '14 - Ayyarwaddy'), (15, '15 - NayPyiTaw')], verbose_name='State/Division', validators=[birth_registration.validators.validate_2digits]),
),
migrations.AlterField(
model_name='f101',
name='TWN',
field=birth_registration.fields.TownshipField(verbose_name='Township or town', choices=[('01 - Myitkyina', [(1, '01 - Myitkyina'), (2, '02 - Waingmaw'), (3, '03 - Ingyan Yan'), (4, '04 - Moekaung'), (5, '05 - Moehnyin'), (6, '06 - Phakant'), (7, '07 - Karmine*'), (8, '08 - Ta Naing'), (9, '09 - Chibway'), (10, '10 - Sautlaw')]), ('02 - Bamaw', [(11, '01 - Bamaw'), (12, '02 - Shwegu'), (13, '03 - Moemauk'), (14, '04 - Mansi')]), ('03 - Puta O', [(15, '01 - Puta O'), (16, '02 - Swanprabum'), (17, '03 - Machanbaw'), (18, '04 - Khaunglanphoo'), (19, '05 - Naungmon')]), ('04 - Loikaw', [(20, '01 - Loikaw'), (21, '02 - Dimawsoe'), (22, '03 - Phrusoe'), (23, '04 - Shartaw')]), ('05 - Bawlakhei', [(24, '01 - Bawlakhei'), (25, '02 - Pharsaung'), (26, '03 - Meiseit')]), ('06 - Pha An', [(27, '01 - Pha An'), (28, '02 - Hlaingbweit'), (29, '03 - Pharpon'), (30, '04 - Thandaung'), (31, '05 - Thandaungkyee')]), ('07 - Myawaddy', [(32, '01 - Myawaddy')]), ('08 - Kawtkaraik', [(33, '01 - Kawtkaraik'), (34, '02 - Kyarinseikkyi'), (35, '03 - Phayathonesu*'), (36, '04 - Kyoandoe')]), ('09 - Phalan', [(37, '01 - Phalan'), (38, '02 - Hakha'), (39, '03 - Htantalan'), (40, '04 - Teetain'), (41, '05 - Tunzan')]), ('10 - Mintatt', [(42, '01 - Mintatt'), (43, '02 - Matupi'), (44, '03 - Kanpetlet'), (45, '04 - Paletwa')]), ('11 - Sagaing', [(46, '01 - Sagaing'), (47, '02 - Myinmu'), (48, '03 - Myaung')]), ('12 - Shwebo', [(49, '01 - Shwebo'), (50, '02 - Khin Oo'), (51, '03 - Wetlet'), (52, '04 - Kantbalu'), (53, '05 - Kyunhla'), (54, '06 - Yay Oo'), (55, '07 - Dipeiyin'), (56, '08 - Tantsei')]), ('13 - Monywar', [(57, '01 - Monywar'), (58, '02 - Butalin'), (59, '03 - Ahyartaw'), (60, '04 - Chaung Oo'), (61, '05 - Yinmarbin'), (62, '06 - Kani'), (63, '07 - Salingyee'), (64, '08 - Palei')]), ('14 - Kathar', [(65, '01 - Kathar'), (66, '02 - Indaw'), (67, '03 - Hteechaink'), (68, '04 - Bamauk'), (69, '05 - Kawlin'), (70, '06 - Wuntho'), (71, '07 - Pinleibu')]), ('15 - Kalay', [(72, '01 - Kalay'), (73, '02 - Kalaywa'), (74, '03 - Minkin')]), ('16 - Tamu', [(75, '01 - Tamu')]), ('17 - Mawlaik', [(76, '01 - Mawlaik'), (77, '02 - Phaungpyin')]), ('18 - Khantee', [(78, '01 - Khantee'), (79, '02 - Hoamalin'), (80, '03 - Layshee'), (81, '04 - Lahei'), (82, '05 - Nanyon')]), ('19 - Dawei', [(83, '01 - Dawei'), (84, '02 - Launglon'), (85, '03 - Thayetchaung'), (86, '04 - Yayphyu')]), ('20 - Myeik', [(87, '01 - Myeik'), (88, '02 - Kyunsu'), (89, '03 - Pulaw'), (90, '04 - Tanintharyi')]), ('21 - Kautthaung', [(91, '01 - Kautthaung'), (92, '02 - Boatpyin')]), ('22 - Bago', [(93, '01 - Bago'), (94, '02 - Thanatpin'), (95, '03 - Kawa'), (96, '04 - Waw'), (97, '05 - Nyaunglaybin'), (98, '06 - Madauk*'), (99, '07 - Pyuntanzar*'), (100, '08 - Kyauktaga'), (101, '09 - Peinweikone*'), (102, '10 - Daik Oo'), (103, '11 - Shwekyin')]), ('23 - Pyay', [(104, '01 - Pyay'), (105, '02 - Pauk Khaung'), (106, '03 - Padaung'), (107, '04 - Paungtei'), (108, '05 - Theikone'), (109, '06 - Shwetaung')]), ('24 - Tharyarwaddy', [(110, '01 - Tharyarwaddy'), (111, '02 - Thonesei*'), (112, '03 - Letpandan'), (113, '04 - Minhla'), (114, '05 - Oakpho'), (115, '06 - Zeekone'), (116, '07 - Nattalin'), (117, '08 - Moenyo'), (118, '09 - Kyoetbinkaut')]), ('25 - Taungoo', [(119, '01 - Taungoo'), (120, '02 - Yaytarshay'), (121, '03 - Kyaukyee'), (122, '04 - Phyu'), (123, '05 - Oaktwin'), (124, '06 - Htandabin')]), ('26 - Magway', [(125, '01 - Magway'), (126, '02 - Yenanchaung'), (127, '03 - Chauk'), (128, '04 - Taungtwingyee'), (129, '05 - Myoethit'), (130, '06 - Natmauk')]), ('27 - Minbu', [(131, '01 - Minbu'), (132, '02 - Saku*'), (133, '03 - Pwintphyu'), (134, '04 - Ngaphei'), (135, '05 - Salin'), (136, '06 - Sinphyukyun*'), (137, '07 - Saytoattaya')]), ('28 - Thayet', [(138, '01 - Thayet'), (139, '02 - Minhla'), (140, '03 - Mintone'), (141, '04 - Kanma'), (142, '05 - Aunglan'), (143, '06 - Sinpaungwei')]), ('29 - Pakokku', [(144, '01 - Pakokku'), (145, '02 - Yesagyo'), (146, '03 - Myaing'), (147, '04 - Pauk'), (148, '05 - Saikphyu')]), ('30 - Gantgaw', [(149, '01 - Gantgaw'), (150, '02 - Hteelin'), (151, '03 - Saw')]), ('31 - Mandalay', [(152, '01 - Aungmyaytharzan'), (153, '02 - Chanayetharzan'), (154, '03 - MahaAungmyay'), (155, '04 - Chanmyatharsi'), (156, '05 - Pyigyeetakhun'), (157, '06 - Amarapura'), (158, '07 - Myitnge*'), (159, '08 - Patheingyee')]), ('32 - Pyin Oo Lwin', [(160, '01 - Pyin Oo Lwin'), (161, '02 - Madayar'), (162, '03 - Sintkuu'), (163, '04 - Moegauk'), (164, '05 - Thabaikkyin')]), ('33 - Kyaukse', [(165, '01 - Kyaukse'), (166, '02 - Sintkai'), (167, '03 - Myitthar'), (168, '04 - Tadaoo')]), ('34 - Myingyan', [(169, '01 - Myingyan'), (170, '02 - Taungthar'), (171, '03 - Nahtoegyee'), (172, '04 - Kyaukbadaung'), (173, '05 - Nganzun')]), ('35 - Nyaung Oo', [(174, '01 - Nyaung Oo'), (175, '02 - Bagan*'), (176, '03 - Ngatharauk*')]), ('36 - Yameithinn', [(177, '01 - Yameithinn'), (178, '02 - Pyawbwei'), (179, '03 - Tatkone'), (180, '04 - Pyinmana'), (181, '05 - Leiway')]), ('37 - Meikhtila', [(182, '01 - Meikhtila'), (183, '02 - Mahlaing'), (184, '03 - Tharzi'), (185, '04 - Wantwin')]), ('38 - Mawlamyaing', [(186, '01 - Mawlamyaing'), (187, '02 - Kyaikmayaw'), (188, '03 - Chaungzon'), (189, '04 - Thanphyuzayat'), (190, '05 - Kyaikkhami*'), (191, '06 - Mudon'), (192, '07 - Yay')]), ('39 - Thahton', [(193, '01 - Thahton'), (194, '02 - Paung'), (195, '03 - Kyaikhto'), (196, '04 - Beelin')]), ('40 - Sittwe', [(197, '01 - Sittwe'), (198, '02 - Poannakyun'), (199, '03 - Myauk Oo'), (200, '04 - Kyauktaw'), (201, '05 - Minbya'), (202, '06 - Pauktaw'), (203, '07 - Myaybon')]), ('41 - Maungdaw', [(204, '01 - Maungdaw')]), ('42 - Buthidaung', [(205, '01 - Buthidaung'), (206, '02 - Rathedaung')]), ('43 - Kyaukphyu', [(207, '01 - Kyaukphyu'), (208, '02 - Man Aung'), (209, '03 - Ranbyei'), (210, '04 - Ann')]), ('44 - Thandwe', [(211, '01 - Thandwe'), (212, '02 - Taungkauk'), (213, '03 - Gwa')]), ('45 - Ahshaytbine (East)', [(214, '01 - Thingankyun'), (215, '02 - Yankin'), (216, '03 - Taung Okkalapa'), (217, '04 - Myauk Okkalapa'), (218, '05 - Tharketa'), (219, '06 - Dawbon'), (220, '07 - Tarmwe'), (221, '08 - Pazuntaung'), (222, '09 - Botahtaung'), (223, '10 - Dagon Myothit Taung (Sounth)'), (224, '11 - Dagon Myothi Myauk (North)'), (225, '12 - Dagon Myothit Ahshayt (East)'), (226, '13 - Dagon Myothit (Seikkan)'), (227, '14 - Mingalataungnyunt')]), ('46 - Ahnoutpine (West)', [(228, '01 - Kyauktada'), (229, '02 - Panbedan'), (230, '03 - Lanmadaw'), (231, '04 - Lathar'), (232, '05 - Ahlon'), (233, '06 - Kyeemyindine'), (234, '07 - Sanchaung'), (235, '08 - Hlaing'), (236, '09 - Kamaryut'), (237, '10 - Mayangone'), (238, '11 - Dagon'), (239, '12 - Bahan'), (240, '13 - Seikkan')]), ('47 - Taungbine (South)', [(241, '01 - Thanhlyin'), (242, '02 - Kyauktan'), (243, '03 - Thonekhwa'), (244, '04 - Khayan'), (245, '05 - Tonte'), (246, '06 - Kauthmu'), (247, '07 - Kunchangone'), (248, '08 - Dala'), (249, '09 - Seikkyee'), (250, '10 - Khanaungto'), (251, '11 - Kokoe Island')]), ('48 - Myaukpine (North)', [(252, '01 - Insein'), (253, '02 - Mingalardon'), (254, '03 - Htaunkkyant*'), (255, '04 - Hmawbi'), (256, '05 - Hlegu'), (257, '06 - Tiakkyee'), (258, '07 - Oakkan*'), (259, '08 - Htantabin'), (260, '09 - Shwepyithar'), (261, '10 - Hlaingtharyar'), (262, '11 - Ahphyauk*')]), ('49 - Taungyi', [(263, '01 - Taungyi'), (264, '02 - Ayetharyar*'), (265, '03 - Hopone'), (266, '04 - Nyaungshwe'), (267, '05 - Sisaing'), (268, '06 - Kalaw'), (269, '07 - Aungban*'), (270, '08 - Pindaya'), (271, '09 - Ywarngan'), (272, '10 - Yatsauk'), (273, '11 - Pinlaung'), (274, '12 - Phekhoan')]), ('50 - Loilin', [(275, '01 - Loilin'), (276, '02 - Pinlon'), (277, '03 - Leichar'), (278, '04 - Nantsam(South)'), (279, '05 - Kunhein'), (280, '06 - Moenei'), (281, '07 - Linkhay'), (282, '08 - Maukmei'), (283, '09 - Minepan'), (284, '10 - Kyaythee'), (285, '11 - Minekaing'), (286, '12 - Mineshu')]), ('51 - Lahsio', [(287, '01 - Lashio'), (288, '02 - Theinni'), (289, '03 - Mineyei'), (290, '04 - Tantyan'), (291, '05 - Minephant'), (292, '06 - Panyang'), (293, '07 - Narphan'), (294, '08 - Panwaing'), (295, '09 - Minemaw'), (296, '10 - Pansan (Pankhan)')]), ('52 - Muse', [(297, '01 - Muse'), (298, '02 - Nantkhan'), (299, '03 - Kutkhaing'), (300, '04 - Monkoe'), (301, '05 - Kyukoak')]), ('53 - Kyaukmei', [(302, '01 - Kyaukmei'), (303, '02 - Naungcho'), (304, '03 - Thibaw'), (305, '04 - Namtu'), (306, '05 - Nantsam(North)'), (307, '06 - Moemaik'), (308, '07 - Mabain'), (309, '08 - Mantoan')]), ('54 - Kunloan', [(310, '01 - Kunloan'), (311, '02 - Hopan')]), ('55 - Laukkaing', [(312, '01 - Laukkaing'), (313, '02 - Chinshwehaw*'), (314, '03 - Koankyan')]), ('56 - Kyaington', [(315, '01 - Kyaington'), (316, '02 - Minekhat'), (317, '03 - Mineyang'), (318, '04 - Minelar'), (319, '05 - Metman')]), ('57 - Minesatt', [(320, '01 - Minesatt'), (321, '02 - Minepyinn'), (322, '03 - Minetoan')]), ('58 - Tachilaik', [(323, '01 - Tachilaik')]), ('59 - Minephyat', [(324, '01 - Minephyat'), (325, '02 - Mineyaung')]), ('60 - Pathein', [(326, '01 - Pathein'), (327, '02 - Kangyidaunt'), (328, '03 - Tharpaung'), (329, '04 - Ngaputaw'), (330, '05 - Kyoanpyaw'), (331, '06 - Yaykyi'), (332, '07 - Ngathaingchaung'), (333, '08 - Kyaungkone'), (334, '09 - Haigyikyun')]), ('61 - Hinthada', [(335, '01 - Hinthada'), (336, '02 - Zalun'), (337, '03 - Laymyethnar'), (338, '04 - Myan Aung'), (339, '05 - Ka Naung'), (340, '06 - Kyankhin'), (341, '07 - Ingapu')]), ('62 - Myaungmya', [(342, '01 - Myaungmya'), (343, '02 - Ainmei'), (344, '03 - Laputta'), (345, '04 - Warkhema'), (346, '05 - Mawlamyaingkyun')]), ('63 - Ma U Bin', [(347, '01 - Ma U Bin'), (348, '02 - Pantanaw'), (349, '03 - Nyaungtone'), (350, '04 - Danubyu')]), ('64 - Phyarpon', [(351, '01 - Phyarpon'), (352, '02 - Bogalay'), (353, '03 - Kyaiklatt'), (354, '04 - Daydayei')]), ('65 - Zayarthiri', [(355, '01 - Zayarthiri')]), ('66 - Dakhenathiri', [(356, '01 - Dakhenathiri')]), ('67 - Oktayathiri', [(357, '01 - Oktayathiri')]), ('68 - Potebathiri', [(358, '01 - Potebathiri')]), ('69 - Zabuthiri', [(359, '01 - Zabuthiri')]), ('70 - Tatkon', [(360, '01 - Tatkon')]), ('71 - Pyinmana', [(361, '01 - Pyinmana')]), ('72 - Lewe', [(362, '01 - Lewe')])]),
),
migrations.AlterField(
model_name='f201',
name='CTIZ',
field=birth_registration.fields.CitizenshipField(choices=[(1, '01 - Myanmar'), (2, '02 - Indian'), (3, '03 - Pakistani'), (4, '04 - Bangladesh'), (5, '05 - Nepalese'), (6, '06 - Chinese'), (7, '07 - European/American'), (8, '08 - Other Asian'), (9, '09 - Others'), (10, '10 - Not stated')], verbose_name='Citizenship', validators=[django.core.validators.MaxValueValidator(10)]),
),
migrations.AlterField(
model_name='f201',
name='Father_name',
field=birth_registration.fields.Char100Field(max_length=100, null=True, verbose_name="Father's Name", blank=True),
),
migrations.AlterField(
model_name='f201',
name='Mother_name',
field=birth_registration.fields.Char100Field(max_length=100, null=True, verbose_name="Mother's Name", blank=True),
),
migrations.AlterField(
model_name='f201',
name='NNRT',
field=birth_registration.fields.StateDivisionField(choices=[(1, '01 - Kachin'), (2, '02 - Kayh'), (3, '03 - Kayin'), (4, '04 - Chin'), (5, '05 - Sagaing'), (6, '06 - Tanintharyi'), (7, '07 - Bago'), (8, '08 - Magway'), (9, '09 - Mandalay'), (10, '10 - Mon'), (11, '11 - Rakhine'), (12, '12 - Yangon'), (13, '13 - Shan'), (14, '14 - Ayyarwaddy'), (15, '15 - NayPyiTaw')], verbose_name='State/Division', validators=[birth_registration.validators.validate_2digits]),
),
migrations.AlterField(
model_name='f201',
name='NNRT1',
field=birth_registration.fields.DistrictField(choices=[('01 - Kachin', [(1, '01 - Myitkyina'), (2, '02 - Bamaw'), (3, '03 - Puta O')]), ('02 - Kayh', [(4, '04 - Loikaw'), (5, '05 - Bawlakhei')]), ('03 - Kayin', [(6, '06 - Pha An'), (7, '07 - Myawaddy'), (8, '08 - Kawtkaraik')]), ('04 - Chin', [(9, '09 - Phalan'), (10, '10 - Mintatt')]), ('05 - Sagaing', [(11, '11 - Sagaing'), (12, '12 - Shwebo'), (13, '13 - Monywar'), (14, '14 - Kathar'), (15, '15 - Kalay'), (16, '16 - Tamu'), (17, '17 - Mawlaik'), (18, '18 - Khantee')]), ('06 - Tanintharyi', [(19, '19 - Dawei'), (20, '20 - Myeik'), (21, '21 - Kautthaung')]), ('07 - Bago', [(22, '22 - Bago'), (23, '23 - Pyay'), (24, '24 - Tharyarwaddy'), (25, '25 - Taungoo')]), ('08 - Magway', [(26, '26 - Magway'), (27, '27 - Minbu'), (28, '28 - Thayet'), (29, '29 - Pakokku'), (30, '30 - Gantgaw')]), ('09 - Mandalay', [(31, '31 - Mandalay'), (32, '32 - Pyin Oo Lwin'), (33, '33 - Kyaukse'), (34, '34 - Myingyan'), (35, '35 - Nyaung Oo'), (36, '36 - Yameithinn'), (37, '37 - Meikhtila')]), ('10 - Mon', [(38, '38 - Mawlamyaing'), (39, '39 - Thahton')]), ('11 - Rakhine', [(40, '40 - Sittwe'), (41, '41 - Maungdaw'), (42, '42 - Buthidaung'), (43, '43 - Kyaukphyu'), (44, '44 - Thandwe')]), ('12 - Yangon', [(45, '45 - Ahshaytbine (East)'), (46, '46 - Ahnoutpine (West)'), (47, '47 - Taungbine (South)'), (48, '48 - Myaukpine (North)')]), ('13 - Shan', [(49, '49 - Taungyi'), (50, '50 - Loilin'), (51, '51 - Lahsio'), (52, '52 - Muse'), (53, '53 - Kyaukmei'), (54, '54 - Kunloan'), (55, '55 - Laukkaing'), (56, '56 - Kyaington'), (57, '57 - Minesatt'), (58, '58 - Tachilaik'), (59, '59 - Minephyat')]), ('14 - Ayyarwaddy', [(60, '60 - Pathein'), (61, '61 - Hinthada'), (62, '62 - Myaungmya'), (63, '63 - Ma U Bin'), (64, '64 - Phyarpon')]), ('15 - NayPyiTaw', [(65, '65 - Zayarthiri'), (66, '66 - Dakhenathiri'), (67, '67 - Oktayathiri'), (68, '68 - Potebathiri'), (69, '69 - Zabuthiri'), (70, '70 - Tatkon'), (71, '71 - Pyinmana'), (72, '72 - Lewe')])], verbose_name='District', validators=[birth_registration.validators.validate_2digits]),
),
migrations.AlterField(
model_name='f201',
name='NNST',
field=birth_registration.fields.TownshipField(verbose_name='Township or town', choices=[('01 - Myitkyina', [(1, '01 - Myitkyina'), (2, '02 - Waingmaw'), (3, '03 - Ingyan Yan'), (4, '04 - Moekaung'), (5, '05 - Moehnyin'), (6, '06 - Phakant'), (7, '07 - Karmine*'), (8, '08 - Ta Naing'), (9, '09 - Chibway'), (10, '10 - Sautlaw')]), ('02 - Bamaw', [(11, '01 - Bamaw'), (12, '02 - Shwegu'), (13, '03 - Moemauk'), (14, '04 - Mansi')]), ('03 - Puta O', [(15, '01 - Puta O'), (16, '02 - Swanprabum'), (17, '03 - Machanbaw'), (18, '04 - Khaunglanphoo'), (19, '05 - Naungmon')]), ('04 - Loikaw', [(20, '01 - Loikaw'), (21, '02 - Dimawsoe'), (22, '03 - Phrusoe'), (23, '04 - Shartaw')]), ('05 - Bawlakhei', [(24, '01 - Bawlakhei'), (25, '02 - Pharsaung'), (26, '03 - Meiseit')]), ('06 - Pha An', [(27, '01 - Pha An'), (28, '02 - Hlaingbweit'), (29, '03 - Pharpon'), (30, '04 - Thandaung'), (31, '05 - Thandaungkyee')]), ('07 - Myawaddy', [(32, '01 - Myawaddy')]), ('08 - Kawtkaraik', [(33, '01 - Kawtkaraik'), (34, '02 - Kyarinseikkyi'), (35, '03 - Phayathonesu*'), (36, '04 - Kyoandoe')]), ('09 - Phalan', [(37, '01 - Phalan'), (38, '02 - Hakha'), (39, '03 - Htantalan'), (40, '04 - Teetain'), (41, '05 - Tunzan')]), ('10 - Mintatt', [(42, '01 - Mintatt'), (43, '02 - Matupi'), (44, '03 - Kanpetlet'), (45, '04 - Paletwa')]), ('11 - Sagaing', [(46, '01 - Sagaing'), (47, '02 - Myinmu'), (48, '03 - Myaung')]), ('12 - Shwebo', [(49, '01 - Shwebo'), (50, '02 - Khin Oo'), (51, '03 - Wetlet'), (52, '04 - Kantbalu'), (53, '05 - Kyunhla'), (54, '06 - Yay Oo'), (55, '07 - Dipeiyin'), (56, '08 - Tantsei')]), ('13 - Monywar', [(57, '01 - Monywar'), (58, '02 - Butalin'), (59, '03 - Ahyartaw'), (60, '04 - Chaung Oo'), (61, '05 - Yinmarbin'), (62, '06 - Kani'), (63, '07 - Salingyee'), (64, '08 - Palei')]), ('14 - Kathar', [(65, '01 - Kathar'), (66, '02 - Indaw'), (67, '03 - Hteechaink'), (68, '04 - Bamauk'), (69, '05 - Kawlin'), (70, '06 - Wuntho'), (71, '07 - Pinleibu')]), ('15 - Kalay', [(72, '01 - Kalay'), (73, '02 - Kalaywa'), (74, '03 - Minkin')]), ('16 - Tamu', [(75, '01 - Tamu')]), ('17 - Mawlaik', [(76, '01 - Mawlaik'), (77, '02 - Phaungpyin')]), ('18 - Khantee', [(78, '01 - Khantee'), (79, '02 - Hoamalin'), (80, '03 - Layshee'), (81, '04 - Lahei'), (82, '05 - Nanyon')]), ('19 - Dawei', [(83, '01 - Dawei'), (84, '02 - Launglon'), (85, '03 - Thayetchaung'), (86, '04 - Yayphyu')]), ('20 - Myeik', [(87, '01 - Myeik'), (88, '02 - Kyunsu'), (89, '03 - Pulaw'), (90, '04 - Tanintharyi')]), ('21 - Kautthaung', [(91, '01 - Kautthaung'), (92, '02 - Boatpyin')]), ('22 - Bago', [(93, '01 - Bago'), (94, '02 - Thanatpin'), (95, '03 - Kawa'), (96, '04 - Waw'), (97, '05 - Nyaunglaybin'), (98, '06 - Madauk*'), (99, '07 - Pyuntanzar*'), (100, '08 - Kyauktaga'), (101, '09 - Peinweikone*'), (102, '10 - Daik Oo'), (103, '11 - Shwekyin')]), ('23 - Pyay', [(104, '01 - Pyay'), (105, '02 - Pauk Khaung'), (106, '03 - Padaung'), (107, '04 - Paungtei'), (108, '05 - Theikone'), (109, '06 - Shwetaung')]), ('24 - Tharyarwaddy', [(110, '01 - Tharyarwaddy'), (111, '02 - Thonesei*'), (112, '03 - Letpandan'), (113, '04 - Minhla'), (114, '05 - Oakpho'), (115, '06 - Zeekone'), (116, '07 - Nattalin'), (117, '08 - Moenyo'), (118, '09 - Kyoetbinkaut')]), ('25 - Taungoo', [(119, '01 - Taungoo'), (120, '02 - Yaytarshay'), (121, '03 - Kyaukyee'), (122, '04 - Phyu'), (123, '05 - Oaktwin'), (124, '06 - Htandabin')]), ('26 - Magway', [(125, '01 - Magway'), (126, '02 - Yenanchaung'), (127, '03 - Chauk'), (128, '04 - Taungtwingyee'), (129, '05 - Myoethit'), (130, '06 - Natmauk')]), ('27 - Minbu', [(131, '01 - Minbu'), (132, '02 - Saku*'), (133, '03 - Pwintphyu'), (134, '04 - Ngaphei'), (135, '05 - Salin'), (136, '06 - Sinphyukyun*'), (137, '07 - Saytoattaya')]), ('28 - Thayet', [(138, '01 - Thayet'), (139, '02 - Minhla'), (140, '03 - Mintone'), (141, '04 - Kanma'), (142, '05 - Aunglan'), (143, '06 - Sinpaungwei')]), ('29 - Pakokku', [(144, '01 - Pakokku'), (145, '02 - Yesagyo'), (146, '03 - Myaing'), (147, '04 - Pauk'), (148, '05 - Saikphyu')]), ('30 - Gantgaw', [(149, '01 - Gantgaw'), (150, '02 - Hteelin'), (151, '03 - Saw')]), ('31 - Mandalay', [(152, '01 - Aungmyaytharzan'), (153, '02 - Chanayetharzan'), (154, '03 - MahaAungmyay'), (155, '04 - Chanmyatharsi'), (156, '05 - Pyigyeetakhun'), (157, '06 - Amarapura'), (158, '07 - Myitnge*'), (159, '08 - Patheingyee')]), ('32 - Pyin Oo Lwin', [(160, '01 - Pyin Oo Lwin'), (161, '02 - Madayar'), (162, '03 - Sintkuu'), (163, '04 - Moegauk'), (164, '05 - Thabaikkyin')]), ('33 - Kyaukse', [(165, '01 - Kyaukse'), (166, '02 - Sintkai'), (167, '03 - Myitthar'), (168, '04 - Tadaoo')]), ('34 - Myingyan', [(169, '01 - Myingyan'), (170, '02 - Taungthar'), (171, '03 - Nahtoegyee'), (172, '04 - Kyaukbadaung'), (173, '05 - Nganzun')]), ('35 - Nyaung Oo', [(174, '01 - Nyaung Oo'), (175, '02 - Bagan*'), (176, '03 - Ngatharauk*')]), ('36 - Yameithinn', [(177, '01 - Yameithinn'), (178, '02 - Pyawbwei'), (179, '03 - Tatkone'), (180, '04 - Pyinmana'), (181, '05 - Leiway')]), ('37 - Meikhtila', [(182, '01 - Meikhtila'), (183, '02 - Mahlaing'), (184, '03 - Tharzi'), (185, '04 - Wantwin')]), ('38 - Mawlamyaing', [(186, '01 - Mawlamyaing'), (187, '02 - Kyaikmayaw'), (188, '03 - Chaungzon'), (189, '04 - Thanphyuzayat'), (190, '05 - Kyaikkhami*'), (191, '06 - Mudon'), (192, '07 - Yay')]), ('39 - Thahton', [(193, '01 - Thahton'), (194, '02 - Paung'), (195, '03 - Kyaikhto'), (196, '04 - Beelin')]), ('40 - Sittwe', [(197, '01 - Sittwe'), (198, '02 - Poannakyun'), (199, '03 - Myauk Oo'), (200, '04 - Kyauktaw'), (201, '05 - Minbya'), (202, '06 - Pauktaw'), (203, '07 - Myaybon')]), ('41 - Maungdaw', [(204, '01 - Maungdaw')]), ('42 - Buthidaung', [(205, '01 - Buthidaung'), (206, '02 - Rathedaung')]), ('43 - Kyaukphyu', [(207, '01 - Kyaukphyu'), (208, '02 - Man Aung'), (209, '03 - Ranbyei'), (210, '04 - Ann')]), ('44 - Thandwe', [(211, '01 - Thandwe'), (212, '02 - Taungkauk'), (213, '03 - Gwa')]), ('45 - Ahshaytbine (East)', [(214, '01 - Thingankyun'), (215, '02 - Yankin'), (216, '03 - Taung Okkalapa'), (217, '04 - Myauk Okkalapa'), (218, '05 - Tharketa'), (219, '06 - Dawbon'), (220, '07 - Tarmwe'), (221, '08 - Pazuntaung'), (222, '09 - Botahtaung'), (223, '10 - Dagon Myothit Taung (Sounth)'), (224, '11 - Dagon Myothi Myauk (North)'), (225, '12 - Dagon Myothit Ahshayt (East)'), (226, '13 - Dagon Myothit (Seikkan)'), (227, '14 - Mingalataungnyunt')]), ('46 - Ahnoutpine (West)', [(228, '01 - Kyauktada'), (229, '02 - Panbedan'), (230, '03 - Lanmadaw'), (231, '04 - Lathar'), (232, '05 - Ahlon'), (233, '06 - Kyeemyindine'), (234, '07 - Sanchaung'), (235, '08 - Hlaing'), (236, '09 - Kamaryut'), (237, '10 - Mayangone'), (238, '11 - Dagon'), (239, '12 - Bahan'), (240, '13 - Seikkan')]), ('47 - Taungbine (South)', [(241, '01 - Thanhlyin'), (242, '02 - Kyauktan'), (243, '03 - Thonekhwa'), (244, '04 - Khayan'), (245, '05 - Tonte'), (246, '06 - Kauthmu'), (247, '07 - Kunchangone'), (248, '08 - Dala'), (249, '09 - Seikkyee'), (250, '10 - Khanaungto'), (251, '11 - Kokoe Island')]), ('48 - Myaukpine (North)', [(252, '01 - Insein'), (253, '02 - Mingalardon'), (254, '03 - Htaunkkyant*'), (255, '04 - Hmawbi'), (256, '05 - Hlegu'), (257, '06 - Tiakkyee'), (258, '07 - Oakkan*'), (259, '08 - Htantabin'), (260, '09 - Shwepyithar'), (261, '10 - Hlaingtharyar'), (262, '11 - Ahphyauk*')]), ('49 - Taungyi', [(263, '01 - Taungyi'), (264, '02 - Ayetharyar*'), (265, '03 - Hopone'), (266, '04 - Nyaungshwe'), (267, '05 - Sisaing'), (268, '06 - Kalaw'), (269, '07 - Aungban*'), (270, '08 - Pindaya'), (271, '09 - Ywarngan'), (272, '10 - Yatsauk'), (273, '11 - Pinlaung'), (274, '12 - Phekhoan')]), ('50 - Loilin', [(275, '01 - Loilin'), (276, '02 - Pinlon'), (277, '03 - Leichar'), (278, '04 - Nantsam(South)'), (279, '05 - Kunhein'), (280, '06 - Moenei'), (281, '07 - Linkhay'), (282, '08 - Maukmei'), (283, '09 - Minepan'), (284, '10 - Kyaythee'), (285, '11 - Minekaing'), (286, '12 - Mineshu')]), ('51 - Lahsio', [(287, '01 - Lashio'), (288, '02 - Theinni'), (289, '03 - Mineyei'), (290, '04 - Tantyan'), (291, '05 - Minephant'), (292, '06 - Panyang'), (293, '07 - Narphan'), (294, '08 - Panwaing'), (295, '09 - Minemaw'), (296, '10 - Pansan (Pankhan)')]), ('52 - Muse', [(297, '01 - Muse'), (298, '02 - Nantkhan'), (299, '03 - Kutkhaing'), (300, '04 - Monkoe'), (301, '05 - Kyukoak')]), ('53 - Kyaukmei', [(302, '01 - Kyaukmei'), (303, '02 - Naungcho'), (304, '03 - Thibaw'), (305, '04 - Namtu'), (306, '05 - Nantsam(North)'), (307, '06 - Moemaik'), (308, '07 - Mabain'), (309, '08 - Mantoan')]), ('54 - Kunloan', [(310, '01 - Kunloan'), (311, '02 - Hopan')]), ('55 - Laukkaing', [(312, '01 - Laukkaing'), (313, '02 - Chinshwehaw*'), (314, '03 - Koankyan')]), ('56 - Kyaington', [(315, '01 - Kyaington'), (316, '02 - Minekhat'), (317, '03 - Mineyang'), (318, '04 - Minelar'), (319, '05 - Metman')]), ('57 - Minesatt', [(320, '01 - Minesatt'), (321, '02 - Minepyinn'), (322, '03 - Minetoan')]), ('58 - Tachilaik', [(323, '01 - Tachilaik')]), ('59 - Minephyat', [(324, '01 - Minephyat'), (325, '02 - Mineyaung')]), ('60 - Pathein', [(326, '01 - Pathein'), (327, '02 - Kangyidaunt'), (328, '03 - Tharpaung'), (329, '04 - Ngaputaw'), (330, '05 - Kyoanpyaw'), (331, '06 - Yaykyi'), (332, '07 - Ngathaingchaung'), (333, '08 - Kyaungkone'), (334, '09 - Haigyikyun')]), ('61 - Hinthada', [(335, '01 - Hinthada'), (336, '02 - Zalun'), (337, '03 - Laymyethnar'), (338, '04 - Myan Aung'), (339, '05 - Ka Naung'), (340, '06 - Kyankhin'), (341, '07 - Ingapu')]), ('62 - Myaungmya', [(342, '01 - Myaungmya'), (343, '02 - Ainmei'), (344, '03 - Laputta'), (345, '04 - Warkhema'), (346, '05 - Mawlamyaingkyun')]), ('63 - Ma U Bin', [(347, '01 - Ma U Bin'), (348, '02 - Pantanaw'), (349, '03 - Nyaungtone'), (350, '04 - Danubyu')]), ('64 - Phyarpon', [(351, '01 - Phyarpon'), (352, '02 - Bogalay'), (353, '03 - Kyaiklatt'), (354, '04 - Daydayei')]), ('65 - Zayarthiri', [(355, '01 - Zayarthiri')]), ('66 - Dakhenathiri', [(356, '01 - Dakhenathiri')]), ('67 - Oktayathiri', [(357, '01 - Oktayathiri')]), ('68 - Potebathiri', [(358, '01 - Potebathiri')]), ('69 - Zabuthiri', [(359, '01 - Zabuthiri')]), ('70 - Tatkon', [(360, '01 - Tatkon')]), ('71 - Pyinmana', [(361, '01 - Pyinmana')]), ('72 - Lewe', [(362, '01 - Lewe')])]),
),
migrations.AlterField(
model_name='f201',
name='NNVD',
field=birth_registration.fields.AreaField(help_text='Urban/Rural', verbose_name='Area', choices=[(1, '01 - Urban'), (2, '02 - Rural')]),
),
migrations.AlterField(
model_name='f201',
name='Name',
field=birth_registration.fields.Char100Field(max_length=100, null=True, verbose_name='Name', blank=True),
),
migrations.AlterField(
model_name='f201',
name='OCCU',
field=birth_registration.fields.OccupationField(verbose_name='Occupation', choices=[(1, '01 - Professional Technical and related workers'), (2, '02 - Administrative and managerial workers'), (3, '03 - Clerical and related workers'), (4, '04 - Sales workers'), (5, '05 - Services workers'), (6, '06 - Agriculture, Animal Husbandry and Forest workers, Fishermen, Hunters'), (7, '07 - Production and related workers, Transport equipment operators and labours'), (8, '08 - Not classified by occupation'), (9, '09 - Armed Forces'), (0, '00 - Economically inactive')]),
),
migrations.AlterField(
model_name='f201',
name='Original_form',
field=birth_registration.fields.OriginalFormField(help_text='Please attach a scanned copy of a photograph on the original form', upload_to=b'F201/%Y/%m/%d/', null=True, verbose_name='Original Form Image', blank=True),
),
migrations.AlterField(
model_name='f201',
name='RACE',
field=birth_registration.fields.RaceField(verbose_name='Race', choices=[(1, '01 - Kachin'), (2, '02 - Kayah'), (3, '03 - Kayin'), (4, '04 - Chin'), (5, '05 - Bamar'), (6, '06 - Mon'), (7, '07 - Rakhine'), (8, '08 - Shan'), (9, '09 - Other indigenous Races'), (10, '10 - Myanmar/Foreigners'), (11, '11 - Chinese'), (12, '12 - Indian'), (13, '13 - Pakistani'), (14, '14 - Bangladesh'), (15, '15 - Nepal'), (16, '16 - Other Asian'), (17, '17 - Others'), (18, '18 - Not stated')]),
),
migrations.AlterField(
model_name='f201',
name='REL',
field=birth_registration.fields.ReligionField(choices=[(1, '01 - Buddhist'), (2, '02 - Islam'), (3, '03 - Christian'), (4, '04 - Hindu'), (5, '05 - Animist'), (6, '06 - Confucion'), (7, '07 - Sikh'), (8, '08 - Jew'), (9, '09 - Others'), (10, 'Not stated')], verbose_name='Religion', validators=[django.core.validators.MaxValueValidator(10)]),
),
migrations.AlterField(
model_name='f201',
name='SEX',
field=models.PositiveSmallIntegerField(help_text='<ul><li>Male = 1 </li><li>Female = 2 </li><li>Not stated = 3</li></ul>', choices=[(1, '01 - Male'), (2, '02 - Female'), (9, '09 - Not stated')], verbose_name='Sex', validators=[django.core.validators.RegexValidator(regex=b'(1|2|9)', message='Incorrect choice. Avaiable choices:1 - Male, 2 - Female9 - Not Stated.')]),
),
migrations.AlterField(
model_name='f201',
name='SNER',
field=birth_registration.fields.SerialNoField(help_text='Enter serial No. direct into coding column. Watch carefully for sequence of serial No.If it is not in sequence make query.<br/>Code actual serial No. in full 4 digits. If serial No. is 1 to 99; Code 0001-0009, and 10-99 code 0010, 0011,\u2026\u2026\u2026..0099, 0100 and over in full', verbose_name='Serial No. in Registration book', validators=[django.core.validators.MaxValueValidator(99999)]),
),
migrations.AlterField(
model_name='f201',
name='UPCR',
field=birth_registration.fields.TownshipField(verbose_name='Usual place of residence - Township', choices=[('01 - Myitkyina', [(1, '01 - Myitkyina'), (2, '02 - Waingmaw'), (3, '03 - Ingyan Yan'), (4, '04 - Moekaung'), (5, '05 - Moehnyin'), (6, '06 - Phakant'), (7, '07 - Karmine*'), (8, '08 - Ta Naing'), (9, '09 - Chibway'), (10, '10 - Sautlaw')]), ('02 - Bamaw', [(11, '01 - Bamaw'), (12, '02 - Shwegu'), (13, '03 - Moemauk'), (14, '04 - Mansi')]), ('03 - Puta O', [(15, '01 - Puta O'), (16, '02 - Swanprabum'), (17, '03 - Machanbaw'), (18, '04 - Khaunglanphoo'), (19, '05 - Naungmon')]), ('04 - Loikaw', [(20, '01 - Loikaw'), (21, '02 - Dimawsoe'), (22, '03 - Phrusoe'), (23, '04 - Shartaw')]), ('05 - Bawlakhei', [(24, '01 - Bawlakhei'), (25, '02 - Pharsaung'), (26, '03 - Meiseit')]), ('06 - Pha An', [(27, '01 - Pha An'), (28, '02 - Hlaingbweit'), (29, '03 - Pharpon'), (30, '04 - Thandaung'), (31, '05 - Thandaungkyee')]), ('07 - Myawaddy', [(32, '01 - Myawaddy')]), ('08 - Kawtkaraik', [(33, '01 - Kawtkaraik'), (34, '02 - Kyarinseikkyi'), (35, '03 - Phayathonesu*'), (36, '04 - Kyoandoe')]), ('09 - Phalan', [(37, '01 - Phalan'), (38, '02 - Hakha'), (39, '03 - Htantalan'), (40, '04 - Teetain'), (41, '05 - Tunzan')]), ('10 - Mintatt', [(42, '01 - Mintatt'), (43, '02 - Matupi'), (44, '03 - Kanpetlet'), (45, '04 - Paletwa')]), ('11 - Sagaing', [(46, '01 - Sagaing'), (47, '02 - Myinmu'), (48, '03 - Myaung')]), ('12 - Shwebo', [(49, '01 - Shwebo'), (50, '02 - Khin Oo'), (51, '03 - Wetlet'), (52, '04 - Kantbalu'), (53, '05 - Kyunhla'), (54, '06 - Yay Oo'), (55, '07 - Dipeiyin'), (56, '08 - Tantsei')]), ('13 - Monywar', [(57, '01 - Monywar'), (58, '02 - Butalin'), (59, '03 - Ahyartaw'), (60, '04 - Chaung Oo'), (61, '05 - Yinmarbin'), (62, '06 - Kani'), (63, '07 - Salingyee'), (64, '08 - Palei')]), ('14 - Kathar', [(65, '01 - Kathar'), (66, '02 - Indaw'), (67, '03 - Hteechaink'), (68, '04 - Bamauk'), (69, '05 - Kawlin'), (70, '06 - Wuntho'), (71, '07 - Pinleibu')]), ('15 - Kalay', [(72, '01 - Kalay'), (73, '02 - Kalaywa'), (74, '03 - Minkin')]), ('16 - Tamu', [(75, '01 - Tamu')]), ('17 - Mawlaik', [(76, '01 - Mawlaik'), (77, '02 - Phaungpyin')]), ('18 - Khantee', [(78, '01 - Khantee'), (79, '02 - Hoamalin'), (80, '03 - Layshee'), (81, '04 - Lahei'), (82, '05 - Nanyon')]), ('19 - Dawei', [(83, '01 - Dawei'), (84, '02 - Launglon'), (85, '03 - Thayetchaung'), (86, '04 - Yayphyu')]), ('20 - Myeik', [(87, '01 - Myeik'), (88, '02 - Kyunsu'), (89, '03 - Pulaw'), (90, '04 - Tanintharyi')]), ('21 - Kautthaung', [(91, '01 - Kautthaung'), (92, '02 - Boatpyin')]), ('22 - Bago', [(93, '01 - Bago'), (94, '02 - Thanatpin'), (95, '03 - Kawa'), (96, '04 - Waw'), (97, '05 - Nyaunglaybin'), (98, '06 - Madauk*'), (99, '07 - Pyuntanzar*'), (100, '08 - Kyauktaga'), (101, '09 - Peinweikone*'), (102, '10 - Daik Oo'), (103, '11 - Shwekyin')]), ('23 - Pyay', [(104, '01 - Pyay'), (105, '02 - Pauk Khaung'), (106, '03 - Padaung'), (107, '04 - Paungtei'), (108, '05 - Theikone'), (109, '06 - Shwetaung')]), ('24 - Tharyarwaddy', [(110, '01 - Tharyarwaddy'), (111, '02 - Thonesei*'), (112, '03 - Letpandan'), (113, '04 - Minhla'), (114, '05 - Oakpho'), (115, '06 - Zeekone'), (116, '07 - Nattalin'), (117, '08 - Moenyo'), (118, '09 - Kyoetbinkaut')]), ('25 - Taungoo', [(119, '01 - Taungoo'), (120, '02 - Yaytarshay'), (121, '03 - Kyaukyee'), (122, '04 - Phyu'), (123, '05 - Oaktwin'), (124, '06 - Htandabin')]), ('26 - Magway', [(125, '01 - Magway'), (126, '02 - Yenanchaung'), (127, '03 - Chauk'), (128, '04 - Taungtwingyee'), (129, '05 - Myoethit'), (130, '06 - Natmauk')]), ('27 - Minbu', [(131, '01 - Minbu'), (132, '02 - Saku*'), (133, '03 - Pwintphyu'), (134, '04 - Ngaphei'), (135, '05 - Salin'), (136, '06 - Sinphyukyun*'), (137, '07 - Saytoattaya')]), ('28 - Thayet', [(138, '01 - Thayet'), (139, '02 - Minhla'), (140, '03 - Mintone'), (141, '04 - Kanma'), (142, '05 - Aunglan'), (143, '06 - Sinpaungwei')]), ('29 - Pakokku', [(144, '01 - Pakokku'), (145, '02 - Yesagyo'), (146, '03 - Myaing'), (147, '04 - Pauk'), (148, '05 - Saikphyu')]), ('30 - Gantgaw', [(149, '01 - Gantgaw'), (150, '02 - Hteelin'), (151, '03 - Saw')]), ('31 - Mandalay', [(152, '01 - Aungmyaytharzan'), (153, '02 - Chanayetharzan'), (154, '03 - MahaAungmyay'), (155, '04 - Chanmyatharsi'), (156, '05 - Pyigyeetakhun'), (157, '06 - Amarapura'), (158, '07 - Myitnge*'), (159, '08 - Patheingyee')]), ('32 - Pyin Oo Lwin', [(160, '01 - Pyin Oo Lwin'), (161, '02 - Madayar'), (162, '03 - Sintkuu'), (163, '04 - Moegauk'), (164, '05 - Thabaikkyin')]), ('33 - Kyaukse', [(165, '01 - Kyaukse'), (166, '02 - Sintkai'), (167, '03 - Myitthar'), (168, '04 - Tadaoo')]), ('34 - Myingyan', [(169, '01 - Myingyan'), (170, '02 - Taungthar'), (171, '03 - Nahtoegyee'), (172, '04 - Kyaukbadaung'), (173, '05 - Nganzun')]), ('35 - Nyaung Oo', [(174, '01 - Nyaung Oo'), (175, '02 - Bagan*'), (176, '03 - Ngatharauk*')]), ('36 - Yameithinn', [(177, '01 - Yameithinn'), (178, '02 - Pyawbwei'), (179, '03 - Tatkone'), (180, '04 - Pyinmana'), (181, '05 - Leiway')]), ('37 - Meikhtila', [(182, '01 - Meikhtila'), (183, '02 - Mahlaing'), (184, '03 - Tharzi'), (185, '04 - Wantwin')]), ('38 - Mawlamyaing', [(186, '01 - Mawlamyaing'), (187, '02 - Kyaikmayaw'), (188, '03 - Chaungzon'), (189, '04 - Thanphyuzayat'), (190, '05 - Kyaikkhami*'), (191, '06 - Mudon'), (192, '07 - Yay')]), ('39 - Thahton', [(193, '01 - Thahton'), (194, '02 - Paung'), (195, '03 - Kyaikhto'), (196, '04 - Beelin')]), ('40 - Sittwe', [(197, '01 - Sittwe'), (198, '02 - Poannakyun'), (199, '03 - Myauk Oo'), (200, '04 - Kyauktaw'), (201, '05 - Minbya'), (202, '06 - Pauktaw'), (203, '07 - Myaybon')]), ('41 - Maungdaw', [(204, '01 - Maungdaw')]), ('42 - Buthidaung', [(205, '01 - Buthidaung'), (206, '02 - Rathedaung')]), ('43 - Kyaukphyu', [(207, '01 - Kyaukphyu'), (208, '02 - Man Aung'), (209, '03 - Ranbyei'), (210, '04 - Ann')]), ('44 - Thandwe', [(211, '01 - Thandwe'), (212, '02 - Taungkauk'), (213, '03 - Gwa')]), ('45 - Ahshaytbine (East)', [(214, '01 - Thingankyun'), (215, '02 - Yankin'), (216, '03 - Taung Okkalapa'), (217, '04 - Myauk Okkalapa'), (218, '05 - Tharketa'), (219, '06 - Dawbon'), (220, '07 - Tarmwe'), (221, '08 - Pazuntaung'), (222, '09 - Botahtaung'), (223, '10 - Dagon Myothit Taung (Sounth)'), (224, '11 - Dagon Myothi Myauk (North)'), (225, '12 - Dagon Myothit Ahshayt (East)'), (226, '13 - Dagon Myothit (Seikkan)'), (227, '14 - Mingalataungnyunt')]), ('46 - Ahnoutpine (West)', [(228, '01 - Kyauktada'), (229, '02 - Panbedan'), (230, '03 - Lanmadaw'), (231, '04 - Lathar'), (232, '05 - Ahlon'), (233, '06 - Kyeemyindine'), (234, '07 - Sanchaung'), (235, '08 - Hlaing'), (236, '09 - Kamaryut'), (237, '10 - Mayangone'), (238, '11 - Dagon'), (239, '12 - Bahan'), (240, '13 - Seikkan')]), ('47 - Taungbine (South)', [(241, '01 - Thanhlyin'), (242, '02 - Kyauktan'), (243, '03 - Thonekhwa'), (244, '04 - Khayan'), (245, '05 - Tonte'), (246, '06 - Kauthmu'), (247, '07 - Kunchangone'), (248, '08 - Dala'), (249, '09 - Seikkyee'), (250, '10 - Khanaungto'), (251, '11 - Kokoe Island')]), ('48 - Myaukpine (North)', [(252, '01 - Insein'), (253, '02 - Mingalardon'), (254, '03 - Htaunkkyant*'), (255, '04 - Hmawbi'), (256, '05 - Hlegu'), (257, '06 - Tiakkyee'), (258, '07 - Oakkan*'), (259, '08 - Htantabin'), (260, '09 - Shwepyithar'), (261, '10 - Hlaingtharyar'), (262, '11 - Ahphyauk*')]), ('49 - Taungyi', [(263, '01 - Taungyi'), (264, '02 - Ayetharyar*'), (265, '03 - Hopone'), (266, '04 - Nyaungshwe'), (267, '05 - Sisaing'), (268, '06 - Kalaw'), (269, '07 - Aungban*'), (270, '08 - Pindaya'), (271, '09 - Ywarngan'), (272, '10 - Yatsauk'), (273, '11 - Pinlaung'), (274, '12 - Phekhoan')]), ('50 - Loilin', [(275, '01 - Loilin'), (276, '02 - Pinlon'), (277, '03 - Leichar'), (278, '04 - Nantsam(South)'), (279, '05 - Kunhein'), (280, '06 - Moenei'), (281, '07 - Linkhay'), (282, '08 - Maukmei'), (283, '09 - Minepan'), (284, '10 - Kyaythee'), (285, '11 - Minekaing'), (286, '12 - Mineshu')]), ('51 - Lahsio', [(287, '01 - Lashio'), (288, '02 - Theinni'), (289, '03 - Mineyei'), (290, '04 - Tantyan'), (291, '05 - Minephant'), (292, '06 - Panyang'), (293, '07 - Narphan'), (294, '08 - Panwaing'), (295, '09 - Minemaw'), (296, '10 - Pansan (Pankhan)')]), ('52 - Muse', [(297, '01 - Muse'), (298, '02 - Nantkhan'), (299, '03 - Kutkhaing'), (300, '04 - Monkoe'), (301, '05 - Kyukoak')]), ('53 - Kyaukmei', [(302, '01 - Kyaukmei'), (303, '02 - Naungcho'), (304, '03 - Thibaw'), (305, '04 - Namtu'), (306, '05 - Nantsam(North)'), (307, '06 - Moemaik'), (308, '07 - Mabain'), (309, '08 - Mantoan')]), ('54 - Kunloan', [(310, '01 - Kunloan'), (311, '02 - Hopan')]), ('55 - Laukkaing', [(312, '01 - Laukkaing'), (313, '02 - Chinshwehaw*'), (314, '03 - Koankyan')]), ('56 - Kyaington', [(315, '01 - Kyaington'), (316, '02 - Minekhat'), (317, '03 - Mineyang'), (318, '04 - Minelar'), (319, '05 - Metman')]), ('57 - Minesatt', [(320, '01 - Minesatt'), (321, '02 - Minepyinn'), (322, '03 - Minetoan')]), ('58 - Tachilaik', [(323, '01 - Tachilaik')]), ('59 - Minephyat', [(324, '01 - Minephyat'), (325, '02 - Mineyaung')]), ('60 - Pathein', [(326, '01 - Pathein'), (327, '02 - Kangyidaunt'), (328, '03 - Tharpaung'), (329, '04 - Ngaputaw'), (330, '05 - Kyoanpyaw'), (331, '06 - Yaykyi'), (332, '07 - Ngathaingchaung'), (333, '08 - Kyaungkone'), (334, '09 - Haigyikyun')]), ('61 - Hinthada', [(335, '01 - Hinthada'), (336, '02 - Zalun'), (337, '03 - Laymyethnar'), (338, '04 - Myan Aung'), (339, '05 - Ka Naung'), (340, '06 - Kyankhin'), (341, '07 - Ingapu')]), ('62 - Myaungmya', [(342, '01 - Myaungmya'), (343, '02 - Ainmei'), (344, '03 - Laputta'), (345, '04 - Warkhema'), (346, '05 - Mawlamyaingkyun')]), ('63 - Ma U Bin', [(347, '01 - Ma U Bin'), (348, '02 - Pantanaw'), (349, '03 - Nyaungtone'), (350, '04 - Danubyu')]), ('64 - Phyarpon', [(351, '01 - Phyarpon'), (352, '02 - Bogalay'), (353, '03 - Kyaiklatt'), (354, '04 - Daydayei')]), ('65 - Zayarthiri', [(355, '01 - Zayarthiri')]), ('66 - Dakhenathiri', [(356, '01 - Dakhenathiri')]), ('67 - Oktayathiri', [(357, '01 - Oktayathiri')]), ('68 - Potebathiri', [(358, '01 - Potebathiri')]), ('69 - Zabuthiri', [(359, '01 - Zabuthiri')]), ('70 - Tatkon', [(360, '01 - Tatkon')]), ('71 - Pyinmana', [(361, '01 - Pyinmana')]), ('72 - Lewe', [(362, '01 - Lewe')])]),
),
migrations.AlterField(
model_name='f201',
name='UPRS',
field=birth_registration.fields.StateDivisionField(choices=[(1, '01 - Kachin'), (2, '02 - Kayh'), (3, '03 - Kayin'), (4, '04 - Chin'), (5, '05 - Sagaing'), (6, '06 - Tanintharyi'), (7, '07 - Bago'), (8, '08 - Magway'), (9, '09 - Mandalay'), (10, '10 - Mon'), (11, '11 - Rakhine'), (12, '12 - Yangon'), (13, '13 - Shan'), (14, '14 - Ayyarwaddy'), (15, '15 - NayPyiTaw')], verbose_name='Usual place of residence - State/Division', validators=[birth_registration.validators.validate_2digits]),
),
migrations.AlterField(
model_name='f201',
name='UPRS1',
field=birth_registration.fields.DistrictField(choices=[('01 - Kachin', [(1, '01 - Myitkyina'), (2, '02 - Bamaw'), (3, '03 - Puta O')]), ('02 - Kayh', [(4, '04 - Loikaw'), (5, '05 - Bawlakhei')]), ('03 - Kayin', [(6, '06 - Pha An'), (7, '07 - Myawaddy'), (8, '08 - Kawtkaraik')]), ('04 - Chin', [(9, '09 - Phalan'), (10, '10 - Mintatt')]), ('05 - Sagaing', [(11, '11 - Sagaing'), (12, '12 - Shwebo'), (13, '13 - Monywar'), (14, '14 - Kathar'), (15, '15 - Kalay'), (16, '16 - Tamu'), (17, '17 - Mawlaik'), (18, '18 - Khantee')]), ('06 - Tanintharyi', [(19, '19 - Dawei'), (20, '20 - Myeik'), (21, '21 - Kautthaung')]), ('07 - Bago', [(22, '22 - Bago'), (23, '23 - Pyay'), (24, '24 - Tharyarwaddy'), (25, '25 - Taungoo')]), ('08 - Magway', [(26, '26 - Magway'), (27, '27 - Minbu'), (28, '28 - Thayet'), (29, '29 - Pakokku'), (30, '30 - Gantgaw')]), ('09 - Mandalay', [(31, '31 - Mandalay'), (32, '32 - Pyin Oo Lwin'), (33, '33 - Kyaukse'), (34, '34 - Myingyan'), (35, '35 - Nyaung Oo'), (36, '36 - Yameithinn'), (37, '37 - Meikhtila')]), ('10 - Mon', [(38, '38 - Mawlamyaing'), (39, '39 - Thahton')]), ('11 - Rakhine', [(40, '40 - Sittwe'), (41, '41 - Maungdaw'), (42, '42 - Buthidaung'), (43, '43 - Kyaukphyu'), (44, '44 - Thandwe')]), ('12 - Yangon', [(45, '45 - Ahshaytbine (East)'), (46, '46 - Ahnoutpine (West)'), (47, '47 - Taungbine (South)'), (48, '48 - Myaukpine (North)')]), ('13 - Shan', [(49, '49 - Taungyi'), (50, '50 - Loilin'), (51, '51 - Lahsio'), (52, '52 - Muse'), (53, '53 - Kyaukmei'), (54, '54 - Kunloan'), (55, '55 - Laukkaing'), (56, '56 - Kyaington'), (57, '57 - Minesatt'), (58, '58 - Tachilaik'), (59, '59 - Minephyat')]), ('14 - Ayyarwaddy', [(60, '60 - Pathein'), (61, '61 - Hinthada'), (62, '62 - Myaungmya'), (63, '63 - Ma U Bin'), (64, '64 - Phyarpon')]), ('15 - NayPyiTaw', [(65, '65 - Zayarthiri'), (66, '66 - Dakhenathiri'), (67, '67 - Oktayathiri'), (68, '68 - Potebathiri'), (69, '69 - Zabuthiri'), (70, '70 - Tatkon'), (71, '71 - Pyinmana'), (72, '72 - Lewe')])], verbose_name='Usual place of residence - District', validators=[birth_registration.validators.validate_2digits]),
),
migrations.AlterField(
model_name='rhc',
name='DIS',
field=birth_registration.fields.DistrictField(choices=[('01 - Kachin', [(1, '01 - Myitkyina'), (2, '02 - Bamaw'), (3, '03 - Puta O')]), ('02 - Kayh', [(4, '04 - Loikaw'), (5, '05 - Bawlakhei')]), ('03 - Kayin', [(6, '06 - Pha An'), (7, '07 - Myawaddy'), (8, '08 - Kawtkaraik')]), ('04 - Chin', [(9, '09 - Phalan'), (10, '10 - Mintatt')]), ('05 - Sagaing', [(11, '11 - Sagaing'), (12, '12 - Shwebo'), (13, '13 - Monywar'), (14, '14 - Kathar'), (15, '15 - Kalay'), (16, '16 - Tamu'), (17, '17 - Mawlaik'), (18, '18 - Khantee')]), ('06 - Tanintharyi', [(19, '19 - Dawei'), (20, '20 - Myeik'), (21, '21 - Kautthaung')]), ('07 - Bago', [(22, '22 - Bago'), (23, '23 - Pyay'), (24, '24 - Tharyarwaddy'), (25, '25 - Taungoo')]), ('08 - Magway', [(26, '26 - Magway'), (27, '27 - Minbu'), (28, '28 - Thayet'), (29, '29 - Pakokku'), (30, '30 - Gantgaw')]), ('09 - Mandalay', [(31, '31 - Mandalay'), (32, '32 - Pyin Oo Lwin'), (33, '33 - Kyaukse'), (34, '34 - Myingyan'), (35, '35 - Nyaung Oo'), (36, '36 - Yameithinn'), (37, '37 - Meikhtila')]), ('10 - Mon', [(38, '38 - Mawlamyaing'), (39, '39 - Thahton')]), ('11 - Rakhine', [(40, '40 - Sittwe'), (41, '41 - Maungdaw'), (42, '42 - Buthidaung'), (43, '43 - Kyaukphyu'), (44, '44 - Thandwe')]), ('12 - Yangon', [(45, '45 - Ahshaytbine (East)'), (46, '46 - Ahnoutpine (West)'), (47, '47 - Taungbine (South)'), (48, '48 - Myaukpine (North)')]), ('13 - Shan', [(49, '49 - Taungyi'), (50, '50 - Loilin'), (51, '51 - Lahsio'), (52, '52 - Muse'), (53, '53 - Kyaukmei'), (54, '54 - Kunloan'), (55, '55 - Laukkaing'), (56, '56 - Kyaington'), (57, '57 - Minesatt'), (58, '58 - Tachilaik'), (59, '59 - Minephyat')]), ('14 - Ayyarwaddy', [(60, '60 - Pathein'), (61, '61 - Hinthada'), (62, '62 - Myaungmya'), (63, '63 - Ma U Bin'), (64, '64 - Phyarpon')]), ('15 - NayPyiTaw', [(65, '65 - Zayarthiri'), (66, '66 - Dakhenathiri'), (67, '67 - Oktayathiri'), (68, '68 - Potebathiri'), (69, '69 - Zabuthiri'), (70, '70 - Tatkon'), (71, '71 - Pyinmana'), (72, '72 - Lewe')])], validators=[birth_registration.validators.validate_2digits]),
),
migrations.AlterField(
model_name='rhc',
name='ST_DV',
field=birth_registration.fields.StateDivisionField(choices=[(1, '01 - Kachin'), (2, '02 - Kayh'), (3, '03 - Kayin'), (4, '04 - Chin'), (5, '05 - Sagaing'), (6, '06 - Tanintharyi'), (7, '07 - Bago'), (8, '08 - Magway'), (9, '09 - Mandalay'), (10, '10 - Mon'), (11, '11 - Rakhine'), (12, '12 - Yangon'), (13, '13 - Shan'), (14, '14 - Ayyarwaddy'), (15, '15 - NayPyiTaw')], validators=[birth_registration.validators.validate_2digits]),
),
migrations.AlterField(
model_name='rhc',
name='TWN',
field=birth_registration.fields.TownshipField(choices=[('01 - Myitkyina', [(1, '01 - Myitkyina'), (2, '02 - Waingmaw'), (3, '03 - Ingyan Yan'), (4, '04 - Moekaung'), (5, '05 - Moehnyin'), (6, '06 - Phakant'), (7, '07 - Karmine*'), (8, '08 - Ta Naing'), (9, '09 - Chibway'), (10, '10 - Sautlaw')]), ('02 - Bamaw', [(11, '01 - Bamaw'), (12, '02 - Shwegu'), (13, '03 - Moemauk'), (14, '04 - Mansi')]), ('03 - Puta O', [(15, '01 - Puta O'), (16, '02 - Swanprabum'), (17, '03 - Machanbaw'), (18, '04 - Khaunglanphoo'), (19, '05 - Naungmon')]), ('04 - Loikaw', [(20, '01 - Loikaw'), (21, '02 - Dimawsoe'), (22, '03 - Phrusoe'), (23, '04 - Shartaw')]), ('05 - Bawlakhei', [(24, '01 - Bawlakhei'), (25, '02 - Pharsaung'), (26, '03 - Meiseit')]), ('06 - Pha An', [(27, '01 - Pha An'), (28, '02 - Hlaingbweit'), (29, '03 - Pharpon'), (30, '04 - Thandaung'), (31, '05 - Thandaungkyee')]), ('07 - Myawaddy', [(32, '01 - Myawaddy')]), ('08 - Kawtkaraik', [(33, '01 - Kawtkaraik'), (34, '02 - Kyarinseikkyi'), (35, '03 - Phayathonesu*'), (36, '04 - Kyoandoe')]), ('09 - Phalan', [(37, '01 - Phalan'), (38, '02 - Hakha'), (39, '03 - Htantalan'), (40, '04 - Teetain'), (41, '05 - Tunzan')]), ('10 - Mintatt', [(42, '01 - Mintatt'), (43, '02 - Matupi'), (44, '03 - Kanpetlet'), (45, '04 - Paletwa')]), ('11 - Sagaing', [(46, '01 - Sagaing'), (47, '02 - Myinmu'), (48, '03 - Myaung')]), ('12 - Shwebo', [(49, '01 - Shwebo'), (50, '02 - Khin Oo'), (51, '03 - Wetlet'), (52, '04 - Kantbalu'), (53, '05 - Kyunhla'), (54, '06 - Yay Oo'), (55, '07 - Dipeiyin'), (56, '08 - Tantsei')]), ('13 - Monywar', [(57, '01 - Monywar'), (58, '02 - Butalin'), (59, '03 - Ahyartaw'), (60, '04 - Chaung Oo'), (61, '05 - Yinmarbin'), (62, '06 - Kani'), (63, '07 - Salingyee'), (64, '08 - Palei')]), ('14 - Kathar', [(65, '01 - Kathar'), (66, '02 - Indaw'), (67, '03 - Hteechaink'), (68, '04 - Bamauk'), (69, '05 - Kawlin'), (70, '06 - Wuntho'), (71, '07 - Pinleibu')]), ('15 - Kalay', [(72, '01 - Kalay'), (73, '02 - Kalaywa'), (74, '03 - Minkin')]), ('16 - Tamu', [(75, '01 - Tamu')]), ('17 - Mawlaik', [(76, '01 - Mawlaik'), (77, '02 - Phaungpyin')]), ('18 - Khantee', [(78, '01 - Khantee'), (79, '02 - Hoamalin'), (80, '03 - Layshee'), (81, '04 - Lahei'), (82, '05 - Nanyon')]), ('19 - Dawei', [(83, '01 - Dawei'), (84, '02 - Launglon'), (85, '03 - Thayetchaung'), (86, '04 - Yayphyu')]), ('20 - Myeik', [(87, '01 - Myeik'), (88, '02 - Kyunsu'), (89, '03 - Pulaw'), (90, '04 - Tanintharyi')]), ('21 - Kautthaung', [(91, '01 - Kautthaung'), (92, '02 - Boatpyin')]), ('22 - Bago', [(93, '01 - Bago'), (94, '02 - Thanatpin'), (95, '03 - Kawa'), (96, '04 - Waw'), (97, '05 - Nyaunglaybin'), (98, '06 - Madauk*'), (99, '07 - Pyuntanzar*'), (100, '08 - Kyauktaga'), (101, '09 - Peinweikone*'), (102, '10 - Daik Oo'), (103, '11 - Shwekyin')]), ('23 - Pyay', [(104, '01 - Pyay'), (105, '02 - Pauk Khaung'), (106, '03 - Padaung'), (107, '04 - Paungtei'), (108, '05 - Theikone'), (109, '06 - Shwetaung')]), ('24 - Tharyarwaddy', [(110, '01 - Tharyarwaddy'), (111, '02 - Thonesei*'), (112, '03 - Letpandan'), (113, '04 - Minhla'), (114, '05 - Oakpho'), (115, '06 - Zeekone'), (116, '07 - Nattalin'), (117, '08 - Moenyo'), (118, '09 - Kyoetbinkaut')]), ('25 - Taungoo', [(119, '01 - Taungoo'), (120, '02 - Yaytarshay'), (121, '03 - Kyaukyee'), (122, '04 - Phyu'), (123, '05 - Oaktwin'), (124, '06 - Htandabin')]), ('26 - Magway', [(125, '01 - Magway'), (126, '02 - Yenanchaung'), (127, '03 - Chauk'), (128, '04 - Taungtwingyee'), (129, '05 - Myoethit'), (130, '06 - Natmauk')]), ('27 - Minbu', [(131, '01 - Minbu'), (132, '02 - Saku*'), (133, '03 - Pwintphyu'), (134, '04 - Ngaphei'), (135, '05 - Salin'), (136, '06 - Sinphyukyun*'), (137, '07 - Saytoattaya')]), ('28 - Thayet', [(138, '01 - Thayet'), (139, '02 - Minhla'), (140, '03 - Mintone'), (141, '04 - Kanma'), (142, '05 - Aunglan'), (143, '06 - Sinpaungwei')]), ('29 - Pakokku', [(144, '01 - Pakokku'), (145, '02 - Yesagyo'), (146, '03 - Myaing'), (147, '04 - Pauk'), (148, '05 - Saikphyu')]), ('30 - Gantgaw', [(149, '01 - Gantgaw'), (150, '02 - Hteelin'), (151, '03 - Saw')]), ('31 - Mandalay', [(152, '01 - Aungmyaytharzan'), (153, '02 - Chanayetharzan'), (154, '03 - MahaAungmyay'), (155, '04 - Chanmyatharsi'), (156, '05 - Pyigyeetakhun'), (157, '06 - Amarapura'), (158, '07 - Myitnge*'), (159, '08 - Patheingyee')]), ('32 - Pyin Oo Lwin', [(160, '01 - Pyin Oo Lwin'), (161, '02 - Madayar'), (162, '03 - Sintkuu'), (163, '04 - Moegauk'), (164, '05 - Thabaikkyin')]), ('33 - Kyaukse', [(165, '01 - Kyaukse'), (166, '02 - Sintkai'), (167, '03 - Myitthar'), (168, '04 - Tadaoo')]), ('34 - Myingyan', [(169, '01 - Myingyan'), (170, '02 - Taungthar'), (171, '03 - Nahtoegyee'), (172, '04 - Kyaukbadaung'), (173, '05 - Nganzun')]), ('35 - Nyaung Oo', [(174, '01 - Nyaung Oo'), (175, '02 - Bagan*'), (176, '03 - Ngatharauk*')]), ('36 - Yameithinn', [(177, '01 - Yameithinn'), (178, '02 - Pyawbwei'), (179, '03 - Tatkone'), (180, '04 - Pyinmana'), (181, '05 - Leiway')]), ('37 - Meikhtila', [(182, '01 - Meikhtila'), (183, '02 - Mahlaing'), (184, '03 - Tharzi'), (185, '04 - Wantwin')]), ('38 - Mawlamyaing', [(186, '01 - Mawlamyaing'), (187, '02 - Kyaikmayaw'), (188, '03 - Chaungzon'), (189, '04 - Thanphyuzayat'), (190, '05 - Kyaikkhami*'), (191, '06 - Mudon'), (192, '07 - Yay')]), ('39 - Thahton', [(193, '01 - Thahton'), (194, '02 - Paung'), (195, '03 - Kyaikhto'), (196, '04 - Beelin')]), ('40 - Sittwe', [(197, '01 - Sittwe'), (198, '02 - Poannakyun'), (199, '03 - Myauk Oo'), (200, '04 - Kyauktaw'), (201, '05 - Minbya'), (202, '06 - Pauktaw'), (203, '07 - Myaybon')]), ('41 - Maungdaw', [(204, '01 - Maungdaw')]), ('42 - Buthidaung', [(205, '01 - Buthidaung'), (206, '02 - Rathedaung')]), ('43 - Kyaukphyu', [(207, '01 - Kyaukphyu'), (208, '02 - Man Aung'), (209, '03 - Ranbyei'), (210, '04 - Ann')]), ('44 - Thandwe', [(211, '01 - Thandwe'), (212, '02 - Taungkauk'), (213, '03 - Gwa')]), ('45 - Ahshaytbine (East)', [(214, '01 - Thingankyun'), (215, '02 - Yankin'), (216, '03 - Taung Okkalapa'), (217, '04 - Myauk Okkalapa'), (218, '05 - Tharketa'), (219, '06 - Dawbon'), (220, '07 - Tarmwe'), (221, '08 - Pazuntaung'), (222, '09 - Botahtaung'), (223, '10 - Dagon Myothit Taung (Sounth)'), (224, '11 - Dagon Myothi Myauk (North)'), (225, '12 - Dagon Myothit Ahshayt (East)'), (226, '13 - Dagon Myothit (Seikkan)'), (227, '14 - Mingalataungnyunt')]), ('46 - Ahnoutpine (West)', [(228, '01 - Kyauktada'), (229, '02 - Panbedan'), (230, '03 - Lanmadaw'), (231, '04 - Lathar'), (232, '05 - Ahlon'), (233, '06 - Kyeemyindine'), (234, '07 - Sanchaung'), (235, '08 - Hlaing'), (236, '09 - Kamaryut'), (237, '10 - Mayangone'), (238, '11 - Dagon'), (239, '12 - Bahan'), (240, '13 - Seikkan')]), ('47 - Taungbine (South)', [(241, '01 - Thanhlyin'), (242, '02 - Kyauktan'), (243, '03 - Thonekhwa'), (244, '04 - Khayan'), (245, '05 - Tonte'), (246, '06 - Kauthmu'), (247, '07 - Kunchangone'), (248, '08 - Dala'), (249, '09 - Seikkyee'), (250, '10 - Khanaungto'), (251, '11 - Kokoe Island')]), ('48 - Myaukpine (North)', [(252, '01 - Insein'), (253, '02 - Mingalardon'), (254, '03 - Htaunkkyant*'), (255, '04 - Hmawbi'), (256, '05 - Hlegu'), (257, '06 - Tiakkyee'), (258, '07 - Oakkan*'), (259, '08 - Htantabin'), (260, '09 - Shwepyithar'), (261, '10 - Hlaingtharyar'), (262, '11 - Ahphyauk*')]), ('49 - Taungyi', [(263, '01 - Taungyi'), (264, '02 - Ayetharyar*'), (265, '03 - Hopone'), (266, '04 - Nyaungshwe'), (267, '05 - Sisaing'), (268, '06 - Kalaw'), (269, '07 - Aungban*'), (270, '08 - Pindaya'), (271, '09 - Ywarngan'), (272, '10 - Yatsauk'), (273, '11 - Pinlaung'), (274, '12 - Phekhoan')]), ('50 - Loilin', [(275, '01 - Loilin'), (276, '02 - Pinlon'), (277, '03 - Leichar'), (278, '04 - Nantsam(South)'), (279, '05 - Kunhein'), (280, '06 - Moenei'), (281, '07 - Linkhay'), (282, '08 - Maukmei'), (283, '09 - Minepan'), (284, '10 - Kyaythee'), (285, '11 - Minekaing'), (286, '12 - Mineshu')]), ('51 - Lahsio', [(287, '01 - Lashio'), (288, '02 - Theinni'), (289, '03 - Mineyei'), (290, '04 - Tantyan'), (291, '05 - Minephant'), (292, '06 - Panyang'), (293, '07 - Narphan'), (294, '08 - Panwaing'), (295, '09 - Minemaw'), (296, '10 - Pansan (Pankhan)')]), ('52 - Muse', [(297, '01 - Muse'), (298, '02 - Nantkhan'), (299, '03 - Kutkhaing'), (300, '04 - Monkoe'), (301, '05 - Kyukoak')]), ('53 - Kyaukmei', [(302, '01 - Kyaukmei'), (303, '02 - Naungcho'), (304, '03 - Thibaw'), (305, '04 - Namtu'), (306, '05 - Nantsam(North)'), (307, '06 - Moemaik'), (308, '07 - Mabain'), (309, '08 - Mantoan')]), ('54 - Kunloan', [(310, '01 - Kunloan'), (311, '02 - Hopan')]), ('55 - Laukkaing', [(312, '01 - Laukkaing'), (313, '02 - Chinshwehaw*'), (314, '03 - Koankyan')]), ('56 - Kyaington', [(315, '01 - Kyaington'), (316, '02 - Minekhat'), (317, '03 - Mineyang'), (318, '04 - Minelar'), (319, '05 - Metman')]), ('57 - Minesatt', [(320, '01 - Minesatt'), (321, '02 - Minepyinn'), (322, '03 - Minetoan')]), ('58 - Tachilaik', [(323, '01 - Tachilaik')]), ('59 - Minephyat', [(324, '01 - Minephyat'), (325, '02 - Mineyaung')]), ('60 - Pathein', [(326, '01 - Pathein'), (327, '02 - Kangyidaunt'), (328, '03 - Tharpaung'), (329, '04 - Ngaputaw'), (330, '05 - Kyoanpyaw'), (331, '06 - Yaykyi'), (332, '07 - Ngathaingchaung'), (333, '08 - Kyaungkone'), (334, '09 - Haigyikyun')]), ('61 - Hinthada', [(335, '01 - Hinthada'), (336, '02 - Zalun'), (337, '03 - Laymyethnar'), (338, '04 - Myan Aung'), (339, '05 - Ka Naung'), (340, '06 - Kyankhin'), (341, '07 - Ingapu')]), ('62 - Myaungmya', [(342, '01 - Myaungmya'), (343, '02 - Ainmei'), (344, '03 - Laputta'), (345, '04 - Warkhema'), (346, '05 - Mawlamyaingkyun')]), ('63 - Ma U Bin', [(347, '01 - Ma U Bin'), (348, '02 - Pantanaw'), (349, '03 - Nyaungtone'), (350, '04 - Danubyu')]), ('64 - Phyarpon', [(351, '01 - Phyarpon'), (352, '02 - Bogalay'), (353, '03 - Kyaiklatt'), (354, '04 - Daydayei')]), ('65 - Zayarthiri', [(355, '01 - Zayarthiri')]), ('66 - Dakhenathiri', [(356, '01 - Dakhenathiri')]), ('67 - Oktayathiri', [(357, '01 - Oktayathiri')]), ('68 - Potebathiri', [(358, '01 - Potebathiri')]), ('69 - Zabuthiri', [(359, '01 - Zabuthiri')]), ('70 - Tatkon', [(360, '01 - Tatkon')]), ('71 - Pyinmana', [(361, '01 - Pyinmana')]), ('72 - Lewe', [(362, '01 - Lewe')])]),
),
]
| 341.468468 | 10,288 | 0.534918 | 9,436 | 75,806 | 4.277554 | 0.104811 | 0.021901 | 0.022793 | 0.028739 | 0.966603 | 0.960434 | 0.953051 | 0.94879 | 0.943934 | 0.938582 | 0 | 0.188796 | 0.180751 | 75,806 | 221 | 10,289 | 343.013575 | 0.46113 | 0.000277 | 0 | 0.669767 | 0 | 0.013953 | 0.509719 | 0.001095 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.023256 | 0 | 0.037209 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
40ca2c1ff695b7a95e2c22eaae333c4d641f902b | 5,622 | py | Python | tests/device_fixtures.py | zohassadar/netdisc | 9ce4d5c2b0f30d36e71118ffbe6b7ffd93e0dfc8 | [
"MIT"
] | null | null | null | tests/device_fixtures.py | zohassadar/netdisc | 9ce4d5c2b0f30d36e71118ffbe6b7ffd93e0dfc8 | [
"MIT"
] | null | null | null | tests/device_fixtures.py | zohassadar/netdisc | 9ce4d5c2b0f30d36e71118ffbe6b7ffd93e0dfc8 | [
"MIT"
] | null | null | null | import yaml
import pytest
import pathlib
DEVICES_PATH = pathlib.Path.joinpath(pathlib.Path(__file__).parent)
# Build a small fake topology of three devices connected to each other
# d1 Fa1/2 <-> d2 Fa2/1
# d1 Fa1/3 <-> d3 Fa3/1
# d2 Fa2/1 <-> d1 Fa1/2
# d2 Fa2/3 <-> d3 Fa3/2
# d3 Fa3/1 <-> d1 Fa1/3
# d3 Fa3/2 <-> d2 Fa2/3
def open_n_yaml_safe_load(filename):
with open(filename) as f:
return yaml.safe_load(f)
DEVICE_1 = pathlib.Path.joinpath(DEVICES_PATH, "device_1.yaml")
DEVICE_2 = pathlib.Path.joinpath(DEVICES_PATH, "device_2.yaml")
DEVICE_3 = pathlib.Path.joinpath(DEVICES_PATH, "device_3.yaml")
@pytest.fixture
def device_1_partial():
return open_n_yaml_safe_load(DEVICE_1)["device_1"]
@pytest.fixture
def device_2_partial():
return open_n_yaml_safe_load(DEVICE_2)["device_2"]
@pytest.fixture
def device_3_partial():
return open_n_yaml_safe_load(DEVICE_3)["device_3"]
@pytest.fixture
def device_1_interface_2():
return open_n_yaml_safe_load(DEVICE_1)["device_1_interface_2"]
@pytest.fixture
def device_1_interface_3():
return open_n_yaml_safe_load(DEVICE_1)["device_1_interface_3"]
@pytest.fixture
def device_1_interface_3_updated():
return open_n_yaml_safe_load(DEVICE_1)["device_1_interface_3_updated"]
@pytest.fixture
def device_2_interface_1():
return open_n_yaml_safe_load(DEVICE_2)["device_2_interface_1"]
@pytest.fixture
def device_2_interface_3():
return open_n_yaml_safe_load(DEVICE_2)["device_2_interface_3"]
@pytest.fixture
def device_3_interface_1():
return open_n_yaml_safe_load(DEVICE_3)["device_3_interface_1"]
@pytest.fixture
def device_3_interface_2():
return open_n_yaml_safe_load(DEVICE_3)["device_3_interface_2"]
@pytest.fixture
def device_1_neighbor_2():
return open_n_yaml_safe_load(DEVICE_1)["device_1_neighbor_2"]
@pytest.fixture
def device_1_neighbor_3():
return open_n_yaml_safe_load(DEVICE_1)["device_1_neighbor_3"]
@pytest.fixture
def device_2_neighbor_1():
return open_n_yaml_safe_load(DEVICE_2)["device_2_neighbor_1"]
@pytest.fixture
def device_2_neighbor_3():
return open_n_yaml_safe_load(DEVICE_2)["device_2_neighbor_3"]
@pytest.fixture
def device_3_neighbor_1():
return open_n_yaml_safe_load(DEVICE_3)["device_3_neighbor_1"]
@pytest.fixture
def device_3_neighbor_2():
return open_n_yaml_safe_load(DEVICE_3)["device_3_neighbor_2"]
@pytest.fixture
def device_1_interfaces():
return [
open_n_yaml_safe_load(DEVICE_1)["device_1_interface_2"],
open_n_yaml_safe_load(DEVICE_1)["device_1_interface_3"],
]
@pytest.fixture
def device_2_interfaces():
return [
open_n_yaml_safe_load(DEVICE_2)["device_2_interface_1"],
open_n_yaml_safe_load(DEVICE_2)["device_2_interface_3"],
]
@pytest.fixture
def device_3_interfaces():
return [
open_n_yaml_safe_load(DEVICE_3)["device_3_interface_2"],
open_n_yaml_safe_load(DEVICE_3)["device_3_interface_3"],
]
@pytest.fixture
def device_1_neighbors():
return [
open_n_yaml_safe_load(DEVICE_1)["device_1_neighbor_2"],
open_n_yaml_safe_load(DEVICE_1)["device_1_neighbor_3"],
]
@pytest.fixture
def device_2_neighbors():
return [
open_n_yaml_safe_load(DEVICE_2)["device_2_neighbor_1"],
open_n_yaml_safe_load(DEVICE_2)["device_2_neighbor_3"],
]
@pytest.fixture
def device_3_neighbors():
return [
open_n_yaml_safe_load(DEVICE_3)["device_3_neighbor_2"],
open_n_yaml_safe_load(DEVICE_3)["device_3_neighbor_3"],
]
@pytest.fixture
def device_1_loaded():
device = open_n_yaml_safe_load(DEVICE_1)["device_1"]
interfaces = [
open_n_yaml_safe_load(DEVICE_1)["device_1_interface_2"],
open_n_yaml_safe_load(DEVICE_1)["device_1_interface_3"],
]
neighbors = [
open_n_yaml_safe_load(DEVICE_1)["device_1_neighbor_2"],
open_n_yaml_safe_load(DEVICE_1)["device_1_neighbor_3"],
]
device["interfaces"].extend(interfaces)
device["neighbors"].extend(neighbors)
return device
@pytest.fixture
def device_1_loaded_with_update():
device = open_n_yaml_safe_load(DEVICE_1)["device_1"]
interfaces = [
open_n_yaml_safe_load(DEVICE_1)["device_1_interface_2"],
open_n_yaml_safe_load(DEVICE_1)["device_1_interface_3_updated"],
]
neighbors = [
open_n_yaml_safe_load(DEVICE_1)["device_1_neighbor_2"],
open_n_yaml_safe_load(DEVICE_1)["device_1_neighbor_3"],
]
device["interfaces"].extend(interfaces)
device["neighbors"].extend(neighbors)
return device
@pytest.fixture
def device_2_loaded():
device = open_n_yaml_safe_load(DEVICE_2)["device_2"]
interfaces = [
open_n_yaml_safe_load(DEVICE_2)["device_2_interface_1"],
open_n_yaml_safe_load(DEVICE_2)["device_2_interface_3"],
]
neighbors = [
open_n_yaml_safe_load(DEVICE_2)["device_2_neighbor_1"],
open_n_yaml_safe_load(DEVICE_2)["device_2_neighbor_3"],
]
device["interfaces"].extend(interfaces)
device["neighbors"].extend(neighbors)
return device
@pytest.fixture
def device_3_loaded():
device = open_n_yaml_safe_load(DEVICE_3)["device_3"]
interfaces = [
open_n_yaml_safe_load(DEVICE_3)["device_3_interface_1"],
open_n_yaml_safe_load(DEVICE_3)["device_3_interface_2"],
]
neighbors = [
open_n_yaml_safe_load(DEVICE_3)["device_3_neighbor_1"],
open_n_yaml_safe_load(DEVICE_3)["device_3_neighbor_2"],
]
device["interfaces"].extend(interfaces)
device["neighbors"].extend(neighbors)
return device
| 24.550218 | 74 | 0.737282 | 872 | 5,622 | 4.227064 | 0.057339 | 0.098752 | 0.162778 | 0.172816 | 0.930005 | 0.902876 | 0.820944 | 0.772382 | 0.743082 | 0.696148 | 0 | 0.045712 | 0.151725 | 5,622 | 228 | 75 | 24.657895 | 0.727196 | 0.037709 | 0 | 0.513333 | 0 | 0 | 0.18271 | 0.010367 | 0 | 0 | 0 | 0 | 0 | 1 | 0.18 | false | 0 | 0.02 | 0.146667 | 0.38 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 8 |
40d92f6619900cdf66bb2f1fca8c99780b3e816e | 5,287 | py | Python | bokeh/tests/test_colors.py | tswicegood/bokeh | 2e74be5c9288306896e8c76af2e14a8c7513e0e3 | [
"BSD-3-Clause"
] | 2 | 2021-09-01T12:36:06.000Z | 2021-11-17T10:48:36.000Z | bokeh/tests/test_colors.py | brian15co/bokeh | 6cecb7211277b9d838039d0eb15e50a10f9ac3d1 | [
"BSD-3-Clause"
] | null | null | null | bokeh/tests/test_colors.py | brian15co/bokeh | 6cecb7211277b9d838039d0eb15e50a10f9ac3d1 | [
"BSD-3-Clause"
] | 2 | 2015-12-22T04:13:10.000Z | 2021-07-06T21:18:04.000Z |
import unittest
import bokeh.colors as colors
class TestColor(unittest.TestCase):
def test_basic(self):
c = colors.Color()
def test_abstract(self):
c = colors.Color()
self.assertRaises(NotImplementedError, c.to_css)
self.assertRaises(NotImplementedError, c.to_rgb)
self.assertRaises(NotImplementedError, c.to_hsl)
self.assertRaises(NotImplementedError, c.from_rgb, "foo")
self.assertRaises(NotImplementedError, c.from_hsl,"foo")
def test_repr(self):
c = colors.Color()
self.assertRaises(NotImplementedError, repr, c)
def test_clamp(self):
self.assertEqual(colors.Color.clamp(10), 10)
self.assertEqual(colors.Color.clamp(10, 20), 10)
self.assertEqual(colors.Color.clamp(10, 5), 5)
self.assertEqual(colors.Color.clamp(-10), 0)
def test_lighten(self):
c = colors.HSL(10, 0.2, 0.2, 0.2)
c1 = c.lighten(0.2)
self.assertEqual(c1.a, 0.2)
self.assertEqual(c1.h, 10)
self.assertEqual(c1.s, 0.2)
self.assertEqual(c1.l, 0.4)
def test_darken(self):
c = colors.HSL(10, 0.2, 0.2, 0.2)
c1 = c.darken(0.1)
self.assertEqual(c1.a, 0.2)
self.assertEqual(c1.h, 10)
self.assertEqual(c1.s, 0.2)
self.assertEqual(c1.l, 0.1)
c2 = c.darken(0.3)
self.assertEqual(c2.a, 0.2)
self.assertEqual(c2.h, 10)
self.assertEqual(c2.s, 0.2)
self.assertEqual(c2.l, 0)
class TestRGB(unittest.TestCase):
def test_basic(self):
c = colors.RGB(10, 20, 30)
c = colors.RGB(10, 20, 30, 0.3)
def test_repr(self):
c = colors.RGB(10, 20, 30)
self.assertEqual(repr(c), c.to_css())
c = colors.RGB(10, 20, 30, 0.3)
self.assertEqual(repr(c), c.to_css())
def test_to_css(self):
c = colors.RGB(10, 20, 30)
self.assertEqual(c.to_css(), "rgb(10, 20, 30)")
c = colors.RGB(10, 20, 30, 0.3)
self.assertEqual(c.to_css(), "rgba(10, 20, 30, 0.3)")
def test_to_hex(self):
c = colors.RGB(10, 20, 30)
self.assertEqual(c.to_hex(), "#%02X%02X%02X" % (c.r, c.g, c.b))
def test_to_rgb(self):
c = colors.RGB(10, 20, 30)
c2 = c.to_rgb()
self.assertTrue(c is not c2)
self.assertEqual(c.a, c2.a)
self.assertEqual(c.r, c2.r)
self.assertEqual(c.g, c2.g)
self.assertEqual(c.b, c2.b)
c = colors.RGB(10, 20, 30, 0.1)
c2 = c.to_rgb()
self.assertTrue(c is not c2)
self.assertEqual(c.a, c2.a)
self.assertEqual(c.r, c2.r)
self.assertEqual(c.g, c2.g)
self.assertEqual(c.b, c2.b)
def test_from_rgb(self):
c = colors.RGB(10, 20, 30)
c2 = c.from_rgb(c)
self.assertTrue(c is not c2)
self.assertEqual(c.a, c2.a)
self.assertEqual(c.r, c2.r)
self.assertEqual(c.g, c2.g)
self.assertEqual(c.b, c2.b)
c = colors.RGB(10, 20, 30, 0.1)
c2 = c.from_rgb(c)
self.assertTrue(c is not c2)
self.assertEqual(c.a, c2.a)
self.assertEqual(c.r, c2.r)
self.assertEqual(c.g, c2.g)
self.assertEqual(c.b, c2.b)
class TestHSL(unittest.TestCase):
def test_basic(self):
c = colors.HSL(10, 0.2, 0.3)
c = colors.HSL(10, 0.2, 0.3, 0.3)
def test_repr(self):
c = colors.HSL(10, 0.2, 0.3)
self.assertEqual(repr(c), c.to_css())
c = colors.HSL(10, 0.2, 0.3, 0.3)
self.assertEqual(repr(c), c.to_css())
def test_to_css(self):
c = colors.HSL(10, 0.2, 0.3)
self.assertEqual(c.to_css(), "hsl(10, 20.0%, 30.0%)")
c = colors.HSL(10, 0.2, 0.3, 0.3)
self.assertEqual(c.to_css(), "hsla(10, 20.0%, 30.0%, 0.3)")
def test_to_hsl(self):
c = colors.HSL(10, 0.2, 0.3)
c2 = c.to_hsl()
self.assertTrue(c is not c2)
self.assertEqual(c.a, c2.a)
self.assertEqual(c.h, c2.h)
self.assertEqual(c.s, c2.s)
self.assertEqual(c.l, c2.l)
c = colors.HSL(10, 0.2, 0.3, 0.1)
c2 = c.to_hsl()
self.assertTrue(c is not c2)
self.assertEqual(c.a, c2.a)
self.assertEqual(c.h, c2.h)
self.assertEqual(c.s, c2.s)
self.assertEqual(c.l, c2.l)
def test_from_hsl(self):
c = colors.HSL(10, 0.2, 0.3)
c2 = c.from_hsl(c)
self.assertTrue(c is not c2)
self.assertEqual(c.a, c2.a)
self.assertEqual(c.h, c2.h)
self.assertEqual(c.s, c2.s)
self.assertEqual(c.l, c2.l)
c = colors.HSL(10, 0.2, 0.3, 0.1)
c2 = c.from_hsl(c)
self.assertTrue(c is not c2)
self.assertEqual(c.a, c2.a)
self.assertEqual(c.h, c2.h)
self.assertEqual(c.s, c2.s)
self.assertEqual(c.l, c2.l)
class TestNamedColor(unittest.TestCase):
def test_basic(self):
c = colors.NamedColor("aliceblue", 240, 248, 255)
self.assertEqual(c.name, "aliceblue")
def test_to_css(self):
c = colors.NamedColor("aliceblue", 240, 248, 255)
self.assertEqual(c.to_css(), "aliceblue")
def test_repr(self):
c = colors.NamedColor("aliceblue", 240, 248, 255)
self.assertEqual(repr(c), c.to_css()) | 31.1 | 71 | 0.564971 | 857 | 5,287 | 3.424737 | 0.072345 | 0.306644 | 0.212606 | 0.049063 | 0.883475 | 0.802385 | 0.772402 | 0.694719 | 0.641567 | 0.615332 | 0 | 0.087568 | 0.274258 | 5,287 | 170 | 72 | 31.1 | 0.677352 | 0 | 0 | 0.707143 | 0 | 0 | 0.027993 | 0 | 0 | 0 | 0 | 0 | 0.528571 | 1 | 0.142857 | false | 0 | 0.014286 | 0 | 0.185714 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
40db6814e044807552c355342471812c78d80b44 | 48,530 | py | Python | contrib/testmpi/test_mpi_derivatives.py | thearn/OpenMDAO-Framework | c1b00dfdf4bf402061ddc52abce201e2329e5563 | [
"Apache-2.0"
] | null | null | null | contrib/testmpi/test_mpi_derivatives.py | thearn/OpenMDAO-Framework | c1b00dfdf4bf402061ddc52abce201e2329e5563 | [
"Apache-2.0"
] | null | null | null | contrib/testmpi/test_mpi_derivatives.py | thearn/OpenMDAO-Framework | c1b00dfdf4bf402061ddc52abce201e2329e5563 | [
"Apache-2.0"
] | null | null | null |
import numpy as np
import sys
from openmdao.util.testutil import assert_rel_error
from openmdao.test.mpiunittest import MPITestCase, MPIContext
from openmdao.main.api import Assembly, Component, set_as_top, Driver
from openmdao.main.datatypes.api import Float, Array
from openmdao.main.test.simpledriver import SimpleDriver
from openmdao.test.execcomp import ExecCompWithDerivatives, ExecComp
class Paraboloid(Component):
""" Evaluates the equation f(x,y) = (x-3)^2 + xy + (y+4)^2 - 3 """
# set up interface to the framework
# pylint: disable=E1101
x = Float(0.0, iotype='in', desc='The variable x')
y = Float(0.0, iotype='in', desc='The variable y')
f_xy = Float(iotype='out', desc='F(x,y)')
def execute(self):
"""f(x,y) = (x-3)^2 + xy + (y+4)^2 - 3
Optimal solution (minimum): x = 6.6667; y = -7.3333
"""
x = self.x
y = self.y
self.f_xy = (x-3.0)**2 + x*y + (y+4.0)**2 - 3.0
def provideJ(self):
"""Analytical first derivatives"""
df_dx = 2.0*self.x - 6.0 + self.y
df_dy = 2.0*self.y + 8.0 + self.x
self.J = np.array([[df_dx, df_dy]])
return self.J
def list_deriv_vars(self):
input_keys = ('x', 'y')
output_keys = ('f_xy',)
return input_keys, output_keys
class MPITests_2Proc(MPITestCase):
N_PROCS = 2
def setUp(self):
# this model mimics the one in test_derivatives, test_single_comp
self.top = top = set_as_top(Assembly())
top.add('comp', Paraboloid())
top.add('driver', SimpleDriver())
top.driver.workflow.add(['comp'])
top.driver.add_parameter('comp.x', low=-1000, high=1000)
top.driver.add_parameter('comp.y', low=-1000, high=1000)
top.driver.add_objective('comp.f_xy')
top.comp.x = 3
top.comp.y = 5
def test_run(self):
self.top.run()
if self.comm.rank == 0:
self.assertEqual(self.top.comp.f_xy, 93.)
self.assertEqual(self.top._pseudo_0.out0, 93.)
def test_calc_gradient_fwd(self):
self.top.run()
J = self.top.driver.calc_gradient(mode='forward',
return_format='dict')
if self.comm.rank == 0:
assert_rel_error(self, J['_pseudo_0.out0']['comp.x'][0][0],
5.0, 0.0001)
assert_rel_error(self, J['_pseudo_0.out0']['comp.y'][0][0],
21.0, 0.0001)
def test_calc_gradient_adjoint(self):
self.top.run()
J = self.top.driver.calc_gradient(mode='adjoint',
return_format='dict')
if self.comm.rank == 0:
assert_rel_error(self, J['_pseudo_0.out0']['comp.x'][0][0],
5.0, 0.0001)
assert_rel_error(self, J['_pseudo_0.out0']['comp.y'][0][0],
21.0, 0.0001)
def test_calc_gradient_fd(self):
self.top.run()
J = self.top.driver.calc_gradient(mode='fd',
return_format='dict')
if self.comm.rank == 0:
assert_rel_error(self, J['_pseudo_0.out0']['comp.x'][0][0],
5.0, 0.0001)
assert_rel_error(self, J['_pseudo_0.out0']['comp.y'][0][0],
21.0, 0.0001)
def test_calc_gradient_fwd_linGS(self):
self.top.driver.gradient_options.lin_solver = 'linear_gs'
self.top.driver.gradient_options.maxiter = 1
self.top.run()
J = self.top.driver.calc_gradient(inputs=['comp.x','comp.y'], mode='forward',
return_format='dict')
J = self.top.driver.workflow._system.get_combined_J(J)
if self.comm.rank == 0:
assert_rel_error(self, J['_pseudo_0.out0']['comp.x'][0][0],
5.0, 0.0001)
assert_rel_error(self, J['_pseudo_0.out0']['comp.y'][0][0],
21.0, 0.0001)
def test_two_to_one_forward(self):
top = set_as_top(Assembly())
exp1 = ["y = 3.0*x"]
exp2 = ["y = -2.0*x"]
exp3 = ["y = 5.0*x1 + 4.0*x2"]
deriv1 = ["dy_dx = 3.0"]
deriv2 = ["dy_dx = -2.0"]
deriv3 = ["dy_dx1 = 5.0", "dy_dx2 = 4.0"]
top.add('comp1', ExecCompWithDerivatives(exp1, deriv1))
top.add('comp2', ExecCompWithDerivatives(exp2, deriv2))
top.add('comp3', ExecCompWithDerivatives(exp3, deriv3))
top.add('driver', SimpleDriver())
top.driver.workflow.add(['comp1', 'comp2', 'comp3'])
top.connect('comp1.y', 'comp3.x1')
top.connect('comp2.y', 'comp3.x2')
top.driver.add_parameter('comp1.x', low=-100, high=100)
top.driver.add_parameter('comp2.x', low=-100, high=100)
top.driver.add_constraint('comp3.y < 1000')
top.run()
J = top.driver.calc_gradient(mode='forward',
return_format='dict')
J = top.driver.workflow._system.get_combined_J(J)
assert_rel_error(self, J['_pseudo_0.out0']['comp1.x'][0][0],
15.0, 0.0001)
assert_rel_error(self, J['_pseudo_0.out0']['comp2.x'][0][0],
-8.0, 0.0001)
def test_two_to_one_adjoint(self):
top = set_as_top(Assembly())
exp1 = ["y = 3.0*x"]
exp2 = ["y = -2.0*x"]
exp3 = ["y = 5.0*x1 + 4.0*x2"]
deriv1 = ["dy_dx = 3.0"]
deriv2 = ["dy_dx = -2.0"]
deriv3 = ["dy_dx1 = 5.0", "dy_dx2 = 4.0"]
top.add('comp1', ExecCompWithDerivatives(exp1, deriv1))
top.add('comp2', ExecCompWithDerivatives(exp2, deriv2))
top.add('comp3', ExecCompWithDerivatives(exp3, deriv3))
top.add('driver', SimpleDriver())
top.driver.workflow.add(['comp1', 'comp2', 'comp3'])
top.connect('comp1.y', 'comp3.x1')
top.connect('comp2.y', 'comp3.x2')
top.driver.add_parameter('comp1.x', low=-100, high=100)
top.driver.add_parameter('comp2.x', low=-100, high=100)
top.driver.add_constraint('comp3.y < 1000')
top.run()
J = top.driver.calc_gradient(mode='adjoint',
return_format='dict')
J = top.driver.workflow._system.get_combined_J(J)
assert_rel_error(self, J['_pseudo_0.out0']['comp1.x'][0][0],
15.0, 0.0001)
assert_rel_error(self, J['_pseudo_0.out0']['comp2.x'][0][0],
-8.0, 0.0001)
def test_two_to_one_fd(self):
top = set_as_top(Assembly())
exp1 = ["y = 3.0*x"]
exp2 = ["y = -2.0*x"]
exp3 = ["y = 5.0*x1 + 4.0*x2"]
deriv1 = ["dy_dx = 3.0"]
deriv2 = ["dy_dx = -2.0"]
deriv3 = ["dy_dx1 = 5.0", "dy_dx2 = 4.0"]
top.add('comp1', ExecCompWithDerivatives(exp1, deriv1))
top.add('comp2', ExecCompWithDerivatives(exp2, deriv2))
top.add('comp3', ExecCompWithDerivatives(exp3, deriv3))
top.add('driver', SimpleDriver())
top.driver.workflow.add(['comp1', 'comp2', 'comp3'])
top.connect('comp1.y', 'comp3.x1')
top.connect('comp2.y', 'comp3.x2')
top.driver.add_parameter('comp1.x', low=-100, high=100)
top.driver.add_parameter('comp2.x', low=-100, high=100)
top.driver.add_constraint('comp3.y < 1000')
top.run()
J = top.driver.calc_gradient(mode='fd',
return_format='dict')
J = top.driver.workflow._system.get_combined_J(J)
assert_rel_error(self, J['_pseudo_0.out0']['comp1.x'][0][0],
15.0, 0.0001)
assert_rel_error(self, J['_pseudo_0.out0']['comp2.x'][0][0],
-8.0, 0.0001)
def test_two_to_one_forward_bcast(self):
top = set_as_top(Assembly())
exp1 = ["y = 3.0*x"]
exp2 = ["y = -2.0*x"]
exp3 = ["y = 5.0*x1 + 4.0*x2"]
deriv1 = ["dy_dx = 3.0"]
deriv2 = ["dy_dx = -2.0"]
deriv3 = ["dy_dx1 = 5.0", "dy_dx2 = 4.0"]
top.add('comp1', ExecCompWithDerivatives(exp1, deriv1))
top.add('comp2', ExecCompWithDerivatives(exp2, deriv2))
top.add('comp3', ExecCompWithDerivatives(exp3, deriv3))
top.add('driver', SimpleDriver())
top.driver.workflow.add(['comp1', 'comp2', 'comp3'])
top.connect('comp1.y', 'comp3.x1')
top.connect('comp2.y', 'comp3.x2')
top.driver.add_parameter(('comp1.x', 'comp2.x'),
low=-100, high=100)
top.driver.add_constraint('comp3.y < 1000')
top.run()
J = top.driver.calc_gradient(mode='forward',
return_format='dict')
J = top.driver.workflow._system.get_combined_J(J)
#print J
assert_rel_error(self, J['_pseudo_0.out0']['comp1.x'][0][0],
7.0, 0.0001)
def test_two_to_one_adjoint_bcast(self):
top = set_as_top(Assembly())
exp1 = ["y = 3.0*x"]
exp2 = ["y = -2.0*x"]
exp3 = ["y = 5.0*x1 + 4.0*x2"]
deriv1 = ["dy_dx = 3.0"]
deriv2 = ["dy_dx = -2.0"]
deriv3 = ["dy_dx1 = 5.0", "dy_dx2 = 4.0"]
top.add('comp1', ExecCompWithDerivatives(exp1, deriv1))
top.add('comp2', ExecCompWithDerivatives(exp2, deriv2))
top.add('comp3', ExecCompWithDerivatives(exp3, deriv3))
top.add('driver', SimpleDriver())
top.driver.workflow.add(['comp1', 'comp2', 'comp3'])
top.connect('comp1.y', 'comp3.x1')
top.connect('comp2.y', 'comp3.x2')
top.driver.add_parameter(('comp1.x', 'comp2.x'),
low=-100, high=100)
top.driver.add_constraint('comp3.y < 1000')
top.run()
J = top.driver.calc_gradient(mode='adjoint',
return_format='dict')
J = top.driver.workflow._system.get_combined_J(J)
#from openmdao.util.dotgraph import plot_system_tree
#plot_system_tree(top._system)
assert_rel_error(self, J['_pseudo_0.out0']['comp1.x'][0][0],
7.0, 0.0001)
def test_one_to_two_forward(self):
top = set_as_top(Assembly())
exp1 = ["y1 = 3.0*x", "y2 = 4.0*x"]
exp2 = ["y = -2.0*x"]
exp3 = ["y = 5.0*x"]
deriv1 = ["dy1_dx = 3.0", "dy2_dx = 4.0"]
deriv2 = ["dy_dx = -2.0"]
deriv3 = ["dy_dx = 5.0"]
top.add('comp1', ExecCompWithDerivatives(exp1, deriv1))
top.add('comp2', ExecCompWithDerivatives(exp2, deriv2))
top.add('comp3', ExecCompWithDerivatives(exp3, deriv3))
top.add('driver', SimpleDriver())
top.driver.workflow.add(['comp1', 'comp2', 'comp3'])
top.connect('comp1.y1', 'comp2.x')
top.connect('comp1.y2', 'comp3.x')
top.driver.add_parameter('comp1.x', low=-100, high=100)
top.driver.add_constraint('comp2.y < 1000')
top.driver.add_constraint('comp3.y < 1000')
top.run()
J = top.driver.calc_gradient(mode='forward',
return_format='dict')
J = top.driver.workflow._system.get_combined_J(J)
assert_rel_error(self, J['_pseudo_0.out0']['comp1.x'][0][0],
-6.0, 0.0001)
assert_rel_error(self, J['_pseudo_1.out0']['comp1.x'][0][0],
20.0, 0.0001)
def test_one_to_two_adjoint(self):
top = set_as_top(Assembly())
exp1 = ["y1 = 3.0*x", "y2 = 4.0*x"]
exp2 = ["y = -2.0*x"]
exp3 = ["y = 5.0*x"]
deriv1 = ["dy1_dx = 3.0", "dy2_dx = 4.0"]
deriv2 = ["dy_dx = -2.0"]
deriv3 = ["dy_dx = 5.0"]
top.add('comp1', ExecCompWithDerivatives(exp1, deriv1))
top.add('comp2', ExecCompWithDerivatives(exp2, deriv2))
top.add('comp3', ExecCompWithDerivatives(exp3, deriv3))
top.add('driver', SimpleDriver())
top.driver.workflow.add(['comp1', 'comp2', 'comp3'])
top.connect('comp1.y1', 'comp2.x')
top.connect('comp1.y2', 'comp3.x')
top.driver.add_parameter('comp1.x', low=-100, high=100)
top.driver.add_constraint('comp2.y < 1000')
top.driver.add_constraint('comp3.y < 1000')
top.run()
J = top.driver.calc_gradient(mode='adjoint',
return_format='dict')
J = top.driver.workflow._system.get_combined_J(J)
assert_rel_error(self, J['_pseudo_0.out0']['comp1.x'][0][0],
-6.0, 0.0001)
assert_rel_error(self, J['_pseudo_1.out0']['comp1.x'][0][0],
20.0, 0.0001)
def test_one_to_two_fd(self):
top = set_as_top(Assembly())
exp1 = ["y1 = 3.0*x", "y2 = 4.0*x"]
exp2 = ["y = -2.0*x"]
exp3 = ["y = 5.0*x"]
deriv1 = ["dy1_dx = 3.0", "dy2_dx = 4.0"]
deriv2 = ["dy_dx = -2.0"]
deriv3 = ["dy_dx = 5.0"]
top.add('comp1', ExecCompWithDerivatives(exp1, deriv1))
top.add('comp2', ExecCompWithDerivatives(exp2, deriv2))
top.add('comp3', ExecCompWithDerivatives(exp3, deriv3))
top.add('driver', SimpleDriver())
top.driver.workflow.add(['comp1', 'comp2', 'comp3'])
top.connect('comp1.y1', 'comp2.x')
top.connect('comp1.y2', 'comp3.x')
top.driver.add_parameter('comp1.x', low=-100, high=100)
top.driver.add_constraint('comp2.y < 1000')
top.driver.add_constraint('comp3.y < 1000')
top.run()
J = top.driver.calc_gradient(mode='fd',
return_format='dict')
J = top.driver.workflow._system.get_combined_J(J)
assert_rel_error(self, J['_pseudo_0.out0']['comp1.x'][0][0],
-6.0, 0.0001)
assert_rel_error(self, J['_pseudo_1.out0']['comp1.x'][0][0],
20.0, 0.0001)
def test_three_comp_diamond_forward(self):
self.top = set_as_top(Assembly())
exp1 = ['y1 = 50.0*x1',
'y2 = 1.0*x1']
deriv1 = ['dy1_dx1 = 50.0',
'dy2_dx1 = 1.0']
exp2 = ['y1 = 1.2*x1']
deriv2 = ['dy1_dx1 = 1.2']
exp3 = ['y1 = 100.0*x1*x2 + 30*x1 + 0.3*x2']
deriv3 = ['dy1_dx1 = 100.0*x2 + 30',
'dy1_dx2 = 100.0*x1 + 0.3']
self.top.add('comp1', ExecCompWithDerivatives(exp1, deriv1))
self.top.add('comp2', ExecCompWithDerivatives(exp2, deriv2))
self.top.add('comp3', ExecCompWithDerivatives(exp3, deriv3))
self.top.add('driver', SimpleDriver())
self.top.driver.workflow.add(['comp1', 'comp2', 'comp3'])
self.top.connect('comp1.y1', 'comp2.x1')
self.top.connect('comp1.y2', 'comp3.x1')
self.top.connect('comp2.y1', 'comp3.x2')
self.top.driver.add_parameter('comp1.x1', low=-100, high=100)
self.top.driver.add_objective('comp3.y1')
self.top.comp1.x1 = 2.0
self.top.run()
J = self.top.driver.calc_gradient(inputs=['comp1.x1'],
outputs=['comp3.y1'],
mode='forward',
return_format='dict')
if self.comm.rank == 0:
assert_rel_error(self, J['comp3.y1']['comp1.x1'][0][0],
24048.0, 0.0001)
def test_diverge_converge_forward(self):
self.top = set_as_top(Assembly())
exp1 = ['y1 = 2.0*x1**2',
'y2 = 3.0*x1']
deriv1 = ['dy1_dx1 = 4.0*x1',
'dy2_dx1 = 3.0']
exp2 = ['y1 = 0.5*x1']
deriv2 = ['dy1_dx1 = 0.5']
exp3 = ['y1 = 3.5*x1']
deriv3 = ['dy1_dx1 = 3.5']
exp4 = ['y1 = x1 + 2.0*x2',
'y2 = 3.0*x1',
'y3 = x1*x2']
deriv4 = ['dy1_dx1 = 1.0',
'dy1_dx2 = 2.0',
'dy2_dx1 = 3.0',
'dy2_dx2 = 0.0',
'dy3_dx1 = x2',
'dy3_dx2 = x1']
exp5 = ['y1 = x1 + 3.0*x2 + 2.0*x3']
deriv5 = ['dy1_dx1 = 1.0',
'dy1_dx2 = 3.0',
'dy1_dx3 = 2.0']
self.top.add('comp1', ExecCompWithDerivatives(exp1, deriv1))
self.top.add('comp2', ExecCompWithDerivatives(exp2, deriv2))
self.top.add('comp3', ExecCompWithDerivatives(exp3, deriv3))
self.top.add('comp4', ExecCompWithDerivatives(exp4, deriv4))
self.top.add('comp5', ExecCompWithDerivatives(exp5, deriv5))
self.top.driver.workflow.add(['comp1', 'comp2', 'comp3', 'comp4', 'comp5'])
self.top.connect('comp1.y1', 'comp2.x1')
self.top.connect('comp1.y2', 'comp3.x1')
self.top.connect('comp2.y1', 'comp4.x1')
self.top.connect('comp3.y1', 'comp4.x2')
self.top.connect('comp4.y1', 'comp5.x1')
self.top.connect('comp4.y2', 'comp5.x2')
self.top.connect('comp4.y3', 'comp5.x3')
self.top.comp1.x1 = 2.0
self.top.run()
J = self.top.driver.calc_gradient(inputs=['comp1.x1'],
outputs=['comp5.y1'],
mode='forward',
return_format='dict')
assert_rel_error(self, J['comp5.y1']['comp1.x1'][0][0],
313.0, 0.0001)
def test_diverge_converge_LinGS_forward(self):
self.top = set_as_top(Assembly())
exp1 = ['y1 = 2.0*x1**2',
'y2 = 3.0*x1']
deriv1 = ['dy1_dx1 = 4.0*x1',
'dy2_dx1 = 3.0']
exp2 = ['y1 = 0.5*x1']
deriv2 = ['dy1_dx1 = 0.5']
exp3 = ['y1 = 3.5*x1']
deriv3 = ['dy1_dx1 = 3.5']
exp4 = ['y1 = x1 + 2.0*x2',
'y2 = 3.0*x1',
'y3 = x1*x2']
deriv4 = ['dy1_dx1 = 1.0',
'dy1_dx2 = 2.0',
'dy2_dx1 = 3.0',
'dy2_dx2 = 0.0',
'dy3_dx1 = x2',
'dy3_dx2 = x1']
exp5 = ['y1 = x1 + 3.0*x2 + 2.0*x3']
deriv5 = ['dy1_dx1 = 1.0',
'dy1_dx2 = 3.0',
'dy1_dx3 = 2.0']
self.top.add('comp1', ExecCompWithDerivatives(exp1, deriv1))
self.top.add('comp2', ExecCompWithDerivatives(exp2, deriv2))
self.top.add('comp3', ExecCompWithDerivatives(exp3, deriv3))
self.top.add('comp4', ExecCompWithDerivatives(exp4, deriv4))
self.top.add('comp5', ExecCompWithDerivatives(exp5, deriv5))
self.top.driver.workflow.add(['comp1', 'comp2', 'comp3', 'comp4', 'comp5'])
self.top.connect('comp1.y1', 'comp2.x1')
self.top.connect('comp1.y2', 'comp3.x1')
self.top.connect('comp2.y1', 'comp4.x1')
self.top.connect('comp3.y1', 'comp4.x2')
self.top.connect('comp4.y1', 'comp5.x1')
self.top.connect('comp4.y2', 'comp5.x2')
self.top.connect('comp4.y3', 'comp5.x3')
self.top.comp1.x1 = 2.0
self.top.driver.gradient_options.lin_solver = 'linear_gs'
self.top.driver.gradient_options.maxiter = 1
self.top.run()
# from openmdao.util.dotgraph import plot_system_tree
# plot_system_tree(self.top._system)
J = self.top.driver.calc_gradient(inputs=['comp1.x1'],
outputs=['comp5.y1'],
mode='forward',
return_format='dict')
assert_rel_error(self, J['comp5.y1']['comp1.x1'][0][0],
313.0, 0.0001)
def test_diverge_converge_LinGS_adjoint(self):
self.top = set_as_top(Assembly())
exp1 = ['y1 = 2.0*x1**2',
'y2 = 3.0*x1']
deriv1 = ['dy1_dx1 = 4.0*x1',
'dy2_dx1 = 3.0']
exp2 = ['y1 = 0.5*x1']
deriv2 = ['dy1_dx1 = 0.5']
exp3 = ['y1 = 3.5*x1']
deriv3 = ['dy1_dx1 = 3.5']
exp4 = ['y1 = x1 + 2.0*x2',
'y2 = 3.0*x1',
'y3 = x1*x2']
deriv4 = ['dy1_dx1 = 1.0',
'dy1_dx2 = 2.0',
'dy2_dx1 = 3.0',
'dy2_dx2 = 0.0',
'dy3_dx1 = x2',
'dy3_dx2 = x1']
exp5 = ['y1 = x1 + 3.0*x2 + 2.0*x3']
deriv5 = ['dy1_dx1 = 1.0',
'dy1_dx2 = 3.0',
'dy1_dx3 = 2.0']
self.top.add('comp1', ExecCompWithDerivatives(exp1, deriv1))
self.top.add('comp2', ExecCompWithDerivatives(exp2, deriv2))
self.top.add('comp3', ExecCompWithDerivatives(exp3, deriv3))
self.top.add('comp4', ExecCompWithDerivatives(exp4, deriv4))
self.top.add('comp5', ExecCompWithDerivatives(exp5, deriv5))
self.top.driver.workflow.add(['comp1', 'comp2', 'comp3', 'comp4', 'comp5'])
self.top.connect('comp1.y1', 'comp2.x1')
self.top.connect('comp1.y2', 'comp3.x1')
self.top.connect('comp2.y1', 'comp4.x1')
self.top.connect('comp3.y1', 'comp4.x2')
self.top.connect('comp4.y1', 'comp5.x1')
self.top.connect('comp4.y2', 'comp5.x2')
self.top.connect('comp4.y3', 'comp5.x3')
self.top.comp1.x1 = 2.0
self.top.driver.gradient_options.lin_solver = 'linear_gs'
self.top.driver.gradient_options.maxiter = 1
self.top.run()
J = self.top.driver.calc_gradient(inputs=['comp1.x1'],
outputs=['comp5.y1'],
mode='adjoint',
return_format='dict')
assert_rel_error(self, J['comp5.y1']['comp1.x1'][0][0],
313.0, 0.0001)
def test_diverge_converge_adjoint(self):
self.top = set_as_top(Assembly())
exp1 = ['y1 = 2.0*x1**2',
'y2 = 3.0*x1']
deriv1 = ['dy1_dx1 = 4.0*x1',
'dy2_dx1 = 3.0']
exp2 = ['y1 = 0.5*x1']
deriv2 = ['dy1_dx1 = 0.5']
exp3 = ['y1 = 3.5*x1']
deriv3 = ['dy1_dx1 = 3.5']
exp4 = ['y1 = x1 + 2.0*x2',
'y2 = 3.0*x1',
'y3 = x1*x2']
deriv4 = ['dy1_dx1 = 1.0',
'dy1_dx2 = 2.0',
'dy2_dx1 = 3.0',
'dy2_dx2 = 0.0',
'dy3_dx1 = x2',
'dy3_dx2 = x1']
exp5 = ['y1 = x1 + 3.0*x2 + 2.0*x3']
deriv5 = ['dy1_dx1 = 1.0',
'dy1_dx2 = 3.0',
'dy1_dx3 = 2.0']
self.top.add('comp1', ExecCompWithDerivatives(exp1, deriv1))
self.top.add('comp2', ExecCompWithDerivatives(exp2, deriv2))
self.top.add('comp3', ExecCompWithDerivatives(exp3, deriv3))
self.top.add('comp4', ExecCompWithDerivatives(exp4, deriv4))
self.top.add('comp5', ExecCompWithDerivatives(exp5, deriv5))
self.top.driver.workflow.add(['comp1', 'comp2', 'comp3', 'comp4', 'comp5'])
self.top.connect('comp1.y1', 'comp2.x1')
self.top.connect('comp1.y2', 'comp3.x1')
self.top.connect('comp2.y1', 'comp4.x1')
self.top.connect('comp3.y1', 'comp4.x2')
self.top.connect('comp4.y1', 'comp5.x1')
self.top.connect('comp4.y2', 'comp5.x2')
self.top.connect('comp4.y3', 'comp5.x3')
self.top.comp1.x1 = 2.0
self.top.run()
J = self.top.driver.calc_gradient(inputs=['comp1.x1'],
outputs=['comp5.y1'],
mode='adjoint',
return_format='dict')
#from openmdao.util.dotgraph import plot_system_tree
#plot_system_tree(self.top._system)
#print J
assert_rel_error(self, J['comp5.y1']['comp1.x1'][0][0],
313.0, 0.0001)
def test_diverge_converge_extended_adjoint(self):
top = set_as_top(Assembly())
exp1 = ['y1 = 2.0*x1**2',
'y2 = 3.0*x1']
deriv1 = ['dy1_dx1 = 4.0*x1',
'dy2_dx1 = 3.0']
exp2 = ['y1 = 0.5*x1']
deriv2 = ['dy1_dx1 = 0.5']
exp3 = ['y1 = 3.5*x1']
deriv3 = ['dy1_dx1 = 3.5']
exp4 = ['y1 = x1 + 2.0*x2',
'y2 = 3.0*x1',
'y3 = x1*x2']
deriv4 = ['dy1_dx1 = 1.0',
'dy1_dx2 = 2.0',
'dy2_dx1 = 3.0',
'dy2_dx2 = 0.0',
'dy3_dx1 = x2',
'dy3_dx2 = x1']
exp5 = ['y1 = x1 + 3.0*x2 + 2.0*x3']
deriv5 = ['dy1_dx1 = 1.0',
'dy1_dx2 = 3.0',
'dy1_dx3 = 2.0']
top.add('comp1', ExecCompWithDerivatives(exp1, deriv1))
top.add('comp2', ExecCompWithDerivatives(exp2, deriv2))
top.add('comp2b', ExecCompWithDerivatives(exp3, deriv3))
top.add('comp3', ExecCompWithDerivatives(exp3, deriv3))
top.add('comp3b', ExecCompWithDerivatives(exp3, deriv3))
top.add('comp4', ExecCompWithDerivatives(exp4, deriv4))
top.add('comp5', ExecCompWithDerivatives(exp5, deriv5))
top.driver.workflow.add(['comp1', 'comp2', 'comp2b', 'comp3', 'comp3b', 'comp4', 'comp5'])
top.connect('comp1.y1', 'comp2.x1')
top.connect('comp1.y2', 'comp3.x1')
top.connect('comp2.y1', 'comp2b.x1')
top.connect('comp3.y1', 'comp3b.x1')
top.connect('comp2b.y1', 'comp4.x1')
top.connect('comp3b.y1', 'comp4.x2')
top.connect('comp4.y1', 'comp5.x1')
top.connect('comp4.y2', 'comp5.x2')
top.connect('comp4.y3', 'comp5.x3')
top.comp1.x1 = 2.0
#top.driver.gradient_options.lin_solver = 'linear_gs'
#top.driver.gradient_options.maxiter = 1
top.run()
J = top.driver.calc_gradient(inputs=['comp1.x1'],
outputs=['comp5.y1'],
mode='adjoint',
return_format='dict')
#from openmdao.util.dotgraph import plot_system_tree
#plot_system_tree(self.top._system)
#print J
assert_rel_error(self, J['comp5.y1']['comp1.x1'][0][0],
3300.5, 0.0001)
def test_diverge_converge_nondiff_comp3_forward(self):
self.top = set_as_top(Assembly())
exp1 = ['y1 = 2.0*x1**2',
'y2 = 3.0*x1']
deriv1 = ['dy1_dx1 = 4.0*x1',
'dy2_dx1 = 3.0']
exp2 = ['y1 = 0.5*x1']
deriv2 = ['dy1_dx1 = 0.5']
exp3 = ['y1 = 3.5*x1']
deriv3 = ['dy1_dx1 = 3.5']
exp4 = ['y1 = x1 + 2.0*x2',
'y2 = 3.0*x1',
'y3 = x1*x2']
deriv4 = ['dy1_dx1 = 1.0',
'dy1_dx2 = 2.0',
'dy2_dx1 = 3.0',
'dy2_dx2 = 0.0',
'dy3_dx1 = x2',
'dy3_dx2 = x1']
exp5 = ['y1 = x1 + 3.0*x2 + 2.0*x3']
deriv5 = ['dy1_dx1 = 1.0',
'dy1_dx2 = 3.0',
'dy1_dx3 = 2.0']
self.top.add('driver', SimpleDriver())
self.top.add('comp1', ExecCompWithDerivatives(exp1, deriv1))
self.top.add('comp2', ExecCompWithDerivatives(exp2, deriv2))
self.top.add('comp3', ExecComp(exp3))
self.top.add('comp4', ExecCompWithDerivatives(exp4, deriv4))
self.top.add('comp5', ExecCompWithDerivatives(exp5, deriv5))
self.top.driver.workflow.add(['comp1', 'comp2', 'comp3', 'comp4', 'comp5'])
self.top.connect('comp1.y1', 'comp2.x1')
self.top.connect('comp1.y2', 'comp3.x1')
self.top.connect('comp2.y1', 'comp4.x1')
self.top.connect('comp3.y1', 'comp4.x2')
self.top.connect('comp4.y1', 'comp5.x1')
self.top.connect('comp4.y2', 'comp5.x2')
self.top.connect('comp4.y3', 'comp5.x3')
self.top.driver.add_parameter('comp1.x1', low=-100, high=100)
self.top.driver.add_objective('comp5.y1')
self.top.comp1.x1 = 2.0
self.top.run()
J = self.top.driver.calc_gradient(inputs=['comp1.x1'],
outputs=['comp5.y1'],
mode='forward',
return_format='dict')
assert_rel_error(self, J['comp5.y1']['comp1.x1'][0][0],
313.0, 0.0001)
def test_one_two_one_two_one_forward(self):
self.top = set_as_top(Assembly())
exp1 = ['y1 = 2.0*x1**2',
'y2 = 3.0*x1']
deriv1 = ['dy1_dx1 = 4.0*x1',
'dy2_dx1 = 3.0']
exp2 = ['y1 = 0.5*x1']
deriv2 = ['dy1_dx1 = 0.5']
exp3 = ['y1 = 3.5*x1']
deriv3 = ['dy1_dx1 = 3.5']
exp4 = ['y1 = x1 + 2.0*x2',
'y2 = 3.0*x1 - 5.0*x2']
deriv4 = ['dy1_dx1 = 1.0',
'dy1_dx2 = 2.0',
'dy2_dx1 = 3.0',
'dy2_dx2 = -5.0',]
exp5 = ['y1 = 0.8*x1']
deriv5 = ['dy1_dx1 = 0.8']
exp6 = ['y1 = 0.5*x1']
deriv6 = ['dy1_dx1 = 0.5']
exp7 = ['y1 = x1 + 3.0*x2']
deriv7 = ['dy1_dx1 = 1.0',
'dy1_dx2 = 3.0']
self.top.add('comp1', ExecCompWithDerivatives(exp1, deriv1))
self.top.add('comp2', ExecCompWithDerivatives(exp2, deriv2))
self.top.add('comp3', ExecCompWithDerivatives(exp3, deriv3))
self.top.add('comp4', ExecCompWithDerivatives(exp4, deriv4))
self.top.add('comp5', ExecCompWithDerivatives(exp5, deriv5))
self.top.add('comp6', ExecCompWithDerivatives(exp6, deriv6))
self.top.add('comp7', ExecCompWithDerivatives(exp7, deriv7))
self.top.driver.workflow.add(['comp1', 'comp2', 'comp3', 'comp4', 'comp5', 'comp6', 'comp7'])
self.top.connect('comp1.y1', 'comp2.x1')
self.top.connect('comp1.y2', 'comp3.x1')
self.top.connect('comp2.y1', 'comp4.x1')
self.top.connect('comp3.y1', 'comp4.x2')
self.top.connect('comp4.y1', 'comp5.x1')
self.top.connect('comp4.y2', 'comp6.x1')
self.top.connect('comp5.y1', 'comp7.x1')
self.top.connect('comp6.y1', 'comp7.x2')
self.top.comp1.x1 = 2.0
self.top.run()
J = self.top.driver.calc_gradient(inputs=['comp1.x1'],
outputs=['comp7.y1'],
mode='forward',
return_format='dict')
assert_rel_error(self, J['comp7.y1']['comp1.x1'][0][0],
-40.75, 0.0001)
def test_one_two_one_two_one_adjoint(self):
self.top = set_as_top(Assembly())
exp1 = ['y1 = 2.0*x1**2',
'y2 = 3.0*x1']
deriv1 = ['dy1_dx1 = 4.0*x1',
'dy2_dx1 = 3.0']
exp2 = ['y1 = 0.5*x1']
deriv2 = ['dy1_dx1 = 0.5']
exp3 = ['y1 = 3.5*x1']
deriv3 = ['dy1_dx1 = 3.5']
exp4 = ['y1 = x1 + 2.0*x2',
'y2 = 3.0*x1 - 5.0*x2']
deriv4 = ['dy1_dx1 = 1.0',
'dy1_dx2 = 2.0',
'dy2_dx1 = 3.0',
'dy2_dx2 = -5.0',]
exp5 = ['y1 = 0.8*x1']
deriv5 = ['dy1_dx1 = 0.8']
exp6 = ['y1 = 0.5*x1']
deriv6 = ['dy1_dx1 = 0.5']
exp7 = ['y1 = x1 + 3.0*x2']
deriv7 = ['dy1_dx1 = 1.0',
'dy1_dx2 = 3.0']
self.top.add('comp1', ExecCompWithDerivatives(exp1, deriv1))
self.top.add('comp2', ExecCompWithDerivatives(exp2, deriv2))
self.top.add('comp3', ExecCompWithDerivatives(exp3, deriv3))
self.top.add('comp4', ExecCompWithDerivatives(exp4, deriv4))
self.top.add('comp5', ExecCompWithDerivatives(exp5, deriv5))
self.top.add('comp6', ExecCompWithDerivatives(exp6, deriv6))
self.top.add('comp7', ExecCompWithDerivatives(exp7, deriv7))
self.top.driver.workflow.add(['comp1', 'comp2', 'comp3', 'comp4', 'comp5', 'comp6', 'comp7'])
self.top.connect('comp1.y1', 'comp2.x1')
self.top.connect('comp1.y2', 'comp3.x1')
self.top.connect('comp2.y1', 'comp4.x1')
self.top.connect('comp3.y1', 'comp4.x2')
self.top.connect('comp4.y1', 'comp5.x1')
self.top.connect('comp4.y2', 'comp6.x1')
self.top.connect('comp5.y1', 'comp7.x1')
self.top.connect('comp6.y1', 'comp7.x2')
self.top.comp1.x1 = 2.0
self.top.run()
J = self.top.driver.calc_gradient(inputs=['comp1.x1'],
outputs=['comp7.y1'],
mode='adjoint',
return_format='dict')
assert_rel_error(self, J['comp7.y1']['comp1.x1'][0][0],
-40.75, 0.0001)
def test_lin_GS_subassy(self):
class Sub(Assembly):
def configure(self):
exp1 = ['y1 = 2.0*x1**2',
'y2 = 3.0*x1']
deriv1 = ['dy1_dx1 = 4.0*x1',
'dy2_dx1 = 3.0']
exp2 = ['y1 = 0.5*x1']
deriv2 = ['dy1_dx1 = 0.5']
exp3 = ['y1 = 3.5*x1']
deriv3 = ['dy1_dx1 = 3.5']
exp4 = ['y1 = x1 + 2.0*x2',
'y2 = 3.0*x1',
'y3 = x1*x2']
deriv4 = ['dy1_dx1 = 1.0',
'dy1_dx2 = 2.0',
'dy2_dx1 = 3.0',
'dy2_dx2 = 0.0',
'dy3_dx1 = x2',
'dy3_dx2 = x1']
exp5 = ['y1 = x1 + 3.0*x2 + 2.0*x3']
deriv5 = ['dy1_dx1 = 1.0',
'dy1_dx2 = 3.0',
'dy1_dx3 = 2.0']
self.add('comp1', ExecCompWithDerivatives(exp1, deriv1))
self.add('comp2', ExecCompWithDerivatives(exp2, deriv2))
self.add('comp2b', ExecCompWithDerivatives(exp3, deriv3))
self.add('comp3', ExecCompWithDerivatives(exp3, deriv3))
self.add('comp3b', ExecCompWithDerivatives(exp3, deriv3))
self.add('comp4', ExecCompWithDerivatives(exp4, deriv4))
self.add('comp5', ExecCompWithDerivatives(exp5, deriv5))
self.driver.workflow.add(['comp1', 'comp2', 'comp2b', 'comp3', 'comp3b', 'comp4', 'comp5'])
self.connect('comp1.y1', 'comp2.x1')
self.connect('comp1.y2', 'comp3.x1')
self.connect('comp2.y1', 'comp2b.x1')
self.connect('comp3.y1', 'comp3b.x1')
self.connect('comp2b.y1', 'comp4.x1')
self.connect('comp3b.y1', 'comp4.x2')
self.connect('comp4.y1', 'comp5.x1')
self.connect('comp4.y2', 'comp5.x2')
self.connect('comp4.y3', 'comp5.x3')
self.comp1.x1 = 2.0
self.create_passthrough('comp1.x1')
self.create_passthrough('comp5.y1')
self.create_passthrough('comp1.y2')
self.driver.system_type = 'serial'
top = set_as_top(Assembly())
top.add('sub1', Sub())
top.add('sub2', Sub())
top.replace('driver', SimpleDriver())
top.driver.workflow.add(['sub1', 'sub2'])
top.driver.add_parameter('sub1.x1', low=-10, high=10)
top.driver.add_parameter('sub2.x1', low=-10, high=10)
top.driver.add_objective('sub1.y1 + sub2.y1')
# These make it lock up
top.driver.add_constraint('sub1.y2 < 100')
#top.driver.add_constraint('sub2.y2 < 100')
top.sub1.x1 = 2.0
top.sub2.x1 = 3.0
top.driver.gradient_options.lin_solver = 'linear_gs'
top.driver.gradient_options.maxiter = 1
top.run()
# from openmdao.util.dotgraph import plot_system_tree
# plot_system_tree(top._system)
J = top.driver.calc_gradient(mode='adjoint',
return_format='dict')
assert_rel_error(self, J['_pseudo_0.out0']['sub1.x1'][0][0],
3300.5, 0.0001)
assert_rel_error(self, J['_pseudo_0.out0']['sub2.x1'][0][0],
7229.25, 0.0001)
assert_rel_error(self, J['_pseudo_1.out0']['sub1.x1'][0][0],
3.0, 0.0001)
def test_parsys_transdriver(self):
class Sub(Assembly):
def __init__(self, factor):
super(Sub, self).__init__()
self.factor = factor
def configure(self):
exp = ['y = %f*x' % self.factor]
deriv = ['dy_dx = %f' % self.factor]
self.add('comp', ExecCompWithDerivatives(exp, deriv))
self.driver.workflow.add(['comp'])
self.create_passthrough('comp.x')
self.create_passthrough('comp.y')
self.driver.system_type = 'serial'
top = set_as_top(Assembly())
top.add('sub1', Sub(factor=1.0))
top.add('sub2', Sub(factor=3.0))
top.add('stuff', Driver())
top.stuff.system_type = "serial"
exp = ['y = 3.0*x1 + 5.0*x2']
deriv = ['dy_dx1 = 3.0', 'dy_dx2 = 5.0']
top.add('post', ExecCompWithDerivatives(exp, deriv))
top.connect('sub1.y', 'post.x1')
top.connect('sub2.y', 'post.x2')
top.replace('driver', SimpleDriver())
top.driver.workflow.add(['sub1', 'sub2', 'stuff'])
top.stuff.workflow.add(['post'])
top.driver.add_parameter('sub1.x', low=-10, high=10)
top.driver.add_parameter('sub2.x', low=-10, high=10)
top.driver.add_objective('post.y')
top.sub1.x = 2.0
top.sub2.x = 3.0
top.run()
with MPIContext():
assert_rel_error(self, top.post.y, 51.0, 0.0001)
# from openmdao.util.dotgraph import plot_system_tree
# plot_system_tree(top._system)
J = top.driver.calc_gradient(mode='forward', return_format='dict')
with MPIContext():
assert_rel_error(self, J['_pseudo_0.out0']['sub1.x'][0][0],
3.0, 0.0001)
assert_rel_error(self, J['_pseudo_0.out0']['sub2.x'][0][0],
15.0, 0.0001)
J = top.driver.calc_gradient(mode='adjoint', return_format='dict')
with MPIContext():
assert_rel_error(self, J['_pseudo_0.out0']['sub1.x'][0][0],
3.0, 0.0001)
assert_rel_error(self, J['_pseudo_0.out0']['sub2.x'][0][0],
15.0, 0.0001)
J = top.driver.calc_gradient(mode='fd', return_format='dict')
with MPIContext():
assert_rel_error(self, J['_pseudo_0.out0']['sub1.x'][0][0],
3.0, 0.0001)
assert_rel_error(self, J['_pseudo_0.out0']['sub2.x'][0][0],
15.0, 0.0001)
def test_parsys_in_transdriver_and_transdriver(self):
class Sub(Assembly):
def __init__(self, factor):
super(Sub, self).__init__()
self.factor = factor
def configure(self):
exp = ['y = %f*x' % self.factor]
deriv = ['dy_dx = %f' % self.factor]
self.add('comp', ExecCompWithDerivatives(exp, deriv))
self.driver.workflow.add(['comp'])
self.create_passthrough('comp.x')
self.create_passthrough('comp.y')
self.driver.system_type = 'serial'
top = set_as_top(Assembly())
top.add('sub1', Sub(factor=1.0))
top.add('sub2', Sub(factor=3.0))
top.add('stuff', Driver())
top.add('missions', Driver())
top.stuff.system_type = "serial"
exp = ['y = 3.0*x1 + 5.0*x2']
deriv = ['dy_dx1 = 3.0', 'dy_dx2 = 5.0']
top.add('post', ExecCompWithDerivatives(exp, deriv))
top.connect('sub1.y', 'post.x1')
top.connect('sub2.y', 'post.x2')
top.replace('driver', SimpleDriver())
top.driver.workflow.add(['missions', 'stuff'])
top.missions.workflow.add(['sub1', 'sub2'])
top.stuff.workflow.add(['post'])
top.driver.add_parameter('sub1.x', low=-10, high=10)
top.driver.add_parameter('sub2.x', low=-10, high=10)
top.driver.add_objective('post.y')
top.sub1.x = 2.0
top.sub2.x = 3.0
top.run()
with MPIContext():
assert_rel_error(self, top.post.y, 51.0, 0.0001)
# from openmdao.util.dotgraph import plot_system_tree
# plot_system_tree(top._system)
J = top.driver.calc_gradient(mode='forward', return_format='dict')
with MPIContext():
assert_rel_error(self, J['_pseudo_0.out0']['sub1.x'][0][0],
3.0, 0.0001)
assert_rel_error(self, J['_pseudo_0.out0']['sub2.x'][0][0],
15.0, 0.0001)
J = top.driver.calc_gradient(mode='adjoint', return_format='dict')
with MPIContext():
assert_rel_error(self, J['_pseudo_0.out0']['sub1.x'][0][0],
3.0, 0.0001)
assert_rel_error(self, J['_pseudo_0.out0']['sub2.x'][0][0],
15.0, 0.0001)
J = top.driver.calc_gradient(mode='fd', return_format='dict')
with MPIContext():
assert_rel_error(self, J['_pseudo_0.out0']['sub1.x'][0][0],
3.0, 0.0001)
assert_rel_error(self, J['_pseudo_0.out0']['sub2.x'][0][0],
15.0, 0.0001)
def test_parallel_gather_for_objective(self):
class SpecialDriver(SimpleDriver):
def execute(self):
self.run_iteration()
self.func_dict = {}
for key, obj in self.get_objectives().iteritems():
name = '%s.out0' % obj.pcomp_name
self.func_dict[name] = np.array(obj.evaluate())
class Sub(Assembly):
def configure(self):
exp1 = ['y1 = 2.0*x1**2',
'y2 = 3.0*x1']
deriv1 = ['dy1_dx1 = 4.0*x1',
'dy2_dx1 = 3.0']
exp2 = ['y1 = 0.5*x1']
deriv2 = ['dy1_dx1 = 0.5']
exp3 = ['y1 = 3.5*x1']
deriv3 = ['dy1_dx1 = 3.5']
exp4 = ['y1 = x1 + 2.0*x2',
'y2 = 3.0*x1',
'y3 = x1*x2']
deriv4 = ['dy1_dx1 = 1.0',
'dy1_dx2 = 2.0',
'dy2_dx1 = 3.0',
'dy2_dx2 = 0.0',
'dy3_dx1 = x2',
'dy3_dx2 = x1']
exp5 = ['y1 = x1 + 3.0*x2 + 2.0*x3']
deriv5 = ['dy1_dx1 = 1.0',
'dy1_dx2 = 3.0',
'dy1_dx3 = 2.0']
self.add('comp1', ExecCompWithDerivatives(exp1, deriv1))
self.add('comp2', ExecCompWithDerivatives(exp2, deriv2))
self.add('comp2b', ExecCompWithDerivatives(exp3, deriv3))
self.add('comp3', ExecCompWithDerivatives(exp3, deriv3))
self.add('comp3b', ExecCompWithDerivatives(exp3, deriv3))
self.add('comp4', ExecCompWithDerivatives(exp4, deriv4))
self.add('comp5', ExecCompWithDerivatives(exp5, deriv5))
self.driver.workflow.add(['comp1', 'comp2', 'comp2b', 'comp3', 'comp3b', 'comp4', 'comp5'])
self.connect('comp1.y1', 'comp2.x1')
self.connect('comp1.y2', 'comp3.x1')
self.connect('comp2.y1', 'comp2b.x1')
self.connect('comp3.y1', 'comp3b.x1')
self.connect('comp2b.y1', 'comp4.x1')
self.connect('comp3b.y1', 'comp4.x2')
self.connect('comp4.y1', 'comp5.x1')
self.connect('comp4.y2', 'comp5.x2')
self.connect('comp4.y3', 'comp5.x3')
self.comp1.x1 = 2.0
self.create_passthrough('comp1.x1')
self.create_passthrough('comp5.y1')
self.create_passthrough('comp1.y2')
self.driver.system_type = 'serial'
top = set_as_top(Assembly())
top.add('sub1', Sub())
top.add('sub2', Sub())
top.replace('driver', SpecialDriver())
top.driver.workflow.add(['sub1', 'sub2'])
top.driver.add_parameter('sub1.x1', low=-10, high=10)
top.driver.add_parameter('sub2.x1', low=-10, high=10)
top.driver.add_objective('sub1.y1 + sub2.y1')
top.sub1.x1 = 2.0
top.sub2.x1 = 3.0
top.run()
#from openmdao.util.dotgraph import plot_system_tree
#plot_system_tree(top._system)
assert_rel_error(self, 9826.25, top._pseudo_0.out0, 0.0001)
assert_rel_error(self, 9826.25, top.driver.func_dict['_pseudo_0.out0'], 0.0001)
def test_CADRE_bug1(self):
class AComp(Component):
x = Array(np.array([7.0]), iotype='in')
v = Array(np.array([5.0, 3.0]), iotype='in')
y = Float(1.0, iotype='out')
def execute(self):
self.y = self.x[0] * (self.v[0] + self.v[1])
def list_deriv_vars(self):
return ('x', 'v'), ('y', )
def provideJ(self):
self.J = np.array([self.v[0] + self.v[1], self.x[0], self.x[0]])
return None
def apply_deriv(self, arg, result):
if 'x' in arg:
result['y'] += self.J[0] * arg['x']
if 'v' in arg:
result['y'] += self.J[1:].dot(arg['v'])
def apply_derivT(self, arg, result):
if 'x' in result:
result['x'] += self.J[0] * arg['y']
if 'v' in result:
result['v'] += self.J[1:] * arg['y']
top = set_as_top(Assembly())
top.add('nest1', Assembly())
top.add('nest2', Assembly())
top.add('driver', SimpleDriver())
top.nest1.add('comp', AComp())
top.nest2.add('comp', AComp())
top.nest1.create_passthrough('comp.x')
top.nest1.create_passthrough('comp.v')
top.nest1.create_passthrough('comp.y')
top.nest2.create_passthrough('comp.x')
top.nest2.create_passthrough('comp.v')
top.nest2.create_passthrough('comp.y')
top.nest1.driver.workflow.add('comp')
top.nest2.driver.workflow.add('comp')
top.driver.workflow.add(['nest1', 'nest2'])
top.driver.add_parameter('nest1.x[0]', low=-10, high=10)
top.driver.add_parameter('nest1.v', low=-10, high=10)
top.driver.add_parameter('nest2.x[0]', low=-10, high=10)
top.driver.add_parameter('nest2.v', low=-10, high=10)
top.driver.add_objective('nest1.y + nest2.y')
# Answers: 8, 7, 7
top.run()
J = top.driver.calc_gradient(mode='forward', return_format='dict')
print J
# Check for Bret (Note, need better way to figure out rank that contains an assembly.)
if self.comm.rank == 1:
asys = top.nest1._system
else:
asys = top.nest2._system
# Slice should be there
self.assertTrue(('x[0]', ('comp.x[0]', 'x[0]')) in asys.variables.keys())
# Full vec shoulld not
self.assertTrue(('x', ('comp.x', 'x')) not in asys.variables.keys())
with MPIContext():
assert_rel_error(self,
J['_pseudo_0.out0']['nest1.x[0]'][0][0],
8.0, 0.0001)
if __name__ == '__main__':
from openmdao.test.mpiunittest import mpirun_tests
mpirun_tests()
| 35.475146 | 107 | 0.49796 | 6,079 | 48,530 | 3.84685 | 0.04458 | 0.052384 | 0.032328 | 0.039256 | 0.907462 | 0.876374 | 0.857558 | 0.847766 | 0.835707 | 0.829121 | 0 | 0.098593 | 0.335174 | 48,530 | 1,367 | 108 | 35.501097 | 0.626209 | 0.022728 | 0 | 0.827902 | 0 | 0 | 0.183215 | 0 | 0 | 0 | 0 | 0 | 0.057026 | 0 | null | null | 0.016293 | 0.009165 | null | null | 0.001018 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
40fe977a69f6676e55dbfb3a9e51bdbe83345a41 | 123 | py | Python | advkit/convnets/__init__.py | tqch/advkit | 5a5835317614717749e6fc4c577dad940920a9ca | [
"MIT"
] | null | null | null | advkit/convnets/__init__.py | tqch/advkit | 5a5835317614717749e6fc4c577dad940920a9ca | [
"MIT"
] | null | null | null | advkit/convnets/__init__.py | tqch/advkit | 5a5835317614717749e6fc4c577dad940920a9ca | [
"MIT"
] | null | null | null | import advkit.convnets.vgg
import advkit.convnets.resnet
import advkit.convnets.wide_resnet
import advkit.convnets.densenet | 30.75 | 34 | 0.878049 | 17 | 123 | 6.294118 | 0.411765 | 0.448598 | 0.747664 | 0.485981 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.056911 | 123 | 4 | 35 | 30.75 | 0.922414 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
90b880586f0513f37a2b8f96c68960ffeb57756b | 2,241 | py | Python | Src/lr_policy.py | SivanKe/SyntheticDataHandwrittenCharacterRecognition | c2b299009ddc24eab1bc2074787e82e566f79abc | [
"MIT"
] | 7 | 2019-07-14T06:49:15.000Z | 2021-11-03T12:13:37.000Z | Src/lr_policy.py | SivanKe/SyntheticDataHandwrittenCharacterRecognition | c2b299009ddc24eab1bc2074787e82e566f79abc | [
"MIT"
] | 2 | 2019-07-16T07:44:37.000Z | 2019-07-18T10:57:23.000Z | Src/lr_policy.py | SivanKe/SyntheticDataHandwrittenCharacterRecognition | c2b299009ddc24eab1bc2074787e82e566f79abc | [
"MIT"
] | 1 | 2020-09-03T14:51:53.000Z | 2020-09-03T14:51:53.000Z | import numpy as np
class StepLR(object):
def __init__(self, optimizer, step_size=1000, max_iter=10000):
self.optimizer = optimizer
self.max_iter = max_iter
self.step_size = step_size
self.last_iter = -1
self.base_lrs = list(map(lambda group: group['lr'], optimizer.param_groups))
def get_lr(self):
return self.optimizer.param_groups[0]['lr']
def step(self, last_iter=None):
if last_iter is not None:
self.last_iter = last_iter
if self.last_iter + 1 == self.max_iter:
self.last_iter = -1
self.last_iter = (self.last_iter + 1) % self.max_iter
for ids, param_group in enumerate(self.optimizer.param_groups):
param_group['lr'] = self.base_lrs[ids] * 0.1 ** ( self.last_iter // self.step_size )
class StepLrOld(object):
def __init__(self, optimizer, step_size=1000, max_iter=10000):
self.optimizer = optimizer
self.max_iter = max_iter
self.step_size = step_size
self.last_iter = -1
self.base_lrs = list(map(lambda group: group['lr'], optimizer.param_groups))
def get_lr(self):
return self.optimizer.param_groups[0]['lr']
def step(self, last_iter=None):
if last_iter is not None:
self.last_iter = last_iter
if self.last_iter + 1 == self.max_iter:
self.last_iter = -1
self.last_iter = (self.last_iter + 1) % self.max_iter
for ids, param_group in enumerate(self.optimizer.param_groups):
param_group['lr'] = self.base_lrs[ids] * 0.1 ** ( self.last_iter // self.step_size )
class DannLR(object):
def __init__(self, optimizer, step_size=1000, max_iter=120000):
self.optimizer = optimizer
self.max_iter = max_iter
self.step_size = step_size
self.last_iter = -1
self.base_lrs = list(map(lambda group: group['lr'], optimizer.param_groups))
def get_lr(self):
return self.optimizer.param_groups[0]['lr']
def update(self, p):
self.last_iter += 1
lr_factor = 1./(1. + 10 * p)**0.75
for ids, param_group in enumerate(self.optimizer.param_groups):
param_group['lr'] = self.base_lrs[ids] * lr_factor
| 37.35 | 96 | 0.631861 | 326 | 2,241 | 4.08589 | 0.144172 | 0.132132 | 0.162162 | 0.097598 | 0.928679 | 0.928679 | 0.928679 | 0.928679 | 0.928679 | 0.928679 | 0 | 0.030971 | 0.250781 | 2,241 | 59 | 97 | 37.983051 | 0.762359 | 0 | 0 | 0.816327 | 0 | 0 | 0.008032 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.183673 | false | 0 | 0.020408 | 0.061224 | 0.326531 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
90d299e87a3114bfe9e98cd62e43f4202ebd5cf3 | 12,816 | py | Python | tests/test_auto_scan_grid_sample.py | neonhuang/Paddle2ONNX | 96640ccbeea1ae4e628dde20b3dbb1c3b4ef2702 | [
"Apache-2.0"
] | 95 | 2019-09-27T14:26:59.000Z | 2020-12-08T01:20:28.000Z | tests/test_auto_scan_grid_sample.py | neonhuang/Paddle2ONNX | 96640ccbeea1ae4e628dde20b3dbb1c3b4ef2702 | [
"Apache-2.0"
] | 51 | 2018-04-04T22:39:30.000Z | 2019-08-28T20:19:14.000Z | tests/test_auto_scan_grid_sample.py | neonhuang/Paddle2ONNX | 96640ccbeea1ae4e628dde20b3dbb1c3b4ef2702 | [
"Apache-2.0"
] | 22 | 2019-09-03T08:50:04.000Z | 2020-12-02T11:05:42.000Z | # Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from auto_scan_test import OPConvertAutoScanTest, BaseNet
from hypothesis import reproduce_failure
import hypothesis.strategies as st
import numpy as np
import unittest
import paddle
import paddle.fluid as fluid
from paddle2onnx.command import program2onnx
import logging
from onnxbase import randtool, compare
import onnxruntime as rt
paddle.enable_static()
np.random.seed(33)
def test_grid_sample_align_corners():
main_program = paddle.static.Program()
startup_program = paddle.static.Program()
# onnxruntime 中的Floor不支持float64
dtype = 'float32'
align_corners = True
N = 5
with paddle.static.program_guard(main_program, startup_program):
x = fluid.data(
name='x', shape=[-1, 6, -1, -1], dtype=dtype, lod_level=1)
grid = fluid.data(
name='grid', shape=[-1, 3, -1, -1], dtype=dtype, lod_level=1)
out = paddle.nn.functional.grid_sample(
x, grid, align_corners=align_corners)
exe = paddle.static.Executor(paddle.CPUPlace())
exe.run(paddle.static.default_startup_program())
x_data = randtool("int", 1, 10, [N, 6, 3, 3]).astype(dtype)
grid_data = randtool("float", 1, 10, [N, 3, 4, 2]).astype(dtype)
x_val = fluid.create_lod_tensor(x_data, [[N]], fluid.CPUPlace())
grid_val = fluid.create_lod_tensor(grid_data, [[N]], fluid.CPUPlace())
result, = exe.run(feed={"x": x_val,
"grid": grid_val},
fetch_list=[out],
return_numpy=False)
result = np.array(result)
path_prefix = "./grid_sampler"
fluid.io.save_inference_model(path_prefix, ["x", "grid"], [out], exe)
onnx_path = path_prefix + "/model.onnx"
program2onnx(
model_dir=path_prefix,
save_file=onnx_path,
opset_version=11,
enable_onnx_checker=True)
sess = rt.InferenceSession(onnx_path)
input_name1 = sess.get_inputs()[0].name
input_name2 = sess.get_inputs()[1].name
label_name = sess.get_outputs()[0].name
pred_onnx = sess.run([label_name],
{input_name1: x_data,
input_name2: grid_data})[0]
pred_onnx = np.array(pred_onnx)
compare(pred_onnx, result, 1e-5, 1e-5)
def test_grid_sample_align_corners_False():
main_program = paddle.static.Program()
startup_program = paddle.static.Program()
# onnxruntime 中的Floor不支持float64
dtype = 'float32'
align_corners = False
N = 5
with paddle.static.program_guard(main_program, startup_program):
x = fluid.data(
name='x', shape=[-1, 6, -1, -1], dtype=dtype, lod_level=1)
grid = fluid.data(
name='grid', shape=[-1, 3, -1, -1], dtype=dtype, lod_level=1)
out = paddle.nn.functional.grid_sample(
x, grid, align_corners=align_corners)
exe = paddle.static.Executor(paddle.CPUPlace())
exe.run(paddle.static.default_startup_program())
x_data = randtool("int", 1, 10, [N, 6, 3, 3]).astype(dtype)
grid_data = randtool("float", 1, 10, [N, 3, 4, 2]).astype(dtype)
x_val = fluid.create_lod_tensor(x_data, [[N]], fluid.CPUPlace())
grid_val = fluid.create_lod_tensor(grid_data, [[N]], fluid.CPUPlace())
result, = exe.run(feed={"x": x_val,
"grid": grid_val},
fetch_list=[out],
return_numpy=False)
result = np.array(result)
path_prefix = "./grid_sampler"
fluid.io.save_inference_model(path_prefix, ["x", "grid"], [out], exe)
onnx_path = path_prefix + "/model.onnx"
program2onnx(
model_dir=path_prefix,
save_file=onnx_path,
opset_version=11,
enable_onnx_checker=True)
sess = rt.InferenceSession(onnx_path)
input_name1 = sess.get_inputs()[0].name
input_name2 = sess.get_inputs()[1].name
label_name = sess.get_outputs()[0].name
pred_onnx = sess.run([label_name],
{input_name1: x_data,
input_name2: grid_data})[0]
pred_onnx = np.array(pred_onnx)
compare(pred_onnx, result, 1e-5, 1e-5)
def test_grid_sample_align_corners_Opset12():
main_program = paddle.static.Program()
startup_program = paddle.static.Program()
# onnxruntime 中的Floor不支持float64
dtype = 'float32'
align_corners = False
N = 5
with paddle.static.program_guard(main_program, startup_program):
x = fluid.data(
name='x', shape=[-1, 6, -1, -1], dtype=dtype, lod_level=1)
grid = fluid.data(
name='grid', shape=[-1, 3, -1, -1], dtype=dtype, lod_level=1)
out = paddle.nn.functional.grid_sample(
x, grid, align_corners=align_corners)
exe = paddle.static.Executor(paddle.CPUPlace())
exe.run(paddle.static.default_startup_program())
x_data = randtool("int", 1, 10, [N, 6, 3, 3]).astype(dtype)
grid_data = randtool("float", 1, 10, [N, 3, 4, 2]).astype(dtype)
x_val = fluid.create_lod_tensor(x_data, [[N]], fluid.CPUPlace())
grid_val = fluid.create_lod_tensor(grid_data, [[N]], fluid.CPUPlace())
result, = exe.run(feed={"x": x_val,
"grid": grid_val},
fetch_list=[out],
return_numpy=False)
result = np.array(result)
path_prefix = "./grid_sampler"
fluid.io.save_inference_model(path_prefix, ["x", "grid"], [out], exe)
onnx_path = path_prefix + "/model.onnx"
program2onnx(
model_dir=path_prefix,
save_file=onnx_path,
opset_version=12,
enable_onnx_checker=True)
sess = rt.InferenceSession(onnx_path)
input_name1 = sess.get_inputs()[0].name
input_name2 = sess.get_inputs()[1].name
label_name = sess.get_outputs()[0].name
pred_onnx = sess.run([label_name],
{input_name1: x_data,
input_name2: grid_data})[0]
pred_onnx = np.array(pred_onnx)
compare(pred_onnx, result, 1e-5, 1e-5)
def test_grid_sample_align_corners_Opset13():
main_program = paddle.static.Program()
startup_program = paddle.static.Program()
# onnxruntime 中的Floor不支持float64
dtype = 'float32'
align_corners = False
N = 5
with paddle.static.program_guard(main_program, startup_program):
x = fluid.data(
name='x', shape=[-1, 6, -1, -1], dtype=dtype, lod_level=1)
grid = fluid.data(
name='grid', shape=[-1, 3, -1, -1], dtype=dtype, lod_level=1)
out = paddle.nn.functional.grid_sample(
x, grid, align_corners=align_corners)
exe = paddle.static.Executor(paddle.CPUPlace())
exe.run(paddle.static.default_startup_program())
x_data = randtool("int", 1, 10, [N, 6, 3, 3]).astype(dtype)
grid_data = randtool("float", 1, 10, [N, 3, 4, 2]).astype(dtype)
x_val = fluid.create_lod_tensor(x_data, [[N]], fluid.CPUPlace())
grid_val = fluid.create_lod_tensor(grid_data, [[N]], fluid.CPUPlace())
result, = exe.run(feed={"x": x_val,
"grid": grid_val},
fetch_list=[out],
return_numpy=False)
result = np.array(result)
path_prefix = "./grid_sampler"
fluid.io.save_inference_model(path_prefix, ["x", "grid"], [out], exe)
onnx_path = path_prefix + "/model.onnx"
program2onnx(
model_dir=path_prefix,
save_file=onnx_path,
opset_version=13,
enable_onnx_checker=True)
sess = rt.InferenceSession(onnx_path)
input_name1 = sess.get_inputs()[0].name
input_name2 = sess.get_inputs()[1].name
label_name = sess.get_outputs()[0].name
pred_onnx = sess.run([label_name],
{input_name1: x_data,
input_name2: grid_data})[0]
pred_onnx = np.array(pred_onnx)
compare(pred_onnx, result, 1e-5, 1e-5)
def test_grid_sample_align_corners_Opset14():
main_program = paddle.static.Program()
startup_program = paddle.static.Program()
# onnxruntime 中的Floor不支持float64
dtype = 'float32'
align_corners = False
N = 5
with paddle.static.program_guard(main_program, startup_program):
x = fluid.data(
name='x', shape=[-1, 6, -1, -1], dtype=dtype, lod_level=1)
grid = fluid.data(
name='grid', shape=[-1, 3, -1, -1], dtype=dtype, lod_level=1)
out = paddle.nn.functional.grid_sample(
x, grid, align_corners=align_corners)
exe = paddle.static.Executor(paddle.CPUPlace())
exe.run(paddle.static.default_startup_program())
x_data = randtool("int", 1, 10, [N, 6, 3, 3]).astype(dtype)
grid_data = randtool("float", 1, 10, [N, 3, 4, 2]).astype(dtype)
x_val = fluid.create_lod_tensor(x_data, [[N]], fluid.CPUPlace())
grid_val = fluid.create_lod_tensor(grid_data, [[N]], fluid.CPUPlace())
result, = exe.run(feed={"x": x_val,
"grid": grid_val},
fetch_list=[out],
return_numpy=False)
result = np.array(result)
path_prefix = "./grid_sampler"
fluid.io.save_inference_model(path_prefix, ["x", "grid"], [out], exe)
onnx_path = path_prefix + "/model.onnx"
program2onnx(
model_dir=path_prefix,
save_file=onnx_path,
opset_version=14,
enable_onnx_checker=True)
sess = rt.InferenceSession(onnx_path)
input_name1 = sess.get_inputs()[0].name
input_name2 = sess.get_inputs()[1].name
label_name = sess.get_outputs()[0].name
pred_onnx = sess.run([label_name],
{input_name1: x_data,
input_name2: grid_data})[0]
pred_onnx = np.array(pred_onnx)
compare(pred_onnx, result, 1e-5, 1e-5)
def test_grid_sample_align_corners_Opset15():
main_program = paddle.static.Program()
startup_program = paddle.static.Program()
# onnxruntime 中的Floor不支持float64
dtype = 'float32'
align_corners = False
N = 5
with paddle.static.program_guard(main_program, startup_program):
x = fluid.data(
name='x', shape=[-1, 6, -1, -1], dtype=dtype, lod_level=1)
grid = fluid.data(
name='grid', shape=[-1, 3, -1, -1], dtype=dtype, lod_level=1)
out = paddle.nn.functional.grid_sample(
x, grid, align_corners=align_corners)
exe = paddle.static.Executor(paddle.CPUPlace())
exe.run(paddle.static.default_startup_program())
x_data = randtool("int", 1, 10, [N, 6, 3, 3]).astype(dtype)
grid_data = randtool("float", 1, 10, [N, 3, 4, 2]).astype(dtype)
x_val = fluid.create_lod_tensor(x_data, [[N]], fluid.CPUPlace())
grid_val = fluid.create_lod_tensor(grid_data, [[N]], fluid.CPUPlace())
result, = exe.run(feed={"x": x_val,
"grid": grid_val},
fetch_list=[out],
return_numpy=False)
result = np.array(result)
path_prefix = "./grid_sampler"
fluid.io.save_inference_model(path_prefix, ["x", "grid"], [out], exe)
onnx_path = path_prefix + "/model.onnx"
program2onnx(
model_dir=path_prefix,
save_file=onnx_path,
opset_version=15,
enable_onnx_checker=True)
sess = rt.InferenceSession(onnx_path)
input_name1 = sess.get_inputs()[0].name
input_name2 = sess.get_inputs()[1].name
label_name = sess.get_outputs()[0].name
pred_onnx = sess.run([label_name],
{input_name1: x_data,
input_name2: grid_data})[0]
pred_onnx = np.array(pred_onnx)
compare(pred_onnx, result, 1e-5, 1e-5)
| 42.019672 | 78 | 0.595428 | 1,634 | 12,816 | 4.440636 | 0.107711 | 0.049614 | 0.047133 | 0.042999 | 0.885474 | 0.885474 | 0.881477 | 0.881477 | 0.881477 | 0.881477 | 0 | 0.029431 | 0.27887 | 12,816 | 304 | 79 | 42.157895 | 0.755681 | 0.059457 | 0 | 0.907336 | 0 | 0 | 0.027422 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.023166 | false | 0 | 0.042471 | 0 | 0.065637 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
297d020d83ad2a9ed5629b079e045d233aeec47f | 21,010 | py | Python | pb/risk_ban_buysell_pb2.py | zheng-zy/ot_root | 920236b48458aeed72968bc7ec8e01a084b15951 | [
"Artistic-2.0"
] | 1 | 2016-03-23T07:54:55.000Z | 2016-03-23T07:54:55.000Z | pb/risk_ban_buysell_pb2.py | zheng-zy/ot_root | 920236b48458aeed72968bc7ec8e01a084b15951 | [
"Artistic-2.0"
] | null | null | null | pb/risk_ban_buysell_pb2.py | zheng-zy/ot_root | 920236b48458aeed72968bc7ec8e01a084b15951 | [
"Artistic-2.0"
] | 1 | 2020-07-23T18:27:53.000Z | 2020-07-23T18:27:53.000Z | # Generated by the protocol buffer compiler. DO NOT EDIT!
# source: risk_ban_buysell.proto
import sys
_b=sys.version_info[0]<3 and (lambda x:x) or (lambda x:x.encode('latin1'))
from google.protobuf import descriptor as _descriptor
from google.protobuf import message as _message
from google.protobuf import reflection as _reflection
from google.protobuf import symbol_database as _symbol_database
from google.protobuf import descriptor_pb2
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
DESCRIPTOR = _descriptor.FileDescriptor(
name='risk_ban_buysell.proto',
package='risk_ban_buysell',
serialized_pb=_b('\n\x16risk_ban_buysell.proto\x12\x10risk_ban_buysell\"8\n\x0cproduct_info\x12\x12\n\nproduct_id\x18\x01 \x02(\t\x12\x14\n\x0cproduct_name\x18\x02 \x02(\t\"\x8a\x03\n\x15\x62\x61n_buy_ban_sell_info\x12>\n\x18\x62\x61sket_not_ban_buy_stock\x18\x01 \x03(\x0b\x32\x1c.risk_ban_buysell.stock_info\x12?\n\x19\x62\x61sket_not_ban_sell_stock\x18\x02 \x03(\x0b\x32\x1c.risk_ban_buysell.stock_info\x12:\n\x14\x62\x61sket_ban_buy_stock\x18\x03 \x03(\x0b\x32\x1c.risk_ban_buysell.stock_info\x12;\n\x15\x62\x61sket_ban_sell_stock\x18\x04 \x03(\x0b\x32\x1c.risk_ban_buysell.stock_info\x12:\n\x14single_ban_buy_stock\x18\x05 \x03(\x0b\x32\x1c.risk_ban_buysell.stock_info\x12;\n\x15single_ban_sell_stock\x18\x06 \x03(\x0b\x32\x1c.risk_ban_buysell.stock_info\"!\n\x0bProduct_Req\x12\x12\n\nproduct_id\x18\x01 \x01(\t\"@\n\x0cProduct_Resp\x12\x30\n\x08products\x18\x01 \x03(\x0b\x32\x1e.risk_ban_buysell.product_info\"#\n\rStock_Ban_Req\x12\x12\n\nproduct_id\x18\x01 \x02(\t\"_\n\x0eStock_Ban_Resp\x12\x12\n\nproduct_id\x18\x01 \x02(\t\x12\x39\n\x08\x62\x61n_info\x18\x02 \x01(\x0b\x32\'.risk_ban_buysell.ban_buy_ban_sell_info\"4\n\nstock_info\x12\x12\n\nstock_code\x18\x01 \x02(\t\x12\x12\n\nstock_name\x18\x02 \x02(\t\"b\n\x11Stock_Ban_Add_Req\x12\x12\n\nproduct_id\x18\x01 \x02(\t\x12\x39\n\x08\x62\x61n_info\x18\x02 \x01(\x0b\x32\'.risk_ban_buysell.ban_buy_ban_sell_info\"n\n\x12Stock_Ban_Add_Resp\x12\x10\n\x08ret_code\x18\x01 \x02(\x05\x12\x39\n\x08\x62\x61n_info\x18\x02 \x01(\x0b\x32\'.risk_ban_buysell.ban_buy_ban_sell_info\x12\x0b\n\x03msg\x18\x03 \x01(\t\"b\n\x11Stock_Ban_Del_Req\x12\x12\n\nproduct_id\x18\x01 \x02(\t\x12\x39\n\x08\x62\x61n_info\x18\x02 \x01(\x0b\x32\'.risk_ban_buysell.ban_buy_ban_sell_info\"n\n\x12Stock_Ban_Del_Resp\x12\x10\n\x08ret_code\x18\x01 \x02(\x05\x12\x39\n\x08\x62\x61n_info\x18\x02 \x01(\x0b\x32\'.risk_ban_buysell.ban_buy_ban_sell_info\x12\x0b\n\x03msg\x18\x03 \x01(\t')
)
_sym_db.RegisterFileDescriptor(DESCRIPTOR)
_PRODUCT_INFO = _descriptor.Descriptor(
name='product_info',
full_name='risk_ban_buysell.product_info',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='product_id', full_name='risk_ban_buysell.product_info.product_id', index=0,
number=1, type=9, cpp_type=9, label=2,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='product_name', full_name='risk_ban_buysell.product_info.product_name', index=1,
number=2, type=9, cpp_type=9, label=2,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
extension_ranges=[],
oneofs=[
],
serialized_start=44,
serialized_end=100,
)
_BAN_BUY_BAN_SELL_INFO = _descriptor.Descriptor(
name='ban_buy_ban_sell_info',
full_name='risk_ban_buysell.ban_buy_ban_sell_info',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='basket_not_ban_buy_stock', full_name='risk_ban_buysell.ban_buy_ban_sell_info.basket_not_ban_buy_stock', index=0,
number=1, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='basket_not_ban_sell_stock', full_name='risk_ban_buysell.ban_buy_ban_sell_info.basket_not_ban_sell_stock', index=1,
number=2, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='basket_ban_buy_stock', full_name='risk_ban_buysell.ban_buy_ban_sell_info.basket_ban_buy_stock', index=2,
number=3, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='basket_ban_sell_stock', full_name='risk_ban_buysell.ban_buy_ban_sell_info.basket_ban_sell_stock', index=3,
number=4, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='single_ban_buy_stock', full_name='risk_ban_buysell.ban_buy_ban_sell_info.single_ban_buy_stock', index=4,
number=5, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='single_ban_sell_stock', full_name='risk_ban_buysell.ban_buy_ban_sell_info.single_ban_sell_stock', index=5,
number=6, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
extension_ranges=[],
oneofs=[
],
serialized_start=103,
serialized_end=497,
)
_PRODUCT_REQ = _descriptor.Descriptor(
name='Product_Req',
full_name='risk_ban_buysell.Product_Req',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='product_id', full_name='risk_ban_buysell.Product_Req.product_id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
extension_ranges=[],
oneofs=[
],
serialized_start=499,
serialized_end=532,
)
_PRODUCT_RESP = _descriptor.Descriptor(
name='Product_Resp',
full_name='risk_ban_buysell.Product_Resp',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='products', full_name='risk_ban_buysell.Product_Resp.products', index=0,
number=1, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
extension_ranges=[],
oneofs=[
],
serialized_start=534,
serialized_end=598,
)
_STOCK_BAN_REQ = _descriptor.Descriptor(
name='Stock_Ban_Req',
full_name='risk_ban_buysell.Stock_Ban_Req',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='product_id', full_name='risk_ban_buysell.Stock_Ban_Req.product_id', index=0,
number=1, type=9, cpp_type=9, label=2,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
extension_ranges=[],
oneofs=[
],
serialized_start=600,
serialized_end=635,
)
_STOCK_BAN_RESP = _descriptor.Descriptor(
name='Stock_Ban_Resp',
full_name='risk_ban_buysell.Stock_Ban_Resp',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='product_id', full_name='risk_ban_buysell.Stock_Ban_Resp.product_id', index=0,
number=1, type=9, cpp_type=9, label=2,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='ban_info', full_name='risk_ban_buysell.Stock_Ban_Resp.ban_info', index=1,
number=2, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
extension_ranges=[],
oneofs=[
],
serialized_start=637,
serialized_end=732,
)
_STOCK_INFO = _descriptor.Descriptor(
name='stock_info',
full_name='risk_ban_buysell.stock_info',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='stock_code', full_name='risk_ban_buysell.stock_info.stock_code', index=0,
number=1, type=9, cpp_type=9, label=2,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='stock_name', full_name='risk_ban_buysell.stock_info.stock_name', index=1,
number=2, type=9, cpp_type=9, label=2,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
extension_ranges=[],
oneofs=[
],
serialized_start=734,
serialized_end=786,
)
_STOCK_BAN_ADD_REQ = _descriptor.Descriptor(
name='Stock_Ban_Add_Req',
full_name='risk_ban_buysell.Stock_Ban_Add_Req',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='product_id', full_name='risk_ban_buysell.Stock_Ban_Add_Req.product_id', index=0,
number=1, type=9, cpp_type=9, label=2,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='ban_info', full_name='risk_ban_buysell.Stock_Ban_Add_Req.ban_info', index=1,
number=2, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
extension_ranges=[],
oneofs=[
],
serialized_start=788,
serialized_end=886,
)
_STOCK_BAN_ADD_RESP = _descriptor.Descriptor(
name='Stock_Ban_Add_Resp',
full_name='risk_ban_buysell.Stock_Ban_Add_Resp',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='ret_code', full_name='risk_ban_buysell.Stock_Ban_Add_Resp.ret_code', index=0,
number=1, type=5, cpp_type=1, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='ban_info', full_name='risk_ban_buysell.Stock_Ban_Add_Resp.ban_info', index=1,
number=2, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='msg', full_name='risk_ban_buysell.Stock_Ban_Add_Resp.msg', index=2,
number=3, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
extension_ranges=[],
oneofs=[
],
serialized_start=888,
serialized_end=998,
)
_STOCK_BAN_DEL_REQ = _descriptor.Descriptor(
name='Stock_Ban_Del_Req',
full_name='risk_ban_buysell.Stock_Ban_Del_Req',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='product_id', full_name='risk_ban_buysell.Stock_Ban_Del_Req.product_id', index=0,
number=1, type=9, cpp_type=9, label=2,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='ban_info', full_name='risk_ban_buysell.Stock_Ban_Del_Req.ban_info', index=1,
number=2, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
extension_ranges=[],
oneofs=[
],
serialized_start=1000,
serialized_end=1098,
)
_STOCK_BAN_DEL_RESP = _descriptor.Descriptor(
name='Stock_Ban_Del_Resp',
full_name='risk_ban_buysell.Stock_Ban_Del_Resp',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='ret_code', full_name='risk_ban_buysell.Stock_Ban_Del_Resp.ret_code', index=0,
number=1, type=5, cpp_type=1, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='ban_info', full_name='risk_ban_buysell.Stock_Ban_Del_Resp.ban_info', index=1,
number=2, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='msg', full_name='risk_ban_buysell.Stock_Ban_Del_Resp.msg', index=2,
number=3, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
extension_ranges=[],
oneofs=[
],
serialized_start=1100,
serialized_end=1210,
)
_BAN_BUY_BAN_SELL_INFO.fields_by_name['basket_not_ban_buy_stock'].message_type = _STOCK_INFO
_BAN_BUY_BAN_SELL_INFO.fields_by_name['basket_not_ban_sell_stock'].message_type = _STOCK_INFO
_BAN_BUY_BAN_SELL_INFO.fields_by_name['basket_ban_buy_stock'].message_type = _STOCK_INFO
_BAN_BUY_BAN_SELL_INFO.fields_by_name['basket_ban_sell_stock'].message_type = _STOCK_INFO
_BAN_BUY_BAN_SELL_INFO.fields_by_name['single_ban_buy_stock'].message_type = _STOCK_INFO
_BAN_BUY_BAN_SELL_INFO.fields_by_name['single_ban_sell_stock'].message_type = _STOCK_INFO
_PRODUCT_RESP.fields_by_name['products'].message_type = _PRODUCT_INFO
_STOCK_BAN_RESP.fields_by_name['ban_info'].message_type = _BAN_BUY_BAN_SELL_INFO
_STOCK_BAN_ADD_REQ.fields_by_name['ban_info'].message_type = _BAN_BUY_BAN_SELL_INFO
_STOCK_BAN_ADD_RESP.fields_by_name['ban_info'].message_type = _BAN_BUY_BAN_SELL_INFO
_STOCK_BAN_DEL_REQ.fields_by_name['ban_info'].message_type = _BAN_BUY_BAN_SELL_INFO
_STOCK_BAN_DEL_RESP.fields_by_name['ban_info'].message_type = _BAN_BUY_BAN_SELL_INFO
DESCRIPTOR.message_types_by_name['product_info'] = _PRODUCT_INFO
DESCRIPTOR.message_types_by_name['ban_buy_ban_sell_info'] = _BAN_BUY_BAN_SELL_INFO
DESCRIPTOR.message_types_by_name['Product_Req'] = _PRODUCT_REQ
DESCRIPTOR.message_types_by_name['Product_Resp'] = _PRODUCT_RESP
DESCRIPTOR.message_types_by_name['Stock_Ban_Req'] = _STOCK_BAN_REQ
DESCRIPTOR.message_types_by_name['Stock_Ban_Resp'] = _STOCK_BAN_RESP
DESCRIPTOR.message_types_by_name['stock_info'] = _STOCK_INFO
DESCRIPTOR.message_types_by_name['Stock_Ban_Add_Req'] = _STOCK_BAN_ADD_REQ
DESCRIPTOR.message_types_by_name['Stock_Ban_Add_Resp'] = _STOCK_BAN_ADD_RESP
DESCRIPTOR.message_types_by_name['Stock_Ban_Del_Req'] = _STOCK_BAN_DEL_REQ
DESCRIPTOR.message_types_by_name['Stock_Ban_Del_Resp'] = _STOCK_BAN_DEL_RESP
product_info = _reflection.GeneratedProtocolMessageType('product_info', (_message.Message,), dict(
DESCRIPTOR = _PRODUCT_INFO,
__module__ = 'risk_ban_buysell_pb2'
# @@protoc_insertion_point(class_scope:risk_ban_buysell.product_info)
))
_sym_db.RegisterMessage(product_info)
ban_buy_ban_sell_info = _reflection.GeneratedProtocolMessageType('ban_buy_ban_sell_info', (_message.Message,), dict(
DESCRIPTOR = _BAN_BUY_BAN_SELL_INFO,
__module__ = 'risk_ban_buysell_pb2'
# @@protoc_insertion_point(class_scope:risk_ban_buysell.ban_buy_ban_sell_info)
))
_sym_db.RegisterMessage(ban_buy_ban_sell_info)
Product_Req = _reflection.GeneratedProtocolMessageType('Product_Req', (_message.Message,), dict(
DESCRIPTOR = _PRODUCT_REQ,
__module__ = 'risk_ban_buysell_pb2'
# @@protoc_insertion_point(class_scope:risk_ban_buysell.Product_Req)
))
_sym_db.RegisterMessage(Product_Req)
Product_Resp = _reflection.GeneratedProtocolMessageType('Product_Resp', (_message.Message,), dict(
DESCRIPTOR = _PRODUCT_RESP,
__module__ = 'risk_ban_buysell_pb2'
# @@protoc_insertion_point(class_scope:risk_ban_buysell.Product_Resp)
))
_sym_db.RegisterMessage(Product_Resp)
Stock_Ban_Req = _reflection.GeneratedProtocolMessageType('Stock_Ban_Req', (_message.Message,), dict(
DESCRIPTOR = _STOCK_BAN_REQ,
__module__ = 'risk_ban_buysell_pb2'
# @@protoc_insertion_point(class_scope:risk_ban_buysell.Stock_Ban_Req)
))
_sym_db.RegisterMessage(Stock_Ban_Req)
Stock_Ban_Resp = _reflection.GeneratedProtocolMessageType('Stock_Ban_Resp', (_message.Message,), dict(
DESCRIPTOR = _STOCK_BAN_RESP,
__module__ = 'risk_ban_buysell_pb2'
# @@protoc_insertion_point(class_scope:risk_ban_buysell.Stock_Ban_Resp)
))
_sym_db.RegisterMessage(Stock_Ban_Resp)
stock_info = _reflection.GeneratedProtocolMessageType('stock_info', (_message.Message,), dict(
DESCRIPTOR = _STOCK_INFO,
__module__ = 'risk_ban_buysell_pb2'
# @@protoc_insertion_point(class_scope:risk_ban_buysell.stock_info)
))
_sym_db.RegisterMessage(stock_info)
Stock_Ban_Add_Req = _reflection.GeneratedProtocolMessageType('Stock_Ban_Add_Req', (_message.Message,), dict(
DESCRIPTOR = _STOCK_BAN_ADD_REQ,
__module__ = 'risk_ban_buysell_pb2'
# @@protoc_insertion_point(class_scope:risk_ban_buysell.Stock_Ban_Add_Req)
))
_sym_db.RegisterMessage(Stock_Ban_Add_Req)
Stock_Ban_Add_Resp = _reflection.GeneratedProtocolMessageType('Stock_Ban_Add_Resp', (_message.Message,), dict(
DESCRIPTOR = _STOCK_BAN_ADD_RESP,
__module__ = 'risk_ban_buysell_pb2'
# @@protoc_insertion_point(class_scope:risk_ban_buysell.Stock_Ban_Add_Resp)
))
_sym_db.RegisterMessage(Stock_Ban_Add_Resp)
Stock_Ban_Del_Req = _reflection.GeneratedProtocolMessageType('Stock_Ban_Del_Req', (_message.Message,), dict(
DESCRIPTOR = _STOCK_BAN_DEL_REQ,
__module__ = 'risk_ban_buysell_pb2'
# @@protoc_insertion_point(class_scope:risk_ban_buysell.Stock_Ban_Del_Req)
))
_sym_db.RegisterMessage(Stock_Ban_Del_Req)
Stock_Ban_Del_Resp = _reflection.GeneratedProtocolMessageType('Stock_Ban_Del_Resp', (_message.Message,), dict(
DESCRIPTOR = _STOCK_BAN_DEL_RESP,
__module__ = 'risk_ban_buysell_pb2'
# @@protoc_insertion_point(class_scope:risk_ban_buysell.Stock_Ban_Del_Resp)
))
_sym_db.RegisterMessage(Stock_Ban_Del_Resp)
# @@protoc_insertion_point(module_scope)
| 37.65233 | 1,915 | 0.758353 | 3,041 | 21,010 | 4.787899 | 0.056232 | 0.047253 | 0.070192 | 0.045742 | 0.866346 | 0.807486 | 0.765659 | 0.72706 | 0.693475 | 0.680769 | 0 | 0.033936 | 0.120609 | 21,010 | 557 | 1,916 | 37.719928 | 0.754113 | 0.044693 | 0 | 0.664659 | 1 | 0.008032 | 0.215707 | 0.162054 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.012048 | 0 | 0.012048 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
297d7b0abf5a3aa56cd9eec1ead635a6b4a4e997 | 4,575 | py | Python | descarteslabs/common/proto/testing/testing_pb2_grpc.py | carderne/descarteslabs-python | 757b480efb8d58474a3bf07f1dbd90652b46ed64 | [
"Apache-2.0"
] | null | null | null | descarteslabs/common/proto/testing/testing_pb2_grpc.py | carderne/descarteslabs-python | 757b480efb8d58474a3bf07f1dbd90652b46ed64 | [
"Apache-2.0"
] | null | null | null | descarteslabs/common/proto/testing/testing_pb2_grpc.py | carderne/descarteslabs-python | 757b480efb8d58474a3bf07f1dbd90652b46ed64 | [
"Apache-2.0"
] | null | null | null | # Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT!
import grpc
from descarteslabs.common.proto.testing import testing_pb2 as descarteslabs_dot_common_dot_proto_dot_testing_dot_testing__pb2
class TestServiceStub(object):
# missing associated documentation comment in .proto file
pass
def __init__(self, channel):
"""Constructor.
Args:
channel: A grpc.Channel.
"""
self.StreamStream = channel.stream_stream(
'/testing.v1.TestService/StreamStream',
request_serializer=descarteslabs_dot_common_dot_proto_dot_testing_dot_testing__pb2.Request.SerializeToString,
response_deserializer=descarteslabs_dot_common_dot_proto_dot_testing_dot_testing__pb2.Response.FromString,
)
self.StreamUnary = channel.stream_unary(
'/testing.v1.TestService/StreamUnary',
request_serializer=descarteslabs_dot_common_dot_proto_dot_testing_dot_testing__pb2.Request.SerializeToString,
response_deserializer=descarteslabs_dot_common_dot_proto_dot_testing_dot_testing__pb2.Response.FromString,
)
self.UnaryStream = channel.unary_stream(
'/testing.v1.TestService/UnaryStream',
request_serializer=descarteslabs_dot_common_dot_proto_dot_testing_dot_testing__pb2.Request.SerializeToString,
response_deserializer=descarteslabs_dot_common_dot_proto_dot_testing_dot_testing__pb2.Response.FromString,
)
self.UnaryUnary = channel.unary_unary(
'/testing.v1.TestService/UnaryUnary',
request_serializer=descarteslabs_dot_common_dot_proto_dot_testing_dot_testing__pb2.Request.SerializeToString,
response_deserializer=descarteslabs_dot_common_dot_proto_dot_testing_dot_testing__pb2.Response.FromString,
)
class TestServiceServicer(object):
# missing associated documentation comment in .proto file
pass
def StreamStream(self, request_iterator, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def StreamUnary(self, request_iterator, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def UnaryStream(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def UnaryUnary(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def add_TestServiceServicer_to_server(servicer, server):
rpc_method_handlers = {
'StreamStream': grpc.stream_stream_rpc_method_handler(
servicer.StreamStream,
request_deserializer=descarteslabs_dot_common_dot_proto_dot_testing_dot_testing__pb2.Request.FromString,
response_serializer=descarteslabs_dot_common_dot_proto_dot_testing_dot_testing__pb2.Response.SerializeToString,
),
'StreamUnary': grpc.stream_unary_rpc_method_handler(
servicer.StreamUnary,
request_deserializer=descarteslabs_dot_common_dot_proto_dot_testing_dot_testing__pb2.Request.FromString,
response_serializer=descarteslabs_dot_common_dot_proto_dot_testing_dot_testing__pb2.Response.SerializeToString,
),
'UnaryStream': grpc.unary_stream_rpc_method_handler(
servicer.UnaryStream,
request_deserializer=descarteslabs_dot_common_dot_proto_dot_testing_dot_testing__pb2.Request.FromString,
response_serializer=descarteslabs_dot_common_dot_proto_dot_testing_dot_testing__pb2.Response.SerializeToString,
),
'UnaryUnary': grpc.unary_unary_rpc_method_handler(
servicer.UnaryUnary,
request_deserializer=descarteslabs_dot_common_dot_proto_dot_testing_dot_testing__pb2.Request.FromString,
response_serializer=descarteslabs_dot_common_dot_proto_dot_testing_dot_testing__pb2.Response.SerializeToString,
),
}
generic_handler = grpc.method_handlers_generic_handler(
'testing.v1.TestService', rpc_method_handlers)
server.add_generic_rpc_handlers((generic_handler,))
| 46.683673 | 125 | 0.792787 | 512 | 4,575 | 6.623047 | 0.138672 | 0.100265 | 0.110292 | 0.125332 | 0.761132 | 0.726334 | 0.726334 | 0.726334 | 0.726334 | 0.726334 | 0 | 0.005867 | 0.143169 | 4,575 | 97 | 126 | 47.164948 | 0.859184 | 0.098579 | 0 | 0.520548 | 1 | 0 | 0.095145 | 0.039522 | 0 | 0 | 0 | 0 | 0 | 1 | 0.082192 | false | 0.082192 | 0.027397 | 0 | 0.136986 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
297ecd273dc6b00d293fe5343b3e4117b0c9d8d8 | 13,509 | py | Python | tltorch/_tensor_lasso.py | isabella232/torch-2 | 8ba2197438452dc7b36d96d572a1781c8700d7dd | [
"BSD-3-Clause"
] | null | null | null | tltorch/_tensor_lasso.py | isabella232/torch-2 | 8ba2197438452dc7b36d96d572a1781c8700d7dd | [
"BSD-3-Clause"
] | 1 | 2021-04-08T10:54:37.000Z | 2021-04-08T10:54:37.000Z | tltorch/_tensor_lasso.py | isabella232/torch-2 | 8ba2197438452dc7b36d96d572a1781c8700d7dd | [
"BSD-3-Clause"
] | null | null | null | import tensorly as tl
tl.set_backend('pytorch')
from tensorly import random
from tensorly.testing import assert_
import numpy as np
import warnings
import torch
from torch import nn
from torch.nn import functional as F
import tltorch as tltorch
from .base import ParameterList
# Author: Jean Kossaifi
# License: BSD 3 clause
class TuckerL1Regularizer():
"""Decomposition Hook for Tensor Lasso on Tucker tensors
Applies a generalized Lasso (l1 regularization) on the tensor layers the regularization it is applied to.
Parameters
----------
penalty : float, default is 0.01
scaling factor for the loss
clamp_weights : bool, default is True
if True, the lasso weights are clamp between -1 and 1
threshold : float, default is 1e-6
if a lasso weight is lower than the set threshold, it is set to 0
normalize_loss : bool, default is True
If True, the loss will be between 0 and 1.
Otherwise, the raw sum of absolute weights will be returned.
Examples
--------
First you need to create an instance of the regularizer:
>>> regularizer = TuckerL1Regularizer(penalty=penalty)
You can apply the regularizer to one or several layers:
>>> trl = TRL((5, 5), (5, 5), rank='same')
>>> trl2 = TRL((5, 5), (2, ), rank='same')
>>> regularizer.apply(trl)
>>> regularizer.apply(trl2)
The lasso is automatically applied:
>>> x = trl(x)
>>> pred = trl2(x)
>>> loss = your_loss_function(pred)
Add the Lasso loss:
>>> loss = loss + regularizer.loss
You can now backpropagate through your loss as usual:
>>> loss.backwards()
After you finish updating the weights, don't forget to reset the regularizer,
otherwise it will keep accumulating values!
>>> loss.reset()
You can also remove the regularizer with `regularizer.remove(trl)`.
"""
_log = []
def __init__(self, penalty=0.01, clamp_weights=True, threshold=1e-6, normalize_loss=True):
self.penalty = penalty
self.clamp_weights = clamp_weights
self.threshold = threshold
self.normalize_loss = normalize_loss
# Initialize the counters
self.reset()
def reset(self):
"""Reset the loss, should be called at the end of each iteration.
"""
self._loss = 0
self.n_element = 0
@property
def loss(self):
"""Returns the current Lasso (l1) loss for the layers that have been called so far.
Returns
-------
float
l1 regularization on the tensor layers the regularization has been applied to.
"""
if self.n_element == 0:
warnings.warn('The L1Regularization was not applied to any weights.')
return 0
elif self.normalize_loss:
return self.penalty*self._loss/self.n_element
else:
return self.penalty*self._loss
def __call__(self, module, tucker_tensor):
lasso_weights = getattr(module, 'lasso_weights')
order = len(lasso_weights)
with torch.no_grad():
for i in range(order):
if self.clamp_weights:
lasso_weights[i].data = torch.clamp(lasso_weights[i].data, -1, 1)
if self.threshold:
lasso_weights[i] = F.threshold(lasso_weights[i], threshold=self.threshold, value=0, inplace=True)
setattr(module, 'lasso_weights', lasso_weights)
for weight in lasso_weights:
self.n_element += weight.numel()
self._loss = self._loss + torch.sum(torch.abs(weight))
return self.apply_lasso(tucker_tensor, lasso_weights)
def apply_lasso(self, tucker_tensor, lasso_weights):
"""Applies the lasso to a decomposed tensor
"""
core, factors = tucker_tensor
factors = [factor*w for (factor, w) in zip(factors, lasso_weights)]
return core, factors
def apply(self, module):
"""Apply an instance of the L1Regularizer to a tensor module
Parameters
----------
module : TensorModule
module on which to add the regularization
Returns
-------
TensorModule (with Regularization hook)
"""
if not isinstance(module, tltorch.TensorModule):
raise ValueError(f'L1Regularizer can only be applied onto a TensorModule but got {module.__class__.__name__}.')
rank = module.rank
context = tl.context(module.core)
lasso_weights = ParameterList([nn.Parameter(torch.ones(r, **context)) for r in rank])
setattr(module, 'lasso_weights', lasso_weights)
handle = module.register_decomposition_forward_pre_hook(self, 'L1Regularizer')
return module
def remove(self, module):
"""Remove the Regularization from a module.
"""
for key in module._decomposition_forward_pre_hooks:
if key == 'L1Regularizer':
delattr(module, 'lasso_weights')
del module._decomposition_forward_pre_hooks[key]
break
class TTL1Regularizer():
"""Decomposition Hook for Tensor Lasso on TT tensors
Parameters
----------
penalty : float, default is 0.01
scaling factor for the loss
clamp_weights : bool, default is True
if True, the lasso weights are clamp between -1 and 1
threshold : float, default is 1e-6
if a lasso weight is lower than the set threshold, it is set to 0
normalize_loss : bool, default is True
If True, the loss will be between 0 and 1.
Otherwise, the raw sum of absolute weights will be returned.
Examples
--------
First you need to create an instance of the regularizer:
>>> regularizer = TTL1Regularizer(penalty=penalty)
You can apply the regularizer to one or several layers:
>>> trl = TensorTrainTRL((5, 5), (5, 5), rank='same')
>>> trl2 = TensorTrainTRL((5, 5), (2, ), rank='same')
>>> regularizer.apply(trl)
>>> regularizer.apply(trl2)
The lasso is automatically applied:
>>> x = trl(x)
>>> pred = trl2(x)
>>> loss = your_loss_function(pred)
Add the Lasso loss:
>>> loss = loss + regularizer.loss
You can now backpropagate through your loss as usual:
>>> loss.backwards()
After you finish updating the weights, don't forget to reset the regularizer,
otherwise it will keep accumulating values!
>>> loss.reset()
You can also remove the regularizer with `regularizer.remove(trl)`.
"""
def __init__(self, penalty=0.01, clamp_weights=True, threshold=1e-6, normalize_loss=True):
self.penalty = penalty
self.clamp_weights = clamp_weights
self.threshold = threshold
self.normalize_loss = normalize_loss
# Initialize the counters
self.reset()
def reset(self):
"""Reset the loss, should be called at the end of each iteration.
"""
self._loss = 0
self.n_element = 0
@property
def loss(self):
"""Returns the current Lasso (l1) loss for the layers that have been called so far.
Returns
-------
float
l1 regularization on the tensor layers the regularization has been applied to.
"""
if self.n_element == 0:
warnings.warn('The L1Regularization was not applied to any weights.')
return 0
elif self.normalize_loss:
return self.penalty*self._loss/self.n_element
else:
return self.penalty*self._loss
def __call__(self, module, tt_tensor):
lasso_weights = getattr(module, 'lasso_weights')
order = len(lasso_weights)
with torch.no_grad():
for i in range(order):
if self.clamp_weights:
lasso_weights[i].data = torch.clamp(lasso_weights[i].data, -1, 1)
if self.threshold:
lasso_weights[i] = F.threshold(lasso_weights[i], threshold=self.threshold, value=0, inplace=True)
setattr(module, 'lasso_weights', lasso_weights)
for weight in lasso_weights:
self.n_element += weight.numel()
self._loss = self._loss + torch.sum(torch.abs(weight))
return self.apply_lasso(tt_tensor, lasso_weights)
def apply_lasso(self, tt_tensor, lasso_weights):
"""Applies the lasso to a decomposed tensor
"""
factors = [factor*w for (factor, w) in zip(tt_tensor, lasso_weights)] + [tt_tensor[-1]]
return factors
def apply(self, module):
"""Apply an instance of the L1Regularizer to a tensor module
Parameters
----------
module : TensorModule
module on which to add the regularization
Returns
-------
TensorModule (with Regularization hook)
"""
rank = module.rank[1:-1]
lasso_weights = ParameterList([nn.Parameter(torch.ones(1, 1, r)) for r in rank])
setattr(module, 'lasso_weights', lasso_weights)
handle = module.register_decomposition_forward_pre_hook(self, 'L1Regularizer')
return module
def remove(self, module):
"""Remove the Regularization from a module.
"""
for key, hook in module._decomposition_forward_pre_hooks.items():
if key == 'L1Regularizer':
delattr(module, 'lasso_weights')
del module._decomposition_forward_pre_hooks[key]
break
class CPL1Regularizer():
"""Decomposition Hook for Tensor Lasso on TT tensors
Parameters
----------
penalty : float, default is 0.01
scaling factor for the loss
clamp_weights : bool, default is True
if True, the lasso weights are clamp between -1 and 1
threshold : float, default is 1e-6
if a lasso weight is lower than the set threshold, it is set to 0
normalize_loss : bool, default is True
If True, the loss will be between 0 and 1.
Otherwise, the raw sum of absolute weights will be returned.
Examples
--------
First you need to create an instance of the regularizer:
>>> regularizer = CPL1Regularizer(penalty=penalty)
You can apply the regularizer to one or several layers:
>>> trl = CPTRL((5, 5), (5, 5), rank='same')
>>> trl2 = CPTRL((5, 5), (2, ), rank='same')
>>> regularizer.apply(trl)
>>> regularizer.apply(trl2)
The lasso is automatically applied:
>>> x = trl(x)
>>> pred = trl2(x)
>>> loss = your_loss_function(pred)
Add the Lasso loss:
>>> loss = loss + regularizer.loss
You can now backpropagate through your loss as usual:
>>> loss.backwards()
After you finish updating the weights, don't forget to reset the regularizer,
otherwise it will keep accumulating values!
>>> loss.reset()
You can also remove the regularizer with `regularizer.remove(trl)`.
"""
def __init__(self, penalty=0.01, clamp_weights=True, threshold=1e-6, normalize_loss=True):
self.penalty = penalty
self.clamp_weights = clamp_weights
self.threshold = threshold
self.normalize_loss = normalize_loss
# Initialize the counters
self.reset()
def reset(self):
"""Reset the loss, should be called at the end of each iteration.
"""
self._loss = 0
self.n_element = 0
@property
def loss(self):
"""Returns the current Lasso (l1) loss for the layers that have been called so far.
Returns
-------
float
l1 regularization on the tensor layers the regularization has been applied to.
"""
if self.n_element == 0:
warnings.warn('The L1Regularization was not applied to any weights.')
return 0
elif self.normalize_loss:
return self.penalty*self._loss/self.n_element
else:
return self.penalty*self._loss
def __call__(self, module, cp_tensor):
"""CP already includes weights, we'll just take their l1 norm
"""
weights = getattr(module, 'weights')
with torch.no_grad():
if self.clamp_weights:
weights.data = torch.clamp(weights.data, -1, 1)
setattr(module, 'weights', weights)
if self.threshold:
weights.data = F.threshold(weights.data, threshold=self.threshold, value=0, inplace=True)
setattr(module, 'weights', weights)
self.n_element += weights.numel()
self._loss = self._loss + self.penalty*torch.norm(weights, 1)
return cp_tensor
def apply(self, module):
"""Apply an instance of the L1Regularizer to a tensor module
Parameters
----------
module : TensorModule
module on which to add the regularization
Returns
-------
TensorModule (with Regularization hook)
"""
handle = module.register_decomposition_forward_pre_hook(self, 'L1Regularizer')
return module
def remove(self, module):
"""Remove the Regularization from a module.
"""
for key, hook in module._decomposition_forward_pre_hooks.items():
if key == 'L1Regularizer':
del module._decomposition_forward_pre_hooks[key]
break
| 30.983945 | 123 | 0.617292 | 1,661 | 13,509 | 4.908489 | 0.130042 | 0.054458 | 0.017662 | 0.012511 | 0.873176 | 0.867901 | 0.859438 | 0.833926 | 0.822274 | 0.808046 | 0 | 0.013552 | 0.289881 | 13,509 | 435 | 124 | 31.055172 | 0.836339 | 0.429788 | 0 | 0.728477 | 0 | 0 | 0.066472 | 0.004082 | 0 | 0 | 0 | 0 | 0.006623 | 1 | 0.13245 | false | 0 | 0.066225 | 0 | 0.337748 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
465227e6ad379bb1a15a80b5c0f66bcdc1ee79a0 | 10,593 | py | Python | molsysmt/forms/api_openmm_AmberPrmtopFile.py | dprada/molsysmt | 83f150bfe3cfa7603566a0ed4aed79d9b0c97f5d | [
"MIT"
] | null | null | null | molsysmt/forms/api_openmm_AmberPrmtopFile.py | dprada/molsysmt | 83f150bfe3cfa7603566a0ed4aed79d9b0c97f5d | [
"MIT"
] | null | null | null | molsysmt/forms/api_openmm_AmberPrmtopFile.py | dprada/molsysmt | 83f150bfe3cfa7603566a0ed4aed79d9b0c97f5d | [
"MIT"
] | null | null | null | from molsysmt._private_tools.exceptions import *
from molsysmt.forms.common_gets import *
import numpy as np
from molsysmt.native.molecular_system import molecular_system_components
import sys
import importlib
form_name='openmm.AmberPrmtopFile'
is_form={
'openmm.AmberPrmtopFile' : form_name,
}
info=["",""]
has = molecular_system_components.copy()
for ii in ['elements', 'bonds', 'ff_parameters']:
has[ii]=True
def to_molsysmt_Topology(item, molecular_system=None, atom_indices='all', frame_indices='all'):
from molsysmt.forms.api_openmm_Topology import to_molsysmt_Topology as openmm_Topology_to_molsysmt_Topology
tmp_item, tmp_molecular_system = to_openmm_Topology(item, molecular_system=molecular_system)
tmp_item, tmp_molecular_system = openmm_Topology_to_molsysmt_Topology(item, molecular_system=tmp_molecular_system, atom_indices=atom_indices, frame_indices=frame_indices)
return tmp_item, tmp_molecular_system
def to_openmm_Topology(item, molecular_system=None, atom_indices='all', frame_indices='all'):
from molsysmt.forms.api_openmm_Topology import to_openmm_Topology as openmm_Topology_to_openmm_Topology
tmp_item = item.topology
if molecular_system is not None:
tmp_molecular_system = molecular_system.combine_with_items(tmp_item)
else:
tmp_molecular_system = None
tmp_item, tmp_molecular_system = openmm_Topology_to_openmm_Topology(tmp_item, molecular_system=tmp_molecular_system, atom_indices=atom_indices, frame_indices=frame_indices, copy_if_all=False)
return tmp_item, tmp_molecular_system
def to_openmm_System(item, molecular_system=None, atom_indices='all', frame_indices='all'):
tmp_item = item.createSystem()
if molecular_system is not None:
tmp_molecular_system = molecular_system.combine_with_items(tmp_item)
else:
tmp_molecular_system = None
tmp_item, tmp_molecular_system = openmm_System_to_openmm_System(tmp_item, tmp_molecular_system, atom_indices=atom_indices, frame_indices=frame_indices, copy_if_all=False)
return tmp_item, tmp_molecular_system
def to_openmm_AmberPrmtopFile(item, molecular_system=None, atom_indices='all', frame_indices='all', copy_if_all=True):
tmp_molecular_system = None
if (atom_indices is 'all') and (frame_indices is 'all'):
if copy_if_all:
tmp_item = extract(item)
if molecular_system is not None:
tmp_molecular_system = molecular_system.combine_with_items(tmp_item)
else:
tmp_item = item
if molecular_system is not None:
tmp_molecular_system = molecular_system
else:
tmp_item = extract(item, atom_indices=atom_indices, frame_indices=frame_indices)
if molecular_system is not None:
tmp_molecular_system = molecular_system.combine_with_items(tmp_item, atom_indices=atom_indices, frame_indices=frame_indices)
return tmp_item, tmp_molecular_system
def extract(item, atom_indices='all', frame_indices='all'):
if (atom_indices is 'all') and (frame_indices is 'all'):
raise NotImplementedError()
else:
raise NotImplementedError()
return tmp_item
def merge(item_1, item_2):
raise NotImplementedError
def add(to_item, item):
raise NotImplementedError
def append_frames(item, step=None, time=None, coordinates=None, box=None):
raise NotImplementedError()
def concatenate_frames(item, step=None, time=None, coordinates=None, box=None):
raise NotImplementedError
##### Set
def aux_get(item, indices='all', frame_indices='all'):
from molsysmt.forms import forms
method_name = sys._getframe(1).f_code.co_name
if 'openmm.Topology' in forms:
tmp_item, _ = to_openmm_Topology(item)
module = importlib.import_module('molsysmt.forms.api_openmm_Topology')
_get = getattr(module, method_name)
output = _get(tmp_item, indices=indices, frame_indices=frame_indices)
else:
raise NotImplementedError()
return output
## Atom
def get_atom_id_from_atom(item, indices='all', frame_indices='all'):
return aux_get(item, indices=indices, frame_indices=frame_indices)
def get_atom_name_from_atom(item, indices='all', frame_indices='all'):
return aux_get(item, indices=indices, frame_indices=frame_indices)
def get_atom_type_from_atom(item, indices='all', frame_indices='all'):
return aux_get(item, indices=indices, frame_indices=frame_indices)
def get_group_index_from_atom (item, indices='all', frame_indices='all'):
return aux_get(item, indices=indices, frame_indices=frame_indices)
def get_component_index_from_atom (item, indices='all', frame_indices='all'):
return aux_get(item, indices=indices, frame_indices=frame_indices)
def get_chain_index_from_atom (item, indices='all', frame_indices='all'):
return aux_get(item, indices=indices, frame_indices=frame_indices)
def get_molecule_index_from_atom (item, indices='all', frame_indices='all'):
return aux_get(item, indices=indices, frame_indices=frame_indices)
def get_entity_index_from_atom (item, indices='all', frame_indices='all'):
return aux_get(item, indices=indices, frame_indices=frame_indices)
def get_inner_bonded_atoms_from_atom (item, indices='all', frame_indices='all'):
return aux_get(item, indices=indices, frame_indices=frame_indices)
def get_n_inner_bonds_from_atom (item, indices='all', frame_indices='all'):
return aux_get(item, indices=indices, frame_indices=frame_indices)
def get_coordinates_from_atom(item, indices='all', frame_indices='all'):
raise NotWithThisFormError()
def get_frame_from_atom(item, indices='all', frame_indices='all'):
raise NotWithThisFormError()
## group
def get_group_id_from_group(item, indices='all', frame_indices='all'):
return aux_get(item, indices=indices, frame_indices=frame_indices)
def get_group_name_from_group(item, indices='all', frame_indices='all'):
return aux_get(item, indices=indices, frame_indices=frame_indices)
def get_group_type_from_group(item, indices='all', frame_indices='all'):
return aux_get(item, indices=indices, frame_indices=frame_indices)
## component
def get_component_id_from_component (item, indices='all', frame_indices='all'):
return aux_get(item, indices=indices, frame_indices=frame_indices)
def get_component_name_from_component (item, indices='all', frame_indices='all'):
return aux_get(item, indices=indices, frame_indices=frame_indices)
def get_component_type_from_component (item, indices='all', frame_indices='all'):
return aux_get(item, indices=indices, frame_indices=frame_indices)
## molecule
def get_molecule_id_from_molecule (item, indices='all', frame_indices='all'):
return aux_get(item, indices=indices, frame_indices=frame_indices)
def get_molecule_name_from_molecule (item, indices='all', frame_indices='all'):
return aux_get(item, indices=indices, frame_indices=frame_indices)
def get_molecule_type_from_molecule (item, indices='all', frame_indices='all'):
return aux_get(item, indices=indices, frame_indices=frame_indices)
## chain
def get_chain_id_from_chain (item, indices='all', frame_indices='all'):
return aux_get(item, indices=indices, frame_indices=frame_indices)
def get_chain_name_from_chain (item, indices='all', frame_indices='all'):
return aux_get(item, indices=indices, frame_indices=frame_indices)
def get_chain_type_from_chain (item, indices='all', frame_indices='all'):
return aux_get(item, indices=indices, frame_indices=frame_indices)
## entity
def get_entity_id_from_entity (item, indices='all', frame_indices='all'):
return aux_get(item, indices=indices, frame_indices=frame_indices)
def get_entity_name_from_entity (item, indices='all', frame_indices='all'):
return aux_get(item, indices=indices, frame_indices=frame_indices)
def get_entity_type_from_entity (item, indices='all', frame_indices='all'):
return aux_get(item, indices=indices, frame_indices=frame_indices)
# System
def get_n_atoms_from_system(item, indices='all', frame_indices='all'):
return aux_get(item, indices=indices, frame_indices=frame_indices)
def get_n_groups_from_system(item, indices='all', frame_indices='all'):
return aux_get(item, indices=indices, frame_indices=frame_indices)
def get_n_components_from_system(item, indices='all', frame_indices='all'):
return aux_get(item, indices=indices, frame_indices=frame_indices)
def get_n_chains_from_system(item, indices='all', frame_indices='all'):
return aux_get(item, indices=indices, frame_indices=frame_indices)
def get_n_molecules_from_system(item, indices='all', frame_indices='all'):
return aux_get(item, indices=indices, frame_indices=frame_indices)
def get_n_entities_from_system(item, indices='all', frame_indices='all'):
return aux_get(item, indices=indices, frame_indices=frame_indices)
def get_n_bonds_from_system(item, indices='all', frame_indices='all'):
return aux_get(item, indices=indices, frame_indices=frame_indices)
def get_coordinates_from_system(item, indices='all', frame_indices='all'):
raise NotWithThisFormError()
def get_box_from_system(item, indices='all', frame_indices='all'):
return aux_get(item, indices=indices, frame_indices=frame_indices)
def get_box_shape_from_system(item, indices='all', frame_indices='all'):
return aux_get(item, indices=indices, frame_indices=frame_indices)
def get_box_lengths_from_system(item, indices='all', frame_indices='all'):
return aux_get(item, indices=indices, frame_indices=frame_indices)
def get_box_angles_from_system(item, indices='all', frame_indices='all'):
return aux_get(item, indices=indices, frame_indices=frame_indices)
def get_box_volume_from_system(item, indices='all', frame_indices='all'):
return aux_get(item, indices=indices, frame_indices=frame_indices)
def get_time_from_system(item, indices='all', frame_indices='all'):
raise NotWithThisFormError()
def get_step_from_system(item, indices='all', frame_indices='all'):
raise NotWithThisFormError()
def get_n_frames_from_system(item, indices='all', frame_indices='all'):
raise NotWithThisFormError()
def get_bonded_atoms_from_system(item, indices='all', frame_indices='all'):
return aux_get(item, indices=indices, frame_indices=frame_indices)
###### Set
def set_box_to_system(item, indices='all', frame_indices='all', value=None):
raise NotImplementedError
def set_coordinates_to_system(item, indices='all', frame_indices='all', value=None):
raise NotImplementedError
| 33.206897 | 195 | 0.771736 | 1,492 | 10,593 | 5.118633 | 0.068365 | 0.223124 | 0.218934 | 0.149797 | 0.83999 | 0.825979 | 0.811182 | 0.798612 | 0.786434 | 0.77884 | 0 | 0.000324 | 0.125743 | 10,593 | 318 | 196 | 33.311321 | 0.824317 | 0.005381 | 0 | 0.449704 | 0 | 0 | 0.04215 | 0.007422 | 0 | 0 | 0 | 0 | 0 | 1 | 0.331361 | false | 0 | 0.059172 | 0.224852 | 0.650888 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 11 |
466dea8ecaf5b6bba14f040c8f44e75972e8d4af | 28,584 | py | Python | modules/layers/decoders.py | king-menin/slavic-ner | f88e3150546fd2a49bda682396d82c749d6659bb | [
"MIT"
] | 6 | 2019-09-04T09:08:57.000Z | 2020-09-04T11:55:30.000Z | modules/layers/decoders.py | king-menin/AGRR-2019 | 3bf8533cd54c018b45d809c3fcdc2a55a406248d | [
"MIT"
] | 1 | 2020-05-21T01:16:31.000Z | 2021-09-13T08:47:18.000Z | modules/layers/decoders.py | king-menin/slavic-ner | f88e3150546fd2a49bda682396d82c749d6659bb | [
"MIT"
] | 2 | 2019-09-24T12:42:30.000Z | 2021-01-15T00:24:00.000Z | import torch
from torch.nn import functional
from torch.autograd import Variable
from torch import nn
from .layers import Linears, MultiHeadAttention
from .crf import CRF
from .ncrf import NCRF
class CRFDecoder(nn.Module):
def __init__(self, crf, label_size, input_dim, input_dropout=0.5):
super(CRFDecoder, self).__init__()
self.input_dim = input_dim
self.input_dropout = nn.Dropout(p=input_dropout)
self.linear = Linears(in_features=input_dim,
out_features=label_size,
hiddens=[input_dim // 2])
self.crf = crf
self.label_size = label_size
def forward_model(self, inputs):
batch_size, seq_len, input_dim = inputs.size()
output = inputs.contiguous().view(-1, self.input_dim)
output = self.input_dropout(output)
# Fully-connected layer
output = self.linear.forward(output)
output = output.view(batch_size, seq_len, self.label_size)
return output
def forward(self, inputs, labels_mask):
self.eval()
lens = labels_mask.sum(-1)
logits = self.forward_model(inputs)
logits = self.crf.pad_logits(logits)
scores, preds = self.crf.viterbi_decode(logits, lens)
self.train()
return preds
def score(self, inputs, labels_mask, labels):
lens = labels_mask.sum(-1)
logits = self.forward_model(inputs)
logits = self.crf.pad_logits(logits)
norm_score = self.crf.calc_norm_score(logits, lens)
gold_score = self.crf.calc_gold_score(logits, labels, lens)
loglik = gold_score - norm_score
return -loglik.mean()
@classmethod
def create(cls, label_size, input_dim, input_dropout=0.5):
return cls(CRF(label_size+2), label_size, input_dim, input_dropout)
class AttnCRFDecoder(nn.Module):
def __init__(self,
crf, label_size, input_dim, input_dropout=0.5,
key_dim=64, val_dim=64, num_heads=3):
super(AttnCRFDecoder, self).__init__()
self.input_dim = input_dim
self.attn = MultiHeadAttention(key_dim, val_dim, input_dim, num_heads, input_dropout)
self.linear = Linears(in_features=input_dim,
out_features=label_size,
hiddens=[input_dim // 2])
self.crf = crf
self.label_size = label_size
def forward_model(self, inputs, labels_mask=None):
batch_size, seq_len, input_dim = inputs.size()
inputs, _ = self.attn(inputs, inputs, inputs, labels_mask)
output = inputs.contiguous().view(-1, self.input_dim)
# Fully-connected layer
output = self.linear.forward(output)
output = output.view(batch_size, seq_len, self.label_size)
return output
def forward(self, inputs, labels_mask):
self.eval()
lens = labels_mask.sum(-1)
logits = self.forward_model(inputs)
logits = self.crf.pad_logits(logits)
scores, preds = self.crf.viterbi_decode(logits, lens)
self.train()
return preds
def score(self, inputs, labels_mask, labels):
lens = labels_mask.sum(-1)
logits = self.forward_model(inputs)
logits = self.crf.pad_logits(logits)
norm_score = self.crf.calc_norm_score(logits, lens)
gold_score = self.crf.calc_gold_score(logits, labels, lens)
loglik = gold_score - norm_score
return -loglik.mean()
@classmethod
def create(cls, label_size, input_dim, input_dropout=0.5, key_dim=64, val_dim=64, num_heads=3):
return cls(CRF(label_size+2), label_size, input_dim, input_dropout,
key_dim, val_dim, num_heads)
class NMTDecoder(nn.Module):
def __init__(self,
label_size,
embedding_dim=64, hidden_dim=256, rnn_layers=1,
dropout_p=0.1, pad_idx=0, use_cuda=True):
super(NMTDecoder, self).__init__()
self.slot_size = label_size
self.pad_idx = pad_idx
self.embedding_dim = embedding_dim
self.hidden_dim = hidden_dim
self.rnn_layers = rnn_layers
self.dropout_p = dropout_p
self.embedding = nn.Embedding(self.slot_size, self.embedding_dim)
self.lstm = nn.LSTM(self.embedding_dim + self.hidden_dim * 2,
self.hidden_dim, self.rnn_layers,
batch_first=True)
self.attn = nn.Linear(self.hidden_dim, self.hidden_dim)
self.slot_out = nn.Linear(self.hidden_dim * 2, self.slot_size)
self.loss = nn.CrossEntropyLoss(ignore_index=pad_idx)
self.use_cuda = use_cuda
if use_cuda:
self.cuda()
self.init_weights()
def init_weights(self):
nn.init.xavier_normal(self.embedding.weight)
nn.init.xavier_normal(self.attn.weight)
nn.init.xavier_normal(self.slot_out.weight)
def attention(self, hidden, encoder_outputs, input_mask):
"""
hidden : 1,B,D
encoder_outputs : B,T,D
input_mask : B,T # ByteTensor
"""
input_mask = input_mask == 0
hidden = hidden.squeeze(0).unsqueeze(2)
# B
batch_size = encoder_outputs.size(0)
# T
max_len = encoder_outputs.size(1)
# B*T,D -> B*T,D
energies = self.attn(encoder_outputs.contiguous().view(batch_size * max_len, -1))
energies = energies.view(batch_size, max_len, -1)
# B,T,D * B,D,1 --> B,1,T
attn_energies = energies.bmm(hidden).transpose(1, 2)
# PAD masking
attn_energies = attn_energies.squeeze(1).masked_fill(input_mask, -1e12)
# B,T
alpha = functional.softmax(attn_energies)
# B,1,T
alpha = alpha.unsqueeze(1)
# B,1,T * B,T,D => B,1,D
context = alpha.bmm(encoder_outputs)
# B,1,D
return context
def forward_model(self, encoder_outputs, input_mask):
real_context = []
for idx, o in enumerate(encoder_outputs):
real_length = input_mask[idx].sum().cpu().data.tolist()
real_context.append(o[real_length - 1])
context = torch.cat(real_context).view(encoder_outputs.size(0), -1).unsqueeze(1)
batch_size = encoder_outputs.size(0)
input_mask = input_mask == 0
# Get the embedding of the current input word
embedded = Variable(torch.zeros(batch_size, self.embedding_dim))
if self.use_cuda:
embedded = embedded.cuda()
embedded = embedded.unsqueeze(1)
decode = []
aligns = encoder_outputs.transpose(0, 1)
length = encoder_outputs.size(1)
for i in range(length):
# B,1,D
aligned = aligns[i].unsqueeze(1)
# input, context, aligned encoder hidden, hidden
_, hidden = self.lstm(torch.cat((embedded, context, aligned), 2))
# print(hidden[0].shape, context.transpose(0, 1).shape)
concated = torch.cat((hidden[0], context.transpose(0, 1)), 2)
score = self.slot_out(concated.squeeze(0))
softmaxed = functional.log_softmax(score)
decode.append(softmaxed)
_, input = torch.max(softmaxed, 1)
embedded = self.embedding(input.unsqueeze(1))
context = self.attention(hidden[0], encoder_outputs, input_mask)
slot_scores = torch.cat(decode, 1)
return slot_scores.view(batch_size, length, -1)
def forward(self, encoder_outputs, input_mask):
scores = self.forward_model(encoder_outputs, input_mask)
return scores.argmax(-1)
def score(self, encoder_outputs, input_mask, labels_ids):
scores = self.forward_model(encoder_outputs, input_mask)
batch_size = encoder_outputs.shape[0]
len_ = encoder_outputs.shape[1]
return self.loss(scores.view(batch_size * len_, -1), labels_ids.view(-1))
@classmethod
def create(cls, label_size,
embedding_dim=64, hidden_dim=256, rnn_layers=1, dropout_p=0.1, pad_idx=0, use_cuda=True):
return cls(label_size=label_size,
embedding_dim=embedding_dim, hidden_dim=hidden_dim,
rnn_layers=rnn_layers, dropout_p=dropout_p, pad_idx=pad_idx, use_cuda=use_cuda)
class NMTCRFDecoder(nn.Module):
def __init__(self,
label_size, crf,
embedding_dim=64, hidden_dim=256, rnn_layers=1,
dropout_p=0.1, pad_idx=0, use_cuda=True):
super(NMTCRFDecoder, self).__init__()
self.slot_size = label_size
self.pad_idx = pad_idx
self.embedding_dim = embedding_dim
self.hidden_dim = hidden_dim
self.rnn_layers = rnn_layers
self.dropout_p = dropout_p
self.embedding = nn.Embedding(self.slot_size, self.embedding_dim)
self.lstm = nn.LSTM(self.embedding_dim + self.hidden_dim * 2,
self.hidden_dim, self.rnn_layers,
batch_first=True)
self.attn = nn.Linear(self.hidden_dim, self.hidden_dim)
self.slot_out = nn.Linear(self.hidden_dim * 2, self.slot_size)
self.loss = nn.CrossEntropyLoss(ignore_index=pad_idx)
self.crf = crf
self.use_cuda = use_cuda
if use_cuda:
self.cuda()
self.init_weights()
def init_weights(self):
nn.init.xavier_normal(self.embedding.weight)
nn.init.xavier_normal(self.attn.weight)
nn.init.xavier_normal(self.slot_out.weight)
def attention(self, hidden, encoder_outputs, input_mask):
"""
hidden : 1,B,D
encoder_outputs : B,T,D
input_mask : B,T # ByteTensor
"""
input_mask = input_mask == 0
hidden = hidden.squeeze(0).unsqueeze(2)
# B
batch_size = encoder_outputs.size(0)
# T
max_len = encoder_outputs.size(1)
# B*T,D -> B*T,D
energies = self.attn(encoder_outputs.contiguous().view(batch_size * max_len, -1))
energies = energies.view(batch_size, max_len, -1)
# B,T,D * B,D,1 --> B,1,T
attn_energies = energies.bmm(hidden).transpose(1, 2)
# PAD masking
attn_energies = attn_energies.squeeze(1).masked_fill(input_mask, -1e12)
# B,T
alpha = functional.softmax(attn_energies)
# B,1,T
alpha = alpha.unsqueeze(1)
# B,1,T * B,T,D => B,1,D
context = alpha.bmm(encoder_outputs)
# B,1,D
return context
def forward_model(self, encoder_outputs, input_mask):
real_context = []
for idx, o in enumerate(encoder_outputs):
real_length = input_mask[idx].sum().cpu().data.tolist()
real_context.append(o[real_length - 1])
context = torch.cat(real_context).view(encoder_outputs.size(0), -1).unsqueeze(1)
batch_size = encoder_outputs.size(0)
input_mask = input_mask == 0
# Get the embedding of the current input word
embedded = Variable(torch.zeros(batch_size, self.embedding_dim))
if self.use_cuda:
embedded = embedded.cuda()
embedded = embedded.unsqueeze(1)
decode = []
aligns = encoder_outputs.transpose(0, 1)
length = encoder_outputs.size(1)
for i in range(length):
# B,1,D
aligned = aligns[i].unsqueeze(1)
# input, context, aligned encoder hidden, hidden
# print(embedded.shape, context.shape, aligned.shape)
_, hidden = self.lstm(torch.cat((embedded, context, aligned), 2))
# print(hidden[0].shape, context.transpose(0, 1).shape)
concated = torch.cat((hidden[0], context.transpose(0, 1)), 2)
score = self.slot_out(concated.squeeze(0))
softmaxed = functional.log_softmax(score)
decode.append(softmaxed)
_, input = torch.max(softmaxed, 1)
embedded = self.embedding(input.unsqueeze(1))
context = self.attention(hidden[0], encoder_outputs, input_mask)
slot_scores = torch.cat(decode, 1)
# return slot_scores.view(batch_size * length, -1)
return slot_scores.view(batch_size, length, -1)
def forward(self, encoder_outputs, input_mask):
scores = self.forward_model(encoder_outputs, input_mask)
return self.crf.forward(scores, input_mask)
def score(self, encoder_outputs, input_mask, labels_ids):
scores = self.forward_model(encoder_outputs, input_mask)
crf_score = self.crf.score(scores, input_mask, labels_ids)
batch_size = encoder_outputs.shape[0]
len_ = encoder_outputs.shape[1]
return self.loss(scores.view(batch_size * len_, -1), labels_ids.view(-1)) + crf_score
@classmethod
def create(cls, label_size,
embedding_dim=64, hidden_dim=256, rnn_layers=1, dropout_p=0.1, pad_idx=0, use_cuda=True):
crf = CRFDecoder.create(label_size, label_size, input_dropout=dropout_p)
return cls(label_size=label_size, crf=crf,
embedding_dim=embedding_dim, hidden_dim=hidden_dim,
rnn_layers=rnn_layers, dropout_p=dropout_p, pad_idx=pad_idx, use_cuda=use_cuda)
class PoolingLinearClassifier(nn.Module):
"""Create a linear classifier with pooling."""
def __init__(self, input_dim, intent_size, input_dropout=0.5):
super().__init__()
self.input_dim = input_dim
self.intent_size = intent_size
self.input_dropout = input_dropout
self.dropout = nn.Dropout(p=input_dropout)
self.linear = Linears(input_dim * 3, intent_size, [input_dim // 2], activation="relu")
@staticmethod
def pool(x, bs, is_max):
"""Pool the tensor along the seq_len dimension."""
f = functional.adaptive_max_pool1d if is_max else functional.adaptive_avg_pool1d
return f(x.permute(1, 2, 0), (1,)).view(bs, -1)
def forward(self, output):
output = self.dropout(output).transpose(0, 1)
sl, bs, _ = output.size()
avgpool = self.pool(output, bs, False)
mxpool = self.pool(output, bs, True)
x = torch.cat([output[0], mxpool, avgpool], 1)
return self.linear(x)
class AttnCRFJointDecoder(nn.Module):
def __init__(self,
crf, label_size, input_dim, intent_size, input_dropout=0.5,
key_dim=64, val_dim=64, num_heads=3):
super(AttnCRFJointDecoder, self).__init__()
self.input_dim = input_dim
self.attn = MultiHeadAttention(key_dim, val_dim, input_dim, num_heads, input_dropout)
self.linear = Linears(in_features=input_dim,
out_features=label_size,
hiddens=[input_dim // 2])
self.crf = crf
self.label_size = label_size
self.intent_size = intent_size
self.intent_out = PoolingLinearClassifier(input_dim, intent_size, input_dropout)
self.intent_loss = nn.CrossEntropyLoss()
def forward_model(self, inputs, labels_mask=None):
batch_size, seq_len, input_dim = inputs.size()
inputs, hidden = self.attn(inputs, inputs, inputs, labels_mask)
intent_output = self.intent_out(inputs)
output = inputs.contiguous().view(-1, self.input_dim)
# Fully-connected layer
output = self.linear.forward(output)
output = output.view(batch_size, seq_len, self.label_size)
return output, intent_output
def forward(self, inputs, labels_mask):
self.eval()
lens = labels_mask.sum(-1)
logits, intent_output = self.forward_model(inputs)
logits = self.crf.pad_logits(logits)
scores, preds = self.crf.viterbi_decode(logits, lens)
self.train()
return preds, intent_output.argmax(-1)
def score(self, inputs, labels_mask, labels, cls_ids):
lens = labels_mask.sum(-1)
logits, intent_output = self.forward_model(inputs)
logits = self.crf.pad_logits(logits)
norm_score = self.crf.calc_norm_score(logits, lens)
gold_score = self.crf.calc_gold_score(logits, labels, lens)
loglik = gold_score - norm_score
return -loglik.mean() + self.intent_loss(intent_output, cls_ids)
@classmethod
def create(cls, label_size, input_dim, intent_size, input_dropout=0.5, key_dim=64, val_dim=64, num_heads=3):
return cls(CRF(label_size + 2), label_size, input_dim, intent_size, input_dropout,
key_dim, val_dim, num_heads)
class NMTJointDecoder(nn.Module):
def __init__(self,
label_size, intent_size,
embedding_dim=64, hidden_dim=256, rnn_layers=1,
dropout_p=0.1, pad_idx=0, use_cuda=True, nbest=7):
super(NMTJointDecoder, self).__init__()
self.nbest = nbest
self.crf = NCRF(label_size, use_cuda)
self.slot_size = label_size + 2
self.intent_size = intent_size
self.pad_idx = pad_idx
self.embedding_dim = embedding_dim
self.hidden_dim = hidden_dim
self.rnn_layers = rnn_layers
self.dropout_p = dropout_p
self.embedding = nn.Embedding(self.slot_size, self.embedding_dim)
self.lstm = nn.LSTM(self.embedding_dim + self.hidden_dim * 2,
self.hidden_dim, self.rnn_layers,
batch_first=True)
self.attn = nn.Linear(self.hidden_dim, self.hidden_dim)
self.slot_out = nn.Linear(self.hidden_dim * 2, self.slot_size)
# self.loss = nn.CrossEntropyLoss(ignore_index=pad_idx)
self.intent_loss = nn.CrossEntropyLoss()
self.intent_out = Linears(
in_features=self.hidden_dim * 2,
out_features=self.intent_size,
hiddens=[hidden_dim // 2],
activation="relu")
self.use_cuda = use_cuda
if use_cuda:
self.cuda()
self.init_weights()
def init_weights(self):
nn.init.xavier_normal(self.embedding.weight)
nn.init.xavier_normal(self.attn.weight)
nn.init.xavier_normal(self.slot_out.weight)
def attention(self, hidden, encoder_outputs, input_mask):
"""
hidden : 1,B,D
encoder_outputs : B,T,D
input_mask : B,T # ByteTensor
"""
input_mask = input_mask == 0
hidden = hidden.squeeze(0).unsqueeze(2)
# B
batch_size = encoder_outputs.size(0)
# T
max_len = encoder_outputs.size(1)
# B*T,D -> B*T,D
energies = self.attn(encoder_outputs.contiguous().view(batch_size * max_len, -1))
energies = energies.view(batch_size, max_len, -1)
# B,T,D * B,D,1 --> B,1,T
attn_energies = energies.bmm(hidden).transpose(1, 2)
# PAD masking
attn_energies = attn_energies.squeeze(1).masked_fill(input_mask, -1e12)
# B,T
alpha = functional.softmax(attn_energies)
# B,1,T
alpha = alpha.unsqueeze(1)
# B,1,T * B,T,D => B,1,D
context = alpha.bmm(encoder_outputs)
# B,1,D
return context
def forward_model(self, encoder_outputs, input_mask):
real_context = []
for idx, o in enumerate(encoder_outputs):
real_length = input_mask[idx].sum().cpu().data.tolist()
real_context.append(o[real_length - 1])
context = torch.cat(real_context).view(encoder_outputs.size(0), -1).unsqueeze(1)
batch_size = encoder_outputs.size(0)
input_mask = input_mask == 0
# Get the embedding of the current input word
embedded = Variable(torch.zeros(batch_size, self.embedding_dim))
if self.use_cuda:
embedded = embedded.cuda()
embedded = embedded.unsqueeze(1)
decode = []
aligns = encoder_outputs.transpose(0, 1)
length = encoder_outputs.size(1)
intent_score = None
for i in range(length):
# B,1,D
aligned = aligns[i].unsqueeze(1)
# input, context, aligned encoder hidden, hidden
_, hidden = self.lstm(torch.cat((embedded, context, aligned), 2))
# for Intent Detection
if i == 0:
intent_hidden = hidden[0].clone()
# 1,B,D
intent_context = self.attention(intent_hidden, encoder_outputs, input_mask)
concated = torch.cat((intent_hidden, intent_context.transpose(0, 1)), 2)
# B,D
intent_score = self.intent_out(concated.squeeze(0))
concated = torch.cat((hidden[0], context.transpose(0, 1)), 2)
score = self.slot_out(concated.squeeze(0))
softmaxed = functional.log_softmax(score)
decode.append(score)
_, input = torch.max(softmaxed, 1)
embedded = self.embedding(input.unsqueeze(1))
context = self.attention(hidden[0], encoder_outputs, input_mask)
slot_scores = torch.cat(decode, 1)
return slot_scores.view(batch_size, length, -1), intent_score
def forward(self, encoder_outputs, input_mask):
logits, intent_score = self.forward_model(encoder_outputs, input_mask)
_, preds = self.crf._viterbi_decode_nbest(logits, input_mask, self.nbest)
# print(preds.shape)
preds = preds[:, :, 0]
return preds, intent_score.argmax(-1)
def score(self, encoder_outputs, input_mask, labels_ids, cls_ids):
logits, intent_score = self.forward_model(encoder_outputs, input_mask)
crf_score = self.crf.neg_log_likelihood_loss(logits, input_mask, labels_ids) / logits.size(0)
return crf_score + self.intent_loss(intent_score, cls_ids)
@classmethod
def create(cls, label_size, intent_size,
embedding_dim=64, hidden_dim=256, rnn_layers=1, dropout_p=0.1, pad_idx=0, use_cuda=True, nbest=7):
return cls(label_size=label_size, intent_size=intent_size,
embedding_dim=embedding_dim, hidden_dim=hidden_dim,
rnn_layers=rnn_layers, dropout_p=dropout_p, pad_idx=pad_idx, use_cuda=use_cuda, nbest=nbest)
class AttnNCRFJointDecoder(nn.Module):
def __init__(self,
crf, label_size, input_dim, intent_size, input_dropout=0.5,
key_dim=64, val_dim=64, num_heads=3, nbest=8):
super(AttnNCRFJointDecoder, self).__init__()
self.input_dim = input_dim
self.attn = MultiHeadAttention(key_dim, val_dim, input_dim, num_heads, input_dropout)
self.linear = Linears(in_features=input_dim,
out_features=label_size,
hiddens=[input_dim // 2])
self.crf = crf
self.label_size = label_size
self.intent_size = intent_size
self.intent_out = PoolingLinearClassifier(input_dim, intent_size, input_dropout)
self.intent_loss = nn.CrossEntropyLoss()
self.nbest = nbest
def forward_model(self, inputs, labels_mask=None):
batch_size, seq_len, input_dim = inputs.size()
inputs, hidden = self.attn(inputs, inputs, inputs, labels_mask)
intent_output = self.intent_out(inputs)
output = inputs.contiguous().view(-1, self.input_dim)
# Fully-connected layer
output = self.linear.forward(output)
output = output.view(batch_size, seq_len, self.label_size)
return output, intent_output
def forward(self, inputs, labels_mask):
self.eval()
logits, intent_output = self.forward_model(inputs)
_, preds = self.crf._viterbi_decode_nbest(logits, labels_mask, self.nbest)
# print(preds.shape)
preds = preds[:, :, 0]
"""for idx in range(len(preds)):
for idx_ in range(len(preds[0])):
if preds[idx][idx_] > 0:
preds[idx][idx_] -= 1
else:
raise"""
# print(preds)
self.train()
return preds, intent_output.argmax(-1)
def score(self, inputs, labels_mask, labels, cls_ids):
logits, intent_output = self.forward_model(inputs)
crf_score = self.crf.neg_log_likelihood_loss(logits, labels_mask, labels) / logits.size(0)
return crf_score + self.intent_loss(intent_output, cls_ids)
@classmethod
def create(cls, label_size, input_dim, intent_size, input_dropout=0.5, key_dim=64,
val_dim=64, num_heads=3, use_cuda=True, nbest=8):
return cls(NCRF(label_size, use_cuda), label_size + 2, input_dim, intent_size, input_dropout,
key_dim, val_dim, num_heads, nbest)
class AttnNCRFDecoder(nn.Module):
def __init__(self,
crf, label_size, input_dim, input_dropout=0.5,
key_dim=64, val_dim=64, num_heads=3, nbest=8):
super(AttnNCRFDecoder, self).__init__()
self.input_dim = input_dim
self.attn = MultiHeadAttention(key_dim, val_dim, input_dim, num_heads, input_dropout)
self.linear = Linears(in_features=input_dim,
out_features=label_size,
hiddens=[input_dim // 2])
self.nbest = nbest
self.crf = crf
self.label_size = label_size
def forward_model(self, inputs, labels_mask=None):
batch_size, seq_len, input_dim = inputs.size()
inputs, _ = self.attn(inputs, inputs, inputs, labels_mask)
output = inputs.contiguous().view(-1, self.input_dim)
# Fully-connected layer
output = self.linear.forward(output)
output = output.view(batch_size, seq_len, self.label_size)
return output
def forward(self, inputs, labels_mask):
self.eval()
logits = self.forward_model(inputs)
_, preds = self.crf._viterbi_decode_nbest(logits, labels_mask, self.nbest)
# print(preds.shape)
preds = preds[:, :, 0]
self.train()
return preds
def score(self, inputs, labels_mask, labels):
logits = self.forward_model(inputs)
crf_score = self.crf.neg_log_likelihood_loss(logits, labels_mask, labels) / logits.size(0)
return crf_score
@classmethod
def create(cls, label_size, input_dim, input_dropout=0.5, key_dim=64,
val_dim=64, num_heads=3, use_cuda=True, nbest=8):
return cls(NCRF(label_size, use_cuda), label_size + 2, input_dim, input_dropout,
key_dim, val_dim, num_heads, nbest)
class NCRFDecoder(nn.Module):
# TODO: TRY TO FIX THIS SHIT (get attribute error)
def get_config(self):
config = {
"name": "NCRFDecoder",
"params": {
"label_size": self.label_size,
"input_dim": self.input_dim,
"input_dropout": self.dropout.p,
"nbest": self.nbest
}
}
return config
def __init__(self,
crf, label_size, input_dim, input_dropout=0.5, nbest=8):
super(NCRFDecoder, self).__init__()
self.input_dim = input_dim
self.dropout = nn.Dropout(input_dropout)
self.linear = Linears(in_features=input_dim,
out_features=label_size,
hiddens=[input_dim // 2])
self.nbest = nbest
self.crf = crf
self.label_size = label_size
def forward_model(self, inputs, labels_mask=None):
batch_size, seq_len, input_dim = inputs.size()
inputs = self.dropout(inputs)
output = inputs.contiguous().view(-1, self.input_dim)
# Fully-connected layer
output = self.linear.forward(output)
output = output.view(batch_size, seq_len, self.label_size)
return output
def forward(self, inputs, labels_mask):
self.eval()
logits = self.forward_model(inputs)
_, preds = self.crf._viterbi_decode_nbest(logits, labels_mask, self.nbest)
# print(preds.shape)
preds = preds[:, :, 0]
self.train()
return preds
def score(self, inputs, labels_mask, labels):
logits = self.forward_model(inputs)
crf_score = self.crf.neg_log_likelihood_loss(logits, labels_mask, labels) / logits.size(0)
return crf_score
@classmethod
def from_config(cls, config):
return cls.create(**config)
@classmethod
def create(cls, label_size, input_dim, input_dropout=0.5, use_cuda=True, nbest=8):
return cls(NCRF(label_size, use_cuda), label_size + 2, input_dim, input_dropout, nbest)
| 39.590028 | 113 | 0.622481 | 3,713 | 28,584 | 4.541341 | 0.053057 | 0.036295 | 0.024789 | 0.030008 | 0.893192 | 0.885482 | 0.87368 | 0.861997 | 0.84984 | 0.841656 | 0 | 0.016468 | 0.271341 | 28,584 | 721 | 114 | 39.644938 | 0.793115 | 0.050028 | 0 | 0.787879 | 0 | 0 | 0.002466 | 0 | 0 | 0 | 0 | 0.001387 | 0 | 1 | 0.106061 | false | 0 | 0.013258 | 0.017045 | 0.219697 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
469f5c7f50dd2b249109a97a52c22857153b7419 | 6,878 | py | Python | tests/files/analysis/fastqc_out.py | virtool/virtool-job | db62f2c84e6e3817eaf78ecd03d5f9bbfcb7fc90 | [
"MIT"
] | null | null | null | tests/files/analysis/fastqc_out.py | virtool/virtool-job | db62f2c84e6e3817eaf78ecd03d5f9bbfcb7fc90 | [
"MIT"
] | 1 | 2020-09-02T20:18:45.000Z | 2020-09-02T20:18:45.000Z | tests/files/analysis/fastqc_out.py | virtool/virtool-job | db62f2c84e6e3817eaf78ecd03d5f9bbfcb7fc90 | [
"MIT"
] | null | null | null | expected_output = {
"count": 1960013,
"encoding": "Sanger / Illumina 1.9\n",
"length": [100, 101],
"gc": 49.0,
"bases": [
[33, 34, 33, 34, 31, 34],
[33, 34, 34, 34, 31, 34],
[33, 34, 34, 34, 31, 34],
[36, 37, 37, 37, 35, 37],
[36, 37, 37, 37, 35, 37],
[36, 37, 37, 37, 35, 37],
[36, 37, 37, 37, 35, 37],
[36, 37, 37, 37, 35, 37],
[38, 39, 39, 39, 37, 39],
[38, 39, 39, 39, 37, 39],
[38, 39, 39, 39, 37, 39],
[38, 39, 39, 39, 37, 39],
[38, 39, 39, 39, 37, 39],
[40, 41, 40, 41, 38, 41],
[40, 41, 40, 41, 38, 41],
[39, 41, 40, 41, 38, 41],
[39, 41, 40, 41, 38, 41],
[39, 41, 40, 41, 38, 41],
[39, 41, 40, 41, 38, 41],
[39, 41, 40, 41, 38, 41],
[39, 41, 40, 41, 38, 41],
[39, 41, 40, 41, 38, 41],
[39, 41, 40, 41, 38, 41],
[39, 41, 40, 41, 38, 41],
[39, 41, 40, 41, 38, 41],
[39, 41, 40, 41, 38, 41],
[39, 41, 40, 41, 38, 41],
[39, 41, 40, 41, 37, 41],
[39, 41, 40, 41, 37, 41],
[39, 41, 40, 41, 37, 41],
[39, 41, 40, 41, 37, 41],
[39, 41, 39, 41, 36, 41],
[39, 41, 39, 41, 36, 41],
[39, 41, 39, 41, 36, 41],
[39, 41, 39, 41, 36, 41],
[39, 41, 39, 41, 35, 41],
[39, 41, 39, 41, 35, 41],
[39, 40, 38, 41, 35, 41],
[39, 40, 38, 41, 35, 41],
[38, 40, 38, 41, 35, 41],
[38, 40, 38, 41, 35, 41],
[38, 40, 38, 41, 35, 41],
[38, 40, 38, 41, 35, 41],
[38, 40, 37, 41, 35, 41],
[38, 40, 37, 41, 35, 41],
[38, 40, 37, 41, 35, 41],
[38, 40, 37, 41, 35, 41],
[38, 40, 36, 41, 34, 41],
[38, 40, 36, 41, 34, 41],
[38, 39, 35, 41, 34, 41],
[38, 39, 35, 41, 34, 41],
[37, 39, 35, 41, 34, 41],
[37, 39, 35, 41, 34, 41],
[37, 39, 35, 41, 33, 41],
[37, 39, 35, 41, 33, 41],
[37, 38, 35, 41, 33, 41],
[37, 38, 35, 41, 33, 41],
[36, 37, 35, 40, 33, 41],
[36, 37, 35, 40, 33, 41],
[36, 36, 35, 40, 33, 41],
[36, 36, 35, 40, 33, 41],
[36, 36, 35, 39, 33, 41],
[36, 36, 35, 39, 33, 41],
[36, 35, 35, 39, 33, 41],
[36, 35, 35, 39, 33, 41],
[35, 35, 35, 39, 33, 41],
[35, 35, 35, 39, 33, 41],
[35, 35, 35, 37, 33, 41],
[35, 35, 35, 37, 33, 41],
[35, 35, 35, 37, 33, 40],
[35, 35, 35, 37, 33, 40],
[35, 35, 35, 36, 33, 39],
[35, 35, 35, 36, 33, 39],
[35, 35, 35, 36, 33, 39],
[35, 35, 35, 36, 33, 39],
[33, 34, 32, 35, 30, 37],
[33, 34, 32, 35, 30, 37],
[34, 35, 34, 35, 32, 37],
[34, 35, 34, 35, 32, 37],
[34, 35, 35, 35, 33, 37],
[34, 35, 35, 35, 33, 37],
[34, 35, 35, 35, 33, 37],
[34, 35, 35, 35, 33, 37],
[34, 35, 35, 35, 33, 36],
[34, 35, 35, 35, 33, 36],
[34, 35, 35, 35, 33, 36],
[34, 35, 35, 35, 33, 36],
[34, 35, 35, 35, 32, 36],
[34, 35, 35, 35, 32, 36],
[34, 35, 35, 35, 32, 36],
[34, 35, 35, 35, 32, 36],
[34, 35, 34, 35, 32, 36],
[34, 35, 34, 35, 32, 36],
[34, 35, 34, 35, 32, 35],
[34, 35, 34, 35, 32, 35],
[34, 35, 34, 35, 32, 35],
[34, 35, 34, 35, 32, 35],
[34, 35, 34, 35, 32, 35],
[34, 35, 34, 35, 32, 35],
[33, 34, 32, 35, 29, 35],
[33, 34, 32, 35, 29, 35],
],
"sequences": [
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
1790,
3313,
5510,
8105,
11624,
16027,
21887,
30357,
43308,
69760,
163214,
401156,
651049,
402076,
130052,
785,
0,
0,
0,
0,
0,
0,
0,
0,
0,
],
"composition": [
[26, 13, 12, 47],
[25, 17, 29, 27],
[24, 20, 23, 31],
[28, 24, 18, 28],
[29, 27, 20, 21],
[25, 32, 20, 22],
[21, 20, 35, 21],
[22, 21, 28, 27],
[24, 20, 31, 23],
[25, 29, 22, 23],
[25, 29, 22, 23],
[24, 23, 26, 25],
[24, 23, 26, 25],
[23, 25, 26, 24],
[23, 25, 26, 24],
[24, 25, 25, 24],
[24, 25, 25, 24],
[25, 24, 25, 24],
[25, 24, 25, 24],
[24, 25, 26, 24],
[24, 25, 26, 24],
[24, 25, 25, 24],
[24, 25, 25, 24],
[24, 25, 25, 24],
[24, 25, 25, 24],
[24, 25, 25, 23],
[24, 25, 25, 23],
[24, 25, 24, 24],
[24, 25, 24, 24],
[23, 25, 25, 24],
[23, 25, 25, 24],
[24, 25, 25, 24],
[24, 25, 25, 24],
[24, 26, 24, 24],
[24, 26, 24, 24],
[24, 26, 24, 24],
[24, 26, 24, 24],
[24, 25, 24, 24],
[24, 25, 24, 24],
[24, 26, 24, 24],
[24, 26, 24, 24],
[24, 25, 25, 23],
[24, 25, 25, 23],
[24, 25, 26, 24],
[24, 25, 26, 24],
[24, 25, 25, 23],
[24, 25, 25, 23],
[24, 25, 25, 24],
[24, 25, 25, 24],
[24, 25, 26, 24],
[24, 25, 26, 24],
[24, 25, 25, 23],
[24, 25, 25, 23],
[24, 26, 25, 24],
[24, 26, 25, 24],
[23, 25, 26, 24],
[23, 25, 26, 24],
[24, 25, 25, 23],
[24, 25, 25, 23],
[24, 25, 26, 23],
[24, 25, 26, 23],
[24, 24, 26, 24],
[24, 24, 26, 24],
[23, 26, 25, 24],
[23, 26, 25, 24],
[24, 25, 25, 24],
[24, 25, 25, 24],
[24, 25, 25, 25],
[24, 25, 25, 25],
[25, 25, 25, 24],
[25, 25, 25, 24],
[24, 26, 25, 23],
[24, 26, 25, 23],
[24, 25, 25, 24],
[24, 25, 25, 24],
[24, 24, 25, 25],
[24, 24, 25, 25],
[24, 25, 24, 25],
[24, 25, 24, 25],
[24, 25, 25, 24],
[24, 25, 25, 24],
[24, 25, 25, 23],
[24, 25, 25, 23],
[24, 25, 24, 24],
[24, 25, 24, 24],
[24, 25, 25, 25],
[24, 25, 25, 25],
[24, 25, 24, 24],
[24, 25, 24, 24],
[25, 25, 24, 24],
[25, 25, 24, 24],
[24, 25, 25, 24],
[24, 25, 25, 24],
[23, 24, 26, 24],
[23, 24, 26, 24],
[23, 25, 25, 25],
[23, 25, 25, 25],
[23, 24, 26, 25],
[23, 24, 26, 25],
[24, 24, 25, 25],
[24, 24, 25, 25],
],
}
| 25.954717 | 42 | 0.323786 | 1,079 | 6,878 | 2.063021 | 0.057461 | 0.116801 | 0.110512 | 0.104223 | 0.888589 | 0.868823 | 0.839173 | 0.813118 | 0.768194 | 0.688679 | 0 | 0.575595 | 0.456673 | 6,878 | 264 | 43 | 26.05303 | 0.02007 | 0 | 0 | 0.867424 | 0 | 0 | 0.010032 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
d3c17590e35236aef05589d3565eb41dd1e0fe56 | 18,401 | py | Python | grow/conversion/content_locale_split_test.py | tabulon-ext/grow | 2929c3e9b467a7768d5b5055fe965fbb5106603f | [
"MIT"
] | 335 | 2016-04-02T20:12:21.000Z | 2022-03-28T18:55:26.000Z | grow/conversion/content_locale_split_test.py | tabulon-ext/grow | 2929c3e9b467a7768d5b5055fe965fbb5106603f | [
"MIT"
] | 784 | 2016-04-01T16:56:41.000Z | 2022-03-05T01:25:34.000Z | grow/conversion/content_locale_split_test.py | tabulon-ext/grow | 2929c3e9b467a7768d5b5055fe965fbb5106603f | [
"MIT"
] | 54 | 2016-05-03T13:06:15.000Z | 2021-09-24T04:46:23.000Z | from grow.pods import pods
from grow import storage
from grow.testing import testing
from . import content_locale_split
import textwrap
import unittest
class ConversionDocumentTestCase(unittest.TestCase):
def setUp(self):
dir_path = testing.create_test_pod_dir()
self.pod = pods.Pod(dir_path, storage=storage.FileStorage)
self.pod.write_yaml('/podspec.yaml', {})
def test_convert_for_locale(self):
input = textwrap.dedent("""
foo@: bar
foo@fr: baz
foo@ja: bam
""")
# Converting for no locale removes all tagged fields.
self.assertEqual(
{'foo@': 'bar'},
content_locale_split.ConversionDocument.convert_for_locale(input, None))
# Converting for not specified locale removes all tagged fields.
self.assertEqual(
{'foo@': 'bar'},
content_locale_split.ConversionDocument.convert_for_locale(input, 'en'))
# Converting for specified locale updates and removes other tagged.
self.assertEqual(
{'foo@': 'baz'},
content_locale_split.ConversionDocument.convert_for_locale(input, 'fr'))
# Converting for specified locale updates and removes other tagged.
self.assertEqual(
{'foo@': 'bam'},
content_locale_split.ConversionDocument.convert_for_locale(input, 'ja'))
def test_determine_default_locale(self):
input = textwrap.dedent("""
name: Julie Yang
""")
self.assertEqual(
None,
content_locale_split.ConversionDocument.determine_default_locale(input))
input = textwrap.dedent("""
$localization:
default_locale: de
""")
self.assertEqual(
'de',
content_locale_split.ConversionDocument.determine_default_locale(input))
input = textwrap.dedent("""
$localization:
locales:
- de
""")
self.assertEqual(
None,
content_locale_split.ConversionDocument.determine_default_locale(input))
def test_determine_locales(self):
# No locales returns empty.
input = textwrap.dedent("""
name: Julie Yang
""")
expected = ([], 'name: Julie Yang')
self.assertEqual(
expected,
content_locale_split.ConversionDocument.determine_locales(
input, default_locale='en'))
# Non-default locale is preserved.
input = textwrap.dedent("""
$locale: de
name: Julie Yang
""")
expected = (['de'], input.strip())
self.assertEqual(
expected,
content_locale_split.ConversionDocument.determine_locales(
input, default_locale='en', remove_locales=False))
# Default locale is removed.
input = textwrap.dedent("""
$locale: de
name: Julie Yang
""")
expected = ([], 'name: Julie Yang')
self.assertEqual(
expected,
content_locale_split.ConversionDocument.determine_locales(
input, default_locale='de'))
# Default locale is removed.
input = textwrap.dedent("""
$locales:
- de
- fr
- es
name: Julie Yang
""")
expected = (['de', 'fr', 'es'], 'name: Julie Yang')
self.assertEqual(
expected,
content_locale_split.ConversionDocument.determine_locales(
input, default_locale='en'))
# Locales are preserved.
input = textwrap.dedent("""
$locales:
- de
- fr
- es
name: Julie Yang
""")
expected = (['de', 'fr', 'es'], input.strip())
self.assertEqual(
expected,
content_locale_split.ConversionDocument.determine_locales(input, remove_locales=False))
# Nothing to see here.
input = None
expected = ([], None)
self.assertEqual(
expected,
content_locale_split.ConversionDocument.determine_locales(input))
def test_format_file(self):
front_matter = textwrap.dedent("""
name: Julie Yang
foo: bar
""").strip()
content = textwrap.dedent("""
Content reigns supreme.
""").strip()
expected = textwrap.dedent("""
name: Julie Yang
foo: bar
""").lstrip()
self.assertEqual(
expected,
content_locale_split.ConversionDocument.format_file(front_matter=front_matter))
expected = textwrap.dedent("""
Content reigns supreme.
""").lstrip()
self.assertEqual(
expected,
content_locale_split.ConversionDocument.format_file(content=content))
expected = textwrap.dedent("""
---
name: Julie Yang
foo: bar
---
Content reigns supreme.
""").lstrip()
self.assertEqual(
expected,
content_locale_split.ConversionDocument.format_file(
front_matter=front_matter, content=content))
def test_gather_for_locale(self):
input = textwrap.dedent("""
foo@: bar
foo@fr: baz
foo@ja: bam
""").strip()
# Gathering for no locale keeps all tagged fields.
self.assertEqual(
(input, {}),
content_locale_split.ConversionDocument.gather_for_locale(input, None))
# Gathering for not specified locale keeps all tagged fields.
self.assertEqual(
(input, {}),
content_locale_split.ConversionDocument.gather_for_locale(input, 'en'))
# Gathering for specified locale removes locale specific and keeps rest.
self.assertEqual(
('foo@: bar\nfoo@ja: bam', {'foo@': 'baz'}),
content_locale_split.ConversionDocument.gather_for_locale(input, 'fr'))
# Gathering for specified locale removes locale specific and keeps rest.
self.assertEqual(
('foo@: bar\nfoo@fr: baz', {'foo@': 'bam'}),
content_locale_split.ConversionDocument.gather_for_locale(input, 'ja'))
def test_default_locale_in_doc(self):
self.pod.write_file('/content/test.md', textwrap.dedent("""
---
name: Julie Yang
foo@: bar
$locales:
- en_us
- en_au
- en_uk
---
Content reigns supreme.
---
$locale: ja
foo@: baz
---
Supreme the content reigns.
---
$locales:
- fr
- ch
foo@: bam
foo@ch: zam
---
Reigning content.
""").lstrip())
expected = {
'/content/test.md': textwrap.dedent("""
---
name: Julie Yang
foo@: bar
$locales:
- en_au
- en_uk
---
Content reigns supreme.
""").lstrip(),
'/content/test@ja.md': textwrap.dedent("""
---
foo@: baz
---
Supreme the content reigns.
""").lstrip(),
'/content/test@fr.md': textwrap.dedent("""
---
foo@: bam
---
Reigning content.
""").lstrip(),
'/content/test@ch.md': textwrap.dedent("""
---
foo@: zam
---
Reigning content.
""").lstrip(),
}
doc = content_locale_split.ConversionDocument(
self.pod, '/content/test.md', 'en_us')
doc.convert()
for key, value in expected.items():
self.assertEqual(value, self.pod.read_file(key))
def test_convert_with_empty_front_section(self):
self.pod.write_file('/content/test.yaml', textwrap.dedent("""
---
name: Julie Yang
""").lstrip())
expected = {
'/content/test.yaml': textwrap.dedent("""
name: Julie Yang
""").lstrip(),
}
doc = content_locale_split.ConversionDocument(
self.pod, '/content/test.yaml', 'en_us')
doc.convert()
for key, value in expected.items():
self.assertEqual(value, self.pod.read_file(key))
def test_convert_with_existing(self):
self.pod.write_file('/content/test.md', textwrap.dedent("""
---
name: Julie Yang
---
Content reigns supreme.
---
$locale: ja
---
Supreme the content reigns.
""").lstrip())
self.pod.write_file('/content/test@ja.md', textwrap.dedent("""
Supreme the content reigns.
""").lstrip())
doc = content_locale_split.ConversionDocument(
self.pod, '/content/test.md', 'en_us')
with self.assertRaises(content_locale_split.LocaleExistsError):
doc.convert()
def test_convert_with_extended(self):
self.pod.write_file('/content/test.yaml', textwrap.dedent("""
name: Julie Yang
---
$locales:
- ja
- fr
foo: bar
---
$locale: fr
bar: faz
""").lstrip())
expected = {
'/content/test.yaml': textwrap.dedent("""
name: Julie Yang
""").lstrip(),
'/content/test@ja.yaml': textwrap.dedent("""
foo: bar
""").lstrip(),
'/content/test@fr.yaml': textwrap.dedent("""
bar: faz
foo: bar
""").lstrip(),
}
doc = content_locale_split.ConversionDocument(
self.pod, '/content/test.yaml', 'en_us')
doc.convert()
for key, value in expected.items():
self.assertEqual(value, self.pod.read_file(key))
def test_convert_with_missing_locale(self):
self.pod.write_file('/content/test.yaml', textwrap.dedent("""
name: Julie Yang
---
foo:bar
""").lstrip())
doc = content_locale_split.ConversionDocument(
self.pod, '/content/test.yaml', 'en_us')
with self.assertRaises(content_locale_split.LocaleMissingError):
doc.convert()
def test_convert_with_gather(self):
self.pod.write_file('/content/test.md', textwrap.dedent("""
---
name: Julie Yang
bar@: tri
bar@es: pep
bar@fr: pip
bar@ja: tes
---
Content reigns supreme.
---
$locale: ja
---
Supreme the content reigns.
---
$locale: fr
---
Reigning content.
""").lstrip())
expected = {
'/content/test.md': textwrap.dedent("""
---
name: Julie Yang
bar@: tri
bar@es: pep
---
Content reigns supreme.
""").lstrip(),
'/content/test@ja.md': textwrap.dedent("""
---
bar@: tes
---
Supreme the content reigns.
""").lstrip(),
'/content/test@fr.md': textwrap.dedent("""
---
bar@: pip
---
Reigning content.
""").lstrip(),
}
doc = content_locale_split.ConversionDocument(
self.pod, '/content/test.md', 'en_us')
doc.convert()
for key, value in expected.items():
self.assertEqual(value, self.pod.read_file(key))
def test_convert_with_gather_array(self):
self.pod.write_file('/content/test.md', textwrap.dedent("""
---
name: Julie Yang
bar:
- title: bar
title@fr: rab
foo: fed
- title: bam
title@fr: mab
foo: dew
- title: baz
foo: tee
---
Content reigns supreme.
---
$locale: ja
---
Supreme the content reigns.
---
$locale: fr
---
Reigning content.
""").lstrip())
expected = {
'/content/test.md': textwrap.dedent("""
---
name: Julie Yang
bar:
- title: bar
foo: fed
- title: bam
foo: dew
- title: baz
foo: tee
---
Content reigns supreme.
""").lstrip(),
'/content/test@ja.md': textwrap.dedent("""
Supreme the content reigns.
""").lstrip(),
'/content/test@fr.md': textwrap.dedent("""
---
bar:
- title: rab
foo: fed
- title: mab
foo: dew
- title: baz
foo: tee
---
Reigning content.
""").lstrip(),
}
doc = content_locale_split.ConversionDocument(
self.pod, '/content/test.md', 'en_us')
doc.convert()
for key, value in expected.items():
self.assertEqual(value, self.pod.read_file(key))
def test_convert_with_gather_trailing(self):
self.pod.write_file('/content/test.md', textwrap.dedent("""
---
name: Julie Yang
grr:
foo: tas
foo@es: sep
foo@fr: gli
foo@ja: min
---
Content reigns supreme.
---
$locale: ja
---
Supreme the content reigns.
---
$locale: fr
---
Reigning content.
""").lstrip())
expected = {
'/content/test.md': textwrap.dedent("""
---
name: Julie Yang
grr:
foo: tas
foo@es: sep
---
Content reigns supreme.
""").lstrip(),
'/content/test@ja.md': textwrap.dedent("""
---
grr:
foo: min
---
Supreme the content reigns.
""").lstrip(),
'/content/test@fr.md': textwrap.dedent("""
---
grr:
foo: gli
---
Reigning content.
""").lstrip(),
}
doc = content_locale_split.ConversionDocument(
self.pod, '/content/test.md', 'en_us')
doc.convert()
for key, value in expected.items():
self.assertEqual(value, self.pod.read_file(key))
def test_split(self):
# Two part document.
self.pod.write_file('/content/something.md', textwrap.dedent("""
---
name: Julie Yang
foo@: bar
---
Content reigns supreme.
---
$locale: ja
foo@: baz
---
Supreme the content reigns.
""").lstrip())
expected = [
(textwrap.dedent("""
name: Julie Yang
foo@: bar
""").strip(), 'Content reigns supreme.'),
(textwrap.dedent("""
$locale: ja
foo@: baz
""").strip(), 'Supreme the content reigns.'),
]
doc = content_locale_split.ConversionDocument(
self.pod, '/content/something.md', 'en_us')
self.assertEqual(expected, list(doc.split()))
# Empty front matter document.
self.pod.write_file('/content/something.md', textwrap.dedent("""
---
---
Content reigns supreme.
""").lstrip())
expected = [
(None, 'Content reigns supreme.'),
]
doc = content_locale_split.ConversionDocument(
self.pod, '/content/something.md', 'en_us')
self.assertEqual(expected, list(doc.split()))
# Missing front matter document.
self.pod.write_file('/content/something.md', textwrap.dedent("""
Content reigns supreme.
""").lstrip())
expected = [
(None, 'Content reigns supreme.'),
]
doc = content_locale_split.ConversionDocument(
self.pod, '/content/something.md', 'en_us')
self.assertEqual(expected, list(doc.split()))
# Yaml document.
self.pod.write_file('/content/something.yaml', textwrap.dedent("""
name: Julie Yang
foo: bar
---
$locale: ja
foo: baz
""").lstrip())
expected = [
(textwrap.dedent("""
name: Julie Yang
foo: bar
""").strip(), None),
(textwrap.dedent("""
$locale: ja
foo: baz
""").strip(), None),
]
doc = content_locale_split.ConversionDocument(
self.pod, '/content/something.yaml', 'en_us')
self.assertEqual(expected, list(doc.split()))
if __name__ == '__main__':
unittest.main()
| 31.08277 | 99 | 0.465301 | 1,567 | 18,401 | 5.331844 | 0.086152 | 0.082107 | 0.075404 | 0.137882 | 0.848833 | 0.840455 | 0.819749 | 0.79629 | 0.745183 | 0.71167 | 0 | 0 | 0.418945 | 18,401 | 591 | 100 | 31.135364 | 0.781425 | 0.04065 | 0 | 0.816047 | 0 | 0 | 0.465067 | 0.012136 | 0 | 0 | 0 | 0 | 0.062622 | 1 | 0.029354 | false | 0 | 0.011742 | 0 | 0.043053 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d3dc3aa290de068346b689be4570c19ea4648693 | 179 | py | Python | ddtrace/contrib/__init__.py | p7g/dd-trace-py | 141ac0ab6e9962e3b3bafc9de172076075289a19 | [
"Apache-2.0",
"BSD-3-Clause"
] | 308 | 2016-12-07T16:49:27.000Z | 2022-03-15T10:06:45.000Z | ddtrace/contrib/__init__.py | p7g/dd-trace-py | 141ac0ab6e9962e3b3bafc9de172076075289a19 | [
"Apache-2.0",
"BSD-3-Clause"
] | 1,928 | 2016-11-28T17:13:18.000Z | 2022-03-31T21:43:19.000Z | ddtrace/contrib/__init__.py | p7g/dd-trace-py | 141ac0ab6e9962e3b3bafc9de172076075289a19 | [
"Apache-2.0",
"BSD-3-Clause"
] | 311 | 2016-11-27T03:01:49.000Z | 2022-03-18T21:34:03.000Z | from ..internal.utils.importlib import func_name # noqa
from ..internal.utils.importlib import module_name # noqa
from ..internal.utils.importlib import require_modules # noqa
| 44.75 | 62 | 0.798883 | 24 | 179 | 5.833333 | 0.458333 | 0.257143 | 0.364286 | 0.557143 | 0.8 | 0.571429 | 0.571429 | 0 | 0 | 0 | 0 | 0 | 0.117318 | 179 | 3 | 63 | 59.666667 | 0.886076 | 0.078212 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
d3e056c03773e7e3d796a115f911941d4f83a7af | 189,021 | py | Python | src/contrib/thriftfs/gen-py/hadoopfs/ThriftHadoopFileSystem.py | cantbesure/hadoop-20 | 2c4bf4ae6f8d021a946b0afcf4577d9ca1ff861d | [
"Apache-2.0"
] | 194 | 2015-01-07T11:12:52.000Z | 2022-03-14T09:19:24.000Z | src/contrib/thriftfs/gen-py/hadoopfs/ThriftHadoopFileSystem.py | cantbesure/hadoop-20 | 2c4bf4ae6f8d021a946b0afcf4577d9ca1ff861d | [
"Apache-2.0"
] | 1 | 2017-10-19T16:57:05.000Z | 2017-10-19T16:57:05.000Z | src/contrib/thriftfs/gen-py/hadoopfs/ThriftHadoopFileSystem.py | cantbesure/hadoop-20 | 2c4bf4ae6f8d021a946b0afcf4577d9ca1ff861d | [
"Apache-2.0"
] | 172 | 2015-01-14T19:25:48.000Z | 2022-02-24T02:42:02.000Z | #
# Autogenerated by Thrift Compiler (0.7.0)
#
# DO NOT EDIT UNLESS YOU ARE SURE THAT YOU KNOW WHAT YOU ARE DOING
#
from thrift.Thrift import *
from ttypes import *
from thrift.Thrift import TProcessor
from thrift.transport import TTransport
from thrift.protocol import TBinaryProtocol, TProtocol
try:
from thrift.protocol import fastbinary
except:
fastbinary = None
class Iface:
def setInactivityTimeoutPeriod(self, periodInSeconds):
"""
Parameters:
- periodInSeconds
"""
pass
def shutdown(self, status):
"""
Parameters:
- status
"""
pass
def create(self, path):
"""
Parameters:
- path
"""
pass
def createFile(self, path, mode, overwrite, bufferSize, block_replication, blocksize):
"""
Parameters:
- path
- mode
- overwrite
- bufferSize
- block_replication
- blocksize
"""
pass
def open(self, path):
"""
Parameters:
- path
"""
pass
def append(self, path):
"""
Parameters:
- path
"""
pass
def write(self, handle, data):
"""
Parameters:
- handle
- data
"""
pass
def read(self, handle, offset, size):
"""
Parameters:
- handle
- offset
- size
"""
pass
def close(self, out):
"""
Parameters:
- out
"""
pass
def rm(self, path, recursive):
"""
Parameters:
- path
- recursive
"""
pass
def rename(self, path, dest):
"""
Parameters:
- path
- dest
"""
pass
def mkdirs(self, path):
"""
Parameters:
- path
"""
pass
def exists(self, path):
"""
Parameters:
- path
"""
pass
def stat(self, path):
"""
Parameters:
- path
"""
pass
def listStatus(self, path):
"""
Parameters:
- path
"""
pass
def chmod(self, path, mode):
"""
Parameters:
- path
- mode
"""
pass
def chown(self, path, owner, group):
"""
Parameters:
- path
- owner
- group
"""
pass
def setReplication(self, path, replication):
"""
Parameters:
- path
- replication
"""
pass
def getFileBlockLocations(self, path, start, length):
"""
Parameters:
- path
- start
- length
"""
pass
def hardLink(self, src, dest):
"""
Parameters:
- src
- dest
"""
pass
def concat(self, target, srcs, restricted):
"""
Parameters:
- target
- srcs
- restricted
"""
pass
def reportBadBlocks(self, blocks):
"""
Parameters:
- blocks
"""
pass
def getDataTransferProtocolVersion(self, ):
"""
The following methods are typically used by native C++ hdfs client and
are not used by hdfs applications themselves
"""
pass
def renewLease(self, clientName):
"""
Parameters:
- clientName
"""
pass
def recoverLease(self, path, clientName):
"""
Parameters:
- path
- clientName
"""
pass
def closeRecoverLease(self, path, clientName, discardLastBlock):
"""
Parameters:
- path
- clientName
- discardLastBlock
"""
pass
def abandonBlock(self, block, pathname, clientName):
"""
Parameters:
- block
- pathname
- clientName
"""
pass
def abandonFile(self, pathname, clientName):
"""
Parameters:
- pathname
- clientName
"""
pass
def addBlock(self, pathname, clientName, startOffset, lastBlock, excludedNodes, favouredNodes):
"""
Parameters:
- pathname
- clientName
- startOffset
- lastBlock
- excludedNodes
- favouredNodes
"""
pass
def addFirstBlock(self, pathname, clientName, excludedNodes, favouredNodes):
"""
Parameters:
- pathname
- clientName
- excludedNodes
- favouredNodes
"""
pass
def complete(self, pathname, clientName, fileLen, lastBlock):
"""
Parameters:
- pathname
- clientName
- fileLen
- lastBlock
"""
pass
class Client(Iface):
def __init__(self, iprot, oprot=None):
self._iprot = self._oprot = iprot
if oprot is not None:
self._oprot = oprot
self._seqid = 0
def setInactivityTimeoutPeriod(self, periodInSeconds):
"""
Parameters:
- periodInSeconds
"""
self.send_setInactivityTimeoutPeriod(periodInSeconds)
self.recv_setInactivityTimeoutPeriod()
def send_setInactivityTimeoutPeriod(self, periodInSeconds):
self._oprot.writeMessageBegin('setInactivityTimeoutPeriod', TMessageType.CALL, self._seqid)
args = setInactivityTimeoutPeriod_args()
args.periodInSeconds = periodInSeconds
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_setInactivityTimeoutPeriod(self, ):
(fname, mtype, rseqid) = self._iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(self._iprot)
self._iprot.readMessageEnd()
raise x
result = setInactivityTimeoutPeriod_result()
result.read(self._iprot)
self._iprot.readMessageEnd()
return
def shutdown(self, status):
"""
Parameters:
- status
"""
self.send_shutdown(status)
self.recv_shutdown()
def send_shutdown(self, status):
self._oprot.writeMessageBegin('shutdown', TMessageType.CALL, self._seqid)
args = shutdown_args()
args.status = status
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_shutdown(self, ):
(fname, mtype, rseqid) = self._iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(self._iprot)
self._iprot.readMessageEnd()
raise x
result = shutdown_result()
result.read(self._iprot)
self._iprot.readMessageEnd()
return
def create(self, path):
"""
Parameters:
- path
"""
self.send_create(path)
return self.recv_create()
def send_create(self, path):
self._oprot.writeMessageBegin('create', TMessageType.CALL, self._seqid)
args = create_args()
args.path = path
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_create(self, ):
(fname, mtype, rseqid) = self._iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(self._iprot)
self._iprot.readMessageEnd()
raise x
result = create_result()
result.read(self._iprot)
self._iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.ouch is not None:
raise result.ouch
raise TApplicationException(TApplicationException.MISSING_RESULT, "create failed: unknown result");
def createFile(self, path, mode, overwrite, bufferSize, block_replication, blocksize):
"""
Parameters:
- path
- mode
- overwrite
- bufferSize
- block_replication
- blocksize
"""
self.send_createFile(path, mode, overwrite, bufferSize, block_replication, blocksize)
return self.recv_createFile()
def send_createFile(self, path, mode, overwrite, bufferSize, block_replication, blocksize):
self._oprot.writeMessageBegin('createFile', TMessageType.CALL, self._seqid)
args = createFile_args()
args.path = path
args.mode = mode
args.overwrite = overwrite
args.bufferSize = bufferSize
args.block_replication = block_replication
args.blocksize = blocksize
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_createFile(self, ):
(fname, mtype, rseqid) = self._iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(self._iprot)
self._iprot.readMessageEnd()
raise x
result = createFile_result()
result.read(self._iprot)
self._iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.ouch is not None:
raise result.ouch
raise TApplicationException(TApplicationException.MISSING_RESULT, "createFile failed: unknown result");
def open(self, path):
"""
Parameters:
- path
"""
self.send_open(path)
return self.recv_open()
def send_open(self, path):
self._oprot.writeMessageBegin('open', TMessageType.CALL, self._seqid)
args = open_args()
args.path = path
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_open(self, ):
(fname, mtype, rseqid) = self._iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(self._iprot)
self._iprot.readMessageEnd()
raise x
result = open_result()
result.read(self._iprot)
self._iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.ouch is not None:
raise result.ouch
raise TApplicationException(TApplicationException.MISSING_RESULT, "open failed: unknown result");
def append(self, path):
"""
Parameters:
- path
"""
self.send_append(path)
return self.recv_append()
def send_append(self, path):
self._oprot.writeMessageBegin('append', TMessageType.CALL, self._seqid)
args = append_args()
args.path = path
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_append(self, ):
(fname, mtype, rseqid) = self._iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(self._iprot)
self._iprot.readMessageEnd()
raise x
result = append_result()
result.read(self._iprot)
self._iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.ouch is not None:
raise result.ouch
raise TApplicationException(TApplicationException.MISSING_RESULT, "append failed: unknown result");
def write(self, handle, data):
"""
Parameters:
- handle
- data
"""
self.send_write(handle, data)
return self.recv_write()
def send_write(self, handle, data):
self._oprot.writeMessageBegin('write', TMessageType.CALL, self._seqid)
args = write_args()
args.handle = handle
args.data = data
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_write(self, ):
(fname, mtype, rseqid) = self._iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(self._iprot)
self._iprot.readMessageEnd()
raise x
result = write_result()
result.read(self._iprot)
self._iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.ouch is not None:
raise result.ouch
raise TApplicationException(TApplicationException.MISSING_RESULT, "write failed: unknown result");
def read(self, handle, offset, size):
"""
Parameters:
- handle
- offset
- size
"""
self.send_read(handle, offset, size)
return self.recv_read()
def send_read(self, handle, offset, size):
self._oprot.writeMessageBegin('read', TMessageType.CALL, self._seqid)
args = read_args()
args.handle = handle
args.offset = offset
args.size = size
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_read(self, ):
(fname, mtype, rseqid) = self._iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(self._iprot)
self._iprot.readMessageEnd()
raise x
result = read_result()
result.read(self._iprot)
self._iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.ouch is not None:
raise result.ouch
raise TApplicationException(TApplicationException.MISSING_RESULT, "read failed: unknown result");
def close(self, out):
"""
Parameters:
- out
"""
self.send_close(out)
return self.recv_close()
def send_close(self, out):
self._oprot.writeMessageBegin('close', TMessageType.CALL, self._seqid)
args = close_args()
args.out = out
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_close(self, ):
(fname, mtype, rseqid) = self._iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(self._iprot)
self._iprot.readMessageEnd()
raise x
result = close_result()
result.read(self._iprot)
self._iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.ouch is not None:
raise result.ouch
raise TApplicationException(TApplicationException.MISSING_RESULT, "close failed: unknown result");
def rm(self, path, recursive):
"""
Parameters:
- path
- recursive
"""
self.send_rm(path, recursive)
return self.recv_rm()
def send_rm(self, path, recursive):
self._oprot.writeMessageBegin('rm', TMessageType.CALL, self._seqid)
args = rm_args()
args.path = path
args.recursive = recursive
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_rm(self, ):
(fname, mtype, rseqid) = self._iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(self._iprot)
self._iprot.readMessageEnd()
raise x
result = rm_result()
result.read(self._iprot)
self._iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.ouch is not None:
raise result.ouch
raise TApplicationException(TApplicationException.MISSING_RESULT, "rm failed: unknown result");
def rename(self, path, dest):
"""
Parameters:
- path
- dest
"""
self.send_rename(path, dest)
return self.recv_rename()
def send_rename(self, path, dest):
self._oprot.writeMessageBegin('rename', TMessageType.CALL, self._seqid)
args = rename_args()
args.path = path
args.dest = dest
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_rename(self, ):
(fname, mtype, rseqid) = self._iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(self._iprot)
self._iprot.readMessageEnd()
raise x
result = rename_result()
result.read(self._iprot)
self._iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.ouch is not None:
raise result.ouch
raise TApplicationException(TApplicationException.MISSING_RESULT, "rename failed: unknown result");
def mkdirs(self, path):
"""
Parameters:
- path
"""
self.send_mkdirs(path)
return self.recv_mkdirs()
def send_mkdirs(self, path):
self._oprot.writeMessageBegin('mkdirs', TMessageType.CALL, self._seqid)
args = mkdirs_args()
args.path = path
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_mkdirs(self, ):
(fname, mtype, rseqid) = self._iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(self._iprot)
self._iprot.readMessageEnd()
raise x
result = mkdirs_result()
result.read(self._iprot)
self._iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.ouch is not None:
raise result.ouch
raise TApplicationException(TApplicationException.MISSING_RESULT, "mkdirs failed: unknown result");
def exists(self, path):
"""
Parameters:
- path
"""
self.send_exists(path)
return self.recv_exists()
def send_exists(self, path):
self._oprot.writeMessageBegin('exists', TMessageType.CALL, self._seqid)
args = exists_args()
args.path = path
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_exists(self, ):
(fname, mtype, rseqid) = self._iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(self._iprot)
self._iprot.readMessageEnd()
raise x
result = exists_result()
result.read(self._iprot)
self._iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.ouch is not None:
raise result.ouch
raise TApplicationException(TApplicationException.MISSING_RESULT, "exists failed: unknown result");
def stat(self, path):
"""
Parameters:
- path
"""
self.send_stat(path)
return self.recv_stat()
def send_stat(self, path):
self._oprot.writeMessageBegin('stat', TMessageType.CALL, self._seqid)
args = stat_args()
args.path = path
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_stat(self, ):
(fname, mtype, rseqid) = self._iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(self._iprot)
self._iprot.readMessageEnd()
raise x
result = stat_result()
result.read(self._iprot)
self._iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.ouch is not None:
raise result.ouch
raise TApplicationException(TApplicationException.MISSING_RESULT, "stat failed: unknown result");
def listStatus(self, path):
"""
Parameters:
- path
"""
self.send_listStatus(path)
return self.recv_listStatus()
def send_listStatus(self, path):
self._oprot.writeMessageBegin('listStatus', TMessageType.CALL, self._seqid)
args = listStatus_args()
args.path = path
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_listStatus(self, ):
(fname, mtype, rseqid) = self._iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(self._iprot)
self._iprot.readMessageEnd()
raise x
result = listStatus_result()
result.read(self._iprot)
self._iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.ouch is not None:
raise result.ouch
raise TApplicationException(TApplicationException.MISSING_RESULT, "listStatus failed: unknown result");
def chmod(self, path, mode):
"""
Parameters:
- path
- mode
"""
self.send_chmod(path, mode)
self.recv_chmod()
def send_chmod(self, path, mode):
self._oprot.writeMessageBegin('chmod', TMessageType.CALL, self._seqid)
args = chmod_args()
args.path = path
args.mode = mode
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_chmod(self, ):
(fname, mtype, rseqid) = self._iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(self._iprot)
self._iprot.readMessageEnd()
raise x
result = chmod_result()
result.read(self._iprot)
self._iprot.readMessageEnd()
if result.ouch is not None:
raise result.ouch
return
def chown(self, path, owner, group):
"""
Parameters:
- path
- owner
- group
"""
self.send_chown(path, owner, group)
self.recv_chown()
def send_chown(self, path, owner, group):
self._oprot.writeMessageBegin('chown', TMessageType.CALL, self._seqid)
args = chown_args()
args.path = path
args.owner = owner
args.group = group
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_chown(self, ):
(fname, mtype, rseqid) = self._iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(self._iprot)
self._iprot.readMessageEnd()
raise x
result = chown_result()
result.read(self._iprot)
self._iprot.readMessageEnd()
if result.ouch is not None:
raise result.ouch
return
def setReplication(self, path, replication):
"""
Parameters:
- path
- replication
"""
self.send_setReplication(path, replication)
self.recv_setReplication()
def send_setReplication(self, path, replication):
self._oprot.writeMessageBegin('setReplication', TMessageType.CALL, self._seqid)
args = setReplication_args()
args.path = path
args.replication = replication
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_setReplication(self, ):
(fname, mtype, rseqid) = self._iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(self._iprot)
self._iprot.readMessageEnd()
raise x
result = setReplication_result()
result.read(self._iprot)
self._iprot.readMessageEnd()
if result.ouch is not None:
raise result.ouch
return
def getFileBlockLocations(self, path, start, length):
"""
Parameters:
- path
- start
- length
"""
self.send_getFileBlockLocations(path, start, length)
return self.recv_getFileBlockLocations()
def send_getFileBlockLocations(self, path, start, length):
self._oprot.writeMessageBegin('getFileBlockLocations', TMessageType.CALL, self._seqid)
args = getFileBlockLocations_args()
args.path = path
args.start = start
args.length = length
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_getFileBlockLocations(self, ):
(fname, mtype, rseqid) = self._iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(self._iprot)
self._iprot.readMessageEnd()
raise x
result = getFileBlockLocations_result()
result.read(self._iprot)
self._iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.ouch is not None:
raise result.ouch
raise TApplicationException(TApplicationException.MISSING_RESULT, "getFileBlockLocations failed: unknown result");
def hardLink(self, src, dest):
"""
Parameters:
- src
- dest
"""
self.send_hardLink(src, dest)
return self.recv_hardLink()
def send_hardLink(self, src, dest):
self._oprot.writeMessageBegin('hardLink', TMessageType.CALL, self._seqid)
args = hardLink_args()
args.src = src
args.dest = dest
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_hardLink(self, ):
(fname, mtype, rseqid) = self._iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(self._iprot)
self._iprot.readMessageEnd()
raise x
result = hardLink_result()
result.read(self._iprot)
self._iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.ouch is not None:
raise result.ouch
raise TApplicationException(TApplicationException.MISSING_RESULT, "hardLink failed: unknown result");
def concat(self, target, srcs, restricted):
"""
Parameters:
- target
- srcs
- restricted
"""
self.send_concat(target, srcs, restricted)
self.recv_concat()
def send_concat(self, target, srcs, restricted):
self._oprot.writeMessageBegin('concat', TMessageType.CALL, self._seqid)
args = concat_args()
args.target = target
args.srcs = srcs
args.restricted = restricted
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_concat(self, ):
(fname, mtype, rseqid) = self._iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(self._iprot)
self._iprot.readMessageEnd()
raise x
result = concat_result()
result.read(self._iprot)
self._iprot.readMessageEnd()
if result.ouch is not None:
raise result.ouch
return
def reportBadBlocks(self, blocks):
"""
Parameters:
- blocks
"""
self.send_reportBadBlocks(blocks)
self.recv_reportBadBlocks()
def send_reportBadBlocks(self, blocks):
self._oprot.writeMessageBegin('reportBadBlocks', TMessageType.CALL, self._seqid)
args = reportBadBlocks_args()
args.blocks = blocks
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_reportBadBlocks(self, ):
(fname, mtype, rseqid) = self._iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(self._iprot)
self._iprot.readMessageEnd()
raise x
result = reportBadBlocks_result()
result.read(self._iprot)
self._iprot.readMessageEnd()
if result.ouch is not None:
raise result.ouch
return
def getDataTransferProtocolVersion(self, ):
"""
The following methods are typically used by native C++ hdfs client and
are not used by hdfs applications themselves
"""
self.send_getDataTransferProtocolVersion()
return self.recv_getDataTransferProtocolVersion()
def send_getDataTransferProtocolVersion(self, ):
self._oprot.writeMessageBegin('getDataTransferProtocolVersion', TMessageType.CALL, self._seqid)
args = getDataTransferProtocolVersion_args()
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_getDataTransferProtocolVersion(self, ):
(fname, mtype, rseqid) = self._iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(self._iprot)
self._iprot.readMessageEnd()
raise x
result = getDataTransferProtocolVersion_result()
result.read(self._iprot)
self._iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.ouch is not None:
raise result.ouch
raise TApplicationException(TApplicationException.MISSING_RESULT, "getDataTransferProtocolVersion failed: unknown result");
def renewLease(self, clientName):
"""
Parameters:
- clientName
"""
self.send_renewLease(clientName)
self.recv_renewLease()
def send_renewLease(self, clientName):
self._oprot.writeMessageBegin('renewLease', TMessageType.CALL, self._seqid)
args = renewLease_args()
args.clientName = clientName
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_renewLease(self, ):
(fname, mtype, rseqid) = self._iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(self._iprot)
self._iprot.readMessageEnd()
raise x
result = renewLease_result()
result.read(self._iprot)
self._iprot.readMessageEnd()
if result.ouch is not None:
raise result.ouch
return
def recoverLease(self, path, clientName):
"""
Parameters:
- path
- clientName
"""
self.send_recoverLease(path, clientName)
self.recv_recoverLease()
def send_recoverLease(self, path, clientName):
self._oprot.writeMessageBegin('recoverLease', TMessageType.CALL, self._seqid)
args = recoverLease_args()
args.path = path
args.clientName = clientName
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_recoverLease(self, ):
(fname, mtype, rseqid) = self._iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(self._iprot)
self._iprot.readMessageEnd()
raise x
result = recoverLease_result()
result.read(self._iprot)
self._iprot.readMessageEnd()
if result.ouch is not None:
raise result.ouch
return
def closeRecoverLease(self, path, clientName, discardLastBlock):
"""
Parameters:
- path
- clientName
- discardLastBlock
"""
self.send_closeRecoverLease(path, clientName, discardLastBlock)
self.recv_closeRecoverLease()
def send_closeRecoverLease(self, path, clientName, discardLastBlock):
self._oprot.writeMessageBegin('closeRecoverLease', TMessageType.CALL, self._seqid)
args = closeRecoverLease_args()
args.path = path
args.clientName = clientName
args.discardLastBlock = discardLastBlock
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_closeRecoverLease(self, ):
(fname, mtype, rseqid) = self._iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(self._iprot)
self._iprot.readMessageEnd()
raise x
result = closeRecoverLease_result()
result.read(self._iprot)
self._iprot.readMessageEnd()
if result.ouch is not None:
raise result.ouch
return
def abandonBlock(self, block, pathname, clientName):
"""
Parameters:
- block
- pathname
- clientName
"""
self.send_abandonBlock(block, pathname, clientName)
self.recv_abandonBlock()
def send_abandonBlock(self, block, pathname, clientName):
self._oprot.writeMessageBegin('abandonBlock', TMessageType.CALL, self._seqid)
args = abandonBlock_args()
args.block = block
args.pathname = pathname
args.clientName = clientName
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_abandonBlock(self, ):
(fname, mtype, rseqid) = self._iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(self._iprot)
self._iprot.readMessageEnd()
raise x
result = abandonBlock_result()
result.read(self._iprot)
self._iprot.readMessageEnd()
if result.ouch is not None:
raise result.ouch
return
def abandonFile(self, pathname, clientName):
"""
Parameters:
- pathname
- clientName
"""
self.send_abandonFile(pathname, clientName)
self.recv_abandonFile()
def send_abandonFile(self, pathname, clientName):
self._oprot.writeMessageBegin('abandonFile', TMessageType.CALL, self._seqid)
args = abandonFile_args()
args.pathname = pathname
args.clientName = clientName
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_abandonFile(self, ):
(fname, mtype, rseqid) = self._iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(self._iprot)
self._iprot.readMessageEnd()
raise x
result = abandonFile_result()
result.read(self._iprot)
self._iprot.readMessageEnd()
if result.ouch is not None:
raise result.ouch
return
def addBlock(self, pathname, clientName, startOffset, lastBlock, excludedNodes, favouredNodes):
"""
Parameters:
- pathname
- clientName
- startOffset
- lastBlock
- excludedNodes
- favouredNodes
"""
self.send_addBlock(pathname, clientName, startOffset, lastBlock, excludedNodes, favouredNodes)
return self.recv_addBlock()
def send_addBlock(self, pathname, clientName, startOffset, lastBlock, excludedNodes, favouredNodes):
self._oprot.writeMessageBegin('addBlock', TMessageType.CALL, self._seqid)
args = addBlock_args()
args.pathname = pathname
args.clientName = clientName
args.startOffset = startOffset
args.lastBlock = lastBlock
args.excludedNodes = excludedNodes
args.favouredNodes = favouredNodes
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_addBlock(self, ):
(fname, mtype, rseqid) = self._iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(self._iprot)
self._iprot.readMessageEnd()
raise x
result = addBlock_result()
result.read(self._iprot)
self._iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.ouch is not None:
raise result.ouch
raise TApplicationException(TApplicationException.MISSING_RESULT, "addBlock failed: unknown result");
def addFirstBlock(self, pathname, clientName, excludedNodes, favouredNodes):
"""
Parameters:
- pathname
- clientName
- excludedNodes
- favouredNodes
"""
self.send_addFirstBlock(pathname, clientName, excludedNodes, favouredNodes)
return self.recv_addFirstBlock()
def send_addFirstBlock(self, pathname, clientName, excludedNodes, favouredNodes):
self._oprot.writeMessageBegin('addFirstBlock', TMessageType.CALL, self._seqid)
args = addFirstBlock_args()
args.pathname = pathname
args.clientName = clientName
args.excludedNodes = excludedNodes
args.favouredNodes = favouredNodes
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_addFirstBlock(self, ):
(fname, mtype, rseqid) = self._iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(self._iprot)
self._iprot.readMessageEnd()
raise x
result = addFirstBlock_result()
result.read(self._iprot)
self._iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.ouch is not None:
raise result.ouch
raise TApplicationException(TApplicationException.MISSING_RESULT, "addFirstBlock failed: unknown result");
def complete(self, pathname, clientName, fileLen, lastBlock):
"""
Parameters:
- pathname
- clientName
- fileLen
- lastBlock
"""
self.send_complete(pathname, clientName, fileLen, lastBlock)
return self.recv_complete()
def send_complete(self, pathname, clientName, fileLen, lastBlock):
self._oprot.writeMessageBegin('complete', TMessageType.CALL, self._seqid)
args = complete_args()
args.pathname = pathname
args.clientName = clientName
args.fileLen = fileLen
args.lastBlock = lastBlock
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_complete(self, ):
(fname, mtype, rseqid) = self._iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(self._iprot)
self._iprot.readMessageEnd()
raise x
result = complete_result()
result.read(self._iprot)
self._iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.ouch is not None:
raise result.ouch
raise TApplicationException(TApplicationException.MISSING_RESULT, "complete failed: unknown result");
class Processor(Iface, TProcessor):
def __init__(self, handler):
self._handler = handler
self._processMap = {}
self._processMap["setInactivityTimeoutPeriod"] = Processor.process_setInactivityTimeoutPeriod
self._processMap["shutdown"] = Processor.process_shutdown
self._processMap["create"] = Processor.process_create
self._processMap["createFile"] = Processor.process_createFile
self._processMap["open"] = Processor.process_open
self._processMap["append"] = Processor.process_append
self._processMap["write"] = Processor.process_write
self._processMap["read"] = Processor.process_read
self._processMap["close"] = Processor.process_close
self._processMap["rm"] = Processor.process_rm
self._processMap["rename"] = Processor.process_rename
self._processMap["mkdirs"] = Processor.process_mkdirs
self._processMap["exists"] = Processor.process_exists
self._processMap["stat"] = Processor.process_stat
self._processMap["listStatus"] = Processor.process_listStatus
self._processMap["chmod"] = Processor.process_chmod
self._processMap["chown"] = Processor.process_chown
self._processMap["setReplication"] = Processor.process_setReplication
self._processMap["getFileBlockLocations"] = Processor.process_getFileBlockLocations
self._processMap["hardLink"] = Processor.process_hardLink
self._processMap["concat"] = Processor.process_concat
self._processMap["reportBadBlocks"] = Processor.process_reportBadBlocks
self._processMap["getDataTransferProtocolVersion"] = Processor.process_getDataTransferProtocolVersion
self._processMap["renewLease"] = Processor.process_renewLease
self._processMap["recoverLease"] = Processor.process_recoverLease
self._processMap["closeRecoverLease"] = Processor.process_closeRecoverLease
self._processMap["abandonBlock"] = Processor.process_abandonBlock
self._processMap["abandonFile"] = Processor.process_abandonFile
self._processMap["addBlock"] = Processor.process_addBlock
self._processMap["addFirstBlock"] = Processor.process_addFirstBlock
self._processMap["complete"] = Processor.process_complete
def process(self, iprot, oprot):
(name, type, seqid) = iprot.readMessageBegin()
if name not in self._processMap:
iprot.skip(TType.STRUCT)
iprot.readMessageEnd()
x = TApplicationException(TApplicationException.UNKNOWN_METHOD, 'Unknown function %s' % (name))
oprot.writeMessageBegin(name, TMessageType.EXCEPTION, seqid)
x.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
return
else:
self._processMap[name](self, seqid, iprot, oprot)
return True
def process_setInactivityTimeoutPeriod(self, seqid, iprot, oprot):
args = setInactivityTimeoutPeriod_args()
args.read(iprot)
iprot.readMessageEnd()
result = setInactivityTimeoutPeriod_result()
self._handler.setInactivityTimeoutPeriod(args.periodInSeconds)
oprot.writeMessageBegin("setInactivityTimeoutPeriod", TMessageType.REPLY, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_shutdown(self, seqid, iprot, oprot):
args = shutdown_args()
args.read(iprot)
iprot.readMessageEnd()
result = shutdown_result()
self._handler.shutdown(args.status)
oprot.writeMessageBegin("shutdown", TMessageType.REPLY, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_create(self, seqid, iprot, oprot):
args = create_args()
args.read(iprot)
iprot.readMessageEnd()
result = create_result()
try:
result.success = self._handler.create(args.path)
except ThriftIOException, ouch:
result.ouch = ouch
oprot.writeMessageBegin("create", TMessageType.REPLY, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_createFile(self, seqid, iprot, oprot):
args = createFile_args()
args.read(iprot)
iprot.readMessageEnd()
result = createFile_result()
try:
result.success = self._handler.createFile(args.path, args.mode, args.overwrite, args.bufferSize, args.block_replication, args.blocksize)
except ThriftIOException, ouch:
result.ouch = ouch
oprot.writeMessageBegin("createFile", TMessageType.REPLY, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_open(self, seqid, iprot, oprot):
args = open_args()
args.read(iprot)
iprot.readMessageEnd()
result = open_result()
try:
result.success = self._handler.open(args.path)
except ThriftIOException, ouch:
result.ouch = ouch
oprot.writeMessageBegin("open", TMessageType.REPLY, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_append(self, seqid, iprot, oprot):
args = append_args()
args.read(iprot)
iprot.readMessageEnd()
result = append_result()
try:
result.success = self._handler.append(args.path)
except ThriftIOException, ouch:
result.ouch = ouch
oprot.writeMessageBegin("append", TMessageType.REPLY, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_write(self, seqid, iprot, oprot):
args = write_args()
args.read(iprot)
iprot.readMessageEnd()
result = write_result()
try:
result.success = self._handler.write(args.handle, args.data)
except ThriftIOException, ouch:
result.ouch = ouch
oprot.writeMessageBegin("write", TMessageType.REPLY, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_read(self, seqid, iprot, oprot):
args = read_args()
args.read(iprot)
iprot.readMessageEnd()
result = read_result()
try:
result.success = self._handler.read(args.handle, args.offset, args.size)
except ThriftIOException, ouch:
result.ouch = ouch
oprot.writeMessageBegin("read", TMessageType.REPLY, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_close(self, seqid, iprot, oprot):
args = close_args()
args.read(iprot)
iprot.readMessageEnd()
result = close_result()
try:
result.success = self._handler.close(args.out)
except ThriftIOException, ouch:
result.ouch = ouch
oprot.writeMessageBegin("close", TMessageType.REPLY, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_rm(self, seqid, iprot, oprot):
args = rm_args()
args.read(iprot)
iprot.readMessageEnd()
result = rm_result()
try:
result.success = self._handler.rm(args.path, args.recursive)
except ThriftIOException, ouch:
result.ouch = ouch
oprot.writeMessageBegin("rm", TMessageType.REPLY, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_rename(self, seqid, iprot, oprot):
args = rename_args()
args.read(iprot)
iprot.readMessageEnd()
result = rename_result()
try:
result.success = self._handler.rename(args.path, args.dest)
except ThriftIOException, ouch:
result.ouch = ouch
oprot.writeMessageBegin("rename", TMessageType.REPLY, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_mkdirs(self, seqid, iprot, oprot):
args = mkdirs_args()
args.read(iprot)
iprot.readMessageEnd()
result = mkdirs_result()
try:
result.success = self._handler.mkdirs(args.path)
except ThriftIOException, ouch:
result.ouch = ouch
oprot.writeMessageBegin("mkdirs", TMessageType.REPLY, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_exists(self, seqid, iprot, oprot):
args = exists_args()
args.read(iprot)
iprot.readMessageEnd()
result = exists_result()
try:
result.success = self._handler.exists(args.path)
except ThriftIOException, ouch:
result.ouch = ouch
oprot.writeMessageBegin("exists", TMessageType.REPLY, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_stat(self, seqid, iprot, oprot):
args = stat_args()
args.read(iprot)
iprot.readMessageEnd()
result = stat_result()
try:
result.success = self._handler.stat(args.path)
except ThriftIOException, ouch:
result.ouch = ouch
oprot.writeMessageBegin("stat", TMessageType.REPLY, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_listStatus(self, seqid, iprot, oprot):
args = listStatus_args()
args.read(iprot)
iprot.readMessageEnd()
result = listStatus_result()
try:
result.success = self._handler.listStatus(args.path)
except ThriftIOException, ouch:
result.ouch = ouch
oprot.writeMessageBegin("listStatus", TMessageType.REPLY, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_chmod(self, seqid, iprot, oprot):
args = chmod_args()
args.read(iprot)
iprot.readMessageEnd()
result = chmod_result()
try:
self._handler.chmod(args.path, args.mode)
except ThriftIOException, ouch:
result.ouch = ouch
oprot.writeMessageBegin("chmod", TMessageType.REPLY, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_chown(self, seqid, iprot, oprot):
args = chown_args()
args.read(iprot)
iprot.readMessageEnd()
result = chown_result()
try:
self._handler.chown(args.path, args.owner, args.group)
except ThriftIOException, ouch:
result.ouch = ouch
oprot.writeMessageBegin("chown", TMessageType.REPLY, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_setReplication(self, seqid, iprot, oprot):
args = setReplication_args()
args.read(iprot)
iprot.readMessageEnd()
result = setReplication_result()
try:
self._handler.setReplication(args.path, args.replication)
except ThriftIOException, ouch:
result.ouch = ouch
oprot.writeMessageBegin("setReplication", TMessageType.REPLY, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_getFileBlockLocations(self, seqid, iprot, oprot):
args = getFileBlockLocations_args()
args.read(iprot)
iprot.readMessageEnd()
result = getFileBlockLocations_result()
try:
result.success = self._handler.getFileBlockLocations(args.path, args.start, args.length)
except ThriftIOException, ouch:
result.ouch = ouch
oprot.writeMessageBegin("getFileBlockLocations", TMessageType.REPLY, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_hardLink(self, seqid, iprot, oprot):
args = hardLink_args()
args.read(iprot)
iprot.readMessageEnd()
result = hardLink_result()
try:
result.success = self._handler.hardLink(args.src, args.dest)
except ThriftIOException, ouch:
result.ouch = ouch
oprot.writeMessageBegin("hardLink", TMessageType.REPLY, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_concat(self, seqid, iprot, oprot):
args = concat_args()
args.read(iprot)
iprot.readMessageEnd()
result = concat_result()
try:
self._handler.concat(args.target, args.srcs, args.restricted)
except ThriftIOException, ouch:
result.ouch = ouch
oprot.writeMessageBegin("concat", TMessageType.REPLY, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_reportBadBlocks(self, seqid, iprot, oprot):
args = reportBadBlocks_args()
args.read(iprot)
iprot.readMessageEnd()
result = reportBadBlocks_result()
try:
self._handler.reportBadBlocks(args.blocks)
except ThriftIOException, ouch:
result.ouch = ouch
oprot.writeMessageBegin("reportBadBlocks", TMessageType.REPLY, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_getDataTransferProtocolVersion(self, seqid, iprot, oprot):
args = getDataTransferProtocolVersion_args()
args.read(iprot)
iprot.readMessageEnd()
result = getDataTransferProtocolVersion_result()
try:
result.success = self._handler.getDataTransferProtocolVersion()
except ThriftIOException, ouch:
result.ouch = ouch
oprot.writeMessageBegin("getDataTransferProtocolVersion", TMessageType.REPLY, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_renewLease(self, seqid, iprot, oprot):
args = renewLease_args()
args.read(iprot)
iprot.readMessageEnd()
result = renewLease_result()
try:
self._handler.renewLease(args.clientName)
except ThriftIOException, ouch:
result.ouch = ouch
oprot.writeMessageBegin("renewLease", TMessageType.REPLY, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_recoverLease(self, seqid, iprot, oprot):
args = recoverLease_args()
args.read(iprot)
iprot.readMessageEnd()
result = recoverLease_result()
try:
self._handler.recoverLease(args.path, args.clientName)
except ThriftIOException, ouch:
result.ouch = ouch
oprot.writeMessageBegin("recoverLease", TMessageType.REPLY, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_closeRecoverLease(self, seqid, iprot, oprot):
args = closeRecoverLease_args()
args.read(iprot)
iprot.readMessageEnd()
result = closeRecoverLease_result()
try:
self._handler.closeRecoverLease(args.path, args.clientName, args.discardLastBlock)
except ThriftIOException, ouch:
result.ouch = ouch
oprot.writeMessageBegin("closeRecoverLease", TMessageType.REPLY, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_abandonBlock(self, seqid, iprot, oprot):
args = abandonBlock_args()
args.read(iprot)
iprot.readMessageEnd()
result = abandonBlock_result()
try:
self._handler.abandonBlock(args.block, args.pathname, args.clientName)
except ThriftIOException, ouch:
result.ouch = ouch
oprot.writeMessageBegin("abandonBlock", TMessageType.REPLY, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_abandonFile(self, seqid, iprot, oprot):
args = abandonFile_args()
args.read(iprot)
iprot.readMessageEnd()
result = abandonFile_result()
try:
self._handler.abandonFile(args.pathname, args.clientName)
except ThriftIOException, ouch:
result.ouch = ouch
oprot.writeMessageBegin("abandonFile", TMessageType.REPLY, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_addBlock(self, seqid, iprot, oprot):
args = addBlock_args()
args.read(iprot)
iprot.readMessageEnd()
result = addBlock_result()
try:
result.success = self._handler.addBlock(args.pathname, args.clientName, args.startOffset, args.lastBlock, args.excludedNodes, args.favouredNodes)
except ThriftIOException, ouch:
result.ouch = ouch
oprot.writeMessageBegin("addBlock", TMessageType.REPLY, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_addFirstBlock(self, seqid, iprot, oprot):
args = addFirstBlock_args()
args.read(iprot)
iprot.readMessageEnd()
result = addFirstBlock_result()
try:
result.success = self._handler.addFirstBlock(args.pathname, args.clientName, args.excludedNodes, args.favouredNodes)
except ThriftIOException, ouch:
result.ouch = ouch
oprot.writeMessageBegin("addFirstBlock", TMessageType.REPLY, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_complete(self, seqid, iprot, oprot):
args = complete_args()
args.read(iprot)
iprot.readMessageEnd()
result = complete_result()
try:
result.success = self._handler.complete(args.pathname, args.clientName, args.fileLen, args.lastBlock)
except ThriftIOException, ouch:
result.ouch = ouch
oprot.writeMessageBegin("complete", TMessageType.REPLY, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
# HELPER FUNCTIONS AND STRUCTURES
class setInactivityTimeoutPeriod_args:
"""
Attributes:
- periodInSeconds
"""
thrift_spec = (
None, # 0
(1, TType.I64, 'periodInSeconds', None, None, ), # 1
)
def __init__(self, periodInSeconds=None,):
self.periodInSeconds = periodInSeconds
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I64:
self.periodInSeconds = iprot.readI64();
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('setInactivityTimeoutPeriod_args')
if self.periodInSeconds is not None:
oprot.writeFieldBegin('periodInSeconds', TType.I64, 1)
oprot.writeI64(self.periodInSeconds)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class setInactivityTimeoutPeriod_result:
thrift_spec = (
)
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('setInactivityTimeoutPeriod_result')
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class shutdown_args:
"""
Attributes:
- status
"""
thrift_spec = (
None, # 0
(1, TType.I32, 'status', None, None, ), # 1
)
def __init__(self, status=None,):
self.status = status
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I32:
self.status = iprot.readI32();
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('shutdown_args')
if self.status is not None:
oprot.writeFieldBegin('status', TType.I32, 1)
oprot.writeI32(self.status)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class shutdown_result:
thrift_spec = (
)
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('shutdown_result')
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class create_args:
"""
Attributes:
- path
"""
thrift_spec = (
None, # 0
(1, TType.STRUCT, 'path', (Pathname, Pathname.thrift_spec), None, ), # 1
)
def __init__(self, path=None,):
self.path = path
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.path = Pathname()
self.path.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('create_args')
if self.path is not None:
oprot.writeFieldBegin('path', TType.STRUCT, 1)
self.path.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class create_result:
"""
Attributes:
- success
- ouch
"""
thrift_spec = (
(0, TType.STRUCT, 'success', (ThriftHandle, ThriftHandle.thrift_spec), None, ), # 0
(1, TType.STRUCT, 'ouch', (ThriftIOException, ThriftIOException.thrift_spec), None, ), # 1
)
def __init__(self, success=None, ouch=None,):
self.success = success
self.ouch = ouch
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = ThriftHandle()
self.success.read(iprot)
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.ouch = ThriftIOException()
self.ouch.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('create_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
if self.ouch is not None:
oprot.writeFieldBegin('ouch', TType.STRUCT, 1)
self.ouch.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class createFile_args:
"""
Attributes:
- path
- mode
- overwrite
- bufferSize
- block_replication
- blocksize
"""
thrift_spec = (
None, # 0
(1, TType.STRUCT, 'path', (Pathname, Pathname.thrift_spec), None, ), # 1
(2, TType.I16, 'mode', None, None, ), # 2
(3, TType.BOOL, 'overwrite', None, None, ), # 3
(4, TType.I32, 'bufferSize', None, None, ), # 4
(5, TType.I16, 'block_replication', None, None, ), # 5
(6, TType.I64, 'blocksize', None, None, ), # 6
)
def __init__(self, path=None, mode=None, overwrite=None, bufferSize=None, block_replication=None, blocksize=None,):
self.path = path
self.mode = mode
self.overwrite = overwrite
self.bufferSize = bufferSize
self.block_replication = block_replication
self.blocksize = blocksize
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.path = Pathname()
self.path.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.I16:
self.mode = iprot.readI16();
else:
iprot.skip(ftype)
elif fid == 3:
if ftype == TType.BOOL:
self.overwrite = iprot.readBool();
else:
iprot.skip(ftype)
elif fid == 4:
if ftype == TType.I32:
self.bufferSize = iprot.readI32();
else:
iprot.skip(ftype)
elif fid == 5:
if ftype == TType.I16:
self.block_replication = iprot.readI16();
else:
iprot.skip(ftype)
elif fid == 6:
if ftype == TType.I64:
self.blocksize = iprot.readI64();
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('createFile_args')
if self.path is not None:
oprot.writeFieldBegin('path', TType.STRUCT, 1)
self.path.write(oprot)
oprot.writeFieldEnd()
if self.mode is not None:
oprot.writeFieldBegin('mode', TType.I16, 2)
oprot.writeI16(self.mode)
oprot.writeFieldEnd()
if self.overwrite is not None:
oprot.writeFieldBegin('overwrite', TType.BOOL, 3)
oprot.writeBool(self.overwrite)
oprot.writeFieldEnd()
if self.bufferSize is not None:
oprot.writeFieldBegin('bufferSize', TType.I32, 4)
oprot.writeI32(self.bufferSize)
oprot.writeFieldEnd()
if self.block_replication is not None:
oprot.writeFieldBegin('block_replication', TType.I16, 5)
oprot.writeI16(self.block_replication)
oprot.writeFieldEnd()
if self.blocksize is not None:
oprot.writeFieldBegin('blocksize', TType.I64, 6)
oprot.writeI64(self.blocksize)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class createFile_result:
"""
Attributes:
- success
- ouch
"""
thrift_spec = (
(0, TType.STRUCT, 'success', (ThriftHandle, ThriftHandle.thrift_spec), None, ), # 0
(1, TType.STRUCT, 'ouch', (ThriftIOException, ThriftIOException.thrift_spec), None, ), # 1
)
def __init__(self, success=None, ouch=None,):
self.success = success
self.ouch = ouch
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = ThriftHandle()
self.success.read(iprot)
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.ouch = ThriftIOException()
self.ouch.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('createFile_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
if self.ouch is not None:
oprot.writeFieldBegin('ouch', TType.STRUCT, 1)
self.ouch.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class open_args:
"""
Attributes:
- path
"""
thrift_spec = (
None, # 0
(1, TType.STRUCT, 'path', (Pathname, Pathname.thrift_spec), None, ), # 1
)
def __init__(self, path=None,):
self.path = path
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.path = Pathname()
self.path.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('open_args')
if self.path is not None:
oprot.writeFieldBegin('path', TType.STRUCT, 1)
self.path.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class open_result:
"""
Attributes:
- success
- ouch
"""
thrift_spec = (
(0, TType.STRUCT, 'success', (ThriftHandle, ThriftHandle.thrift_spec), None, ), # 0
(1, TType.STRUCT, 'ouch', (ThriftIOException, ThriftIOException.thrift_spec), None, ), # 1
)
def __init__(self, success=None, ouch=None,):
self.success = success
self.ouch = ouch
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = ThriftHandle()
self.success.read(iprot)
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.ouch = ThriftIOException()
self.ouch.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('open_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
if self.ouch is not None:
oprot.writeFieldBegin('ouch', TType.STRUCT, 1)
self.ouch.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class append_args:
"""
Attributes:
- path
"""
thrift_spec = (
None, # 0
(1, TType.STRUCT, 'path', (Pathname, Pathname.thrift_spec), None, ), # 1
)
def __init__(self, path=None,):
self.path = path
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.path = Pathname()
self.path.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('append_args')
if self.path is not None:
oprot.writeFieldBegin('path', TType.STRUCT, 1)
self.path.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class append_result:
"""
Attributes:
- success
- ouch
"""
thrift_spec = (
(0, TType.STRUCT, 'success', (ThriftHandle, ThriftHandle.thrift_spec), None, ), # 0
(1, TType.STRUCT, 'ouch', (ThriftIOException, ThriftIOException.thrift_spec), None, ), # 1
)
def __init__(self, success=None, ouch=None,):
self.success = success
self.ouch = ouch
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = ThriftHandle()
self.success.read(iprot)
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.ouch = ThriftIOException()
self.ouch.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('append_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
if self.ouch is not None:
oprot.writeFieldBegin('ouch', TType.STRUCT, 1)
self.ouch.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class write_args:
"""
Attributes:
- handle
- data
"""
thrift_spec = (
None, # 0
(1, TType.STRUCT, 'handle', (ThriftHandle, ThriftHandle.thrift_spec), None, ), # 1
(2, TType.STRING, 'data', None, None, ), # 2
)
def __init__(self, handle=None, data=None,):
self.handle = handle
self.data = data
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.handle = ThriftHandle()
self.handle.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRING:
self.data = iprot.readString();
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('write_args')
if self.handle is not None:
oprot.writeFieldBegin('handle', TType.STRUCT, 1)
self.handle.write(oprot)
oprot.writeFieldEnd()
if self.data is not None:
oprot.writeFieldBegin('data', TType.STRING, 2)
oprot.writeString(self.data)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class write_result:
"""
Attributes:
- success
- ouch
"""
thrift_spec = (
(0, TType.BOOL, 'success', None, None, ), # 0
(1, TType.STRUCT, 'ouch', (ThriftIOException, ThriftIOException.thrift_spec), None, ), # 1
)
def __init__(self, success=None, ouch=None,):
self.success = success
self.ouch = ouch
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.BOOL:
self.success = iprot.readBool();
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.ouch = ThriftIOException()
self.ouch.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('write_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.BOOL, 0)
oprot.writeBool(self.success)
oprot.writeFieldEnd()
if self.ouch is not None:
oprot.writeFieldBegin('ouch', TType.STRUCT, 1)
self.ouch.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class read_args:
"""
Attributes:
- handle
- offset
- size
"""
thrift_spec = (
None, # 0
(1, TType.STRUCT, 'handle', (ThriftHandle, ThriftHandle.thrift_spec), None, ), # 1
(2, TType.I64, 'offset', None, None, ), # 2
(3, TType.I32, 'size', None, None, ), # 3
)
def __init__(self, handle=None, offset=None, size=None,):
self.handle = handle
self.offset = offset
self.size = size
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.handle = ThriftHandle()
self.handle.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.I64:
self.offset = iprot.readI64();
else:
iprot.skip(ftype)
elif fid == 3:
if ftype == TType.I32:
self.size = iprot.readI32();
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('read_args')
if self.handle is not None:
oprot.writeFieldBegin('handle', TType.STRUCT, 1)
self.handle.write(oprot)
oprot.writeFieldEnd()
if self.offset is not None:
oprot.writeFieldBegin('offset', TType.I64, 2)
oprot.writeI64(self.offset)
oprot.writeFieldEnd()
if self.size is not None:
oprot.writeFieldBegin('size', TType.I32, 3)
oprot.writeI32(self.size)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class read_result:
"""
Attributes:
- success
- ouch
"""
thrift_spec = (
(0, TType.STRING, 'success', None, None, ), # 0
(1, TType.STRUCT, 'ouch', (ThriftIOException, ThriftIOException.thrift_spec), None, ), # 1
)
def __init__(self, success=None, ouch=None,):
self.success = success
self.ouch = ouch
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRING:
self.success = iprot.readString();
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.ouch = ThriftIOException()
self.ouch.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('read_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRING, 0)
oprot.writeString(self.success)
oprot.writeFieldEnd()
if self.ouch is not None:
oprot.writeFieldBegin('ouch', TType.STRUCT, 1)
self.ouch.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class close_args:
"""
Attributes:
- out
"""
thrift_spec = (
None, # 0
(1, TType.STRUCT, 'out', (ThriftHandle, ThriftHandle.thrift_spec), None, ), # 1
)
def __init__(self, out=None,):
self.out = out
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.out = ThriftHandle()
self.out.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('close_args')
if self.out is not None:
oprot.writeFieldBegin('out', TType.STRUCT, 1)
self.out.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class close_result:
"""
Attributes:
- success
- ouch
"""
thrift_spec = (
(0, TType.BOOL, 'success', None, None, ), # 0
(1, TType.STRUCT, 'ouch', (ThriftIOException, ThriftIOException.thrift_spec), None, ), # 1
)
def __init__(self, success=None, ouch=None,):
self.success = success
self.ouch = ouch
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.BOOL:
self.success = iprot.readBool();
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.ouch = ThriftIOException()
self.ouch.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('close_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.BOOL, 0)
oprot.writeBool(self.success)
oprot.writeFieldEnd()
if self.ouch is not None:
oprot.writeFieldBegin('ouch', TType.STRUCT, 1)
self.ouch.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class rm_args:
"""
Attributes:
- path
- recursive
"""
thrift_spec = (
None, # 0
(1, TType.STRUCT, 'path', (Pathname, Pathname.thrift_spec), None, ), # 1
(2, TType.BOOL, 'recursive', None, None, ), # 2
)
def __init__(self, path=None, recursive=None,):
self.path = path
self.recursive = recursive
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.path = Pathname()
self.path.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.BOOL:
self.recursive = iprot.readBool();
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('rm_args')
if self.path is not None:
oprot.writeFieldBegin('path', TType.STRUCT, 1)
self.path.write(oprot)
oprot.writeFieldEnd()
if self.recursive is not None:
oprot.writeFieldBegin('recursive', TType.BOOL, 2)
oprot.writeBool(self.recursive)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class rm_result:
"""
Attributes:
- success
- ouch
"""
thrift_spec = (
(0, TType.BOOL, 'success', None, None, ), # 0
(1, TType.STRUCT, 'ouch', (ThriftIOException, ThriftIOException.thrift_spec), None, ), # 1
)
def __init__(self, success=None, ouch=None,):
self.success = success
self.ouch = ouch
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.BOOL:
self.success = iprot.readBool();
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.ouch = ThriftIOException()
self.ouch.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('rm_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.BOOL, 0)
oprot.writeBool(self.success)
oprot.writeFieldEnd()
if self.ouch is not None:
oprot.writeFieldBegin('ouch', TType.STRUCT, 1)
self.ouch.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class rename_args:
"""
Attributes:
- path
- dest
"""
thrift_spec = (
None, # 0
(1, TType.STRUCT, 'path', (Pathname, Pathname.thrift_spec), None, ), # 1
(2, TType.STRUCT, 'dest', (Pathname, Pathname.thrift_spec), None, ), # 2
)
def __init__(self, path=None, dest=None,):
self.path = path
self.dest = dest
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.path = Pathname()
self.path.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRUCT:
self.dest = Pathname()
self.dest.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('rename_args')
if self.path is not None:
oprot.writeFieldBegin('path', TType.STRUCT, 1)
self.path.write(oprot)
oprot.writeFieldEnd()
if self.dest is not None:
oprot.writeFieldBegin('dest', TType.STRUCT, 2)
self.dest.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class rename_result:
"""
Attributes:
- success
- ouch
"""
thrift_spec = (
(0, TType.BOOL, 'success', None, None, ), # 0
(1, TType.STRUCT, 'ouch', (ThriftIOException, ThriftIOException.thrift_spec), None, ), # 1
)
def __init__(self, success=None, ouch=None,):
self.success = success
self.ouch = ouch
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.BOOL:
self.success = iprot.readBool();
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.ouch = ThriftIOException()
self.ouch.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('rename_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.BOOL, 0)
oprot.writeBool(self.success)
oprot.writeFieldEnd()
if self.ouch is not None:
oprot.writeFieldBegin('ouch', TType.STRUCT, 1)
self.ouch.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class mkdirs_args:
"""
Attributes:
- path
"""
thrift_spec = (
None, # 0
(1, TType.STRUCT, 'path', (Pathname, Pathname.thrift_spec), None, ), # 1
)
def __init__(self, path=None,):
self.path = path
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.path = Pathname()
self.path.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('mkdirs_args')
if self.path is not None:
oprot.writeFieldBegin('path', TType.STRUCT, 1)
self.path.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class mkdirs_result:
"""
Attributes:
- success
- ouch
"""
thrift_spec = (
(0, TType.BOOL, 'success', None, None, ), # 0
(1, TType.STRUCT, 'ouch', (ThriftIOException, ThriftIOException.thrift_spec), None, ), # 1
)
def __init__(self, success=None, ouch=None,):
self.success = success
self.ouch = ouch
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.BOOL:
self.success = iprot.readBool();
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.ouch = ThriftIOException()
self.ouch.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('mkdirs_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.BOOL, 0)
oprot.writeBool(self.success)
oprot.writeFieldEnd()
if self.ouch is not None:
oprot.writeFieldBegin('ouch', TType.STRUCT, 1)
self.ouch.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class exists_args:
"""
Attributes:
- path
"""
thrift_spec = (
None, # 0
(1, TType.STRUCT, 'path', (Pathname, Pathname.thrift_spec), None, ), # 1
)
def __init__(self, path=None,):
self.path = path
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.path = Pathname()
self.path.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('exists_args')
if self.path is not None:
oprot.writeFieldBegin('path', TType.STRUCT, 1)
self.path.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class exists_result:
"""
Attributes:
- success
- ouch
"""
thrift_spec = (
(0, TType.BOOL, 'success', None, None, ), # 0
(1, TType.STRUCT, 'ouch', (ThriftIOException, ThriftIOException.thrift_spec), None, ), # 1
)
def __init__(self, success=None, ouch=None,):
self.success = success
self.ouch = ouch
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.BOOL:
self.success = iprot.readBool();
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.ouch = ThriftIOException()
self.ouch.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('exists_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.BOOL, 0)
oprot.writeBool(self.success)
oprot.writeFieldEnd()
if self.ouch is not None:
oprot.writeFieldBegin('ouch', TType.STRUCT, 1)
self.ouch.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class stat_args:
"""
Attributes:
- path
"""
thrift_spec = (
None, # 0
(1, TType.STRUCT, 'path', (Pathname, Pathname.thrift_spec), None, ), # 1
)
def __init__(self, path=None,):
self.path = path
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.path = Pathname()
self.path.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('stat_args')
if self.path is not None:
oprot.writeFieldBegin('path', TType.STRUCT, 1)
self.path.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class stat_result:
"""
Attributes:
- success
- ouch
"""
thrift_spec = (
(0, TType.STRUCT, 'success', (FileStatus, FileStatus.thrift_spec), None, ), # 0
(1, TType.STRUCT, 'ouch', (ThriftIOException, ThriftIOException.thrift_spec), None, ), # 1
)
def __init__(self, success=None, ouch=None,):
self.success = success
self.ouch = ouch
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = FileStatus()
self.success.read(iprot)
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.ouch = ThriftIOException()
self.ouch.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('stat_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
if self.ouch is not None:
oprot.writeFieldBegin('ouch', TType.STRUCT, 1)
self.ouch.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class listStatus_args:
"""
Attributes:
- path
"""
thrift_spec = (
None, # 0
(1, TType.STRUCT, 'path', (Pathname, Pathname.thrift_spec), None, ), # 1
)
def __init__(self, path=None,):
self.path = path
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.path = Pathname()
self.path.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('listStatus_args')
if self.path is not None:
oprot.writeFieldBegin('path', TType.STRUCT, 1)
self.path.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class listStatus_result:
"""
Attributes:
- success
- ouch
"""
thrift_spec = (
(0, TType.LIST, 'success', (TType.STRUCT,(FileStatus, FileStatus.thrift_spec)), None, ), # 0
(1, TType.STRUCT, 'ouch', (ThriftIOException, ThriftIOException.thrift_spec), None, ), # 1
)
def __init__(self, success=None, ouch=None,):
self.success = success
self.ouch = ouch
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.LIST:
self.success = []
(_etype24, _size21) = iprot.readListBegin()
for _i25 in xrange(_size21):
_elem26 = FileStatus()
_elem26.read(iprot)
self.success.append(_elem26)
iprot.readListEnd()
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.ouch = ThriftIOException()
self.ouch.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('listStatus_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.LIST, 0)
oprot.writeListBegin(TType.STRUCT, len(self.success))
for iter27 in self.success:
iter27.write(oprot)
oprot.writeListEnd()
oprot.writeFieldEnd()
if self.ouch is not None:
oprot.writeFieldBegin('ouch', TType.STRUCT, 1)
self.ouch.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class chmod_args:
"""
Attributes:
- path
- mode
"""
thrift_spec = (
None, # 0
(1, TType.STRUCT, 'path', (Pathname, Pathname.thrift_spec), None, ), # 1
(2, TType.I16, 'mode', None, None, ), # 2
)
def __init__(self, path=None, mode=None,):
self.path = path
self.mode = mode
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.path = Pathname()
self.path.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.I16:
self.mode = iprot.readI16();
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('chmod_args')
if self.path is not None:
oprot.writeFieldBegin('path', TType.STRUCT, 1)
self.path.write(oprot)
oprot.writeFieldEnd()
if self.mode is not None:
oprot.writeFieldBegin('mode', TType.I16, 2)
oprot.writeI16(self.mode)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class chmod_result:
"""
Attributes:
- ouch
"""
thrift_spec = (
None, # 0
(1, TType.STRUCT, 'ouch', (ThriftIOException, ThriftIOException.thrift_spec), None, ), # 1
)
def __init__(self, ouch=None,):
self.ouch = ouch
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.ouch = ThriftIOException()
self.ouch.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('chmod_result')
if self.ouch is not None:
oprot.writeFieldBegin('ouch', TType.STRUCT, 1)
self.ouch.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class chown_args:
"""
Attributes:
- path
- owner
- group
"""
thrift_spec = (
None, # 0
(1, TType.STRUCT, 'path', (Pathname, Pathname.thrift_spec), None, ), # 1
(2, TType.STRING, 'owner', None, None, ), # 2
(3, TType.STRING, 'group', None, None, ), # 3
)
def __init__(self, path=None, owner=None, group=None,):
self.path = path
self.owner = owner
self.group = group
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.path = Pathname()
self.path.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRING:
self.owner = iprot.readString();
else:
iprot.skip(ftype)
elif fid == 3:
if ftype == TType.STRING:
self.group = iprot.readString();
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('chown_args')
if self.path is not None:
oprot.writeFieldBegin('path', TType.STRUCT, 1)
self.path.write(oprot)
oprot.writeFieldEnd()
if self.owner is not None:
oprot.writeFieldBegin('owner', TType.STRING, 2)
oprot.writeString(self.owner)
oprot.writeFieldEnd()
if self.group is not None:
oprot.writeFieldBegin('group', TType.STRING, 3)
oprot.writeString(self.group)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class chown_result:
"""
Attributes:
- ouch
"""
thrift_spec = (
None, # 0
(1, TType.STRUCT, 'ouch', (ThriftIOException, ThriftIOException.thrift_spec), None, ), # 1
)
def __init__(self, ouch=None,):
self.ouch = ouch
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.ouch = ThriftIOException()
self.ouch.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('chown_result')
if self.ouch is not None:
oprot.writeFieldBegin('ouch', TType.STRUCT, 1)
self.ouch.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class setReplication_args:
"""
Attributes:
- path
- replication
"""
thrift_spec = (
None, # 0
(1, TType.STRUCT, 'path', (Pathname, Pathname.thrift_spec), None, ), # 1
(2, TType.I16, 'replication', None, None, ), # 2
)
def __init__(self, path=None, replication=None,):
self.path = path
self.replication = replication
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.path = Pathname()
self.path.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.I16:
self.replication = iprot.readI16();
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('setReplication_args')
if self.path is not None:
oprot.writeFieldBegin('path', TType.STRUCT, 1)
self.path.write(oprot)
oprot.writeFieldEnd()
if self.replication is not None:
oprot.writeFieldBegin('replication', TType.I16, 2)
oprot.writeI16(self.replication)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class setReplication_result:
"""
Attributes:
- ouch
"""
thrift_spec = (
None, # 0
(1, TType.STRUCT, 'ouch', (ThriftIOException, ThriftIOException.thrift_spec), None, ), # 1
)
def __init__(self, ouch=None,):
self.ouch = ouch
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.ouch = ThriftIOException()
self.ouch.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('setReplication_result')
if self.ouch is not None:
oprot.writeFieldBegin('ouch', TType.STRUCT, 1)
self.ouch.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class getFileBlockLocations_args:
"""
Attributes:
- path
- start
- length
"""
thrift_spec = (
None, # 0
(1, TType.STRUCT, 'path', (Pathname, Pathname.thrift_spec), None, ), # 1
(2, TType.I64, 'start', None, None, ), # 2
(3, TType.I64, 'length', None, None, ), # 3
)
def __init__(self, path=None, start=None, length=None,):
self.path = path
self.start = start
self.length = length
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.path = Pathname()
self.path.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.I64:
self.start = iprot.readI64();
else:
iprot.skip(ftype)
elif fid == 3:
if ftype == TType.I64:
self.length = iprot.readI64();
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('getFileBlockLocations_args')
if self.path is not None:
oprot.writeFieldBegin('path', TType.STRUCT, 1)
self.path.write(oprot)
oprot.writeFieldEnd()
if self.start is not None:
oprot.writeFieldBegin('start', TType.I64, 2)
oprot.writeI64(self.start)
oprot.writeFieldEnd()
if self.length is not None:
oprot.writeFieldBegin('length', TType.I64, 3)
oprot.writeI64(self.length)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class getFileBlockLocations_result:
"""
Attributes:
- success
- ouch
"""
thrift_spec = (
(0, TType.LIST, 'success', (TType.STRUCT,(BlockLocation, BlockLocation.thrift_spec)), None, ), # 0
(1, TType.STRUCT, 'ouch', (ThriftIOException, ThriftIOException.thrift_spec), None, ), # 1
)
def __init__(self, success=None, ouch=None,):
self.success = success
self.ouch = ouch
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.LIST:
self.success = []
(_etype31, _size28) = iprot.readListBegin()
for _i32 in xrange(_size28):
_elem33 = BlockLocation()
_elem33.read(iprot)
self.success.append(_elem33)
iprot.readListEnd()
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.ouch = ThriftIOException()
self.ouch.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('getFileBlockLocations_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.LIST, 0)
oprot.writeListBegin(TType.STRUCT, len(self.success))
for iter34 in self.success:
iter34.write(oprot)
oprot.writeListEnd()
oprot.writeFieldEnd()
if self.ouch is not None:
oprot.writeFieldBegin('ouch', TType.STRUCT, 1)
self.ouch.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class hardLink_args:
"""
Attributes:
- src
- dest
"""
thrift_spec = (
None, # 0
(1, TType.STRUCT, 'src', (Pathname, Pathname.thrift_spec), None, ), # 1
(2, TType.STRUCT, 'dest', (Pathname, Pathname.thrift_spec), None, ), # 2
)
def __init__(self, src=None, dest=None,):
self.src = src
self.dest = dest
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.src = Pathname()
self.src.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRUCT:
self.dest = Pathname()
self.dest.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('hardLink_args')
if self.src is not None:
oprot.writeFieldBegin('src', TType.STRUCT, 1)
self.src.write(oprot)
oprot.writeFieldEnd()
if self.dest is not None:
oprot.writeFieldBegin('dest', TType.STRUCT, 2)
self.dest.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class hardLink_result:
"""
Attributes:
- success
- ouch
"""
thrift_spec = (
(0, TType.BOOL, 'success', None, None, ), # 0
(1, TType.STRUCT, 'ouch', (ThriftIOException, ThriftIOException.thrift_spec), None, ), # 1
)
def __init__(self, success=None, ouch=None,):
self.success = success
self.ouch = ouch
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.BOOL:
self.success = iprot.readBool();
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.ouch = ThriftIOException()
self.ouch.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('hardLink_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.BOOL, 0)
oprot.writeBool(self.success)
oprot.writeFieldEnd()
if self.ouch is not None:
oprot.writeFieldBegin('ouch', TType.STRUCT, 1)
self.ouch.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class concat_args:
"""
Attributes:
- target
- srcs
- restricted
"""
thrift_spec = (
None, # 0
(1, TType.STRUCT, 'target', (Pathname, Pathname.thrift_spec), None, ), # 1
(2, TType.LIST, 'srcs', (TType.STRUCT,(Pathname, Pathname.thrift_spec)), None, ), # 2
(3, TType.BOOL, 'restricted', None, None, ), # 3
)
def __init__(self, target=None, srcs=None, restricted=None,):
self.target = target
self.srcs = srcs
self.restricted = restricted
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.target = Pathname()
self.target.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.LIST:
self.srcs = []
(_etype38, _size35) = iprot.readListBegin()
for _i39 in xrange(_size35):
_elem40 = Pathname()
_elem40.read(iprot)
self.srcs.append(_elem40)
iprot.readListEnd()
else:
iprot.skip(ftype)
elif fid == 3:
if ftype == TType.BOOL:
self.restricted = iprot.readBool();
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('concat_args')
if self.target is not None:
oprot.writeFieldBegin('target', TType.STRUCT, 1)
self.target.write(oprot)
oprot.writeFieldEnd()
if self.srcs is not None:
oprot.writeFieldBegin('srcs', TType.LIST, 2)
oprot.writeListBegin(TType.STRUCT, len(self.srcs))
for iter41 in self.srcs:
iter41.write(oprot)
oprot.writeListEnd()
oprot.writeFieldEnd()
if self.restricted is not None:
oprot.writeFieldBegin('restricted', TType.BOOL, 3)
oprot.writeBool(self.restricted)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class concat_result:
"""
Attributes:
- ouch
"""
thrift_spec = (
None, # 0
(1, TType.STRUCT, 'ouch', (ThriftIOException, ThriftIOException.thrift_spec), None, ), # 1
)
def __init__(self, ouch=None,):
self.ouch = ouch
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.ouch = ThriftIOException()
self.ouch.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('concat_result')
if self.ouch is not None:
oprot.writeFieldBegin('ouch', TType.STRUCT, 1)
self.ouch.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class reportBadBlocks_args:
"""
Attributes:
- blocks
"""
thrift_spec = (
None, # 0
(1, TType.LIST, 'blocks', (TType.STRUCT,(TLocatedBlock, TLocatedBlock.thrift_spec)), None, ), # 1
)
def __init__(self, blocks=None,):
self.blocks = blocks
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.LIST:
self.blocks = []
(_etype45, _size42) = iprot.readListBegin()
for _i46 in xrange(_size42):
_elem47 = TLocatedBlock()
_elem47.read(iprot)
self.blocks.append(_elem47)
iprot.readListEnd()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('reportBadBlocks_args')
if self.blocks is not None:
oprot.writeFieldBegin('blocks', TType.LIST, 1)
oprot.writeListBegin(TType.STRUCT, len(self.blocks))
for iter48 in self.blocks:
iter48.write(oprot)
oprot.writeListEnd()
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class reportBadBlocks_result:
"""
Attributes:
- ouch
"""
thrift_spec = (
None, # 0
(1, TType.STRUCT, 'ouch', (ThriftIOException, ThriftIOException.thrift_spec), None, ), # 1
)
def __init__(self, ouch=None,):
self.ouch = ouch
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.ouch = ThriftIOException()
self.ouch.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('reportBadBlocks_result')
if self.ouch is not None:
oprot.writeFieldBegin('ouch', TType.STRUCT, 1)
self.ouch.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class getDataTransferProtocolVersion_args:
thrift_spec = (
)
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('getDataTransferProtocolVersion_args')
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class getDataTransferProtocolVersion_result:
"""
Attributes:
- success
- ouch
"""
thrift_spec = (
(0, TType.I32, 'success', None, None, ), # 0
(1, TType.STRUCT, 'ouch', (ThriftIOException, ThriftIOException.thrift_spec), None, ), # 1
)
def __init__(self, success=None, ouch=None,):
self.success = success
self.ouch = ouch
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.I32:
self.success = iprot.readI32();
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.ouch = ThriftIOException()
self.ouch.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('getDataTransferProtocolVersion_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.I32, 0)
oprot.writeI32(self.success)
oprot.writeFieldEnd()
if self.ouch is not None:
oprot.writeFieldBegin('ouch', TType.STRUCT, 1)
self.ouch.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class renewLease_args:
"""
Attributes:
- clientName
"""
thrift_spec = (
None, # 0
(1, TType.STRING, 'clientName', None, None, ), # 1
)
def __init__(self, clientName=None,):
self.clientName = clientName
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRING:
self.clientName = iprot.readString();
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('renewLease_args')
if self.clientName is not None:
oprot.writeFieldBegin('clientName', TType.STRING, 1)
oprot.writeString(self.clientName)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class renewLease_result:
"""
Attributes:
- ouch
"""
thrift_spec = (
None, # 0
(1, TType.STRUCT, 'ouch', (ThriftIOException, ThriftIOException.thrift_spec), None, ), # 1
)
def __init__(self, ouch=None,):
self.ouch = ouch
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.ouch = ThriftIOException()
self.ouch.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('renewLease_result')
if self.ouch is not None:
oprot.writeFieldBegin('ouch', TType.STRUCT, 1)
self.ouch.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class recoverLease_args:
"""
Attributes:
- path
- clientName
"""
thrift_spec = (
None, # 0
(1, TType.STRUCT, 'path', (Pathname, Pathname.thrift_spec), None, ), # 1
(2, TType.STRING, 'clientName', None, None, ), # 2
)
def __init__(self, path=None, clientName=None,):
self.path = path
self.clientName = clientName
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.path = Pathname()
self.path.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRING:
self.clientName = iprot.readString();
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('recoverLease_args')
if self.path is not None:
oprot.writeFieldBegin('path', TType.STRUCT, 1)
self.path.write(oprot)
oprot.writeFieldEnd()
if self.clientName is not None:
oprot.writeFieldBegin('clientName', TType.STRING, 2)
oprot.writeString(self.clientName)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class recoverLease_result:
"""
Attributes:
- ouch
"""
thrift_spec = (
None, # 0
(1, TType.STRUCT, 'ouch', (ThriftIOException, ThriftIOException.thrift_spec), None, ), # 1
)
def __init__(self, ouch=None,):
self.ouch = ouch
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.ouch = ThriftIOException()
self.ouch.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('recoverLease_result')
if self.ouch is not None:
oprot.writeFieldBegin('ouch', TType.STRUCT, 1)
self.ouch.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class closeRecoverLease_args:
"""
Attributes:
- path
- clientName
- discardLastBlock
"""
thrift_spec = (
None, # 0
(1, TType.STRUCT, 'path', (Pathname, Pathname.thrift_spec), None, ), # 1
(2, TType.STRING, 'clientName', None, None, ), # 2
(3, TType.BOOL, 'discardLastBlock', None, None, ), # 3
)
def __init__(self, path=None, clientName=None, discardLastBlock=None,):
self.path = path
self.clientName = clientName
self.discardLastBlock = discardLastBlock
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.path = Pathname()
self.path.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRING:
self.clientName = iprot.readString();
else:
iprot.skip(ftype)
elif fid == 3:
if ftype == TType.BOOL:
self.discardLastBlock = iprot.readBool();
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('closeRecoverLease_args')
if self.path is not None:
oprot.writeFieldBegin('path', TType.STRUCT, 1)
self.path.write(oprot)
oprot.writeFieldEnd()
if self.clientName is not None:
oprot.writeFieldBegin('clientName', TType.STRING, 2)
oprot.writeString(self.clientName)
oprot.writeFieldEnd()
if self.discardLastBlock is not None:
oprot.writeFieldBegin('discardLastBlock', TType.BOOL, 3)
oprot.writeBool(self.discardLastBlock)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class closeRecoverLease_result:
"""
Attributes:
- ouch
"""
thrift_spec = (
None, # 0
(1, TType.STRUCT, 'ouch', (ThriftIOException, ThriftIOException.thrift_spec), None, ), # 1
)
def __init__(self, ouch=None,):
self.ouch = ouch
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.ouch = ThriftIOException()
self.ouch.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('closeRecoverLease_result')
if self.ouch is not None:
oprot.writeFieldBegin('ouch', TType.STRUCT, 1)
self.ouch.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class abandonBlock_args:
"""
Attributes:
- block
- pathname
- clientName
"""
thrift_spec = (
None, # 0
(1, TType.STRUCT, 'block', (TBlock, TBlock.thrift_spec), None, ), # 1
(2, TType.STRUCT, 'pathname', (Pathname, Pathname.thrift_spec), None, ), # 2
(3, TType.STRING, 'clientName', None, None, ), # 3
)
def __init__(self, block=None, pathname=None, clientName=None,):
self.block = block
self.pathname = pathname
self.clientName = clientName
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.block = TBlock()
self.block.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRUCT:
self.pathname = Pathname()
self.pathname.read(iprot)
else:
iprot.skip(ftype)
elif fid == 3:
if ftype == TType.STRING:
self.clientName = iprot.readString();
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('abandonBlock_args')
if self.block is not None:
oprot.writeFieldBegin('block', TType.STRUCT, 1)
self.block.write(oprot)
oprot.writeFieldEnd()
if self.pathname is not None:
oprot.writeFieldBegin('pathname', TType.STRUCT, 2)
self.pathname.write(oprot)
oprot.writeFieldEnd()
if self.clientName is not None:
oprot.writeFieldBegin('clientName', TType.STRING, 3)
oprot.writeString(self.clientName)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class abandonBlock_result:
"""
Attributes:
- ouch
"""
thrift_spec = (
None, # 0
(1, TType.STRUCT, 'ouch', (ThriftIOException, ThriftIOException.thrift_spec), None, ), # 1
)
def __init__(self, ouch=None,):
self.ouch = ouch
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.ouch = ThriftIOException()
self.ouch.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('abandonBlock_result')
if self.ouch is not None:
oprot.writeFieldBegin('ouch', TType.STRUCT, 1)
self.ouch.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class abandonFile_args:
"""
Attributes:
- pathname
- clientName
"""
thrift_spec = (
None, # 0
(1, TType.STRUCT, 'pathname', (Pathname, Pathname.thrift_spec), None, ), # 1
(2, TType.STRING, 'clientName', None, None, ), # 2
)
def __init__(self, pathname=None, clientName=None,):
self.pathname = pathname
self.clientName = clientName
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.pathname = Pathname()
self.pathname.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRING:
self.clientName = iprot.readString();
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('abandonFile_args')
if self.pathname is not None:
oprot.writeFieldBegin('pathname', TType.STRUCT, 1)
self.pathname.write(oprot)
oprot.writeFieldEnd()
if self.clientName is not None:
oprot.writeFieldBegin('clientName', TType.STRING, 2)
oprot.writeString(self.clientName)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class abandonFile_result:
"""
Attributes:
- ouch
"""
thrift_spec = (
None, # 0
(1, TType.STRUCT, 'ouch', (ThriftIOException, ThriftIOException.thrift_spec), None, ), # 1
)
def __init__(self, ouch=None,):
self.ouch = ouch
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.ouch = ThriftIOException()
self.ouch.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('abandonFile_result')
if self.ouch is not None:
oprot.writeFieldBegin('ouch', TType.STRUCT, 1)
self.ouch.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class addBlock_args:
"""
Attributes:
- pathname
- clientName
- startOffset
- lastBlock
- excludedNodes
- favouredNodes
"""
thrift_spec = (
None, # 0
(1, TType.STRUCT, 'pathname', (Pathname, Pathname.thrift_spec), None, ), # 1
(2, TType.STRING, 'clientName', None, None, ), # 2
(3, TType.I64, 'startOffset', None, None, ), # 3
(4, TType.STRUCT, 'lastBlock', (TBlock, TBlock.thrift_spec), None, ), # 4
(5, TType.LIST, 'excludedNodes', (TType.STRUCT,(TDatanodeID, TDatanodeID.thrift_spec)), None, ), # 5
(6, TType.LIST, 'favouredNodes', (TType.STRUCT,(TDatanodeID, TDatanodeID.thrift_spec)), None, ), # 6
)
def __init__(self, pathname=None, clientName=None, startOffset=None, lastBlock=None, excludedNodes=None, favouredNodes=None,):
self.pathname = pathname
self.clientName = clientName
self.startOffset = startOffset
self.lastBlock = lastBlock
self.excludedNodes = excludedNodes
self.favouredNodes = favouredNodes
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.pathname = Pathname()
self.pathname.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRING:
self.clientName = iprot.readString();
else:
iprot.skip(ftype)
elif fid == 3:
if ftype == TType.I64:
self.startOffset = iprot.readI64();
else:
iprot.skip(ftype)
elif fid == 4:
if ftype == TType.STRUCT:
self.lastBlock = TBlock()
self.lastBlock.read(iprot)
else:
iprot.skip(ftype)
elif fid == 5:
if ftype == TType.LIST:
self.excludedNodes = []
(_etype52, _size49) = iprot.readListBegin()
for _i53 in xrange(_size49):
_elem54 = TDatanodeID()
_elem54.read(iprot)
self.excludedNodes.append(_elem54)
iprot.readListEnd()
else:
iprot.skip(ftype)
elif fid == 6:
if ftype == TType.LIST:
self.favouredNodes = []
(_etype58, _size55) = iprot.readListBegin()
for _i59 in xrange(_size55):
_elem60 = TDatanodeID()
_elem60.read(iprot)
self.favouredNodes.append(_elem60)
iprot.readListEnd()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('addBlock_args')
if self.pathname is not None:
oprot.writeFieldBegin('pathname', TType.STRUCT, 1)
self.pathname.write(oprot)
oprot.writeFieldEnd()
if self.clientName is not None:
oprot.writeFieldBegin('clientName', TType.STRING, 2)
oprot.writeString(self.clientName)
oprot.writeFieldEnd()
if self.startOffset is not None:
oprot.writeFieldBegin('startOffset', TType.I64, 3)
oprot.writeI64(self.startOffset)
oprot.writeFieldEnd()
if self.lastBlock is not None:
oprot.writeFieldBegin('lastBlock', TType.STRUCT, 4)
self.lastBlock.write(oprot)
oprot.writeFieldEnd()
if self.excludedNodes is not None:
oprot.writeFieldBegin('excludedNodes', TType.LIST, 5)
oprot.writeListBegin(TType.STRUCT, len(self.excludedNodes))
for iter61 in self.excludedNodes:
iter61.write(oprot)
oprot.writeListEnd()
oprot.writeFieldEnd()
if self.favouredNodes is not None:
oprot.writeFieldBegin('favouredNodes', TType.LIST, 6)
oprot.writeListBegin(TType.STRUCT, len(self.favouredNodes))
for iter62 in self.favouredNodes:
iter62.write(oprot)
oprot.writeListEnd()
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class addBlock_result:
"""
Attributes:
- success
- ouch
"""
thrift_spec = (
(0, TType.STRUCT, 'success', (TLocatedBlock, TLocatedBlock.thrift_spec), None, ), # 0
(1, TType.STRUCT, 'ouch', (ThriftIOException, ThriftIOException.thrift_spec), None, ), # 1
)
def __init__(self, success=None, ouch=None,):
self.success = success
self.ouch = ouch
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = TLocatedBlock()
self.success.read(iprot)
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.ouch = ThriftIOException()
self.ouch.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('addBlock_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
if self.ouch is not None:
oprot.writeFieldBegin('ouch', TType.STRUCT, 1)
self.ouch.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class addFirstBlock_args:
"""
Attributes:
- pathname
- clientName
- excludedNodes
- favouredNodes
"""
thrift_spec = (
None, # 0
(1, TType.STRUCT, 'pathname', (Pathname, Pathname.thrift_spec), None, ), # 1
(2, TType.STRING, 'clientName', None, None, ), # 2
(3, TType.LIST, 'excludedNodes', (TType.STRUCT,(TDatanodeID, TDatanodeID.thrift_spec)), None, ), # 3
(4, TType.LIST, 'favouredNodes', (TType.STRUCT,(TDatanodeID, TDatanodeID.thrift_spec)), None, ), # 4
)
def __init__(self, pathname=None, clientName=None, excludedNodes=None, favouredNodes=None,):
self.pathname = pathname
self.clientName = clientName
self.excludedNodes = excludedNodes
self.favouredNodes = favouredNodes
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.pathname = Pathname()
self.pathname.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRING:
self.clientName = iprot.readString();
else:
iprot.skip(ftype)
elif fid == 3:
if ftype == TType.LIST:
self.excludedNodes = []
(_etype66, _size63) = iprot.readListBegin()
for _i67 in xrange(_size63):
_elem68 = TDatanodeID()
_elem68.read(iprot)
self.excludedNodes.append(_elem68)
iprot.readListEnd()
else:
iprot.skip(ftype)
elif fid == 4:
if ftype == TType.LIST:
self.favouredNodes = []
(_etype72, _size69) = iprot.readListBegin()
for _i73 in xrange(_size69):
_elem74 = TDatanodeID()
_elem74.read(iprot)
self.favouredNodes.append(_elem74)
iprot.readListEnd()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('addFirstBlock_args')
if self.pathname is not None:
oprot.writeFieldBegin('pathname', TType.STRUCT, 1)
self.pathname.write(oprot)
oprot.writeFieldEnd()
if self.clientName is not None:
oprot.writeFieldBegin('clientName', TType.STRING, 2)
oprot.writeString(self.clientName)
oprot.writeFieldEnd()
if self.excludedNodes is not None:
oprot.writeFieldBegin('excludedNodes', TType.LIST, 3)
oprot.writeListBegin(TType.STRUCT, len(self.excludedNodes))
for iter75 in self.excludedNodes:
iter75.write(oprot)
oprot.writeListEnd()
oprot.writeFieldEnd()
if self.favouredNodes is not None:
oprot.writeFieldBegin('favouredNodes', TType.LIST, 4)
oprot.writeListBegin(TType.STRUCT, len(self.favouredNodes))
for iter76 in self.favouredNodes:
iter76.write(oprot)
oprot.writeListEnd()
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class addFirstBlock_result:
"""
Attributes:
- success
- ouch
"""
thrift_spec = (
(0, TType.STRUCT, 'success', (TLocatedBlock, TLocatedBlock.thrift_spec), None, ), # 0
(1, TType.STRUCT, 'ouch', (ThriftIOException, ThriftIOException.thrift_spec), None, ), # 1
)
def __init__(self, success=None, ouch=None,):
self.success = success
self.ouch = ouch
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = TLocatedBlock()
self.success.read(iprot)
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.ouch = ThriftIOException()
self.ouch.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('addFirstBlock_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
if self.ouch is not None:
oprot.writeFieldBegin('ouch', TType.STRUCT, 1)
self.ouch.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class complete_args:
"""
Attributes:
- pathname
- clientName
- fileLen
- lastBlock
"""
thrift_spec = (
None, # 0
(1, TType.STRUCT, 'pathname', (Pathname, Pathname.thrift_spec), None, ), # 1
(2, TType.STRING, 'clientName', None, None, ), # 2
(3, TType.I64, 'fileLen', None, None, ), # 3
(4, TType.STRUCT, 'lastBlock', (TBlock, TBlock.thrift_spec), None, ), # 4
)
def __init__(self, pathname=None, clientName=None, fileLen=None, lastBlock=None,):
self.pathname = pathname
self.clientName = clientName
self.fileLen = fileLen
self.lastBlock = lastBlock
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.pathname = Pathname()
self.pathname.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRING:
self.clientName = iprot.readString();
else:
iprot.skip(ftype)
elif fid == 3:
if ftype == TType.I64:
self.fileLen = iprot.readI64();
else:
iprot.skip(ftype)
elif fid == 4:
if ftype == TType.STRUCT:
self.lastBlock = TBlock()
self.lastBlock.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('complete_args')
if self.pathname is not None:
oprot.writeFieldBegin('pathname', TType.STRUCT, 1)
self.pathname.write(oprot)
oprot.writeFieldEnd()
if self.clientName is not None:
oprot.writeFieldBegin('clientName', TType.STRING, 2)
oprot.writeString(self.clientName)
oprot.writeFieldEnd()
if self.fileLen is not None:
oprot.writeFieldBegin('fileLen', TType.I64, 3)
oprot.writeI64(self.fileLen)
oprot.writeFieldEnd()
if self.lastBlock is not None:
oprot.writeFieldBegin('lastBlock', TType.STRUCT, 4)
self.lastBlock.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class complete_result:
"""
Attributes:
- success
- ouch
"""
thrift_spec = (
(0, TType.BOOL, 'success', None, None, ), # 0
(1, TType.STRUCT, 'ouch', (ThriftIOException, ThriftIOException.thrift_spec), None, ), # 1
)
def __init__(self, success=None, ouch=None,):
self.success = success
self.ouch = ouch
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.BOOL:
self.success = iprot.readBool();
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.ouch = ThriftIOException()
self.ouch.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('complete_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.BOOL, 0)
oprot.writeBool(self.success)
oprot.writeFieldEnd()
if self.ouch is not None:
oprot.writeFieldBegin('ouch', TType.STRUCT, 1)
self.ouch.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
| 30.277271 | 188 | 0.658509 | 21,419 | 189,021 | 5.583221 | 0.012699 | 0.017184 | 0.030931 | 0.026491 | 0.887279 | 0.856364 | 0.833469 | 0.817714 | 0.793815 | 0.792737 | 0 | 0.005413 | 0.224033 | 189,021 | 6,242 | 189 | 30.282121 | 0.809912 | 0.002354 | 0 | 0.835673 | 1 | 0 | 0.026172 | 0.003087 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.0064 | 0.001239 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
317058c6908bacbb016a39263890f6884b5f8fd9 | 140 | py | Python | pygama/io/lh5.py | sweigart/pygama | 3c5fe4c69230814933b2de879b9a305ff0d4ad5e | [
"Apache-2.0"
] | 13 | 2019-05-01T01:37:30.000Z | 2022-03-18T08:52:19.000Z | pygama/io/lh5.py | sweigart/pygama | 3c5fe4c69230814933b2de879b9a305ff0d4ad5e | [
"Apache-2.0"
] | 111 | 2019-03-25T00:50:48.000Z | 2022-03-30T17:13:43.000Z | pygama/io/lh5.py | sweigart/pygama | 3c5fe4c69230814933b2de879b9a305ff0d4ad5e | [
"Apache-2.0"
] | 52 | 2019-01-24T21:05:04.000Z | 2022-03-07T23:37:55.000Z | from pygama.lh5 import *
print("Warning: pygama.io.lh5 is deprecated and will be removed in a future release. Instead import pygama.lh5.")
| 35 | 113 | 0.771429 | 23 | 140 | 4.695652 | 0.782609 | 0.166667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.025 | 0.142857 | 140 | 3 | 114 | 46.666667 | 0.875 | 0 | 0 | 0 | 0 | 0.5 | 0.742857 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0.5 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 8 |
3182c95b8df7b3db2f759556629d18b8c5e0c9c4 | 5,607 | py | Python | evaluate.py | wuzhanghui/postrate | 24291edbc8df9b88f347aee132a128c78d17cc21 | [
"Apache-2.0"
] | null | null | null | evaluate.py | wuzhanghui/postrate | 24291edbc8df9b88f347aee132a128c78d17cc21 | [
"Apache-2.0"
] | null | null | null | evaluate.py | wuzhanghui/postrate | 24291edbc8df9b88f347aee132a128c78d17cc21 | [
"Apache-2.0"
] | null | null | null | import os
import argparse
from douzero.evaluation.simulation import evaluate
if __name__ == '__main__':
#测试农民
# parser = argparse.ArgumentParser(
# 'Dou Dizhu Evaluation')
# parser.add_argument('--landlord', type=str,
# default='baselines/ADP/landlord.ckpt')
# parser.add_argument('--landlord_up', type=str,
# default='baselines/resnet/landlord_up.ckpt')
# parser.add_argument('--landlord_down', type=str,
# default='baselines/resnet/landlord_down.ckpt')
#测试地主
# parser = argparse.ArgumentParser(
# 'Dou Dizhu Evaluation')
# parser.add_argument('--landlord', type=str,
# default='baselines/resnet/landlord.ckpt')
# parser.add_argument('--landlord_up', type=str,
# default='baselines/ADP/landlord_up.ckpt')
# parser.add_argument('--landlord_down', type=str,
# default='baselines/ADP/landlord_down.ckpt')
#新
parser = argparse.ArgumentParser(
'Dou Dizhu Evaluation')
parser.add_argument('--landlord', type=str,
default='baselines/resnet/landlord.ckpt')
parser.add_argument('--landlord_up', type=str,
default='baselines/resnet/landlord_up.ckpt')
parser.add_argument('--landlord_down', type=str,
default='baselines/resnet/landlord_down.ckpt')
parser.add_argument('--eval_data', type=str,
default='eval_data.pkl')
parser.add_argument('--num_workers', type=int, default=5)
parser.add_argument('--gpu_device', type=str, default='0')
parser.add_argument('--output', type=bool, default=True)
parser.add_argument('--bid', type=bool, default=True)
parser.add_argument('--title', type=str, default='New')
args = parser.parse_args()
#args.output = True
args.output = False
args.bid = False
# if args.output or args.bid:
# args.num_workers = 1
# t = 3
# frame = 3085177900
# adp_frame = 2511184300
# args.landlord = 'baselines/resnet_landlord_%i.ckpt' % frame
# args.landlord_up = 'baselines/resnet_landlord_up_%i.ckpt' % frame
# args.landlord_down = 'baselines/resnet_landlord_%i.ckpt' % frame
# args.landlord = 'baselines/douzero_ADP/landlord.ckpt'
# args.landlord_up = 'baselines/douzero_ADP/landlord_up.ckpt'
# args.landlord_down = 'baselines/douzero_ADP/landlord_down.ckpt'
# if t == 1:
# args.landlord = 'baselines/resnet_landlord_%i.ckpt' % frame
# args.landlord_up = 'baselines/douzero_ADP/landlord_up.ckpt'
# args.landlord_down = 'baselines/douzero_ADP/landlord_down.ckpt'
# elif t == 2:
# args.landlord = 'baselines/douzero_ADP/landlord.ckpt'
# args.landlord_up = 'baselines/resnet_landlord_up_%i.ckpt' % frame
# args.landlord_down = 'baselines/resnet_landlord_down_%i.ckpt' % frame
# elif t == 3:
# args.landlord = 'baselines/resnet_landlord_%i.ckpt' % frame
# args.landlord_up = 'baselines/resnet_landlord_up_%i.ckpt' % frame
# args.landlord_down = 'baselines/resnet_landlord_down_%i.ckpt' % frame
# elif t == 4:
# args.landlord = 'baselines/douzero_ADP/landlord.ckpt'
# args.landlord_up = 'baselines/douzero_ADP/landlord_up.ckpt'
# args.landlord_down = 'baselines/douzero_ADP/landlord_down.ckpt'
# elif t == 5:
# args.landlord = 'baselines/douzero_WP/landlord.ckpt'
# args.landlord_up = 'baselines/douzero_WP/landlord_up.ckpt'
# args.landlord_down = 'baselines/douzero_WP/landlord_down.ckpt'
# elif t == 6:
# args.landlord = 'baselines/resnet_landlord_%i.ckpt' % frame
# args.landlord_up = 'baselines/douzero_ADP/landlord_up_weights_%i.ckpt' % adp_frame
# args.landlord_down = 'baselines/douzero_ADP/landlord_down_weights_%i.ckpt' % adp_frame
# elif t == 7:
# args.landlord = 'baselines/douzero_ADP/landlord_weights_%i.ckpt' % adp_frame
# args.landlord_up = 'baselines/resnet_landlord_up_%i.ckpt' % frame
# args.landlord_down = 'baselines/resnet_landlord_down_%i.ckpt' % frame
# elif t == 8:
# args.landlord = 'baselines/douzero_ADP/landlord_weights_%i.ckpt' % adp_frame
# args.landlord_up = 'baselines/douzero_ADP/landlord_up_weights_%i.ckpt' % adp_frame
# args.landlord_down = 'baselines/douzero_ADP/landlord_down_weights_%i.ckpt' % adp_frame
# elif t == 9:
# args.landlord = 'baselines/resnet_landlord_%i.ckpt' % frame
# args.landlord_up = 'baselines/resnet_landlord_up_%i.ckpt' % adp_frame
# args.landlord_down = 'baselines/resnet_landlord_down_%i.ckpt' % adp_frame
# elif t == 10:
# # landlord_down_weights_10777798400
# args.landlord = 'baselines/douzero_ADP/landlord.ckpt'
# args.landlord_up = 'baselines/douzero_ADP/landlord_up_weights_%i.ckpt' % adp_frame
# args.landlord_down = 'baselines/douzero_ADP/landlord_down_weights_%i.ckpt' % adp_frame
# elif t == 11:
# args.landlord = 'baselines/douzero_ADP/landlord_weights_%i.ckpt' % adp_frame
# args.landlord_up = 'baselines/douzero_ADP/landlord_up.ckpt'
# args.landlord_down = 'baselines/douzero_ADP/landlord_down.ckpt'
os.environ['KMP_DUPLICATE_LIB_OK'] = 'True'
os.environ["CUDA_VISIBLE_DEVICES"] = args.gpu_device
evaluate(args.landlord,
args.landlord_up,
args.landlord_down,
args.eval_data,
args.num_workers,
args.output,
args.bid,
args.title)
| 47.923077 | 96 | 0.658463 | 678 | 5,607 | 5.166667 | 0.113569 | 0.143877 | 0.137882 | 0.161861 | 0.812161 | 0.806166 | 0.797602 | 0.765059 | 0.748216 | 0.748216 | 0 | 0.010909 | 0.215267 | 5,607 | 116 | 97 | 48.336207 | 0.785227 | 0.679864 | 0 | 0 | 0 | 0 | 0.163562 | 0.057043 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.09375 | 0 | 0.09375 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.