hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
9e8817627535df6f0d585998aa24f60ff7d9791c | 365 | py | Python | skp_edu_docker/code/cluster/preprocess/pre_node_feed_fr2cnn.py | TensorMSA/hoyai_docker | 12f0041e6306d8a6421585a4b51666bad30be442 | [
"MIT"
] | 8 | 2017-06-16T00:19:12.000Z | 2020-08-13T03:15:57.000Z | skp_edu_docker/code/cluster/preprocess/pre_node_feed_fr2cnn.py | TensorMSA/tensormsa_docker | 12f0041e6306d8a6421585a4b51666bad30be442 | [
"MIT"
] | 21 | 2017-06-09T10:15:14.000Z | 2018-03-29T07:51:02.000Z | skp_edu_docker/code/cluster/preprocess/pre_node_feed_fr2cnn.py | TensorMSA/hoyai_docker | 12f0041e6306d8a6421585a4b51666bad30be442 | [
"MIT"
] | 4 | 2017-10-25T09:59:53.000Z | 2020-05-07T09:51:11.000Z | from cluster.preprocess.pre_node_feed import PreNodeFeed
class PreNodeFeedFr2Cnn(PreNodeFeed):
"""
"""
def run(self, conf_data):
"""
override init class
"""
super(PreNodeFeedFr2Cnn, self).run(conf_data)
self._init_node_parm(conf_data['node_id'])
def _convert_data_format(self, obj, index):
pass
| 19.210526 | 56 | 0.635616 | 41 | 365 | 5.365854 | 0.609756 | 0.109091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00738 | 0.257534 | 365 | 18 | 57 | 20.277778 | 0.804428 | 0.052055 | 0 | 0 | 0 | 0 | 0.022654 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | false | 0.142857 | 0.142857 | 0 | 0.571429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 3 |
9e922e429bd689190580fd828c6525764ab943c0 | 1,268 | py | Python | tower_cli/compat.py | kedark3/tower-cli | 487a1b9a8e96509798fee108e4f7d2c187177771 | [
"Apache-2.0"
] | 363 | 2015-01-14T17:48:34.000Z | 2022-01-29T06:37:04.000Z | tower_cli/compat.py | kedark3/tower-cli | 487a1b9a8e96509798fee108e4f7d2c187177771 | [
"Apache-2.0"
] | 703 | 2015-01-06T17:17:20.000Z | 2020-09-16T15:54:17.000Z | tower_cli/compat.py | kedark3/tower-cli | 487a1b9a8e96509798fee108e4f7d2c187177771 | [
"Apache-2.0"
] | 203 | 2015-01-18T22:38:23.000Z | 2022-01-28T19:19:05.000Z | # Copyright 2015, Ansible, Inc.
# Luke Sneeringer <lsneeringer@ansible.com>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# Import OrderedDict from the standard library if possible, and from
# the ordereddict library (required on Python 2.6) otherwise.
try:
from collections import OrderedDict # NOQA
except ImportError: # Python < 2.7
from ordereddict import OrderedDict # NOQA
# Import simplejson if we have it (Python 2.6), and use json from the
# standard library otherwise.
#
# Note: Python 2.6 does have a JSON library, but it lacks `object_pairs_hook`
# as a keyword argument to `json.loads`, so we still need simplejson on
# Python 2.6.
import sys
if sys.version_info < (2, 7):
import simplejson as json # NOQA
else:
import json # NOQA
| 36.228571 | 77 | 0.746057 | 195 | 1,268 | 4.835897 | 0.528205 | 0.063627 | 0.033934 | 0.033934 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.019305 | 0.182965 | 1,268 | 34 | 78 | 37.294118 | 0.890927 | 0.794164 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
9e92a7a7b55047819665042784eb05a1717697c3 | 19,250 | py | Python | py/tests/test_donorfy.py | samuelcolvin/nosht | 9e4d9bea8ff6bfae86cae948cc3028ccc68d0188 | [
"MIT"
] | 26 | 2018-07-28T23:11:27.000Z | 2022-02-09T13:40:33.000Z | py/tests/test_donorfy.py | samuelcolvin/nosht | 9e4d9bea8ff6bfae86cae948cc3028ccc68d0188 | [
"MIT"
] | 336 | 2018-05-25T17:57:00.000Z | 2022-03-11T23:24:36.000Z | py/tests/test_donorfy.py | samuelcolvin/nosht | 9e4d9bea8ff6bfae86cae948cc3028ccc68d0188 | [
"MIT"
] | 4 | 2018-07-18T08:37:19.000Z | 2022-01-31T14:42:48.000Z | import json
import pytest
from buildpg import Values
from pytest import fixture
from pytest_toolbox.comparison import CloseToNow, RegexStr
from shared.actions import ActionTypes
from shared.donorfy import DonorfyActor
from shared.utils import RequestError
from web.utils import encrypt_json
from .conftest import Factory
@fixture(name='donorfy')
async def create_donorfy(settings, db_pool):
settings.donorfy_api_key = 'standard'
settings.donorfy_access_key = 'donorfy-access-key'
don = DonorfyActor(settings=settings, pg=db_pool, concurrency_enabled=False)
await don.startup()
redis = await don.get_redis()
await redis.flushdb()
yield don
await don.close(shutdown=True)
async def test_create_host_existing(donorfy: DonorfyActor, factory: Factory, dummy_server):
await factory.create_company()
await factory.create_user()
await donorfy.host_signuped(factory.user_id)
await donorfy.host_signuped(factory.user_id)
assert dummy_server.app['log'] == [
f'GET donorfy_api_root/standard/constituents/ExternalKey/nosht_{factory.user_id}',
]
async def test_create_host_new(donorfy: DonorfyActor, factory: Factory, dummy_server):
await factory.create_company()
await factory.create_user()
donorfy.settings.donorfy_api_key = 'new-user'
await donorfy.host_signuped(factory.user_id)
assert dummy_server.app['log'] == [
f'GET donorfy_api_root/new-user/constituents/ExternalKey/nosht_{factory.user_id}',
f'GET donorfy_api_root/new-user/constituents/EmailAddress/frank@example.org',
f'POST donorfy_api_root/new-user/constituents',
]
async def test_create_event(donorfy: DonorfyActor, factory: Factory, dummy_server):
await factory.create_company()
await factory.create_user()
await factory.create_cat()
await factory.create_event(long_description='test ' * 100)
await donorfy.event_created(factory.event_id)
assert dummy_server.app['log'] == [
f'GET donorfy_api_root/standard/System/LookUpTypes/Campaigns',
f'GET donorfy_api_root/standard/constituents/ExternalKey/nosht_{factory.user_id}',
f'GET donorfy_api_root/standard/constituents/123456',
f'POST donorfy_api_root/standard/constituents/123456/AddActiveTags',
f'POST donorfy_api_root/standard/activities',
]
activity_data = dummy_server.app['data']['/donorfy_api_root/standard/activities 201']
assert activity_data['Code1'] == '/supper-clubs/the-event-name/'
assert activity_data['Code3'] == 'test test test test test test test test test...'
assert 'Code2' not in activity_data
async def test_create_event_no_duration(donorfy: DonorfyActor, factory: Factory, dummy_server):
await factory.create_company()
await factory.create_user()
await factory.create_cat()
await factory.create_event(price=10, duration=None)
donorfy.settings.donorfy_api_key = 'no-users'
await donorfy.event_created(factory.event_id)
assert dummy_server.app['log'] == [
f'GET donorfy_api_root/no-users/System/LookUpTypes/Campaigns',
f'GET donorfy_api_root/no-users/constituents/ExternalKey/nosht_{factory.user_id}',
f'GET donorfy_api_root/no-users/constituents/EmailAddress/frank@example.org',
f'POST donorfy_api_root/no-users/constituents',
f'POST donorfy_api_root/no-users/constituents/456789/AddActiveTags',
f'POST donorfy_api_root/no-users/activities',
]
async def test_book_tickets(donorfy: DonorfyActor, factory: Factory, dummy_server):
await factory.create_company()
await factory.create_user()
await factory.create_cat()
await factory.create_event(price=10)
res = await factory.create_reservation()
await factory.buy_tickets(res)
assert dummy_server.app['log'] == [
f'GET donorfy_api_root/standard/System/LookUpTypes/Campaigns',
f'GET donorfy_api_root/standard/constituents/ExternalKey/nosht_{factory.user_id}',
f'GET donorfy_api_root/standard/constituents/123456',
f'POST donorfy_api_root/standard/activities',
f'GET stripe_root_url/balance/history/txn_charge-id',
f'POST donorfy_api_root/standard/transactions',
(
'email_send_endpoint',
'Subject: "The Event Name Ticket Confirmation", To: "Frank Spencer <frank@example.org>"',
),
]
async def test_book_tickets_free(donorfy: DonorfyActor, factory: Factory, dummy_server):
await factory.create_company()
await factory.create_user()
await factory.create_cat()
await factory.create_event()
action_id = await factory.book_free(await factory.create_reservation())
await donorfy.tickets_booked(action_id)
assert dummy_server.app['log'] == [
f'GET donorfy_api_root/standard/System/LookUpTypes/Campaigns',
f'GET donorfy_api_root/standard/constituents/ExternalKey/nosht_{factory.user_id}',
f'GET donorfy_api_root/standard/constituents/123456',
f'POST donorfy_api_root/standard/activities',
]
async def test_book_tickets_multiple(donorfy: DonorfyActor, factory: Factory, dummy_server, db_conn):
await factory.create_company()
await factory.create_user()
await factory.create_cat()
await factory.create_event(duration=None)
donorfy.settings.donorfy_api_key = 'no-users'
ben = await factory.create_user(first_name='ben', email='ben@example.org')
charlie = await factory.create_user(first_name='charlie', email='charlie@example.org')
danial = await factory.create_user(first_name='danial', email='danial@example.org')
res = await factory.create_reservation(factory.user_id, ben, charlie, danial)
action_id = await factory.book_free(res)
v = await db_conn.execute('update tickets set user_id=null where user_id=$1', danial)
assert v == 'UPDATE 1'
await donorfy.tickets_booked(action_id)
assert set(dummy_server.app['log']) == {
f'GET donorfy_api_root/no-users/System/LookUpTypes/Campaigns',
f'GET donorfy_api_root/no-users/constituents/ExternalKey/nosht_{factory.user_id}',
f'GET donorfy_api_root/no-users/constituents/EmailAddress/frank@example.org',
f'POST donorfy_api_root/no-users/constituents',
f'POST donorfy_api_root/no-users/activities',
f'GET donorfy_api_root/no-users/constituents/ExternalKey/nosht_{ben}',
f'GET donorfy_api_root/no-users/constituents/EmailAddress/charlie@example.org',
f'GET donorfy_api_root/no-users/constituents/ExternalKey/nosht_{charlie}',
f'GET donorfy_api_root/no-users/constituents/EmailAddress/ben@example.org',
}
async def test_book_tickets_extra(donorfy: DonorfyActor, factory: Factory, dummy_server, db_conn):
await factory.create_company()
await factory.create_user()
await factory.create_cat(cover_costs_percentage=10)
await factory.create_event(status='published', price=100)
res = await factory.create_reservation()
action_id = await db_conn.fetchval_b(
'INSERT INTO actions (:values__names) VALUES :values RETURNING id',
values=Values(
company=factory.company_id,
user_id=factory.user_id,
type=ActionTypes.buy_tickets,
event=factory.event_id,
extra=json.dumps({'stripe_balance_transaction': 'txn_testing'}),
),
)
await db_conn.execute(
"UPDATE tickets SET status='booked', booked_action=$1 WHERE reserve_action=$2", action_id, res.action_id,
)
await db_conn.execute('update tickets set extra_donated=10')
await donorfy.tickets_booked(action_id)
assert set(dummy_server.app['log']) == {
f'GET donorfy_api_root/standard/constituents/ExternalKey/nosht_{factory.user_id}',
f'GET donorfy_api_root/standard/constituents/123456',
f'POST donorfy_api_root/standard/activities',
f'GET stripe_root_url/balance/history/txn_testing',
f'POST donorfy_api_root/standard/transactions',
f'POST donorfy_api_root/standard/transactions/trans_123/AddAllocation',
f'GET donorfy_api_root/standard/System/LookUpTypes/Campaigns',
}
async def test_book_multiple(donorfy: DonorfyActor, factory: Factory, dummy_server, cli, url, login):
await factory.create_company()
await factory.create_user()
await factory.create_cat()
await factory.create_user(first_name='T', last_name='B', email='ticket.buyer@example.org')
await factory.create_event(status='published', price=10)
await login(email='ticket.buyer@example.org')
data = {
'tickets': [
{'t': True, 'email': 'ticket.buyer@example.org'},
{'t': True, 'email': 'ticket.buyer@example.org'},
],
'ticket_type': factory.ticket_type_id,
}
r = await cli.json_post(url('event-reserve-tickets', id=factory.event_id), data=data)
assert r.status == 200, await r.text()
action_id = (await r.json())['action_id']
await factory.fire_stripe_webhook(action_id)
trans_data = dummy_server.app['post_data']['POST donorfy_api_root/standard/transactions']
assert len(trans_data) == 1
assert trans_data[0] == {
'ExistingConstituentId': '123456',
'Channel': 'nosht-supper-clubs',
'Currency': 'gbp',
'Campaign': 'supper-clubs:the-event-name',
'PaymentMethod': 'Payment Card via Stripe',
'Product': 'Event Ticket(s)',
'Fund': 'Unrestricted General',
'Department': '220 Ticket Sales',
'BankAccount': 'Unrestricted Account',
'DatePaid': CloseToNow(delta=4),
'Amount': 20.0,
'ProcessingCostsAmount': 0.5,
'Quantity': 2,
'Acknowledgement': 'supper-clubs-thanks',
'AcknowledgementText': RegexStr('Ticket ID: .*'),
'Reference': 'Events.HUF:supper-clubs the-event-name',
'AddGiftAidDeclaration': False,
'GiftAidClaimed': False,
}
async def test_book_offline(donorfy: DonorfyActor, factory: Factory, dummy_server, cli, url, login):
await factory.create_company()
await factory.create_cat()
await factory.create_user(role='host')
await factory.create_event(price=10)
await login()
res = await factory.create_reservation()
app = cli.app['main_app']
data = dict(booking_token=encrypt_json(app, res.dict()), book_action='buy-tickets-offline')
r = await cli.json_post(url('event-book-tickets'), data=data)
assert r.status == 200, await r.text()
assert 'POST donorfy_api_root/standard/transactions' not in dummy_server.app['post_data']
async def test_donate(donorfy: DonorfyActor, factory: Factory, dummy_server, db_conn, cli, url, login):
await factory.create_company()
await factory.create_user()
await factory.create_cat()
await factory.create_event(price=10)
await factory.create_donation_option()
await login()
r = await cli.json_post(
url('donation-after-prepare', don_opt_id=factory.donation_option_id, event_id=factory.event_id)
)
assert r.status == 200, await r.text()
action_id = (await r.json())['action_id']
post_data = dict(
title='Mr',
first_name='Joe',
last_name='Blogs',
address='Testing Street',
city='Testingville',
postcode='TE11 0ST',
)
r = await cli.json_post(url('donation-gift-aid', action_id=action_id), data=post_data)
assert r.status == 200, await r.text()
assert 0 == await db_conn.fetchval('SELECT COUNT(*) FROM donations')
await factory.fire_stripe_webhook(action_id, amount=20_00, purpose='donate')
assert 1 == await db_conn.fetchval('SELECT COUNT(*) FROM donations')
assert dummy_server.app['log'] == [
'POST stripe_root_url/customers',
'POST stripe_root_url/payment_intents',
'GET donorfy_api_root/standard/System/LookUpTypes/Campaigns',
f'GET donorfy_api_root/standard/constituents/ExternalKey/nosht_{factory.user_id}',
'GET donorfy_api_root/standard/constituents/123456',
'GET stripe_root_url/balance/history/txn_charge-id',
'POST donorfy_api_root/standard/transactions',
'POST donorfy_api_root/standard/constituents/123456/GiftAidDeclarations',
('email_send_endpoint', 'Subject: "Thanks for your donation", To: "Frank Spencer <frank@example.org>"'),
]
async def test_donate_no_gift_aid(donorfy: DonorfyActor, factory: Factory, dummy_server, db_conn, cli, url, login):
await factory.create_company()
await factory.create_user()
await factory.create_cat()
await factory.create_event(price=10)
await factory.create_donation_option()
await login()
r = await cli.json_post(
url('donation-after-prepare', don_opt_id=factory.donation_option_id, event_id=factory.event_id)
)
assert r.status == 200, await r.text()
action_id = (await r.json())['action_id']
await factory.fire_stripe_webhook(action_id, amount=20_00, purpose='donate')
assert 1 == await db_conn.fetchval('SELECT COUNT(*) FROM donations')
assert dummy_server.app['log'] == [
'POST stripe_root_url/customers',
'POST stripe_root_url/payment_intents',
'GET donorfy_api_root/standard/System/LookUpTypes/Campaigns',
f'GET donorfy_api_root/standard/constituents/ExternalKey/nosht_{factory.user_id}',
'GET donorfy_api_root/standard/constituents/123456',
'GET stripe_root_url/balance/history/txn_charge-id',
'POST donorfy_api_root/standard/transactions',
('email_send_endpoint', 'Subject: "Thanks for your donation", To: "Frank Spencer <frank@example.org>"'),
]
async def test_update_user(donorfy: DonorfyActor, factory: Factory, dummy_server):
await factory.create_company()
await factory.create_user()
await donorfy.update_user(factory.user_id)
assert set(dummy_server.app['log']) == {
f'GET donorfy_api_root/standard/constituents/ExternalKey/nosht_{factory.user_id}',
'PUT donorfy_api_root/standard/constituents/123456',
'POST donorfy_api_root/standard/constituents/123456/Preferences',
}
async def test_update_user_neither(donorfy: DonorfyActor, factory: Factory, dummy_server):
await factory.create_company()
await factory.create_user()
await donorfy.update_user(factory.user_id, update_user=False, update_marketing=False)
assert dummy_server.app['log'] == [
f'GET donorfy_api_root/standard/constituents/ExternalKey/nosht_{factory.user_id}',
]
async def test_update_user_no_user(donorfy: DonorfyActor, factory: Factory, dummy_server):
await factory.create_company()
await factory.create_user()
donorfy.settings.donorfy_api_key = 'no-user'
await donorfy.update_user(factory.user_id)
assert set(dummy_server.app['log']) == {
f'GET donorfy_api_root/no-user/constituents/ExternalKey/nosht_{factory.user_id}',
'GET donorfy_api_root/no-user/constituents/EmailAddress/frank@example.org',
'POST donorfy_api_root/no-user/constituents',
'POST donorfy_api_root/no-user/constituents/456789/Preferences',
}
async def test_get_user_update(donorfy: DonorfyActor, factory: Factory, dummy_server):
await factory.create_company()
await factory.create_user()
donorfy.settings.donorfy_api_key = 'no-ext-id'
const_id = await donorfy._get_constituent(user_id=factory.user_id, email='foobar@example.com')
assert const_id == '456789'
assert dummy_server.app['log'] == [
f'GET donorfy_api_root/no-ext-id/constituents/ExternalKey/nosht_{factory.user_id}',
f'GET donorfy_api_root/no-ext-id/constituents/EmailAddress/foobar@example.com',
'PUT donorfy_api_root/no-ext-id/constituents/456789',
]
async def test_get_user_wrong_id(donorfy: DonorfyActor, factory: Factory, dummy_server):
await factory.create_company()
await factory.create_user()
donorfy.settings.donorfy_api_key = 'wrong-ext-id'
const_id = await donorfy._get_constituent(user_id=factory.user_id, email='foobar@example.com')
assert const_id is None
assert dummy_server.app['log'] == [
f'GET donorfy_api_root/wrong-ext-id/constituents/ExternalKey/nosht_{factory.user_id}',
f'GET donorfy_api_root/wrong-ext-id/constituents/EmailAddress/foobar@example.com',
]
async def test_bad_response(donorfy: DonorfyActor, dummy_server):
with pytest.raises(RequestError):
await donorfy.client.get('/foobar')
assert dummy_server.app['log'] == [
'GET donorfy_api_root/standard/foobar',
]
async def test_campaign_exists(donorfy: DonorfyActor, dummy_server):
await donorfy._get_or_create_campaign('supper-clubs', 'the-event-name')
assert dummy_server.app['log'] == ['GET donorfy_api_root/standard/System/LookUpTypes/Campaigns']
await donorfy._get_or_create_campaign('supper-clubs', 'the-event-name')
assert dummy_server.app['log'] == ['GET donorfy_api_root/standard/System/LookUpTypes/Campaigns'] # cached
async def test_campaign_new(donorfy: DonorfyActor, dummy_server):
await donorfy._get_or_create_campaign('supper-clubs', 'foobar')
assert dummy_server.app['log'] == [
'GET donorfy_api_root/standard/System/LookUpTypes/Campaigns',
'POST donorfy_api_root/standard/System/LookUpTypes/Campaigns',
]
await donorfy._get_or_create_campaign('supper-clubs', 'foobar')
assert len(dummy_server.app['log']) == 2
async def test_get_constituent_update_campaign(donorfy: DonorfyActor, dummy_server):
donorfy.settings.donorfy_api_key = 'default-campaign'
await donorfy._get_constituent(user_id=123, campaign='foo:bar')
assert dummy_server.app['log'] == [
f'GET donorfy_api_root/default-campaign/constituents/ExternalKey/nosht_123',
'GET donorfy_api_root/default-campaign/constituents/123456',
'PUT donorfy_api_root/default-campaign/constituents/123456',
]
async def test_donate_direct(donorfy: DonorfyActor, factory: Factory, dummy_server, db_conn, cli, url, login):
await factory.create_company()
await factory.create_user()
await factory.create_cat()
await factory.create_event(price=10, allow_donations=True, status='published')
await login()
r = await cli.json_post(
url('donation-direct-prepare', tt_id=factory.donation_ticket_type_id_1), data=dict(custom_amount=123),
)
assert r.status == 200, await r.text()
action_id = (await r.json())['action_id']
assert 0 == await db_conn.fetchval('SELECT COUNT(*) FROM donations')
await factory.fire_stripe_webhook(action_id, amount=20_00, purpose='donate-direct')
assert 1 == await db_conn.fetchval('SELECT COUNT(*) FROM donations')
assert dummy_server.app['log'] == [
'POST stripe_root_url/customers',
'POST stripe_root_url/payment_intents',
'GET donorfy_api_root/standard/System/LookUpTypes/Campaigns',
f'GET donorfy_api_root/standard/constituents/ExternalKey/nosht_{factory.user_id}',
'GET donorfy_api_root/standard/constituents/123456',
'GET stripe_root_url/balance/history/txn_charge-id',
'POST donorfy_api_root/standard/transactions',
('email_send_endpoint', 'Subject: "Thanks for your donation", To: "Frank Spencer <frank@example.org>"'),
]
| 41.75705 | 115 | 0.720883 | 2,505 | 19,250 | 5.299401 | 0.106188 | 0.063277 | 0.080151 | 0.061469 | 0.799021 | 0.757966 | 0.721507 | 0.647458 | 0.634953 | 0.600829 | 0 | 0.012783 | 0.162857 | 19,250 | 460 | 116 | 41.847826 | 0.810984 | 0.000312 | 0 | 0.486413 | 0 | 0.002717 | 0.370648 | 0.257717 | 0 | 0 | 0 | 0 | 0.11413 | 1 | 0 | false | 0 | 0.027174 | 0 | 0.027174 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
9e952388ffcae30ff69efaffb734ef8eaddfbd24 | 275 | py | Python | exercicios/ex047.py | mrcbnu/python-_exercicios | 773da52a227f0e9ecce998cb6b50b3fe167a4f1d | [
"MIT"
] | null | null | null | exercicios/ex047.py | mrcbnu/python-_exercicios | 773da52a227f0e9ecce998cb6b50b3fe167a4f1d | [
"MIT"
] | null | null | null | exercicios/ex047.py | mrcbnu/python-_exercicios | 773da52a227f0e9ecce998cb6b50b3fe167a4f1d | [
"MIT"
] | null | null | null | ###########################################
# EXERCICIO 047 #
###########################################
'''CRIE UM PROGRAMA QUE MOSTRE NA TELA TODOS OS NUMEROS
PARES DE 1 E 50'''
for c in range(1, 51):
if c % 2 == 0:
print(c, end=' ') | 30.555556 | 55 | 0.341818 | 30 | 275 | 3.133333 | 0.9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.054455 | 0.265455 | 275 | 9 | 56 | 30.555556 | 0.410891 | 0.349091 | 0 | 0 | 0 | 0 | 0.013699 | 0 | 0 | 0 | 0 | 0.111111 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.333333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
9eb6d151f5528cdc1c322f9b0068d09d6d965dbb | 114 | py | Python | pinax/points/signals.py | hacklabr/pinax-points | 1133c6d911e9fe9445ff45aebe118bcaf715aae0 | [
"MIT"
] | 28 | 2015-01-26T07:52:49.000Z | 2022-03-20T17:15:04.000Z | pinax/points/signals.py | hacklabr/pinax-points | 1133c6d911e9fe9445ff45aebe118bcaf715aae0 | [
"MIT"
] | 15 | 2017-12-04T07:35:49.000Z | 2020-05-21T15:53:20.000Z | pinax/points/signals.py | hacklabr/pinax-points | 1133c6d911e9fe9445ff45aebe118bcaf715aae0 | [
"MIT"
] | 14 | 2015-02-10T04:27:12.000Z | 2021-10-05T10:27:21.000Z | from django.dispatch import Signal
points_awarded = Signal(providing_args=["target", "key", "points", "source"])
| 28.5 | 77 | 0.745614 | 14 | 114 | 5.928571 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.096491 | 114 | 3 | 78 | 38 | 0.805825 | 0 | 0 | 0 | 0 | 0 | 0.184211 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
9eb981e2b99c90652643fa12268a2ebf31a607f2 | 929 | py | Python | hub/sensorhub/models/user_agent.py | kblum/sensor-hub | 6b766ca59be74ae1a8d3d42afe048d04b6a0c546 | [
"MIT"
] | null | null | null | hub/sensorhub/models/user_agent.py | kblum/sensor-hub | 6b766ca59be74ae1a8d3d42afe048d04b6a0c546 | [
"MIT"
] | null | null | null | hub/sensorhub/models/user_agent.py | kblum/sensor-hub | 6b766ca59be74ae1a8d3d42afe048d04b6a0c546 | [
"MIT"
] | null | null | null | from django.db import models
from . import TimestampedModel
class UserAgent(TimestampedModel):
"""
Representation of HTTP user agent string from reading API request.
Exists as a separate model for database normalisation.
"""
user_agent_string = models.TextField(db_index=True, unique=True, null=False, blank=False)
@staticmethod
def get_or_create_user_agent(user_agent_string, save=False):
if not user_agent_string:
return None
try:
# attempt to load user agent from user agent string
return UserAgent.objects.get(user_agent_string=user_agent_string)
except UserAgent.DoesNotExist:
# create new user agent
user_agent = UserAgent(user_agent_string=user_agent_string)
if save:
user_agent.save()
return user_agent
def __str__(self):
return self.user_agent_string
| 32.034483 | 93 | 0.678149 | 114 | 929 | 5.280702 | 0.45614 | 0.239203 | 0.249169 | 0.059801 | 0.099668 | 0.099668 | 0 | 0 | 0 | 0 | 0 | 0 | 0.264801 | 929 | 28 | 94 | 33.178571 | 0.881406 | 0.208827 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.117647 | false | 0 | 0.117647 | 0.058824 | 0.588235 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
7b654b9c9b8b039ce736b77aa128a506d29b9452 | 140 | py | Python | Pruebas/prob_14.py | FR98/Cuarto-Compu | 3824d0089562bccfbc839d9979809bc7a0fe4684 | [
"MIT"
] | 1 | 2022-03-20T12:57:04.000Z | 2022-03-20T12:57:04.000Z | Pruebas/prob_14.py | FR98/cuarto-compu | 3824d0089562bccfbc839d9979809bc7a0fe4684 | [
"MIT"
] | null | null | null | Pruebas/prob_14.py | FR98/cuarto-compu | 3824d0089562bccfbc839d9979809bc7a0fe4684 | [
"MIT"
] | null | null | null | def prob_14(n):
if n < 2:
return False
for i in range(2, n):
if n % i == 0:
return False
return True | 20 | 25 | 0.464286 | 23 | 140 | 2.782609 | 0.608696 | 0.09375 | 0.125 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.064103 | 0.442857 | 140 | 7 | 26 | 20 | 0.75641 | 0 | 0 | 0.285714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0 | 0 | 0.571429 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
7b66b53434936973c793021141732d4d9ee0ccb9 | 1,329 | py | Python | metashare/accounts/urls.py | hpusset/ELRI | c4455cff3adb920627f014f37e740665342e9cee | [
"BSD-3-Clause"
] | 1 | 2017-07-10T08:15:07.000Z | 2017-07-10T08:15:07.000Z | metashare/accounts/urls.py | hpusset/ELRI | c4455cff3adb920627f014f37e740665342e9cee | [
"BSD-3-Clause"
] | null | null | null | metashare/accounts/urls.py | hpusset/ELRI | c4455cff3adb920627f014f37e740665342e9cee | [
"BSD-3-Clause"
] | 1 | 2018-07-03T07:55:56.000Z | 2018-07-03T07:55:56.000Z | from django.conf.urls import patterns, url
from metashare.settings import DJANGO_BASE
urlpatterns = patterns('metashare.accounts.views',
url(r'create/$',
'create', name='create'),
url(r'confirm/(?P<uuid>[0-9a-f]{32})/$',
'confirm', name='confirm'),
url(r'contact/$',
'contact', name='contact'),
url(r'reset/(?:(?P<uuid>[0-9a-f]{32})/)?$',
'reset', name='reset'),
url(r'profile/$',
'edit_profile', name='edit_profile'),
url(r'editor_group_application/$',
'editor_group_application', name='editor_group_application'),
url(r'organization_application/$',
'organization_application', name='organization_application'),
url(r'update_default_editor_groups/$',
'update_default_editor_groups', name='update_default_editor_groups'),
url(r'edelivery_membership_application/$',
'edelivery_application', name='edelivery_application'),
)
urlpatterns += patterns('django.contrib.auth.views',
url(r'^profile/change_password/$', 'password_change',
{'post_change_redirect' : '/{0}accounts/profile/change_password/done/'.format(DJANGO_BASE), 'template_name': 'accounts/change_password.html'}, name='password_change'),
url(r'^profile/change_password/done/$', 'password_change_done',
{'template_name': 'accounts/change_password_done.html'}, name='password_change_done'),
)
| 42.870968 | 175 | 0.713318 | 162 | 1,329 | 5.592593 | 0.283951 | 0.048565 | 0.036424 | 0.082781 | 0.154525 | 0.024283 | 0 | 0 | 0 | 0 | 0 | 0.007538 | 0.10158 | 1,329 | 30 | 176 | 44.3 | 0.751256 | 0 | 0 | 0 | 0 | 0 | 0.604966 | 0.442438 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.142857 | 0.071429 | 0 | 0.071429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 3 |
7b882a00a99da3e2e17e41e9f577ca3003e8abd3 | 2,561 | py | Python | app/core/models.py | fxavier/abt-epts | 021a8140db32afba106a7a9e122b98452d88c225 | [
"MIT"
] | null | null | null | app/core/models.py | fxavier/abt-epts | 021a8140db32afba106a7a9e122b98452d88c225 | [
"MIT"
] | null | null | null | app/core/models.py | fxavier/abt-epts | 021a8140db32afba106a7a9e122b98452d88c225 | [
"MIT"
] | null | null | null | from django.db import models
from django.contrib.auth.models import AbstractBaseUser, BaseUserManager, \
PermissionsMixin
from django.conf import settings
class UserManager(BaseUserManager):
def create_user(self, email, password=None, **extra_fields):
"""Creates and saves a new user"""
if not email:
raise ValueError('Users must have an email address')
user = self.model(email=self.normalize_email(email), **extra_fields)
user.set_password(password)
user.save(using=self._db)
return user
def create_superuser(self, email, password):
"""Creates and saves a new super user"""
user = self.create_user(email, password)
user.is_staff = True
user.is_superuser = True
user.save(using=self._db)
return user
class User(AbstractBaseUser, PermissionsMixin):
"""Custom user model that suppors using email instead of username"""
email = models.EmailField(max_length=255, unique=True)
name = models.CharField(max_length=255)
is_active = models.BooleanField(default=True)
is_staff = models.BooleanField(default=False)
objects = UserManager()
USERNAME_FIELD = 'email'
class Provincia(models.Model):
name = models.CharField(max_length=200)
def __str__(self):
return self.name
class Distrito(models.Model):
"""Model definition for District."""
# TODO: Define fields here
name = models.CharField(max_length=100)
provincia = models.ForeignKey('Provincia', on_delete=models.CASCADE)
def __str__(self):
"""Unicode representation of District."""
return self.name
class UnidadeSanitaria(models.Model):
"""Model definition for HealthFacility."""
id = models.CharField(max_length=255, primary_key=True)
name = models.CharField(max_length=255)
# openmrs_name = models.CharField(max_length=255, null=True, blank=True)
distrito = models.ForeignKey('Distrito', on_delete=models.CASCADE)
class Meta:
"""Meta definition for HealthFacility."""
verbose_name = 'Unidade Sanitaria'
verbose_name_plural = 'Unidades Sanitarias'
def __str__(self):
"""Unicode representation of HealthFacility."""
return self.name
class Livro(models.Model):
tipo = models.CharField(max_length=100)
numero = models.IntegerField()
pagina = models.IntegerField()
linha = models.IntegerField()
def __str__(self):
return f'{self.tipo} {self.numero}'
| 30.488095 | 76 | 0.673565 | 295 | 2,561 | 5.694915 | 0.359322 | 0.042857 | 0.075 | 0.1 | 0.258333 | 0.133929 | 0.07619 | 0 | 0 | 0 | 0 | 0.012072 | 0.223741 | 2,561 | 84 | 77 | 30.488095 | 0.832998 | 0.158141 | 0 | 0.26 | 0 | 0 | 0.054374 | 0 | 0 | 0 | 0 | 0.011905 | 0 | 1 | 0.12 | false | 0.08 | 0.06 | 0.04 | 0.76 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 3 |
7b8d3bfd9dda43412dd61ee3a956e43a5295cf1f | 78 | py | Python | Python Book/12. Complex Loops/05_sequence_2k_plus_one/sequence_2k_plus_one.py | alexanderivanov2/Softuni-Software-Engineering | 8adb96f445f1da17dbb6eded9e9594319154c7e7 | [
"MIT"
] | null | null | null | Python Book/12. Complex Loops/05_sequence_2k_plus_one/sequence_2k_plus_one.py | alexanderivanov2/Softuni-Software-Engineering | 8adb96f445f1da17dbb6eded9e9594319154c7e7 | [
"MIT"
] | null | null | null | Python Book/12. Complex Loops/05_sequence_2k_plus_one/sequence_2k_plus_one.py | alexanderivanov2/Softuni-Software-Engineering | 8adb96f445f1da17dbb6eded9e9594319154c7e7 | [
"MIT"
] | null | null | null | n = int(input())
num = 1
while num <= n:
print(num)
num = num * 2 + 1 | 13 | 21 | 0.487179 | 14 | 78 | 2.714286 | 0.571429 | 0.315789 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.057692 | 0.333333 | 78 | 6 | 21 | 13 | 0.673077 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.2 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
7bb9a05e4b4df3445a16a9d49bf23b734a000bdc | 1,718 | py | Python | test/espnet2/tts/feats_extract/test_energy.py | texpomru13/espnet | 7ef005e832e2fb033f356c16f54e0f08762fb4b0 | [
"Apache-2.0"
] | 5,053 | 2017-12-13T06:21:41.000Z | 2022-03-31T13:38:29.000Z | test/espnet2/tts/feats_extract/test_energy.py | texpomru13/espnet | 7ef005e832e2fb033f356c16f54e0f08762fb4b0 | [
"Apache-2.0"
] | 3,666 | 2017-12-14T05:58:50.000Z | 2022-03-31T22:11:49.000Z | test/espnet2/tts/feats_extract/test_energy.py | texpomru13/espnet | 7ef005e832e2fb033f356c16f54e0f08762fb4b0 | [
"Apache-2.0"
] | 1,709 | 2017-12-13T01:02:42.000Z | 2022-03-31T11:57:45.000Z | import pytest
import torch
from espnet2.tts.feats_extract.energy import Energy
@pytest.mark.parametrize(
"use_token_averaged_energy, reduction_factor", [(False, None), (True, 1), (True, 3)]
)
def test_forward(use_token_averaged_energy, reduction_factor):
layer = Energy(
n_fft=128,
hop_length=64,
fs="16k",
use_token_averaged_energy=use_token_averaged_energy,
reduction_factor=reduction_factor,
)
xs = torch.randn(2, 384)
if not use_token_averaged_energy:
es, elens = layer(xs, torch.LongTensor([384, 128]))
assert es.shape[1] == max(elens)
else:
ds = torch.LongTensor([[3, 3, 1], [3, 0, 0]]) // reduction_factor
dlens = torch.LongTensor([3, 1])
es, _ = layer(
xs, torch.LongTensor([384, 128]), durations=ds, durations_lengths=dlens
)
assert torch.isnan(es).sum() == 0
@pytest.mark.parametrize(
"use_token_averaged_energy, reduction_factor", [(False, None), (True, 1), (True, 3)]
)
def test_output_size(use_token_averaged_energy, reduction_factor):
layer = Energy(
n_fft=4,
hop_length=1,
fs="16k",
use_token_averaged_energy=use_token_averaged_energy,
reduction_factor=reduction_factor,
)
print(layer.output_size())
@pytest.mark.parametrize(
"use_token_averaged_energy, reduction_factor", [(False, None), (True, 1), (True, 3)]
)
def test_get_parameters(use_token_averaged_energy, reduction_factor):
layer = Energy(
n_fft=4,
hop_length=1,
fs="16k",
use_token_averaged_energy=use_token_averaged_energy,
reduction_factor=reduction_factor,
)
print(layer.get_parameters())
| 30.140351 | 88 | 0.661816 | 220 | 1,718 | 4.859091 | 0.268182 | 0.097287 | 0.194574 | 0.26754 | 0.695042 | 0.695042 | 0.642657 | 0.642657 | 0.642657 | 0.642657 | 0 | 0.035688 | 0.217113 | 1,718 | 56 | 89 | 30.678571 | 0.759108 | 0 | 0 | 0.44898 | 0 | 0 | 0.080326 | 0.045402 | 0 | 0 | 0 | 0 | 0.040816 | 1 | 0.061224 | false | 0 | 0.061224 | 0 | 0.122449 | 0.040816 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
c879174dc589e41a31be3771fbf140871339c500 | 151 | py | Python | setup.py | Will-Robin/NorthNet | 343238afbefd02b7255ef6013cbfb0e801bc2b3b | [
"BSD-3-Clause"
] | null | null | null | setup.py | Will-Robin/NorthNet | 343238afbefd02b7255ef6013cbfb0e801bc2b3b | [
"BSD-3-Clause"
] | 2 | 2022-02-23T12:03:32.000Z | 2022-02-23T14:27:29.000Z | setup.py | Will-Robin/NorthNet | 343238afbefd02b7255ef6013cbfb0e801bc2b3b | [
"BSD-3-Clause"
] | null | null | null | from setuptools import setup, version
setup(
name="NorthNet",
version="0.0",
author="William E. Robinson",
packages = ["NorthNet"],
)
| 16.777778 | 37 | 0.635762 | 17 | 151 | 5.647059 | 0.764706 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.016807 | 0.211921 | 151 | 8 | 38 | 18.875 | 0.789916 | 0 | 0 | 0 | 0 | 0 | 0.251656 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.142857 | 0 | 0.142857 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
c88407b58490b10ee7b7b9dec303ca0721d6f4c4 | 281 | py | Python | timesheet/forms.py | pincoin/windmill | fe373e5ca27c775a926e9a5538931f9394196d90 | [
"MIT"
] | null | null | null | timesheet/forms.py | pincoin/windmill | fe373e5ca27c775a926e9a5538931f9394196d90 | [
"MIT"
] | 7 | 2020-02-12T01:22:46.000Z | 2021-06-10T18:43:01.000Z | timesheet/forms.py | pincoin/windmill | fe373e5ca27c775a926e9a5538931f9394196d90 | [
"MIT"
] | null | null | null | from django import forms
from . import models
class PunchLogForm(forms.ModelForm):
latitude = forms.DecimalField(widget=forms.HiddenInput())
longitude = forms.DecimalField(widget=forms.HiddenInput())
class Meta:
model = models.PunchLog
fields = ()
| 20.071429 | 62 | 0.701068 | 29 | 281 | 6.793103 | 0.586207 | 0.172589 | 0.233503 | 0.284264 | 0.395939 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.202847 | 281 | 13 | 63 | 21.615385 | 0.879464 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
c887c627a5de312187bb987f26d6bea4c3b72084 | 733 | py | Python | polls/views.py | druss16/danslist | ad06f8fa8df5936db7a60e9820f0c89a77f8879a | [
"MIT"
] | null | null | null | polls/views.py | druss16/danslist | ad06f8fa8df5936db7a60e9820f0c89a77f8879a | [
"MIT"
] | null | null | null | polls/views.py | druss16/danslist | ad06f8fa8df5936db7a60e9820f0c89a77f8879a | [
"MIT"
] | null | null | null | from django.shortcuts import render
from django.http import HttpResponse
from django.template import RequestContext, loader
from .models import Question
# Create your views here.
def index(request):
latest_question_list = Question.objects.order_by('-pub_date')[:5]
context = {'latest_question_list': latest_question_list}
return render(request, 'polls/index.html', context)
def detail(request, question_id):
return HttpResponse("You're looking at question %s." % question_id)
def results(request, question_id):
response = "You're looking at the results of the question %s."
return HttpResponse(response % question_id)
def vote(request, question_id):
return HttpResponse("You're voting on question %s." % question_id)
| 29.32 | 68 | 0.777626 | 102 | 733 | 5.45098 | 0.441176 | 0.107914 | 0.097122 | 0.082734 | 0.143885 | 0.143885 | 0.143885 | 0 | 0 | 0 | 0 | 0.001558 | 0.124147 | 733 | 24 | 69 | 30.541667 | 0.864486 | 0.031378 | 0 | 0 | 0 | 0 | 0.217021 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.266667 | false | 0 | 0.266667 | 0.133333 | 0.8 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
c8c3d449685f28e78f767aafb617c4bfc465febb | 2,779 | py | Python | emerald/database_operations.py | femmerling/EmeraldBox | 68f5776577f0c929ca1f5ba23f1dfe480f813037 | [
"MIT"
] | 17 | 2015-01-15T21:41:16.000Z | 2021-01-10T15:34:09.000Z | emerald/database_operations.py | femmerling/EmeraldBox | 68f5776577f0c929ca1f5ba23f1dfe480f813037 | [
"MIT"
] | null | null | null | emerald/database_operations.py | femmerling/EmeraldBox | 68f5776577f0c929ca1f5ba23f1dfe480f813037 | [
"MIT"
] | 5 | 2015-02-07T02:41:18.000Z | 2016-11-11T02:50:21.000Z | import imp
import os.path
from app import db
from migrate.versioning import api
from config import SQLALCHEMY_DATABASE_URI
from config import SQLALCHEMY_MIGRATE_REPO
def db_create():
# This creates the new database.
db.create_all()
# If no repo existed, the creation will prepare for the first migration.
if not os.path.exists(SQLALCHEMY_MIGRATE_REPO):
api.create(SQLALCHEMY_MIGRATE_REPO, 'database repository')
api.version_control(SQLALCHEMY_DATABASE_URI, SQLALCHEMY_MIGRATE_REPO)
print '\nDatabase creation completed\n'
else:
api.version_control(SQLALCHEMY_DATABASE_URI, SQLALCHEMY_MIGRATE_REPO, api.db_version(SQLALCHEMY_DATABASE_URI, SQLALCHEMY_MIGRATE_REPO))
def db_migrate():
# This is used for database migration. Newly created database should go through this as well.
migration = SQLALCHEMY_MIGRATE_REPO + '/versions/%03d_migration.py' % (api.db_version(SQLALCHEMY_DATABASE_URI, SQLALCHEMY_MIGRATE_REPO) + 1)
tmp_module = imp.new_module('old_model')
old_model = api.create_model(SQLALCHEMY_DATABASE_URI, SQLALCHEMY_MIGRATE_REPO)
exec old_model in tmp_module.__dict__
script = api.make_update_script_for_model(SQLALCHEMY_DATABASE_URI, SQLALCHEMY_MIGRATE_REPO, tmp_module.meta, db.metadata)
open(migration, "wt").write(script)
api.upgrade(SQLALCHEMY_DATABASE_URI, SQLALCHEMY_MIGRATE_REPO)
print 'New migration saved as ' + migration
print 'Current database version: ' + str(api.db_version(SQLALCHEMY_DATABASE_URI, SQLALCHEMY_MIGRATE_REPO)) + '\n'
def db_upgrade():
# This is used for database migration upgrade.
api.upgrade(SQLALCHEMY_DATABASE_URI, SQLALCHEMY_MIGRATE_REPO)
print 'Database upgrade completed!'
print 'Current database version is: ' + str(api.db_version(SQLALCHEMY_DATABASE_URI, SQLALCHEMY_MIGRATE_REPO))
def db_downgrade(version=None):
# This is used to downgrade the database schema to a certain version or to one version before.
# If you know exactly the version you wish to use then you can directly downgrade to that version.
if not version:
current_version = api.db_version(SQLALCHEMY_DATABASE_URI, SQLALCHEMY_MIGRATE_REPO)
downgrade_version = current_version - 1
else:
downgrade_version = version
api.downgrade(SQLALCHEMY_DATABASE_URI, SQLALCHEMY_MIGRATE_REPO, downgrade_version)
print 'Database downgrade completed!'
print 'Current database version: ' + str(api.db_version(SQLALCHEMY_DATABASE_URI, SQLALCHEMY_MIGRATE_REPO))
def db_version():
# this is used to get the latest version in the database
current_version = api.db_version(SQLALCHEMY_DATABASE_URI, SQLALCHEMY_MIGRATE_REPO)
print 'The current database version is ' + str(current_version)
# end of file | 41.477612 | 144 | 0.77366 | 375 | 2,779 | 5.453333 | 0.256 | 0.149633 | 0.184841 | 0.212225 | 0.515403 | 0.466504 | 0.437164 | 0.386308 | 0.350122 | 0.210269 | 0 | 0.001712 | 0.15905 | 2,779 | 67 | 145 | 41.477612 | 0.873342 | 0.178122 | 0 | 0.15 | 0 | 0 | 0.123902 | 0.011863 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.15 | null | null | 0.2 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
c8c808427fd949238223a24b72518b4c7f83bcd8 | 1,190 | py | Python | mall/serializers.py | turing0/mallProject | cc56d25c51fa03584f99a633a6f606622cfb1e5d | [
"MIT"
] | null | null | null | mall/serializers.py | turing0/mallProject | cc56d25c51fa03584f99a633a6f606622cfb1e5d | [
"MIT"
] | null | null | null | mall/serializers.py | turing0/mallProject | cc56d25c51fa03584f99a633a6f606622cfb1e5d | [
"MIT"
] | null | null | null | from rest_framework import serializers
from .models import User
from .models import Product
from django.contrib.auth import get_user_model
class UserSerializer(serializers.ModelSerializer):
class Meta:
model = User
fields = ('id', 'username', 'password', 'balance')
class ProductSerializer(serializers.ModelSerializer):
# owner_name = serializers.ReadOnlyField(source="owner.username")
class Meta:
model = Product
# fields = '__all__'
fields = ('id', 'name', 'price', 'owner', 'buyer', 'sell_date')
# def update(self, instance, validated_data):
# if validated_data.get("owner"):
# owner = validated_data.pop('owner')
# owner = Product.objects.get(id=self.initial_data["id"])
# owner_task = super(ProductSerializer, self, ).update(instance, validated_data)
# owner_task.owner = owner
# owner_task.save()
# return owner_task
# return super(ProductSerializer, self, ).update(instance, validated_data)
# 处理外键字段
# def create(self, validated_data):
# return Product.objects.create(seller=self.context["seller"], **validated_data)
| 36.060606 | 92 | 0.657143 | 127 | 1,190 | 5.992126 | 0.393701 | 0.11958 | 0.082786 | 0.0841 | 0.13929 | 0.13929 | 0.13929 | 0 | 0 | 0 | 0 | 0 | 0.223529 | 1,190 | 32 | 93 | 37.1875 | 0.823593 | 0.540336 | 0 | 0.166667 | 0 | 0 | 0.103383 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.083333 | 0.333333 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 3 |
c8ce16cc98ba530c9d0d89640e062797670ba6af | 275 | py | Python | thywill_apps/src/thywill_apps/test/proof_of_concept/__init__.py | exratione/thywill-python | 2078d6f6fc12034eac60a7cc30bf2bc0d27a8732 | [
"MIT"
] | 1 | 2015-04-26T19:49:35.000Z | 2015-04-26T19:49:35.000Z | thywill_apps/src/thywill_apps/test/proof_of_concept/__init__.py | exratione/thywill-python | 2078d6f6fc12034eac60a7cc30bf2bc0d27a8732 | [
"MIT"
] | null | null | null | thywill_apps/src/thywill_apps/test/proof_of_concept/__init__.py | exratione/thywill-python | 2078d6f6fc12034eac60a7cc30bf2bc0d27a8732 | [
"MIT"
] | null | null | null | '''
A very simple test application to exercise a round trip of messages through the thywill system.
This also illustrates the bare, bare minimum implementation of the 'thywill_interface.py' module -
all it does is echo back incoming messages to the client who sent them.
''' | 45.833333 | 98 | 0.789091 | 44 | 275 | 4.909091 | 0.795455 | 0.092593 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.163636 | 275 | 6 | 99 | 45.833333 | 0.93913 | 0.970909 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
c8d09ce36295ecfe93aeeecfaa8a003ce925b428 | 6,979 | py | Python | src/jk_sysinfo/get_proc_cpu_info.py | jkpubsrc/python-module-jk-sysinfo | 583c9e5d10f64a722ffa794d081aaf94354ba4fb | [
"Apache-1.1"
] | null | null | null | src/jk_sysinfo/get_proc_cpu_info.py | jkpubsrc/python-module-jk-sysinfo | 583c9e5d10f64a722ffa794d081aaf94354ba4fb | [
"Apache-1.1"
] | null | null | null | src/jk_sysinfo/get_proc_cpu_info.py | jkpubsrc/python-module-jk-sysinfo | 583c9e5d10f64a722ffa794d081aaf94354ba4fb | [
"Apache-1.1"
] | null | null | null |
import typing
from jk_cachefunccalls import cacheCalls
from jk_cmdoutputparsinghelper import ValueParser_ByteWithUnit
from .parsing_utils import *
from .invoke_utils import run
#import jk_json
_parserColonKVP = ParseAtFirstDelimiter(delimiter=":", valueCanBeWrappedInDoubleQuotes=False, keysReplaceSpacesWithUnderscores=True)
#
# Returns:
#
# [
# {
# "<key>": "<value>",
# ...
# },
# ...
# ]
#
def parse_proc_cpu_info(stdout:str, stderr:str, exitcode:int) -> typing.Tuple[list,dict]:
"""
processor : 0
vendor_id : GenuineIntel
cpu family : 6
model : 92
model name : Intel(R) Pentium(R) CPU J4205 @ 1.50GHz
stepping : 9
microcode : 0x38
cpu MHz : 1000.000
cache size : 1024 KB
physical id : 0
siblings : 4
core id : 0
cpu cores : 4
apicid : 0
initial apicid : 0
fpu : yes
fpu_exception : yes
cpuid level : 21
wp : yes
flags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush dts acpi mmx fxsr sse sse2 ss ht tm pbe syscall nx pdpe1gb rdtscp lm constant_tsc art arch_perfmon pebs bts rep_good nopl xtopology nonstop_tsc aperfmperf pni pclmulqdq dtes64 monitor ds_cpl vmx est tm2 ssse3 sdbg cx16 xtpr pdcm sse4_1 sse4_2 x2apic movbe popcnt tsc_deadline_timer aes xsave rdrand lahf_lm 3dnowprefetch intel_pt ibrs ibpb stibp tpr_shadow vnmi flexpriority ept vpid fsgsbase tsc_adjust smep erms mpx rdseed smap clflushopt sha_ni xsaveopt xsavec xgetbv1 dtherm ida arat pln pts md_clear arch_capabilities
bugs : monitor spectre_v1 spectre_v2
bogomips : 2995.20
clflush size : 64
cache_alignment : 64
address sizes : 39 bits physical, 48 bits virtual
power management:
processor : 1
vendor_id : GenuineIntel
cpu family : 6
model : 92
model name : Intel(R) Pentium(R) CPU J4205 @ 1.50GHz
stepping : 9
microcode : 0x38
cpu MHz : 800.000
cache size : 1024 KB
physical id : 0
siblings : 4
core id : 1
cpu cores : 4
apicid : 2
initial apicid : 2
fpu : yes
fpu_exception : yes
cpuid level : 21
wp : yes
flags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush dts acpi mmx fxsr sse sse2 ss ht tm pbe syscall nx pdpe1gb rdtscp lm constant_tsc art arch_perfmon pebs bts rep_good nopl xtopology nonstop_tsc aperfmperf pni pclmulqdq dtes64 monitor ds_cpl vmx est tm2 ssse3 sdbg cx16 xtpr pdcm sse4_1 sse4_2 x2apic movbe popcnt tsc_deadline_timer aes xsave rdrand lahf_lm 3dnowprefetch intel_pt ibrs ibpb stibp tpr_shadow vnmi flexpriority ept vpid fsgsbase tsc_adjust smep erms mpx rdseed smap clflushopt sha_ni xsaveopt xsavec xgetbv1 dtherm ida arat pln pts md_clear arch_capabilities
bugs : monitor spectre_v1 spectre_v2
bogomips : 2995.20
clflush size : 64
cache_alignment : 64
address sizes : 39 bits physical, 48 bits virtual
power management:
processor : 2
vendor_id : GenuineIntel
cpu family : 6
model : 92
model name : Intel(R) Pentium(R) CPU J4205 @ 1.50GHz
stepping : 9
microcode : 0x38
cpu MHz : 800.000
cache size : 1024 KB
physical id : 0
siblings : 4
core id : 2
cpu cores : 4
apicid : 4
initial apicid : 4
fpu : yes
fpu_exception : yes
cpuid level : 21
wp : yes
flags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush dts acpi mmx fxsr sse sse2 ss ht tm pbe syscall nx pdpe1gb rdtscp lm constant_tsc art arch_perfmon pebs bts rep_good nopl xtopology nonstop_tsc aperfmperf pni pclmulqdq dtes64 monitor ds_cpl vmx est tm2 ssse3 sdbg cx16 xtpr pdcm sse4_1 sse4_2 x2apic movbe popcnt tsc_deadline_timer aes xsave rdrand lahf_lm 3dnowprefetch intel_pt ibrs ibpb stibp tpr_shadow vnmi flexpriority ept vpid fsgsbase tsc_adjust smep erms mpx rdseed smap clflushopt sha_ni xsaveopt xsavec xgetbv1 dtherm ida arat pln pts md_clear arch_capabilities
bugs : monitor spectre_v1 spectre_v2
bogomips : 2995.20
clflush size : 64
cache_alignment : 64
address sizes : 39 bits physical, 48 bits virtual
power management:
processor : 3
vendor_id : GenuineIntel
cpu family : 6
model : 92
model name : Intel(R) Pentium(R) CPU J4205 @ 1.50GHz
stepping : 9
microcode : 0x38
cpu MHz : 1100.000
cache size : 1024 KB
physical id : 0
siblings : 4
core id : 3
cpu cores : 4
apicid : 6
initial apicid : 6
fpu : yes
fpu_exception : yes
cpuid level : 21
wp : yes
flags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush dts acpi mmx fxsr sse sse2 ss ht tm pbe syscall nx pdpe1gb rdtscp lm constant_tsc art arch_perfmon pebs bts rep_good nopl xtopology nonstop_tsc aperfmperf pni pclmulqdq dtes64 monitor ds_cpl vmx est tm2 ssse3 sdbg cx16 xtpr pdcm sse4_1 sse4_2 x2apic movbe popcnt tsc_deadline_timer aes xsave rdrand lahf_lm 3dnowprefetch intel_pt ibrs ibpb stibp tpr_shadow vnmi flexpriority ept vpid fsgsbase tsc_adjust smep erms mpx rdseed smap clflushopt sha_ni xsaveopt xsavec xgetbv1 dtherm ida arat pln pts md_clear arch_capabilities
bugs : monitor spectre_v1 spectre_v2
bogomips : 2995.20
clflush size : 64
cache_alignment : 64
address sizes : 39 bits physical, 48 bits virtual
power management:
"""
if exitcode != 0:
raise Exception()
cpuInfos = splitAtEmptyLines(stdout.split("\n"))
retExtra = {}
ret = []
for group in cpuInfos:
d = _parserColonKVP.parseLines(group)
if "processor" not in d:
for k, v in d.items():
retExtra[k.lower()] = v
continue
if "cache_size" in d:
d["cache_size_kb"] = ValueParser_ByteWithUnit.parse(d["cache_size"]) // 1024
del d["cache_size"]
if "bogomips" in d:
d["bogomips"] = float(d["apicid"])
elif "BogoMIPS" in d:
d["bogomips"] = float(d["BogoMIPS"])
del d["BogoMIPS"]
if "bugs" in d:
d["bugs"] = d["bugs"].split()
if "flags" in d:
d["flags"] = sorted(d["flags"].split())
elif "Features" in d:
d["flags"] = sorted(d["Features"].split())
del d["Features"]
# bool
for key in [ "fpu", "fpu_exception", "wp" ]:
if key in d:
d[key.lower()] = d[key] == "yes"
if key != key.lower():
del d[key]
# int
for key in [ "CPU_architecture", "CPU_revision", "physical_id", "initial_apicid", "cpu_cores", "core_id", "clflush_size", "cache_alignment", "apicid" ]:
if key in d:
d[key.lower()] = int(d[key])
if key != key.lower():
del d[key]
# float
for key in [ "cpu_MHz" ]:
if key in d:
d[key.lower()] = float(d[key])
if key != key.lower():
del d[key]
# str
for key in [ "CPU_implementer", "CPU_part", "CPU_variant" ]:
if key in d:
d[key.lower()] = d[key]
if key != key.lower():
del d[key]
d["processor"] = int(d["processor"])
if "siblings" in d:
d["siblings"] = int(d["siblings"])
#jk_json.prettyPrint(d)
ret.append(d)
return ret, retExtra
#
#
# Returns:
#
# [
# {
# "<key>": "<value>",
# ...
# },
# ...
# ]
#
@cacheCalls(seconds=3, dependArgs=[0])
def get_proc_cpu_info(c = None) -> typing.Tuple[list,dict]:
stdout, stderr, exitcode = run(c, "cat /proc/cpuinfo")
return parse_proc_cpu_info(stdout, stderr, exitcode)
#
| 29.572034 | 612 | 0.71271 | 1,110 | 6,979 | 4.372072 | 0.216216 | 0.008036 | 0.009067 | 0.018957 | 0.732124 | 0.723058 | 0.716464 | 0.694622 | 0.694622 | 0.671955 | 0 | 0.049793 | 0.202894 | 6,979 | 235 | 613 | 29.697872 | 0.822578 | 0.745379 | 0 | 0.193548 | 0 | 0 | 0.174539 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.032258 | false | 0 | 0.080645 | 0 | 0.145161 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
c8d9772ef30de66f59d67a0dc784ccc67d52e59f | 94 | py | Python | python3/binary.py | eiadshahtout/Python | b2406b0806bc55a9d8f5482a304a8d6968249018 | [
"MIT"
] | null | null | null | python3/binary.py | eiadshahtout/Python | b2406b0806bc55a9d8f5482a304a8d6968249018 | [
"MIT"
] | null | null | null | python3/binary.py | eiadshahtout/Python | b2406b0806bc55a9d8f5482a304a8d6968249018 | [
"MIT"
] | null | null | null | def count_ones(num):
binary = str(bin(num))[2:]
print(binary)
return binary
count_ones(20) | 15.666667 | 27 | 0.712766 | 16 | 94 | 4.0625 | 0.6875 | 0.276923 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.036585 | 0.12766 | 94 | 6 | 28 | 15.666667 | 0.756098 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0 | 0 | 0.4 | 0.2 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
c8e4d42dd8ef4d4d14c2794784ca0f4e4747b37c | 278 | py | Python | miner/config.py | czhang-nbai/swan | 03a6ade93d9b8b193bd05bf851779784eb2ffde5 | [
"MIT"
] | 6 | 2021-02-19T02:36:06.000Z | 2021-03-20T09:38:17.000Z | miner/config.py | czhang-nbai/swan | 03a6ade93d9b8b193bd05bf851779784eb2ffde5 | [
"MIT"
] | 27 | 2021-01-13T06:43:44.000Z | 2021-05-12T04:55:28.000Z | miner/config.py | czhang-nbai/swan | 03a6ade93d9b8b193bd05bf851779784eb2ffde5 | [
"MIT"
] | 7 | 2021-01-26T04:50:11.000Z | 2021-03-04T22:26:59.000Z | import toml
def read_config(_config_path=None):
if _config_path is None:
_config_path = './config.toml'
# script_dir = os.path.dirname(__file__)
# file_path = os.path.join(script_dir, config_path)
_config = toml.load(_config_path)
return _config
| 21.384615 | 55 | 0.694245 | 39 | 278 | 4.461538 | 0.435897 | 0.287356 | 0.183908 | 0.229885 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.205036 | 278 | 12 | 56 | 23.166667 | 0.78733 | 0.316547 | 0 | 0 | 0 | 0 | 0.069519 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.166667 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
c8e8ef9bc1df23fffd3b87a416935aa12a7c1e19 | 214 | py | Python | app/database/pronto_soccorso.py | nyxgear/PSD-e-service-pronto-soccorso | 92eb0586c2cfb12a844a106b71911c80e8e3e57b | [
"MIT"
] | null | null | null | app/database/pronto_soccorso.py | nyxgear/PSD-e-service-pronto-soccorso | 92eb0586c2cfb12a844a106b71911c80e8e3e57b | [
"MIT"
] | null | null | null | app/database/pronto_soccorso.py | nyxgear/PSD-e-service-pronto-soccorso | 92eb0586c2cfb12a844a106b71911c80e8e3e57b | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from .tables.pronto_soccorsi import table
class ProntoSoccorso:
_table = table
def __init__(self, ps_dict):
# entity dict
self.e_d = ps_dict
def to_dict(self):
return self.e_d
| 14.266667 | 41 | 0.696262 | 33 | 214 | 4.181818 | 0.636364 | 0.086957 | 0.086957 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00578 | 0.191589 | 214 | 14 | 42 | 15.285714 | 0.791908 | 0.154206 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | false | 0 | 0.142857 | 0.142857 | 0.857143 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
c8efd5f50e23a88b242e0e5832ddd548e4a5108c | 1,809 | py | Python | src/entitykb/pipeline/filterers.py | genomoncology/entitykb | 61cf346a24f52fd8c1edea8827a816284ed6ecaf | [
"MIT"
] | 25 | 2020-06-30T16:46:43.000Z | 2022-01-04T15:27:49.000Z | src/entitykb/pipeline/filterers.py | genomoncology/entitykb | 61cf346a24f52fd8c1edea8827a816284ed6ecaf | [
"MIT"
] | 3 | 2020-11-25T15:09:33.000Z | 2021-05-08T11:25:14.000Z | src/entitykb/pipeline/filterers.py | genomoncology/entitykb | 61cf346a24f52fd8c1edea8827a816284ed6ecaf | [
"MIT"
] | 2 | 2021-06-17T11:21:49.000Z | 2021-12-02T13:07:15.000Z | from typing import Iterator
from entitykb import Span, interfaces, Doc
class KeepExactNameOnly(interfaces.IFilterer):
""" Only keep spans that are an exact match. """
def is_keep(self, span: Span):
return span.name == span.text
class RemoveInexactSynonyms(interfaces.IFilterer):
""" Remove if not exact synonyms. """
def is_keep(self, span):
is_keep = span.name and (span.name.lower() == span.text.lower())
return is_keep or (span.text in span.synonyms)
class DedupeByKeyOffset(interfaces.IFilterer):
""" Keeps longest overlapping span sharing same key. """
def __init__(self, doc: Doc = None):
super().__init__(doc)
self.seen = set()
def span_tuple(self, span: Span, offset: int):
return span.entity_key, offset
def is_unique(self, span: Span) -> bool:
keys = {self.span_tuple(span, offset) for offset in span.offsets}
is_unique = self.seen.isdisjoint(keys)
if is_unique:
self.seen.update(keys)
return is_unique
@classmethod
def sort_key(cls, span: Span):
return (
-span.num_tokens,
span.match_type(),
span.offset,
span.label,
)
def filter(self, spans: Iterator[Span]) -> Iterator[Span]:
spans = sorted(spans, key=self.sort_key)
if len(spans) > 1:
spans = filter(self.is_unique, spans)
return spans
class DedupeByLabelOffset(DedupeByKeyOffset):
""" Keeps longest overlapping span sharing same label. """
def span_tuple(self, span: Span, offset: int):
return span.label, offset
class DedupeByOffset(DedupeByKeyOffset):
""" Keeps longest overlapping spans. """
def span_tuple(self, span: Span, offset: int):
return offset
| 27.409091 | 73 | 0.63571 | 221 | 1,809 | 5.085973 | 0.316742 | 0.049822 | 0.053381 | 0.042705 | 0.209075 | 0.178826 | 0.11121 | 0.11121 | 0.11121 | 0.076512 | 0 | 0.000743 | 0.255943 | 1,809 | 65 | 74 | 27.830769 | 0.834324 | 0.114428 | 0 | 0.075 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.225 | false | 0 | 0.05 | 0.125 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
c8f87a26e0ea3211d6cafee5a76cf221fb9382c8 | 107,098 | py | Python | src/dbobjects.py | 211tbc/synthesis | 55b4dcb85b7a2ed5fbc46b1740c8ca0ab80248a4 | [
"Unlicense"
] | null | null | null | src/dbobjects.py | 211tbc/synthesis | 55b4dcb85b7a2ed5fbc46b1740c8ca0ab80248a4 | [
"Unlicense"
] | 7 | 2016-08-12T15:12:43.000Z | 2020-06-07T03:19:13.000Z | src/dbobjects.py | 211tbc/synthesis | 55b4dcb85b7a2ed5fbc46b1740c8ca0ab80248a4 | [
"Unlicense"
] | null | null | null | from sqlalchemy import create_engine, Column, Integer, BigInteger, String, Boolean, MetaData, ForeignKey
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import sessionmaker, relationship
from sqlalchemy.types import DateTime, Date, Interval
from sqlalchemy.pool import NullPool
from .conf import settings
from logging import Logger
print("loaded dbobjects module")
class DB:
#print "loaded DB Class"
database_string = 'postgresql+psycopg2://' + settings.DB_USER + ':' + settings.DB_PASSWD + '@' + settings.DB_HOST + ':' + str(settings.DB_PORT) + '/' + settings.DB_DATABASE
pg_db_engine = create_engine(database_string, poolclass=NullPool, echo=settings.DEBUG_ALCHEMY)
mymetadata = MetaData(bind=pg_db_engine)
Base = declarative_base(metadata=mymetadata)
def __init__(self):
#postgresql[+driver]://<user>:<pass>@<host>/<dbname> #, server_side_cursors=True)
self.Session = sessionmaker() # Was
#self.Session = sessionmaker(bind=self.pg_db_engine) # JCS
loglevel = 'DEBUG'
self.log = Logger(settings.LOGGING_INI, loglevel)
class MapBase():
def __init__(self, field_dict):
if settings.DEBUG:
print("Base Class created: %s" % self.__class__.__name__)
#def __init__(self, field_dict):
if settings.DEBUG:
print(field_dict)
for x, y in field_dict.iteritems():
self.__setattr__(x,y)
def __repr__(self):
field_dict = vars(self)
out = ''
if len(field_dict) > 0:
for x, y in field_dict.iteritems():
if x[0] != "_":
out = out + "%s = %s, " % (x,y)
return "<%s(%s)>" % (self.__class__.__name__, out)
else:
return ''
class SiteServiceParticipation(DB.Base, MapBase):
__tablename__ = 'site_service_participation'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
site_service_index_id = Column(Integer, ForeignKey('site_service.id'))
household_index_id = Column(Integer, ForeignKey('household.id'))
site_service_participation_idid_num = Column(String(32))
site_service_participation_idid_num_date_collected = Column(DateTime(timezone=False))
site_service_participation_idid_str = Column(String(32))
site_service_participation_idid_str_date_collected = Column(DateTime(timezone=False))
site_service_idid_num = Column(String(32)) # JCS
#site_service_idid_num_date_collected = Column(DateTime(timezone=False)) # JCS
destination = Column(String(32))
destination_date_collected = Column(DateTime(timezone=False))
destination_other = Column(String(32))
destination_other_date_collected = Column(DateTime(timezone=False))
destination_tenure = Column(String(32))
destination_tenure_date_collected = Column(DateTime(timezone=False))
disabling_condition = Column(String(32))
disabling_condition_date_collected = Column(DateTime(timezone=False))
participation_dates_start_date = Column(DateTime(timezone=False))
participation_dates_start_date_date_collected = Column(DateTime(timezone=False))
participation_dates_end_date = Column(DateTime(timezone=False))
participation_dates_end_date_date_collected = Column(DateTime(timezone=False))
veteran_status = Column(String(32))
veteran_status_date_collected = Column(DateTime(timezone=False))
#adding a reported column. Hopefully this will append the column to the table def.
reported = Column(Boolean)
site_service_participation_id_delete = Column(String(32))
site_service_participation_id_delete_occurred_date = Column(DateTime(timezone=False))
site_service_participation_id_delete_effective_date = Column(DateTime(timezone=False))
fk_participation_to_need = relationship('Need', backref='fk_need_to_participation')
fk_participation_to_serviceevent = relationship('ServiceEvent')
fk_participation_to_personhistorical = relationship('PersonHistorical')
fk_participation_to_person = Column(Integer, ForeignKey('person.id'))
useexisting = True
class Need(DB.Base, MapBase):
__tablename__ = 'need'
id = Column(Integer, primary_key=True)
site_service_index_id = Column(Integer, ForeignKey('site_service.id')) # JCS
site_service_participation_index_id = Column(Integer, ForeignKey('site_service_participation.id')) # JCS
export_index_id = Column(Integer, ForeignKey('export.id'))
need_idid_num = Column(String(32))
need_idid_num_date_collected = Column(DateTime(timezone=False))
need_idid_str = Column(String(32))
need_idid_str_date_collected = Column(DateTime(timezone=False))
site_service_idid_num = Column(String(32))
site_service_idid_num_date_collected = Column(DateTime(timezone=False))
site_service_idid_str = Column(String(32))
site_service_idid_str_date_collected = Column(DateTime(timezone=False))
service_event_idid_num = Column(String(32))
service_event_idid_num_date_collected = Column(DateTime(timezone=False))
service_event_idid_str = Column(String(32))
service_event_idid_str_date_collected = Column(DateTime(timezone=False))
need_status = Column(String(32))
need_status_date_collected = Column(DateTime(timezone=False))
taxonomy = Column(String(32))
reported = Column(Boolean)
## HUD 3.0
person_index_id = Column(Integer, ForeignKey('person.id'))
need_id_delete = Column(String(32))
need_id_delete_occurred_date = Column(DateTime(timezone=False))
need_id_delete_delete_effective_date = Column(DateTime(timezone=False))
need_effective_period_start_date = Column(DateTime(timezone=False))
need_effective_period_end_date = Column(DateTime(timezone=False))
need_recorded_date = Column(DateTime(timezone=False))
useexisting = True
class Races(DB.Base, MapBase):
__tablename__ = 'races'
id = Column(Integer, primary_key=True)
person_index_id = Column(Integer, ForeignKey('person.id'))
export_index_id = Column(Integer, ForeignKey('export.id'))
race_unhashed = Column(Integer)
race_hashed = Column(String(32))
race_date_collected = Column(DateTime(timezone=False))
reported = Column(Boolean)
## HUD 3.0
race_data_collection_stage = Column(String(32))
race_date_effective = Column(DateTime(timezone=False))
useexisting = True
class OtherNames(DB.Base, MapBase):
__tablename__ = 'other_names'
id = Column(Integer, primary_key=True)
person_index_id = Column(Integer, ForeignKey('person.id'))
export_index_id = Column(Integer, ForeignKey('export.id'))
other_first_name_unhashed = Column(String(50))
other_first_name_hashed = Column(String(50))
other_first_name_date_collected = Column(DateTime(timezone=False))
other_first_name_date_effective = Column(DateTime(timezone=False))
other_first_name_data_collection_stage = Column(String(32))
other_middle_name_unhashed = Column(String(50))
other_middle_name_hashed = Column(String(50))
other_middle_name_date_collected = Column(DateTime(timezone=False))
other_middle_name_date_effective = Column(DateTime(timezone=False))
other_middle_name_data_collection_stage = Column(String(32))
other_last_name_unhashed = Column(String(50))
other_last_name_hashed = Column(String(50))
other_last_name_date_collected = Column(DateTime(timezone=False))
other_last_name_date_effective = Column(DateTime(timezone=False))
other_last_name_data_collection_stage = Column(String(32))
other_suffix_unhashed = Column(String(50))
other_suffix_hashed = Column(String(50))
other_suffix_date_collected = Column(DateTime(timezone=False))
other_suffix_date_effective = Column(DateTime(timezone=False))
other_suffix_data_collection_stage = Column(String(32))
useexisting = True
class HUDHomelessEpisodes(DB.Base, MapBase):
__tablename__ = 'hud_homeless_episodes'
id = Column(Integer, primary_key=True)
person_historical_index_id = Column(Integer, ForeignKey('person_historical.id'))
export_index_id = Column(Integer, ForeignKey('export.id'))
start_date = Column(String(32))
start_date_date_collected = Column(DateTime(timezone=False))
end_date = Column(String(32))
end_date_date_collected = Column(DateTime(timezone=False))
useexisting = True
class Veteran(DB.Base, MapBase):
__tablename__ = 'veteran'
id = Column(Integer, primary_key=True)
person_historical_index_id = Column(Integer, ForeignKey('person_historical.id'))
export_index_id = Column(Integer, ForeignKey('export.id'))
service_era = Column(Integer)
service_era_date_collected = Column(DateTime(timezone=False))
military_service_duration = Column(Integer)
military_service_duration_date_collected = Column(DateTime(timezone=False))
served_in_war_zone = Column(Integer)
served_in_war_zone_date_collected = Column(DateTime(timezone=False))
war_zone = Column(Integer)
war_zone_date_collected = Column(DateTime(timezone=False))
war_zone_other = Column(String(50))
war_zone_other_date_collected = Column(DateTime(timezone=False))
months_in_war_zone = Column(Integer)
months_in_war_zone_date_collected = Column(DateTime(timezone=False))
received_fire = Column(Integer)
received_fire_date_collected = Column(DateTime(timezone=False))
military_branch = Column(Integer)
military_branch_date_collected = Column(DateTime(timezone=False))
military_branch_other = Column(String(50))
military_branch_other_date_collected = Column(DateTime(timezone=False))
discharge_status = Column(Integer)
discharge_status_date_collected = Column(DateTime(timezone=False))
discharge_status_other = Column(String(50))
discharge_status_other_date_collected = Column(DateTime(timezone=False))
reported = Column(Boolean)
useexisting = True
class DrugHistory(DB.Base, MapBase):
__tablename__ = 'drug_history'
id = Column(Integer, primary_key=True)
person_historical_index_id = Column(Integer, ForeignKey('person_historical.id'))
export_index_id = Column(Integer, ForeignKey('export.id'))
drug_history_id = Column(String(32))
drug_history_id_date_collected = Column(DateTime(timezone=False))
drug_code = Column(Integer)
drug_code_date_collected = Column(DateTime(timezone=False))
drug_use_frequency = Column(Integer)
drug_use_frequency_date_collected = Column(DateTime(timezone=False))
reported = Column(Boolean)
useexisting = True
class EmergencyContact(DB.Base, MapBase):
__tablename__ = 'emergency_contact'
id = Column(Integer, primary_key=True)
person_historical_index_id = Column(Integer, ForeignKey('person_historical.id'))
export_index_id = Column(Integer, ForeignKey('export.id'))
emergency_contact_id = Column(String(32))
emergency_contact_id_date_collected = Column(DateTime(timezone=False))
emergency_contact_name = Column(String(32))
emergency_contact_name_date_collected = Column(DateTime(timezone=False))
emergency_contact_phone_number_0 = Column(String(32))
emergency_contact_phone_number_date_collected_0 = Column(DateTime(timezone=False))
emergency_contact_phone_number_type_0 = Column(String(32))
emergency_contact_phone_number_1 = Column(String(32))
emergency_contact_phone_number_date_collected_1 = Column(DateTime(timezone=False))
emergency_contact_phone_number_type_1 = Column(String(32))
emergency_contact_address_date_collected = Column(DateTime(timezone=False))
emergency_contact_address_start_date = Column(DateTime(timezone=False))
emergency_contact_address_start_date_date_collected = Column(DateTime(timezone=False))
emergency_contact_address_end_date = Column(DateTime(timezone=False))
emergency_contact_address_end_date_date_collected = Column(DateTime(timezone=False))
emergency_contact_address_line1 = Column(String(32))
emergency_contact_address_line1_date_collected = Column(DateTime(timezone=False))
emergency_contact_address_line2 = Column(String(32))
emergency_contact_address_line2_date_collected = Column(DateTime(timezone=False))
emergency_contact_address_city = Column(String(32))
emergency_contact_address_city_date_collected = Column(DateTime(timezone=False))
emergency_contact_address_state = Column(String(32))
emergency_contact_address_state_date_collected = Column(DateTime(timezone=False))
emergency_contact_relation_to_client = Column(String(32))
emergency_contact_relation_to_client_date_collected = Column(DateTime(timezone=False))
reported = Column(Boolean)
useexisting = True
class PersonAddress(DB.Base, MapBase):
__tablename__ = 'person_address'
id = Column(Integer, primary_key=True)
person_historical_index_id = Column(Integer, ForeignKey('person_historical.id'))
export_index_id = Column(Integer, ForeignKey('export.id'))
address_period_start_date = Column(DateTime(timezone=False))
address_period_start_date_date_collected = Column(DateTime(timezone=False))
address_period_end_date = Column(DateTime(timezone=False))
address_period_end_date_date_collected = Column(DateTime(timezone=False))
pre_address_line = Column(String(100))
pre_address_line_date_collected = Column(DateTime(timezone=False))
pre_address_line_date_effective = Column(DateTime(timezone=False))
pre_address_line_data_collection_stage = Column(String(32))
line1 = Column(String(100))
line1_date_collected = Column(DateTime(timezone=False))
line1_date_effective = Column(DateTime(timezone=False))
line1_data_collection_stage = Column(String(32))
line2 = Column(String(100))
line2_date_collected = Column(DateTime(timezone=False))
line2_date_effective = Column(DateTime(timezone=False))
line2_data_collection_stage = Column(String(32))
city = Column(String(100))
city_date_collected = Column(DateTime(timezone=False))
city_date_effective = Column(DateTime(timezone=False))
city_data_collection_stage = Column(String(32))
county = Column(String(32))
county_date_collected = Column(DateTime(timezone=False))
county_date_effective = Column(DateTime(timezone=False))
county_data_collection_stage = Column(String(32))
state = Column(String(32))
state_date_collected = Column(DateTime(timezone=False))
state_date_effective = Column(DateTime(timezone=False))
state_data_collection_stage = Column(String(32))
zipcode = Column(String(10))
zipcode_date_collected = Column(DateTime(timezone=False))
zipcode_date_effective = Column(DateTime(timezone=False))
zipcode_data_collection_stage = Column(String(32))
country = Column(String(32))
country_date_collected = Column(DateTime(timezone=False))
country_date_effective = Column(DateTime(timezone=False))
country_data_collection_stage = Column(String(32))
is_last_permanent_zip = Column(Integer)
is_last_permanent_zip_date_collected = Column(DateTime(timezone=False))
is_last_permanent_zip_date_effective = Column(DateTime(timezone=False))
is_last_permanent_zip_data_collection_stage = Column(String(32))
zip_quality_code = Column(Integer)
zip_quality_code_date_collected = Column(DateTime(timezone=False))
zip_quality_code_date_effective = Column(DateTime(timezone=False))
zip_quality_code_data_collection_stage = Column(String(32))
reported = Column(Boolean)
## HUD 3.0
person_address_delete = Column(String(32))
person_address_delete_occurred_date = Column(DateTime(timezone=False))
person_address_delete_effective_date = Column(DateTime(timezone=False))
useexisting = True
class PersonHistorical(DB.Base, MapBase):
__tablename__ = 'person_historical'
id = Column(Integer, primary_key=True)
call_index_id = Column(Integer, ForeignKey('call.id'))
export_index_id = Column(Integer, ForeignKey('export.id'))
person_index_id = Column(Integer, ForeignKey('person.id'))
site_service_index_id = Column(Integer, ForeignKey('site_service.id')) # JCS
site_service_participation_index_id = Column(Integer, ForeignKey('site_service_participation.id')) # JCS
person_historical_id_id_num = Column(String(32))
person_historical_id_id_str = Column(String(32))
person_historical_id_delete_effective_date = Column(DateTime(timezone=False))
person_historical_id_delete = Column(Integer)
person_historical_id_delete_occurred_date = Column(DateTime(timezone=False))
barrier_code = Column(String(32))
barrier_code_date_collected = Column(DateTime(timezone=False))
barrier_other = Column(String(32))
barrier_other_date_collected = Column(DateTime(timezone=False))
child_currently_enrolled_in_school = Column(String(32))
child_currently_enrolled_in_school_date_collected = Column(DateTime(timezone=False))
currently_employed = Column(String(32))
currently_employed_date_collected = Column(DateTime(timezone=False))
currently_in_school = Column(String(32))
currently_in_school_date_collected = Column(DateTime(timezone=False))
degree_code = Column(String(32))
degree_code_date_collected = Column(DateTime(timezone=False))
degree_other = Column(String(32))
degree_other_date_collected = Column(DateTime(timezone=False))
developmental_disability = Column(String(32))
developmental_disability_date_collected = Column(DateTime(timezone=False))
domestic_violence = Column(String(32))
domestic_violence_date_collected = Column(DateTime(timezone=False))
domestic_violence_how_long = Column(String(32))
domestic_violence_how_long_date_collected = Column(DateTime(timezone=False))
due_date = Column(String(32))
due_date_date_collected = Column(DateTime(timezone=False))
employment_tenure = Column(String(32))
employment_tenure_date_collected = Column(DateTime(timezone=False))
health_status = Column(String(32))
health_status_date_collected = Column(DateTime(timezone=False))
highest_school_level = Column(String(32))
highest_school_level_date_collected = Column(DateTime(timezone=False))
hivaids_status = Column(String(32))
hivaids_status_date_collected = Column(DateTime(timezone=False))
hours_worked_last_week = Column(String(32))
hours_worked_last_week_date_collected = Column(DateTime(timezone=False))
hud_chronic_homeless = Column(String(32))
hud_chronic_homeless_date_collected = Column(DateTime(timezone=False))
hud_homeless = Column(String(32))
hud_homeless_date_collected = Column(DateTime(timezone=False))
site_service_id = Column(Integer)
###HUDHomelessEpisodes (subtable)
###IncomeAndSources (subtable)
length_of_stay_at_prior_residence = Column(String(32))
length_of_stay_at_prior_residence_date_collected = Column(DateTime(timezone=False))
looking_for_work = Column(String(32))
looking_for_work_date_collected = Column(DateTime(timezone=False))
mental_health_indefinite = Column(String(32))
mental_health_indefinite_date_collected = Column(DateTime(timezone=False))
mental_health_problem = Column(String(32))
mental_health_problem_date_collected = Column(DateTime(timezone=False))
non_cash_source_code = Column(String(32))
non_cash_source_code_date_collected = Column(DateTime(timezone=False))
non_cash_source_other = Column(String(32))
non_cash_source_other_date_collected = Column(DateTime(timezone=False))
###PersonAddress (subtable)
person_email = Column(String(32))
person_email_date_collected = Column(DateTime(timezone=False))
person_phone_number = Column(String(32))
person_phone_number_date_collected = Column(DateTime(timezone=False))
physical_disability = Column(String(32))
physical_disability_date_collected = Column(DateTime(timezone=False))
pregnancy_status = Column(String(32))
pregnancy_status_date_collected = Column(DateTime(timezone=False))
prior_residence = Column(String(32))
prior_residence_date_collected = Column(DateTime(timezone=False))
prior_residence_other = Column(String(32))
prior_residence_other_date_collected = Column(DateTime(timezone=False))
reason_for_leaving = Column(String(32))
reason_for_leaving_date_collected = Column(DateTime(timezone=False))
reason_for_leaving_other = Column(String(32))
reason_for_leaving_other_date_collected = Column(DateTime(timezone=False))
school_last_enrolled_date = Column(String(32))
school_last_enrolled_date_date_collected = Column(DateTime(timezone=False))
school_name = Column(String(32))
school_name_date_collected = Column(DateTime(timezone=False))
school_type = Column(String(32))
school_type_date_collected = Column(DateTime(timezone=False))
subsidy_other = Column(String(32))
subsidy_other_date_collected = Column(DateTime(timezone=False))
subsidy_type = Column(String(32))
subsidy_type_date_collected = Column(DateTime(timezone=False))
substance_abuse_indefinite = Column(String(32))
substance_abuse_indefinite_date_collected = Column(DateTime(timezone=False))
substance_abuse_problem = Column(String(32))
substance_abuse_problem_date_collected = Column(DateTime(timezone=False))
total_income = Column(String(32))
total_income_date_collected = Column(DateTime(timezone=False))
###Veteran (subtable)
vocational_training = Column(String(32))
vocational_training_date_collected = Column(DateTime(timezone=False))
annual_personal_income = Column(Integer)
annual_personal_income_date_collected = Column(DateTime(timezone=False))
employment_status = Column(Integer)
employment_status_date_collected = Column(DateTime(timezone=False))
family_size = Column(Integer)
family_size_date_collected = Column(DateTime(timezone=False))
hearing_impaired = Column(Integer)
hearing_impaired_date_collected = Column(DateTime(timezone=False))
marital_status = Column(Integer)
marital_status_date_collected = Column(DateTime(timezone=False))
non_ambulatory = Column(Integer)
non_ambulatory_date_collected = Column(DateTime(timezone=False))
residential_status = Column(Integer)
residential_status_date_collected = Column(DateTime(timezone=False))
visually_impaired = Column(Integer)
visually_impaired_date_collected = Column(DateTime(timezone=False))
reported = Column(Boolean)
fk_person_historical_to_income_and_sources = relationship('IncomeAndSources',
backref='fk_income_and_sources_to_person_historical')
fk_person_historical_to_veteran = relationship('Veteran', backref='fk_veteran_to_person_historical')
fk_person_historical_to_hud_homeless_episodes = relationship('HUDHomelessEpisodes',
backref='fk_hud_homeless_episodes_to_person_historical')
fk_person_historical_to_person_address = relationship('PersonAddress', backref='fk_person_address_to_person_historical')
useexisting = True
class IncomeAndSources(DB.Base, MapBase):
__tablename__ = 'income_and_sources'
id = Column(Integer, primary_key=True)
person_historical_index_id = Column(Integer, ForeignKey('person_historical.id'))
export_index_id = Column(Integer, ForeignKey('export.id'))
amount = Column(Integer)
amount_date_collected = Column(DateTime(timezone=False))
income_source_code = Column(Integer)
income_source_code_date_collected = Column(DateTime(timezone=False))
income_source_other = Column(String(32))
income_source_other_date_collected = Column(DateTime(timezone=False))
## HUD 3.0
income_and_source_id_id_num = Column(String(32))
income_and_source_id_id_str = Column(String(32))
income_and_source_id_id_delete_occurred_date = Column(DateTime(timezone=False))
income_and_source_id_id_delete_effective_date = Column(DateTime(timezone=False))
income_source_code_date_effective = Column(DateTime(timezone=False))
income_source_other_date_effective = Column(DateTime(timezone=False))
receiving_income_source_date_collected = Column(DateTime(timezone=False))
receiving_income_source_date_effective = Column(DateTime(timezone=False))
income_source_amount_date_effective = Column(DateTime(timezone=False))
income_and_source_id_id_delete = Column(Integer)
income_source_code_data_collection_stage = Column(String(32))
income_source_other_data_collection_stage = Column(String(32))
receiving_income_source = Column(Integer)
receiving_income_source_data_collection_stage = Column(String(32))
income_source_amount_data_collection_stage = Column(String(32))
useexisting = True
class Members(DB.Base, MapBase):
__tablename__ = 'members'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
household_index_id = Column(Integer, ForeignKey('household.id'))
person_index_id = Column(Integer, ForeignKey('person.id'))
relationship_to_head_of_household = Column(String(32))
relationship_to_head_of_household_date_collected = Column(DateTime(timezone=False))
reported = Column(Boolean)
useexisting = True
class ReleaseOfInformation(DB.Base, MapBase):
__tablename__ = 'release_of_information'
id = Column(Integer, primary_key=True)
person_index_id = Column(Integer, ForeignKey('person.id'))
export_index_id = Column(Integer, ForeignKey('export.id'))
release_of_information_idid_num = Column(String(32))
release_of_information_idid_num_date_collected = Column(DateTime(timezone=False))
release_of_information_idid_str = Column(String(32))
release_of_information_idid_str_date_collected = Column(DateTime(timezone=False))
site_service_idid_num = Column(String(32))
site_service_idid_num_date_collected = Column(DateTime(timezone=False))
site_service_idid_str = Column(String(32))
site_service_idid_str_date_collected = Column(DateTime(timezone=False))
documentation = Column(String(32))
documentation_date_collected = Column(DateTime(timezone=False))
#EffectivePeriod (subtable)
start_date = Column(String(32))
start_date_date_collected = Column(DateTime(timezone=False))
end_date = Column(String(32))
end_date_date_collected = Column(DateTime(timezone=False))
release_granted = Column(String(32))
release_granted_date_collected = Column(DateTime(timezone=False))
reported = Column(Boolean)
## HUD 3.0
release_of_information_id_data_collection_stage = Column(String(32))
release_of_information_id_date_effective = Column(DateTime(timezone=False))
documentation_data_collection_stage = Column(String(32))
documentation_date_effective = Column(DateTime(timezone=False))
release_granted_data_collection_stage = Column(String(32))
release_granted_date_effective = Column(DateTime(timezone=False))
useexisting = True
class SourceExportLink(DB.Base, MapBase):
__tablename__ = 'source_export_link'
id = Column(Integer, primary_key=True)
source_index_id = Column(Integer, ForeignKey('source.id'))
export_index_id = Column(Integer, ForeignKey('export.id'))
report_index_id = Column(String(50), ForeignKey('report.report_id'))
useexisting = True
class Region(DB.Base, MapBase):
__tablename__ = 'region'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
report_index_id = Column(String(50), ForeignKey('report.report_id'))
region_id_id_num = Column(String(50))
region_id_id_str = Column(String(32))
site_service_id = Column(String(50))
region_type = Column(String(50))
region_type_date_collected = Column(DateTime(timezone=False))
region_type_date_effective = Column(DateTime(timezone=False))
region_type_data_collection_stage = Column(String(32))
region_description = Column(String(30))
region_description_date_collected = Column(DateTime(timezone=False))
region_description_date_effective = Column(DateTime(timezone=False))
region_description_data_collection_stage = Column(String(32))
useexisting = True
class Agency(DB.Base, MapBase):
__tablename__ = 'agency'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
report_index_id = Column(String(50), ForeignKey('report.report_id'))
agency_delete = Column(Integer)
agency_delete_occurred_date = Column(DateTime(timezone=False))
agency_delete_effective_date = Column(DateTime(timezone=False))
airs_key = Column(String(50))
airs_name = Column(String(50))
agency_description = Column(String(50))
irs_status = Column(String(50))
source_of_funds = Column(String(50))
record_owner = Column(String(50))
fein = Column(String(50))
year_inc = Column(String(50))
annual_budget_total = Column(String(50))
legal_status = Column(String(50))
exclude_from_website = Column(String(50))
exclude_from_directory = Column(String(50))
useexisting = True
class AgencyChild(DB.Base, MapBase):
__tablename__ = 'agency_child'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
report_index_id = Column(String(50), ForeignKey('report.report_id'))
agency_index_id = Column(Integer, ForeignKey('agency.id'))
useexisting = True
class Service(DB.Base, MapBase):
__tablename__ = 'service'
id = Column(Integer, primary_key=True)
service_id = Column(String(50))
export_index_id = Column(Integer, ForeignKey('export.id'))
report_index_id = Column(String(50), ForeignKey('report.report_id'))
service_delete = Column(Integer)
service_delete_occurred_date = Column(DateTime(timezone=False))
service_delete_effective_date = Column(DateTime(timezone=False))
airs_key = Column(String(50))
airs_name = Column(String(50))
coc_code = Column(String(5))
configuration = Column(String(50))
direct_service_code = Column(String(50))
grantee_identifier = Column(String(10))
individual_family_code = Column(String(50))
residential_tracking_method = Column(String(50))
service_type = Column(String(50))
jfcs_service_type = Column(String(50))
service_effective_period_start_date = Column(DateTime(timezone=False))
service_effective_period_end_date = Column(DateTime(timezone=False))
service_recorded_date = Column(DateTime(timezone=False))
target_population_a = Column(String(50))
target_population_b = Column(String(50))
useexisting = True
class Site(DB.Base, MapBase):
__tablename__ = 'site'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
report_index_id = Column(String(50), ForeignKey('report.report_id'))
agency_index_id = Column(Integer, ForeignKey('agency.id'))
#agency_location_index_id = Column(Integer, ForeignKey('agency_location.id'))
site_delete = Column(Integer)
site_delete_occurred_date = Column(DateTime(timezone=False))
site_delete_effective_date = Column(DateTime(timezone=False))
airs_key = Column(String(50))
airs_name = Column(String(50))
site_description = Column(String(50))
physical_address_pre_address_line = Column(String(100))
physical_address_line_1 = Column(String(100))
physical_address_line_2 = Column(String(100))
physical_address_city = Column(String(50))
physical_address_country = Column(String(50))
physical_address_state = Column(String(50))
physical_address_zip_code = Column(String(50))
physical_address_country = Column(String(50))
physical_address_reason_withheld = Column(String(50))
physical_address_confidential = Column(String(50))
physical_address_description = Column(String(50))
mailing_address_pre_address_line = Column(String(100))
mailing_address_line_1 = Column(String(100))
mailing_address_line_2 = Column(String(100))
mailing_address_city = Column(String(50))
mailing_address_country = Column(String(50))
mailing_address_state = Column(String(50))
mailing_address_zip_code = Column(String(50))
mailing_address_country = Column(String(50))
mailing_address_reason_withheld = Column(String(50))
mailing_address_confidential = Column(String(50))
mailing_address_description = Column(String(50))
no_physical_address_description = Column(String(50))
no_physical_address_explanation = Column(String(50))
disabilities_access = Column(String(50))
physical_location_description = Column(String(50))
bus_service_access = Column(String(50))
public_access_to_transportation = Column(String(50))
year_inc = Column(String(50))
annual_budget_total = Column(String(50))
legal_status = Column(String(50))
exclude_from_website = Column(String(50))
exclude_from_directory = Column(String(50))
agency_key = Column(String(50))
useexisting = True
class SiteService(DB.Base, MapBase):
__tablename__ = 'site_service'
id = Column(Integer, primary_key=True)
site_service_id = Column(String(50))
export_index_id = Column(Integer, ForeignKey('export.id'))
report_index_id = Column(String(50), ForeignKey('report.report_id'))
site_index_id = Column(Integer, ForeignKey('site.id'))
service_index_id = Column(Integer, ForeignKey(Service.id))
agency_location_index_id = Column(Integer, ForeignKey('agency_location.id'))
site_service_delete = Column(Integer)
site_service_delete_occurred_date = Column(DateTime(timezone=False))
site_service_delete_effective_date = Column(DateTime(timezone=False))
name = Column(String(50))
key = Column(String(50))
description = Column(String(50))
fee_structure = Column(String(50))
gender_requirements = Column(String(50))
area_flexibility = Column(String(50))
service_not_always_available = Column(String(50))
service_group_key = Column(String(50))
site_id = Column(String(50))
geographic_code = Column(String(50))
geographic_code_date_collected = Column(DateTime(timezone=False))
geographic_code_date_effective = Column(DateTime(timezone=False))
geographic_code_data_collection_stage = Column(String(50))
housing_type = Column(String(50))
housing_type_date_collected = Column(DateTime(timezone=False))
housing_type_date_effective = Column(DateTime(timezone=False))
housing_type_data_collection_stage = Column(String(50))
principal = Column(String(50))
site_service_effective_period_start_date = Column(DateTime(timezone=False))
site_service_effective_period_end_date = Column(DateTime(timezone=False))
site_service_recorded_date = Column(DateTime(timezone=False))
site_service_type = Column(String(50))
useexisting = True
class FundingSource(DB.Base, MapBase):
__tablename__ = 'funding_source'
id = Column(Integer, primary_key=True)
service_index_id = Column(Integer, ForeignKey('service.id'))
export_index_id = Column(Integer, ForeignKey('export.id'))
service_event_index_id = Column(Integer, ForeignKey('service_event.id'))
funding_source_id_id_num = Column(String(50))
funding_source_id_id_str = Column(String(32))
funding_source_id_delete = Column(String(50))
funding_source_id_delete_occurred_date = Column(DateTime(timezone=False))
funding_source_id_delete_effective_date = Column(DateTime(timezone=False))
federal_cfda_number = Column(String(50))
receives_mckinney_funding = Column(String(50))
advance_or_arrears = Column(String(50))
financial_assistance_amount = Column(String(50))
useexisting = True
class ResourceInfo(DB.Base, MapBase):
__tablename__ = 'resource_info'
id = Column(Integer, primary_key=True)
agency_index_id = Column(Integer, ForeignKey('agency.id'))
export_index_id = Column(Integer, ForeignKey('export.id'))
site_service_index_id = Column(Integer, ForeignKey('site_service.id'))
resource_specialist = Column(String(50))
available_for_directory = Column(String(50))
available_for_referral = Column(String(50))
available_for_research = Column(String(50))
date_added = Column(DateTime(timezone=False))
date_last_verified = Column(DateTime(timezone=False))
date_of_last_action = Column(DateTime(timezone=False))
last_action_type = Column(String(50))
useexisting = True
class Inventory(DB.Base, MapBase):
__tablename__ = 'inventory'
id = Column(Integer, primary_key=True)
service_index_id = Column(Integer, ForeignKey(Service.id))
export_index_id = Column(Integer, ForeignKey('export.id'))
site_service_index_id = Column(Integer, ForeignKey('site_service.id'))
inventory_delete = Column(Integer)
inventory_delete_occurred_date = Column(DateTime(timezone=False))
inventory_delete_effective_delete = Column(DateTime(timezone=False))
hmis_participation_period_start_date = Column(DateTime(timezone=False))
hmis_participation_period_end_date = Column(DateTime(timezone=False))
inventory_id_id_num = Column(String(50))
inventory_id_id_str = Column(String(32))
bed_inventory = Column(String(50))
bed_availability = Column(String(50))
bed_type = Column(String(50))
bed_individual_family_type = Column(String(50))
chronic_homeless_bed = Column(String(50))
domestic_violence_shelter_bed = Column(String(50))
household_type = Column(String(50))
hmis_participating_beds = Column(String(50))
inventory_effective_period_start_date = Column(DateTime(timezone=False))
inventory_effective_period_end_date = Column(DateTime(timezone=False))
inventory_recorded_date = Column(DateTime(timezone=False))
unit_inventory = Column(String(50))
useexisting = True
class AgeRequirements(DB.Base, MapBase):
__tablename__ = 'age_requirements'
id = Column(Integer, primary_key=True)
site_service_index_id = Column(Integer, ForeignKey('site_service.id'))
export_index_id = Column(Integer, ForeignKey('export.id'))
gender = Column(String(50))
minimum_age = Column(String(50))
maximum_age = Column(String(50))
useexisting = True
class AidRequirements(DB.Base, MapBase):
__tablename__ = 'aid_requirements'
id = Column(Integer, primary_key=True)
site_service_index_id = Column(Integer, ForeignKey('site_service.id'))
export_index_id = Column(Integer, ForeignKey('export.id'))
aid_requirements = Column(String(50))
useexisting = True
class Aka(DB.Base, MapBase):
__tablename__ = 'aka'
id = Column(Integer, primary_key=True)
agency_index_id = Column(Integer, ForeignKey('agency.id'))
site_index_id = Column(Integer, ForeignKey('site.id'))
export_index_id = Column(Integer, ForeignKey('export.id'))
# SBB20100914 Added Agency Location foreign key
agency_location_index_id = Column(Integer, ForeignKey('agency_location.id'))
name = Column(String(50))
confidential = Column(String(50))
description = Column(String(50))
useexisting = True
class ApplicationProcess(DB.Base, MapBase):
__tablename__ = 'application_process'
id = Column(Integer, primary_key=True)
site_service_index_id = Column(Integer, ForeignKey('site_service.id'))
export_index_id = Column(Integer, ForeignKey('export.id'))
step = Column(String(50))
description = Column(String(50))
useexisting = True
class Assignment(DB.Base, MapBase):
__tablename__ = 'assignment'
id = Column(Integer, primary_key=True)
hmis_asset_index_id = Column(Integer, ForeignKey('hmis_asset.id'))
export_index_id = Column(Integer, ForeignKey('export.id'))
assignment_id_id_num = Column(String(50))
assignment_id_id_str = Column(String(32))
assignment_id_delete = Column(Integer)
assignment_id_delete_occurred_date = Column(DateTime(timezone=False))
assignment_id_delete_effective_date = Column(DateTime(timezone=False))
person_id_id_num = Column(String(50))
person_id_id_str = Column(String(32))
household_id_id_num = Column(String(50))
household_id_id_str = Column(String(32))
useexisting = True
class AssignmentPeriod(DB.Base, MapBase):
__tablename__ = 'assignment_period'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
assignment_index_id = Column(Integer, ForeignKey(Assignment.id))
assignment_period_start_date = Column(DateTime(timezone=False))
assignment_period_end_date = Column(DateTime(timezone=False))
useexisting = True
class Call(DB.Base, MapBase):
__tablename__ = 'call'
id = Column(Integer, primary_key=True)
site_service_id = Column(String(50))
call_id_id_num = Column(String(50))
call_id_id_str = Column(String(32))
call_time = Column(DateTime(timezone=False))
call_duration = Column(Interval())
caseworker_id_id_num = Column(String(50))
caseworker_id_id_str = Column(String(32))
# FBY : TBC requested|required fields
caller_zipcode = Column(String(10))
caller_city = Column(String(128))
caller_state = Column(String(2))
caller_home_phone = Column(String(10))
class ChildEnrollmentStatus(DB.Base, MapBase):
__tablename__ = 'child_enrollment_status'
id = Column(Integer, primary_key=True)
person_historical_index_id = Column(Integer, ForeignKey('person_historical.id'))
export_index_id = Column(Integer, ForeignKey('export.id'))
child_enrollment_status_id_id_num = Column(String(50))
child_enrollment_status_id_id_str = Column(String(32))
child_enrollment_status_id_delete = Column(Integer)
child_enrollment_status_id_delete_occurred_date = Column(DateTime(timezone=False))
child_enrollment_status_id_delete_effective_date = Column(DateTime(timezone=False))
child_currently_enrolled_in_school = Column(String(50))
child_currently_enrolled_in_school_date_effective = Column(DateTime(timezone=False))
child_currently_enrolled_in_school_date_collected = Column(DateTime(timezone=False))
child_currently_enrolled_in_school_data_collection_stage = Column(String(50))
child_school_name = Column(String(50))
child_school_name_date_effective = Column(DateTime(timezone=False))
child_school_name_date_collected = Column(DateTime(timezone=False))
child_school_name_data_collection_stage = Column(String(50))
child_mckinney_vento_liaison = Column(String(50))
child_mckinney_vento_liaison_date_effective = Column(DateTime(timezone=False))
child_mckinney_vento_liaison_date_collected = Column(DateTime(timezone=False))
child_mckinney_vento_liaison_data_collection_stage = Column(String(50))
child_school_type = Column(String(50))
child_school_type_date_effective = Column(DateTime(timezone=False))
child_school_type_date_collected = Column(DateTime(timezone=False))
child_school_type_data_collection_stage = Column(String(50))
child_school_last_enrolled_date = Column(DateTime(timezone=False))
child_school_last_enrolled_date_date_collected = Column(DateTime(timezone=False))
child_school_last_enrolled_date_data_collection_stage = Column(String(50))
useexisting = True
class ChildEnrollmentStatusBarrier(DB.Base, MapBase):
__tablename__ = 'child_enrollment_status_barrier'
id = Column(Integer, primary_key=True)
child_enrollment_status_index_id = Column(Integer, ForeignKey(ChildEnrollmentStatus.id))
export_index_id = Column(Integer, ForeignKey('export.id'))
barrier_id_id_num = Column(String(50))
barrier_id_id_str = Column(String(32))
barrier_id_delete = Column(Integer)
barrier_id_delete_occurred_date = Column(DateTime(timezone=False))
barrier_id_delete_effective_date = Column(DateTime(timezone=False))
barrier_code = Column(String(50))
barrier_code_date_collected = Column(DateTime(timezone=False))
barrier_code_date_effective = Column(DateTime(timezone=False))
barrier_code_data_collection_stage = Column(String(50))
barrier_other = Column(String(50))
barrier_other_date_collected = Column(DateTime(timezone=False))
barrier_other_date_effective = Column(DateTime(timezone=False))
barrier_other_data_collection_stage = Column(String(50))
useexisting = True
class ChronicHealthCondition(DB.Base, MapBase):
__tablename__ = 'chronic_health_condition'
id = Column(Integer, primary_key=True)
person_historical_index_id = Column(Integer, ForeignKey('person_historical.id'))
export_index_id = Column(Integer, ForeignKey('export.id'))
has_chronic_health_condition = Column(String(50))
has_chronic_health_condition_date_collected = Column(DateTime(timezone=False))
has_chronic_health_condition_date_effective = Column(DateTime(timezone=False))
has_chronic_health_condition_data_collection_stage = Column(String(50))
receive_chronic_health_services = Column(String(50))
receive_chronic_health_services_date_collected = Column(DateTime(timezone=False))
receive_chronic_health_services_date_effective = Column(DateTime(timezone=False))
receive_chronic_health_services_data_collection_stage = Column(String(50))
useexisting = True
class Contact(DB.Base, MapBase):
__tablename__ = 'contact'
id = Column(Integer, primary_key=True)
agency_index_id = Column(Integer, ForeignKey('agency.id'))
export_index_id = Column(Integer, ForeignKey('export.id'))
resource_info_index_id = Column(Integer, ForeignKey('resource_info.id'))
site_index_id = Column(Integer, ForeignKey('site.id'))
agency_location_index_id = Column(Integer, ForeignKey('agency_location.id'))
title = Column(String(50))
name = Column(String(50))
type = Column(String(50))
useexisting = True
class ContactMade(DB.Base, MapBase):
__tablename__ = 'contact_made'
id = Column(Integer, primary_key=True)
person_historical_index_id = Column(Integer, ForeignKey('person_historical.id'))
export_index_id = Column(Integer, ForeignKey('export.id'))
contact_id_id_num = Column(String(50))
contact_id_id_str = Column(String(32))
contact_id_delete = Column(Integer)
contact_id_delete_occurred_date = Column(DateTime(timezone=False))
contact_id_delete_effective_date = Column(DateTime(timezone=False))
contact_date = Column(DateTime(timezone=False))
contact_date_data_collection_stage = Column(String(50))
contact_location = Column(String(50))
contact_location_data_collection_stage = Column(String(50))
useexisting = True
class CrossStreet(DB.Base, MapBase):
__tablename__ = 'cross_street'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
site_index_id = Column(Integer, ForeignKey('site.id'))
agency_location_index_id = Column(Integer, ForeignKey('agency_location.id'))
cross_street = Column(String(50))
useexisting = True
class CurrentlyInSchool(DB.Base, MapBase):
__tablename__ = 'currently_in_school'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
person_historical_index_id = Column(Integer, ForeignKey('person_historical.id'))
currently_in_school = Column(String(50))
currently_in_school_date_collected = Column(DateTime(timezone=False))
currently_in_school_date_effective = Column(DateTime(timezone=False))
currently_in_school_data_collection_stage = Column(String(50))
useexisting = True
class LicenseAccreditation(DB.Base, MapBase):
__tablename__ = 'license_accreditation'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
agency_index_id = Column(Integer, ForeignKey('agency.id'))
license = Column(String(50))
licensed_by = Column(String(50))
useexisting = True
class MentalHealthProblem(DB.Base, MapBase):
__tablename__ = 'mental_health_problem'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
person_historical_index_id = Column(Integer, ForeignKey('person_historical.id'))
has_mental_health_problem = Column(String(50))
has_mental_health_problem_date_collected = Column(DateTime(timezone=False))
has_mental_health_problem_date_effective = Column(DateTime(timezone=False))
has_mental_health_problem_data_collection_stage = Column(String(50))
mental_health_indefinite = Column(String(50))
mental_health_indefinite_date_collected = Column(DateTime(timezone=False))
mental_health_indefinite_date_effective = Column(DateTime(timezone=False))
mental_health_indefinite_data_collection_stage = Column(String(50))
receive_mental_health_services = Column(String(50))
receive_mental_health_services_date_collected = Column(DateTime(timezone=False))
receive_mental_health_services_date_effective = Column(DateTime(timezone=False))
receive_mental_health_services_data_collection_stage = Column(String(50))
useexisting = True
class NonCashBenefits(DB.Base, MapBase):
__tablename__ = 'non_cash_benefits'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
person_historical_index_id = Column(Integer, ForeignKey('person_historical.id'))
non_cash_benefit_id_id_num = Column(String(50))
non_cash_benefit_id_id_str = Column(String(32))
non_cash_benefit_id_id_delete = Column(Integer)
non_cash_benefit_id_id_delete_occurred_date = Column(DateTime(timezone=False))
non_cash_benefit_id_id_delete_effective_date = Column(DateTime(timezone=False))
non_cash_source_code = Column(String(50))
non_cash_source_code_date_collected = Column(DateTime(timezone=False))
non_cash_source_code_date_effective = Column(DateTime(timezone=False))
non_cash_source_code_data_collection_stage = Column(String(50))
non_cash_source_other = Column(String(50))
non_cash_source_other_date_collected = Column(DateTime(timezone=False))
non_cash_source_other_date_effective = Column(DateTime(timezone=False))
non_cash_source_other_data_collection_stage = Column(String(50))
receiving_non_cash_source = Column(String(50))
receiving_non_cash_source_date_collected = Column(DateTime(timezone=False))
receiving_non_cash_source_date_effective = Column(DateTime(timezone=False))
receiving_non_cash_source_data_collection_stage = Column(String(50))
useexisting = True
class AgencyLocation(DB.Base, MapBase):
__tablename__ = 'agency_location'
id = Column(Integer, primary_key=True)
agency_index_id = Column(Integer, ForeignKey('agency.id'))
export_index_id = Column(Integer, ForeignKey('export.id'))
key = Column(String(50))
name = Column(String(50))
site_description = Column(String(50))
physical_address_pre_address_line = Column(String(100))
physical_address_line_1 = Column(String(100))
physical_address_line_2 = Column(String(100))
physical_address_city = Column(String(50))
physical_address_country = Column(String(50))
physical_address_state = Column(String(50))
physical_address_zip_code = Column(String(50))
physical_address_county = Column(String(50))
physical_address_reason_withheld = Column(String(50))
physical_address_confidential = Column(String(50))
physical_address_description = Column(String(50))
mailing_address_pre_address_line = Column(String(100))
mailing_address_line_1 = Column(String(100))
mailing_address_line_2 = Column(String(100))
mailing_address_city = Column(String(50))
mailing_address_county = Column(String(50))
mailing_address_state = Column(String(50))
mailing_address_zip_code = Column(String(50))
mailing_address_country = Column(String(50))
mailing_address_reason_withheld = Column(String(50))
mailing_address_confidential = Column(String(50))
mailing_address_description = Column(String(50))
no_physical_address_description = Column(String(50))
no_physical_address_explanation = Column(String(50))
disabilities_access = Column(String(50))
physical_location_description = Column(String(50))
bus_service_access = Column(String(50))
public_access_to_transportation = Column(String(50))
year_inc = Column(String(50))
annual_budget_total = Column(String(50))
legal_status = Column(String(50))
exclude_from_website = Column(String(50))
exclude_from_directory = Column(String(50))
useexisting = True
class AgencyService(DB.Base, MapBase):
__tablename__ = 'agency_service'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
agency_index_id = Column(Integer, ForeignKey('agency.id'))
key = Column(String(50))
agency_key = Column(String(50))
name = Column(String(50))
useexisting = True
class NonCashBenefitsLast30Days(DB.Base, MapBase):
__tablename__ = 'non_cash_benefits_last_30_days'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
person_historical_index_id = Column(Integer, ForeignKey('person_historical.id'))
income_last_30_days = Column(String(50))
income_last_30_days_date_collected = Column(DateTime(timezone=False))
income_last_30_days_date_effective = Column(DateTime(timezone=False))
income_last_30_days_data_collection_stage = Column(String(50))
useexisting = True
class OtherAddress(DB.Base, MapBase):
__tablename__ = 'other_address'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
site_index_id = Column(Integer, ForeignKey('site.id'))
agency_location_index_id = Column(Integer, ForeignKey('agency_location.id'))
pre_address_line = Column(String(100))
line_1 = Column(String(100))
line_2 = Column(String(100))
city = Column(String(50))
county = Column(String(50))
state = Column(String(50))
zip_code = Column(String(50))
country = Column(String(50))
reason_withheld = Column(String(50))
confidential = Column(String(50))
description = Column(String(50))
useexisting = True
class OtherRequirements(DB.Base, MapBase):
__tablename__ = 'other_requirements'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
site_service_index_id = Column(Integer, ForeignKey('site_service.id'))
other_requirements = Column(String(50))
useexisting = True
class Phone(DB.Base, MapBase):
__tablename__ = 'phone'
id = Column(Integer, primary_key=True)
agency_index_id = Column(Integer, ForeignKey('agency.id'))
export_index_id = Column(Integer, ForeignKey('export.id'))
contact_index_id = Column(Integer, ForeignKey(Contact.id))
resource_info_index_id = Column(Integer, ForeignKey('resource_info.id'))
site_index_id = Column(Integer, ForeignKey('site.id'))
site_service_index_id = Column(Integer, ForeignKey('site_service.id'))
person_historical_index_id = Column(Integer, ForeignKey('person_historical.id'))
agency_location_index_id = Column(Integer, ForeignKey('agency_location.id'))
phone_number = Column(String(50))
reason_withheld = Column(String(50))
extension = Column(String(50))
description = Column(String(50))
type = Column(String(50))
function = Column(String(50))
toll_free = Column(String(50))
confidential = Column(String(50))
person_phone_number = Column(String(50))
person_phone_number_date_collected = Column(DateTime(timezone=False))
person_phone_number_date_effective = Column(DateTime(timezone=False))
person_phone_number_data_collection_stage = Column(String(50))
useexisting = True
class PhysicalDisability(DB.Base, MapBase):
__tablename__ = 'physical_disability'
id = Column(Integer, primary_key=True)
person_historical_index_id = Column(Integer, ForeignKey('person_historical.id'))
export_index_id = Column(Integer, ForeignKey('export.id'))
has_physical_disability = Column(String(50))
has_physical_disability_date_collected = Column(DateTime(timezone=False))
has_physical_disability_date_effective = Column(DateTime(timezone=False))
has_physical_disability_data_collection_stage = Column(String(50))
receive_physical_disability_services = Column(String(50))
receive_physical_disability_services_date_collected = Column(DateTime(timezone=False))
receive_physical_disability_services_date_effective = Column(DateTime(timezone=False))
receive_physical_disability_services_data_collection_stage = Column(String(50))
useexisting = True
class PitCountSet(DB.Base, MapBase):
__tablename__ = 'pit_count_set'
id = Column(Integer, primary_key=True)
site_service_index_id = Column(Integer, ForeignKey('site_service.id'))
export_index_id = Column(Integer, ForeignKey('export.id'))
pit_count_set_id_id_num = Column(String(50))
pit_count_set_id_id_str = Column(String(32))
pit_count_set_id_delete = Column(Integer)
pit_count_set_id_delete_occurred_date = Column(DateTime(timezone=False))
pit_count_set_id_delete_effective_date = Column(DateTime(timezone=False))
hud_waiver_received = Column(String(50))
hud_waiver_date = Column(DateTime(timezone=False))
hud_waiver_effective_period_start_date = Column(DateTime(timezone=False))
hud_waiver_effective_period_end_date = Column(DateTime(timezone=False))
last_pit_sheltered_count_date = Column(DateTime(timezone=False))
last_pit_unsheltered_count_date = Column(DateTime(timezone=False))
useexisting = True
class PitCounts(DB.Base, MapBase):
__tablename__ = 'pit_counts'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
pit_count_set_index_id = Column(Integer, ForeignKey(PitCountSet.id))
pit_count_value = Column(String(50))
pit_count_effective_period_start_date = Column(DateTime(timezone=False))
pit_count_effective_period_end_date = Column(DateTime(timezone=False))
pit_count_recorded_date = Column(DateTime(timezone=False))
pit_count_household_type = Column(String(50))
useexisting = True
class Pregnancy(DB.Base, MapBase):
__tablename__ = 'pregnancy'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
person_historical_index_id = Column(Integer, ForeignKey('person_historical.id'))
pregnancy_id_id_num = Column(String(50))
pregnancy_id_id_str = Column(String(32))
pregnancy_id_id_delete = Column(Integer)
pregnancy_id_id_delete_occurred_date = Column(DateTime(timezone=False))
pregnancy_id_id_delete_effective_date = Column(DateTime(timezone=False))
pregnancy_status = Column(String(50))
pregnancy_status_date_collected = Column(DateTime(timezone=False))
pregnancy_status_date_effective = Column(DateTime(timezone=False))
pregnancy_status_data_collection_stage = Column(String(50))
due_date = Column(DateTime(timezone=False))
due_date_date_collected = Column(DateTime(timezone=False))
due_date_data_collection_stage = Column(String(50))
useexisting = True
class Degree(DB.Base, MapBase):
__tablename__ = 'degree'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
person_historical_index_id = Column(Integer, ForeignKey('person_historical.id'))
degree_id_id_num = Column(String(50))
degree_id_id_str = Column(String(32))
degree_id_delete = Column(Integer)
degree_id_delete_occurred_date = Column(DateTime(timezone=False))
degree_id_delete_effective_date = Column(DateTime(timezone=False))
degree_other = Column(String(50))
degree_other_date_collected = Column(DateTime(timezone=False))
degree_other_date_effective = Column(DateTime(timezone=False))
degree_other_data_collection_stage = Column(String(50))
useexisting = True
class PriorResidence(DB.Base, MapBase):
__tablename__ = 'prior_residence'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
person_historical_index_id = Column(Integer, ForeignKey('person_historical.id'))
prior_residence_id_id_num = Column(String(50))
prior_residence_id_id_str = Column(String(32))
prior_residence_id_delete = Column(Integer)
prior_residence_id_delete_occurred_date = Column(DateTime(timezone=False))
prior_residence_id_delete_effective_date = Column(DateTime(timezone=False))
prior_residence_code = Column(String(50))
prior_residence_code_date_collected = Column(DateTime(timezone=False))
prior_residence_code_date_effective = Column(DateTime(timezone=False))
prior_residence_code_data_collection_stage = Column(String(50))
prior_residence_other = Column(String(50))
prior_residence_other_date_collected = Column(DateTime(timezone=False))
prior_residence_other_date_effective = Column(DateTime(timezone=False))
prior_residence_other_data_collection_stage = Column(String(50))
useexisting = True
class DegreeCode(DB.Base, MapBase):
__tablename__ = 'degree_code'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
degree_index_id = Column(Integer, ForeignKey(Degree.id))
degree_code = Column(String(50))
degree_date_collected = Column(DateTime(timezone=False))
degree_date_effective = Column(DateTime(timezone=False))
degree_data_collection_stage = Column(String(50))
useexisting = True
class Destinations(DB.Base, MapBase):
__tablename__ = 'destinations'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
person_historical_index_id = Column(Integer, ForeignKey('person_historical.id'))
destination_id_id_num = Column(String(50))
destination_id_id_str = Column(String(32))
destination_id_delete = Column(Integer)
destination_id_delete_occurred_date = Column(DateTime(timezone=False))
destination_id_delete_effective_date = Column(DateTime(timezone=False))
destination_code = Column(String(50))
destination_code_date_collected = Column(DateTime(timezone=False))
destination_code_date_effective = Column(DateTime(timezone=False))
destination_code_data_collection_stage = Column(String(50))
destination_other = Column(String(50))
destination_other_date_collected = Column(DateTime(timezone=False))
destination_other_date_effective = Column(DateTime(timezone=False))
destination_other_data_collection_stage = Column(String(50))
useexisting = True
class ReasonsForLeaving(DB.Base, MapBase):
__tablename__ = 'reasons_for_leaving'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
site_service_participation_index_id = Column(Integer, ForeignKey('site_service_participation.id'))
reason_for_leaving_id_id_num = Column(String(50))
reason_for_leaving_id_id_str = Column(String(32))
reason_for_leaving_id_delete = Column(Integer)
reason_for_leaving_id_delete_occurred_date = Column(DateTime(timezone=False))
reason_for_leaving_id_delete_effective_date = Column(DateTime(timezone=False))
reason_for_leaving = Column(String(50))
reason_for_leaving_date_collected = Column(DateTime(timezone=False))
reason_for_leaving_date_effective = Column(DateTime(timezone=False))
reason_for_leaving_data_collection_stage = Column(String(50))
reason_for_leaving_other = Column(String(50))
reason_for_leaving_other_date_collected = Column(DateTime(timezone=False))
reason_for_leaving_other_date_effective = Column(DateTime(timezone=False))
reason_for_leaving_other_data_collection_stage = Column(String(50))
useexisting = True
class DevelopmentalDisability(DB.Base, MapBase):
__tablename__ = 'developmental_disability'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
person_historical_index_id = Column(Integer, ForeignKey('person_historical.id'))
has_developmental_disability = Column(String(50))
has_developmental_disability_date_collected = Column(DateTime(timezone=False))
has_developmental_disability_date_effective = Column(DateTime(timezone=False))
has_developmental_disability_data_collection_stage = Column(String(50))
receive_developmental_disability = Column(String(50))
receive_developmental_disability_date_collected = Column(DateTime(timezone=False))
receive_developmental_disability_date_effective = Column(DateTime(timezone=False))
receive_developmental_disability_data_collection_stage = Column(String(50))
useexisting = True
class DisablingCondition(DB.Base, MapBase):
__tablename__ = 'disabling_condition'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
person_historical_index_id = Column(Integer, ForeignKey('person_historical.id'))
disabling_condition = Column(String(50))
disabling_condition_date_collected = Column(DateTime(timezone=False))
disabling_condition_date_effective = Column(DateTime(timezone=False))
disabling_condition_data_collection_stage = Column(String(50))
useexisting = True
class DocumentsRequired(DB.Base, MapBase):
__tablename__ = 'documents_required'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
site_service_index_id = Column(Integer, ForeignKey('site_service.id'))
documents_required = Column(String(50))
description = Column(String(50))
useexisting = True
class ResidencyRequirements(DB.Base, MapBase):
__tablename__ = 'residency_requirements'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
site_service_index_id = Column(Integer, ForeignKey('site_service.id'))
residency_requirements = Column(String(50))
useexisting = True
class DomesticViolence(DB.Base, MapBase):
__tablename__ = 'domestic_violence'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
person_historical_index_id = Column(Integer, ForeignKey('person_historical.id'))
domestic_violence_survivor = Column(String(50))
domestic_violence_survivor_date_collected = Column(DateTime(timezone=False))
domestic_violence_survivor_date_effective = Column(DateTime(timezone=False))
domestic_violence_survivor_data_collection_stage = Column(String(50))
dv_occurred = Column(String(50))
dv_occurred_date_collected = Column(DateTime(timezone=False))
dv_occurred_date_effective = Column(DateTime(timezone=False))
dv_occurred_data_collection_stage = Column(String(50))
useexisting = True
class Email(DB.Base, MapBase):
__tablename__ = 'email'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
agency_index_id = Column(Integer, ForeignKey('agency.id'))
contact_index_id = Column(Integer, ForeignKey(Contact.id))
resource_info_index_id = Column(Integer, ForeignKey('resource_info.id'))
site_index_id = Column(Integer, ForeignKey('site.id'))
person_historical_index_id = Column(Integer, ForeignKey('person_historical.id'))
agency_location_index_id = Column(Integer, ForeignKey('agency_location.id'))
address = Column(String(100))
note = Column(String(50))
person_email = Column(String(50))
person_email_date_collected = Column(DateTime(timezone=False))
person_email_date_effective = Column(DateTime(timezone=False))
person_email_data_collection_stage = Column(String(50))
useexisting = True
class Seasonal(DB.Base, MapBase):
__tablename__ = 'seasonal'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
site_service_index_id = Column(Integer, ForeignKey('site_service.id'))
description = Column(String(50))
start_date = Column(String(50))
end_date = Column(String(50))
useexisting = True
class Employment(DB.Base, MapBase):
__tablename__ = 'employment'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
person_historical_index_id = Column(Integer, ForeignKey('person_historical.id'))
employment_id_id_num = Column(String(50))
employment_id_id_str = Column(String(32))
employment_id_id_delete = Column(Integer)
employment_id_id_delete_occurred_date = Column(DateTime(timezone=False))
employment_id_id_delete_effective_date = Column(DateTime(timezone=False))
currently_employed = Column(String(50))
currently_employed_date_collected = Column(DateTime(timezone=False))
currently_employed_date_effective = Column(DateTime(timezone=False))
currently_employed_data_collection_stage = Column(String(50))
hours_worked_last_week = Column(String(50))
hours_worked_last_week_date_collected = Column(DateTime(timezone=False))
hours_worked_last_week_date_effective = Column(DateTime(timezone=False))
hours_worked_last_week_data_collection_stage = Column(String(50))
employment_tenure = Column(String(50))
employment_tenure_date_collected = Column(DateTime(timezone=False))
employment_tenure_date_effective = Column(DateTime(timezone=False))
employment_tenure_data_collection_stage = Column(String(50))
looking_for_work = Column(String(50))
looking_for_work_date_collected = Column(DateTime(timezone=False))
looking_for_work_date_effective = Column(DateTime(timezone=False))
looking_for_work_data_collection_stage = Column(String(50))
useexisting = True
class EngagedDate(DB.Base, MapBase):
__tablename__ = 'engaged_date'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
person_historical_index_id = Column(Integer, ForeignKey('person_historical.id'))
engaged_date = Column(DateTime(timezone=False))
engaged_date_date_collected = Column(DateTime(timezone=False))
engaged_date_data_collection_stage = Column(String(50))
useexisting = True
class ServiceEventNotes(DB.Base, MapBase):
__tablename__ = 'service_event_notes'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
service_event_index_id = Column(Integer, ForeignKey('service_event.id'))
note_id_id_num = Column(String(50))
note_id_id_str = Column(String(32))
note_delete = Column(Integer)
note_delete_occurred_date = Column(DateTime(timezone=False))
note_delete_effective_date = Column(DateTime(timezone=False))
note_text = Column(String(255))
note_text_date_collected = Column(DateTime(timezone=False))
note_text_date_effective = Column(DateTime(timezone=False))
note_text_data_collection_stage = Column(String(50))
useexisting = True
class FamilyRequirements(DB.Base, MapBase):
__tablename__ = 'family_requirements'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
site_service_index_id = Column(Integer, ForeignKey('site_service.id'))
family_requirements = Column(String(50))
useexisting = True
class ServiceGroup(DB.Base, MapBase):
__tablename__ = 'service_group'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
agency_index_id = Column(Integer, ForeignKey('agency.id'))
key = Column(String(50))
name = Column(String(50))
program_name = Column(String(50))
useexisting = True
class GeographicAreaServed(DB.Base, MapBase):
__tablename__ = 'geographic_area_served'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
site_service_index_id = Column(Integer, ForeignKey('site_service.id'))
zipcode = Column(String(50))
census_track = Column(String(50))
city = Column(String(50))
county = Column(String(50))
state = Column(String(50))
country = Column(String(50))
description = Column(String(50))
useexisting = True
class HealthStatus(DB.Base, MapBase):
__tablename__ = 'health_status'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
person_historical_index_id = Column(Integer, ForeignKey('person_historical.id'))
health_status = Column(String(50))
health_status_date_collected = Column(DateTime(timezone=False))
health_status_date_effective = Column(DateTime(timezone=False))
health_status_data_collection_stage = Column(String(50))
useexisting = True
class HighestSchoolLevel(DB.Base, MapBase):
__tablename__ = 'highest_school_level'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
person_historical_index_id = Column(Integer, ForeignKey('person_historical.id'))
highest_school_level = Column(String(50))
highest_school_level_date_collected = Column(DateTime(timezone=False))
highest_school_level_date_effective = Column(DateTime(timezone=False))
highest_school_level_data_collection_stage = Column(String(50))
useexisting = True
class HivAidsStatus(DB.Base, MapBase):
__tablename__ = 'hiv_aids_status'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
person_historical_index_id = Column(Integer, ForeignKey('person_historical.id'))
has_hiv_aids = Column(String(50))
has_hiv_aids_date_collected = Column(DateTime(timezone=False))
has_hiv_aids_date_effective = Column(DateTime(timezone=False))
has_hiv_aids_data_collection_stage = Column(String(50))
receive_hiv_aids_services = Column(String(50))
receive_hiv_aids_services_date_collected = Column(DateTime(timezone=False))
receive_hiv_aids_services_date_effective = Column(DateTime(timezone=False))
receive_hiv_aids_services_data_collection_stage = Column(String(50))
useexisting = True
class SpatialLocation(DB.Base, MapBase):
__tablename__ = 'spatial_location'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
site_index_id = Column(Integer, ForeignKey('site.id'))
agency_location_index_id = Column(Integer, ForeignKey('agency_location.id'))
description = Column(String(50))
datum = Column(String(50))
latitude = Column(String(50))
longitude = Column(String(50))
useexisting = True
class HmisAsset(DB.Base, MapBase):
__tablename__ = 'hmis_asset'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
site_index_id = Column(Integer, ForeignKey('site.id'))
asset_id_id_num = Column(String(50))
asset_id_id_str = Column(String(32))
asset_id_delete = Column(Integer)
asset_id_delete_occurred_date = Column(DateTime(timezone=False))
asset_id_delete_effective_date = Column(DateTime(timezone=False))
asset_count = Column(String(50))
asset_count_bed_availability = Column(String(50))
asset_count_bed_type = Column(String(50))
asset_count_bed_individual_family_type = Column(String(50))
asset_count_chronic_homeless_bed = Column(String(50))
asset_count_domestic_violence_shelter_bed = Column(String(50))
asset_count_household_type = Column(String(50))
asset_type = Column(String(50))
asset_effective_period_start_date = Column(DateTime(timezone=False))
asset_effective_period_end_date = Column(DateTime(timezone=False))
asset_recorded_date = Column(DateTime(timezone=False))
useexisting = True
class SubstanceAbuseProblem(DB.Base, MapBase):
__tablename__ = 'substance_abuse_problem'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
person_historical_index_id = Column(Integer, ForeignKey('person_historical.id'))
has_substance_abuse_problem = Column(String(50))
has_substance_abuse_problem_date_collected = Column(DateTime(timezone=False))
has_substance_abuse_problem_date_effective = Column(DateTime(timezone=False))
has_substance_abuse_problem_data_collection_stage = Column(String(50))
substance_abuse_indefinite = Column(String(50))
substance_abuse_indefinite_date_collected = Column(DateTime(timezone=False))
substance_abuse_indefinite_date_effective = Column(DateTime(timezone=False))
substance_abuse_indefinite_data_collection_stage = Column(String(50))
receive_substance_abuse_services = Column(String(50))
receive_substance_abuse_services_date_collected = Column(DateTime(timezone=False))
receive_substance_abuse_services_date_effective = Column(DateTime(timezone=False))
receive_substance_abuse_services_data_collection_stage = Column(String(50))
useexisting = True
class HousingStatus(DB.Base, MapBase):
__tablename__ = 'housing_status'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
person_historical_index_id = Column(Integer, ForeignKey('person_historical.id'))
housing_status = Column(String(50))
housing_status_date_collected = Column(DateTime(timezone=False))
housing_status_date_effective = Column(DateTime(timezone=False))
housing_status_data_collection_stage = Column(String(50))
useexisting = True
class Taxonomy(DB.Base, MapBase):
__tablename__ = 'taxonomy'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
site_service_index_id = Column(Integer, ForeignKey('site_service.id'))
need_index_id = Column(Integer, ForeignKey('need.id'))
code = Column(String(300))
useexisting = True
class HudChronicHomeless(DB.Base, MapBase):
__tablename__ = 'hud_chronic_homeless'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
person_historical_index_id = Column(Integer, ForeignKey('person_historical.id'))
hud_chronic_homeless = Column(String(50))
hud_chronic_homeless_date_collected = Column(DateTime(timezone=False))
hud_chronic_homeless_date_effective = Column(DateTime(timezone=False))
hud_chronic_homeless_data_collection_stage = Column(String(50))
useexisting = True
class TimeOpen(DB.Base, MapBase):
__tablename__ = 'time_open'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
site_index_id = Column(Integer, ForeignKey('site.id'))
languages_index_id = Column(Integer, ForeignKey('languages.id'))
site_service_index_id = Column(Integer, ForeignKey('site_service.id'))
agency_location_index_id = Column(Integer, ForeignKey('agency_location.id'))
notes = Column(String(50))
useexisting = True
class TimeOpenDays(DB.Base, MapBase):
__tablename__ = 'time_open_days'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
time_open_index_id = Column(Integer, ForeignKey(TimeOpen.id))
day_of_week = Column(String(50))
from_time = Column(String(50))
to_time = Column(String(50))
useexisting = True
class Url(DB.Base, MapBase):
__tablename__ = 'url'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
agency_index_id = Column(Integer, ForeignKey('agency.id'))
site_index_id = Column(Integer, ForeignKey('site.id'))
agency_location_index_id = Column(Integer, ForeignKey('agency_location.id'))
address = Column(String(50))
note = Column(String(50))
useexisting = True
class VeteranMilitaryBranches(DB.Base, MapBase):
__tablename__ = 'veteran_military_branches'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
person_historical_index_id = Column(Integer, ForeignKey('person_historical.id'))
military_branch_id_id_num = Column(String(50))
military_branch_id_id_str = Column(String(32))
military_branch_id_id_delete = Column(Integer)
military_branch_id_id_delete_occurred_date = Column(DateTime(timezone=False))
military_branch_id_id_delete_effective_date = Column(DateTime(timezone=False))
discharge_status = Column(String(50))
discharge_status_date_collected = Column(DateTime(timezone=False))
discharge_status_date_effective = Column(DateTime(timezone=False))
discharge_status_data_collection_stage = Column(String(50))
discharge_status_other = Column(String(50))
discharge_status_other_date_collected = Column(DateTime(timezone=False))
discharge_status_other_date_effective = Column(DateTime(timezone=False))
discharge_status_other_data_collection_stage = Column(String(50))
military_branch = Column(String(50))
military_branch_date_collected = Column(DateTime(timezone=False))
military_branch_date_effective = Column(DateTime(timezone=False))
military_branch_data_collection_stage = Column(String(50))
military_branch_other = Column(String(50))
military_branch_other_date_collected = Column(DateTime(timezone=False))
military_branch_other_date_effective = Column(DateTime(timezone=False))
military_branch_other_data_collection_stage = Column(String(50))
useexisting = True
class IncomeLast30Days(DB.Base, MapBase):
__tablename__ = 'income_last_30_days'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
person_historical_index_id = Column(Integer, ForeignKey('person_historical.id'))
income_last_30_days = Column(String(50))
income_last_30_days_date_collected = Column(DateTime(timezone=False))
income_last_30_days_date_effective = Column(DateTime(timezone=False))
income_last_30_days_data_collection_stage = Column(String(50))
useexisting = True
class VeteranMilitaryServiceDuration(DB.Base, MapBase):
__tablename__ = 'veteran_military_service_duration'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
person_historical_index_id = Column(Integer, ForeignKey('person_historical.id'))
military_service_duration = Column(String(50))
military_service_duration_date_collected = Column(DateTime(timezone=False))
military_service_duration_date_effective = Column(DateTime(timezone=False))
military_service_duration_data_collection_stage = Column(String(50))
useexisting = True
class IncomeRequirements(DB.Base, MapBase):
__tablename__ = 'income_requirements'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
site_service_index_id = Column(Integer, ForeignKey('site_service.id'))
income_requirements = Column(String(50))
useexisting = True
class VeteranServedInWarZone(DB.Base, MapBase):
__tablename__ = 'veteran_served_in_war_zone'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
person_historical_index_id = Column(Integer, ForeignKey('person_historical.id'))
served_in_war_zone = Column(String(50))
served_in_war_zone_date_collected = Column(DateTime(timezone=False))
served_in_war_zone_date_effective = Column(DateTime(timezone=False))
served_in_war_zone_data_collection_stage = Column(String(50))
useexisting = True
class IncomeTotalMonthly(DB.Base, MapBase):
__tablename__ = 'income_total_monthly'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
person_historical_index_id = Column(Integer, ForeignKey('person_historical.id'))
income_total_monthly = Column(String(50))
income_total_monthly_date_collected = Column(DateTime(timezone=False))
income_total_monthly_date_effective = Column(DateTime(timezone=False))
income_total_monthly_data_collection_stage = Column(String(50))
useexisting = True
class VeteranServiceEra(DB.Base, MapBase):
__tablename__ = 'veteran_service_era'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
person_historical_index_id = Column(Integer, ForeignKey('person_historical.id'))
service_era = Column(String(50))
service_era_date_collected = Column(DateTime(timezone=False))
service_era_date_effective = Column(DateTime(timezone=False))
service_era_data_collection_stage = Column(String(50))
useexisting = True
class VeteranVeteranStatus(DB.Base, MapBase):
__tablename__ = 'veteran_veteran_status'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
person_historical_index_id = Column(Integer, ForeignKey('person_historical.id'))
veteran_status = Column(String(50))
veteran_status_date_collected = Column(DateTime(timezone=False))
veteran_status_date_effective = Column(DateTime(timezone=False))
veteran_status_data_collection_stage = Column(String(50))
useexisting = True
class Languages(DB.Base, MapBase):
__tablename__ = 'languages'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
site_index_id = Column(Integer, ForeignKey('site.id'))
site_service_index_id = Column(Integer, ForeignKey('site_service.id'))
agency_location_index_id = Column(Integer, ForeignKey('agency_location.id'))
name = Column(String(50))
notes = Column(String(50))
useexisting = True
class VeteranWarzonesServed(DB.Base, MapBase):
__tablename__ = 'veteran_warzones_served'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
person_historical_index_id = Column(Integer, ForeignKey('person_historical.id'))
war_zone_id_id_num = Column(String(50))
war_zone_id_id_str = Column(String(32))
war_zone_id_id_delete = Column(Integer)
war_zone_id_id_delete_occurred_date = Column(DateTime(timezone=False))
war_zone_id_id_delete_effective_date = Column(DateTime(timezone=False))
months_in_war_zone = Column(String(50))
months_in_war_zone_date_collected = Column(DateTime(timezone=False))
months_in_war_zone_date_effective = Column(DateTime(timezone=False))
months_in_war_zone_data_collection_stage = Column(String(50))
received_fire = Column(String(50))
received_fire_date_collected = Column(DateTime(timezone=False))
received_fire_date_effective = Column(DateTime(timezone=False))
received_fire_data_collection_stage = Column(String(50))
war_zone = Column(String(50))
war_zone_date_collected = Column(DateTime(timezone=False))
war_zone_date_effective = Column(DateTime(timezone=False))
war_zone_data_collection_stage = Column(String(50))
war_zone_other = Column(String(50))
war_zone_other_date_collected = Column(DateTime(timezone=False))
war_zone_other_date_effective = Column(DateTime(timezone=False))
war_zone_other_data_collection_stage = Column(String(50))
useexisting = True
class LengthOfStayAtPriorResidence(DB.Base, MapBase):
__tablename__ = 'length_of_stay_at_prior_residence'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
person_historical_index_id = Column(Integer, ForeignKey('person_historical.id'))
length_of_stay_at_prior_residence = Column(String(50))
length_of_stay_at_prior_residence_date_collected = Column(DateTime(timezone=False))
length_of_stay_at_prior_residence_date_effective = Column(DateTime(timezone=False))
length_of_stay_at_prior_residence_data_collection_stage = Column(String(50))
useexisting = True
def __repr__(self):
field_dict = vars(self)
out = ''
if len(field_dict) > 0:
for x, y in field_dict.iteritems():
if x[0] != "_":
out = out + "%s = %s, " % (x,y)
return "<%s(%s)>" % (self.__class__.__name__, out)
else:
return ''
class VocationalTraining(DB.Base, MapBase):
__tablename__ = 'vocational_training'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
person_historical_index_id = Column(Integer, ForeignKey('person_historical.id'))
vocational_training = Column(String(50))
vocational_training_date_collected = Column(DateTime(timezone=False))
vocational_training_date_effective = Column(DateTime(timezone=False))
vocational_training_data_collection_stage = Column(String(50))
useexisting = True
class Export(DB.Base, MapBase):
__tablename__ = 'export'
id = Column(Integer, primary_key=True)
export_id = Column(String(50), primary_key=False, unique=False)
export_id_date_collected = Column(DateTime(timezone=False))
export_date = Column(DateTime(timezone=False))
export_date_date_collected = Column(DateTime(timezone=False))
export_period_start_date = Column(DateTime(timezone=False))
export_period_start_date_date_collected = Column(DateTime(timezone=False))
export_period_end_date = Column(DateTime(timezone=False))
export_period_end_date_date_collected = Column(DateTime(timezone=False))
export_software_vendor = Column(String(50))
export_software_vendor_date_collected = Column(DateTime(timezone=False))
export_software_version = Column(String(10))
export_software_version_date_collected = Column(DateTime(timezone=False))
#HUD 3.0
export_id_id_num = Column(String(50))
export_id_id_str = Column(String(50))
export_id_delete_occurred_date = Column(DateTime(timezone=False))
export_id_delete_effective_date = Column(DateTime(timezone=False))
export_id_delete = Column(String(32))
fk_export_to_person = relationship('Person', backref='fk_person_to_export')
#$fk_export_to_household = relationship('Household', backref='fk_household_to_export')
# 'fk_export_to_database': relation(Source, backref='fk_database_to_export')
useexisting = True
class Report(DB.Base, MapBase):
__tablename__ = 'report'
report_id = Column(String(50), primary_key=True, unique=True)
report_id_date_collected = Column(DateTime(timezone=False))
report_date = Column(DateTime(timezone=False))
report_date_date_collected = Column(DateTime(timezone=False))
report_period_start_date = Column(DateTime(timezone=False))
report_period_start_date_date_collected = Column(DateTime(timezone=False))
report_period_end_date = Column(DateTime(timezone=False))
report_period_end_date_date_collected = Column(DateTime(timezone=False))
report_software_vendor = Column(String(50))
report_software_vendor_date_collected = Column(DateTime(timezone=False))
report_software_version = Column(String(10))
report_software_version_date_collected = Column(DateTime(timezone=False))
#HUD 3.0
report_id_id_num = Column(String(50))
report_id_id_str = Column(String(50))
report_id_id_delete_occurred_date = Column(DateTime(timezone=False))
report_id_id_delete_effective_date = Column(DateTime(timezone=False))
report_id_id_delete = Column(String(32))
export_index_id = Column(Integer, ForeignKey('export.id'))
#fk_report_to_person = relationship('Person', backref='fk_person_to_report')
#fk_report_to_household = relationship('Household', backref='fk_household_to_report')
#fk_report_to_database = relationship('Source', backref='fk_database_to_report')
useexisting = True
class FosterChildEver(DB.Base, MapBase):
__tablename__ = 'foster_child_ever'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
person_historical_index_id = Column(Integer, ForeignKey('person_historical.id'))
foster_child_ever = Column(Integer)
foster_child_ever_date_collected = Column(DateTime(timezone=False))
foster_child_ever_date_effective = Column(DateTime(timezone=False))
useexisting = True
class Household(DB.Base, MapBase):
__tablename__ = 'household'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
report_id = Column(String(50), ForeignKey('report.report_id'))
household_id_num = Column(String(32))
household_id_num_date_collected = Column(DateTime(timezone=False))
household_id_str = Column(String(32))
household_id_str_date_collected = Column(DateTime(timezone=False))
head_of_household_id_unhashed = Column(String(32))
head_of_household_id_unhashed_date_collected = Column(DateTime(timezone=False))
head_of_household_id_hashed = Column(String(32))
head_of_household_id_hashed_date_collected = Column(DateTime(timezone=False))
reported = Column(Boolean)
useexisting = True
fk_household_to_members = relationship('Members', backref='fk_members_to_household')
class Person(DB.Base, MapBase):
__tablename__ = 'person'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
report_id = Column(String(50), ForeignKey('report.report_id'))
person_id_hashed = Column(String(32))
person_id_unhashed = Column(String(50))
person_id_date_collected = Column(DateTime(timezone=False))
person_date_of_birth_hashed = Column(String(32))
person_date_of_birth_hashed_date_collected = Column(DateTime(timezone=False))
person_date_of_birth_unhashed = Column(DateTime(timezone=False))
person_date_of_birth_unhashed_date_collected = Column(DateTime(timezone=False))
person_ethnicity_hashed = Column(String(32))
person_ethnicity_unhashed = Column(Integer)
person_ethnicity_hashed_date_collected = Column(DateTime(timezone=False))
person_ethnicity_unhashed_date_collected = Column(DateTime(timezone=False))
person_gender_hashed = Column(String(32))
person_gender_unhashed = Column(Integer)
person_gender_hashed_date_collected = Column(DateTime(timezone=False))
person_gender_unhashed_date_collected = Column(DateTime(timezone=False))
person_gender_unhashed_date_effective = Column(DateTime(timezone=False))
person_gender_hashed_date_effective = Column(DateTime(timezone=False))
person_legal_first_name_hashed = Column(String(32))
person_legal_first_name_unhashed = Column(String(50))
person_legal_first_name_hashed_date_collected = Column(DateTime(timezone=False))
person_legal_first_name_hashed_date_effective = Column(DateTime(timezone=False))
person_legal_first_name_unhashed_date_collected = Column(DateTime(timezone=False))
person_legal_first_name_unhashed_date_effective = Column(DateTime(timezone=False)) # JCS Added
person_legal_last_name_hashed = Column(String(32))
person_legal_last_name_unhashed = Column(String(50))
person_legal_last_name_unhashed_date_collected = Column(DateTime(timezone=False))
person_legal_last_name_unhashed_date_effective = Column(DateTime(timezone=False))
person_legal_last_name_hashed_date_collected = Column(DateTime(timezone=False))
person_legal_middle_name_hashed = Column(String(32))
person_legal_middle_name_unhashed = Column(String(50))
person_legal_middle_name_unhashed_date_collected = Column(DateTime(timezone=False))
person_legal_middle_name_hashed_date_collected = Column(DateTime(timezone=False))
person_legal_suffix_hashed = Column(String(32))
person_legal_suffix_unhashed = Column(String(50))
person_legal_suffix_unhashed_date_collected = Column(DateTime(timezone=False))
person_legal_suffix_hashed_date_collected = Column(DateTime(timezone=False))
#OtherNames is in its own table as there can be multiple OtherNames
#Race is in its own table as there can be multiple races
person_social_security_number_hashed = Column(String(32))
person_social_security_number_unhashed = Column(String(9))
person_social_security_number_unhashed_date_collected = Column(DateTime(timezone=False))
person_social_security_number_hashed_date_effective = Column(DateTime(timezone=False))
person_social_security_number_unhashed_date_effective = Column(DateTime(timezone=False))
person_social_security_number_hashed_date_collected = Column(DateTime(timezone=False))
person_social_security_number_quality_code = Column(String(2))
person_social_security_number_quality_code_date_collected = Column(DateTime(timezone=False))
person_social_security_number_quality_code_date_effective = Column(DateTime(timezone=False))
#PersonHistorical has its own table
#SiteServiceParticipation has its own table
#ReleaseOfInformation has its own table
reported = Column(Boolean)
# HUD 3.0
person_id_id_num = Column(String(50))
person_id_id_str = Column(String(50))
person_id_delete = Column(String(32))
person_id_delete_occurred_date = Column(DateTime(timezone=False))
person_id_delete_effective_date = Column(DateTime(timezone=False))
person_date_of_birth_type = Column(Integer)
person_date_of_birth_type_date_collected = Column(DateTime(timezone=False))
fk_person_to_other_names = relationship('OtherNames', backref='fk_other_names_to_person')
site_service_participations = relationship("SiteServiceParticipation", backref="person")
fk_person_to_person_historical = relationship('PersonHistorical', backref='fk_person_historical_to_person')
fk_person_to_release_of_information = relationship('ReleaseOfInformation', backref='fk_release_of_information_to_person')
fk_person_to_races = relationship('Races', backref='fk_races_to_person')
useexisting = True
#class DeduplicationLink(DB.Base, MapBase):
class ServiceEvent(DB.Base, MapBase):
__tablename__ = 'service_event'
id = Column(Integer, primary_key=True)
export_index_id = Column(Integer, ForeignKey('export.id'))
site_service_index_id = Column(Integer, ForeignKey('site_service.id'))
household_index_id = Column(Integer, ForeignKey('household.id'))
person_index_id = Column(Integer, ForeignKey('person.id'))
need_index_id = Column(Integer, ForeignKey('need.id'))
site_service_participation_index_id = Column(Integer, ForeignKey('site_service_participation.id'))
service_event_idid_num = Column(String(32))
service_event_idid_num_date_collected = Column(DateTime(timezone=False))
service_event_idid_str = Column(String(32))
service_event_idid_str_date_collected = Column(DateTime(timezone=False))
household_idid_num = Column(String(32))
is_referral = Column(String(32))
is_referral_date_collected = Column(DateTime(timezone=False))
quantity_of_service = Column(String(32))
quantity_of_service_date_collected = Column(DateTime(timezone=False))
quantity_of_service_measure = Column(String(32))
quantity_of_service_measure_date_collected = Column(DateTime(timezone=False))
service_airs_code = Column(String(300))
service_airs_code_date_collected = Column(DateTime(timezone=False))
service_period_start_date = Column(DateTime(timezone=False))
service_period_start_date_date_collected = Column(DateTime(timezone=False))
service_period_end_date = Column(DateTime(timezone=False))
service_period_end_date_date_collected = Column(DateTime(timezone=False))
service_unit = Column(String(32))
service_unit_date_collected = Column(DateTime(timezone=False))
type_of_service = Column(String(32))
type_of_service_date_collected = Column(DateTime(timezone=False))
type_of_service_other = Column(String(32))
type_of_service_other_date_collected = Column(DateTime(timezone=False))
type_of_service_par = Column(Integer)
#adding a reported column. Hopefully this will append the column to the table def.
reported = Column(Boolean)
service_event_id_delete = Column(String(32))
service_event_ind_fam = Column(Integer)
site_service_id = Column(String(50))
hmis_service_event_code_type_of_service = Column(String(50))
hmis_service_event_code_type_of_service_other = Column(String(50))
hprp_financial_assistance_service_event_code = Column(String(50))
hprp_relocation_stabilization_service_event_code = Column(String(50))
service_event_id_delete_occurred_date = Column(DateTime(timezone=False))
service_event_id_delete_effective_date = Column(DateTime(timezone=False))
service_event_provision_date = Column(DateTime(timezone=False))
service_event_recorded_date = Column(DateTime(timezone=False))
useexisting = True
class Referral(DB.Base, MapBase):
__tablename__ = 'referral'
id = Column(Integer, primary_key=True)
service_event_index_id = Column(Integer, ForeignKey('service_event.id'))
export_index_id = Column(Integer, ForeignKey('export.id'))
person_index_id = Column(Integer, ForeignKey('person.id'))
need_index_id = Column(Integer, ForeignKey('need.id')) # ??
#referral_id_date_effective = Column(DateTime(timezone=False))
referral_idid_num = Column(String(50))
referral_idid_str = Column(String(32))
referral_delete = Column(Integer)
referral_delete_occurred_date = Column(DateTime(timezone=False))
referral_delete_effective_date = Column(DateTime(timezone=False))
referral_agency_referred_to_idid_num = Column(String(50))
referral_agency_referred_to_idid_str = Column(String(50))
referral_agency_referred_to_name = Column(String(50))
referral_agency_referred_to_name_data_collection_stage = Column(String(50))
referral_agency_referred_to_name_date_collected = Column(DateTime(timezone=False))
referral_agency_referred_to_name_date_effective = Column(DateTime(timezone=False))
referral_call_idid_num = Column(String(50))
referral_call_idid_str = Column(String(50))
referral_need_idid_num = Column(String(50)) # In TBC, these refer to an already defined Need
referral_need_idid_str = Column(String(50))
useexisting = True
# FBY : TBC requested|required field
referral_need_notes = Column(String)
class Source(DB.Base, MapBase):
__tablename__ = 'source'
id = Column(Integer, primary_key=True)
report_id = Column(String(50), ForeignKey('report.report_id'))
source_id = Column(String(50))
source_id_date_collected = Column(DateTime(timezone=False))
source_email = Column(String(255))
source_email_date_collected = Column(DateTime(timezone=False))
source_contact_extension = Column(String(10))
source_contact_extension_date_collected = Column(DateTime(timezone=False))
source_contact_first = Column(String(20))
source_contact_first_date_collected = Column(DateTime(timezone=False))
source_contact_last = Column(String(20))
source_contact_last_date_collected = Column(DateTime(timezone=False))
source_contact_phone = Column(String(20))
source_contact_phone_date_collected = Column(DateTime(timezone=False))
source_name = Column(String(50))
source_name_date_collected = Column(DateTime(timezone=False))
#HUD 3.0
schema_version = Column(String(50))
source_id_id_num = Column(String(50))
source_id_id_str = Column(String(50))
source_id_delete = Column(Integer)
source_id_delete_occurred_date = Column(DateTime(timezone=False))
source_id_delete_effective_date = Column(DateTime(timezone=False))
software_vendor = Column(String(50))
software_version = Column(String(50))
source_contact_email = Column(String(255))
useexisting = True
#properties={'fk_source_to_export': relation(Export, backref='fk_export_to_source')})
class SystemConfiguration(DB.Base, MapBase):
__tablename__ = 'system_configuration_table'
id = Column(Integer, primary_key=True)
vendor_name = Column(String(50))
processing_mode = Column(String(4)) # TEST or PROD
source_id = Column(String(50))
odbid = Column(Integer)
providerid = Column(Integer)
userid = Column(Integer)
useexisting = True
class LastDateTime(DB.Base, MapBase):
# FBY: This table is used to record the document lifecycle: received, shredded, transmitted via SOAP
__tablename__ = 'last_date_time'
id = Column(Integer, primary_key=True)
event = Column(String(50))
event_date_time = Column(DateTime(timezone=False))
useexisting = True
def test():
from . import postgresutils
utils = postgresutils.Utils()
utils.blank_database()
print("instantiating db")
db = DB()
session = db.Session()
db.Base.metadata.create_all(db.pg_db_engine)
new = Source(source_id_id_num = 1, source_name='Orange County Corrections')
session.add(new)
session.commit()
print("done")
if __name__ == "__main__":
import sys
sys.exit(test())
#The MIT License
#
#Copyright (c) 2011, Alexandria Consulting LLC
#
#Permission is hereby granted, free of charge, to any person obtaining a copy
#of this software and associated documentation files (the "Software"), to deal
#in the Software without restriction, including without limitation the rights
#to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
#copies of the Software, and to permit persons to whom the Software is
#furnished to do so, subject to the following conditions:
#
#The above copyright notice and this permission notice shall be included in
#all copies or substantial portions of the Software.
#
#THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
#IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
#FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
#AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
#LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
#OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
#THE SOFTWARE
| 49.789865 | 176 | 0.757755 | 13,198 | 107,098 | 5.767389 | 0.048265 | 0.098374 | 0.135842 | 0.166715 | 0.836701 | 0.77217 | 0.694763 | 0.593394 | 0.457513 | 0.352873 | 0 | 0.0148 | 0.145157 | 107,098 | 2,150 | 177 | 49.813023 | 0.816618 | 0.027134 | 0 | 0.380649 | 0 | 0 | 0.049642 | 0.009145 | 0 | 0 | 0 | 0 | 0 | 1 | 0.002658 | false | 0.000532 | 0.004785 | 0 | 0.979798 | 0.002658 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
cdad55a9ce2a49755ae4b294972c1f2e61c115f9 | 425 | py | Python | problem/01000~09999/01058/1058.py3.py | njw1204/BOJ-AC | 1de41685725ae4657a7ff94e413febd97a888567 | [
"MIT"
] | 1 | 2019-04-19T16:37:44.000Z | 2019-04-19T16:37:44.000Z | problem/01000~09999/01058/1058.py3.py | njw1204/BOJ-AC | 1de41685725ae4657a7ff94e413febd97a888567 | [
"MIT"
] | 1 | 2019-04-20T11:42:44.000Z | 2019-04-20T11:42:44.000Z | problem/01000~09999/01058/1058.py3.py | njw1204/BOJ-AC | 1de41685725ae4657a7ff94e413febd97a888567 | [
"MIT"
] | 3 | 2019-04-19T16:37:47.000Z | 2021-10-25T00:45:00.000Z | n=int(input())
link=[[100]*n for i in range(n)]
for i in range(n):
x=input()
for j in range(n):
if x[j]=='Y': link[i][j]=1
for i in range(n):
for j in range(n):
for k in range(n):
if link[j][i]+link[i][k]<link[j][k]:
link[j][k]=link[j][i]+link[i][k]
link[k][j]=link[j][k]
ans=0
for i in range(n):
t=0
for j in range(n):
if link[i][j]<=2 and i!=j: t+=1
ans=max(t,ans)
print(ans) | 20.238095 | 42 | 0.52 | 101 | 425 | 2.188119 | 0.217822 | 0.253394 | 0.289593 | 0.199095 | 0.647059 | 0.488688 | 0.144796 | 0 | 0 | 0 | 0 | 0.024465 | 0.230588 | 425 | 21 | 43 | 20.238095 | 0.651376 | 0 | 0 | 0.315789 | 0 | 0 | 0.002347 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.052632 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
cde9a03010ce87292ba1da645b7d397d96cc724e | 115 | py | Python | aitoolbox/cloud/__init__.py | mv1388/AIToolbox | c64ac4810a02d230ce471d86b758e82ea232a7e7 | [
"MIT"
] | 3 | 2019-10-12T12:24:09.000Z | 2020-08-02T02:42:43.000Z | aitoolbox/cloud/__init__.py | mv1388/aitoolbox | 1060435e6cbdfd19abcb726c4080b663536b7467 | [
"MIT"
] | 3 | 2020-04-10T14:07:07.000Z | 2020-04-22T19:04:38.000Z | aitoolbox/cloud/__init__.py | mv1388/aitoolbox | 1060435e6cbdfd19abcb726c4080b663536b7467 | [
"MIT"
] | null | null | null | s3_available_options = ['s3', 'aws_s3', 'aws']
gcs_available_options = ['gcs', 'google_storage', 'google storage']
| 38.333333 | 67 | 0.713043 | 15 | 115 | 5.066667 | 0.466667 | 0.421053 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.028846 | 0.095652 | 115 | 2 | 68 | 57.5 | 0.701923 | 0 | 0 | 0 | 0 | 0 | 0.365217 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
cdff5880102eb2ba8d22b6cbec2e9bb5407da963 | 2,196 | py | Python | backup.py | BigBlueHat/copy-couch | ab4759540faecae8239c94e8045f7fce1f4a4914 | [
"Apache-2.0"
] | null | null | null | backup.py | BigBlueHat/copy-couch | ab4759540faecae8239c94e8045f7fce1f4a4914 | [
"Apache-2.0"
] | null | null | null | backup.py | BigBlueHat/copy-couch | ab4759540faecae8239c94e8045f7fce1f4a4914 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python
"""copy-couch makes copies of couches. no joke.
License: Apache 2.0 - http://opensource.org/licenses/Apache-2.0
"""
import argparse
import base64
import ConfigParser
import datetime
import json
import requests
argparser = argparse.ArgumentParser()
argparser.add_argument('config_file', type=file,
help="Config INI file. See `config.sample.ini` for info.")
args = argparser.parse_args()
config = ConfigParser.RawConfigParser({
'protocol': 143,
'host': 'localhost:5984'
})
config.readfp(args.config_file)
local_couch = config._sections['local']
local_couch['password'] = base64.b64decode(local_couch['password'])
local_url = local_couch['protocol'] + '://' + local_couch['host'] + '/'
remote_couch = config._sections['remote']
remote_couch['password'] = base64.b64decode(remote_couch['password'])
remote_url = remote_couch['protocol'] + '://' + remote_couch['host'] + '/'
# setup local db session
local_db = requests.Session()
local_db.auth = (local_couch['user'], local_couch['password'])
# setup remote db session
remote_db = requests.Session()
remote_db.auth = (remote_couch['user'], remote_couch['password'])
rv = local_db.get(local_url).json()
uuid = rv['uuid']
rv = local_db.get(local_url + '_all_dbs').json()
# TODO: make which DB's configurable
dbs = [db for db in rv if db[0] != '_']
# create & store one rep_doc per database
for db in dbs:
# create _replicator docs for each DB on local; target remote
rep_doc = {
"_id": "backup~" + datetime.datetime.now().isoformat(),
"source": local_url,
"target": remote_couch['protocol'] + '://' \
+ remote_couch['user'] + ':' + remote_couch['password'] \
+ '@' + remote_couch['host'] + '/backup%2F' + uuid + '%2F',
"create_target": True
}
rep_doc['source'] += db;
rep_doc['target'] += db;
# TODO: make the backup db name configurable / reusable
print 'Copying ' + db
print ' from: ' + local_url
print ' to: ' + remote_url + 'backup%2F' + uuid + '%2F' + db
rv = local_db.post(local_url + '_replicate', json=rep_doc, headers = {
'Content-Type': 'application/json'})
print rv.json()
| 29.675676 | 74 | 0.659836 | 284 | 2,196 | 4.922535 | 0.362676 | 0.086552 | 0.054363 | 0.040057 | 0.112303 | 0.077253 | 0 | 0 | 0 | 0 | 0 | 0.014404 | 0.178051 | 2,196 | 73 | 75 | 30.082192 | 0.760111 | 0.116576 | 0 | 0 | 0 | 0 | 0.198238 | 0 | 0 | 0 | 0 | 0.013699 | 0 | 0 | null | null | 0.108696 | 0.130435 | null | null | 0.086957 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 3 |
a808c833e4004773a8618ea9f6a2827bf0e5f1ca | 2,044 | py | Python | data-detective-airflow/tests/dag_generator/test_tdag.py | dmitriy-e/metadata-governance | 018a879951dee3f3c2c05ac8e05b8360dd7f4ab3 | [
"Apache-2.0"
] | 5 | 2021-12-01T09:55:23.000Z | 2021-12-21T16:23:33.000Z | data-detective-airflow/tests/dag_generator/test_tdag.py | dmitriy-e/metadata-governance | 018a879951dee3f3c2c05ac8e05b8360dd7f4ab3 | [
"Apache-2.0"
] | 1 | 2021-11-16T15:55:34.000Z | 2021-11-16T15:55:34.000Z | data-detective-airflow/tests/dag_generator/test_tdag.py | dmitriy-e/metadata-governance | 018a879951dee3f3c2c05ac8e05b8360dd7f4ab3 | [
"Apache-2.0"
] | 2 | 2021-11-03T09:43:09.000Z | 2021-11-17T10:16:29.000Z | import pytest
import allure
from data_detective_airflow.constants import PG_CONN_ID, S3_CONN_ID
from data_detective_airflow.dag_generator.results import PgResult, PickleResult
from data_detective_airflow.dag_generator import ResultType, WorkType
@allure.feature('Dag results')
@allure.story('Create Pickle')
def test_dag_file_create_result(test_dag, context):
dag_result = test_dag.get_result(operator=None,
result_name='test',
context=context,
result_type=test_dag.result_type,
work_type=test_dag.work_type)
assert isinstance(dag_result, PickleResult)
test_dag.clear_all_works(context)
@allure.feature('Dag results')
@allure.story('Create S3')
@pytest.mark.parametrize('test_dag',
[(ResultType.RESULT_PICKLE.value,
WorkType.WORK_S3.value,
S3_CONN_ID)],
indirect=True)
def test_dag_s3_create_result(test_dag, context):
dag_result = test_dag.get_result(operator=None, result_name='test', context=context,
result_type=test_dag.result_type,
work_type=test_dag.work_type)
assert isinstance(dag_result, PickleResult)
test_dag.clear_all_works(context)
@allure.feature('Dag results')
@allure.story('Create PG')
@pytest.mark.parametrize('test_dag',
[(ResultType.RESULT_PG.value,
WorkType.WORK_PG.value,
PG_CONN_ID)],
indirect=True)
def test_dag_pg_create_result(test_dag, context):
dag_result = test_dag.get_result(operator=None, result_name='test', context=context,
result_type=test_dag.result_type,
work_type=test_dag.work_type)
assert isinstance(dag_result, PgResult)
test_dag.clear_all_works(context)
| 41.714286 | 88 | 0.611546 | 230 | 2,044 | 5.091304 | 0.191304 | 0.119556 | 0.06661 | 0.061486 | 0.801879 | 0.801879 | 0.717336 | 0.560205 | 0.560205 | 0.560205 | 0 | 0.003536 | 0.308219 | 2,044 | 48 | 89 | 42.583333 | 0.824611 | 0 | 0 | 0.487805 | 0 | 0 | 0.04501 | 0 | 0 | 0 | 0 | 0 | 0.073171 | 1 | 0.073171 | false | 0 | 0.121951 | 0 | 0.195122 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
a80bd9815a0efacc56fe16adf0b6e490442b6851 | 161 | py | Python | magic_markdown/__init__.py | transfluxus/magic_markdown | 3a71d0c0a0937dc87973b6e19389f27575e16208 | [
"MIT"
] | 10 | 2019-04-09T17:33:52.000Z | 2021-05-10T04:58:59.000Z | magic_markdown/__init__.py | transfluxus/magic_markdown | 3a71d0c0a0937dc87973b6e19389f27575e16208 | [
"MIT"
] | null | null | null | magic_markdown/__init__.py | transfluxus/magic_markdown | 3a71d0c0a0937dc87973b6e19389f27575e16208 | [
"MIT"
] | null | null | null | name = "magic_markdown"
from magic_markdown.MagicMarkdown import MagicMarkdown
def load_ipython_extension(ipython):
ipython.register_magics(MagicMarkdown)
| 23 | 54 | 0.838509 | 18 | 161 | 7.222222 | 0.666667 | 0.2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.099379 | 161 | 6 | 55 | 26.833333 | 0.896552 | 0 | 0 | 0 | 0 | 0 | 0.086957 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.25 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
a827531247ffd24ded530b9e0dea0c181d142c7b | 114 | py | Python | math_and_algorithm/024.py | tonko2/AtCoder | 5d617072517881d226d7c8af09cb88684d41af7e | [
"Xnet",
"X11",
"CECILL-B"
] | 2 | 2022-01-22T07:56:58.000Z | 2022-01-24T00:29:37.000Z | math_and_algorithm/024.py | tonko2/AtCoder | 5d617072517881d226d7c8af09cb88684d41af7e | [
"Xnet",
"X11",
"CECILL-B"
] | null | null | null | math_and_algorithm/024.py | tonko2/AtCoder | 5d617072517881d226d7c8af09cb88684d41af7e | [
"Xnet",
"X11",
"CECILL-B"
] | null | null | null | N = int(input())
ans = 0
for _ in range(N):
p, q = map(int, input().split())
ans += (1 / p) * q
print(ans) | 19 | 36 | 0.5 | 21 | 114 | 2.666667 | 0.666667 | 0.285714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.02381 | 0.263158 | 114 | 6 | 37 | 19 | 0.642857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.166667 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
b524a997831cceef37fb6ffcb9a5a1813e885500 | 9,076 | py | Python | utils/HCA/a_star.py | proroklab/magat_pathplanning | a2cab3b11abc46904bc45be1762a780becb1e8c7 | [
"MIT"
] | 40 | 2021-07-01T03:14:20.000Z | 2022-03-23T23:45:22.000Z | utils/HCA/a_star.py | QingbiaoLi/magat_pathplanning | f28429b1a2ab7866c3001b82e6ae9ca3f072c106 | [
"MIT"
] | null | null | null | utils/HCA/a_star.py | QingbiaoLi/magat_pathplanning | f28429b1a2ab7866c3001b82e6ae9ca3f072c106 | [
"MIT"
] | 13 | 2021-07-14T07:57:16.000Z | 2022-03-03T10:43:25.000Z | '''
This file contains utility of AStarSearch.
Thanks to Binyu Wang for providing the codes.
'''
from random import randint
import numpy as np
class SearchEntry():
def __init__(self, x, y, g_cost, f_cost=0, pre_entry=None):
self.x = x
self.y = y
# cost move form start entry to this entry
self.g_cost = g_cost
self.f_cost = f_cost
self.pre_entry = pre_entry
def getPos(self):
return (self.x, self.y)
def AStarSearch(img, source, dest):
def getNewPosition(img, location, offset):
x, y = (location.x + offset[0], location.y + offset[1])
if x < 0 or x >= img.shape[0] or y < 0 or y >= img.shape[1] or img[x, y] == 1 or img[x, y] == 3:
return None
return (x, y)
def getPositions(img, location):
# use four ways or eight ways to move
offsets = [(-1, 0), (0, -1), (1, 0), (0, 1)]
# offsets = [(-1,0), (0, -1), (1, 0), (0, 1), (-1,-1), (1, -1), (-1, 1), (1, 1)]
poslist = []
for offset in offsets:
pos = getNewPosition(img, location, offset)
if pos is not None:
poslist.append(pos)
return poslist
# imporve the heuristic distance more precisely in future
def calHeuristic(pos, dest):
return abs(dest.x - pos[0]) + abs(dest.y - pos[1])
def getMoveCost(location, pos):
if location.x != pos[0] and location.y != pos[1]:
return 1.4
else:
return 1
# check if the position is in list
def isInList(list, pos):
if pos in list:
return list[pos]
return None
# add available adjacent positions
def addAdjacentPositions(img, location, dest, openlist, closedlist):
poslist = getPositions(img, location)
for pos in poslist:
# if position is already in closedlist, do nothing
if isInList(closedlist, pos) is None:
findEntry = isInList(openlist, pos)
h_cost = calHeuristic(pos, dest)
g_cost = location.g_cost + getMoveCost(location, pos)
if findEntry is None:
# if position is not in openlist, add it to openlist
openlist[pos] = SearchEntry(pos[0], pos[1], g_cost, g_cost + h_cost, location)
elif findEntry.g_cost > g_cost:
# if position is in openlist and cost is larger than current one,
# then update cost and previous position
findEntry.g_cost = g_cost
findEntry.f_cost = g_cost + h_cost
findEntry.pre_entry = location
# find a least cost position in openlist, return None if openlist is empty
def getFastPosition(openlist):
fast = None
for entry in openlist.values():
if fast is None:
fast = entry
elif fast.f_cost > entry.f_cost:
fast = entry
return fast
all_path = []
openlist = {}
closedlist = {}
location = SearchEntry(source[0], source[1], 0.0)
dest = SearchEntry(dest[0], dest[1], 0.0)
openlist[source] = location
while True:
location = getFastPosition(openlist)
if location is None:
# not found valid path
# print("can't find valid path")
return ([source])
if location.x == dest.x and location.y == dest.y:
break
closedlist[location.getPos()] = location
openlist.pop(location.getPos())
addAdjacentPositions(img, location, dest, openlist, closedlist)
while location is not None:
all_path.append([location.x, location.y])
# img[location.x][location.y] = 2
location = location.pre_entry
return all_path[::-1]
def hca(img, all_start, all_end, steps=100):
all_path = []
robot_loc = np.where(img == 3)
for i in range(img.shape[0]):
for j in range(img.shape[1]):
if img[i, j] == 3:
img[i, j] = 0
res_imgs = np.expand_dims(img, axis=0).repeat(steps, axis=0)
for i in range(len(robot_loc[0])):
res_imgs[0, robot_loc[0][i], robot_loc[1][i]] = 3
for i in range(len(all_start)):
robot_path = AStarTime(res_imgs, (all_start[i][0], all_start[i][1]), (all_end[i][0], all_end[i][1]))
# print(i)
if len(robot_path) == 1:
new_path = []
for j in range(steps - 1):
res_imgs[j, all_start[i][0], all_start[i][1]] = 3
new_path.append([all_start[i][0], all_start[i][1], j])
all_path.append(new_path)
continue
else:
for loc in robot_path:
res_imgs[loc[2], loc[0], loc[1]] = 3
all_path.append(robot_path)
return all_path
class SearchEntryTime():
def __init__(self, x, y, z, g_cost, f_cost=0, pre_entry=None):
self.x = x
self.y = y
self.z = z
# cost move form start entry to this entry
self.g_cost = g_cost
self.f_cost = f_cost
self.pre_entry = pre_entry
def getPos(self):
return (self.x, self.y, self.z)
def AStarTime(imgs, source, dest, total_steps=80):
def getNewPosition(img, location, offset, step=0):
x, y = (location.x + offset[0], location.y + offset[1])
if x < 0 or x >= img.shape[0] or y < 0 or y >= img.shape[1] or img[x, y] == 1 or img[x, y] == 3:
return None
return (x, y, step)
def getPositions(img, location, step=0):
# use four ways or eight ways to move
offsets = [(-1, 0), (0, -1), (1, 0), (0, 1)]
# offsets = [(-1,0), (0, -1), (1, 0), (0, 1), (-1,-1), (1, -1), (-1, 1), (1, 1)]
poslist = []
for offset in offsets:
pos = getNewPosition(img, location, offset, step)
if pos is not None:
poslist.append(pos)
return poslist
# imporve the heuristic distance more precisely in future
def calHeuristic(pos, dest):
return abs(dest.x - pos[0]) + abs(dest.y - pos[1])
def getMoveCost(location, pos):
if location.x != pos[0] and location.y != pos[1]:
return 1.4
else:
return 1
# check if the position is in list
def isInList(list, pos):
if pos in list:
return list[pos]
return None
# add available adjacent positions
def addAdjacentPositions(imgs, location, dest, openlist, closedlist, steps):
img = imgs[int(steps + 1), :, :]
poslist = getPositions(img, location, steps)
for pos in poslist:
# if position is already in closedlist, do nothing
if isInList(closedlist, pos) is None:
findEntry = isInList(openlist, pos)
h_cost = calHeuristic(pos, dest)
g_cost = location.g_cost + getMoveCost(location, pos)
if findEntry is None:
# if position is not in openlist, add it to openlist
steps = int(g_cost)
openlist[(pos[0], pos[1], steps)] = SearchEntryTime(pos[0], pos[1], steps, g_cost, g_cost + h_cost,
location)
elif findEntry.g_cost > g_cost:
# if position is in openlist and cost is larger than current one,
# then update cost and previous position
findEntry.g_cost = g_cost
findEntry.f_cost = g_cost + h_cost
findEntry.z = int(g_cost)
findEntry.pre_entry = location
# find a least cost position in openlist, return None if openlist is empty
def getFastPosition(openlist):
fast = None
for entry in openlist.values():
if fast is None:
fast = entry
elif fast.f_cost > entry.f_cost:
fast = entry
return fast
all_path = []
openlist = {}
closedlist = {}
location = SearchEntryTime(source[0], source[1], 0, 0.0)
dest = SearchEntryTime(dest[0], dest[1], 0, 0.0)
openlist[(source[0], source[1], 0)] = location
steps = 0
while steps < total_steps:
location = getFastPosition(openlist)
if location is None:
# not found valid path
# print("can't find valid path")
return ([source])
if location.x == dest.x and location.y == dest.y:
break
closedlist[location.getPos()] = location
openlist.pop(location.getPos())
steps = int(location.g_cost)
addAdjacentPositions(imgs, location, dest, openlist, closedlist, steps)
while location is not None:
all_path.append([location.x, location.y, location.z])
# img[location.x][location.y] = 2
location = location.pre_entry
return all_path[::-1]
# img = np.zeros((20,20))
# source = (0,0)
# dest = (img.shape[0]-1, img.shape[1]-1)
# path = AStarSearch(img, source, dest)
| 35.592157 | 119 | 0.552336 | 1,220 | 9,076 | 4.02459 | 0.120492 | 0.027495 | 0.008554 | 0.009776 | 0.780652 | 0.743177 | 0.714257 | 0.694297 | 0.682077 | 0.682077 | 0 | 0.025993 | 0.334509 | 9,076 | 254 | 120 | 35.732283 | 0.786921 | 0.164941 | 0 | 0.640884 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.116022 | false | 0 | 0.01105 | 0.022099 | 0.276243 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
b5523d39a4d4c8cb3b8be163ac345c9888bb29a1 | 178 | py | Python | reference/old/distance-simple.py | Art31/trekking-pro-cefetrj | 37ab58759b42978cbd8d950bd75c487e1292cb2b | [
"Apache-1.1"
] | null | null | null | reference/old/distance-simple.py | Art31/trekking-pro-cefetrj | 37ab58759b42978cbd8d950bd75c487e1292cb2b | [
"Apache-1.1"
] | null | null | null | reference/old/distance-simple.py | Art31/trekking-pro-cefetrj | 37ab58759b42978cbd8d950bd75c487e1292cb2b | [
"Apache-1.1"
] | null | null | null | from gpiozero import DistanceSensor
from time import sleep
sensor = DistanceSensor(echo=23, trigger=22)
while True:
print('Distance: ', sensor.distance * 100)
sleep(1)
| 19.777778 | 46 | 0.735955 | 23 | 178 | 5.695652 | 0.73913 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.054054 | 0.168539 | 178 | 8 | 47 | 22.25 | 0.831081 | 0 | 0 | 0 | 0 | 0 | 0.056497 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0.166667 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
b55c52f4a65b506287148bb4b6f73c63cbf60fe9 | 266 | py | Python | cbsettings/exceptions.py | matthewwithanm/django-classbasedsettings | 4b208f1c73a2acedc5cd3bfa2b73541607ed9ce8 | [
"MIT"
] | 23 | 2015-02-17T13:35:33.000Z | 2020-10-02T07:06:24.000Z | cbsettings/exceptions.py | matthewwithanm/django-classbasedsettings | 4b208f1c73a2acedc5cd3bfa2b73541607ed9ce8 | [
"MIT"
] | 8 | 2015-12-23T19:42:49.000Z | 2021-10-01T20:13:40.000Z | cbsettings/exceptions.py | matthewwithanm/django-classbasedsettings | 4b208f1c73a2acedc5cd3bfa2b73541607ed9ce8 | [
"MIT"
] | 4 | 2015-12-23T19:17:39.000Z | 2020-09-27T19:29:13.000Z | class SettingsFactoryDoesNotExist(Exception):
pass
class InvalidSettingsFactory(Exception):
pass
class NoMatchingSettings(Exception):
"""Raised when a suitable settings class cannot be found."""
pass
class InvalidCondition(Exception):
pass
| 16.625 | 64 | 0.75188 | 25 | 266 | 8 | 0.6 | 0.195 | 0.18 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.176692 | 266 | 15 | 65 | 17.733333 | 0.913242 | 0.203008 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.5 | 0 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 3 |
b56d8510f015d44a803fb673140f624e3488a4d1 | 326 | py | Python | pctiler/pctiler/colormaps/mtbs.py | hobu/planetary-computer-apis | 27f5b8ce78737f43b306fa4738007c207a329b5b | [
"MIT"
] | 1 | 2021-11-02T16:13:43.000Z | 2021-11-02T16:13:43.000Z | pctiler/pctiler/colormaps/mtbs.py | moradology/planetary-computer-apis-1 | 81a666e843cd0d8592708b35e1360fb68815816d | [
"MIT"
] | null | null | null | pctiler/pctiler/colormaps/mtbs.py | moradology/planetary-computer-apis-1 | 81a666e843cd0d8592708b35e1360fb68815816d | [
"MIT"
] | null | null | null | from typing import Dict, List
mtbs_colormaps: Dict[str, Dict[int, List[int]]] = {
"mtbs-severity": {
0: [0, 0, 0, 0],
1: [0, 100, 0, 255],
2: [127, 255, 212, 255],
3: [255, 255, 0, 255],
4: [255, 0, 0, 255],
5: [127, 255, 0, 255],
6: [255, 255, 255, 255],
},
}
| 23.285714 | 51 | 0.435583 | 50 | 326 | 2.82 | 0.42 | 0.070922 | 0.06383 | 0.056738 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.336493 | 0.352761 | 326 | 13 | 52 | 25.076923 | 0.331754 | 0 | 0 | 0 | 0 | 0 | 0.039877 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.083333 | 0 | 0.083333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
b5708d1fb897807ff8443226c727f7b6941ec2ec | 580 | py | Python | fluentcheck/tests/tests_is/test_basic_checks_is.py | jstoebel/fluentcheck | 9258dab4f46776b5df50528f5028ce2d11a443c5 | [
"MIT"
] | 83 | 2018-05-31T13:21:06.000Z | 2022-03-20T14:27:49.000Z | fluentcheck/tests/tests_is/test_basic_checks_is.py | jstoebel/fluentcheck | 9258dab4f46776b5df50528f5028ce2d11a443c5 | [
"MIT"
] | 20 | 2019-02-10T15:07:44.000Z | 2021-04-02T13:18:52.000Z | fluentcheck/tests/tests_is/test_basic_checks_is.py | jstoebel/fluentcheck | 9258dab4f46776b5df50528f5028ce2d11a443c5 | [
"MIT"
] | 11 | 2019-02-16T21:33:11.000Z | 2022-03-25T03:39:52.000Z | import unittest
from fluentcheck import Is
from fluentcheck.exceptions import CheckError
# noinspection PyStatementEffect
class TestIsBasicChecks(unittest.TestCase):
def test_is_none_pass(self):
self.assertIsInstance(Is(None).none, Is)
def test_is_none_fail(self):
with self.assertRaises(CheckError):
Is("I am not none").none
def test_is_not_none_pass(self):
self.assertIsInstance(Is("I am not none").not_none, Is)
def test_is_not_none_fail(self):
with self.assertRaises(CheckError):
Is(None).not_none
| 26.363636 | 63 | 0.712069 | 77 | 580 | 5.155844 | 0.311688 | 0.105793 | 0.09068 | 0.065491 | 0.556675 | 0.392947 | 0.221662 | 0.221662 | 0 | 0 | 0 | 0 | 0.201724 | 580 | 21 | 64 | 27.619048 | 0.857451 | 0.051724 | 0 | 0.142857 | 0 | 0 | 0.047445 | 0 | 0 | 0 | 0 | 0 | 0.285714 | 1 | 0.285714 | false | 0.142857 | 0.214286 | 0 | 0.571429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 3 |
b5934a3e23d4a7debafa86c699f518b3070881a9 | 153 | py | Python | website/website/apps/statistics/urls.py | SimonGreenhill/Language5 | c59f502dda7be27fc338f0338cc3b03e63bad9c8 | [
"MIT"
] | 1 | 2020-08-17T05:56:16.000Z | 2020-08-17T05:56:16.000Z | website/website/apps/statistics/urls.py | SimonGreenhill/Language5 | c59f502dda7be27fc338f0338cc3b03e63bad9c8 | [
"MIT"
] | 5 | 2020-06-05T17:51:56.000Z | 2022-01-13T00:42:51.000Z | website/website/apps/statistics/urls.py | SimonGreenhill/Language5 | c59f502dda7be27fc338f0338cc3b03e63bad9c8 | [
"MIT"
] | 1 | 2015-02-23T22:54:00.000Z | 2015-02-23T22:54:00.000Z | from django.conf.urls import url
from website.apps.statistics.views import statistics
urlpatterns = [
url(r'^$', statistics, name="statistics"),
]
| 19.125 | 52 | 0.732026 | 19 | 153 | 5.894737 | 0.684211 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.137255 | 153 | 7 | 53 | 21.857143 | 0.848485 | 0 | 0 | 0 | 0 | 0 | 0.078431 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 0.4 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
a90935ea9eaeec17fd69395a2c757ba3a6870196 | 45 | py | Python | chapter03/array.py | gothedistance/python-book | 18d82b96e8336e2a26608e25b2c2424e19dab2a3 | [
"MIT"
] | 17 | 2016-08-12T13:15:43.000Z | 2021-03-22T13:35:33.000Z | chapter03/array.py | gothedistance/python-book | 18d82b96e8336e2a26608e25b2c2424e19dab2a3 | [
"MIT"
] | null | null | null | chapter03/array.py | gothedistance/python-book | 18d82b96e8336e2a26608e25b2c2424e19dab2a3 | [
"MIT"
] | 13 | 2016-08-12T13:31:47.000Z | 2021-01-28T11:06:48.000Z | array = [1,2,3]
for v in array:
print(v)
| 11.25 | 15 | 0.555556 | 10 | 45 | 2.5 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.090909 | 0.266667 | 45 | 3 | 16 | 15 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.333333 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
a92f6dae240175ed04d97420af8139933f7227f7 | 83 | py | Python | wine/apps.py | zenith0/vinum-db | d601b1e1da6d748450f454e6f6d6803d7a4ecea8 | [
"Apache-2.0"
] | null | null | null | wine/apps.py | zenith0/vinum-db | d601b1e1da6d748450f454e6f6d6803d7a4ecea8 | [
"Apache-2.0"
] | null | null | null | wine/apps.py | zenith0/vinum-db | d601b1e1da6d748450f454e6f6d6803d7a4ecea8 | [
"Apache-2.0"
] | null | null | null | from django.apps import AppConfig
class WineConfig(AppConfig):
name = 'wine'
| 13.833333 | 33 | 0.73494 | 10 | 83 | 6.1 | 0.9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.180723 | 83 | 5 | 34 | 16.6 | 0.897059 | 0 | 0 | 0 | 0 | 0 | 0.048193 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
a958be1401872e502507925a19d9666e8b808383 | 736 | py | Python | listing_manager/utils.py | mikezareno/listing-manager | 7cf07c2f654925254949b8d0000104cd0cfcafe9 | [
"MIT"
] | null | null | null | listing_manager/utils.py | mikezareno/listing-manager | 7cf07c2f654925254949b8d0000104cd0cfcafe9 | [
"MIT"
] | null | null | null | listing_manager/utils.py | mikezareno/listing-manager | 7cf07c2f654925254949b8d0000104cd0cfcafe9 | [
"MIT"
] | null | null | null | #
from __future__ import unicode_literals
import frappe, erpnext
from frappe import _
import json
from frappe.utils import flt, cstr, nowdate, nowtime
from six import string_types
class InvalidWarehouseCompany(frappe.ValidationError): pass
@frappe.whitelist()
def get_item_code(scancode=None):
if scancode:
#try barcode lookup
item_code = frappe.db.get_value("Item Barcode", {"barcode" : scancode}, fieldname=["parent"])
if not item_code:
#try supplier code lookup
item_code = frappe.db.get_value("Item Supplier", {"supplier_part_no" : scancode}, fieldname=["parent"])
if not item_code:
frappe.throw("No Item Found with code: " + scancode)
return item_code
@frappe.whitelist()
def hello():
return "Hello Mike"
| 26.285714 | 107 | 0.754076 | 100 | 736 | 5.37 | 0.45 | 0.089385 | 0.104283 | 0.074488 | 0.260708 | 0.260708 | 0.260708 | 0.126629 | 0 | 0 | 0 | 0 | 0.141304 | 736 | 27 | 108 | 27.259259 | 0.849684 | 0.057065 | 0 | 0.210526 | 0 | 0 | 0.137482 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.105263 | false | 0.052632 | 0.315789 | 0.052632 | 0.578947 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 3 |
a95cd5050cc5338cb7021667243f069114685055 | 2,293 | py | Python | utils/forms.py | JakubBialoskorski/notes | 1016581cbf7d2024df42f85df039c7e2a5b03205 | [
"MIT"
] | null | null | null | utils/forms.py | JakubBialoskorski/notes | 1016581cbf7d2024df42f85df039c7e2a5b03205 | [
"MIT"
] | 1 | 2021-06-22T20:26:20.000Z | 2021-06-22T20:26:20.000Z | utils/forms.py | JakubBialoskorski/notes | 1016581cbf7d2024df42f85df039c7e2a5b03205 | [
"MIT"
] | null | null | null | from flask_wtf import FlaskForm
from wtforms import StringField, PasswordField, SubmitField, SelectMultipleField, HiddenField
from flask_pagedown.fields import PageDownField
from wtforms import validators
class LoginForm(FlaskForm):
username = StringField('Username*', [validators.InputRequired("Please enter your name.")])
password = PasswordField('Password*', [validators.InputRequired("Please enter your password.")])
submit = SubmitField('Login')
class SignUpForm(FlaskForm):
username = StringField('Username*', [validators.InputRequired("Please enter your username")])
email = StringField('Email*', [validators.InputRequired("Please enter your email"), validators.Email('Email format incorrect')])
password = PasswordField('Password*', [validators.InputRequired("Please enter your password"), validators.EqualTo('confirm_password', message='Passwords must match'), validators.Length(min=8, max=32, message='Password must contain 8 digits minimum, with 32 being maximum')])
confirm_password = PasswordField('Confirm your password*', [validators.InputRequired("Confirm your password")])
submit = SubmitField('Signup')
class AddNoteForm(FlaskForm):
note_id = HiddenField("Note ID:")
note_title = StringField('Note Title:', [validators.InputRequired("Please enter a note title.")])
note = PageDownField('Your Note:')
tags = SelectMultipleField('Note Tags:')
submit = SubmitField('Add Note')
class AddTagForm(FlaskForm):
tag = StringField('Enter tag:', [validators.InputRequired("Please enter the tag")])
submit = SubmitField('Add Tag')
class ChangeEmailForm(FlaskForm):
email = StringField('Email*', [validators.InputRequired("Please enter our email"), validators.Email('Email format incorrect')])
submit = SubmitField('Update Email')
class ChangePasswordForm(FlaskForm):
password = PasswordField('Set new password*', [validators.InputRequired("Please enter your password"), validators.EqualTo('confirm_password', message='Passwords must match'), validators.Length(min=8, max=32, message='Password must contain 8 digits minimum, with 32 being maximum')])
confirm_password = PasswordField('Confirm new password*', [validators.InputRequired("Confirm your password")])
submit = SubmitField('Update Password')
| 61.972973 | 286 | 0.756651 | 245 | 2,293 | 7.04898 | 0.261224 | 0.146497 | 0.151129 | 0.177186 | 0.59641 | 0.594094 | 0.547771 | 0.484076 | 0.406485 | 0.2652 | 0 | 0.005944 | 0.119494 | 2,293 | 36 | 287 | 63.694444 | 0.84943 | 0 | 0 | 0 | 0 | 0 | 0.309202 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.266667 | 0.133333 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 3 |
a97f5a52d2112340dd02628abcf36314406fa57c | 338 | py | Python | random-py/app.py | traian-mihali/publishing-py | fa050b1169258b50678f00b97958499bc0210ca3 | [
"MIT"
] | null | null | null | random-py/app.py | traian-mihali/publishing-py | fa050b1169258b50678f00b97958499bc0210ca3 | [
"MIT"
] | null | null | null | random-py/app.py | traian-mihali/publishing-py | fa050b1169258b50678f00b97958499bc0210ca3 | [
"MIT"
] | null | null | null | """ This module provides a method to generate a random number between 0 and the specified number """
import random
import math
def random_num(max):
"""
Generates a random number
Parameters:
max(int): the range upper limit
Returns:
int: the random number
"""
return math.floor(random.random() * max)
| 19.882353 | 100 | 0.668639 | 46 | 338 | 4.891304 | 0.608696 | 0.16 | 0.115556 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003937 | 0.248521 | 338 | 16 | 101 | 21.125 | 0.88189 | 0.585799 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.5 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
8d212f11594f7ae449b95c565655219888507326 | 511 | py | Python | Python/toLowerCase.py | dianeyeo/LeetCode | b814831e7a4296a4e95785b75ea5c540a3fca63d | [
"MIT"
] | null | null | null | Python/toLowerCase.py | dianeyeo/LeetCode | b814831e7a4296a4e95785b75ea5c540a3fca63d | [
"MIT"
] | null | null | null | Python/toLowerCase.py | dianeyeo/LeetCode | b814831e7a4296a4e95785b75ea5c540a3fca63d | [
"MIT"
] | null | null | null | """
https://leetcode.com/problems/to-lower-case/
Difficulty: Easy
Given a string s, return the string after replacing every uppercase letter with the same lowercase letter.
Example 1:
Input: s = "Hello"
Output: "hello"
Example 2:
Input: s = "here"
Output: "here"
Example 3:
Input: s = "LOVELY"
Output: "lovely"
Constraints:
1 <= s.length <= 100
s consists of printable ASCII characters.
"""
class Solution:
def toLowerCase(self, str: str) -> str:
return str.lower()
| 17.033333 | 106 | 0.661448 | 69 | 511 | 4.898551 | 0.666667 | 0.053254 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.017588 | 0.221135 | 511 | 29 | 107 | 17.62069 | 0.831658 | 0.810176 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0 | 0.333333 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 3 |
8d2771d9640e1def0fa9d63283dfdac05afbee62 | 25,468 | py | Python | nova/pci/stats.py | bopopescu/nova-token | ec98f69dea7b3e2b9013b27fd55a2c1a1ac6bfb2 | [
"Apache-2.0"
] | null | null | null | nova/pci/stats.py | bopopescu/nova-token | ec98f69dea7b3e2b9013b27fd55a2c1a1ac6bfb2 | [
"Apache-2.0"
] | null | null | null | nova/pci/stats.py | bopopescu/nova-token | ec98f69dea7b3e2b9013b27fd55a2c1a1ac6bfb2 | [
"Apache-2.0"
] | 2 | 2017-07-20T17:31:34.000Z | 2020-07-24T02:42:19.000Z | begin_unit
comment|'# Copyright (c) 2013 Intel, Inc.'
nl|'\n'
comment|'# Copyright (c) 2013 OpenStack Foundation'
nl|'\n'
comment|'# All Rights Reserved.'
nl|'\n'
comment|'#'
nl|'\n'
comment|'# Licensed under the Apache License, Version 2.0 (the "License"); you may'
nl|'\n'
comment|'# not use this file except in compliance with the License. You may obtain'
nl|'\n'
comment|'# a copy of the License at'
nl|'\n'
comment|'#'
nl|'\n'
comment|'# http://www.apache.org/licenses/LICENSE-2.0'
nl|'\n'
comment|'#'
nl|'\n'
comment|'# Unless required by applicable law or agreed to in writing, software'
nl|'\n'
comment|'# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT'
nl|'\n'
comment|'# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the'
nl|'\n'
comment|'# License for the specific language governing permissions and limitations'
nl|'\n'
comment|'# under the License.'
nl|'\n'
nl|'\n'
name|'import'
name|'copy'
newline|'\n'
nl|'\n'
name|'from'
name|'oslo_config'
name|'import'
name|'cfg'
newline|'\n'
name|'from'
name|'oslo_log'
name|'import'
name|'log'
name|'as'
name|'logging'
newline|'\n'
name|'import'
name|'six'
newline|'\n'
nl|'\n'
name|'from'
name|'nova'
name|'import'
name|'exception'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'i18n'
name|'import'
name|'_LE'
newline|'\n'
name|'from'
name|'nova'
name|'import'
name|'objects'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'objects'
name|'import'
name|'fields'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'objects'
name|'import'
name|'pci_device_pool'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'pci'
name|'import'
name|'utils'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'pci'
name|'import'
name|'whitelist'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|variable|CONF
name|'CONF'
op|'='
name|'cfg'
op|'.'
name|'CONF'
newline|'\n'
DECL|variable|LOG
name|'LOG'
op|'='
name|'logging'
op|'.'
name|'getLogger'
op|'('
name|'__name__'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|class|PciDeviceStats
name|'class'
name|'PciDeviceStats'
op|'('
name|'object'
op|')'
op|':'
newline|'\n'
nl|'\n'
indent|' '
string|'"""PCI devices summary information.\n\n According to the PCI SR-IOV spec, a PCI physical function can have up to\n 256 PCI virtual functions, thus the number of assignable PCI functions in\n a cloud can be big. The scheduler needs to know all device availability\n information in order to determine which compute hosts can support a PCI\n request. Passing individual virtual device information to the scheduler\n does not scale, so we provide summary information.\n\n Usually the virtual functions provided by a host PCI device have the same\n value for most properties, like vendor_id, product_id and class type.\n The PCI stats class summarizes this information for the scheduler.\n\n The pci stats information is maintained exclusively by compute node\n resource tracker and updated to database. The scheduler fetches the\n information and selects the compute node accordingly. If a compute\n node is selected, the resource tracker allocates the devices to the\n instance and updates the pci stats information.\n\n This summary information will be helpful for cloud management also.\n """'
newline|'\n'
nl|'\n'
DECL|variable|pool_keys
name|'pool_keys'
op|'='
op|'['
string|"'product_id'"
op|','
string|"'vendor_id'"
op|','
string|"'numa_node'"
op|','
string|"'dev_type'"
op|']'
newline|'\n'
nl|'\n'
DECL|member|__init__
name|'def'
name|'__init__'
op|'('
name|'self'
op|','
name|'stats'
op|'='
name|'None'
op|','
name|'dev_filter'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'super'
op|'('
name|'PciDeviceStats'
op|','
name|'self'
op|')'
op|'.'
name|'__init__'
op|'('
op|')'
newline|'\n'
comment|'# NOTE(sbauza): Stats are a PCIDevicePoolList object'
nl|'\n'
name|'self'
op|'.'
name|'pools'
op|'='
op|'['
name|'pci_pool'
op|'.'
name|'to_dict'
op|'('
op|')'
nl|'\n'
name|'for'
name|'pci_pool'
name|'in'
name|'stats'
op|']'
name|'if'
name|'stats'
name|'else'
op|'['
op|']'
newline|'\n'
name|'self'
op|'.'
name|'pools'
op|'.'
name|'sort'
op|'('
name|'key'
op|'='
name|'lambda'
name|'item'
op|':'
name|'len'
op|'('
name|'item'
op|')'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'dev_filter'
op|'='
name|'dev_filter'
name|'or'
name|'whitelist'
op|'.'
name|'Whitelist'
op|'('
nl|'\n'
name|'CONF'
op|'.'
name|'pci_passthrough_whitelist'
op|')'
newline|'\n'
nl|'\n'
DECL|member|_equal_properties
dedent|''
name|'def'
name|'_equal_properties'
op|'('
name|'self'
op|','
name|'dev'
op|','
name|'entry'
op|','
name|'matching_keys'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'all'
op|'('
name|'dev'
op|'.'
name|'get'
op|'('
name|'prop'
op|')'
op|'=='
name|'entry'
op|'.'
name|'get'
op|'('
name|'prop'
op|')'
nl|'\n'
name|'for'
name|'prop'
name|'in'
name|'matching_keys'
op|')'
newline|'\n'
nl|'\n'
DECL|member|_find_pool
dedent|''
name|'def'
name|'_find_pool'
op|'('
name|'self'
op|','
name|'dev_pool'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Return the first pool that matches dev."""'
newline|'\n'
name|'for'
name|'pool'
name|'in'
name|'self'
op|'.'
name|'pools'
op|':'
newline|'\n'
indent|' '
name|'pool_keys'
op|'='
name|'pool'
op|'.'
name|'copy'
op|'('
op|')'
newline|'\n'
name|'del'
name|'pool_keys'
op|'['
string|"'count'"
op|']'
newline|'\n'
name|'del'
name|'pool_keys'
op|'['
string|"'devices'"
op|']'
newline|'\n'
name|'if'
op|'('
name|'len'
op|'('
name|'pool_keys'
op|'.'
name|'keys'
op|'('
op|')'
op|')'
op|'=='
name|'len'
op|'('
name|'dev_pool'
op|'.'
name|'keys'
op|'('
op|')'
op|')'
name|'and'
nl|'\n'
name|'self'
op|'.'
name|'_equal_properties'
op|'('
name|'dev_pool'
op|','
name|'pool_keys'
op|','
name|'dev_pool'
op|'.'
name|'keys'
op|'('
op|')'
op|')'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'pool'
newline|'\n'
nl|'\n'
DECL|member|_create_pool_keys_from_dev
dedent|''
dedent|''
dedent|''
name|'def'
name|'_create_pool_keys_from_dev'
op|'('
name|'self'
op|','
name|'dev'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""create a stats pool dict that this dev is supposed to be part of\n\n Note that this pool dict contains the stats pool\'s keys and their\n values. \'count\' and \'devices\' are not included.\n """'
newline|'\n'
comment|"# Don't add a device that doesn't have a matching device spec."
nl|'\n'
comment|'# This can happen during initial sync up with the controller'
nl|'\n'
name|'devspec'
op|'='
name|'self'
op|'.'
name|'dev_filter'
op|'.'
name|'get_devspec'
op|'('
name|'dev'
op|')'
newline|'\n'
name|'if'
name|'not'
name|'devspec'
op|':'
newline|'\n'
indent|' '
name|'return'
newline|'\n'
dedent|''
name|'tags'
op|'='
name|'devspec'
op|'.'
name|'get_tags'
op|'('
op|')'
newline|'\n'
name|'pool'
op|'='
op|'{'
name|'k'
op|':'
name|'getattr'
op|'('
name|'dev'
op|','
name|'k'
op|')'
name|'for'
name|'k'
name|'in'
name|'self'
op|'.'
name|'pool_keys'
op|'}'
newline|'\n'
name|'if'
name|'tags'
op|':'
newline|'\n'
indent|' '
name|'pool'
op|'.'
name|'update'
op|'('
name|'tags'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'pool'
newline|'\n'
nl|'\n'
DECL|member|add_device
dedent|''
name|'def'
name|'add_device'
op|'('
name|'self'
op|','
name|'dev'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Add a device to its matching pool."""'
newline|'\n'
name|'dev_pool'
op|'='
name|'self'
op|'.'
name|'_create_pool_keys_from_dev'
op|'('
name|'dev'
op|')'
newline|'\n'
name|'if'
name|'dev_pool'
op|':'
newline|'\n'
indent|' '
name|'pool'
op|'='
name|'self'
op|'.'
name|'_find_pool'
op|'('
name|'dev_pool'
op|')'
newline|'\n'
name|'if'
name|'not'
name|'pool'
op|':'
newline|'\n'
indent|' '
name|'dev_pool'
op|'['
string|"'count'"
op|']'
op|'='
number|'0'
newline|'\n'
name|'dev_pool'
op|'['
string|"'devices'"
op|']'
op|'='
op|'['
op|']'
newline|'\n'
name|'self'
op|'.'
name|'pools'
op|'.'
name|'append'
op|'('
name|'dev_pool'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'pools'
op|'.'
name|'sort'
op|'('
name|'key'
op|'='
name|'lambda'
name|'item'
op|':'
name|'len'
op|'('
name|'item'
op|')'
op|')'
newline|'\n'
name|'pool'
op|'='
name|'dev_pool'
newline|'\n'
dedent|''
name|'pool'
op|'['
string|"'count'"
op|']'
op|'+='
number|'1'
newline|'\n'
name|'pool'
op|'['
string|"'devices'"
op|']'
op|'.'
name|'append'
op|'('
name|'dev'
op|')'
newline|'\n'
nl|'\n'
dedent|''
dedent|''
op|'@'
name|'staticmethod'
newline|'\n'
DECL|member|_decrease_pool_count
name|'def'
name|'_decrease_pool_count'
op|'('
name|'pool_list'
op|','
name|'pool'
op|','
name|'count'
op|'='
number|'1'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Decrement pool\'s size by count.\n\n If pool becomes empty, remove pool from pool_list.\n """'
newline|'\n'
name|'if'
name|'pool'
op|'['
string|"'count'"
op|']'
op|'>'
name|'count'
op|':'
newline|'\n'
indent|' '
name|'pool'
op|'['
string|"'count'"
op|']'
op|'-='
name|'count'
newline|'\n'
name|'count'
op|'='
number|'0'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'count'
op|'-='
name|'pool'
op|'['
string|"'count'"
op|']'
newline|'\n'
name|'pool_list'
op|'.'
name|'remove'
op|'('
name|'pool'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'count'
newline|'\n'
nl|'\n'
DECL|member|remove_device
dedent|''
name|'def'
name|'remove_device'
op|'('
name|'self'
op|','
name|'dev'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Remove one device from the first pool that it matches."""'
newline|'\n'
name|'dev_pool'
op|'='
name|'self'
op|'.'
name|'_create_pool_keys_from_dev'
op|'('
name|'dev'
op|')'
newline|'\n'
name|'if'
name|'dev_pool'
op|':'
newline|'\n'
indent|' '
name|'pool'
op|'='
name|'self'
op|'.'
name|'_find_pool'
op|'('
name|'dev_pool'
op|')'
newline|'\n'
name|'if'
name|'not'
name|'pool'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'PciDevicePoolEmpty'
op|'('
nl|'\n'
name|'compute_node_id'
op|'='
name|'dev'
op|'.'
name|'compute_node_id'
op|','
name|'address'
op|'='
name|'dev'
op|'.'
name|'address'
op|')'
newline|'\n'
dedent|''
name|'pool'
op|'['
string|"'devices'"
op|']'
op|'.'
name|'remove'
op|'('
name|'dev'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'_decrease_pool_count'
op|'('
name|'self'
op|'.'
name|'pools'
op|','
name|'pool'
op|')'
newline|'\n'
nl|'\n'
DECL|member|get_free_devs
dedent|''
dedent|''
name|'def'
name|'get_free_devs'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'free_devs'
op|'='
op|'['
op|']'
newline|'\n'
name|'for'
name|'pool'
name|'in'
name|'self'
op|'.'
name|'pools'
op|':'
newline|'\n'
indent|' '
name|'free_devs'
op|'.'
name|'extend'
op|'('
name|'pool'
op|'['
string|"'devices'"
op|']'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'free_devs'
newline|'\n'
nl|'\n'
DECL|member|consume_requests
dedent|''
name|'def'
name|'consume_requests'
op|'('
name|'self'
op|','
name|'pci_requests'
op|','
name|'numa_cells'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'alloc_devices'
op|'='
op|'['
op|']'
newline|'\n'
name|'for'
name|'request'
name|'in'
name|'pci_requests'
op|':'
newline|'\n'
indent|' '
name|'count'
op|'='
name|'request'
op|'.'
name|'count'
newline|'\n'
name|'spec'
op|'='
name|'request'
op|'.'
name|'spec'
newline|'\n'
comment|'# For now, keep the same algorithm as during scheduling:'
nl|'\n'
comment|'# a spec may be able to match multiple pools.'
nl|'\n'
name|'pools'
op|'='
name|'self'
op|'.'
name|'_filter_pools_for_spec'
op|'('
name|'self'
op|'.'
name|'pools'
op|','
name|'spec'
op|')'
newline|'\n'
name|'if'
name|'numa_cells'
op|':'
newline|'\n'
indent|' '
name|'pools'
op|'='
name|'self'
op|'.'
name|'_filter_pools_for_numa_cells'
op|'('
name|'pools'
op|','
name|'numa_cells'
op|')'
newline|'\n'
dedent|''
name|'pools'
op|'='
name|'self'
op|'.'
name|'_filter_non_requested_pfs'
op|'('
name|'request'
op|','
name|'pools'
op|')'
newline|'\n'
comment|'# Failed to allocate the required number of devices'
nl|'\n'
comment|'# Return the devices already allocated back to their pools'
nl|'\n'
name|'if'
name|'sum'
op|'('
op|'['
name|'pool'
op|'['
string|"'count'"
op|']'
name|'for'
name|'pool'
name|'in'
name|'pools'
op|']'
op|')'
op|'<'
name|'count'
op|':'
newline|'\n'
indent|' '
name|'LOG'
op|'.'
name|'error'
op|'('
name|'_LE'
op|'('
string|'"Failed to allocate PCI devices for instance."'
nl|'\n'
string|'" Unassigning devices back to pools."'
nl|'\n'
string|'" This should not happen, since the scheduler"'
nl|'\n'
string|'" should have accurate information, and allocation"'
nl|'\n'
string|'" during claims is controlled via a hold"'
nl|'\n'
string|'" on the compute node semaphore"'
op|')'
op|')'
newline|'\n'
name|'for'
name|'d'
name|'in'
name|'range'
op|'('
name|'len'
op|'('
name|'alloc_devices'
op|')'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'add_device'
op|'('
name|'alloc_devices'
op|'.'
name|'pop'
op|'('
op|')'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'None'
newline|'\n'
dedent|''
name|'for'
name|'pool'
name|'in'
name|'pools'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'pool'
op|'['
string|"'count'"
op|']'
op|'>='
name|'count'
op|':'
newline|'\n'
indent|' '
name|'num_alloc'
op|'='
name|'count'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'num_alloc'
op|'='
name|'pool'
op|'['
string|"'count'"
op|']'
newline|'\n'
dedent|''
name|'count'
op|'-='
name|'num_alloc'
newline|'\n'
name|'pool'
op|'['
string|"'count'"
op|']'
op|'-='
name|'num_alloc'
newline|'\n'
name|'for'
name|'d'
name|'in'
name|'range'
op|'('
name|'num_alloc'
op|')'
op|':'
newline|'\n'
indent|' '
name|'pci_dev'
op|'='
name|'pool'
op|'['
string|"'devices'"
op|']'
op|'.'
name|'pop'
op|'('
op|')'
newline|'\n'
name|'self'
op|'.'
name|'_handle_device_dependents'
op|'('
name|'pci_dev'
op|')'
newline|'\n'
name|'pci_dev'
op|'.'
name|'request_id'
op|'='
name|'request'
op|'.'
name|'request_id'
newline|'\n'
name|'alloc_devices'
op|'.'
name|'append'
op|'('
name|'pci_dev'
op|')'
newline|'\n'
dedent|''
name|'if'
name|'count'
op|'=='
number|'0'
op|':'
newline|'\n'
indent|' '
name|'break'
newline|'\n'
dedent|''
dedent|''
dedent|''
name|'return'
name|'alloc_devices'
newline|'\n'
nl|'\n'
DECL|member|_handle_device_dependents
dedent|''
name|'def'
name|'_handle_device_dependents'
op|'('
name|'self'
op|','
name|'pci_dev'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Remove device dependents or a parent from pools.\n\n In case the device is a PF, all of it\'s dependent VFs should\n be removed from pools count, if these are present.\n When the device is a VF, it\'s parent PF pool count should be\n decreased, unless it is no longer in a pool.\n """'
newline|'\n'
name|'if'
name|'pci_dev'
op|'.'
name|'dev_type'
op|'=='
name|'fields'
op|'.'
name|'PciDeviceType'
op|'.'
name|'SRIOV_PF'
op|':'
newline|'\n'
indent|' '
name|'vfs_list'
op|'='
name|'objects'
op|'.'
name|'PciDeviceList'
op|'.'
name|'get_by_parent_address'
op|'('
nl|'\n'
name|'pci_dev'
op|'.'
name|'_context'
op|','
nl|'\n'
name|'pci_dev'
op|'.'
name|'compute_node_id'
op|','
nl|'\n'
name|'pci_dev'
op|'.'
name|'address'
op|')'
newline|'\n'
name|'if'
name|'vfs_list'
op|':'
newline|'\n'
indent|' '
name|'for'
name|'vf'
name|'in'
name|'vfs_list'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'remove_device'
op|'('
name|'vf'
op|')'
newline|'\n'
dedent|''
dedent|''
dedent|''
name|'elif'
name|'pci_dev'
op|'.'
name|'dev_type'
op|'=='
name|'fields'
op|'.'
name|'PciDeviceType'
op|'.'
name|'SRIOV_VF'
op|':'
newline|'\n'
indent|' '
name|'try'
op|':'
newline|'\n'
indent|' '
name|'parent'
op|'='
name|'pci_dev'
op|'.'
name|'get_by_dev_addr'
op|'('
name|'pci_dev'
op|'.'
name|'_context'
op|','
nl|'\n'
name|'pci_dev'
op|'.'
name|'compute_node_id'
op|','
nl|'\n'
name|'pci_dev'
op|'.'
name|'parent_addr'
op|')'
newline|'\n'
comment|'# Make sure not to decrease PF pool count if this parent has'
nl|'\n'
comment|'# been already removed from pools'
nl|'\n'
name|'if'
name|'parent'
name|'in'
name|'self'
op|'.'
name|'get_free_devs'
op|'('
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'remove_device'
op|'('
name|'parent'
op|')'
newline|'\n'
dedent|''
dedent|''
name|'except'
name|'exception'
op|'.'
name|'PciDeviceNotFound'
op|':'
newline|'\n'
indent|' '
name|'return'
newline|'\n'
nl|'\n'
dedent|''
dedent|''
dedent|''
op|'@'
name|'staticmethod'
newline|'\n'
DECL|member|_filter_pools_for_spec
name|'def'
name|'_filter_pools_for_spec'
op|'('
name|'pools'
op|','
name|'request_specs'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
op|'['
name|'pool'
name|'for'
name|'pool'
name|'in'
name|'pools'
nl|'\n'
name|'if'
name|'utils'
op|'.'
name|'pci_device_prop_match'
op|'('
name|'pool'
op|','
name|'request_specs'
op|')'
op|']'
newline|'\n'
nl|'\n'
dedent|''
op|'@'
name|'staticmethod'
newline|'\n'
DECL|member|_filter_pools_for_numa_cells
name|'def'
name|'_filter_pools_for_numa_cells'
op|'('
name|'pools'
op|','
name|'numa_cells'
op|')'
op|':'
newline|'\n'
comment|"# Some systems don't report numa node info for pci devices, in"
nl|'\n'
comment|'# that case None is reported in pci_device.numa_node, by adding None'
nl|'\n'
comment|'# to numa_cells we allow assigning those devices to instances with'
nl|'\n'
comment|'# numa topology'
nl|'\n'
indent|' '
name|'numa_cells'
op|'='
op|'['
name|'None'
op|']'
op|'+'
op|'['
name|'cell'
op|'.'
name|'id'
name|'for'
name|'cell'
name|'in'
name|'numa_cells'
op|']'
newline|'\n'
comment|'# filter out pools which numa_node is not included in numa_cells'
nl|'\n'
name|'return'
op|'['
name|'pool'
name|'for'
name|'pool'
name|'in'
name|'pools'
name|'if'
name|'any'
op|'('
name|'utils'
op|'.'
name|'pci_device_prop_match'
op|'('
nl|'\n'
name|'pool'
op|','
op|'['
op|'{'
string|"'numa_node'"
op|':'
name|'cell'
op|'}'
op|']'
op|')'
nl|'\n'
name|'for'
name|'cell'
name|'in'
name|'numa_cells'
op|')'
op|']'
newline|'\n'
nl|'\n'
DECL|member|_filter_non_requested_pfs
dedent|''
name|'def'
name|'_filter_non_requested_pfs'
op|'('
name|'self'
op|','
name|'request'
op|','
name|'matching_pools'
op|')'
op|':'
newline|'\n'
comment|'# Remove SRIOV_PFs from pools, unless it has been explicitly requested'
nl|'\n'
comment|'# This is especially needed in cases where PFs and VFs has the same'
nl|'\n'
comment|'# product_id.'
nl|'\n'
indent|' '
name|'if'
name|'all'
op|'('
name|'spec'
op|'.'
name|'get'
op|'('
string|"'dev_type'"
op|')'
op|'!='
name|'fields'
op|'.'
name|'PciDeviceType'
op|'.'
name|'SRIOV_PF'
name|'for'
nl|'\n'
name|'spec'
name|'in'
name|'request'
op|'.'
name|'spec'
op|')'
op|':'
newline|'\n'
indent|' '
name|'matching_pools'
op|'='
name|'self'
op|'.'
name|'_filter_pools_for_pfs'
op|'('
name|'matching_pools'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'matching_pools'
newline|'\n'
nl|'\n'
dedent|''
op|'@'
name|'staticmethod'
newline|'\n'
DECL|member|_filter_pools_for_pfs
name|'def'
name|'_filter_pools_for_pfs'
op|'('
name|'pools'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
op|'['
name|'pool'
name|'for'
name|'pool'
name|'in'
name|'pools'
nl|'\n'
name|'if'
name|'not'
name|'pool'
op|'.'
name|'get'
op|'('
string|"'dev_type'"
op|')'
op|'=='
name|'fields'
op|'.'
name|'PciDeviceType'
op|'.'
name|'SRIOV_PF'
op|']'
newline|'\n'
nl|'\n'
DECL|member|_apply_request
dedent|''
name|'def'
name|'_apply_request'
op|'('
name|'self'
op|','
name|'pools'
op|','
name|'request'
op|','
name|'numa_cells'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
comment|'# NOTE(vladikr): This code maybe open to race conditions.'
nl|'\n'
comment|'# Two concurrent requests may succeed when called support_requests'
nl|'\n'
comment|'# because this method does not remove related devices from the pools'
nl|'\n'
indent|' '
name|'count'
op|'='
name|'request'
op|'.'
name|'count'
newline|'\n'
name|'matching_pools'
op|'='
name|'self'
op|'.'
name|'_filter_pools_for_spec'
op|'('
name|'pools'
op|','
name|'request'
op|'.'
name|'spec'
op|')'
newline|'\n'
name|'if'
name|'numa_cells'
op|':'
newline|'\n'
indent|' '
name|'matching_pools'
op|'='
name|'self'
op|'.'
name|'_filter_pools_for_numa_cells'
op|'('
name|'matching_pools'
op|','
nl|'\n'
name|'numa_cells'
op|')'
newline|'\n'
dedent|''
name|'matching_pools'
op|'='
name|'self'
op|'.'
name|'_filter_non_requested_pfs'
op|'('
name|'request'
op|','
nl|'\n'
name|'matching_pools'
op|')'
newline|'\n'
name|'if'
name|'sum'
op|'('
op|'['
name|'pool'
op|'['
string|"'count'"
op|']'
name|'for'
name|'pool'
name|'in'
name|'matching_pools'
op|']'
op|')'
op|'<'
name|'count'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'False'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'for'
name|'pool'
name|'in'
name|'matching_pools'
op|':'
newline|'\n'
indent|' '
name|'count'
op|'='
name|'self'
op|'.'
name|'_decrease_pool_count'
op|'('
name|'pools'
op|','
name|'pool'
op|','
name|'count'
op|')'
newline|'\n'
name|'if'
name|'not'
name|'count'
op|':'
newline|'\n'
indent|' '
name|'break'
newline|'\n'
dedent|''
dedent|''
dedent|''
name|'return'
name|'True'
newline|'\n'
nl|'\n'
DECL|member|support_requests
dedent|''
name|'def'
name|'support_requests'
op|'('
name|'self'
op|','
name|'requests'
op|','
name|'numa_cells'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Check if the pci requests can be met.\n\n Scheduler checks compute node\'s PCI stats to decide if an\n instance can be scheduled into the node. Support does not\n mean real allocation.\n If numa_cells is provided then only devices contained in\n those nodes are considered.\n """'
newline|'\n'
comment|'# note (yjiang5): this function has high possibility to fail,'
nl|'\n'
comment|'# so no exception should be triggered for performance reason.'
nl|'\n'
name|'pools'
op|'='
name|'copy'
op|'.'
name|'deepcopy'
op|'('
name|'self'
op|'.'
name|'pools'
op|')'
newline|'\n'
name|'return'
name|'all'
op|'('
op|'['
name|'self'
op|'.'
name|'_apply_request'
op|'('
name|'pools'
op|','
name|'r'
op|','
name|'numa_cells'
op|')'
nl|'\n'
name|'for'
name|'r'
name|'in'
name|'requests'
op|']'
op|')'
newline|'\n'
nl|'\n'
DECL|member|apply_requests
dedent|''
name|'def'
name|'apply_requests'
op|'('
name|'self'
op|','
name|'requests'
op|','
name|'numa_cells'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Apply PCI requests to the PCI stats.\n\n This is used in multiple instance creation, when the scheduler has to\n maintain how the resources are consumed by the instances.\n If numa_cells is provided then only devices contained in\n those nodes are considered.\n """'
newline|'\n'
name|'if'
name|'not'
name|'all'
op|'('
op|'['
name|'self'
op|'.'
name|'_apply_request'
op|'('
name|'self'
op|'.'
name|'pools'
op|','
name|'r'
op|','
name|'numa_cells'
op|')'
nl|'\n'
name|'for'
name|'r'
name|'in'
name|'requests'
op|']'
op|')'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'PciDeviceRequestFailed'
op|'('
name|'requests'
op|'='
name|'requests'
op|')'
newline|'\n'
nl|'\n'
DECL|member|__iter__
dedent|''
dedent|''
name|'def'
name|'__iter__'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
comment|"# 'devices' shouldn't be part of stats"
nl|'\n'
indent|' '
name|'pools'
op|'='
op|'['
op|']'
newline|'\n'
name|'for'
name|'pool'
name|'in'
name|'self'
op|'.'
name|'pools'
op|':'
newline|'\n'
indent|' '
name|'tmp'
op|'='
op|'{'
name|'k'
op|':'
name|'v'
name|'for'
name|'k'
op|','
name|'v'
name|'in'
name|'six'
op|'.'
name|'iteritems'
op|'('
name|'pool'
op|')'
name|'if'
name|'k'
op|'!='
string|"'devices'"
op|'}'
newline|'\n'
name|'pools'
op|'.'
name|'append'
op|'('
name|'tmp'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'iter'
op|'('
name|'pools'
op|')'
newline|'\n'
nl|'\n'
DECL|member|clear
dedent|''
name|'def'
name|'clear'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Clear all the stats maintained."""'
newline|'\n'
name|'self'
op|'.'
name|'pools'
op|'='
op|'['
op|']'
newline|'\n'
nl|'\n'
DECL|member|__eq__
dedent|''
name|'def'
name|'__eq__'
op|'('
name|'self'
op|','
name|'other'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'cmp'
op|'('
name|'self'
op|'.'
name|'pools'
op|','
name|'other'
op|'.'
name|'pools'
op|')'
op|'=='
number|'0'
newline|'\n'
nl|'\n'
DECL|member|__ne__
dedent|''
name|'def'
name|'__ne__'
op|'('
name|'self'
op|','
name|'other'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'not'
op|'('
name|'self'
op|'=='
name|'other'
op|')'
newline|'\n'
nl|'\n'
DECL|member|to_device_pools_obj
dedent|''
name|'def'
name|'to_device_pools_obj'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Return the contents of the pools as a PciDevicePoolList object."""'
newline|'\n'
name|'stats'
op|'='
op|'['
name|'x'
name|'for'
name|'x'
name|'in'
name|'self'
op|']'
newline|'\n'
name|'return'
name|'pci_device_pool'
op|'.'
name|'from_pci_stats'
op|'('
name|'stats'
op|')'
newline|'\n'
dedent|''
dedent|''
endmarker|''
end_unit
| 14.37246 | 1,148 | 0.61167 | 3,855 | 25,468 | 3.949935 | 0.102464 | 0.125304 | 0.079464 | 0.055691 | 0.637026 | 0.56518 | 0.494319 | 0.425954 | 0.346884 | 0.31477 | 0 | 0.001088 | 0.133736 | 25,468 | 1,771 | 1,149 | 14.380576 | 0.689103 | 0 | 0 | 0.923207 | 0 | 0.002259 | 0.540953 | 0.030116 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.001129 | 0.006211 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
8d58f2b0959a8386b4c708d7cc38bd2e9f103bb6 | 1,321 | py | Python | pyesasky/__init__.py | pierfra-ro/pyesasky | a9342efcaa5cca088ed9a5afa2c98d3e9aa4bd0f | [
"BSD-3-Clause"
] | 13 | 2019-05-30T19:57:37.000Z | 2021-09-10T09:43:49.000Z | pyesasky/__init__.py | pierfra-ro/pyesasky | a9342efcaa5cca088ed9a5afa2c98d3e9aa4bd0f | [
"BSD-3-Clause"
] | 21 | 2019-06-21T18:55:25.000Z | 2022-02-27T14:48:13.000Z | pyesasky/__init__.py | pierfra-ro/pyesasky | a9342efcaa5cca088ed9a5afa2c98d3e9aa4bd0f | [
"BSD-3-Clause"
] | 8 | 2019-05-30T12:20:48.000Z | 2022-03-04T04:01:20.000Z | from ._version import __version__ # noqa
from .pyesasky import ESASkyWidget # noqa
from .catalogue import Catalogue # noqa
from .catalogueDescriptor import CatalogueDescriptor # noqa
from .cooFrame import CooFrame # noqa
from .footprintSet import FootprintSet # noqa
from .footprintSetDescriptor import FootprintSetDescriptor # noqa
from .HiPS import HiPS # noqa
from .imgFormat import ImgFormat # noqa
from .jupyter_server import load_jupyter_server_extension # noqa
from .metadataDescriptor import MetadataDescriptor # noqa
from .metadataType import MetadataType # noqa
import json
from pathlib import Path
HERE = Path(__file__).parent.resolve()
with (HERE / "labextension" / "package.json").open() as fid:
data = json.load(fid)
# Jupyter Extension points
def _jupyter_nbextension_paths():
return [{'section': 'notebook',
# the path is relative to the `pyesasky` directory
'src': 'nbextension/static',
# directory in the `nbextension/` namespace
'dest': 'pyesasky',
# _also_ in the `nbextension/` namespace
'require': 'pyesasky/extension'}]
def _jupyter_server_extension_paths():
return [{"module": "pyesasky"}]
def _jupyter_labextension_paths():
return [{
"src": "labextension",
"dest": data["name"]
}]
| 33.025 | 65 | 0.711582 | 141 | 1,321 | 6.489362 | 0.382979 | 0.096175 | 0.048087 | 0.054645 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.19455 | 1,321 | 39 | 66 | 33.871795 | 0.859962 | 0.161998 | 0 | 0 | 0 | 0 | 0.123049 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.103448 | false | 0 | 0.482759 | 0.103448 | 0.689655 | 0.068966 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 3 |
8d98eec2f752514e211b3f9e607274f2de78ffd9 | 3,543 | py | Python | physprog/tests/test_sample_problem.py | partofthething/physprog | 8bbeb8d84697469417577c76c924dcb3a855cd2d | [
"Apache-2.0"
] | 3 | 2018-03-25T16:13:53.000Z | 2021-06-29T14:30:20.000Z | physprog/tests/test_sample_problem.py | partofthething/physprog | 8bbeb8d84697469417577c76c924dcb3a855cd2d | [
"Apache-2.0"
] | null | null | null | physprog/tests/test_sample_problem.py | partofthething/physprog | 8bbeb8d84697469417577c76c924dcb3a855cd2d | [
"Apache-2.0"
] | 2 | 2021-09-18T08:38:32.000Z | 2022-03-01T07:43:52.000Z | """Run a sample problem to test full system."""
# pylint: disable=invalid-name,missing-docstring
import unittest
from collections import namedtuple
import math
import os
from physprog import classfunctions
from physprog import optimize
THIS_DIR = os.path.dirname(__file__)
SAMPLE_INPUT = os.path.join(THIS_DIR, 'sample-input.yaml')
class TestInput(unittest.TestCase):
"""Test that input can be read."""
def test_read_class_functions(self):
functions = classfunctions.from_input(SAMPLE_INPUT)
self.assertTrue('frequency' in functions)
class Test_Sample_Problem(unittest.TestCase):
"""Test by optimizing a beam problem from the literature."""
def test_optimization(self):
beam = SampleProblemBeam()
# check initial conditions
self.assertAlmostEqual(beam.frequency(), 113.0, delta=0.5)
self.assertAlmostEqual(beam.cost(), 1060.0)
self.assertAlmostEqual(beam.mass(), 2230.0)
prefs = classfunctions.from_input(SAMPLE_INPUT)
optimize.optimize(beam, prefs, plot=False)
# not rigorous, but happens in this problem
self.assertLess(beam.cost(), 1060.0)
SampleDesign = namedtuple('SampleDesign', ['d1', 'd2', 'd3', 'b', 'L'])
class SampleProblemBeam(object):
"""Sample beam design problem from Messac, 1996."""
E1 = 1.6e9
C1 = 500.0
RHO1 = 100.0
E2 = 70e9
C2 = 1500.0
RHO2 = 2770.0
E3 = 200e9
C3 = 800.0
RHO3 = 7780.0
def __init__(self):
self._design = SampleDesign(0.3, 0.35, 0.40, 0.40, 5.0) # initial
def evaluate(self, x=None):
"""Convert input design into output design parameters."""
if x is not None:
self.design = x
return [self.frequency(), self.cost(), self.width(), self.length(),
self.mass(), self.semiheight(), self.width_layer1(),
self.width_layer2(), self.width_layer3()]
@property
def design(self):
return self._design
@design.setter
def design(self, val):
self._design = SampleDesign(*val)
@property
def ei(self):
ds = self.design
return 2.0 / 3.0 * ds.b * (self.E1 * ds.d1 ** 3 +
self.E2 * (ds.d2 ** 3 - ds.d1 ** 3) +
self.E3 * (ds.d3 ** 3 - ds.d2 ** 3))
@property
def mu(self):
ds = self.design
return 2 * ds.b * (self.RHO1 * ds.d1 +
self.RHO2 * (ds.d2 - ds.d1) +
self.RHO3 * (ds.d3 - ds.d2))
def frequency(self):
return math.pi / (2 * self.design.L ** 2) * math.sqrt(self.ei / self.mu)
def cost(self):
ds = self.design
# cost in the paper says 1060 but I'm getting 212, exactly a
# factor of 5 off. But why?? Ah, because cost should have L in it!
# That's a typo in the paper.
return 2 * ds.b * ds.L * (self.C1 * ds.d1 +
self.C2 * (ds.d2 - ds.d1) +
self.C3 * (ds.d3 - ds.d2))
def width(self):
return self.design.b
def length(self):
return self.design.L
def mass(self):
return self.mu * self.design.L
def semiheight(self):
return self.design.d3
def width_layer1(self):
return self.design.d1
def width_layer2(self):
return self.design.d2 - self.design.d1
def width_layer3(self):
return self.design.d3 - self.design.d2
if __name__ == '__main__':
unittest.main()
| 28.804878 | 80 | 0.582275 | 470 | 3,543 | 4.308511 | 0.317021 | 0.083951 | 0.055309 | 0.069136 | 0.120494 | 0.022716 | 0 | 0 | 0 | 0 | 0 | 0.057623 | 0.294666 | 3,543 | 122 | 81 | 29.040984 | 0.752701 | 0.140559 | 0 | 0.074074 | 0 | 0 | 0.017922 | 0 | 0 | 0 | 0 | 0 | 0.061728 | 1 | 0.209877 | false | 0 | 0.074074 | 0.111111 | 0.592593 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
a5d00dc3b88e76c00327d591e70ffe150f4013d2 | 1,946 | py | Python | esercizio_1/untitled1.py | navyzigz420/python_lab | a3496d8b170e334abfb5099bf6ee03df5e226b78 | [
"Apache-2.0"
] | null | null | null | esercizio_1/untitled1.py | navyzigz420/python_lab | a3496d8b170e334abfb5099bf6ee03df5e226b78 | [
"Apache-2.0"
] | null | null | null | esercizio_1/untitled1.py | navyzigz420/python_lab | a3496d8b170e334abfb5099bf6ee03df5e226b78 | [
"Apache-2.0"
] | null | null | null | bits = '110'
def turnBitsIntoInteger(listOfBits):
valore = 0
lunghezza = len(listOfBits)
for x in range(lunghezza):
if listOfBits[x] != '0' and listOfBits[x] != '1':
raise Exception('Not a combinations of bits!')
valore = valore + 2**(lunghezza -1 - x) * int(listOfBits[x])
if valore != int(listOfBits,2):
raise Exception('Porca l\'oca. {} diverso da {}!'.format(valore, int(listOfBits,2)))
return valore
#print(str(valore))
print(turnBitsIntoInteger(bits))
| 1.3637 | 92 | 0.167009 | 64 | 1,946 | 5.078125 | 0.515625 | 0.101538 | 0.116923 | 0.123077 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.024814 | 0.792909 | 1,946 | 1,426 | 93 | 1.364656 | 0.781638 | 0.00925 | 0 | 0 | 0 | 0 | 0.076775 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0.083333 | false | 0 | 0 | 0 | 0.166667 | 0.083333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
5701301232a4492ca4d517aea40acc01301fe2f8 | 329 | py | Python | aoc2020/d01_report_repair/methods.py | sflis/aoc2020 | ef6ee81c18b6ec8332b150638b3d78772fe8327a | [
"Unlicense"
] | null | null | null | aoc2020/d01_report_repair/methods.py | sflis/aoc2020 | ef6ee81c18b6ec8332b150638b3d78772fe8327a | [
"Unlicense"
] | null | null | null | aoc2020/d01_report_repair/methods.py | sflis/aoc2020 | ef6ee81c18b6ec8332b150638b3d78772fe8327a | [
"Unlicense"
] | null | null | null | import numpy as np
def sum_pair_equals(values, val_eq):
values = np.array(values)
return np.where(values + values[:, None] == val_eq)[0]
def sum_triad_equals(values, val_eq):
values = np.array(values)
sum_ = values + values[:, None] + values[:, None, None]
return np.array(np.where(sum_ == val_eq))[:, 0]
| 25.307692 | 59 | 0.653495 | 51 | 329 | 4.019608 | 0.333333 | 0.097561 | 0.146341 | 0.165854 | 0.35122 | 0.35122 | 0.35122 | 0.35122 | 0 | 0 | 0 | 0.007491 | 0.18845 | 329 | 12 | 60 | 27.416667 | 0.7603 | 0 | 0 | 0.25 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.125 | 0 | 0.625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
57071627a3f7ead2f2e5161d076288e623b02921 | 160 | py | Python | src/utilities/grammar.py | sonishreyas/news_scraper | 7cd1bd9eb14fb903fc7b190b04191237da0a1d23 | [
"MIT"
] | null | null | null | src/utilities/grammar.py | sonishreyas/news_scraper | 7cd1bd9eb14fb903fc7b190b04191237da0a1d23 | [
"MIT"
] | null | null | null | src/utilities/grammar.py | sonishreyas/news_scraper | 7cd1bd9eb14fb903fc7b190b04191237da0a1d23 | [
"MIT"
] | null | null | null | from gingerit.gingerit import GingerIt
def check_grammar(text):
parser = GingerIt()
correct_text = parser.parse(text)
return correct_text['result'] | 26.666667 | 38 | 0.74375 | 20 | 160 | 5.8 | 0.6 | 0.172414 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.1625 | 160 | 6 | 39 | 26.666667 | 0.865672 | 0 | 0 | 0 | 0 | 0 | 0.037267 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.2 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
570f8d367a6c727fc6ef795d72a90ef7bea75141 | 2,024 | py | Python | graph.py | mrpatiwi/k-walk-py | a800f64079024716b26c0ebb9c3a2c5b6a935b78 | [
"MIT"
] | null | null | null | graph.py | mrpatiwi/k-walk-py | a800f64079024716b26c0ebb9c3a2c5b6a935b78 | [
"MIT"
] | null | null | null | graph.py | mrpatiwi/k-walk-py | a800f64079024716b26c0ebb9c3a2c5b6a935b78 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from index_matrix import Matrix
__author__ = 'Patricio Lopez Juri'
class Graph:
def __init__(self, V, E, K):
self.V = V
self.E = E
self.K = K
self.A = Matrix.square(items=V)
for a, b in self.E:
self.A[a, b] = 1
@property
def order(self):
return len(self.V)
@property
def n(self):
return self.order
@property
def k(self):
return len(self.K)
def d(self, i):
return self.A.row(i).sum()
def D(self):
D = Matrix.square(items=self.V)
for item in self.V:
D[item, item] = self.d(item)
return D
def p(self, i, j):
return self.A[i, j] / self.d(i)
def p_star(self, start, i, j):
absorbents = [k for k in self.K if k != start]
if i in absorbents and i == j:
return 1
elif i in absorbents and i != j:
return 0
else:
return self.p(i, j)
def P(self):
return self.D().inverse() * self.A
def P_star(self, start):
def function(i, j):
return self.p_star(start, i, j)
P_star = self.A.map(function=function)
absorbents = [k for k in self.K if k != start]
for i, absorbent in enumerate(absorbents):
P_star.swap_columns(P_star.horizontal_items[-(1 + i)], absorbent)
return P_star
def Q_star(self, start):
size = self.n - self.k + 1
return self.P_star(start)[0:size, 0:size]
def R_star(self, start):
size = self.n - self.k + 1
return self.P_star(start)[0:size, size:(size + self.k - 1)]
def Zero_star(self, start):
pass
def I_star(self, start):
size = self.n - self.k + 1
range = size + self.k - 1
return self.P_star(start)[size:range, size:range]
def N_star(self, start):
Q_star = self.Q_star(start)
identity = Q_star.clone_identity()
return (identity - Q_star).inverse()
| 23.264368 | 77 | 0.535079 | 312 | 2,024 | 3.371795 | 0.198718 | 0.047529 | 0.086502 | 0.057034 | 0.306084 | 0.254753 | 0.254753 | 0.209125 | 0.184411 | 0.157795 | 0 | 0.009709 | 0.338439 | 2,024 | 86 | 78 | 23.534884 | 0.775952 | 0.010375 | 0 | 0.129032 | 0 | 0 | 0.009495 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.258065 | false | 0.016129 | 0.016129 | 0.112903 | 0.548387 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
9394ac8b332dbc27f6671e32b2abfcd0890092b3 | 117 | py | Python | web_scraping/ec2files/ec2file78.py | nikibhatt/Groa | fc2d4ae87cb825e6d54a0831c72be16541eebe61 | [
"MIT"
] | 1 | 2020-04-08T20:11:48.000Z | 2020-04-08T20:11:48.000Z | web_scraping/ec2files/ec2file78.py | cmgospod/Groa | 31b3624bfe61e772b55f8175b4e95d63c9e67966 | [
"MIT"
] | null | null | null | web_scraping/ec2files/ec2file78.py | cmgospod/Groa | 31b3624bfe61e772b55f8175b4e95d63c9e67966 | [
"MIT"
] | 1 | 2020-09-12T07:07:41.000Z | 2020-09-12T07:07:41.000Z | from scraper import *
s = Scraper(start=138996, end=140777, max_iter=30, scraper_instance=78)
s.scrape_letterboxd() | 39 | 72 | 0.777778 | 18 | 117 | 4.888889 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.152381 | 0.102564 | 117 | 3 | 73 | 39 | 0.685714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
93cb4419d9691b2ed3418c709e86de6b48657ce2 | 122 | py | Python | Day_2_Software_engineering_best_practices/solutions/06_07_08_full_package/spectra_analysis/__init__.py | Morisset/python-workshop | ec8b0c4f08a24833e53a22f6b52566a08715c9d0 | [
"BSD-3-Clause"
] | null | null | null | Day_2_Software_engineering_best_practices/solutions/06_07_08_full_package/spectra_analysis/__init__.py | Morisset/python-workshop | ec8b0c4f08a24833e53a22f6b52566a08715c9d0 | [
"BSD-3-Clause"
] | null | null | null | Day_2_Software_engineering_best_practices/solutions/06_07_08_full_package/spectra_analysis/__init__.py | Morisset/python-workshop | ec8b0c4f08a24833e53a22f6b52566a08715c9d0 | [
"BSD-3-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Spectra analysis utilities
"""
from ._version import __version__
__all__ = ['__version__']
| 12.2 | 33 | 0.663934 | 12 | 122 | 5.666667 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009804 | 0.163934 | 122 | 9 | 34 | 13.555556 | 0.656863 | 0.401639 | 0 | 0 | 0 | 0 | 0.171875 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
93db9daeaca176a0d9639c9a8adf4162b78f5785 | 52 | py | Python | list_ebs.py | willfong/aws-helper | 21708044fbf95b76393e9b5f0e86c5e74ff11c77 | [
"MIT"
] | null | null | null | list_ebs.py | willfong/aws-helper | 21708044fbf95b76393e9b5f0e86c5e74ff11c77 | [
"MIT"
] | null | null | null | list_ebs.py | willfong/aws-helper | 21708044fbf95b76393e9b5f0e86c5e74ff11c77 | [
"MIT"
] | null | null | null | import boto3
aws_ebs_client = boto3.client('ebs')
| 10.4 | 36 | 0.75 | 8 | 52 | 4.625 | 0.625 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.044444 | 0.134615 | 52 | 4 | 37 | 13 | 0.777778 | 0 | 0 | 0 | 0 | 0 | 0.058824 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
93e251e9378f58f91368189ca0f98d7e9d184630 | 173 | py | Python | Demos/Demo-4.2 Modules/script_3.py | Josverl/MicroPython-Bootcamp | 29f5ccc9768fbea621029dcf6eea9c91ff84c1d5 | [
"MIT"
] | 4 | 2018-04-28T13:43:20.000Z | 2021-03-11T16:10:35.000Z | Demos/Demo-4.2 Modules/script_3.py | Josverl/MicroPython-Bootcamp | 29f5ccc9768fbea621029dcf6eea9c91ff84c1d5 | [
"MIT"
] | null | null | null | Demos/Demo-4.2 Modules/script_3.py | Josverl/MicroPython-Bootcamp | 29f5ccc9768fbea621029dcf6eea9c91ff84c1d5 | [
"MIT"
] | null | null | null | # import just one function from a module
# to save memory
from module import dowork
#now we can us a different name to get to the imported function
#
dowork(13,45)
dir() | 19.222222 | 63 | 0.745665 | 31 | 173 | 4.16129 | 0.741935 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.028986 | 0.202312 | 173 | 9 | 64 | 19.222222 | 0.905797 | 0.676301 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
93e9f8a0d79848804615cc209c301de7b2ddbead | 151 | py | Python | chap01/07.py | knuu/nlp100 | 5008d678d7c8d15057ac67fe68b0667657c39b29 | [
"MIT"
] | 1 | 2015-09-11T10:33:42.000Z | 2015-09-11T10:33:42.000Z | chap01/07.py | knuu/nlp100 | 5008d678d7c8d15057ac67fe68b0667657c39b29 | [
"MIT"
] | null | null | null | chap01/07.py | knuu/nlp100 | 5008d678d7c8d15057ac67fe68b0667657c39b29 | [
"MIT"
] | null | null | null | def makeSentence(x, y, z):
return '{0}時の{1}は{2}'.format(x, y, z)
if __name__ == '__main__':
ans = makeSentence(12, '気温', 22.4)
print(ans)
| 21.571429 | 41 | 0.576159 | 26 | 151 | 3.038462 | 0.807692 | 0.050633 | 0.075949 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.066667 | 0.205298 | 151 | 6 | 42 | 25.166667 | 0.591667 | 0 | 0 | 0 | 0 | 0 | 0.145695 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0 | 0.2 | 0.4 | 0.2 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 3 |
93f2524a7c6f2e836f91bdc023c3abf9b271eb96 | 56 | py | Python | mikaponics/foundation/__init__.py | mikaponics/mikaponics-back | 98e1ff8bab7dda3492e5ff637bf5aafd111c840c | [
"BSD-3-Clause"
] | 2 | 2019-04-30T23:51:41.000Z | 2019-05-04T00:35:52.000Z | mikaponics/foundation/__init__.py | mikaponics/mikaponics-back | 98e1ff8bab7dda3492e5ff637bf5aafd111c840c | [
"BSD-3-Clause"
] | 27 | 2019-04-30T20:22:28.000Z | 2022-02-10T08:10:32.000Z | mikaponics/foundation/__init__.py | mikaponics/mikaponics-back | 98e1ff8bab7dda3492e5ff637bf5aafd111c840c | [
"BSD-3-Clause"
] | 1 | 2019-03-08T18:24:23.000Z | 2019-03-08T18:24:23.000Z | default_app_config = 'foundation.apps.FoundationConfig'
| 28 | 55 | 0.857143 | 6 | 56 | 7.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.053571 | 56 | 1 | 56 | 56 | 0.867925 | 0 | 0 | 0 | 0 | 0 | 0.571429 | 0.571429 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
93fcc6644a7bd3a91ddcfdaa15c6e3faf2dbec83 | 137 | py | Python | tests/views/test_healthcheck.py | oliveryuen/python-flask | a53c9ed823fc2f63c416e5a3b47e91f5c9d91604 | [
"Apache-2.0"
] | null | null | null | tests/views/test_healthcheck.py | oliveryuen/python-flask | a53c9ed823fc2f63c416e5a3b47e91f5c9d91604 | [
"Apache-2.0"
] | null | null | null | tests/views/test_healthcheck.py | oliveryuen/python-flask | a53c9ed823fc2f63c416e5a3b47e91f5c9d91604 | [
"Apache-2.0"
] | null | null | null | """Test health check"""
def test_healthcheck(client):
response = client.get("/healthcheck")
assert response.status_code == 200
| 19.571429 | 41 | 0.70073 | 16 | 137 | 5.875 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.026087 | 0.160584 | 137 | 6 | 42 | 22.833333 | 0.791304 | 0.124088 | 0 | 0 | 0 | 0 | 0.105263 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.333333 | false | 0 | 0 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
9e08eda6cab019bc0097ad8470c08bcc09a74c92 | 5,959 | py | Python | code/geometry/sector.py | Prometheus3375/inno-thesis | 72245706fa25b49f333e08d6074d421b5becfdb5 | [
"BSD-3-Clause"
] | null | null | null | code/geometry/sector.py | Prometheus3375/inno-thesis | 72245706fa25b49f333e08d6074d421b5becfdb5 | [
"BSD-3-Clause"
] | null | null | null | code/geometry/sector.py | Prometheus3375/inno-thesis | 72245706fa25b49f333e08d6074d421b5becfdb5 | [
"BSD-3-Clause"
] | null | null | null | from io import StringIO
from math import atan2, ceil
from typing import Literal, Union, overload
from common import PI, Real, TWOPI, deg, real, reduce_angle
from functions import qbezeir_svg_given_middle
from .circle import CircleBase, FixedCircle
from .point import PointBase, Polar
def check_arc(value: float, /):
if not (0 < value < TWOPI):
raise ValueError(f'arc should be in range (0°, 360°), got {deg(value):.0g}°')
class SectorBase:
__slots__ = '_circle', '_arc', '_arm'
def __init__(self, circle: FixedCircle, arc: float, arm: float, /):
self._circle = circle
self._arc = arc
self._arm = arm
@property
def circle(self, /):
return self._circle
@property
def arc(self, /):
return self._arc
@property
def start_arm(self, /):
return self._arm
@property
def end_arm(self, /):
return self._arm - self._arc
@property
def end_arm_reduced(self, /):
return reduce_angle(self.end_arm)
def __repr__(self, /):
return (
f'{self.__class__.__name__}('
f'{self.circle}, '
f'arc={deg(self.arc):.0f}°, '
f'start_arm={deg(self.start_arm):.0f}°'
f')'
)
def copy(self, /):
return self.__class__(self.circle.copy(), self.arc, self.start_arm)
def __getnewargs__(self, /):
return self._circle, self._arc, self._arm
def fix(self, /) -> 'FixedSector':
raise NotImplementedError
def unfix(self, /) -> 'MutableSector':
raise NotImplementedError
def __eq__(self, other, /):
if isinstance(other, SectorBase):
return self.arc == other.arc and self.start_arm == self.start_arm and self.circle == other.circle
return NotImplemented
def __ne__(self, other, /):
if isinstance(other, SectorBase):
return self.arc != other.arc or self.start_arm != self.start_arm or self.circle != other.circle
return NotImplemented
def is_angle_inside(self, fi: Real, /) -> bool:
fi = reduce_angle(fi)
start = self.start_arm
end = self.end_arm_reduced
if end > start:
return end <= fi <= PI or -PI < fi <= start
return end <= fi <= start
def is_point_inside(self, p: PointBase, /) -> bool:
# Another way https://stackoverflow.com/a/13675772
x = p.x - self.circle.center.x
y = p.y - self.circle.center.y
r2 = x * x + y * y
if r2 == 0:
return True
if r2 > self.circle.r2:
return False
return self.is_angle_inside(atan2(y, x))
def __contains__(self, item, /):
if isinstance(item, real):
return self.is_angle_inside(item)
if isinstance(item, PointBase):
return self.is_point_inside(item)
return False
def as_plotly_shape(self, step_angle: Real = PI / 6, /) -> dict:
# Simulate circle arc with quadratic Bezier curves
center = self.circle.center
r = self.circle.radius
n = ceil(self.arc / step_angle) - 1
p0 = Polar(r, self.start_arm) + center
path = StringIO()
path.write(
f'M {center.x} {center.y} '
f'L {p0.x} {p0.y} '
)
arm = self.start_arm
for _ in range(n):
pm = Polar(r, arm - step_angle / 2) + center
arm -= step_angle
p2 = Polar(r, arm) + center
path.write(f'{qbezeir_svg_given_middle(p0, p2, pm)} ')
p0 = p2
p2 = Polar(r, self.end_arm) + center
pm = Polar(r, (arm + self.end_arm) / 2) + center
path.write(f'{qbezeir_svg_given_middle(p0, p2, pm)} Z')
return dict(
type='path',
path=path.getvalue()
)
class FixedSector(SectorBase):
__slots__ = '_hash',
def __init__(self, circle: FixedCircle, arc: float, arm: float, /):
super().__init__(circle, arc, arm)
self._hash = hash(frozenset((circle, arc, arm)))
def fix(self, /):
return self
def unfix(self, /):
return MutableSector(self.circle, self.arc, self.start_arm)
def __hash__(self, /):
return self._hash
class MutableSector(SectorBase):
__slots__ = ()
# TODO: add circle changing
@property
def arc(self, /):
return self._arc
@arc.setter
def arc(self, value: Real, /):
check_arc(value)
self._arc = float(value)
@property
def start_arm(self, /):
return self._arm
@start_arm.setter
def start_arm(self, value: Real, /):
self._arm = reduce_angle(float(value))
@property
def end_arm(self, /):
return self._arm - self._arc
@end_arm.setter
def end_arm(self, value: Real, /):
self._arm = reduce_angle(value + self._arc)
def fix(self, /):
return FixedSector(self.circle, self.arc, self.start_arm)
def unfix(self, /):
return self
def rotate(self, angle: Real, /):
"""
Rotates the sector by the given angle clockwise
"""
self.start_arm -= angle
@overload
def Sector(circle: CircleBase, arc: Real, start_arm: Real = PI, /) -> FixedSector: ...
@overload
def Sector(circle: CircleBase, arc: Real, start_arm: Real = PI, /, *, fix: Literal[True]) -> FixedSector: ...
@overload
def Sector(circle: CircleBase, arc: Real, start_arm: Real = PI, /, *, fix: Literal[False]) -> MutableSector: ...
@overload
def Sector(circle: CircleBase, arc: Real, start_arm: Real = PI, /, *,
fix: bool) -> Union[FixedSector, MutableSector]: ...
def Sector(circle: CircleBase, arc: Real, start_arm: Real = PI, /, *, fix: bool = True) -> SectorBase:
check_arc(arc)
arc = float(arc)
start_arm = float(start_arm)
if fix:
return FixedSector(circle.fix(), arc, start_arm)
return MutableSector(circle.fix(), arc, start_arm)
| 26.721973 | 112 | 0.587515 | 757 | 5,959 | 4.425363 | 0.179657 | 0.06209 | 0.050149 | 0.037313 | 0.363881 | 0.33194 | 0.311045 | 0.266269 | 0.205373 | 0.179104 | 0 | 0.008939 | 0.286625 | 5,959 | 222 | 113 | 26.842342 | 0.777935 | 0.028864 | 0 | 0.258065 | 0 | 0 | 0.056761 | 0.025169 | 0 | 0 | 0 | 0.004505 | 0 | 1 | 0.232258 | false | 0 | 0.045161 | 0.103226 | 0.516129 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
f50a95d4fbb66571658a68aa0a66854f9c5c4220 | 437 | py | Python | src/my_package/todelete/modules/MotionSymmetryModule.py | laomao0/AIM_DAIN | 8322569498d675d3b2c1f35475c1299cad580bde | [
"MIT"
] | 3 | 2020-05-08T20:45:57.000Z | 2021-01-18T11:32:38.000Z | src/my_package/todelete/modules/MotionSymmetryModule.py | laomao0/AIM_DAIN | 8322569498d675d3b2c1f35475c1299cad580bde | [
"MIT"
] | null | null | null | src/my_package/todelete/modules/MotionSymmetryModule.py | laomao0/AIM_DAIN | 8322569498d675d3b2c1f35475c1299cad580bde | [
"MIT"
] | null | null | null | # modules/InterpolationLayer.py
from torch.nn import Module
from functions.MotionSymmetryLayer import MotionSymmetryLayer
class MotionSymmetryModule(Module):
def __init__(self):
super(MotionSymmetryModule, self).__init__()
self.f = MotionSymmetryLayer()
def forward(self, input1, input2):
return self.f(input1, input2)
#we actually dont need to write the backward code for a module, since we have
| 29.133333 | 81 | 0.741419 | 51 | 437 | 6.196078 | 0.666667 | 0.050633 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011236 | 0.185355 | 437 | 14 | 82 | 31.214286 | 0.876404 | 0.240275 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.25 | 0.125 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
f528d3c7d1c051d306cd7f8c1738faafc34bc81c | 125 | py | Python | Mundo 1/Exercicios/Desafio005.py | yWolfBR/Python-CursoEmVideo | 17bab8ad3c4293daf8377c5d49242942845b3577 | [
"MIT"
] | null | null | null | Mundo 1/Exercicios/Desafio005.py | yWolfBR/Python-CursoEmVideo | 17bab8ad3c4293daf8377c5d49242942845b3577 | [
"MIT"
] | null | null | null | Mundo 1/Exercicios/Desafio005.py | yWolfBR/Python-CursoEmVideo | 17bab8ad3c4293daf8377c5d49242942845b3577 | [
"MIT"
] | null | null | null | n = int(input('Digite um número: '))
print('Seu número é {}. O antecessor é {} e seu sucessor é {}'.format(n, n - 1, n + 1))
| 41.666667 | 87 | 0.592 | 23 | 125 | 3.217391 | 0.652174 | 0.054054 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.02 | 0.2 | 125 | 2 | 88 | 62.5 | 0.72 | 0 | 0 | 0 | 0 | 0 | 0.576 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 3 |
f532177c4078c1e01572de399b2bc77a18421da8 | 14,159 | py | Python | blender/2.79/scripts/addons/io_coat3D/tex.py | uzairakbar/bpy2.79 | 3a3e0004ac6783c4e4b89d939e4432de99026a85 | [
"MIT"
] | 2 | 2019-11-27T09:05:42.000Z | 2020-02-20T01:25:23.000Z | io_coat3D/tex.py | 1-MillionParanoidTterabytes/blender-addons-master | acc8fc23a38e6e89099c3e5079bea31ce85da06a | [
"Unlicense"
] | null | null | null | io_coat3D/tex.py | 1-MillionParanoidTterabytes/blender-addons-master | acc8fc23a38e6e89099c3e5079bea31ce85da06a | [
"Unlicense"
] | 4 | 2020-02-19T20:02:26.000Z | 2022-02-11T18:47:56.000Z | # ***** BEGIN GPL LICENSE BLOCK *****
#
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
# ***** END GPL LICENCE BLOCK *****
import bpy
import os
def find_index(objekti):
luku = 0
for tex in objekti.active_material.texture_slots:
if(not(hasattr(tex,'texture'))):
break
luku = luku +1
return luku
def gettex(mat_list, objekti, scene,export):
coat3D = bpy.context.scene.coat3D
coa = objekti.coat3D
if(bpy.context.scene.render.engine == 'VRAY_RENDER' or bpy.context.scene.render.engine == 'VRAY_RENDER_PREVIEW'):
vray = True
else:
vray = False
take_color = 0
take_spec = 0
take_normal = 0
take_disp = 0
bring_color = 1
bring_spec = 1
bring_normal = 1
bring_disp = 1
texcoat = {}
texcoat['color'] = []
texcoat['specular'] = []
texcoat['nmap'] = []
texcoat['disp'] = []
texu = []
if(export):
objekti.coat3D.objpath = export
nimi = os.path.split(export)[1]
osoite = os.path.dirname(export) + os.sep #pitaa ehka muuttaa
for mate in objekti.material_slots:
for tex_slot in mate.material.texture_slots:
if(hasattr(tex_slot,'texture')):
if(tex_slot.texture.type == 'IMAGE'):
if tex_slot.texture.image is not None:
tex_slot.texture.image.reload()
else:
if(os.sys.platform == 'win32'):
osoite = os.path.expanduser("~") + os.sep + 'Documents' + os.sep + '3DC2Blender' + os.sep + 'Textures' + os.sep
else:
osoite = os.path.expanduser("~") + os.sep + '3DC2Blender' + os.sep + 'Textures' + os.sep
ki = os.path.split(coa.applink_name)[1]
ko = os.path.splitext(ki)[0]
just_nimi = ko + '_'
just_nimi_len = len(just_nimi)
print('terve:' + coa.applink_name)
if(len(objekti.material_slots) != 0):
for obj_tex in objekti.active_material.texture_slots:
if(hasattr(obj_tex,'texture')):
if(obj_tex.texture.type == 'IMAGE'):
if(obj_tex.use_map_color_diffuse):
bring_color = 0;
if(obj_tex.use_map_specular):
bring_spec = 0;
if(obj_tex.use_map_normal):
bring_normal = 0;
if(obj_tex.use_map_displacement):
bring_disp = 0;
files = os.listdir(osoite)
for i in files:
tui = i[:just_nimi_len]
if(tui == just_nimi):
texu.append(i)
for yy in texu:
minimi = (yy.rfind('_'))+1
maksimi = (yy.rfind('.'))
tex_name = yy[minimi:maksimi]
koko = ''
koko += osoite
koko += yy
texcoat[tex_name].append(koko)
if((texcoat['color'] or texcoat['nmap'] or texcoat['disp'] or texcoat['specular']) and (len(objekti.material_slots)) == 0):
materials_old = bpy.data.materials.keys()
bpy.ops.material.new()
materials_new = bpy.data.materials.keys()
new_ma = list(set(materials_new).difference(set(materials_old)))
new_mat = new_ma[0]
ki = bpy.data.materials[new_mat]
objekti.data.materials.append(ki)
if(bring_color == 1 and texcoat['color']):
index = find_index(objekti)
tex = bpy.ops.Texture
objekti.active_material.texture_slots.create(index)
total_mat = len(objekti.active_material.texture_slots.items())
useold = ''
for seekco in bpy.data.textures:
if((seekco.name[:5] == 'Color') and (seekco.users_material == ())):
useold = seekco
if(useold == ''):
textures_old = bpy.data.textures.keys()
bpy.data.textures.new('Color',type='IMAGE')
textures_new = bpy.data.textures.keys()
name_te = list(set(textures_new).difference(set(textures_old)))
name_tex = name_te[0]
bpy.ops.image.new(name=name_tex)
bpy.data.images[name_tex].filepath = texcoat['color'][0]
bpy.data.images[name_tex].source = 'FILE'
objekti.active_material.texture_slots[index].texture = bpy.data.textures[name_tex]
objekti.active_material.texture_slots[index].texture.image = bpy.data.images[name_tex]
if(objekti.data.uv_textures.active):
objekti.active_material.texture_slots[index].texture_coords = 'UV'
objekti.active_material.texture_slots[index].uv_layer = objekti.data.uv_textures.active.name
objekti.active_material.texture_slots[index].texture.image.reload()
elif(useold != ''):
objekti.active_material.texture_slots[index].texture = useold
objekti.active_material.texture_slots[index].texture.image = bpy.data.images[useold.name]
objekti.active_material.texture_slots[index].texture.image.filepath = texcoat['color'][0]
if(objekti.data.uv_textures.active):
objekti.active_material.texture_slots[index].texture_coords = 'UV'
objekti.active_material.texture_slots[index].uv_layer = objekti.data.uv_textures.active.name
if(bring_normal == 1 and texcoat['nmap']):
index = find_index(objekti)
tex = bpy.ops.Texture
objekti.active_material.texture_slots.create(index)
total_mat = len(objekti.active_material.texture_slots.items())
useold = ''
for seekco in bpy.data.textures:
if((seekco.name[:6] == 'Normal') and (seekco.users_material == ())):
useold = seekco
if(useold == ''):
textures_old = bpy.data.textures.keys()
bpy.data.textures.new('Normal',type='IMAGE')
textures_new = bpy.data.textures.keys()
name_te = list(set(textures_new).difference(set(textures_old)))
name_tex = name_te[0]
bpy.ops.image.new(name=name_tex)
bpy.data.images[name_tex].filepath = texcoat['nmap'][0]
bpy.data.images[name_tex].source = 'FILE'
objekti.active_material.texture_slots[index].texture = bpy.data.textures[name_tex]
objekti.active_material.texture_slots[index].texture.image = bpy.data.images[name_tex]
if(objekti.data.uv_textures.active):
objekti.active_material.texture_slots[index].texture_coords = 'UV'
objekti.active_material.texture_slots[index].uv_layer = objekti.data.uv_textures.active.name
objekti.active_material.texture_slots[index].use_map_color_diffuse = False
objekti.active_material.texture_slots[index].use_map_normal = True
objekti.active_material.texture_slots[index].texture.image.reload()
if(vray):
bpy.data.textures[name_tex].vray_slot.BRDFBump.map_type = 'TANGENT'
else:
bpy.data.textures[name_tex].use_normal_map = True
objekti.active_material.texture_slots[index].normal_map_space = 'TANGENT'
objekti.active_material.texture_slots[index].normal_factor = 1
elif(useold != ''):
objekti.active_material.texture_slots[index].texture = useold
objekti.active_material.texture_slots[index].texture.image = bpy.data.images[useold.name]
objekti.active_material.texture_slots[index].texture.image.filepath = texcoat['nmap'][0]
if(objekti.data.uv_textures.active):
objekti.active_material.texture_slots[index].texture_coords = 'UV'
objekti.active_material.texture_slots[index].uv_layer = objekti.data.uv_textures.active.name
objekti.active_material.texture_slots[index].use_map_color_diffuse = False
objekti.active_material.texture_slots[index].use_map_normal = True
objekti.active_material.texture_slots[index].normal_factor = 1
if(bring_spec == 1 and texcoat['specular']):
index = find_index(objekti)
objekti.active_material.texture_slots.create(index)
useold = ''
for seekco in bpy.data.textures:
if((seekco.name[:8] == 'Specular') and (seekco.users_material == ())):
useold = seekco
if(useold == ''):
textures_old = bpy.data.textures.keys()
bpy.data.textures.new('Specular',type='IMAGE')
textures_new = bpy.data.textures.keys()
name_te = list(set(textures_new).difference(set(textures_old)))
name_tex = name_te[0]
bpy.ops.image.new(name=name_tex)
bpy.data.images[name_tex].filepath = texcoat['specular'][0]
bpy.data.images[name_tex].source = 'FILE'
objekti.active_material.texture_slots[index].texture = bpy.data.textures[name_tex]
objekti.active_material.texture_slots[index].texture.image = bpy.data.images[name_tex]
if(objekti.data.uv_textures.active):
objekti.active_material.texture_slots[index].texture_coords = 'UV'
objekti.active_material.texture_slots[index].uv_layer = objekti.data.uv_textures.active.name
objekti.active_material.texture_slots[index].use_map_color_diffuse = False
objekti.active_material.texture_slots[index].use_map_specular = True
objekti.active_material.texture_slots[index].texture.image.reload()
elif(useold != ''):
objekti.active_material.texture_slots[index].texture = useold
objekti.active_material.texture_slots[index].texture.image = bpy.data.images[useold.name]
objekti.active_material.texture_slots[index].texture.image.filepath = texcoat['specular'][0]
if(objekti.data.uv_textures.active):
objekti.active_material.texture_slots[index].texture_coords = 'UV'
objekti.active_material.texture_slots[index].uv_layer = objekti.data.uv_textures.active.name
objekti.active_material.texture_slots[index].use_map_color_diffuse = False
objekti.active_material.texture_slots[index].use_map_specular = True
if(bring_disp == 1 and texcoat['disp']):
index = find_index(objekti)
objekti.active_material.texture_slots.create(index)
useold = ''
for seekco in bpy.data.textures:
if((seekco.name[:12] == 'Displacement') and (seekco.users_material == ())):
useold = seekco
if useold == "":
textures_old = bpy.data.textures.keys()
bpy.data.textures.new('Displacement',type='IMAGE')
textures_new = bpy.data.textures.keys()
name_te = list(set(textures_new).difference(set(textures_old)))
name_tex = name_te[0]
bpy.ops.image.new(name=name_tex)
bpy.data.images[name_tex].filepath = texcoat['disp'][0]
bpy.data.images[name_tex].source = 'FILE'
objekti.active_material.texture_slots[index].texture = bpy.data.textures[name_tex]
objekti.active_material.texture_slots[index].texture.image = bpy.data.images[name_tex]
if(objekti.data.uv_textures.active):
objekti.active_material.texture_slots[index].texture_coords = 'UV'
objekti.active_material.texture_slots[index].uv_layer = objekti.data.uv_textures.active.name
objekti.active_material.texture_slots[index].use_map_color_diffuse = False
objekti.active_material.texture_slots[index].use_map_displacement = True
objekti.active_material.texture_slots[index].texture.image.reload()
elif(useold != ''):
objekti.active_material.texture_slots[index].texture = useold
objekti.active_material.texture_slots[index].texture.image = bpy.data.images[useold.name]
objekti.active_material.texture_slots[index].texture.image.filepath = texcoat['disp'][0]
if(objekti.data.uv_textures.active):
objekti.active_material.texture_slots[index].texture_coords = 'UV'
objekti.active_material.texture_slots[index].uv_layer = objekti.data.uv_textures.active.name
objekti.active_material.texture_slots[index].use_map_color_diffuse = False
objekti.active_material.texture_slots[index].use_map_displacement = True
if(vray):
objekti.active_material.texture_slots[index].texture.use_interpolation = False
objekti.active_material.texture_slots[index].displacement_factor = 0.05
else:
disp_modi = ''
for seek_modi in objekti.modifiers:
if(seek_modi.type == 'DISPLACE'):
disp_modi = seek_modi
break
if(disp_modi):
disp_modi.texture = objekti.active_material.texture_slots[index].texture
if(objekti.data.uv_textures.active):
disp_modi.texture_coords = 'UV'
disp_modi.uv_layer = objekti.data.uv_textures.active.name
else:
objekti.modifiers.new('Displace',type='DISPLACE')
objekti.modifiers['Displace'].texture = objekti.active_material.texture_slots[index].texture
if(objekti.data.uv_textures.active):
objekti.modifiers['Displace'].texture_coords = 'UV'
objekti.modifiers['Displace'].uv_layer = objekti.data.uv_textures.active.name
return('FINISHED')
| 42.139881 | 127 | 0.634296 | 1,723 | 14,159 | 5.009286 | 0.120139 | 0.118179 | 0.157572 | 0.217356 | 0.725177 | 0.705712 | 0.688333 | 0.668752 | 0.637238 | 0.629823 | 0 | 0.006505 | 0.250865 | 14,159 | 335 | 128 | 42.265672 | 0.807203 | 0.054241 | 0 | 0.514403 | 0 | 0 | 0.031187 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.00823 | false | 0 | 0.00823 | 0 | 0.020576 | 0.004115 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
f583aafa3eab4133dcbce8cce69eba93bfd77474 | 2,163 | py | Python | Python/kruskal.py | AtilioA/algoritmos-teoria-dos-grafos | 287234d9d4c5c16707dfe71629f5c237e1759826 | [
"Unlicense"
] | 2 | 2020-05-14T14:12:45.000Z | 2020-09-07T20:44:23.000Z | Python/kruskal.py | AtilioA/teoria-dos-grafos | 287234d9d4c5c16707dfe71629f5c237e1759826 | [
"Unlicense"
] | null | null | null | Python/kruskal.py | AtilioA/teoria-dos-grafos | 287234d9d4c5c16707dfe71629f5c237e1759826 | [
"Unlicense"
] | null | null | null | # Supostamente não funciona
from aresta import Aresta
from insert import insert_sort
from collections import defaultdict
def kruskal(arestas):
arestas, vertices = insert_sort(arestas, defaultdict())
# Inicializa a árvore de fato
arvore = list()
# vertices terá o número de chaves do dicionário retornado pelo insertion_sort
tamanhoArvore = len(vertices.keys())
i = 0
# Enquanto o tamanho da árvore é menor que o tamanho do dicionário de vértices,
while len(arvore) < tamanhoArvore - 1:
# Utilizamos todas as arestas
aresta = arestas[i]
i += 1
# Para verificar o peso das arestas com o dicionário
if vertices[aresta.first] < 2 and vertices[aresta.second] < 2:
vertices[aresta.first] += 1
vertices[aresta.second] += 1
arvore.append(aresta)
# Não se utiliza todo o dicionário pois o tamanho da árvore quebra o while antes disso
return arvore
if __name__ == "__main__":
arestas = list()
# arestas.append(Aresta(1, 'a', 'b'))
# arestas.append(Aresta(8, 'a', 'c'))
# arestas.append(Aresta(3, 'c', 'b'))
# arestas.append(Aresta(4, 'b', 'd'))
# arestas.append(Aresta(2, 'd', 'e'))
# arestas.append(Aresta(3, 'b', 'e'))
# arestas.append(Aresta(-1, 'c', 'd'))
# arestas.append(Aresta(13, '0', '3'))
# arestas.append(Aresta(24, '0', '1'))
# arestas.append(Aresta(13, '0', '2'))
# arestas.append(Aresta(22, '0', '4'))
# arestas.append(Aresta(13, '1', '3'))
# arestas.append(Aresta(22, '1', '2'))
# arestas.append(Aresta(13, '1', '4'))
# arestas.append(Aresta(19, '2', '3'))
# arestas.append(Aresta(14, '2', '4'))
# arestas.append(Aresta(19, '3', '4'))
arestas.append(Aresta(2, "0", "1"))
arestas.append(Aresta(-10, "0", "3"))
arestas.append(Aresta(3, "0", "2"))
arestas.append(Aresta(5, "1", "2"))
arestas.append(Aresta(0, "1", "3"))
arestas.append(Aresta(4, "2", "3"))
grafo = kruskal(arestas)
print("Imprimindo árvore geradora mínima:")
for aresta in grafo:
print(f"Peso {aresta.peso:2}: {aresta.first:1} para {aresta.second:2}")
| 33.796875 | 90 | 0.609801 | 294 | 2,163 | 4.44898 | 0.306122 | 0.220183 | 0.334098 | 0.076453 | 0.230122 | 0 | 0 | 0 | 0 | 0 | 0 | 0.045104 | 0.220989 | 2,163 | 63 | 91 | 34.333333 | 0.731157 | 0.460009 | 0 | 0 | 0 | 0.035714 | 0.100612 | 0 | 0 | 0 | 0 | 0.015873 | 0 | 1 | 0.035714 | false | 0 | 0.107143 | 0 | 0.178571 | 0.071429 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
192a1f5991a077d1bd0f6488cd823461b384cac5 | 188 | py | Python | psmate/apps/blog/admin.py | vgrivtsov/psmate | 10e0279b995d36518e0867e8c5d5125c355a2f00 | [
"MIT"
] | null | null | null | psmate/apps/blog/admin.py | vgrivtsov/psmate | 10e0279b995d36518e0867e8c5d5125c355a2f00 | [
"MIT"
] | null | null | null | psmate/apps/blog/admin.py | vgrivtsov/psmate | 10e0279b995d36518e0867e8c5d5125c355a2f00 | [
"MIT"
] | null | null | null | from django.contrib import admin
from psmate.models import News
class BlogAdmin(admin.ModelAdmin):
prepopulated_fields = {'slug': ('title',)}
admin.site.register(News, BlogAdmin)
| 17.090909 | 46 | 0.75 | 23 | 188 | 6.086957 | 0.73913 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.132979 | 188 | 10 | 47 | 18.8 | 0.858896 | 0 | 0 | 0 | 0 | 0 | 0.048128 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 0.8 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
198e2ea49efeb318a688dffd17a8ff15bcc42c01 | 13 | py | Python | dmm/dmm_data/__init__.py | voghoei/Different-Models | ede4a48a7960ccb4a8519bfb77d234328f2936d1 | [
"MIT"
] | 11 | 2017-11-16T13:01:47.000Z | 2021-12-26T20:07:24.000Z | optvaedatasets/__init__.py | rahulk90/inference_introspection | 102b3cf72abae8d66718b945df365edd4a23a62d | [
"MIT"
] | null | null | null | optvaedatasets/__init__.py | rahulk90/inference_introspection | 102b3cf72abae8d66718b945df365edd4a23a62d | [
"MIT"
] | null | null | null | all=['load']
| 6.5 | 12 | 0.538462 | 2 | 13 | 3.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.076923 | 13 | 1 | 13 | 13 | 0.583333 | 0 | 0 | 0 | 0 | 0 | 0.307692 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
19aaba565d436c4904687e34d86883a59ba3510f | 92 | py | Python | 29.operacoes_com_lista/17.exercicio4.py | robinson-1985/python-zero-dnc | df510d67e453611fcd320df1397cdb9ca47fecb8 | [
"MIT"
] | null | null | null | 29.operacoes_com_lista/17.exercicio4.py | robinson-1985/python-zero-dnc | df510d67e453611fcd320df1397cdb9ca47fecb8 | [
"MIT"
] | null | null | null | 29.operacoes_com_lista/17.exercicio4.py | robinson-1985/python-zero-dnc | df510d67e453611fcd320df1397cdb9ca47fecb8 | [
"MIT"
] | null | null | null | # 4.Crie uma lista que vá de 1 até 100.
list = [ cont for cont in range(1,101)]
print(list) | 23 | 39 | 0.673913 | 20 | 92 | 3.1 | 0.85 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.123288 | 0.206522 | 92 | 4 | 40 | 23 | 0.726027 | 0.402174 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 3 |
27453b9db08872e055996422f63acffb64350e41 | 361 | py | Python | examples/03additionals/legacy_score.py | dotness/swagger-marshmallow-codegen | 62938d780672d754431d50bde3eae04abefb64f1 | [
"MIT"
] | null | null | null | examples/03additionals/legacy_score.py | dotness/swagger-marshmallow-codegen | 62938d780672d754431d50bde3eae04abefb64f1 | [
"MIT"
] | null | null | null | examples/03additionals/legacy_score.py | dotness/swagger-marshmallow-codegen | 62938d780672d754431d50bde3eae04abefb64f1 | [
"MIT"
] | null | null | null | # -*- coding:utf-8 -*-
# this is auto-generated by swagger-marshmallow-codegen
from swagger_marshmallow_codegen.schema.legacy import (
AdditionalPropertiesSchema,
LegacySchema
)
from marshmallow import fields
class Score(AdditionalPropertiesSchema):
name = fields.String(required=True)
class Meta:
additional_field = fields.Integer()
| 24.066667 | 55 | 0.753463 | 38 | 361 | 7.078947 | 0.736842 | 0.133829 | 0.185874 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0033 | 0.160665 | 361 | 14 | 56 | 25.785714 | 0.884488 | 0.204986 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.222222 | 0 | 0.555556 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
27487403a3d7b9e65ea004d6b9bb66d14163ac93 | 2,617 | py | Python | utest/test_iinekoko_db.py | MizunagiKB/IIneKoKo | 495ae6dc2887bb8b41331fab1f21812368400cf2 | [
"MIT"
] | null | null | null | utest/test_iinekoko_db.py | MizunagiKB/IIneKoKo | 495ae6dc2887bb8b41331fab1f21812368400cf2 | [
"MIT"
] | null | null | null | utest/test_iinekoko_db.py | MizunagiKB/IIneKoKo | 495ae6dc2887bb8b41331fab1f21812368400cf2 | [
"MIT"
] | null | null | null | import sys
import unittest
import configparser
sys.path.append("./svc")
class CIIneKoKo_DB(unittest.TestCase):
def setUp(self):
import iinekoko_db
self.o_conf = configparser.ConfigParser()
self.o_conf.read("./svc/config.ini")
self.o_conn = iinekoko_db.CDatabase(self.o_conf)
def test_database(self):
pass
"""
def test_image_encdec(self):
import iinekoko_db
with open("utest/image/test.jpg", "rb") as f:
raw_data = f.read()
enc_data1 = iinekoko_db.dict_image_b64enc("image/jpeg", raw_data)
mime, dec_data = iinekoko_db.dict_image_b64dec(enc_data1)
iinekoko_db.dict_image_b64enc(mime, dec_data)
self.assertTrue(True)
def test_doc_session(self):
o_doc1 = self.o_conn.new_session("1", "username_1")
self.assertEqual(o_doc1.tw_id, "1")
self.assertEqual(o_doc1.tw_username, "username_1")
o_doc2 = self.o_conn.get_session(o_doc1.document_id)
self.assertEqual(o_doc1.document_id, o_doc2.document_id)
self.o_conn.del_session(o_doc1.document_id)
o_doc = self.o_conn.get_session(o_doc1.document_id)
self.assertEqual(o_doc, None)
o_doc = self.o_conn.get_session("X")
self.assertEqual(o_doc, None)
self.assertFalse(self.o_conn.del_session("X"))
def test_doc_image_ref(self):
import iinekoko_db
with open("utest/image/test.jpg", "rb") as f:
raw_data = f.read()
enc_data = iinekoko_db.dict_image_b64enc("image/jpeg", raw_data)
TW_ID = "1"
TW_USERNAME = "username_1"
o_doc = self.o_conn.new_image_ref(enc_data, TW_ID, TW_USERNAME, [])
self.assertEqual(o_doc.tw_id, TW_ID)
o_doc1 = self.o_conn.get_image_ref(o_doc.get_document_id())
self.assertEqual(o_doc.get_document_id(), o_doc1.get_document_id())
self.o_conn.del_image_ref(o_doc.get_document_id())
o_doc = self.o_conn.get_image_ref(o_doc.get_document_id())
self.assertEqual(o_doc, None)
def test_doc_image_mrk(self):
import iinekoko_db
ID_IMAGE_REF = "1"
TW_ID = "1"
TW_USERNAME = "username_1"
o_doc = self.o_conn.append_image_mrk(ID_IMAGE_REF, TW_ID, TW_USERNAME,
[])
self.o_conn.remove_image_mrk(o_doc.get_document_id())
def test_doc_image_ref_list(self):
import iinekoko_db
TW_ID = "431236837"
self.o_conn.get_image_ref_list(TW_ID)
"""
# [EOF]
| 29.077778 | 79 | 0.632404 | 382 | 2,617 | 3.950262 | 0.17801 | 0.059642 | 0.083499 | 0.047714 | 0.63552 | 0.488403 | 0.406229 | 0.348575 | 0.33002 | 0.275679 | 0 | 0.020124 | 0.259457 | 2,617 | 89 | 80 | 29.404494 | 0.758514 | 0.001911 | 0 | 0 | 0 | 0 | 0.057377 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0.083333 | 0.333333 | 0 | 0.583333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 3 |
277858124203940aedb7475ae3b0715e859459a7 | 914 | py | Python | services/login_service.py | EderBevacqua/OPE_ADS_3C | af6e2a1757c82080dc5f6c1f7759f29f408341cc | [
"Apache-2.0"
] | null | null | null | services/login_service.py | EderBevacqua/OPE_ADS_3C | af6e2a1757c82080dc5f6c1f7759f29f408341cc | [
"Apache-2.0"
] | 6 | 2020-06-14T21:50:25.000Z | 2020-06-15T19:39:01.000Z | services/login_service.py | EderBevacqua/OPE_ADS_4C | af6e2a1757c82080dc5f6c1f7759f29f408341cc | [
"Apache-2.0"
] | null | null | null | from infra.usuario_dao import \
listar as dao_listar, \
consultar as dao_consultar, \
cadastrar as dao_cadastrar, \
alterar as dao_alterar, \
remover as dao_remover,\
loadUserEmail as dao_loadUserEmail,\
validarLogin as dao_validarLogin,\
validaMatriculaUsuario as dao_validaMatriculaUsuario,\
carregarUsuario as dao_carregarUsuario,\
cadastrarNovoLogin as dao_cadastrarNovoLogin,\
ativarConta as dao_ativarConta
def loadUserEmail(email):
return dao_loadUserEmail(email)
def validarLogin(email):
return dao_validarLogin(email)
def validaMatriculaUsuario(nMatricula):
return dao_validaMatriculaUsuario(nMatricula)
def carregarUsuario(user_id):
return dao_carregarUsuario(user_id)
def cadastrarNovoLogin(novoLogin):
return dao_cadastrarNovoLogin(novoLogin)
def ativarConta(numeroMatricula,senha):
return dao_ativarConta(numeroMatricula,senha) | 30.466667 | 58 | 0.789934 | 95 | 914 | 7.389474 | 0.273684 | 0.078348 | 0.039886 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.150985 | 914 | 30 | 59 | 30.466667 | 0.904639 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.041667 | 0.25 | 0.541667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
277d94d70e278173e780c161c144680891ccd6df | 116 | py | Python | server/domain/datasets/exceptions.py | multi-coop/catalogage-donnees | 1d70401ff6c7b01ec051460a253cb105adf65911 | [
"MIT"
] | null | null | null | server/domain/datasets/exceptions.py | multi-coop/catalogage-donnees | 1d70401ff6c7b01ec051460a253cb105adf65911 | [
"MIT"
] | 14 | 2022-01-25T17:56:52.000Z | 2022-01-28T17:47:59.000Z | server/domain/datasets/exceptions.py | multi-coop/catalogage-donnees | 1d70401ff6c7b01ec051460a253cb105adf65911 | [
"MIT"
] | null | null | null | from ..common.exceptions import DoesNotExist
class DatasetDoesNotExist(DoesNotExist):
entity_name = "Dataset"
| 19.333333 | 44 | 0.793103 | 11 | 116 | 8.272727 | 0.909091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.12931 | 116 | 5 | 45 | 23.2 | 0.90099 | 0 | 0 | 0 | 0 | 0 | 0.060345 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
27811ac83801b7707ced28bf3be304104b0b4fe0 | 212 | py | Python | marsyas-vamp/marsyas/src/django/birdsong/application/birdsong/onsets/admin.py | jaouahbi/VampPlugins | 27c2248d1c717417fe4d448cdfb4cb882a8a336a | [
"Apache-2.0"
] | null | null | null | marsyas-vamp/marsyas/src/django/birdsong/application/birdsong/onsets/admin.py | jaouahbi/VampPlugins | 27c2248d1c717417fe4d448cdfb4cb882a8a336a | [
"Apache-2.0"
] | null | null | null | marsyas-vamp/marsyas/src/django/birdsong/application/birdsong/onsets/admin.py | jaouahbi/VampPlugins | 27c2248d1c717417fe4d448cdfb4cb882a8a336a | [
"Apache-2.0"
] | null | null | null | from django.contrib import admin
from calls.onsets.models import Recording
class RecordingAdmin(admin.ModelAdmin):
list_display = ('audio', 'image', 'length')
admin.site.register(Recording, RecordingAdmin)
| 26.5 | 47 | 0.783019 | 25 | 212 | 6.6 | 0.76 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.108491 | 212 | 7 | 48 | 30.285714 | 0.873016 | 0 | 0 | 0 | 0 | 0 | 0.075472 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 0.8 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
27a8998af1db32b395a9af2dbb6c8a21bc35a70c | 169 | py | Python | workSpace/boot.py | khutson/macequilt | a4a090ddf296fcea763825fda4243bc84b4d5f0d | [
"MIT"
] | null | null | null | workSpace/boot.py | khutson/macequilt | a4a090ddf296fcea763825fda4243bc84b4d5f0d | [
"MIT"
] | null | null | null | workSpace/boot.py | khutson/macequilt | a4a090ddf296fcea763825fda4243bc84b4d5f0d | [
"MIT"
] | null | null | null | # This file is executed on every boot (including wake-boot from deepsleep)
import esp
esp.osdebug(None)
import wifi
wifi.connect(repl=False)
import gc
gc.collect()
| 13 | 74 | 0.763314 | 27 | 169 | 4.777778 | 0.777778 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.153846 | 169 | 12 | 75 | 14.083333 | 0.902098 | 0.426036 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
27bd073801f417f0a990ea8f8617bbc868baa23e | 157 | py | Python | setup.py | messa/bloom | ce975471d0fabac436bcbd3040d22c6e5a97e47c | [
"MIT"
] | 1 | 2021-03-14T13:54:42.000Z | 2021-03-14T13:54:42.000Z | setup.py | messa/bloom | ce975471d0fabac436bcbd3040d22c6e5a97e47c | [
"MIT"
] | 1 | 2021-03-15T09:02:24.000Z | 2021-03-16T07:41:46.000Z | setup.py | messa/bloom | ce975471d0fabac436bcbd3040d22c6e5a97e47c | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
from setuptools import setup, Extension
setup(
ext_modules=[
Extension('bloom._hashc', ['bloom/_hashcmodule.c'])
])
| 17.444444 | 59 | 0.66242 | 18 | 157 | 5.611111 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007874 | 0.191083 | 157 | 8 | 60 | 19.625 | 0.787402 | 0.133758 | 0 | 0 | 0 | 0 | 0.237037 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.2 | 0 | 0.2 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
27c811e47a423871511471f8e6a47527924900eb | 202 | py | Python | resume/display/views.py | Varun789/Profile | 990818d233ac0279ef4d55641e1e284850bdbfb2 | [
"BSD-3-Clause"
] | null | null | null | resume/display/views.py | Varun789/Profile | 990818d233ac0279ef4d55641e1e284850bdbfb2 | [
"BSD-3-Clause"
] | null | null | null | resume/display/views.py | Varun789/Profile | 990818d233ac0279ef4d55641e1e284850bdbfb2 | [
"BSD-3-Clause"
] | null | null | null | from django.shortcuts import render
from .models import Profile
# Create your views here.
def home(request):
profile=Profile.objects
return render(request,'home.html',{'profile' : profile})
| 22.444444 | 60 | 0.737624 | 26 | 202 | 5.730769 | 0.653846 | 0.187919 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.158416 | 202 | 9 | 61 | 22.444444 | 0.876471 | 0.113861 | 0 | 0 | 0 | 0 | 0.089888 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.4 | 0 | 0.8 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
27de194719485c100a81b84fd59429f4b32b78e0 | 992 | py | Python | modules/wxpy-index/wxpy_index/version.py | john04047210/mira_wepy_server | 385b8561e63f9164102e60681e2704c55fec0577 | [
"MIT"
] | 1 | 2018-05-22T11:25:59.000Z | 2018-05-22T11:25:59.000Z | modules/wxpy-index/wxpy_index/version.py | john04047210/mira_wepy_server | 385b8561e63f9164102e60681e2704c55fec0577 | [
"MIT"
] | null | null | null | modules/wxpy-index/wxpy_index/version.py | john04047210/mira_wepy_server | 385b8561e63f9164102e60681e2704c55fec0577 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
#
# This file is part of Invenio.
# Copyright (C) 2018 QiaoPeng.
#
# Invenio is free software; you can redistribute it
# and/or modify it under the terms of the GNU General Public License as
# published by the Free Software Foundation; either version 2 of the
# License, or (at your option) any later version.
#
# Invenio is distributed in the hope that it will be
# useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
# General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with Invenio; if not, write to the
# Free Software Foundation, Inc., 59 Temple Place, Suite 330, Boston,
# MA 02111-1307, USA.
"""Version information for Wxpy-Index.
This file is imported by ``wxpy_index.__init__``,
and parsed by ``setup.py``.
"""
from __future__ import absolute_import, print_function
__version__ = '0.1.0.dev20180000'
| 33.066667 | 72 | 0.746976 | 155 | 992 | 4.683871 | 0.645161 | 0.049587 | 0.053719 | 0.078512 | 0.112948 | 0.077135 | 0 | 0 | 0 | 0 | 0 | 0.037759 | 0.172379 | 992 | 29 | 73 | 34.206897 | 0.846529 | 0.861895 | 0 | 0 | 0 | 0 | 0.151786 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0.5 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 3 |
27effbc79d2bf6543199f4b75da0205988092da4 | 498 | py | Python | pychonet/HomeSolarPower.py | mochipon/pychonet | 65ba4189f9a66b6e698646854542cdd506369813 | [
"MIT"
] | null | null | null | pychonet/HomeSolarPower.py | mochipon/pychonet | 65ba4189f9a66b6e698646854542cdd506369813 | [
"MIT"
] | null | null | null | pychonet/HomeSolarPower.py | mochipon/pychonet | 65ba4189f9a66b6e698646854542cdd506369813 | [
"MIT"
] | null | null | null | from pychonet.EchonetInstance import EchonetInstance
class HomeSolarPower(EchonetInstance):
def __init__(self, netif, instance = 0x1):
self.eojgc = 0x02
self.eojcc = 0x79
EchonetInstance.__init__(self, self.eojgc, self.eojcc, instance, netif)
def getMeasuredInstantPower(self):
return int.from_bytes(self.getSingleMessageResponse(0xE0), 'big')
def getMeasuredCumulPower(self):
return int.from_bytes(self.getSingleMessageResponse(0xE1), 'big')
| 35.571429 | 79 | 0.728916 | 51 | 498 | 6.921569 | 0.490196 | 0.045326 | 0.073654 | 0.096317 | 0.283286 | 0.283286 | 0.283286 | 0 | 0 | 0 | 0 | 0.029268 | 0.176707 | 498 | 13 | 80 | 38.307692 | 0.831707 | 0 | 0 | 0 | 0 | 0 | 0.012048 | 0 | 0 | 0 | 0.038153 | 0 | 0 | 1 | 0.3 | false | 0 | 0.1 | 0.2 | 0.7 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
27f6d38ee1079239114141527da38c16b3c99951 | 1,024 | py | Python | src/Screen.py | D3r3k23/CaveRun | 27f7b3c518f8646bc506f5d3b774ef6e62faef96 | [
"MIT"
] | 1 | 2022-02-10T04:42:04.000Z | 2022-02-10T04:42:04.000Z | src/Screen.py | D3r3k23/CaveRun | 27f7b3c518f8646bc506f5d3b774ef6e62faef96 | [
"MIT"
] | null | null | null | src/Screen.py | D3r3k23/CaveRun | 27f7b3c518f8646bc506f5d3b774ef6e62faef96 | [
"MIT"
] | 1 | 2022-01-11T17:11:44.000Z | 2022-01-11T17:11:44.000Z |
import Resources
import Colors
import pygame
screen = None
def init(width, height):
global screen
screen = pygame.display.set_mode((width, height))
def width():
return screen.get_width()
def height():
return screen.get_height()
def res():
return (screen.get_width(), screen.get_height())
def rect():
return screen.get_rect()
def clear():
screen.fill(Colors.BLACK)
def draw_to_screen(img, rect=(0, 0)):
screen.blit(img, rect)
def display():
pygame.display.update()
clear()
# Base class for drawable objects
# Created from image and coordinates, stores image and rect
class Drawable:
def __init__(self, img, x, y, center=False): # x, y: center or (left, top) coordinates
width = img.get_width()
height = img.get_height()
origin = (x - (width // 2), y - (height // 2)) if center else (x, y)
self.img = img
self.rect = pygame.Rect(origin, (width, height))
def draw(self):
draw_to_screen(self.img, self.rect)
| 21.333333 | 90 | 0.640625 | 143 | 1,024 | 4.475524 | 0.34965 | 0.070313 | 0.09375 | 0.0625 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00507 | 0.229492 | 1,024 | 47 | 91 | 21.787234 | 0.806084 | 0.125977 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.322581 | false | 0 | 0.096774 | 0.129032 | 0.580645 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
7e0968601bb493a7e6ab7c62ca33e94de63a37f6 | 123 | py | Python | src/apps/buttons/apps.py | GoddessEyes/info_tbot | c7c5c818dc0c0c72aa15e6e4a85e7e28b4a7660d | [
"MIT"
] | null | null | null | src/apps/buttons/apps.py | GoddessEyes/info_tbot | c7c5c818dc0c0c72aa15e6e4a85e7e28b4a7660d | [
"MIT"
] | 4 | 2021-03-19T02:42:10.000Z | 2021-09-22T19:08:09.000Z | src/apps/buttons/apps.py | GoddessEyes/info_tbot | c7c5c818dc0c0c72aa15e6e4a85e7e28b4a7660d | [
"MIT"
] | null | null | null | from django.apps import AppConfig
class ButtonsConfig(AppConfig):
name = 'apps.buttons'
verbose_name = 'Клавиши'
| 17.571429 | 33 | 0.731707 | 14 | 123 | 6.357143 | 0.785714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.178862 | 123 | 6 | 34 | 20.5 | 0.881188 | 0 | 0 | 0 | 0 | 0 | 0.154472 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
fd69e06856c7f3a481475985f97cf69bf7d1965f | 127 | py | Python | satchmo/projects/skeleton/localsite/urls.py | predatell/satchmo | 6ced1f845aadec240c7e433c3cbf4caca96e0d92 | [
"BSD-3-Clause"
] | 1 | 2019-10-08T16:19:59.000Z | 2019-10-08T16:19:59.000Z | satchmo/projects/skeleton/localsite/urls.py | predatell/satchmo | 6ced1f845aadec240c7e433c3cbf4caca96e0d92 | [
"BSD-3-Clause"
] | null | null | null | satchmo/projects/skeleton/localsite/urls.py | predatell/satchmo | 6ced1f845aadec240c7e433c3cbf4caca96e0d92 | [
"BSD-3-Clause"
] | null | null | null | from django.conf.urls import url
from simple.localsite.views import example
urlpatterns = [
url(r'example/', example),
]
| 15.875 | 42 | 0.732283 | 17 | 127 | 5.470588 | 0.705882 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.15748 | 127 | 7 | 43 | 18.142857 | 0.869159 | 0 | 0 | 0 | 0 | 0 | 0.062992 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 0.4 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
fd84a04d43460db0ba028f9e178dd3ce7cffe504 | 2,650 | py | Python | eland/tests/dataframe/test_aggs_pytest.py | redNixon/eland | 1b9cb1db6d30f0662fe3679c7bb31e2c0865f0c3 | [
"Apache-2.0"
] | null | null | null | eland/tests/dataframe/test_aggs_pytest.py | redNixon/eland | 1b9cb1db6d30f0662fe3679c7bb31e2c0865f0c3 | [
"Apache-2.0"
] | null | null | null | eland/tests/dataframe/test_aggs_pytest.py | redNixon/eland | 1b9cb1db6d30f0662fe3679c7bb31e2c0865f0c3 | [
"Apache-2.0"
] | null | null | null | # Copyright 2019 Elasticsearch BV
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# File called _pytest for PyCharm compatability
import numpy as np
from pandas.util.testing import assert_almost_equal
from eland.tests.common import TestData
class TestDataFrameAggs(TestData):
def test_basic_aggs(self):
pd_flights = self.pd_flights()
ed_flights = self.ed_flights()
pd_sum_min = pd_flights.select_dtypes(include=[np.number]).agg(["sum", "min"])
ed_sum_min = ed_flights.select_dtypes(include=[np.number]).agg(["sum", "min"])
# Eland returns all float values for all metric aggs, pandas can return int
# TODO - investigate this more
pd_sum_min = pd_sum_min.astype("float64")
assert_almost_equal(pd_sum_min, ed_sum_min)
pd_sum_min_std = pd_flights.select_dtypes(include=[np.number]).agg(
["sum", "min", "std"]
)
ed_sum_min_std = ed_flights.select_dtypes(include=[np.number]).agg(
["sum", "min", "std"]
)
print(pd_sum_min_std.dtypes)
print(ed_sum_min_std.dtypes)
assert_almost_equal(pd_sum_min_std, ed_sum_min_std, check_less_precise=True)
def test_terms_aggs(self):
pd_flights = self.pd_flights()
ed_flights = self.ed_flights()
pd_sum_min = pd_flights.select_dtypes(include=[np.number]).agg(["sum", "min"])
ed_sum_min = ed_flights.select_dtypes(include=[np.number]).agg(["sum", "min"])
# Eland returns all float values for all metric aggs, pandas can return int
# TODO - investigate this more
pd_sum_min = pd_sum_min.astype("float64")
assert_almost_equal(pd_sum_min, ed_sum_min)
pd_sum_min_std = pd_flights.select_dtypes(include=[np.number]).agg(
["sum", "min", "std"]
)
ed_sum_min_std = ed_flights.select_dtypes(include=[np.number]).agg(
["sum", "min", "std"]
)
print(pd_sum_min_std.dtypes)
print(ed_sum_min_std.dtypes)
assert_almost_equal(pd_sum_min_std, ed_sum_min_std, check_less_precise=True)
| 37.323944 | 86 | 0.672075 | 381 | 2,650 | 4.404199 | 0.309711 | 0.114422 | 0.085816 | 0.123957 | 0.624553 | 0.624553 | 0.624553 | 0.624553 | 0.624553 | 0.624553 | 0 | 0.005848 | 0.22566 | 2,650 | 70 | 87 | 37.857143 | 0.811891 | 0.320755 | 0 | 0.722222 | 0 | 0 | 0.041573 | 0 | 0 | 0 | 0 | 0.014286 | 0.138889 | 1 | 0.055556 | false | 0 | 0.083333 | 0 | 0.166667 | 0.111111 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
fd88c78266fc4209a289fbc25268f76bce338838 | 150 | py | Python | frontends/python/tests/analysis/constant_attribute.py | aardwolf-sfl/aardwolf | 33bfe3e0649a73aec7efa0fa80bff8077b550bd0 | [
"MIT"
] | 2 | 2020-08-15T08:55:39.000Z | 2020-11-09T17:31:16.000Z | frontends/python/tests/analysis/constant_attribute.py | aardwolf-sfl/aardwolf | 33bfe3e0649a73aec7efa0fa80bff8077b550bd0 | [
"MIT"
] | null | null | null | frontends/python/tests/analysis/constant_attribute.py | aardwolf-sfl/aardwolf | 33bfe3e0649a73aec7efa0fa80bff8077b550bd0 | [
"MIT"
] | null | null | null | # AARD: function: __main__
# AARD: #1:1 -> :: defs: %1 / uses: [@1 4:1-4:22] { call }
'value: {}'.format(3)
# AARD: @1 = constant_attribute.py
| 21.428571 | 63 | 0.546667 | 23 | 150 | 3.347826 | 0.652174 | 0.12987 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.09322 | 0.213333 | 150 | 6 | 64 | 25 | 0.559322 | 0.786667 | 0 | 0 | 0 | 0 | 0.333333 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
fdaa4e938b8821b9a6f605b0dbe6cbeea3d62940 | 25 | py | Python | version.py | vignesh1793/gmail_reader | 8bcf8dbc4e839e8eb736c1ae2fef9fd4f9f77ded | [
"MIT"
] | 1 | 2020-07-29T03:35:26.000Z | 2020-07-29T03:35:26.000Z | version.py | thevickypedia/gmail_reader | 8bcf8dbc4e839e8eb736c1ae2fef9fd4f9f77ded | [
"MIT"
] | null | null | null | version.py | thevickypedia/gmail_reader | 8bcf8dbc4e839e8eb736c1ae2fef9fd4f9f77ded | [
"MIT"
] | null | null | null | version_info = (0, 5, 2)
| 12.5 | 24 | 0.6 | 5 | 25 | 2.8 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.15 | 0.2 | 25 | 1 | 25 | 25 | 0.55 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
fdcd1b4c925a7d033b9a81f8136657b169b6bcf9 | 1,603 | py | Python | example_convert_dataset.py | fadamsyah/cv_utils | 487fc65fe4a71f05dd03df31cde21d866968c0b4 | [
"MIT"
] | null | null | null | example_convert_dataset.py | fadamsyah/cv_utils | 487fc65fe4a71f05dd03df31cde21d866968c0b4 | [
"MIT"
] | 1 | 2021-11-01T06:10:29.000Z | 2021-11-09T12:47:48.000Z | example_convert_dataset.py | fadamsyah/cv_utils | 487fc65fe4a71f05dd03df31cde21d866968c0b4 | [
"MIT"
] | null | null | null | from cv_utils.object_detection.dataset.converter import coco_to_yolo
from cv_utils.object_detection.dataset.converter import yolo_to_coco
''' COCO --> YOLO
This code uses the ultralytics/yolov5 format.
The converted dataset will be saved as follow:
- {output_folder}
- images
- {output_set_name}
- {image_1}
- {image_2}
- ...
- labels
- {output_set_name}
- {image_1}
- {image_2}
- ...
- classes.txt
'''
for set_name in ["train", "test", "val"]:
coco_to_yolo(
coco_annotation_path = f"demo/dataset/fasciola_ori/annotations/instances_{set_name}.json",
coco_image_dir = f"demo/dataset/fasciola_ori/{set_name}",
output_image_dir = f"demo/dataset/fasciola_yolo/images/{set_name}",
output_label_dir = f"demo/dataset/fasciola_yolo/labels/{set_name}",
output_category_path = f"demo/dataset/fasciola_yolo/classes.txt"
)
'''YOLO --> COCO
This code uses the ultralytics/yolov5 format:
- {yolo_image_dir}
- {image_1}
- {image_2}
- ...
- {yolo_label_dir}
- {image_1}
- {image_2}
- ...
'''
for set_name in ["train", "test", "val"]:
yolo_to_coco(
yolo_image_dir = f"demo/dataset/fasciola_yolo/images/{set_name}",
yolo_label_dir = f"demo/dataset/fasciola_yolo/labels/{set_name}",
yolo_class_file = "demo/dataset/fasciola_yolo/classes.txt",
coco_image_dir = f"demo/dataset/fasciola_coco/{set_name}",
coco_annotation_path = f"demo/dataset/fasciola_coco/annotations/instances_{set_name}.json"
) | 32.714286 | 98 | 0.651903 | 211 | 1,603 | 4.616114 | 0.255924 | 0.086242 | 0.195072 | 0.184805 | 0.787474 | 0.673511 | 0.605749 | 0.283368 | 0.184805 | 0.184805 | 0 | 0.007981 | 0.218341 | 1,603 | 49 | 99 | 32.714286 | 0.769354 | 0 | 0 | 0.111111 | 0 | 0 | 0.452471 | 0.429658 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.111111 | 0 | 0.111111 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
e31e1e564d0eb470b1f222fdeb2e2e5813305ea2 | 28,531 | py | Python | src/pte_decode/decoding/decoder_factory.py | richardkoehler/pte-decode | d1a466c166e5c3dd5e2c0caf1b12492f0e93bc57 | [
"MIT"
] | null | null | null | src/pte_decode/decoding/decoder_factory.py | richardkoehler/pte-decode | d1a466c166e5c3dd5e2c0caf1b12492f0e93bc57 | [
"MIT"
] | null | null | null | src/pte_decode/decoding/decoder_factory.py | richardkoehler/pte-decode | d1a466c166e5c3dd5e2c0caf1b12492f0e93bc57 | [
"MIT"
] | null | null | null | """Module for machine learning models."""
from dataclasses import dataclass
from typing import Any, Optional, Union
import numpy as np
import pandas as pd
from bayes_opt import BayesianOptimization
from catboost import CatBoostClassifier
from sklearn.discriminant_analysis import (
LinearDiscriminantAnalysis,
QuadraticDiscriminantAnalysis,
)
from sklearn.dummy import DummyClassifier
from sklearn.linear_model import LogisticRegression
from sklearn.metrics import balanced_accuracy_score, log_loss
from sklearn.model_selection import GroupKFold, GroupShuffleSplit
# from sklearn.svm import SVC
from xgboost import XGBClassifier
from pte_decode.decoding.decoder_base import Decoder
def get_decoder(
classifier: str = "lda",
scoring: str = "balanced_accuracy",
balancing: Optional[str] = None,
optimize: bool = False,
) -> Decoder:
"""Create and return Decoder of desired type.
Parameters
----------
classifier : str
Allowed values for `classifier`: ["catboost", "lda", "lin_svm", "lr",
"svm_lin", "svm_poly", "svm_rbf", "xgb"].
scoring : str | None, default="balanced_accuracy"
Score to be calculated. Possible values:
["oversample", "undersample", "balance_weights"].
balancing : str | None, default=None
Method for balancing skewed datasets. Possible values:
["oversample", "undersample", "balance_weights"].
Returns
-------
Decoder
Instance of Decoder given `classifer` and `balancing` method.
"""
classifiers = {
"catboost": CATB,
"dummy": Dummy,
"lda": LDA,
"lr": LR,
"qda": QDA,
# "svm_lin": SVC_Lin,
# "svm_poly": SVC_Poly,
# "svm_rbf": SVC_RBF,
"xgb": XGB,
}
scoring_methods = {
"balanced_accuracy": _get_balanced_accuracy,
"log_loss": _get_log_loss,
}
classifier = classifier.lower()
balancing = balancing.lower() if isinstance(balancing, str) else balancing
scoring = scoring.lower()
if classifier not in classifiers:
raise DecoderNotFoundError(classifier, classifiers.keys())
if scoring not in scoring_methods:
raise ScoringMethodNotFoundError(scoring, scoring_methods.keys())
return classifiers[classifier](
balancing=balancing,
optimize=optimize,
scoring=scoring_methods[scoring],
)
def _get_balanced_accuracy(model, data_test, label_test) -> Any:
"""Calculated balanced accuracy score."""
return balanced_accuracy_score(label_test, model.predict(data_test))
def _get_log_loss(model, data_test, label_test) -> Any:
"""Calculate Log Loss score."""
return log_loss(label_test, model.predict_proba(data_test))
class ScoringMethodNotFoundError(Exception):
"""Exception raised when invalid balancing method is passed.
Attributes:
input_value -- input value which caused the error
allowed -- allowed input values
message -- explanation of the error
"""
def __init__(
self,
input_value,
allowed,
message="Input scoring method is not an allowed value.",
) -> None:
self.input_value = input_value
self.allowed = allowed
self.message = message
super().__init__(self.message)
def __str__(self):
return (
f"{{self.message}} Allowed values: {self.allowed}. Got:"
f" {self.input_value}."
)
class DecoderNotFoundError(Exception):
"""Exception raised when invalid Decoder is passed.
Attributes:
input_value -- input which caused the error
allowed -- allowed input types
message -- explanation of the error
"""
def __init__(
self,
input_value,
allowed,
message="Input decoding model is not an allowed value.",
) -> None:
self.input_value = input_value
self.allowed = allowed.values
self.message = message
super().__init__(self.message)
def __str__(self):
return (
f"{{self.message}} Allowed values: {self.allowed}."
" Got: {self.input_value}."
)
@dataclass
class CATB(Decoder):
"""Class for CatBoostClassifier implementation."""
def __post_init__(self):
self.model = CatBoostClassifier(
loss_function="MultiClass",
verbose=False,
use_best_model=True,
eval_metric="MultiClass",
)
def fit(
self,
data: Union[pd.DataFrame, pd.Series],
labels: np.ndarray,
groups: np.ndarray,
) -> None:
"""Fit model to given training data and training labels."""
self.data_train = data
self.labels_train = labels
self.groups_train = groups
if self.optimize:
self.model = self._bayesian_optimization()
# Train outer model
(
self.data_train,
self.labels_train,
eval_set,
) = self._get_validation_split(
self.data_train,
self.labels_train,
self.groups_train,
train_size=0.8,
)
(
self.data_train,
self.labels_train,
sample_weight,
) = self._balance_samples(
self.data_train, self.labels_train, self.balancing
)
self.model.fit(
self.data_train,
self.labels_train,
eval_set=eval_set,
early_stopping_rounds=25,
sample_weight=sample_weight,
verbose=False,
)
def _bayesian_optimization(self):
"""Estimate optimal model parameters using bayesian optimization."""
optimizer = BayesianOptimization(
self._bo_tune,
{
"max_depth": (4, 10),
"learning_rate": (0.003, 0.3),
"bagging_temperature": (0.0, 1.0),
"l2_leaf_reg": (1, 30),
"random_strength": (0.01, 1.0),
},
)
optimizer.maximize(init_points=10, n_iter=20, acq="ei")
params = optimizer.max["params"]
params["max_depth"] = round(params["max_depth"])
return CatBoostClassifier(
iterations=200,
loss_function="MultiClass",
verbose=False,
use_best_model=True,
eval_metric="MultiClass",
max_depth=params["max_depth"],
learning_rate=params["learning_rate"],
random_strength=params["random_strength"],
bagging_temperature=params["bagging_temperature"],
l2_leaf_reg=params["l2_leaf_reg"],
)
def _bo_tune(
self,
max_depth,
learning_rate,
bagging_temperature,
l2_leaf_reg,
random_strength,
):
# Cross validating with the specified parameters in 5 folds
cv_inner = GroupShuffleSplit(
n_splits=3, train_size=0.66, random_state=42
)
scores = []
for train_index, test_index in cv_inner.split(
self.data_train, self.labels_train, self.groups_train
):
data_train_, data_test_ = (
self.data_train[train_index],
self.data_train[test_index],
)
y_tr, y_te = (
self.labels_train[train_index],
self.labels_train[test_index],
)
groups_tr = self.groups_train[train_index]
(data_train_, y_tr, eval_set_inner,) = self._get_validation_split(
data=data_train_,
labels=y_tr,
groups=groups_tr,
train_size=0.8,
)
data_train_, y_tr, sample_weight = self._balance_samples(
data_train_, y_tr, self.balancing
)
inner_model = CatBoostClassifier(
iterations=100,
loss_function="MultiClass",
verbose=False,
eval_metric="MultiClass",
max_depth=round(max_depth),
learning_rate=learning_rate,
bagging_temperature=bagging_temperature,
l2_leaf_reg=l2_leaf_reg,
random_strength=random_strength,
)
inner_model.fit(
data_train_,
y_tr,
eval_set=eval_set_inner,
early_stopping_rounds=25,
sample_weight=sample_weight,
verbose=False,
)
y_probs = inner_model.predict_proba(data_test_)
score = log_loss(y_te, y_probs, labels=[0, 1])
scores.append(score)
# Return the negative MLOGLOSS
return -1.0 * np.mean(scores)
@dataclass
class LDA(Decoder):
"""Class for applying Linear Discriminant Analysis using scikit-learn."""
def __post_init__(self):
if self.balancing == "balance_weights":
raise ValueError(
"Sample weights cannot be balanced for Linear "
"Discriminant Analysis. Please set `balance_weights` to"
"either `oversample`, `undersample` or `None`."
)
if self.optimize:
raise ValueError(
"Hyperparameter optimization cannot be performed for this"
" implementation of Linear Discriminant Analysis. Please"
" set `optimize` to False."
)
def fit(
self, data: np.ndarray, labels: np.ndarray, groups: np.ndarray
) -> None:
"""Fit model to given training data and training labels."""
self.data_train, self.labels_train, _ = self._balance_samples(
data, labels, self.balancing
)
self.model = LinearDiscriminantAnalysis(
solver="lsqr", shrinkage="auto"
)
self.model.fit(self.data_train, self.labels_train)
@dataclass
class LR(Decoder):
"""Basic representation of class for finding and filtering files."""
def fit(self, data: np.ndarray, labels: np.ndarray, groups) -> None:
"""Fit model to given training data and training labels."""
self.data_train = data
self.labels_train = labels
self.groups_train = groups
if self.optimize:
self.model = self._bayesian_optimization()
else:
self.model = LogisticRegression(solver="newton-cg")
self.data_train, self.labels_train, _ = self._balance_samples(
data, labels, self.balancing
)
self.model.fit(self.data_train, self.labels_train)
def _bayesian_optimization(self):
"""Estimate optimal model parameters using bayesian optimization."""
optimizer = BayesianOptimization(
self._bo_tune,
{"C": (0.01, 1.0)}, # pylint: disable=invalid-name
)
optimizer.maximize(init_points=10, n_iter=20, acq="ei")
# Train outer model with optimized parameters
params = optimizer.max["params"]
# params['max_iter'] = int(params['max_iter'])
return LogisticRegression(
solver="newton-cg", max_iter=500, C=params["C"]
)
def _bo_tune(self, C: float): # pylint: disable=invalid-name
# Cross validating with the specified parameters in 5 folds
cv_inner = GroupShuffleSplit(
n_splits=3, train_size=0.66, random_state=42
)
scores = []
for train_index, test_index in cv_inner.split(
self.data_train, self.labels_train, self.groups_train
):
data_train_, data_test_ = (
self.data_train[train_index],
self.data_train[test_index],
)
y_tr, y_te = (
self.labels_train[train_index],
self.labels_train[test_index],
)
data_train_, y_tr, sample_weight = self._balance_samples(
data_train_, y_tr, self.balancing
)
inner_model = LogisticRegression(
solver="newton-cg", C=C, max_iter=500
)
inner_model.fit(data_train_, y_tr, sample_weight=sample_weight)
y_probs = inner_model.predict_proba(data_test_)
score = log_loss(y_te, y_probs, labels=[0, 1])
scores.append(score)
# Return the negative MLOGLOSS
return -1.0 * np.mean(scores)
@dataclass
class Dummy(Decoder):
"""Dummy classifier implementation from scikit learn"""
def fit(self, data: np.ndarray, labels: np.ndarray, groups) -> None:
"""Fit model to given training data and training labels."""
self.data_train, self.labels_train, _ = self._balance_samples(
data, labels, self.balancing
)
self.model = DummyClassifier(strategy="uniform")
self.model.fit(self.data_train, self.labels_train)
def get_score(self, data_test: np.ndarray, label_test: np.ndarray):
"""Calculate score."""
scores = [
self.scoring(self.model, data_test, label_test)
for _ in range(0, 100)
]
return np.mean(scores)
@dataclass
class QDA(Decoder):
"""Class for applying Linear Discriminant Analysis using scikit-learn."""
def __post_init__(self):
if self.balancing == "balance_weights":
raise ValueError(
"Sample weights cannot be balanced for Quadratic "
"Discriminant Analysis. Please set `balance_weights` to"
"either `oversample`, `undersample` or `None`."
)
if self.optimize:
raise ValueError(
"Hyperparameter optimization cannot be performed for this"
" implementation of Quadratic Discriminant Analysis. Please"
" set `optimize` to False."
)
def fit(self, data: np.ndarray, labels: np.ndarray, groups) -> None:
"""Fit model to given training data and training labels."""
self.data_train, self.labels_train, _ = self._balance_samples(
data, labels, self.balancing
)
self.model = QuadraticDiscriminantAnalysis()
self.model.fit(self.data_train, self.labels_train)
@dataclass
class XGB(Decoder):
"""Basic representation of class for finding and filtering files."""
def _bayesian_optimization(self):
"""Estimate optimal model parameters using bayesian optimization."""
optimizer = BayesianOptimization(
self._bo_tune,
{
"learning_rate": (0.003, 0.3),
"max_depth": (4, 10),
"gamma": (0, 1),
"colsample_bytree": (0.4, 1),
"subsample": (0.4, 1),
},
)
optimizer.maximize(init_points=10, n_iter=20, acq="ei")
# Train outer model with optimized parameters
params = optimizer.max["params"]
return XGBClassifier(
objective="binary:logistic",
use_label_encoder=False,
n_estimators=200,
eval_metric="logloss",
learning_rate=params["learning_rate"],
gamma=params["gamma"],
max_depth=int(params["max_depth"]),
subsample=params["subsample"],
colsample_bytree=params["colsample_bytree"],
)
def _bo_tune(
self, learning_rate, gamma, max_depth, subsample, colsample_bytree
):
cv_inner = GroupKFold(
n_splits=3,
)
scores = []
for train_index, test_index in cv_inner.split(
self.data_train, self.labels_train, self.groups_train
):
data_train_, data_test_ = (
self.data_train.iloc[train_index],
self.data_train.iloc[test_index],
)
y_tr, y_te = (
self.labels_train[train_index],
self.labels_train[test_index],
)
groups_tr = self.groups_train[train_index]
(data_train_, y_tr, eval_set_inner,) = self._get_validation_split(
data=data_train_,
labels=y_tr,
groups=groups_tr,
train_size=0.8,
)
(data_train_, y_tr, sample_weight,) = self._balance_samples(
data=data_train_, labels=y_tr, method=self.balancing
)
inner_model = XGBClassifier(
objective="binary:logistic",
booster="gbtree",
use_label_encoder=False,
eval_metric="logloss",
n_estimators=100,
learning_rate=learning_rate,
gamma=gamma,
max_depth=int(max_depth),
colsample_bytree=colsample_bytree,
subsample=subsample,
)
inner_model.fit(
X=data_train_,
y=y_tr,
eval_set=eval_set_inner,
early_stopping_rounds=20,
sample_weight=sample_weight,
verbose=False,
)
y_probs = inner_model.predict_proba(X=data_test_)
score = log_loss(y_te, y_probs, labels=[0, 1])
scores.append(score)
# Return the negative MLOGLOSS
return -1.0 * np.mean(scores)
def fit(
self, data: pd.DataFrame, labels: np.ndarray, groups: np.ndarray
) -> None:
"""Fit model to given training data and training labels."""
self.data_train = data
self.labels_train = labels
self.groups_train = groups
if self.optimize:
self.model = self._bayesian_optimization()
else:
self.model = XGBClassifier(
objective="binary:logistic",
booster="gbtree",
use_label_encoder=False,
n_estimators=200,
eval_metric="logloss",
)
# Train outer model
(
self.data_train,
self.labels_train,
eval_set,
) = self._get_validation_split(
self.data_train,
self.labels_train,
self.groups_train,
train_size=0.8,
)
(
self.data_train,
self.labels_train,
sample_weight,
) = self._balance_samples(
data=data, labels=labels, method=self.balancing
)
self.model.fit(
self.data_train,
self.labels_train,
eval_set=eval_set,
early_stopping_rounds=20,
sample_weight=sample_weight,
verbose=False,
)
# @dataclass
# class SVC_Lin(Decoder):
# """"""
# @dataclass
# class SVC_Poly(Decoder):
# """"""
# @dataclass
# class SVC_RBF(Decoder):
# """"""
# @dataclass
# class SVC_Sig(Decoder):
# """"""
# def classify_svm_lin(data_train, y_train, group_train, optimize,
# balance):
# """"""
# def bo_tune(C, tol):
# # Cross validating with the specified parameters in 5 folds
# cv_inner = GroupShuffleSplit(
# n_splits=3, train_size=0.66, random_state=42
# )
# scores = []
# for train_index, test_index in cv_inner.split(
# data_train, y_train, group_train
# ):
# data_train_, data_test_ = data_train[train_index],
# data_train[test_index]
# y_tr, y_te = y_train[train_index], y_train[test_index]
# inner_model = SVC(
# kernel="linear",
# C=C,
# max_iter=500,
# tol=tol,
# gamma="scale",
# shrinking=True,
# class_weight=None,
# probability=True,
# verbose=False,
# )
# inner_model.fit(data_train_, y_tr,
# sample_weight=sample_weight)
# y_probs = inner_model.predict_proba(data_test_)
# score = log_loss(y_te, y_probs, labels=[0, 1])
# scores.append(score)
# # Return the negative MLOGLOSS
# return -1.0 * np.mean(scores)
# if optimize:
# # Perform Bayesian Optimization
# bo = BayesianOptimization(
# bo_tune, {"C": (pow(10, -1), pow(10, 1)),
# "tol": (1e-4, 1e-2)}
# )
# bo.maximize(init_points=10, n_iter=20, acq="ei")
# # Train outer model with optimized parameters
# params = bo.max["params"]
# # params['max_iter'] = 500
# model = SVC(
# kernel="linear",
# C=params["C"],
# max_iter=500,
# tol=params["tol"],
# gamma="scale",
# shrinking=True,
# class_weight=None,
# verbose=False,
# )
# else:
# # Use default values
# model = SVC(
# kernel="linear",
# gamma="scale",
# shrinking=True,
# class_weight=None,
# verbose=False,
# )
# model.fit(data_train, y_train, sample_weight=sample_weight)
# return model
# def classify_svm_rbf(data_train, y_train, group_train, optimize,
# balance):
# """"""
# def bo_tune(C, tol):
# # Cross validating with the specified parameters in 5 folds
# cv_inner = GroupShuffleSplit(
# n_splits=3, train_size=0.66, random_state=42
# )
# scores = []
# for train_index, test_index in cv_inner.split(
# data_train, y_train, group_train
# ):
# data_train_, data_test_ = data_train[train_index],
# data_train[test_index]
# y_tr, y_te = y_train[train_index], y_train[test_index]
# inner_model = SVC(
# kernel="rbf",
# C=C,
# max_iter=500,
# tol=tol,
# gamma="scale",
# shrinking=True,
# class_weight=None,
# probability=True,
# verbose=False,
# )
# inner_model.fit(data_train_, y_tr,
# sample_weight=sample_weight)
# y_probs = inner_model.predict_proba(data_test_)
# score = log_loss(y_te, y_probs, labels=[0, 1])
# scores.append(score)
# # Return the negative MLOGLOSS
# return -1.0 * np.mean(scores)
# if optimize:
# # Perform Bayesian Optimization
# bo = BayesianOptimization(
# bo_tune, {"C": (pow(10, -1), pow(10, 1)),
# "tol": (1e-4, 1e-2)}
# )
# bo.maximize(init_points=10, n_iter=20, acq="ei")
# # Train outer model with optimized parameters
# params = bo.max["params"]
# model = SVC(
# kernel="rbf",
# C=params["C"],
# max_iter=500,
# tol=params["tol"],
# gamma="scale",
# shrinking=True,
# class_weight=None,
# verbose=False,
# )
# else:
# # Use default values
# model = SVC(
# kernel="rbf",
# gamma="scale",
# shrinking=True,
# class_weight=None,
# verbose=False,
# )
# model.fit(data_train, y_train, sample_weight=sample_weight)
# return model
# def classify_svm_poly(data_train, y_train, group_train):
# """"""
# def bo_tune(C, tol):
# # Cross validating with the specified parameters in 5 folds
# cv_inner = GroupShuffleSplit(
# n_splits=3, train_size=0.66, random_state=42
# )
# scores = []
# for train_index, test_index in cv_inner.split(
# data_train, y_train, group_train
# ):
# data_train_, data_test_ = data_train[train_index],
# data_train[test_index]
# y_tr, y_te = y_train[train_index], y_train[test_index]
# inner_model = SVC(
# kernel="poly",
# C=C,
# max_iter=500,
# tol=tol,
# gamma="scale",
# shrinking=True,
# class_weight=None,
# probability=True,
# verbose=False,
# )
# inner_model.fit(data_train_, y_tr,
# sample_weight=sample_weight)
# y_probs = inner_model.predict_proba(data_test_)
# score = log_loss(y_te, y_probs, labels=[0, 1])
# scores.append(score)
# # Return the negative MLOGLOSS
# return -1.0 * np.mean(scores)
# if optimize:
# # Perform Bayesian Optimization
# bo = BayesianOptimization(
# bo_tune, {"C": (pow(10, -1), pow(10, 1)),
# "tol": (1e-4, 1e-2)}
# )
# bo.maximize(init_points=10, n_iter=20, acq="ei")
# # Train outer model with optimized parameters
# params = bo.max["params"]
# model = SVC(
# kernel="poly",
# C=params["C"],
# max_iter=500,
# tol=params["tol"],
# gamma="scale",
# shrinking=True,
# class_weight=None,
# verbose=False,
# )
# else:
# # Use default values
# model = SVC(
# kernel="poly",
# gamma="scale",
# shrinking=True,
# class_weight=None,
# verbose=False,
# )
# model.fit(data_train, y_train, sample_weight=sample_weight)
# return model
# def classify_svm_sig(data_train, y_train, group_train, optimize,
# balance):
# """"""
# def bo_tune(C, tol):
# # Cross validating with the specified parameters in 5 folds
# cv_inner = GroupShuffleSplit(
# n_splits=3, train_size=0.66, random_state=42
# )
# scores = []
# for train_index, test_index in cv_inner.split(
# data_train, y_train, group_train
# ):
# data_train_, data_test_ = data_train[train_index],
# data_train[test_index]
# y_tr, y_te = y_train[train_index], y_train[test_index]
# inner_model = SVC(
# kernel="sigmoid",
# C=C,
# max_iter=500,
# tol=tol,
# gamma="auto",
# shrinking=True,
# class_weight=None,
# probability=True,
# verbose=False,
# )
# inner_model.fit(data_train_, y_tr, sample_weight=sample_weight)
# y_probs = inner_model.predict_proba(data_test_)
# score = log_loss(y_te, y_probs, labels=[0, 1])
# scores.append(score)
# # Return the negative MLOGLOSS
# return -1.0 * np.mean(scores)
# if optimize:
# # Perform Bayesian Optimization
# bo = BayesianOptimization(
# bo_tune, {"C": (pow(10, -1), pow(10, 1)), "tol": (1e-4, 1e-2)}
# )
# bo.maximize(init_points=10, n_iter=20, acq="ei")
# # Train outer model with optimized parameters
# params = bo.max["params"]
# model = SVC(
# kernel="sigmoid",
# C=params["C"],
# max_iter=500,
# tol=params["tol"],
# gamma="auto",
# shrinking=True,
# class_weight=None,
# verbose=False,
# )
# else:
# # Use default values
# model = SVC(
# kernel="sigmoid",
# gamma="scale",
# shrinking=True,
# class_weight=None,
# verbose=False,
# )
# model.fit(data_train, y_train, sample_weight=sample_weight)
# return model
| 33.68477 | 78 | 0.537836 | 2,938 | 28,531 | 4.975494 | 0.097686 | 0.044945 | 0.02579 | 0.023259 | 0.767342 | 0.726912 | 0.704269 | 0.694828 | 0.692708 | 0.692708 | 0 | 0.014043 | 0.361046 | 28,531 | 846 | 79 | 33.724586 | 0.787822 | 0.375872 | 0 | 0.553531 | 0 | 0 | 0.078827 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.052392 | false | 0 | 0.029613 | 0.004556 | 0.127563 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
e34a90751c66f311b5912bb8c6a8d1a8ad0deae9 | 449 | py | Python | 23/03/0.py | pylangstudy/201709 | 53d868786d7327a83bfa7f4149549c6f9855a6c6 | [
"CC0-1.0"
] | null | null | null | 23/03/0.py | pylangstudy/201709 | 53d868786d7327a83bfa7f4149549c6f9855a6c6 | [
"CC0-1.0"
] | 32 | 2017-09-01T00:52:17.000Z | 2017-10-01T00:30:02.000Z | 23/03/0.py | pylangstudy/201709 | 53d868786d7327a83bfa7f4149549c6f9855a6c6 | [
"CC0-1.0"
] | null | null | null | import json
import pprint
from urllib.request import urlopen
with urlopen('http://pypi.python.org/pypi/Twisted/json') as url:
http_info = url.info()
raw_data = url.read().decode(http_info.get_content_charset())
project_info = json.loads(raw_data)
pprint.pprint(project_info)
print('------------------------------')
pprint.pprint(project_info, depth=2)
print('------------------------------')
pprint.pprint(project_info, depth=2, width=50)
| 29.933333 | 65 | 0.657016 | 60 | 449 | 4.75 | 0.5 | 0.154386 | 0.2 | 0.242105 | 0.238596 | 0.238596 | 0.238596 | 0 | 0 | 0 | 0 | 0.009756 | 0.08686 | 449 | 14 | 66 | 32.071429 | 0.685366 | 0 | 0 | 0.166667 | 0 | 0 | 0.222717 | 0.13363 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.25 | 0.5 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 3 |
e3528306b7bd15d5fdcba5e58d4d0fd69b1294a4 | 349 | py | Python | sps/api/v1/socket/message.py | tantexian/sps-2014-12-4 | 0cdab186cb3bf148656c4c214a18215643b4969c | [
"Apache-2.0"
] | 1 | 2018-07-27T15:16:14.000Z | 2018-07-27T15:16:14.000Z | sps/api/v1/socket/message.py | tantexian/sps-2014-12-4 | 0cdab186cb3bf148656c4c214a18215643b4969c | [
"Apache-2.0"
] | null | null | null | sps/api/v1/socket/message.py | tantexian/sps-2014-12-4 | 0cdab186cb3bf148656c4c214a18215643b4969c | [
"Apache-2.0"
] | null | null | null | class MessageType(object):
REG = 1
class Message(object):
def __init__(self):
self.type = None
self.body = None
def get_type(self):
return self.type
def get_body(self):
return self.body
def set_type(self, type):
self.type = type
def set_body(self, body):
self.body = body
| 17.45 | 29 | 0.581662 | 47 | 349 | 4.148936 | 0.319149 | 0.164103 | 0.14359 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004219 | 0.320917 | 349 | 19 | 30 | 18.368421 | 0.818565 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.357143 | false | 0 | 0 | 0.142857 | 0.714286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
e366e2fcb39a47a49999de24a158d9a70e017103 | 277 | py | Python | app/admin.py | hbuiOnline/AMS | d9118aee7b5ddd90d54bf4cf7f5cdd11c8e4a511 | [
"MIT"
] | null | null | null | app/admin.py | hbuiOnline/AMS | d9118aee7b5ddd90d54bf4cf7f5cdd11c8e4a511 | [
"MIT"
] | null | null | null | app/admin.py | hbuiOnline/AMS | d9118aee7b5ddd90d54bf4cf7f5cdd11c8e4a511 | [
"MIT"
] | null | null | null | from django.contrib import admin
# Register your models here.
from .models import * # To import all the model from .models, then specify those in register
admin.site.register(Customer)
admin.site.register(Staff)
admin.site.register(Service)
admin.site.register(Appointment) | 27.7 | 93 | 0.794224 | 40 | 277 | 5.5 | 0.55 | 0.163636 | 0.309091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.119134 | 277 | 10 | 94 | 27.7 | 0.901639 | 0.34296 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
e36726ae66238ed268d95c0acb72decccf95ea5d | 17,263 | py | Python | static/brythonlib/cs1robots/worlds_data.py | pythonpad/vue-pythonpad-runner | 52decba9607b3b7b050ee0bf6dd4ef07ae644587 | [
"MIT"
] | 3 | 2021-01-26T16:18:45.000Z | 2021-09-15T00:57:12.000Z | static/brythonlib/cs1robots/worlds_data.py | pythonpad/vue-pythonpad-runner | 52decba9607b3b7b050ee0bf6dd4ef07ae644587 | [
"MIT"
] | null | null | null | static/brythonlib/cs1robots/worlds_data.py | pythonpad/vue-pythonpad-runner | 52decba9607b3b7b050ee0bf6dd4ef07ae644587 | [
"MIT"
] | 2 | 2021-01-26T16:18:47.000Z | 2021-10-21T20:45:20.000Z | def conv_world(kaist_world_dict):
pieces = {}
for (sx, sy), count in kaist_world_dict['beepers'].items():
for i in range(count):
beeper_id = len(pieces)
pieces[beeper_id] = {
'type': 'beeper',
'piece_type': 'marker',
'id': beeper_id,
'position': {
'type': 'position',
'x': sx - 1,
'y': sy - 1,
},
}
walls = []
for sx, sy in kaist_world_dict['walls']:
x1, y1 = (sx - 1) // 2, (sy - 1) // 2
if sx % 2 == 0:
x2 = x1 + 1
y2 = y1
else:
x2 = x1
y2 = y1 + 1
walls.append({
'type': 'wall',
'position_1': {
'type': 'position',
'x': x1,
'y': y1,
},
'position_2': {
'type': 'position',
'x': x2,
'y': y2,
},
})
return {
'type': 'world',
'width': kaist_world_dict['avenues'],
'height': kaist_world_dict['streets'],
'pieces': pieces,
'walls': walls
}
def get_world_dict(title):
global worlds_data
if title not in worlds_data:
raise ValueError('Unknown world name: "%s"' % title)
return conv_world(worlds_data[title])
worlds_data = {
'around': {'avenues': 10, 'streets': 10, 'walls': [], 'beepers': {(1, 9): 1, (2, 10): 1, (8, 10): 1, (10, 10): 1, (9, 10): 1, (5, 10): 1, (10, 8): 1, (10, 4): 1, (10, 1): 1, (8, 1): 1, (7, 1): 1, (6, 1): 1, (5, 1): 1, (3, 1): 1, (1, 6): 1, (1, 5): 1, (1, 3): 1}},
'around2': {'avenues': 10, 'streets': 10, 'walls': [], 'beepers': {(2, 1): 2, (3, 1): 3, (5, 1): 2, (7, 1): 1, (10, 1): 1, (10, 4): 3, (10, 3): 1, (10, 7): 2, (10, 6): 1, (10, 10): 4, (10, 9): 3, (9, 10): 1, (7, 10): 2, (5, 10): 1, (4, 10): 1, (3, 10): 1, (2, 10): 1, (1, 10): 2, (1, 8): 1, (1, 6): 4, (1, 5): 1, (1, 3): 3, (1, 2): 1}},
'around3': {'avenues': 6, 'streets': 6, 'walls': [], 'beepers': {(2, 1): 2, (3, 1): 1, (6, 1): 1, (6, 2): 3, (6, 3): 1, (6, 6): 2, (4, 6): 3, (1, 6): 1, (1, 4): 2, (1, 3): 1, (1, 2): 1}},
'cave': {'avenues': 10, 'streets': 10, 'walls': [(1, 2), (2, 3), (4, 1), (5, 4), (2, 5), (3, 6), (5, 6), (6, 3), (6, 1), (8, 1), (8, 3), (9, 4), (10, 3), (11, 2), (1, 8), (3, 8), (5, 8), (7, 8), (8, 7), (14, 1), (14, 3), (13, 4), (11, 6), (12, 7), (13, 8), (14, 7), (14, 5), (9, 8)], 'beepers': {(6, 5): 1}},
'cave2': {'avenues': 10, 'streets': 10, 'walls': [(1, 2), (2, 3), (4, 1), (4, 3), (4, 5), (3, 6), (1, 8), (3, 8), (5, 8), (6, 7), (7, 8), (9, 8), (10, 7), (9, 6), (8, 5), (8, 1), (10, 1), (10, 3), (7, 4), (6, 3)], 'beepers': {(6, 3): 1}},
'cave3': {'avenues': 10, 'streets': 10, 'walls': [(2, 1), (1, 4), (5, 2), (6, 1), (3, 4), (5, 6), (3, 6), (2, 5), (6, 3), (7, 6), (8, 5), (8, 1), (9, 2), (12, 1), (12, 3), (12, 5), (9, 4), (12, 7), (11, 8), (11, 6), (9, 8), (7, 8), (5, 8), (3, 8)], 'beepers': {(1, 5): 4, (2, 2): 2, (3, 3): 3, (4, 2): 1, (6, 2): 1, (5, 4): 1, (1, 4): 3}},
'cave4': {'avenues': 10, 'streets': 10, 'walls': [(2, 1), (1, 4), (3, 2), (5, 2), (3, 4), (5, 6), (6, 5), (7, 4), (8, 3), (8, 1), (2, 5), (1, 8), (3, 8), (5, 8), (7, 8), (9, 8), (9, 6), (10, 5), (11, 8), (12, 7), (12, 5), (11, 4), (12, 1), (10, 3)], 'beepers': {(3, 2): 1, (2, 4): 3, (4, 4): 3, (7, 2): 4}},
'chimney': {'avenues': 10, 'streets': 10, 'walls': [(1, 2), (2, 3), (2, 5), (2, 7), (2, 9), (2, 11), (4, 11), (4, 9), (4, 7), (4, 5), (4, 3), (3, 12), (5, 2), (6, 3), (6, 5), (7, 6), (8, 5), (8, 3), (9, 2), (11, 2), (12, 3), (12, 5), (12, 7), (12, 9), (13, 10), (14, 9), (14, 7), (14, 5), (14, 3), (15, 2), (16, 3), (16, 5), (16, 7), (16, 9), (16, 11), (16, 13), (16, 15), (17, 16), (18, 15), (18, 13), (18, 11), (18, 9), (18, 7), (18, 5), (18, 3), (19, 2)], 'beepers': {(2, 6): 1, (2, 5): 1, (2, 4): 2, (2, 2): 1, (9, 7): 1, (9, 5): 2, (9, 4): 3, (4, 3): 5, (7, 2): 1, (7, 4): 3, (7, 3): 1, (7, 5): 1}},
'chimney2': {'avenues': 10, 'streets': 10, 'walls': [(1, 2), (2, 3), (2, 5), (2, 7), (3, 8), (4, 7), (4, 5), (4, 3), (4, 9), (4, 11), (4, 13), (4, 15), (5, 16), (6, 15), (6, 13), (6, 11), (6, 9), (6, 7), (6, 5), (6, 3), (7, 2), (8, 3), (10, 3), (11, 2), (13, 2), (14, 3), (16, 3), (18, 3), (17, 2), (18, 5), (18, 7), (18, 9), (18, 11), (18, 13), (18, 15), (19, 16), (15, 4), (8, 5), (10, 5), (10, 11), (9, 12), (8, 11), (8, 9), (10, 9), (10, 7), (8, 7)], 'beepers': {(3, 8): 2, (8, 2): 3, (2, 3): 2, (2, 4): 1, (3, 3): 3, (3, 2): 2, (3, 5): 3, (3, 6): 1, (5, 2): 2, (5, 6): 1, (10, 7): 2}},
'chimney3': {'avenues': 10, 'streets': 10, 'walls': [(1, 2), (2, 3), (2, 5), (2, 7), (3, 8), (4, 7), (4, 5), (4, 3), (4, 9), (4, 11), (5, 12), (6, 11), (6, 9), (6, 7), (6, 5), (6, 3), (7, 2), (9, 2), (10, 3), (10, 5), (10, 7), (11, 8), (12, 9), (12, 11), (13, 12), (14, 11), (14, 9), (15, 8), (16, 9), (16, 11), (16, 15), (16, 13), (16, 17), (18, 17), (18, 15), (18, 13), (18, 11), (18, 9), (19, 8), (13, 2), (15, 2), (17, 2), (19, 2), (13, 4), (15, 4), (17, 4), (19, 4), (13, 6), (15, 6), (17, 6), (19, 6), (17, 18)], 'beepers': {(3, 2): 1, (2, 3): 3, (2, 4): 2, (3, 4): 6, (3, 5): 1, (7, 6): 5, (7, 5): 1, (9, 5): 3, (9, 7): 2}},
'mine': {'avenues': 10, 'streets': 10, 'walls': [(1, 2), (3, 2), (5, 2), (7, 2), (9, 2), (11, 2), (13, 2), (15, 2), (17, 2), (19, 2)], 'beepers': {(2, 1): 1, (3, 1): 1, (5, 1): 1, (8, 1): 1, (10, 1): 1}},
'mine2':{'avenues': 10, 'streets': 10, 'walls': [(1, 2), (3, 2), (5, 2), (7, 2), (9, 2), (11, 2), (13, 2), (15, 2), (17, 2), (19, 2)], 'beepers': {(2, 1): 2, (3, 1): 2, (6, 1): 3, (5, 1): 1, (8, 1): 1, (10, 1): 4}},
'mine3': {'avenues': 10, 'streets': 10, 'walls': [(1, 2), (3, 2), (5, 2), (7, 2), (9, 2), (11, 2), (13, 2), (15, 2), (17, 2), (19, 2)], 'beepers': {(10, 1): 5, (9, 1): 1, (8, 1): 3, (6, 1): 2, (1, 1): 2, (2, 1): 1, (3, 1): 3}},
'mine4': {'avenues': 10, 'streets': 10, 'walls': [(1, 2), (3, 2), (5, 2), (6, 3), (7, 4), (8, 1), (9, 2), (11, 2), (12, 1), (9, 4), (11, 4), (13, 4), (14, 3), (15, 2), (17, 2), (19, 2)], 'beepers': {(10, 1): 2, (8, 1): 3, (7, 2): 1, (7, 1): 1, (4, 2): 6, (5, 2): 1, (4, 1): 1, (3, 1): 2}},
'mine5': {'avenues': 10, 'streets': 10, 'walls': [(1, 2), (3, 2), (5, 2), (6, 3), (9, 2), (8, 1), (10, 1), (7, 4), (9, 4), (11, 4), (12, 3), (13, 2), (14, 3), (14, 5), (14, 7), (15, 8), (17, 8), (19, 8), (17, 6), (16, 5), (18, 5), (19, 4), (16, 3), (16, 1)], 'beepers': {(10, 3): 1, (2, 1): 2, (4, 1): 3, (5, 2): 2, (7, 1): 3, (8, 2): 4, (8, 3): 1, (8, 4): 2}},
'stairs': {'avenues': 10, 'streets': 10, 'walls': [(2, 1), (3, 2), (4, 3), (5, 4), (6, 5), (7, 6), (8, 7), (9, 8), (10, 9), (11, 10), (12, 11), (13, 12), (14, 13), (15, 14), (16, 15), (17, 16), (18, 17), (19, 18)], 'beepers': {(10, 10): 1}},
'stairs2': {'avenues': 10, 'streets': 10, 'walls': [(2, 1), (3, 2), (5, 2), (6, 3), (7, 4), (8, 5), (9, 6), (11, 6), (12, 7), (13, 8), (14, 9), (15, 10), (17, 10), (18, 11), (19, 12)], 'beepers': {(10, 7): 1}},
'stairs3': {'avenues': 10, 'streets': 10, 'walls': [(4, 1), (5, 2), (6, 3), (7, 4), (9, 4), (11, 4), (12, 5), (13, 6), (14, 7), (15, 8), (17, 8), (18, 9), (19, 10)], 'beepers': {(10, 6): 1}},
'stairs4': {'avenues': 10, 'streets': 10, 'walls': [(2, 1), (3, 2), (4, 3), (5, 4), (7, 4), (9, 4), (11, 4), (12, 5), (13, 6), (15, 6), (16, 7), (17, 8), (18, 9), (19, 10)], 'beepers': {(4, 3): 1}},
'coins': {'avenues': 10, 'streets': 10, 'walls': [(3, 2), (5, 2), (7, 2), (9, 2), (11, 2), (13, 2), (15, 2), (17, 2), (19, 2), (2, 3), (2, 5), (2, 7), (2, 9), (2, 11), (2, 13), (2, 15), (2, 17), (2, 19)], 'beepers': {(2, 1): 1, (4, 1): 3, (5, 1): 2, (8, 1): 3, (7, 1): 6, (1, 2): 3, (1, 10): 1, (1, 8): 3, (1, 9): 1, (1, 4): 1}},
'coins2': {'avenues': 10, 'streets': 10, 'walls': [(2, 19), (2, 17), (2, 15), (2, 13), (2, 11), (2, 9), (2, 7), (2, 5), (2, 3), (3, 2), (5, 2), (7, 2), (9, 2), (11, 2), (13, 2), (15, 2), (17, 2), (19, 2)], 'beepers': {(6, 1): 1, (7, 1): 1, (5, 1): 2, (10, 1): 3, (2, 1): 1, (1, 2): 3, (1, 3): 2, (1, 6): 4, (1, 10): 7}},
'news': {'avenues': 10, 'streets': 10, 'walls': [(1, 2), (3, 2), (4, 3), (5, 4), (6, 3), (7, 2), (8, 3), (9, 4), (10, 3), (11, 2), (13, 2), (14, 3), (15, 4), (16, 3), (17, 2), (19, 2)], 'beepers': {}},
'news2': {'avenues': 10, 'streets': 10, 'walls': [(1, 2), (2, 3), (3, 4), (4, 3), (5, 2), (6, 3), (7, 4), (8, 3), (9, 2), (10, 3), (11, 4), (12, 3), (15, 2), (17, 2), (13, 2), (18, 3), (19, 4)], 'beepers': {}},
'news3': {'avenues': 10, 'streets': 10, 'walls': [(1, 2), (3, 2), (5, 4), (4, 3), (6, 3), (7, 4), (8, 3), (9, 4), (10, 3), (11, 4), (12, 3), (13, 2), (14, 3), (15, 4), (16, 3), (17, 4), (18, 3), (19, 2)], 'beepers': {}},
'read': {'avenues': 10, 'streets': 10, 'walls': [], 'beepers': {(10, 1): 7}},
'read2': {'avenues': 10, 'streets': 10, 'walls': [], 'beepers': {(9, 1): 2, (10, 1): 4, (8, 1): 3}},
'read3': {'avenues': 10, 'streets': 10, 'walls': [], 'beepers': {(6, 1): 2, (8, 1): 3, (9, 1): 1, (10, 1): 7}},
'hurdles1': {
'avenues': 10,
'streets': 10,
'walls': [(4, 1), (8, 1), (12, 1), (16, 1)],
'beepers': {(10, 1): 1},
},
'hurdles2': {
'avenues': 10,
'streets': 10,
'walls': [(4, 1), (8, 1), (12, 1), (16, 1)],
'beepers': {(7, 1): 1},
},
'hurdles3': {
'avenues': 10,
'streets': 10,
'walls': [(4, 1), (8, 1), (16, 1), (2, 1), (10, 1), (18, 1), (12, 1)],
'beepers': {(10, 1): 1},
},
'beepers1': {
'avenues': 10,
'streets': 10,
'walls': [],
'beepers': {(3, 1): 1},
},
'corner3_4': {
'avenues': 10,
'streets': 10,
'walls': [],
'beepers': {},
},
'rain1': {
'avenues': 10,
'streets': 10,
'walls': [(5, 6), (4, 7), (4, 9), (4, 13), (4, 15), (5, 16), (9, 16), (13, 16), (15, 16), (16, 15), (16, 11), (16, 9), (16, 7), (15, 6), (11, 6), (7, 6)],
'beepers': {},
},
'newspaper': {
'avenues': 10,
'streets': 10,
'walls': [(4, 1), (5, 2), (7, 2), (8, 3), (9, 4), (11, 4), (12, 5), (13, 6), (15, 6), (16, 7), (17, 8), (19, 8)],
'beepers': {},
},
'hurdles4': {
'avenues': 10,
'streets': 10,
'walls': [(4, 1), (8, 1), (16, 1), (2, 1), (10, 1), (18, 1), (12, 1), (4, 3), (10, 3), (10, 5)],
'beepers': {(10, 1): 1},
},
'frank18': {
'avenues': 10,
'streets': 10,
'walls': [],
'beepers': {(7, 4): 1, (3, 7): 2, (7, 1): 19, (6, 6): 2, (3, 4): 2},
},
'rain2': {
'avenues': 12,
'streets': 9,
'walls': [(5, 6), (7, 6), (11, 6), (13, 6), (15, 6), (16, 5), (17, 4), (21, 4), (22, 5), (22, 9), (22, 11), (22, 15), (21, 16), (19, 16), (15, 16), (13, 16), (9, 16), (5, 16), (4, 15), (4, 13), (4, 9), (4, 7)],
'beepers': {},
},
'wrong': {
'avenues': 10,
'streets': 10,
'walls': [10, (10, 3), (10, 5), (1, 10), (3, 10), (5, 10), (2, 1), (2, 3), (1, 6), (3, 6), (4, 5), (4, 3), (5, 2), (6, 3), (7, 8), (5, 8), (2, 7), (7, 10), (8, 7), (9, 6), (8, 3), (9, 4), (9, 10), (10, 9)],
'beepers': {(6, 4): 1},
},
'hanoi3': {
'avenues': 10,
'streets': 10,
'walls': [],
'beepers': {(2, 1): 3, (2, 2): 2, (2, 3): 1},
},
'fairy_tale': {
'avenues': 14,
'streets': 8,
'walls': [(1, 10), (3, 10), (4, 9), (5, 8), (6, 7), (9, 8), (11, 8), (12, 7), (12, 5), (12, 3), (12, 1)],
'beepers': {},
},
'hanoi4': {
'avenues': 10,
'streets': 10,
'walls': [],
'beepers': {(2, 4): 1, (2, 1): 4, (2, 2): 3, (2, 3): 2},
},
'empty': {
'avenues': 8,
'streets': 8,
'walls': [],
'beepers': {},
},
'trash1': {
'avenues': 10,
'streets': 10,
'walls': [(3, 2), (5, 2), (7, 2), (9, 2), (11, 2), (13, 2), (15, 2), (17, 2), (19, 2), (1, 4), (2, 3)],
'beepers': {(6, 1): 1, (3, 1): 3, (5, 1): 1, (10, 1): 2, (7, 1): 2},
},
'trash2': {
'avenues': 10,
'streets': 10,
'walls': [(3, 2), (5, 2), (7, 2), (9, 2), (11, 2), (13, 2), (15, 2), (17, 2), (19, 2), (1, 4), (2, 3)],
'beepers': {(9, 1): 1, (5, 1): 13, (2, 1): 2, (7, 1): 2},
},
'trash3': {
'avenues': 10,
'streets': 10,
'walls': [],
'beepers': {(1, 2): 18, (7, 3): 4, (4, 8): 1, (5, 6): 7, (7, 1): 4, (9, 2): 11, (8, 8): 1, (1, 10): 3, (2, 5): 3, (5, 8): 2, (7, 9): 2},
},
'trash4': {
'avenues': 11,
'streets': 10,
'walls': [],
'beepers': {(6, 9): 3, (1, 3): 2, (9, 8): 2, (10, 6): 1, (5, 1): 2, (1, 11): 2, (10, 3): 1, (5, 5): 2, (2, 9): 1, (6, 10): 2, (1, 5): 1, (2, 2): 1, (8, 6): 2, (4, 10): 1, (8, 2): 1, (8, 11): 2, (9, 10): 3, (4, 11): 1, (2, 7): 1, (4, 6): 1, (9, 2): 1, (3, 4): 3, (5, 7): 1, (3, 8): 3, (7, 8): 5},
},
'amazing3a': {
'avenues': 7,
'streets': 7,
'walls': [(2, 1), (3, 2), (5, 2), (6, 3), (6, 5), (6, 7), (6, 9), (6, 11), (6, 13)],
'beepers': {(1, 2): 1, (2, 7): 1, (3, 2): 1, (1, 3): 1, (3, 3): 1, (1, 7): 1, (1, 4): 1, (2, 4): 1, (1, 5): 1, (2, 6): 1, (1, 6): 1, (3, 6): 1, (2, 2): 1, (2, 3): 1, (3, 7): 1, (2, 5): 1, (3, 4): 1, (1, 1): 1, (3, 5): 1},
},
'yardwork': {
'avenues': 10,
'streets': 10,
'walls': [],
'beepers': {(1, 2): 18, (7, 3): 4, (4, 8): 1, (5, 6): 7, (7, 1): 4, (9, 2): 11, (8, 8): 1, (1, 10): 3, (2, 5): 3, (5, 8): 2, (7, 9): 2},
},
'sort1': {
'avenues': 10,
'streets': 10,
'walls': [],
'beepers': {(1, 2): 1, (1, 3): 1, (2, 2): 1, (1, 4): 1, (2, 4): 1, (1, 5): 1, (1, 6): 1, (2, 1): 1, (1, 7): 1, (2, 3): 1, (2, 5): 1, (1, 1): 1},
},
'harvest4': {
'avenues': 7,
'streets': 7,
'walls': [],
'beepers': {(7, 3): 1, (6, 6): 1, (5, 6): 1, (3, 2): 1, (2, 1): 1, (6, 2): 1, (5, 1): 2, (2, 5): 1, (7, 2): 1, (5, 5): 1, (7, 6): 1, (4, 4): 1, (3, 6): 1, (2, 2): 2, (3, 5): 1, (4, 1): 1, (6, 4): 1, (5, 4): 1, (7, 1): 1, (4, 5): 1, (2, 3): 1, (4, 2): 1, (6, 5): 2, (5, 3): 2, (4, 6): 1, (6, 1): 1, (7, 4): 1, (4, 3): 1, (3, 4): 2, (2, 4): 1},
},
'amazing5': {
'avenues': 7,
'streets': 7,
'walls': [(3, 2), (6, 5), (6, 7), (6, 9), (6, 11), (6, 13), (4, 1), (2, 3), (3, 4), (5, 4)],
'beepers': {},
},
'maze1': {
'avenues': 10,
'streets': 10,
'walls': [(10, 1), (10, 3), (10, 5), (1, 10), (3, 10), (5, 10), (2, 1), (2, 3), (1, 6), (3, 6), (4, 5), (4, 3), (5, 2), (6, 3), (7, 8), (5, 8), (2, 7), (7, 10), (8, 7), (9, 6), (8, 3), (9, 4), (9, 10), (10, 9)],
'beepers': {(6, 4): 1},
},
'harvest1': {
'avenues': 7,
'streets': 7,
'walls': [],
'beepers': {(3, 3): 1, (3, 2): 1, (3, 1): 1, (5, 6): 1, (5, 1): 1, (3, 6): 1, (5, 3): 1, (5, 2): 1, (7, 6): 1, (7, 5): 1, (7, 4): 1, (7, 3): 1, (7, 2): 1, (7, 1): 1, (3, 5): 1, (3, 4): 1, (2, 4): 1, (2, 5): 1, (2, 6): 1, (2, 1): 1, (2, 2): 1, (2, 3): 1, (4, 6): 1, (4, 4): 1, (4, 5): 1, (4, 2): 1, (4, 3): 1, (4, 1): 1, (6, 1): 1, (6, 2): 1, (6, 3): 1, (6, 4): 1, (6, 5): 1, (6, 6): 1, (5, 5): 1, (5, 4): 1},
},
'amazing1': {
'avenues': 5,
'streets': 5,
'walls': [],
'beepers': {},
},
'harvest2': {
'avenues': 12,
'streets': 12,
'walls': [],
'beepers': {(7, 3): 1, (6, 10): 1, (6, 6): 1, (2, 8): 1, (10, 6): 1, (7, 7): 1, (4, 6): 1, (6, 2): 1, (7, 11): 1, (3, 7): 1, (10, 8): 1, (5, 5): 1, (4, 4): 1, (8, 10): 1, (4, 8): 1, (8, 6): 1, (5, 3): 1, (9, 7): 1, (4, 10): 1, (2, 6): 1, (5, 11): 1, (5, 9): 1, (7, 5): 1, (6, 12): 1, (6, 4): 1, (3, 5): 1, (11, 7): 1, (6, 8): 1, (5, 7): 1, (9, 9): 1, (8, 8): 1, (7, 9): 1, (1, 7): 1, (9, 5): 1, (3, 9): 1, (8, 4): 1},
},
'amazing3': {
'avenues': 7,
'streets': 7,
'walls': [(2, 1), (3, 2), (5, 2), (6, 3), (6, 5), (6, 7), (6, 9), (6, 11), (6, 13)],
'beepers': {},
},
'amazing2': {
'avenues': 7,
'streets': 7,
'walls': [(6, 13), (6, 11), (6, 9), (13, 6), (11, 6), (9, 6), (7, 6), (6, 7)],
'beepers': {},
},
'harvest3': {
'avenues': 7,
'streets': 7,
'walls': [],
'beepers': {(7, 3): 1, (6, 6): 1, (5, 6): 1, (3, 2): 1, (2, 1): 1, (6, 2): 1, (5, 1): 1, (2, 5): 1, (7, 2): 1, (7, 6): 1, (4, 4): 1, (3, 6): 1, (2, 2): 1, (3, 5): 1, (4, 1): 1, (6, 4): 1, (5, 4): 1, (7, 1): 1, (4, 5): 1, (5, 5): 1, (2, 3): 1, (4, 2): 1, (6, 5): 1, (5, 3): 1, (4, 6): 1, (3, 4): 1, (6, 1): 1, (7, 4): 1, (4, 3): 1, (2, 4): 1},
},
'add1': {
'avenues': 10,
'streets': 10,
'walls': [],
'beepers': {(10, 1): 3, (10, 2): 2}
},
'add2': {
'avenues': 10,
'streets': 10,
'walls': [],
'beepers': {(9, 2): 1, (9, 1): 2, (10, 1): 2, (10, 2): 3}
},
'add34': {
'avenues': 10,
'streets': 10,
'walls': [],
'beepers': {(8, 2): 9, (7, 1): 1, (8, 1): 3, (9, 2): 8, (10, 1): 4, (10, 2): 7}
},
}
| 60.36014 | 635 | 0.333835 | 3,103 | 17,263 | 1.848856 | 0.042862 | 0.030678 | 0.117134 | 0.147464 | 0.592644 | 0.493289 | 0.4152 | 0.370577 | 0.315496 | 0.272093 | 0 | 0.272261 | 0.300006 | 17,263 | 285 | 636 | 60.57193 | 0.202499 | 0 | 0 | 0.37234 | 0 | 0 | 0.125123 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.007092 | false | 0 | 0 | 0 | 0.014184 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
e36dc2963b3e15b6183197cc7bce8f0677915722 | 27 | py | Python | rtmp/__init__.py | notnola/pinybot | 8ad579fe5652b42a8fb9486c8d11962f5972f817 | [
"MIT"
] | null | null | null | rtmp/__init__.py | notnola/pinybot | 8ad579fe5652b42a8fb9486c8d11962f5972f817 | [
"MIT"
] | null | null | null | rtmp/__init__.py | notnola/pinybot | 8ad579fe5652b42a8fb9486c8d11962f5972f817 | [
"MIT"
] | 1 | 2019-01-31T01:07:56.000Z | 2019-01-31T01:07:56.000Z | __author__ = 'TechWhizZ199' | 27 | 27 | 0.814815 | 2 | 27 | 9 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.12 | 0.074074 | 27 | 1 | 27 | 27 | 0.6 | 0 | 0 | 0 | 0 | 0 | 0.428571 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
e374829b389cef040daa81ebe91954032d3a7a55 | 72 | py | Python | __init__.py | hoel-bagard/yolact | 028fd121e94c18531243a73eb4c0d443fc38a079 | [
"MIT"
] | null | null | null | __init__.py | hoel-bagard/yolact | 028fd121e94c18531243a73eb4c0d443fc38a079 | [
"MIT"
] | null | null | null | __init__.py | hoel-bagard/yolact | 028fd121e94c18531243a73eb4c0d443fc38a079 | [
"MIT"
] | null | null | null | from .predict import YolactK
from .data import *
__version__ = "0.1.0"
| 14.4 | 28 | 0.722222 | 11 | 72 | 4.363636 | 0.727273 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.05 | 0.166667 | 72 | 4 | 29 | 18 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0.069444 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
8b5e99254ec155e2d433487c1c07674f3203394e | 1,736 | py | Python | Demo/frontend-server.py | hlynch/Penguins_AIforEarth | bccedb68640b20c6c6849040ad57823e99dbd0c6 | [
"MIT"
] | 2 | 2019-06-17T14:09:45.000Z | 2020-08-17T00:20:44.000Z | Demo/frontend-server.py | hlynch/Penguins_AIforEarth | bccedb68640b20c6c6849040ad57823e99dbd0c6 | [
"MIT"
] | 6 | 2019-05-21T16:24:43.000Z | 2019-05-28T18:41:04.000Z | Demo/frontend-server.py | hlynch/Penguins_AIforEarth | bccedb68640b20c6c6849040ad57823e99dbd0c6 | [
"MIT"
] | null | null | null | '''
Webserver for the Penguin Guano Classification AI4Earth API
To run:
export FLASK_APP=frontend-server.py
python -m flask run --host=0.0.0.0
To access the website, enter your IP address:5000 into a browser.
e.g., http://127.0.0.1:5000/
'''
from flask import Flask, send_from_directory, request
import requests
print("Running frontend server")
API_ENDPOINT = "http://penguinguano.eastus.azurecontainer.io:80/v1/pytorch_api/classify"
app = Flask(__name__, static_url_path='')
# front-end server stuff
@app.route('/')
def root():
return send_from_directory('', 'index.html')
@app.route('/about.html')
def send_about():
return send_from_directory('', 'about.html')
@app.route('/instructions.html')
def send_instructions():
return send_from_directory('', 'instructions.html')
@app.route('/static/static-templates/<path:path>')
def send_templates(path):
return send_from_directory('static/static-templates', path)
@app.route('/static/css/<path:path>')
def send_css(path):
return send_from_directory('static/css', path)
@app.route('/static/js/<path:path>')
def send_js(path):
return send_from_directory('static/js', path)
@app.route('/static/images/<path:path>')
def send_image(path):
return send_from_directory('static/images', path)
@app.route('/get-classification', methods=['GET', 'POST'])
def get_classification():
if request.form['type'] == 'sample':
# TODO: enforce strict pathing to static image dir only
data = open('.' + request.form['file'], 'rb').read()
else:
data = request.files.get('file', '')
r = requests.post(url = API_ENDPOINT, data = data,
headers={'Content-Type': 'application/octet-stream'})
return r.json()['image_url']
if __name__ == '__main__':
app.run()
| 24.111111 | 88 | 0.711982 | 248 | 1,736 | 4.814516 | 0.403226 | 0.053601 | 0.113903 | 0.134841 | 0.110553 | 0.110553 | 0 | 0 | 0 | 0 | 0 | 0.01436 | 0.117512 | 1,736 | 71 | 89 | 24.450704 | 0.765013 | 0.1803 | 0 | 0 | 0 | 0 | 0.299151 | 0.108911 | 0 | 0 | 0 | 0.014085 | 0 | 1 | 0.216216 | false | 0 | 0.054054 | 0.189189 | 0.486486 | 0.027027 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 3 |
8b6aace89e3d825b331240e13aabc132d611171f | 2,584 | py | Python | setup_extension.py | kuwayamamasayuki/FeedValidator-extension-for-GTFS-JP | af01375d0cf99c671a8a49f8f3a7aac2083424bc | [
"Apache-2.0"
] | 1 | 2020-04-03T09:18:53.000Z | 2020-04-03T09:18:53.000Z | setup_extension.py | kuwayamamasayuki/FeedValidator-extension-for-GTFS-JP | af01375d0cf99c671a8a49f8f3a7aac2083424bc | [
"Apache-2.0"
] | null | null | null | setup_extension.py | kuwayamamasayuki/FeedValidator-extension-for-GTFS-JP | af01375d0cf99c671a8a49f8f3a7aac2083424bc | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/python2.5
# Copyright (C) 2019 KUWAYAMA, Masayuki
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# Copyright (C) 2011 Google Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import agency
import agency_jp
import stop
import route
import route_jp
import trip
import office_jp
import fareattribute
import farerule
import shape
import feedinfo
import translation
import gtfsfactory
import schedule
def GetGtfsFactory(factory = None):
if not factory:
factory = gtfsfactory.GetGtfsFactory()
# Agency class extension
factory.UpdateClass('Agency', agency.Agency)
# Agency_jp class extension
factory.UpdateClass('Agency_jp', agency_jp.Agency_jp)
# Stop class extension
factory.UpdateClass('Stop', stop.Stop)
# Route class extension
factory.UpdateClass('Route', route.Route)
# Route_jp class extension
factory.UpdateClass('Route_jp', route_jp.Route_jp)
# Trip class extension
factory.UpdateClass('Trip', trip.Trip)
# Office_jp class extension
factory.UpdateClass('Office_jp', office_jp.Office_jp)
# FareAttribute class extension
factory.UpdateClass('FareAttribute', fareattribute.FareAttribute)
# FareRUles class extension
factory.UpdateClass('FareRule', farerule.FareRule)
# Shape class extension
factory.UpdateClass('Shape', shape.Shape)
# FeedInfo class extension
factory.UpdateClass('FeedInfo', feedinfo.FeedInfo)
# Translation class extension
factory.UpdateClass('Translation', translation.Translation)
# Schedule class extension
factory.UpdateClass('Schedule', schedule.Schedule)
return factory
| 28.711111 | 74 | 0.768576 | 345 | 2,584 | 5.713043 | 0.275362 | 0.092339 | 0.138508 | 0.21106 | 0.517504 | 0.422121 | 0.422121 | 0.422121 | 0.422121 | 0.422121 | 0 | 0.008238 | 0.154412 | 2,584 | 89 | 75 | 29.033708 | 0.893822 | 0.562694 | 0 | 0 | 0 | 0 | 0.089908 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.032258 | false | 0 | 0.451613 | 0 | 0.516129 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
8b71d0b65eecf04d767d50cdc3d7516cf1940fbe | 236 | py | Python | routers.py | gabrielangelo/revelo-wallet | 3e91117b673e5aaf50773aa180af4117235965c9 | [
"BSD-3-Clause"
] | null | null | null | routers.py | gabrielangelo/revelo-wallet | 3e91117b673e5aaf50773aa180af4117235965c9 | [
"BSD-3-Clause"
] | 8 | 2020-02-11T23:50:12.000Z | 2022-03-14T22:51:54.000Z | routers.py | gabrielangelo/revelo-wallet | 3e91117b673e5aaf50773aa180af4117235965c9 | [
"BSD-3-Clause"
] | null | null | null | from rest_framework.routers import SimpleRouter
from transactions.api.views import TransactionsViewSet
router_v1 = SimpleRouter(trailing_slash=False)
router_v1.register(r'transactions', TransactionsViewSet, base_name='transactions')
| 33.714286 | 82 | 0.855932 | 27 | 236 | 7.296296 | 0.703704 | 0.081218 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009132 | 0.072034 | 236 | 6 | 83 | 39.333333 | 0.890411 | 0 | 0 | 0 | 0 | 0 | 0.101695 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
8b898fc8f9613f97a1b09d6b849378dd2047f47d | 51 | py | Python | index.py | JaidevstudioRobot/hackoctober2021 | d5855ac4bc797d7abb85b76f8b4a28e4a0dafaea | [
"MIT"
] | null | null | null | index.py | JaidevstudioRobot/hackoctober2021 | d5855ac4bc797d7abb85b76f8b4a28e4a0dafaea | [
"MIT"
] | null | null | null | index.py | JaidevstudioRobot/hackoctober2021 | d5855ac4bc797d7abb85b76f8b4a28e4a0dafaea | [
"MIT"
] | 1 | 2021-10-04T18:16:06.000Z | 2021-10-04T18:16:06.000Z | # Hello python
a = "Hello I m Robot Jai"
print(a)
| 12.75 | 25 | 0.647059 | 10 | 51 | 3.3 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.235294 | 51 | 3 | 26 | 17 | 0.846154 | 0.235294 | 0 | 0 | 0 | 0 | 0.527778 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 3 |
8b93ff57b731c6c33351b57dd2f1b5402cee9a07 | 72 | py | Python | examples/random_article.py | yusufusta/wikiHowUnofficialAPI | e29ae96a2dcf893f5b587804b9dd37a412cdd561 | [
"MIT"
] | 5 | 2021-04-17T14:02:58.000Z | 2022-03-06T02:18:16.000Z | examples/random_article.py | yusufusta/wikiHowUnofficialAPI | e29ae96a2dcf893f5b587804b9dd37a412cdd561 | [
"MIT"
] | 1 | 2021-07-09T12:28:27.000Z | 2021-07-10T10:04:11.000Z | examples/random_article.py | yusufusta/wikiHowUnofficialAPI | e29ae96a2dcf893f5b587804b9dd37a412cdd561 | [
"MIT"
] | 4 | 2021-02-02T14:23:58.000Z | 2021-11-15T04:38:10.000Z | import wikihowunofficialapi as wha
ra = wha.random_article()
print(ra)
| 14.4 | 34 | 0.791667 | 10 | 72 | 5.6 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 72 | 4 | 35 | 18 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0.333333 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
8ba7da6d41b49d8a498bec0f19c0c437b815e955 | 558 | py | Python | RecoTracker/CkfPattern/python/CkfTrackCandidatesNoOverlaps_cff.py | ckamtsikis/cmssw | ea19fe642bb7537cbf58451dcf73aa5fd1b66250 | [
"Apache-2.0"
] | 852 | 2015-01-11T21:03:51.000Z | 2022-03-25T21:14:00.000Z | RecoTracker/CkfPattern/python/CkfTrackCandidatesNoOverlaps_cff.py | ckamtsikis/cmssw | ea19fe642bb7537cbf58451dcf73aa5fd1b66250 | [
"Apache-2.0"
] | 30,371 | 2015-01-02T00:14:40.000Z | 2022-03-31T23:26:05.000Z | RecoTracker/CkfPattern/python/CkfTrackCandidatesNoOverlaps_cff.py | ckamtsikis/cmssw | ea19fe642bb7537cbf58451dcf73aa5fd1b66250 | [
"Apache-2.0"
] | 3,240 | 2015-01-02T05:53:18.000Z | 2022-03-31T17:24:21.000Z | import FWCore.ParameterSet.Config as cms
# TrackerTrajectoryBuilders
from RecoTracker.CkfPattern.CkfTrajectoryBuilder_cff import *
# TrajectoryCleaning
from TrackingTools.TrajectoryCleaning.TrajectoryCleanerBySharedHits_cfi import *
# navigation school
from RecoTracker.TkNavigation.NavigationSchoolESProducer_cff import *
from RecoTracker.CkfPattern.CkfTrackCandidates_cfi import *
# generate CTF track candidates ############
ckfTrackCandidatesNoOverlaps = ckfTrackCandidates.clone(
TrajectoryBuilderPSet = dict(refToPSet_ = 'CkfTrajectoryBuilder')
)
| 39.857143 | 80 | 0.844086 | 45 | 558 | 10.355556 | 0.666667 | 0.096567 | 0.107296 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.084229 | 558 | 13 | 81 | 42.923077 | 0.911937 | 0.164875 | 0 | 0 | 0 | 0 | 0.044643 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.625 | 0 | 0.625 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
8bd89aabbee1878012e7f1e4e9d5683197eea5ee | 342 | py | Python | korzh_bot/uleague/exceptions.py | uleague/universityleague-steam | 65539664c0c6aad94a7ff3a3208323a554e1fddd | [
"MIT"
] | null | null | null | korzh_bot/uleague/exceptions.py | uleague/universityleague-steam | 65539664c0c6aad94a7ff3a3208323a554e1fddd | [
"MIT"
] | 2 | 2020-09-14T21:55:03.000Z | 2020-11-17T17:23:34.000Z | korzh_bot/uleague/exceptions.py | uleague/universityleague-steam | 65539664c0c6aad94a7ff3a3208323a554e1fddd | [
"MIT"
] | null | null | null | class ULeagueRequestError(Exception):
"""
Basic exception for all requests
"""
def __init__(self, message):
self.message = message
super().__init__(self.message)
def __str__(self):
return "Произошла ошибка во время выполнения запроса на ULeague --> {}".format(
self.message
)
| 24.428571 | 87 | 0.614035 | 34 | 342 | 5.823529 | 0.676471 | 0.222222 | 0.151515 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.283626 | 342 | 13 | 88 | 26.307692 | 0.808163 | 0.093567 | 0 | 0 | 0 | 0 | 0.210884 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0 | 0.125 | 0.5 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 3 |
473931c03e87e8ad7ec18f92b6ba042e2eddadd3 | 494 | py | Python | Alice/Sorting/selection_sort.py | sandeepm96/cormen-algos | 9154f6ce9cb0c318bc0d6ecaa13676d080985cec | [
"MIT"
] | 1 | 2017-09-15T13:34:19.000Z | 2017-09-15T13:34:19.000Z | Alice/Sorting/selection_sort.py | sandeepm96/cormen-algos | 9154f6ce9cb0c318bc0d6ecaa13676d080985cec | [
"MIT"
] | null | null | null | Alice/Sorting/selection_sort.py | sandeepm96/cormen-algos | 9154f6ce9cb0c318bc0d6ecaa13676d080985cec | [
"MIT"
] | null | null | null | class SelectionSort:
def __init__(self,array):
self.array = array
def result(self):
n = len(self.array)
for i in range(n):
minimum = i
for j in range(i+1,n):
if self.array[minimum] > self.array[j]:
minimum = j
self.array[i],self.array[minimum] = self.array[minimum],self.array[i]
return self.array
test = list(map(int,input().split(' ')))
t = SelectionSort(test)
print(t.result())
| 29.058824 | 81 | 0.544534 | 66 | 494 | 4.015152 | 0.393939 | 0.339623 | 0.181132 | 0.226415 | 0.249057 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002959 | 0.315789 | 494 | 16 | 82 | 30.875 | 0.781065 | 0 | 0 | 0 | 0 | 0 | 0.002024 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.133333 | false | 0 | 0 | 0 | 0.266667 | 0.066667 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
473f4dc9a600b8c43aff418b905edef20e705990 | 266 | py | Python | Ex031 - Custo de viagem.py | MarcusMendes81/Python | 4af6653da324604930d24542a84a530348029d39 | [
"Apache-2.0"
] | null | null | null | Ex031 - Custo de viagem.py | MarcusMendes81/Python | 4af6653da324604930d24542a84a530348029d39 | [
"Apache-2.0"
] | null | null | null | Ex031 - Custo de viagem.py | MarcusMendes81/Python | 4af6653da324604930d24542a84a530348029d39 | [
"Apache-2.0"
] | null | null | null | km = float(input('Digite qual a distância da sua viagem em km: '))
if km <= 200:
preço = km * 0.50
print('O valor da sua viagem é de {:.2f}R$'.format(preço))
else:
preço = km * 0.45
print('O valor da sua viagem é de {:.2f}R$'.format(preço))
| 29.555556 | 67 | 0.586466 | 48 | 266 | 3.25 | 0.541667 | 0.096154 | 0.211538 | 0.166667 | 0.5 | 0.5 | 0.5 | 0.5 | 0.5 | 0.5 | 0 | 0.055556 | 0.255639 | 266 | 8 | 68 | 33.25 | 0.732323 | 0 | 0 | 0.285714 | 0 | 0 | 0.445736 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.285714 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
476bf06f808044f61843876f04c9e9d7d4868ec8 | 901 | py | Python | palindrome_test.py | gcrowder/palindrome | 31bc1ab62f849dbbfef8e31b0d0d081e7bf2aced | [
"MIT"
] | null | null | null | palindrome_test.py | gcrowder/palindrome | 31bc1ab62f849dbbfef8e31b0d0d081e7bf2aced | [
"MIT"
] | null | null | null | palindrome_test.py | gcrowder/palindrome | 31bc1ab62f849dbbfef8e31b0d0d081e7bf2aced | [
"MIT"
] | null | null | null | import unittest
from palindrome import is_palindrome
class TestPalindrome(unittest.TestCase):
def test_even_numbers(self):
self.assertTrue(is_palindrome('toot'))
def test_odd_numbers(self):
self.assertTrue(is_palindrome('tot'))
def test_simple_values(self):
self.assertTrue(is_palindrome('stunt nuts'))
def test_complete_sentences(self):
self.assertTrue(is_palindrome('Lisa Bonet ate no basil.'))
def test_complex_sentences(self):
self.assertTrue(is_palindrome('A man, a plan, a cat, a ham, a yak, a yam, a hat, a canal: Panama!'))
def test_multiple_sentences(self):
self.assertTrue(is_palindrome('Doc, note, I dissent. A fast never prevents a fatness. I diet on cod.'))
def test_non_palindromes(self):
self.assertFalse(is_palindrome('i am not a palindrome'))
if __name__ == '__main__':
unittest.main()
| 30.033333 | 111 | 0.703663 | 124 | 901 | 4.870968 | 0.475806 | 0.15894 | 0.178808 | 0.198676 | 0.365894 | 0.316225 | 0 | 0 | 0 | 0 | 0 | 0 | 0.188679 | 901 | 29 | 112 | 31.068966 | 0.826265 | 0 | 0 | 0 | 0 | 0.052632 | 0.227525 | 0 | 0 | 0 | 0 | 0 | 0.368421 | 1 | 0.368421 | false | 0 | 0.105263 | 0 | 0.526316 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
4780e099e5c87546b20f86a673d8da78571df7e4 | 361 | py | Python | tests/test_bad_seeds.py | jklynch/diffrascape | 350bed352fa6c9b30739e3748b7ea57b365f1944 | [
"BSD-3-Clause"
] | null | null | null | tests/test_bad_seeds.py | jklynch/diffrascape | 350bed352fa6c9b30739e3748b7ea57b365f1944 | [
"BSD-3-Clause"
] | null | null | null | tests/test_bad_seeds.py | jklynch/diffrascape | 350bed352fa6c9b30739e3748b7ea57b365f1944 | [
"BSD-3-Clause"
] | null | null | null | from diffrascape.env import BadSeeds
def test_construct():
bad_seeds = BadSeeds()
bad_seeds_states = bad_seeds.states()
print(f"### states: {bad_seeds_states}")
assert bad_seeds_states["shape"][0] == 6
bad_seeds_actions = bad_seeds.actions()
print(f"### actions: {bad_seeds_actions}")
assert bad_seeds_actions["num_values"] == 3
| 25.785714 | 47 | 0.695291 | 49 | 361 | 4.77551 | 0.428571 | 0.307692 | 0.239316 | 0.17094 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.01 | 0.168975 | 361 | 13 | 48 | 27.769231 | 0.77 | 0 | 0 | 0 | 0 | 0 | 0.213296 | 0 | 0 | 0 | 0 | 0 | 0.222222 | 1 | 0.111111 | false | 0 | 0.111111 | 0 | 0.222222 | 0.222222 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
47ac2e08152553ebc4a73e2f44181a3dfc25e059 | 165 | py | Python | testapp/another_urls.py | danigosa/django-simple-seo | 17610e50148c6672cb34e96654df1d3515b0444f | [
"BSD-3-Clause"
] | 11 | 2015-01-02T15:44:31.000Z | 2021-07-27T06:54:35.000Z | testapp/another_urls.py | danigosa/django-simple-seo | 17610e50148c6672cb34e96654df1d3515b0444f | [
"BSD-3-Clause"
] | 8 | 2016-02-03T07:07:04.000Z | 2022-01-13T00:42:32.000Z | testapp/another_urls.py | danigosa/django-simple-seo | 17610e50148c6672cb34e96654df1d3515b0444f | [
"BSD-3-Clause"
] | 8 | 2015-02-20T13:51:51.000Z | 2021-06-24T19:11:30.000Z | from django.conf.urls import patterns, url
from .views import template_test
urlpatterns = patterns(
'',
url(r'^', template_test, name='template_test2'),
) | 18.333333 | 52 | 0.709091 | 21 | 165 | 5.428571 | 0.666667 | 0.192982 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007246 | 0.163636 | 165 | 9 | 53 | 18.333333 | 0.818841 | 0 | 0 | 0 | 0 | 0 | 0.090361 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
47bf6e3c9c36dabf9fe1d3cb252c2d9d2f56f9af | 843 | py | Python | tests/tests_query_operations/table_models.py | Robinson04/StructNoSQL | 335c63593025582336bb67ad0b0ed39d30800b74 | [
"MIT"
] | 3 | 2020-10-30T23:31:26.000Z | 2022-03-30T21:48:40.000Z | tests/tests_query_operations/table_models.py | Robinson04/StructNoSQL | 335c63593025582336bb67ad0b0ed39d30800b74 | [
"MIT"
] | 42 | 2020-09-16T15:23:11.000Z | 2021-09-20T13:00:50.000Z | tests/tests_query_operations/table_models.py | Robinson04/StructNoSQL | 335c63593025582336bb67ad0b0ed39d30800b74 | [
"MIT"
] | 2 | 2021-01-03T21:37:22.000Z | 2021-08-12T20:28:52.000Z | from typing import Dict
from StructNoSQL import TableDataModel, BaseField, MapModel
class BaseTableModel(TableDataModel):
type = BaseField(field_type=str, required=False)
fieldOne = BaseField(field_type=str, required=False)
fieldTwo = BaseField(field_type=str, required=False)
class ContainerModel(MapModel):
fieldOne = BaseField(field_type=str, required=False)
fieldTwo = BaseField(field_type=str, required=False)
fieldThree = BaseField(field_type=str, required=False)
container = BaseField(field_type=Dict[str, ContainerModel], key_name='containerKey', required=False)
class DynamoDBTableModel(BaseTableModel):
accountId = BaseField(field_type=str, required=True)
class ExternalDynamoDBApiTableModel(BaseTableModel):
accountProjectTableKeyId = BaseField(field_type=str, required=True)
| 42.15 | 104 | 0.778173 | 90 | 843 | 7.177778 | 0.311111 | 0.195046 | 0.250774 | 0.260062 | 0.467492 | 0.467492 | 0.260062 | 0.260062 | 0.260062 | 0.260062 | 0 | 0 | 0.132859 | 843 | 19 | 105 | 44.368421 | 0.883721 | 0 | 0 | 0.266667 | 0 | 0 | 0.014235 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.133333 | 0 | 0.8 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.