hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
a2bd0fd34368e4604144c29b0f69a07f59c44be6 | 12,878 | py | Python | ckanext-hdx_org_group/ckanext/hdx_org_group/tests/test_controller/test_member_controller.py | alexandru-m-g/hdx-ckan | 647f1f23f0505fa195601245b758edcaf4d25985 | [
"Apache-2.0"
] | null | null | null | ckanext-hdx_org_group/ckanext/hdx_org_group/tests/test_controller/test_member_controller.py | alexandru-m-g/hdx-ckan | 647f1f23f0505fa195601245b758edcaf4d25985 | [
"Apache-2.0"
] | null | null | null | ckanext-hdx_org_group/ckanext/hdx_org_group/tests/test_controller/test_member_controller.py | alexandru-m-g/hdx-ckan | 647f1f23f0505fa195601245b758edcaf4d25985 | [
"Apache-2.0"
] | null | null | null | '''
Created on Jun 23, 2015
@author: alexandru-m-g
'''
import logging
import mock
import ckan.model as model
import ckan.common as common
import ckan.lib.helpers as h
import ckan.lib.mailer as mailer
import ckanext.hdx_users.controllers.mailer as hdx_mailer
import ckanext.hdx_theme.tests.hdx_test_base as hdx_test_base
import ckanext.hdx_theme.tests.mock_helper as mock_helper
import ckanext.hdx_org_group.controllers.member_controller as member_controller
import ckanext.hdx_org_group.tests as org_group_base
c = common.c
log = logging.getLogger(__name__)
q = None
sort = None
c_dict = None
invited_user = None
class TestMembersController(org_group_base.OrgGroupBaseWithIndsAndOrgsTest):
@classmethod
def _load_plugins(cls):
hdx_test_base.load_plugin('ytp_request hdx_org_group hdx_theme')
@classmethod
def _create_test_data(cls):
super(TestMembersController, cls)._create_test_data(create_datasets=False, create_members=True)
def setup(self):
global q, sort, c_dict
q = None
sort = None
c_dict = None
user_invite_params = None
def _populate_member_names(self, members, users):
ret = [next(user['fullname'] for user in users if user['id'] == member[0]) for member in members]
return ret
@mock.patch('ckanext.hdx_theme.helpers.helpers.c')
@mock.patch('ckanext.hdx_org_group.helpers.organization_helper.c')
@mock.patch('ckanext.hdx_org_group.controllers.member_controller.c')
def test_members(self, member_c, org_helper_c, theme_c):
global sort, q
test_username = 'testsysadmin'
mock_helper.populate_mock_as_c(member_c, test_username)
mock_helper.populate_mock_as_c(org_helper_c, test_username)
mock_helper.populate_mock_as_c(theme_c, test_username)
context = {
'model': model, 'session': model.Session, 'user': 'testsysadmin'}
org = self._get_action('organization_show')(context, {'id': 'hdx-test-org'})
# By default the users should be sorted alphabetically asc
user_controller = MockedHDXOrgMemberController()
user_controller.members('hdx-test-org')
user_list = self._populate_member_names(c_dict['members'], org['users'])
for idx, val in enumerate(user_list):
if idx < len(user_list) - 1 and user_list[idx] and user_list[idx + 1]:
assert user_list[idx] < user_list[idx + 1], "{} should be before {}". \
format(user_list[idx], user_list[idx + 1])
# Sorting alphabetically desc
sort = 'title desc'
user_controller.members('hdx-test-org')
user_list = self._populate_member_names(c_dict['members'], org['users'])
for idx, val in enumerate(user_list):
if idx < len(user_list) - 1 and user_list[idx] and user_list[idx + 1]:
assert user_list[idx] > user_list[idx + 1], "{} should be before {}". \
format(user_list[idx], user_list[idx + 1])
# Sorting alphabetically desc
q = 'anna'
user_controller.members('hdx-test-org')
user_list = self._populate_member_names(c_dict['members'], org['users'])
assert len(user_list) == 1, "Only one user should be found for query"
assert user_list[0] == 'Anna Anderson2'
@mock.patch('ckanext.hdx_theme.helpers.helpers.c')
@mock.patch('ckanext.hdx_org_group.helpers.organization_helper.c')
@mock.patch('ckanext.hdx_org_group.controllers.member_controller.c')
def test_members_delete_add(self, member_c, org_helper_c, theme_c):
test_username = 'testsysadmin'
mock_helper.populate_mock_as_c(member_c, test_username)
mock_helper.populate_mock_as_c(org_helper_c, test_username)
mock_helper.populate_mock_as_c(theme_c, test_username)
url = h.url_for(
controller='ckanext.hdx_org_group.controllers.member_controller:HDXOrgMemberController',
action='member_delete',
id='hdx-test-org'
)
self.app.post(url, params={'user': 'annaanderson2'}, extra_environ={"REMOTE_USER": "testsysadmin"})
context = {
'model': model, 'session': model.Session, 'user': 'testsysadmin'}
org = self._get_action('organization_show')(context, {'id': 'hdx-test-org'})
user_controller = MockedHDXOrgMemberController()
user_controller.members('hdx-test-org')
user_list = self._populate_member_names(c_dict['members'], org['users'])
deleted_length = len(user_list)
assert 'Anna Anderson2' not in user_list
url = h.url_for(
controller='ckanext.hdx_org_group.controllers.member_controller:HDXOrgMemberController',
action='member_new',
id='hdx-test-org'
)
self.app.post(url, params={'username': 'annaanderson2', 'role': 'editor'},
extra_environ={"REMOTE_USER": "testsysadmin"})
org = self._get_action('organization_show')(context, {'id': 'hdx-test-org'})
assert len(org['users']) == deleted_length + 1, 'Number of members should have increased by 1'
member_anna = next((user for user in org['users'] if user['name'] == 'annaanderson2'), None)
assert member_anna, 'User annaanderson2 needs to be a member of the org'
assert member_anna['capacity'] == 'editor', 'User annaanderson2 needs to be an editor'
# def test_members_invite(self):
#
# original_send_invite = mailer.send_invite
#
# def mock_send_invite(user):
# global invited_user
# invited_user = user
#
# mailer.send_invite = mock_send_invite
#
# context = {
# 'model': model, 'session': model.Session, 'user': 'testsysadmin'}
# url = h.url_for(
# controller='ckanext.hdx_org_group.controllers.member_controller:HDXOrgMemberController',
# action='member_new',
# id='hdx-test-org'
# )
# self.app.post(url, params={'email': 'hdxtestuser123@test.test', 'role': 'editor'},
# extra_environ={"REMOTE_USER": "testsysadmin"})
# org = self._get_action('organization_show')(context, {'id': 'hdx-test-org'})
#
# new_member = next((user for user in org['users'] if 'hdxtestuser123' in user['name']), None)
# assert new_member, 'Invited user needs to be a member of the org'
# assert new_member['capacity'] == 'editor', 'Invited user needs to be an editor'
#
# mailer.send_invite = original_send_invite
#
# @mock.patch('ckanext.hdx_theme.helpers.helpers.c')
# @mock.patch('ckanext.hdx_org_group.helpers.organization_helper.c')
# @mock.patch('ckanext.hdx_org_group.controllers.member_controller.c')
# def test_bulk_members_invite(self, member_c, org_helper_c, theme_c):
# test_username = 'testsysadmin'
# mock_helper.populate_mock_as_c(member_c, test_username)
# mock_helper.populate_mock_as_c(org_helper_c, test_username)
# mock_helper.populate_mock_as_c(theme_c, test_username)
# original_send_invite = mailer.send_invite
#
# def mock_send_invite(user):
# global invited_user
# invited_user = user
#
# mailer.send_invite = mock_send_invite
# context = {'model': model, 'session': model.Session, 'user': test_username}
#
# # removing one member from organization
# url = h.url_for(
# controller='ckanext.hdx_org_group.controllers.member_controller:HDXOrgMemberController',
# action='member_delete',
# id='hdx-test-org'
# )
# self.app.post(url, params={'user': 'johndoe1'}, extra_environ={"REMOTE_USER": test_username})
#
# org = self._get_action('organization_show')(context, {'id': 'hdx-test-org'})
# user_controller = MockedHDXOrgMemberController()
# user_controller.members('hdx-test-org')
# user_list = self._populate_member_names(c_dict['members'], org['users'])
# deleted_length = len(user_list)
# assert 'John Doe1' not in user_list
#
# # bulk adding members
# url = h.url_for(
# controller='ckanext.hdx_org_group.controllers.member_controller:HDXOrgMemberController',
# action='bulk_member_new',
# id='hdx-test-org'
# )
#
# self.app.post(url, params={'emails': 'janedoe3,johndoe1,dan@k.ro', 'role': 'editor'},
# extra_environ={"REMOTE_USER": test_username})
# org = self._get_action('organization_show')(context, {'id': 'hdx-test-org'})
#
# assert len(org['users']) == deleted_length + 2, 'Number of members should have increased by 2'
# new_member = next((user for user in org['users'] if 'johndoe1' in user['name']), None)
# assert new_member, 'Invited user needs to be a member of the org'
# assert new_member['capacity'] == 'editor', 'Invited user needs to be an editor'
#
# # making john doe1 a member back
# self.app.post(url, params={'emails': 'johndoe1', 'role': 'member'},
# extra_environ={"REMOTE_USER": test_username})
# org = self._get_action('organization_show')(context, {'id': 'hdx-test-org'})
# new_member = next((user for user in org['users'] if 'johndoe1' in user['name']), None)
# assert new_member, 'Invited user needs to be a member of the org'
# assert new_member['capacity'] == 'member', 'Invited user needs to be an member'
#
# mailer.send_invite = original_send_invite
@mock.patch('ckanext.ytp.request.controller.request')
@mock.patch('ckanext.hdx_theme.helpers.helpers.c')
@mock.patch('ckanext.hdx_org_group.helpers.organization_helper.c')
@mock.patch('ckanext.hdx_org_group.controllers.member_controller.c')
def test_request_membership(self, member_c, org_helper_c, theme_c, request_c):
test_sysadmin = 'testsysadmin'
test_username = 'johndoe1'
mock_helper.populate_mock_as_c(member_c, test_sysadmin)
mock_helper.populate_mock_as_c(org_helper_c, test_sysadmin)
mock_helper.populate_mock_as_c(theme_c, test_sysadmin)
original_send_invite = mailer.send_invite
original_mail_recipient = hdx_mailer._mail_recipient
def mock_send_invite(user):
global invited_user
invited_user = user
def mock_mail_recipient(recipients_list, subject, body, sender_name, bcc_recipients_list=None, footer=None,
headers={}, sender_email=None):
return True
mailer.send_invite = mock_send_invite
hdx_mailer._mail_recipient = mock_mail_recipient
context = {'model': model, 'session': model.Session, 'user': test_sysadmin}
# removing one member from organization
url = h.url_for(
controller='ckanext.hdx_org_group.controllers.member_controller:HDXOrgMemberController',
action='member_delete',
id='hdx-test-org'
)
self.app.post(url, params={'user': 'johndoe1'}, extra_environ={"REMOTE_USER": test_sysadmin})
org = self._get_action('organization_show')(context, {'id': 'hdx-test-org'})
user_controller = MockedHDXOrgMemberController()
user_controller.members('hdx-test-org')
user_list = self._populate_member_names(c_dict['members'], org['users'])
assert 'John Doe1' not in user_list
# user update - email
self._get_action('user_update')(context, {'id': test_sysadmin, 'email': 'test@sys.admin'})
usr_dict = self._get_action('user_show')(context, {'id': test_sysadmin})
# send a membership request
mock_helper.populate_mock_as_c(member_c, test_username)
mock_helper.populate_mock_as_c(org_helper_c, test_username)
mock_helper.populate_mock_as_c(theme_c, test_username)
request_c.referer = '/organization/wfp'
url = h.url_for('member_request_new')
ret_page = self.app.post(url, params={'organization': 'hdx-test-org', 'role': 'member', 'save': 'save',
'message': 'add me to your organization'},
extra_environ={"REMOTE_USER": test_username})
mailer.send_invite = original_send_invite
hdx_mailer._mail_recipient = original_mail_recipient
class MockedHDXOrgMemberController(member_controller.HDXOrgMemberController):
def _find_filter_params(self):
return q, sort
def _set_c_params(self, params):
global c_dict
c_dict = params
def _get_context(self):
context = {'model': model, 'session': model.Session, 'user': 'testsysadmin'}
return context
def _render_template(self, template_name, group_type):
pass
| 43.802721 | 115 | 0.656701 | 1,613 | 12,878 | 4.962182 | 0.115933 | 0.028986 | 0.026237 | 0.035982 | 0.743378 | 0.720015 | 0.68928 | 0.663918 | 0.639805 | 0.614693 | 0 | 0.004513 | 0.225734 | 12,878 | 293 | 116 | 43.952218 | 0.798215 | 0.308045 | 0 | 0.431373 | 0 | 0 | 0.201226 | 0.076836 | 0 | 0 | 0 | 0 | 0.058824 | 1 | 0.084967 | false | 0.006536 | 0.071895 | 0.013072 | 0.196078 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
a2c23bf73602d96dd9316f0d033a88b8764a5ac4 | 3,063 | py | Python | fakenet/diverters/fnconfig.py | AzzOnFire/flare-fakenet-ng | bafd7e97b61cd43190dee7f1d2c3f4388488af76 | [
"Apache-2.0"
] | null | null | null | fakenet/diverters/fnconfig.py | AzzOnFire/flare-fakenet-ng | bafd7e97b61cd43190dee7f1d2c3f4388488af76 | [
"Apache-2.0"
] | null | null | null | fakenet/diverters/fnconfig.py | AzzOnFire/flare-fakenet-ng | bafd7e97b61cd43190dee7f1d2c3f4388488af76 | [
"Apache-2.0"
] | null | null | null | class Config(object):
"""Configuration primitives.
Inherit from or instantiate this class and call configure() when you've got
a dictionary of configuration values you want to process and query.
Would be nice to have _expand_cidrlist() so blacklists can specify ranges.
"""
def __init__(self, config_dict=None, portlists=[]):
if config_dict is not None:
self.configure(config_dict, portlists)
def configure(self, config_dict, portlists=[], stringlists=[]):
"""Parse configuration.
Does three things:
1.) Turn dictionary keys to lowercase
2.) Turn string lists into arrays for quicker access
3.) Expand port range specifications
"""
self._dict = dict((k.lower(), v) for k, v in config_dict.items())
for entry in portlists:
portlist = self.getconfigval(entry)
if portlist:
expanded = self._expand_ports(portlist)
self.setconfigval(entry, expanded)
for entry in stringlists:
stringlist = self.getconfigval(entry)
if stringlist:
expanded = [s.strip() for s in stringlist.split(',')]
self.setconfigval(entry, expanded)
def reconfigure(self, portlists=[], stringlists=[]):
"""Same as configure(), but allows multiple callers to sequentially
apply parsing directives for port and string lists.
For instance, if a base class calls configure() specifying one set of
port lists and string lists, but a derived class knows about further
configuration items that will need to be accessed samewise, this
function can be used to leave the existing parsed data alone and only
re-parse the new port or string lists into arrays.
"""
self.configure(self._dict, portlists, stringlists)
def _expand_ports(self, ports_list):
ports = []
for i in ports_list.split(','):
if '-' not in i:
ports.append(int(i))
else:
l, h = list(map(int, i.split('-')))
ports += list(range(l, h + 1))
return ports
def _fuzzy_true(self, value):
return value.lower() in ['yes', 'on', 'true', 'enable', 'enabled']
def _fuzzy_false(self, value):
return value.lower() in ['no', 'off', 'false', 'disable', 'disabled']
def is_configured(self, opt):
return opt.lower() in list(self._dict.keys())
def is_unconfigured(self, opt):
return not self.is_configured(opt)
def is_set(self, opt):
return (self.is_configured(opt) and
self._fuzzy_true(self._dict[opt.lower()]))
def is_clear(self, opt):
return (self.is_configured(opt) and
self._fuzzy_false(self._dict[opt.lower()]))
def getconfigval(self, opt, default=None):
return self._dict[opt.lower()] if self.is_configured(opt) else default
def setconfigval(self, opt, obj):
self._dict[opt.lower()] = obj
| 36.903614 | 79 | 0.614757 | 383 | 3,063 | 4.81201 | 0.37859 | 0.030385 | 0.028215 | 0.041237 | 0.097667 | 0.077048 | 0.047748 | 0.047748 | 0.047748 | 0.047748 | 0 | 0.00182 | 0.282403 | 3,063 | 82 | 80 | 37.353659 | 0.83667 | 0.284035 | 0 | 0.088889 | 0 | 0 | 0.024733 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.266667 | false | 0 | 0 | 0.155556 | 0.466667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
a2d972366674ffee05dbeed1f54a9dc88de6bb40 | 163 | py | Python | MyEircode.py | MrBrianMonaghan/mapping | 1b525eaaad3b22709a53167b46c901ece365ecab | [
"Apache-2.0"
] | null | null | null | MyEircode.py | MrBrianMonaghan/mapping | 1b525eaaad3b22709a53167b46c901ece365ecab | [
"Apache-2.0"
] | null | null | null | MyEircode.py | MrBrianMonaghan/mapping | 1b525eaaad3b22709a53167b46c901ece365ecab | [
"Apache-2.0"
] | null | null | null | import selenium
from selenium import webdriver
try:
browser = webdriver.Firefox()
browser.get('mikekus.com')
except KeyboardInterrupt:
browser.quit()
| 18.111111 | 33 | 0.742331 | 18 | 163 | 6.722222 | 0.722222 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.165644 | 163 | 8 | 34 | 20.375 | 0.889706 | 0 | 0 | 0 | 0 | 0 | 0.067485 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.285714 | 0 | 0.285714 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
a2dec2415ed78800e66aae16391df2b37d8f56eb | 1,193 | py | Python | pysoup/venv/__init__.py | illBeRoy/pysoup | 742fd6630e1be27c275cb8dc6ee94412472cb20b | [
"MIT"
] | 4 | 2016-02-21T12:40:44.000Z | 2019-06-13T13:23:19.000Z | pysoup/venv/__init__.py | illBeRoy/pysoup | 742fd6630e1be27c275cb8dc6ee94412472cb20b | [
"MIT"
] | null | null | null | pysoup/venv/__init__.py | illBeRoy/pysoup | 742fd6630e1be27c275cb8dc6ee94412472cb20b | [
"MIT"
] | 1 | 2020-07-16T12:22:12.000Z | 2020-07-16T12:22:12.000Z | import os.path
from twisted.internet import defer
import pysoup.utils
class Virtualenv(object):
def __init__(self, display_pip, path):
self._display_pipe = display_pip
self._path = path
@property
def path(self):
return self._path
@property
def venv_path(self):
return os.path.join(self._path, 'venv')
@property
def source_path(self):
return os.path.join(self.venv_path, 'bin/activate')
@defer.inlineCallbacks
def create(self):
self._display_pipe.log('Ensuring virtualenv environment at {0}'.format(self._path))
code = yield pysoup.utils.execute_shell_command('mkdir -p {0} && virtualenv --no-site-packages -q {0}'.format(self.venv_path))
if code != 0:
self._display_pipe.error('Failed to setup virtualenv at target! ({0})'.format(self._path))
raise Exception('Could not create virtualenv')
self._display_pipe.notify('Virtualenv is ready')
@defer.inlineCallbacks
def execute_in_venv(self, command):
code = yield pysoup.utils.execute_shell_command('source {0} && {1}'.format(self.source_path, command))
defer.returnValue(code)
| 29.097561 | 134 | 0.668064 | 154 | 1,193 | 4.980519 | 0.38961 | 0.071708 | 0.078227 | 0.041721 | 0.174707 | 0.174707 | 0.174707 | 0 | 0 | 0 | 0 | 0.007471 | 0.214585 | 1,193 | 40 | 135 | 29.825 | 0.811099 | 0 | 0 | 0.178571 | 0 | 0 | 0.177703 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.214286 | false | 0 | 0.107143 | 0.107143 | 0.464286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
a2fcecf1decf4817a91d5d880a0ea9320b043380 | 238 | py | Python | Python/Curos_Python_curemvid/Exercicios_dos_videos/Ex029.py | Jhonattan-rocha/Meus-primeiros-programas | f5971b66c0afd049b5d0493e8b7a116b391d058e | [
"MIT"
] | null | null | null | Python/Curos_Python_curemvid/Exercicios_dos_videos/Ex029.py | Jhonattan-rocha/Meus-primeiros-programas | f5971b66c0afd049b5d0493e8b7a116b391d058e | [
"MIT"
] | null | null | null | Python/Curos_Python_curemvid/Exercicios_dos_videos/Ex029.py | Jhonattan-rocha/Meus-primeiros-programas | f5971b66c0afd049b5d0493e8b7a116b391d058e | [
"MIT"
] | null | null | null | velocidade = float(input("Digite a sua velocidade em Km/h: "))
if velocidade > 80:
amais = velocidade - 80
amais = amais*7
print("Você foi multado, devera pagar uma multa de: R${:.2f}".format(amais))
print("FIM, não se mate")
| 34 | 80 | 0.663866 | 37 | 238 | 4.27027 | 0.783784 | 0.151899 | 0.21519 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.03125 | 0.193277 | 238 | 6 | 81 | 39.666667 | 0.791667 | 0 | 0 | 0 | 0 | 0 | 0.428571 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.333333 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
0c01e08aaee863025867488824fa6692ef88b661 | 468 | py | Python | Python_Advanced_Softuni/Comprehensions_Exericises/venv/number_classification.py | borisboychev/SoftUni | 22062312f08e29a1d85377a6d41ef74966d37e99 | [
"MIT"
] | 1 | 2020-12-14T23:25:19.000Z | 2020-12-14T23:25:19.000Z | Python_Advanced_Softuni/Comprehensions_Exericises/venv/number_classification.py | borisboychev/SoftUni | 22062312f08e29a1d85377a6d41ef74966d37e99 | [
"MIT"
] | null | null | null | Python_Advanced_Softuni/Comprehensions_Exericises/venv/number_classification.py | borisboychev/SoftUni | 22062312f08e29a1d85377a6d41ef74966d37e99 | [
"MIT"
] | null | null | null | elements = [int(x) for x in input().split(', ')]
even_numbers = [x for x in elements if x % 2 == 0]
odd_numbers = [x for x in elements if x % 2 != 0]
positive = [x for x in elements if x >= 0]
negative = [x for x in elements if x < 0]
print(f"Positive: {', '.join(str(x) for x in positive)}")
print(f"Negative: {', '.join(str(x) for x in negative)}")
print(f"Even: {', '.join(str(x) for x in even_numbers)}")
print(f"Odd: {', '.join(str(x) for x in odd_numbers)}")
| 36 | 57 | 0.613248 | 90 | 468 | 3.144444 | 0.211111 | 0.127208 | 0.159011 | 0.222615 | 0.522968 | 0.522968 | 0.325088 | 0.325088 | 0.190813 | 0.190813 | 0 | 0.015748 | 0.185897 | 468 | 12 | 58 | 39 | 0.727034 | 0 | 0 | 0 | 0 | 0 | 0.40257 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.444444 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
0c097274adeceb2e1e44250ea00c4016e23c60ed | 191 | py | Python | Desafios/desafio009.py | LucasHenrique-dev/Exercicios-Python | b1f6ca56ea8e197a89a044245419dc6079bdb9c7 | [
"MIT"
] | 1 | 2020-04-09T23:18:03.000Z | 2020-04-09T23:18:03.000Z | Desafios/desafio009.py | LucasHenrique-dev/Exercicios-Python | b1f6ca56ea8e197a89a044245419dc6079bdb9c7 | [
"MIT"
] | null | null | null | Desafios/desafio009.py | LucasHenrique-dev/Exercicios-Python | b1f6ca56ea8e197a89a044245419dc6079bdb9c7 | [
"MIT"
] | null | null | null | n1 = int(input('Digite um número e veja qual a sua tabuada: '))
n = 0
print('{} X {:2} = {:2}'.format(n1, 0, n1*n))
while n < 10:
n += 1
print('{} X {:2} = {:2}'.format(n1, n, n1*n))
| 27.285714 | 63 | 0.502618 | 37 | 191 | 2.594595 | 0.567568 | 0.09375 | 0.145833 | 0.166667 | 0.333333 | 0.333333 | 0 | 0 | 0 | 0 | 0 | 0.095238 | 0.230366 | 191 | 6 | 64 | 31.833333 | 0.557823 | 0 | 0 | 0 | 0 | 0 | 0.397906 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.333333 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
0c0beaeefd6502afde93d7709e2ca76e12632ed9 | 2,560 | py | Python | save.py | regismeyssonnier/NeuralNetwork | c998b9523ed02287e1c811d73b0757270dee773c | [
"MIT"
] | null | null | null | save.py | regismeyssonnier/NeuralNetwork | c998b9523ed02287e1c811d73b0757270dee773c | [
"MIT"
] | null | null | null | save.py | regismeyssonnier/NeuralNetwork | c998b9523ed02287e1c811d73b0757270dee773c | [
"MIT"
] | null | null | null |
def write_file(filess, T):
f = open(filess, "w")
for o in T:
f.write("[\n")
for l in o:
f.write(str(l)+"\n")
f.write("]\n")
f.close()
def save_hidden_weight(nb_hidden, hiddenw):
for i in range(nb_hidden):
write_file("save/base_nn_hid_" + str(i+1) + "w.nn", hiddenw[i])
def load_hiddenw(filess, hiddenw):
f = open(filess, "r")
s = f.read().splitlines()
h = 0
for o in s:
#print(o)
if o == "[":
h = []
elif o == "]":
hiddenw.append(h)
else:
h.append(float(o))
def load_hidden_weight(hiddenw, nb_hidden):
for i in range(nb_hidden):
hiddenw.append([])
load_hiddenw("save/base_nn_hid_" + str(i+1) + "w.nn", hiddenw[i])
def load_hidden_weight_v(hiddenw, nb_hidden):
for i in range(nb_hidden):
hiddenw.append([])
load_hiddenw("valid/NN/base_nn_hid_" + str(i+1) + "w.nn", hiddenw[i])
def display_hidden_weight(hiddenw, nb_hidden):
for i in range(nb_hidden):
for j in hiddenw[i]:
print("------------------------------------")
I = 0
for k in j:
print(k)
I+=1
if I > 3:
break
def write_fileb(filess, T):
f = open(filess, "w")
for o in T:
f.write(str(o)+"\n")
f.close()
def save_hidden_bias(nb_hidden, hiddenb):
for i in range(nb_hidden):
write_fileb("save/base_nn_hid_" + str(i+1) + "b.nn", hiddenb[i])
def load_hiddenb(filess, hiddenb):
f = open(filess, "r")
s = f.read().splitlines()
for o in s:
hiddenb.append(float(o))
def load_hidden_bias(hiddenb, nb_hidden):
for i in range(nb_hidden):
hiddenb.append([])
load_hiddenb("save/base_nn_hid_" + str(i+1) + "b.nn", hiddenb[i])
def load_hidden_bias_v(hiddenb, nb_hidden):
for i in range(nb_hidden):
hiddenb.append([])
load_hiddenb("valid/NN/base_nn_hid_" + str(i+1) + "b.nn", hiddenb[i])
def display_hidden_bias(hiddenb, nb_hidden):
for i in range(nb_hidden):
print("------------------------------------")
for j in hiddenb[i]:
print(j)
def save_output_weight(outputw):
write_file("save/base_nn_out_w.nn", outputw[0])
def load_output_weight(outputw):
outputw.append([])
load_hiddenw("save/base_nn_out_w.nn", outputw[0])
def load_output_weight_v(outputw):
outputw.append([])
load_hiddenw("valid/NN/base_nn_out_w.nn", outputw[0])
def save_output_bias(outputb):
write_fileb("save/base_nn_out_b.nn", outputb[0])
def load_output_bias(outputb):
outputb.append([])
load_hiddenb("save/base_nn_out_b.nn", outputb[0])
def load_output_bias_v(outputb):
outputb.append([])
load_hiddenb("valid/NN/base_nn_out_b.nn", outputb[0])
| 17.902098 | 71 | 0.636328 | 429 | 2,560 | 3.564103 | 0.125874 | 0.083715 | 0.031393 | 0.057554 | 0.7724 | 0.717462 | 0.644212 | 0.568345 | 0.507521 | 0.507521 | 0 | 0.007537 | 0.170703 | 2,560 | 142 | 72 | 18.028169 | 0.712671 | 0.003125 | 0 | 0.365854 | 0 | 0 | 0.139772 | 0.097369 | 0 | 0 | 0 | 0 | 0 | 1 | 0.219512 | false | 0 | 0 | 0 | 0.219512 | 0.04878 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
0c0c0154d635c140279cd61ef15b6dfc6c89cd23 | 755 | py | Python | test_knot_hasher.py | mmokko/aoc2017 | 0732ac440775f9e6bd4a8447c665c9b0e6969f74 | [
"MIT"
] | null | null | null | test_knot_hasher.py | mmokko/aoc2017 | 0732ac440775f9e6bd4a8447c665c9b0e6969f74 | [
"MIT"
] | null | null | null | test_knot_hasher.py | mmokko/aoc2017 | 0732ac440775f9e6bd4a8447c665c9b0e6969f74 | [
"MIT"
] | null | null | null | from unittest import TestCase
from day10 import KnotHasher
class TestKnotHasher(TestCase):
def test_calc(self):
sut = KnotHasher(5, [3, 4, 1, 5])
self.assertEqual(12, sut.calc())
def test_hash1(self):
sut = KnotHasher(256, '')
self.assertEqual('a2582a3a0e66e6e86e3812dcb672a272', sut.hash())
def test_hash2(self):
sut = KnotHasher(256, 'AoC 2017')
self.assertEqual('33efeb34ea91902bb2f59c9920caa6cd', sut.hash())
def test_hash3(self):
sut = KnotHasher(256, '1,2,3')
self.assertEqual('3efbe78a8d82f29979031a4aa0b16a9d', sut.hash())
def test_hash4(self):
sut = KnotHasher(256, '1,2,4')
self.assertEqual('63960835bcdc130f0b66d7ff4f6a5a8e', sut.hash())
| 30.2 | 72 | 0.658278 | 83 | 755 | 5.927711 | 0.385542 | 0.071138 | 0.172764 | 0.162602 | 0.089431 | 0.089431 | 0 | 0 | 0 | 0 | 0 | 0.186555 | 0.211921 | 755 | 24 | 73 | 31.458333 | 0.640336 | 0 | 0 | 0 | 0 | 0 | 0.193377 | 0.169536 | 0 | 0 | 0 | 0 | 0.277778 | 1 | 0.277778 | false | 0 | 0.111111 | 0 | 0.444444 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
0c1456a33812aa7157896227520f3def0676ad91 | 885 | py | Python | envdsys/envcontacts/apps.py | NOAA-PMEL/envDataSystem | 4db4a3569d2329658799a3eef06ce36dd5c0597d | [
"Unlicense"
] | 1 | 2021-11-06T19:22:53.000Z | 2021-11-06T19:22:53.000Z | envdsys/envcontacts/apps.py | NOAA-PMEL/envDataSystem | 4db4a3569d2329658799a3eef06ce36dd5c0597d | [
"Unlicense"
] | 25 | 2019-06-18T20:40:36.000Z | 2021-07-23T20:56:48.000Z | envdsys/envcontacts/apps.py | NOAA-PMEL/envDataSystem | 4db4a3569d2329658799a3eef06ce36dd5c0597d | [
"Unlicense"
] | null | null | null | from django.apps import AppConfig
class EnvcontactsConfig(AppConfig):
name = 'envcontacts'
# def ready(self) -> None:
# from envnet.registry.registry import ServiceRegistry
# try:
# from setup.ui_server_conf import run_config
# host = run_config["HOST"]["name"]
# except KeyError:
# host = "localhost"
# try:
# from setup.ui_server_conf import run_config
# port = run_config["HOST"]["port"]
# except KeyError:
# port = "8000"
# local = True
# config = {
# "host": host,
# "port": port,
# "regkey": None,
# "service_list": {"envdsys_contacts": {}},
# }
# registration = ServiceRegistry.register(local, config)
# print(registration)
# return super().ready() | 29.5 | 64 | 0.523164 | 79 | 885 | 5.734177 | 0.518987 | 0.07947 | 0.086093 | 0.06181 | 0.172185 | 0.172185 | 0.172185 | 0.172185 | 0.172185 | 0 | 0 | 0.007042 | 0.358192 | 885 | 30 | 65 | 29.5 | 0.790493 | 0.728814 | 0 | 0 | 0 | 0 | 0.050459 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
0c1eb2fd9329de0c031fe686c52f4c0e67ec1227 | 1,103 | py | Python | tempest/api/hybrid_cloud/compute/flavors/test_flavors_operations.py | Hybrid-Cloud/hybrid-tempest | 319e90c6fa6e46925b495c93cd5258f088a30ec0 | [
"Apache-2.0"
] | null | null | null | tempest/api/hybrid_cloud/compute/flavors/test_flavors_operations.py | Hybrid-Cloud/hybrid-tempest | 319e90c6fa6e46925b495c93cd5258f088a30ec0 | [
"Apache-2.0"
] | null | null | null | tempest/api/hybrid_cloud/compute/flavors/test_flavors_operations.py | Hybrid-Cloud/hybrid-tempest | 319e90c6fa6e46925b495c93cd5258f088a30ec0 | [
"Apache-2.0"
] | null | null | null | import testtools
from oslo_log import log
from tempest.api.compute import base
import tempest.api.compute.flavors.test_flavors as FlavorsV2Test
import tempest.api.compute.flavors.test_flavors_negative as FlavorsListWithDetailsNegativeTest
import tempest.api.compute.flavors.test_flavors_negative as FlavorDetailsNegativeTest
from tempest.common.utils import data_utils
from tempest.lib import exceptions as lib_exc
from tempest.lib import decorators
from tempest import test
from tempest import config
CONF = config.CONF
LOG = log.getLogger(__name__)
class HybridFlavorsV2TestJSON(FlavorsV2Test.FlavorsV2TestJSON):
"""Test flavors"""
@testtools.skip("testscenarios are not active.")
@test.SimpleNegativeAutoTest
class HybridFlavorsListWithDetailsNegativeTestJSON(FlavorsListWithDetailsNegativeTest.FlavorsListWithDetailsNegativeTestJSON):
"""Test FlavorsListWithDetails"""
@testtools.skip("testscenarios are not active.")
@test.SimpleNegativeAutoTest
class HybridFlavorDetailsNegativeTestJSON(FlavorDetailsNegativeTest.FlavorDetailsNegativeTestJSON):
"""Test FlavorsListWithDetails"""
| 36.766667 | 126 | 0.853128 | 111 | 1,103 | 8.369369 | 0.369369 | 0.071044 | 0.073197 | 0.074273 | 0.302476 | 0.302476 | 0.302476 | 0.258342 | 0.258342 | 0 | 0 | 0.003949 | 0.081596 | 1,103 | 29 | 127 | 38.034483 | 0.913129 | 0.06165 | 0 | 0.2 | 0 | 0 | 0.056919 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.55 | 0 | 0.7 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
0c34007b8ed98fbad90350a4894f2960e309e1be | 3,306 | py | Python | connect_box/data.py | jtru/python-connect-box | 2d26923e966fbb319760da82e3e71103018ded0b | [
"MIT"
] | null | null | null | connect_box/data.py | jtru/python-connect-box | 2d26923e966fbb319760da82e3e71103018ded0b | [
"MIT"
] | null | null | null | connect_box/data.py | jtru/python-connect-box | 2d26923e966fbb319760da82e3e71103018ded0b | [
"MIT"
] | null | null | null | """Handle Data attributes."""
from datetime import datetime
from ipaddress import IPv4Address, IPv6Address, ip_address as convert_ip
from typing import Iterable, Union
import attr
@attr.s
class Device:
"""A single device."""
mac: str = attr.ib()
hostname: str = attr.ib(cmp=False)
ip: Union[IPv4Address, IPv6Address] = attr.ib(cmp=False, converter=convert_ip)
interface: str = attr.ib()
speed: int = attr.ib()
interface_id: int = attr.ib()
method: int = attr.ib()
lease_time: str = attr.ib(
converter=lambda lease_time: datetime.strptime(lease_time, "00:%H:%M:%S")
)
@attr.s
class DownstreamChannel:
"""A locked downstream channel."""
frequency: int = attr.ib()
powerLevel: int = attr.ib()
modulation: str = attr.ib()
id: str = attr.ib()
snr: float = attr.ib()
preRs: int = attr.ib()
postRs: int = attr.ib()
qamLocked: bool = attr.ib()
fecLocked: bool = attr.ib()
mpegLocked: bool = attr.ib()
@attr.s
class UpstreamChannel:
"""A locked upstream channel."""
frequency: int = attr.ib()
powerLevel: int = attr.ib()
symbolRate: str = attr.ib()
id: str = attr.ib()
modulation: str = attr.ib()
type: str = attr.ib()
t1Timeouts: int = attr.ib()
t2Timeouts: int = attr.ib()
t3Timeouts: int = attr.ib()
t4Timeouts: int = attr.ib()
channelType: str = attr.ib()
messageType: int = attr.ib()
@attr.s
class Ipv6FilterInstance:
"""An IPv6 filter rule instance."""
idd: int = attr.ib()
srcAddr: Union[IPv4Address, IPv6Address] = attr.ib(converter=convert_ip)
srcAddr: str = attr.ib()
srcPrefix: int = attr.ib()
dstAddr: str = attr.ib()
dstAddr: Union[IPv4Address, IPv6Address] = attr.ib(converter=convert_ip)
dstPrefix: int = attr.ib()
srcPortStart: int = attr.ib()
srcPortEnd: int = attr.ib()
dstPortStart: int = attr.ib()
dstPortEnd: int = attr.ib()
protocol: int = attr.ib()
allow: int = attr.ib()
enabled: int = attr.ib()
@attr.s
class FiltersTimeMode:
"""Filters time setting."""
TMode: int = attr.ib()
XmlGeneralTime: str = attr.ib()
XmlDailyTime: str = attr.ib()
@attr.s
class FilterStatesList:
"""A sequence of filter state instances."""
entries: Iterable = attr.ib()
@attr.s
class FilterState:
"""A filter state instance."""
idd: int = attr.ib()
enabled: int = attr.ib()
@attr.s
class CmStatus:
provisioningStatus: str = attr.ib()
cmComment: str = attr.ib()
cmDocsisMode: str = attr.ib()
cmNetworkAccess: str = attr.ib()
firmwareFilename: str = attr.ib()
# number of IP addresses to assign via DHCP
numberOfCpes: int = attr.ib()
# ???
dMaxCpes: int = attr.ib()
bpiEnable: int = attr.ib()
@attr.s
class ServiceFlow:
id: int = attr.ib()
pMaxTrafficRate: int = attr.ib()
pMaxTrafficBurst: int = attr.ib()
pMinReservedRate: int = attr.ib()
pMaxConcatBurst: int = attr.ib()
# 2 seems to be Best Effort
pSchedulingType: int = attr.ib()
@attr.s
class Temperature:
# temperatures in degrees Celsius
tunerTemperature: float = attr.ib()
temperature: float = attr.ib()
# several other stats remain untapped here:
# wan_ipv4_addr
# wan_ipv6_addr, wan_ipv6_addr_entry
| 24.857143 | 82 | 0.635209 | 424 | 3,306 | 4.915094 | 0.32783 | 0.190019 | 0.15547 | 0.042226 | 0.238964 | 0.185221 | 0.143954 | 0.12476 | 0.075816 | 0.033589 | 0 | 0.007813 | 0.22565 | 3,306 | 132 | 83 | 25.045455 | 0.80625 | 0.12311 | 0 | 0.23913 | 0 | 0 | 0.003857 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.043478 | 0 | 0.869565 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
0c380570b168add317dd67b7037f3b6ec7e93c2b | 392 | py | Python | pages/main_page.py | thaidem/selenium-training-page-objects | 1f37a2b5287a502295bb57050c95455d68c2d3eb | [
"Apache-2.0"
] | null | null | null | pages/main_page.py | thaidem/selenium-training-page-objects | 1f37a2b5287a502295bb57050c95455d68c2d3eb | [
"Apache-2.0"
] | null | null | null | pages/main_page.py | thaidem/selenium-training-page-objects | 1f37a2b5287a502295bb57050c95455d68c2d3eb | [
"Apache-2.0"
] | null | null | null | from selenium.webdriver.support.wait import WebDriverWait
class MainPage:
def __init__(self, driver):
self.driver = driver
self.wait = WebDriverWait(driver, 10)
def open(self):
self.driver.get("https://litecart.stqa.ru/en/")
return self
@property
def product_item(self):
return self.driver.find_element_by_css_selector(".product")
| 23.058824 | 67 | 0.670918 | 48 | 392 | 5.291667 | 0.625 | 0.15748 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006557 | 0.221939 | 392 | 16 | 68 | 24.5 | 0.82623 | 0 | 0 | 0 | 0 | 0 | 0.091837 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.272727 | false | 0 | 0.090909 | 0.090909 | 0.636364 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
0c3b1affbabd1c858deb93d0a0302a8d675091d1 | 8,090 | py | Python | tools/xenserver/cleanup_sm_locks.py | bopopescu/nova-token | ec98f69dea7b3e2b9013b27fd55a2c1a1ac6bfb2 | [
"Apache-2.0"
] | null | null | null | tools/xenserver/cleanup_sm_locks.py | bopopescu/nova-token | ec98f69dea7b3e2b9013b27fd55a2c1a1ac6bfb2 | [
"Apache-2.0"
] | null | null | null | tools/xenserver/cleanup_sm_locks.py | bopopescu/nova-token | ec98f69dea7b3e2b9013b27fd55a2c1a1ac6bfb2 | [
"Apache-2.0"
] | 2 | 2017-07-20T17:31:34.000Z | 2020-07-24T02:42:19.000Z | begin_unit
comment|'#!/usr/bin/env python'
nl|'\n'
nl|'\n'
comment|'# Copyright 2013 OpenStack Foundation'
nl|'\n'
comment|'#'
nl|'\n'
comment|'# Licensed under the Apache License, Version 2.0 (the "License");'
nl|'\n'
comment|'# you may not use this file except in compliance with the License.'
nl|'\n'
comment|'# You may obtain a copy of the License at'
nl|'\n'
comment|'#'
nl|'\n'
comment|'# http://www.apache.org/licenses/LICENSE-2.0'
nl|'\n'
comment|'#'
nl|'\n'
comment|'# Unless required by applicable law or agreed to in writing, software'
nl|'\n'
comment|'# distributed under the License is distributed on an "AS IS" BASIS,'
nl|'\n'
comment|'# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.'
nl|'\n'
comment|'# See the License for the specific language governing permissions and'
nl|'\n'
comment|'# limitations under the License.'
nl|'\n'
string|'"""\nScript to cleanup old XenServer /var/lock/sm locks.\n\nXenServer 5.6 and 6.0 do not appear to always cleanup locks when using a\nFileSR. ext3 has a limit of 32K inode links, so when we have 32K-2 (31998)\nlocks laying around, builds will begin to fail because we can\'t create any\nadditional locks. This cleanup script is something we can run periodically as\na stop-gap measure until this is fixed upstream.\n\nThis script should be run on the dom0 of the affected machine.\n"""'
newline|'\n'
name|'import'
name|'errno'
newline|'\n'
name|'import'
name|'optparse'
newline|'\n'
name|'import'
name|'os'
newline|'\n'
name|'import'
name|'sys'
newline|'\n'
name|'import'
name|'time'
newline|'\n'
nl|'\n'
DECL|variable|BASE
name|'BASE'
op|'='
string|"'/var/lock/sm'"
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_get_age_days
name|'def'
name|'_get_age_days'
op|'('
name|'secs'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'float'
op|'('
name|'time'
op|'.'
name|'time'
op|'('
op|')'
op|'-'
name|'secs'
op|')'
op|'/'
number|'86400'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_parse_args
dedent|''
name|'def'
name|'_parse_args'
op|'('
op|')'
op|':'
newline|'\n'
indent|' '
name|'parser'
op|'='
name|'optparse'
op|'.'
name|'OptionParser'
op|'('
op|')'
newline|'\n'
name|'parser'
op|'.'
name|'add_option'
op|'('
string|'"-d"'
op|','
string|'"--dry-run"'
op|','
nl|'\n'
name|'action'
op|'='
string|'"store_true"'
op|','
name|'dest'
op|'='
string|'"dry_run"'
op|','
name|'default'
op|'='
name|'False'
op|','
nl|'\n'
name|'help'
op|'='
string|'"don\'t actually remove locks"'
op|')'
newline|'\n'
name|'parser'
op|'.'
name|'add_option'
op|'('
string|'"-l"'
op|','
string|'"--limit"'
op|','
nl|'\n'
name|'action'
op|'='
string|'"store"'
op|','
name|'type'
op|'='
string|"'int'"
op|','
name|'dest'
op|'='
string|'"limit"'
op|','
nl|'\n'
name|'default'
op|'='
name|'sys'
op|'.'
name|'maxint'
op|','
nl|'\n'
name|'help'
op|'='
string|'"max number of locks to delete (default: no limit)"'
op|')'
newline|'\n'
name|'parser'
op|'.'
name|'add_option'
op|'('
string|'"-v"'
op|','
string|'"--verbose"'
op|','
nl|'\n'
name|'action'
op|'='
string|'"store_true"'
op|','
name|'dest'
op|'='
string|'"verbose"'
op|','
name|'default'
op|'='
name|'False'
op|','
nl|'\n'
name|'help'
op|'='
string|'"don\'t print status messages to stdout"'
op|')'
newline|'\n'
nl|'\n'
name|'options'
op|','
name|'args'
op|'='
name|'parser'
op|'.'
name|'parse_args'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'try'
op|':'
newline|'\n'
indent|' '
name|'days_old'
op|'='
name|'int'
op|'('
name|'args'
op|'['
number|'0'
op|']'
op|')'
newline|'\n'
dedent|''
name|'except'
op|'('
name|'IndexError'
op|','
name|'ValueError'
op|')'
op|':'
newline|'\n'
indent|' '
name|'parser'
op|'.'
name|'print_help'
op|'('
op|')'
newline|'\n'
name|'sys'
op|'.'
name|'exit'
op|'('
number|'1'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'options'
op|','
name|'days_old'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|main
dedent|''
name|'def'
name|'main'
op|'('
op|')'
op|':'
newline|'\n'
indent|' '
name|'options'
op|','
name|'days_old'
op|'='
name|'_parse_args'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'not'
name|'os'
op|'.'
name|'path'
op|'.'
name|'exists'
op|'('
name|'BASE'
op|')'
op|':'
newline|'\n'
indent|' '
name|'print'
op|'>>'
name|'sys'
op|'.'
name|'stderr'
op|','
string|'"error: \'%s\' doesn\'t exist. Make sure you\'re"'
string|'" running this on the dom0."'
op|'%'
name|'BASE'
newline|'\n'
name|'sys'
op|'.'
name|'exit'
op|'('
number|'1'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'lockpaths_removed'
op|'='
number|'0'
newline|'\n'
name|'nspaths_removed'
op|'='
number|'0'
newline|'\n'
nl|'\n'
name|'for'
name|'nsname'
name|'in'
name|'os'
op|'.'
name|'listdir'
op|'('
name|'BASE'
op|')'
op|'['
op|':'
name|'options'
op|'.'
name|'limit'
op|']'
op|':'
newline|'\n'
indent|' '
name|'nspath'
op|'='
name|'os'
op|'.'
name|'path'
op|'.'
name|'join'
op|'('
name|'BASE'
op|','
name|'nsname'
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'not'
name|'os'
op|'.'
name|'path'
op|'.'
name|'isdir'
op|'('
name|'nspath'
op|')'
op|':'
newline|'\n'
indent|' '
name|'continue'
newline|'\n'
nl|'\n'
comment|'# Remove old lockfiles'
nl|'\n'
dedent|''
name|'removed'
op|'='
number|'0'
newline|'\n'
name|'locknames'
op|'='
name|'os'
op|'.'
name|'listdir'
op|'('
name|'nspath'
op|')'
newline|'\n'
name|'for'
name|'lockname'
name|'in'
name|'locknames'
op|':'
newline|'\n'
indent|' '
name|'lockpath'
op|'='
name|'os'
op|'.'
name|'path'
op|'.'
name|'join'
op|'('
name|'nspath'
op|','
name|'lockname'
op|')'
newline|'\n'
name|'lock_age_days'
op|'='
name|'_get_age_days'
op|'('
name|'os'
op|'.'
name|'path'
op|'.'
name|'getmtime'
op|'('
name|'lockpath'
op|')'
op|')'
newline|'\n'
name|'if'
name|'lock_age_days'
op|'>'
name|'days_old'
op|':'
newline|'\n'
indent|' '
name|'lockpaths_removed'
op|'+='
number|'1'
newline|'\n'
name|'removed'
op|'+='
number|'1'
newline|'\n'
nl|'\n'
name|'if'
name|'options'
op|'.'
name|'verbose'
op|':'
newline|'\n'
indent|' '
name|'print'
string|"'Removing old lock: %03d %s'"
op|'%'
op|'('
name|'lock_age_days'
op|','
nl|'\n'
name|'lockpath'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'if'
name|'not'
name|'options'
op|'.'
name|'dry_run'
op|':'
newline|'\n'
indent|' '
name|'os'
op|'.'
name|'unlink'
op|'('
name|'lockpath'
op|')'
newline|'\n'
nl|'\n'
comment|'# Remove empty namespace paths'
nl|'\n'
dedent|''
dedent|''
dedent|''
name|'if'
name|'len'
op|'('
name|'locknames'
op|')'
op|'=='
name|'removed'
op|':'
newline|'\n'
indent|' '
name|'nspaths_removed'
op|'+='
number|'1'
newline|'\n'
nl|'\n'
name|'if'
name|'options'
op|'.'
name|'verbose'
op|':'
newline|'\n'
indent|' '
name|'print'
string|"'Removing empty namespace: %s'"
op|'%'
name|'nspath'
newline|'\n'
nl|'\n'
dedent|''
name|'if'
name|'not'
name|'options'
op|'.'
name|'dry_run'
op|':'
newline|'\n'
indent|' '
name|'try'
op|':'
newline|'\n'
indent|' '
name|'os'
op|'.'
name|'rmdir'
op|'('
name|'nspath'
op|')'
newline|'\n'
dedent|''
name|'except'
name|'OSError'
op|','
name|'e'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'e'
op|'.'
name|'errno'
op|'=='
name|'errno'
op|'.'
name|'ENOTEMPTY'
op|':'
newline|'\n'
indent|' '
name|'print'
op|'>>'
name|'sys'
op|'.'
name|'stderr'
op|','
string|'"warning: directory \'%s\'"'
string|'" not empty"'
op|'%'
name|'nspath'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'raise'
newline|'\n'
nl|'\n'
dedent|''
dedent|''
dedent|''
dedent|''
dedent|''
name|'if'
name|'options'
op|'.'
name|'dry_run'
op|':'
newline|'\n'
indent|' '
name|'print'
string|'"** Dry Run **"'
newline|'\n'
nl|'\n'
dedent|''
name|'print'
string|'"Total locks removed: "'
op|','
name|'lockpaths_removed'
newline|'\n'
name|'print'
string|'"Total namespaces removed: "'
op|','
name|'nspaths_removed'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
name|'if'
name|'__name__'
op|'=='
string|"'__main__'"
op|':'
newline|'\n'
indent|' '
name|'main'
op|'('
op|')'
newline|'\n'
dedent|''
endmarker|''
end_unit
| 13.758503 | 495 | 0.591595 | 1,208 | 8,090 | 3.918874 | 0.194536 | 0.108999 | 0.082383 | 0.070976 | 0.581749 | 0.469793 | 0.362273 | 0.291297 | 0.256232 | 0.241867 | 0 | 0.005838 | 0.131891 | 8,090 | 587 | 496 | 13.781942 | 0.668233 | 0 | 0 | 0.877342 | 0 | 0.001704 | 0.3822 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.008518 | 0 | 0.008518 | 0.015332 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
0c4b5ba22b3ba7761012b4918404bffd6258a269 | 370 | py | Python | network_monitor/__init__.py | brennanhfredericks/network-monitor-client | 618d222bb015662c3958f0100a965f3c71b29d32 | [
"MIT"
] | null | null | null | network_monitor/__init__.py | brennanhfredericks/network-monitor-client | 618d222bb015662c3958f0100a965f3c71b29d32 | [
"MIT"
] | null | null | null | network_monitor/__init__.py | brennanhfredericks/network-monitor-client | 618d222bb015662c3958f0100a965f3c71b29d32 | [
"MIT"
] | null | null | null | import argparse
import netifaces
import sys
import signal
import os
import asyncio
from asyncio import CancelledError, Task
from typing import Optional, List, Any
from .services import (
Service_Manager,
Packet_Parser,
Packet_Submitter,
Packet_Filter,
)
from .configurations import generate_configuration_template, DevConfig, load_config_from_file
| 16.818182 | 93 | 0.802703 | 45 | 370 | 6.4 | 0.644444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.162162 | 370 | 21 | 94 | 17.619048 | 0.929032 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
0c50ef47cd53ea48685602b6b3d98c7fea184c96 | 263 | py | Python | setup.py | thevoxium/netspeed | 9e16a49d64da90a173ef9eaf491d4245c1023105 | [
"MIT"
] | null | null | null | setup.py | thevoxium/netspeed | 9e16a49d64da90a173ef9eaf491d4245c1023105 | [
"MIT"
] | null | null | null | setup.py | thevoxium/netspeed | 9e16a49d64da90a173ef9eaf491d4245c1023105 | [
"MIT"
] | null | null | null | from setuptools import setup
setup(
name='netspeed',
version='0.1',
py_modules=['netspeed'],
install_requires=[
'Click',
'pyspeedtest'
],
entry_points='''
[console_scripts]
netspeed=netspeed:cli
''',
)
| 16.4375 | 29 | 0.558935 | 24 | 263 | 5.958333 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.010811 | 0.296578 | 263 | 15 | 30 | 17.533333 | 0.762162 | 0 | 0 | 0 | 0 | 0 | 0.365019 | 0.079848 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.071429 | 0 | 0.071429 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
0c5549700625606ae1bd959bf730c22c941eb303 | 4,255 | py | Python | bottleneck/tests/list_input_test.py | stroxler/bottleneck | 6e91bcb8a21170588ee9a3f2c425a4e307ae05de | [
"BSD-2-Clause"
] | 2 | 2015-05-26T09:06:32.000Z | 2015-05-26T09:06:46.000Z | bottleneck/tests/list_input_test.py | stroxler/bottleneck | 6e91bcb8a21170588ee9a3f2c425a4e307ae05de | [
"BSD-2-Clause"
] | null | null | null | bottleneck/tests/list_input_test.py | stroxler/bottleneck | 6e91bcb8a21170588ee9a3f2c425a4e307ae05de | [
"BSD-2-Clause"
] | null | null | null | "Test list input."
# For support of python 2.5
from __future__ import with_statement
import numpy as np
from numpy.testing import assert_equal, assert_array_almost_equal
import bottleneck as bn
# ---------------------------------------------------------------------------
# Check that functions can handle list input
def lists():
"Iterator that yields lists to use for unit testing."
ss = {}
ss[1] = {'size': 4, 'shapes': [(4,)]}
ss[2] = {'size': 6, 'shapes': [(1, 6), (2, 3)]}
ss[3] = {'size': 6, 'shapes': [(1, 2, 3)]}
ss[4] = {'size': 24, 'shapes': [(1, 2, 3, 4)]} # Unaccelerated
for ndim in ss:
size = ss[ndim]['size']
shapes = ss[ndim]['shapes']
a = np.arange(size)
for shape in shapes:
a = a.reshape(shape)
yield a.tolist()
def unit_maker(func, func0, args=tuple()):
"Test that bn.xxx gives the same output as bn.slow.xxx for list input."
msg = '\nfunc %s | input %s | shape %s\n'
msg += '\nInput array:\n%s\n'
for i, arr in enumerate(lists()):
argsi = tuple([list(arr)] + list(args))
actual = func(*argsi)
desired = func0(*argsi)
tup = (func.__name__, 'a'+str(i), str(np.array(arr).shape), arr)
err_msg = msg % tup
assert_array_almost_equal(actual, desired, err_msg=err_msg)
def test_nansum():
"Test nansum."
yield unit_maker, bn.nansum, bn.slow.nansum
def test_nanmax():
"Test nanmax."
yield unit_maker, bn.nanmax, bn.slow.nanmax
def test_nanargmin():
"Test nanargmin."
yield unit_maker, bn.nanargmin, bn.slow.nanargmin
def test_nanargmax():
"Test nanargmax."
yield unit_maker, bn.nanargmax, bn.slow.nanargmax
def test_nanmin():
"Test nanmin."
yield unit_maker, bn.nanmin, bn.slow.nanmin
def test_nanmean():
"Test nanmean."
yield unit_maker, bn.nanmean, bn.slow.nanmean
def test_nanstd():
"Test nanstd."
yield unit_maker, bn.nanstd, bn.slow.nanstd
def test_nanvar():
"Test nanvar."
yield unit_maker, bn.nanvar, bn.slow.nanvar
def test_median():
"Test median."
yield unit_maker, bn.median, bn.slow.median
def test_nanmedian():
"Test nanmedian."
yield unit_maker, bn.nanmedian, bn.slow.nanmedian
def test_rankdata():
"Test rankdata."
yield unit_maker, bn.rankdata, bn.slow.rankdata
def test_nanrankdata():
"Test nanrankdata."
yield unit_maker, bn.nanrankdata, bn.slow.nanrankdata
def test_partsort():
"Test partsort."
yield unit_maker, bn.partsort, bn.slow.partsort, (2,)
def test_argpartsort():
"Test argpartsort."
yield unit_maker, bn.argpartsort, bn.slow.argpartsort, (2,)
def test_ss():
"Test ss."
yield unit_maker, bn.ss, bn.slow.ss
def test_nn():
"Test nn."
a = [[1, 2], [3, 4]]
a0 = [1, 2]
assert_equal(bn.nn(a, a0), bn.slow.nn(a, a0))
def test_anynan():
"Test anynan."
yield unit_maker, bn.anynan, bn.slow.anynan
def test_allnan():
"Test allnan."
yield unit_maker, bn.allnan, bn.slow.allnan
def test_move_sum():
"Test move_sum."
yield unit_maker, bn.move_sum, bn.slow.move_sum, (2,)
def test_move_nansum():
"Test move_nansum."
yield unit_maker, bn.move_nansum, bn.slow.move_nansum, (2,)
def test_move_mean():
"Test move_mean."
yield unit_maker, bn.move_mean, bn.slow.move_mean, (2,)
def test_move_median():
"Test move_median."
yield unit_maker, bn.move_median, bn.slow.move_median, (2,)
def test_move_nanmean():
"Test move_nanmean."
yield unit_maker, bn.move_nanmean, bn.slow.move_nanmean, (2,)
def test_move_std():
"Test move_std."
yield unit_maker, bn.move_std, bn.slow.move_std, (2,)
def test_move_nanstd():
"Test move_nanstd."
yield unit_maker, bn.move_nanstd, bn.slow.move_nanstd, (2,)
def test_move_min():
"Test move_min."
yield unit_maker, bn.move_min, bn.slow.move_min, (2,)
def test_move_max():
"Test move_max."
yield unit_maker, bn.move_max, bn.slow.move_max, (2,)
def test_move_nanmin():
"Test move_nanmin."
yield unit_maker, bn.move_nanmin, bn.slow.move_nanmin, (2,)
def test_move_nanmax():
"Test move_nanmax."
yield unit_maker, bn.move_nanmax, bn.slow.move_nanmax, (2,)
| 22.632979 | 77 | 0.636193 | 636 | 4,255 | 4.064465 | 0.16195 | 0.069633 | 0.151644 | 0.173308 | 0.15087 | 0 | 0 | 0 | 0 | 0 | 0 | 0.013918 | 0.206345 | 4,255 | 187 | 78 | 22.754011 | 0.751555 | 0.172268 | 0 | 0 | 0 | 0 | 0.157869 | 0 | 0 | 0 | 0 | 0 | 0.02521 | 1 | 0.260504 | false | 0 | 0.033613 | 0 | 0.294118 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
0c587de94c3ee270415110f012b7d77cb256c5a4 | 1,475 | py | Python | hanzo/warcindex.py | ukwa/warctools | f74061382d6bc37b6eec889a3aec26c5748d90d3 | [
"MIT"
] | 1 | 2020-09-03T00:51:50.000Z | 2020-09-03T00:51:50.000Z | hanzo/warcindex.py | martinsbalodis/warc-tools | d9d5e708e00bd0f6d9d0c2d95cbc9332f51b05e4 | [
"MIT"
] | null | null | null | hanzo/warcindex.py | martinsbalodis/warc-tools | d9d5e708e00bd0f6d9d0c2d95cbc9332f51b05e4 | [
"MIT"
] | 1 | 2021-04-12T01:45:14.000Z | 2021-04-12T01:45:14.000Z | #!/usr/bin/env python
"""warcindex - dump warc index"""
import os
import sys
import sys
import os.path
from optparse import OptionParser
from .warctools import WarcRecord, expand_files
parser = OptionParser(usage="%prog [options] warc warc warc")
parser.add_option("-l", "--limit", dest="limit")
parser.add_option("-O", "--output-format", dest="output_format", help="output format (ignored)")
parser.add_option("-o", "--output", dest="output_format", help="output file (ignored)")
parser.add_option("-L", "--log-level", dest="log_level")
parser.set_defaults(output=None, limit=None, log_level="info")
def main(argv):
(options, input_files) = parser.parse_args(args=argv[1:])
out = sys.stdout
if len(input_files) < 1:
parser.error("no imput warc file(s)")
print '#WARC filename offset warc-type warc-subject-uri warc-record-id content-type content-length'
for name in expand_files(input_files):
fh = WarcRecord.open_archive(name, gzip="auto")
for (offset, record, errors) in fh.read_records(limit=None):
if record:
print name, offset, record.type, record.url, record.id, record.content_type, record.content_length
elif errors:
pass
# ignore
else:
pass
# no errors at tail
fh.close()
return 0
def run():
sys.exit(main(sys.argv))
if __name__ == '__main__':
run()
| 23.412698 | 114 | 0.633898 | 195 | 1,475 | 4.65641 | 0.451282 | 0.039648 | 0.066079 | 0.035242 | 0.105727 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002643 | 0.230508 | 1,475 | 62 | 115 | 23.790323 | 0.797357 | 0.030508 | 0 | 0.121212 | 0 | 0.030303 | 0.209052 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.060606 | 0.181818 | null | null | 0.060606 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
0c5f5d9ac8242efc8ccf5bafaa6e567b8ee2cc86 | 5,808 | py | Python | cog/cli/user_argparser.py | Demonware/cog | b206066ebfd5faae000b1a1708988db8ca592b94 | [
"BSD-3-Clause"
] | 2 | 2016-06-02T02:15:56.000Z | 2016-08-16T08:37:27.000Z | cog/cli/user_argparser.py | Demonware/cog | b206066ebfd5faae000b1a1708988db8ca592b94 | [
"BSD-3-Clause"
] | null | null | null | cog/cli/user_argparser.py | Demonware/cog | b206066ebfd5faae000b1a1708988db8ca592b94 | [
"BSD-3-Clause"
] | null | null | null |
# -*- coding: utf-8 -*-
import sys
import argparse
arg_no = len(sys.argv)
tool_parser = argparse.ArgumentParser(add_help=False)
tool_subparsers = tool_parser.add_subparsers(help='commands', dest='command')
# The rename command.
rename_parser = tool_subparsers.add_parser('rename', help='rename an existing user account.')
rename_parser.add_argument(
'name', action='store', metavar='<name>', help='account name'
)
rename_parser.add_argument(
'--new-name', '-n', action='store', dest='newName', metavar='<new account name>'
)
# The add command.
add_parser = tool_subparsers.add_parser('add', help='add new user account to the directory.')
add_parser.add_argument(
'--type', '-t', action='store', default='generic', dest='account_type', metavar='<type of account>'
)
add_parser.add_argument(
'name', action='store', help='account name', metavar='<name>'
)
group1_parser = add_parser.add_argument_group('account specific')
group1_parser.add_argument(
'--password', '-P', action='store', dest='userPassword', metavar='<account\'s owner password>'
)
group1_parser.add_argument(
'--home', action='store', dest='homeDirectory', metavar='<path to the home directory>'
)
group1_parser.add_argument(
'--shell', action='store', dest='loginShell', metavar='<path to the shell interpreter>'
)
group1_parser = add_parser.add_argument_group('personal information')
group1_parser.add_argument(
'--phone-no', action='append', dest='telephoneNumber', metavar='<phone number>'
)
group1_parser.add_argument(
'--last-name', action='store', dest='sn', metavar='<account owner\'s last name>'
)
group1_parser.add_argument(
'--first-name', action='store', dest='givenName', metavar='<account owner\'s first name>'
)
group1_parser.add_argument(
'--organization', '-o', action='store', dest='o', metavar='<organization>'
)
group1_parser.add_argument(
'--email', action='append', dest='mail', metavar='<email>'
)
group1_parser.add_argument(
'--full-name', action='store', dest='cn', metavar='<account owner\'s full name>'
)
group1_parser = add_parser.add_argument_group('uid and group management')
group1_parser.add_argument(
'--uid', action='store', dest='uid', metavar='<user\'s uid>'
)
group1_parser.add_argument(
'--add-group', action='append', dest='group', metavar='<secondary group>'
)
group1_parser.add_argument(
'--uid-number', action='store', dest='uidNumber', metavar='<user id number>'
)
group1_parser.add_argument(
'--gid', action='store', dest='gidNumber', metavar='<primary group id>'
)
# The show command.
show_parser = tool_subparsers.add_parser('show', help='show account data')
show_parser.add_argument(
'name', action='append', nargs='*', help='account name'
)
show_parser.add_argument(
'--verbose', '-v', action='store_true', dest='verbose', help='be verbose about it'
)
# The edit command.
edit_parser = tool_subparsers.add_parser('edit', help='edit existing user data in the directory')
edit_parser.add_argument(
'--type', '-t', action='store', dest='account_type', metavar='<change account type>'
)
edit_parser.add_argument(
'name', action='store', help='account name'
)
group1_parser = edit_parser.add_argument_group('account specific')
group1_parser.add_argument(
'--reset-password', '-r', dest='resetPassword', action='store_true', help='<reset user\'s password>'
)
group1_parser.add_argument(
'--home', action='store', dest='homeDirectory', metavar='<new home directory path>'
)
group1_parser.add_argument(
'--shell', action='store', dest='loginShell', metavar='<new shell interpreter path>'
)
group1_parser = edit_parser.add_argument_group('personal information')
group1_parser.add_argument(
'--first-name', action='store', dest='givenName', metavar='<new first name>'
)
group1_parser.add_argument(
'--del-email', action='append', dest='delMail', metavar='<remove email address>'
)
group1_parser.add_argument(
'--last-name', action='store', dest='sn', metavar='<new last name>'
)
group1_parser.add_argument(
'--add-email', action='append', dest='addMail', metavar='<add new email address>'
)
group1_parser.add_argument(
'--del-phone-no', action='append', dest='delTelephoneNumber', metavar='<phone number to remove>'
)
group1_parser.add_argument(
'--organization', '-o', action='store', dest='o', metavar='<organization>'
)
group1_parser.add_argument(
'--add-phone-no', action='append', dest='addTelephoneNumber', metavar='<phone number to add>'
)
group1_parser.add_argument(
'--full-name', action='store', dest='cn', metavar='<new full name>'
)
group1_parser = edit_parser.add_argument_group('uid and group management')
group1_parser.add_argument(
'--del-group', action='append', dest='delgroup', metavar='<remove user from the group>'
)
group1_parser.add_argument(
'--group-id', action='store', dest='gidNumber', metavar='<change primary group ID>'
)
group1_parser.add_argument(
'--add-group', action='append', dest='addgroup', metavar='<add user to the group>'
)
group1_parser.add_argument(
'--uid-number', action='store', dest='uidNumber', metavar='<change user ID number>'
)
group1_parser.add_argument(
'--uid', action='store', dest='uid', metavar='<user\'s uid>'
)
# The retire command.
retire_parser = tool_subparsers.add_parser('retire', help='retire an existing account and remove all its privileges.')
retire_parser.add_argument(
'name', action='store', metavar='<name>', help='account name'
)
# The type command.
type_parser = tool_subparsers.add_parser('type', help='manage user types')
type_parser.add_argument(
'--list', '-l', action='store_true', dest='list_types', help='list user types'
)
# The remove command.
remove_parser = tool_subparsers.add_parser('remove', help='remove an existing account.')
remove_parser.add_argument(
'name', action='store', metavar='<name>', help='account name'
)
| 35.2 | 118 | 0.722968 | 767 | 5,808 | 5.295958 | 0.147327 | 0.112999 | 0.192516 | 0.164205 | 0.616691 | 0.522649 | 0.483013 | 0.447563 | 0.429345 | 0.383063 | 0 | 0.006865 | 0.097107 | 5,808 | 164 | 119 | 35.414634 | 0.767735 | 0.026171 | 0 | 0.318841 | 0 | 0 | 0.35341 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.014493 | 0.014493 | 0 | 0.014493 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
a7660deda124d1efd2085f69810453398abdc730 | 324 | py | Python | Aula01 e exercicios/exercicio_06.py | Dorcival/PYTHON | 0dc3fa53699d40b21c6ed721a190ffb4f8404345 | [
"MIT"
] | null | null | null | Aula01 e exercicios/exercicio_06.py | Dorcival/PYTHON | 0dc3fa53699d40b21c6ed721a190ffb4f8404345 | [
"MIT"
] | null | null | null | Aula01 e exercicios/exercicio_06.py | Dorcival/PYTHON | 0dc3fa53699d40b21c6ed721a190ffb4f8404345 | [
"MIT"
] | null | null | null | # Conversor de CELSIUS para FAHRENHEIT v.0.1
# Por Dorcival Leite 202003362174
import time
print("CONVERTER TEMPERATURA DE CELSIUS PARA FAHRENHEIT\n")
c = float(input("Digite a temperatura em CELSIUS: "))
f = float((9 * c)/5)+32
print("\nA temperatura de", c, "graus CELSIUS é igual a", f, "graus FAHRENHEIT")
time.sleep(20) | 40.5 | 80 | 0.734568 | 52 | 324 | 4.576923 | 0.653846 | 0.07563 | 0.109244 | 0.193277 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.071685 | 0.138889 | 324 | 8 | 81 | 40.5 | 0.781362 | 0.228395 | 0 | 0 | 0 | 0 | 0.564516 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.166667 | 0 | 0.166667 | 0.333333 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
a76da74714c837b30e9c8d8f4fbec4b1ea99f85f | 211 | py | Python | exerc18/18.py | WilliamSampaio/ExerciciosPython | 4317d242d2944b91b5d455da8a4ac3a33e154385 | [
"MIT"
] | null | null | null | exerc18/18.py | WilliamSampaio/ExerciciosPython | 4317d242d2944b91b5d455da8a4ac3a33e154385 | [
"MIT"
] | null | null | null | exerc18/18.py | WilliamSampaio/ExerciciosPython | 4317d242d2944b91b5d455da8a4ac3a33e154385 | [
"MIT"
] | null | null | null | import os
numeros = [0,0]
numeros[0] = float(input('Digite o numero 1: '))
numeros[1] = float(input('Digite o numero 2: '))
print(f'o maior valor entre os dois é: {max(numeros)}')
os.system('pause') | 21.1 | 56 | 0.630332 | 35 | 211 | 3.8 | 0.6 | 0.120301 | 0.240602 | 0.255639 | 0.345865 | 0 | 0 | 0 | 0 | 0 | 0 | 0.035088 | 0.189573 | 211 | 10 | 57 | 21.1 | 0.74269 | 0 | 0 | 0 | 0 | 0 | 0.433498 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.166667 | 0 | 0.166667 | 0.166667 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
a77112792896e19b96e12810cacf0861b725bf41 | 3,873 | py | Python | ooobuild/lo/packages/x_data_sink_encr_support.py | Amourspirit/ooo_uno_tmpl | 64e0c86fd68f24794acc22d63d8d32ae05dd12b8 | [
"Apache-2.0"
] | null | null | null | ooobuild/lo/packages/x_data_sink_encr_support.py | Amourspirit/ooo_uno_tmpl | 64e0c86fd68f24794acc22d63d8d32ae05dd12b8 | [
"Apache-2.0"
] | null | null | null | ooobuild/lo/packages/x_data_sink_encr_support.py | Amourspirit/ooo_uno_tmpl | 64e0c86fd68f24794acc22d63d8d32ae05dd12b8 | [
"Apache-2.0"
] | null | null | null | # coding: utf-8
#
# Copyright 2022 :Barry-Thomas-Paul: Moss
#
# Licensed under the Apache License, Version 2.0 (the "License")
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http: // www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
# Interface Class
# this is a auto generated file generated by Cheetah
# Libre Office Version: 7.3
# Namespace: com.sun.star.packages
import typing
from abc import abstractmethod
from ..uno.x_interface import XInterface as XInterface_8f010a43
if typing.TYPE_CHECKING:
from ..io.x_input_stream import XInputStream as XInputStream_98d40ab4
class XDataSinkEncrSupport(XInterface_8f010a43):
"""
Allows to get access to the stream of a PackageStream.
See Also:
`API XDataSinkEncrSupport <https://api.libreoffice.org/docs/idl/ref/interfacecom_1_1sun_1_1star_1_1packages_1_1XDataSinkEncrSupport.html>`_
"""
__ooo_ns__: str = 'com.sun.star.packages'
__ooo_full_ns__: str = 'com.sun.star.packages.XDataSinkEncrSupport'
__ooo_type_name__: str = 'interface'
__pyunointerface__: str = 'com.sun.star.packages.XDataSinkEncrSupport'
@abstractmethod
def getDataStream(self) -> 'XInputStream_98d40ab4':
"""
Allows to get access to the data of the PackageStream.
In case stream is encrypted one and the key for the stream is not set, an exception must be thrown.
Raises:
com.sun.star.packages.WrongPasswordException: ``WrongPasswordException``
com.sun.star.packages.zip.ZipException: ``ZipException``
com.sun.star.io.IOException: ``IOException``
"""
@abstractmethod
def getPlainRawStream(self) -> 'XInputStream_98d40ab4':
"""
Allows to get access to the raw data of the stream as it is stored in the package.
Raises:
com.sun.star.io.IOException: ``IOException``
com.sun.star.packages.NoEncryptionException: ``NoEncryptionException``
"""
@abstractmethod
def getRawStream(self) -> 'XInputStream_98d40ab4':
"""
Allows to get access to the data of the PackageStream as to raw stream.
In case stream is not encrypted an exception will be thrown.
The difference of raw stream is that it contains header for encrypted data, so an encrypted stream can be copied from one PackageStream to another one without decryption.
Raises:
com.sun.star.packages.NoEncryptionException: ``NoEncryptionException``
com.sun.star.io.IOException: ``IOException``
"""
@abstractmethod
def setDataStream(self, aStream: 'XInputStream_98d40ab4') -> None:
"""
Allows to set a data stream for the PackageStream.
In case PackageStream is marked as encrypted the data stream will be encrypted on storing.
Raises:
com.sun.star.io.IOException: ``IOException``
"""
@abstractmethod
def setRawStream(self, aStream: 'XInputStream_98d40ab4') -> None:
"""
Allows to set raw stream for the PackageStream.
The PackageStream object can not be marked as encrypted one, an exception will be thrown in such case.
Raises:
com.sun.star.packages.EncryptionNotAllowedException: ``EncryptionNotAllowedException``
com.sun.star.packages.NoRawFormatException: ``NoRawFormatException``
com.sun.star.io.IOException: ``IOException``
"""
__all__ = ['XDataSinkEncrSupport']
| 39.520408 | 178 | 0.694036 | 467 | 3,873 | 5.657388 | 0.364026 | 0.034065 | 0.056775 | 0.06813 | 0.323997 | 0.286147 | 0.179031 | 0.16162 | 0.068887 | 0.051476 | 0 | 0.020333 | 0.225407 | 3,873 | 97 | 179 | 39.927835 | 0.860333 | 0.644203 | 0 | 0.238095 | 0 | 0 | 0.2364 | 0.207715 | 0 | 0 | 0 | 0 | 0 | 1 | 0.238095 | false | 0 | 0.190476 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
a7779af144a3ba68deaf47c8047f304427889fe5 | 2,002 | py | Python | organizational_area/admin.py | mspasiano/uniTicket | 1e8e4c2274293e751deea5b8b1fb4116136c5641 | [
"Apache-2.0"
] | null | null | null | organizational_area/admin.py | mspasiano/uniTicket | 1e8e4c2274293e751deea5b8b1fb4116136c5641 | [
"Apache-2.0"
] | null | null | null | organizational_area/admin.py | mspasiano/uniTicket | 1e8e4c2274293e751deea5b8b1fb4116136c5641 | [
"Apache-2.0"
] | null | null | null | from django.contrib import admin
from .models import *
from .admin_inlines import *
class AbstractAdmin(admin.ModelAdmin):
list_display = ('name', 'description')
class Media:
js = ('js/textarea-autosize.js',)
css = {'all': ('css/textarea-small.css',),}
@admin.register(OrganizationalStructureFunction)
class OrganizationalStructureFunctionAdmin(AbstractAdmin):
pass
@admin.register(OrganizationalStructureType)
class OrganizationalStructureTypeAdmin(AbstractAdmin):
pass
@admin.register(OrganizationalStructure)
class OrganizationalStructureAdmin(AbstractAdmin):
prepopulated_fields = {'slug': ('name',)}
list_display = ('name', 'structure_type',
'description', 'is_active')
list_filter = ('structure_type', 'is_active')
list_editable = ('is_active',)
inlines = [OrganizationalStructureLocationInline,
OrganizationalStructureOfficeInline,]
@admin.register(OrganizationalStructureOffice)
class OrganizationalStructureOfficeAdmin(AbstractAdmin):
prepopulated_fields = {'slug': ('name',)}
list_display = ('name', 'organizational_structure', 'is_active')
list_filter = ('organizational_structure',
'is_active')
list_editable = ('is_active',)
inlines = [OrganizationalStructureOfficeEmployeeInline,
OrganizationalStructureOfficeLocationInline,]
#@admin.register(TipoDotazione)
#class TipoDotazioneAdmin(admin.ModelAdmin):
#list_display = ('nome', 'descrizione')
#class Media:
#js = ('js/textarea-autosize.js',)
#css = {'all': ('css/textarea-small.css',),}
#@admin.register(Locazione)
#class LocazioneAdmin(admin.ModelAdmin):
#list_display = ('nome', 'indirizzo', 'descrizione_breve',)
#class Media:
#js = ('js/textarea-autosize.js',)
#css = {'all': ('css/textarea-small.css',),}
# @admin.register(OrganizationalStructureFunction)
# class OrganizationalStructureFunction(AbstractAdmin):
# pass
| 29.014493 | 68 | 0.700799 | 165 | 2,002 | 8.363636 | 0.321212 | 0.065942 | 0.034783 | 0.056522 | 0.418841 | 0.333333 | 0.333333 | 0.282609 | 0.204348 | 0.204348 | 0 | 0 | 0.167832 | 2,002 | 68 | 69 | 29.441176 | 0.828331 | 0.258741 | 0 | 0.1875 | 0 | 0 | 0.155208 | 0.063308 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.0625 | 0.09375 | 0 | 0.625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 2 |
a779d1a47d9473c22bbee36fab9477af4aad4943 | 228 | py | Python | 01-logica-de-programacao-e-algoritmos/Aula 04/1/exercicio01.py | rafaelbarretomg/Uninter | 1f84b0103263177122663e991db3a8aeb106a959 | [
"MIT"
] | null | null | null | 01-logica-de-programacao-e-algoritmos/Aula 04/1/exercicio01.py | rafaelbarretomg/Uninter | 1f84b0103263177122663e991db3a8aeb106a959 | [
"MIT"
] | null | null | null | 01-logica-de-programacao-e-algoritmos/Aula 04/1/exercicio01.py | rafaelbarretomg/Uninter | 1f84b0103263177122663e991db3a8aeb106a959 | [
"MIT"
] | null | null | null | # Exercicio 01 Tuplas
x = int(input('Digite o primeiro numero: '))
y = int(input('Digite o segundo numero: '))
cont = 1
soma = x
while cont < y:
soma = soma + x
cont = cont + 1
print('O resultado eh: {}' .format(soma))
| 20.727273 | 44 | 0.618421 | 36 | 228 | 3.916667 | 0.555556 | 0.113475 | 0.198582 | 0.212766 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.022727 | 0.22807 | 228 | 10 | 45 | 22.8 | 0.778409 | 0.083333 | 0 | 0 | 0 | 0 | 0.334951 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.125 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
a77a0a8078a541187f7e349449f50c15dd027ebe | 832 | py | Python | docs/OOPS/Accessing_pvt_var2.py | munyumunyu/Python-for-beginners | 335d001d4b8f13af71f660beed0b7f5fe313aa3b | [
"MIT"
] | 158 | 2018-10-03T23:36:48.000Z | 2022-03-25T00:16:00.000Z | docs/OOPS/Accessing_pvt_var2.py | munyumunyu/Python-for-beginners | 335d001d4b8f13af71f660beed0b7f5fe313aa3b | [
"MIT"
] | 10 | 2018-10-11T03:52:28.000Z | 2019-12-04T02:51:28.000Z | docs/OOPS/Accessing_pvt_var2.py | munyumunyu/Python-for-beginners | 335d001d4b8f13af71f660beed0b7f5fe313aa3b | [
"MIT"
] | 40 | 2018-10-03T10:47:28.000Z | 2022-02-22T19:55:46.000Z | '''
To have a error free way of accessing and updating private variables, we create specific methods for this.
Those methods which are meant to set a value to a private variable are called setter methods and methods
meant to access private variable values are called getter methods.
The below code is an example of getter and setter methods:
'''
class Customer:
def __init__(self, id, name, age, wallet_balance):
self.id = id
self.name = name
self.age = age
self.__wallet_balance = wallet_balance
def set_wallet_balance(self, amount):
if amount < 1000 and amount> 0:
self.__wallet_balance = amount
def get_wallet_balance(self):
return self.__wallet_balance
c1=Customer(100, "Gopal", 24, 1000)
c1.set_wallet_balance(120)
print(c1.get_wallet_balance())
| 32 | 107 | 0.71274 | 123 | 832 | 4.634146 | 0.471545 | 0.205263 | 0.089474 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.030864 | 0.221154 | 832 | 25 | 108 | 33.28 | 0.848765 | 0.409856 | 0 | 0 | 0 | 0 | 0.010352 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.214286 | false | 0 | 0 | 0.071429 | 0.357143 | 0.071429 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
a7855fa0e107181fe9f7c866727366717fbbb9d3 | 727 | py | Python | fixtures/requests.py | AzatAza/december-api-tests | dd120fd0c479b035dbe84ccd1fb1dd687d84af5d | [
"Apache-2.0"
] | null | null | null | fixtures/requests.py | AzatAza/december-api-tests | dd120fd0c479b035dbe84ccd1fb1dd687d84af5d | [
"Apache-2.0"
] | null | null | null | fixtures/requests.py | AzatAza/december-api-tests | dd120fd0c479b035dbe84ccd1fb1dd687d84af5d | [
"Apache-2.0"
] | null | null | null | import requests
from requests import Response
class Client:
@staticmethod
def request(method: str, url: str, **kwargs) -> Response:
"""
Request method
method: method for the new Request object: GET, OPTIONS, HEAD, POST, PUT, PATCH, or DELETE. # noqa
url – URL for the new Request object.
**kwargs:
params – (optional) Dictionary, list of tuples or bytes to send in the query string for the Request. # noqa
json – (optional) A JSON serializable Python object to send in the body of the Request. # noqa
headers – (optional) Dictionary of HTTP Headers to send with the Request.
"""
return requests.request(method, url, **kwargs) | 42.764706 | 119 | 0.645117 | 98 | 727 | 4.826531 | 0.489796 | 0.082452 | 0.038055 | 0.067653 | 0.093023 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.277854 | 727 | 17 | 120 | 42.764706 | 0.893333 | 0.618982 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.333333 | 0 | 0.833333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
a797c1cadcd8bbf1b052c2b8f77c7e0d396cdcfb | 271 | py | Python | src/ralph/api/__init__.py | DoNnMyTh/ralph | 97b91639fa68965ad3fd9d0d2652a6545a2a5b72 | [
"Apache-2.0"
] | 1,668 | 2015-01-01T12:51:20.000Z | 2022-03-29T09:05:35.000Z | src/ralph/api/__init__.py | hq-git/ralph | e2448caf02d6e5abfd81da2cff92aefe0a534883 | [
"Apache-2.0"
] | 2,314 | 2015-01-02T13:26:26.000Z | 2022-03-29T04:06:03.000Z | src/ralph/api/__init__.py | hq-git/ralph | e2448caf02d6e5abfd81da2cff92aefe0a534883 | [
"Apache-2.0"
] | 534 | 2015-01-05T12:40:28.000Z | 2022-03-29T21:10:12.000Z | from ralph.api.serializers import RalphAPISerializer
from ralph.api.viewsets import RalphAPIViewSet, RalphReadOnlyAPIViewSet
from ralph.api.routers import router
__all__ = [
'RalphAPISerializer',
'RalphAPIViewSet',
'RalphReadOnlyAPIViewSet',
'router',
]
| 24.636364 | 71 | 0.778598 | 24 | 271 | 8.625 | 0.5 | 0.130435 | 0.173913 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.140221 | 271 | 10 | 72 | 27.1 | 0.888412 | 0 | 0 | 0 | 0 | 0 | 0.228782 | 0.084871 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
a7a176d1745c365cc7c57002a3194eb70a8c838f | 80 | py | Python | UDTherapy/__init__.py | JonSn0w/Urban-Dictionary-Therapy | 8257cd3883bcef31207c2b089197ee9b0788727f | [
"MIT"
] | 3 | 2017-05-08T11:59:51.000Z | 2017-06-20T22:36:07.000Z | UDTherapy/__init__.py | JonSn0w/Urban-Dictionary-Therapy | 8257cd3883bcef31207c2b089197ee9b0788727f | [
"MIT"
] | null | null | null | UDTherapy/__init__.py | JonSn0w/Urban-Dictionary-Therapy | 8257cd3883bcef31207c2b089197ee9b0788727f | [
"MIT"
] | null | null | null | name = 'Urban Dictionary Therapy'
__all__ = ['UDTherapy',
'helper']
| 16 | 33 | 0.6 | 7 | 80 | 6.285714 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.2625 | 80 | 4 | 34 | 20 | 0.745763 | 0 | 0 | 0 | 0 | 0 | 0.4875 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
a7b495865e605d6301abb8d08c5cad2ee915a172 | 7,369 | py | Python | tests/unit/utils/test_attributes.py | pyqgis/plutil | 79df2596e4e0340f3765ccb5bdfd4cc1d01fcb7d | [
"MIT"
] | null | null | null | tests/unit/utils/test_attributes.py | pyqgis/plutil | 79df2596e4e0340f3765ccb5bdfd4cc1d01fcb7d | [
"MIT"
] | null | null | null | tests/unit/utils/test_attributes.py | pyqgis/plutil | 79df2596e4e0340f3765ccb5bdfd4cc1d01fcb7d | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Unit tests for Attributes.
"""
from __future__ import unicode_literals
from __future__ import print_function
import logging
import os
import shutil
import tempfile
from unittest import TestCase, SkipTest
from unittest.mock import MagicMock
from qgis.PyQt.QtCore import QVariant
from qgis.core import (
QgsWkbTypes, QgsProject, QgsVectorLayer, QgsField,
QgsVectorDataProvider
)
from qgis_plutil.utils.attributes import (
variant_ctor_for_object, fields_from_data,
merge_fields_in_provider,
)
logger = logging.getLogger('tests.attributes')
class TestVariantCtorForObject(TestCase):
def test_invalid_params(self):
with self.assertRaises(NotImplementedError):
variant_ctor_for_object(None)
with self.assertRaises(NotImplementedError):
variant_ctor_for_object({})
with self.assertRaises(NotImplementedError):
variant_ctor_for_object(b'dd')
def test_valid_params(self):
self.assertEqual(variant_ctor_for_object(0.5), QVariant.Double)
self.assertEqual(variant_ctor_for_object(1), QVariant.Int)
self.assertEqual(variant_ctor_for_object(""), QVariant.String)
self.assertEqual(variant_ctor_for_object("test"), QVariant.String)
class TestFieldsFromData(TestCase):
def test_invalid(self):
with self.assertRaises(ValueError):
fields_from_data("abc")
with self.assertRaises(TypeError):
fields_from_data(1)
with self.assertRaises(ValueError):
fields_from_data(["abc"])
with self.assertRaises(ValueError):
fields_from_data([1])
def test_mixed_dict(self):
with self.assertRaises(ValueError):
fields, are_dicts = fields_from_data([{1: 1, 2: 1, 3: 1}, [1]])
with self.assertRaises(ValueError):
fields, are_dicts = fields_from_data([[1, 4, 6], {1: 1, 6: 9}])
def test_valid_list_of_one(self):
fields, are_dicts = fields_from_data([[1]])
self.assertFalse(are_dicts)
self.assertEqual(len(fields), 1)
self.assertIsInstance(fields["Field 1"], QgsField)
self.assertEqual(fields["Field 1"].type(), QVariant.Int)
self.assertEqual(fields["Field 1"].name(), "Field 1")
def test_valid_lists(self):
fields, are_dicts = fields_from_data([[1, "2", 3.5]])
self.assertFalse(are_dicts)
self.assertEqual(len(fields), 3)
self.assertIsInstance(fields["Field 1"], QgsField)
self.assertIsInstance(fields["Field 2"], QgsField)
self.assertIsInstance(fields["Field 3"], QgsField)
self.assertEqual(fields["Field 1"].type(), QVariant.Int)
self.assertEqual(fields["Field 1"].name(), "Field 1")
self.assertEqual(fields["Field 2"].type(), QVariant.String)
self.assertEqual(fields["Field 2"].name(), "Field 2")
self.assertEqual(fields["Field 3"].type(), QVariant.Double)
self.assertEqual(fields["Field 3"].name(), "Field 3")
def test_two_valid_lists(self):
fields, are_dicts = fields_from_data([
[1, "2", 3.5], [4, 5, 6, 7]
])
self.assertFalse(are_dicts)
self.assertEqual(len(fields), 4)
self.assertIsInstance(fields["Field 1"], QgsField)
self.assertIsInstance(fields["Field 2"], QgsField)
self.assertIsInstance(fields["Field 3"], QgsField)
self.assertIsInstance(fields["Field 4"], QgsField)
self.assertEqual(fields["Field 1"].type(), QVariant.Int)
self.assertEqual(fields["Field 1"].name(), "Field 1")
self.assertEqual(fields["Field 2"].type(), QVariant.String)
self.assertEqual(fields["Field 2"].name(), "Field 2")
self.assertEqual(fields["Field 3"].type(), QVariant.Double)
self.assertEqual(fields["Field 3"].name(), "Field 3")
self.assertEqual(fields["Field 4"].type(), QVariant.Int)
self.assertEqual(fields["Field 4"].name(), "Field 4")
def test_valid_dict_of_one(self):
fields, are_dicts = fields_from_data([{1: 2}])
self.assertTrue(are_dicts)
self.assertEqual(len(fields), 1)
self.assertIsInstance(fields['1'], QgsField)
self.assertEqual(fields['1'].type(), QVariant.Int)
def test_valid_dicts(self):
fields, are_dicts = fields_from_data([{1: 'a', "2": 4, 3: 3.5}])
self.assertTrue(are_dicts)
self.assertEqual(len(fields), 3)
self.assertIsInstance(fields["1"], QgsField)
self.assertIsInstance(fields["2"], QgsField)
self.assertIsInstance(fields["3"], QgsField)
self.assertEqual(fields["1"].type(), QVariant.String)
self.assertEqual(fields["1"].name(), "1")
self.assertEqual(fields["2"].type(), QVariant.Int)
self.assertEqual(fields["2"].name(), "2")
self.assertEqual(fields["3"].type(), QVariant.Double)
self.assertEqual(fields["3"].name(), "3")
def test_two_valid_dicts(self):
fields, are_dicts = fields_from_data([
{1: 'a', "2": 4, 3: 3.5},
{1: 'a', "2": 4, 3: 3.5, "some other": 5},
])
self.assertTrue(are_dicts)
self.assertEqual(len(fields), 4)
self.assertIsInstance(fields["1"], QgsField)
self.assertIsInstance(fields["2"], QgsField)
self.assertIsInstance(fields["3"], QgsField)
self.assertIsInstance(fields["some other"], QgsField)
self.assertEqual(fields["1"].type(), QVariant.String)
self.assertEqual(fields["1"].name(), "1")
self.assertEqual(fields["2"].type(), QVariant.Int)
self.assertEqual(fields["2"].name(), "2")
self.assertEqual(fields["3"].type(), QVariant.Double)
self.assertEqual(fields["3"].name(), "3")
self.assertEqual(fields["some other"].type(), QVariant.Int)
self.assertEqual(fields["some other"].name(), "some other")
class TestMergeFieldsInProvider(TestCase):
def test_no_layer(self):
provider = MagicMock(spec=QgsVectorDataProvider)
pf1 = MagicMock(spec=QgsField)
pf1.name.return_value = "1"
pf2 = MagicMock(spec=QgsField)
pf2.name.return_value = "2"
pf3 = MagicMock(spec=QgsField)
pf3.name.return_value = "3"
provider.fields.return_value = [pf1, pf2, pf3]
pn1 = MagicMock(spec=QgsField)
pn1.name.return_value = "1"
pn4 = MagicMock(spec=QgsField)
pn4.name.return_value = "4"
pn9 = MagicMock(spec=QgsField)
pn9.name.return_value = "9"
merge_fields_in_provider(
provider,
fields={'1': pn1, '4': pn4, '9': pn9},
layer=None)
provider.addAttributes.assert_called_once_with([pn4, pn9])
def test_layer(self):
pf1 = MagicMock(spec=QgsField)
pf1.name.return_value = "1"
pf2 = MagicMock(spec=QgsField)
pf2.name.return_value = "2"
pf3 = MagicMock(spec=QgsField)
pf3.name.return_value = "3"
provider = MagicMock(spec=QgsVectorDataProvider)
provider.fields.return_value = [pf1, pf2, pf3]
layer = MagicMock(spec=QgsVectorLayer)
merge_fields_in_provider(
provider,
fields={'1': pf1, '2': pf2, '3': pf3},
layer=layer)
provider.addAttributes.assert_called_once_with([])
layer.updateFields.assert_called_once()
| 39.832432 | 75 | 0.639707 | 875 | 7,369 | 5.237714 | 0.128 | 0.134192 | 0.142047 | 0.09077 | 0.743618 | 0.713943 | 0.655248 | 0.600262 | 0.548767 | 0.542221 | 0 | 0.028641 | 0.218211 | 7,369 | 184 | 76 | 40.048913 | 0.766881 | 0.006649 | 0 | 0.522013 | 0 | 0 | 0.048003 | 0 | 0 | 0 | 0 | 0 | 0.471698 | 1 | 0.075472 | false | 0 | 0.069182 | 0 | 0.163522 | 0.006289 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
a7c28ef23aa63e38de4b879a5620a01103796308 | 164 | py | Python | src/server.py | sqweelygig/a-pi-api | 7c83bf5d1a00e01a45edc7fda9b4887bf02b064a | [
"Apache-2.0"
] | null | null | null | src/server.py | sqweelygig/a-pi-api | 7c83bf5d1a00e01a45edc7fda9b4887bf02b064a | [
"Apache-2.0"
] | null | null | null | src/server.py | sqweelygig/a-pi-api | 7c83bf5d1a00e01a45edc7fda9b4887bf02b064a | [
"Apache-2.0"
] | null | null | null | import RPi.GPIO as GPIO
import connexion
if __name__ == '__main__':
app = connexion.App('a-pi-api')
app.add_api('v0/spec.yml')
app.run(host='0.0.0.0', port=80)
| 20.5 | 33 | 0.682927 | 31 | 164 | 3.322581 | 0.677419 | 0.058252 | 0.058252 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.048611 | 0.121951 | 164 | 7 | 34 | 23.428571 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0.207317 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
a7cb29cc32a2319fccf961ffb48796199a5ff0d3 | 1,110 | py | Python | jskparser/ast/stmt/ifstmt.py | natebragg/java-sketch | f5ac26f2cc46ae4556f9a61c55afd37f55c961ff | [
"MIT"
] | 15 | 2015-12-15T18:33:50.000Z | 2021-09-29T11:48:54.000Z | jskparser/ast/stmt/ifstmt.py | natebragg/java-sketch | f5ac26f2cc46ae4556f9a61c55afd37f55c961ff | [
"MIT"
] | 11 | 2015-11-16T22:14:58.000Z | 2021-09-23T05:28:40.000Z | jskparser/ast/stmt/ifstmt.py | natebragg/java-sketch | f5ac26f2cc46ae4556f9a61c55afd37f55c961ff | [
"MIT"
] | 8 | 2015-11-16T21:50:08.000Z | 2021-03-23T15:15:34.000Z | #!/usr/bin/env python
from .statement import Statement
from . import _import
class IfStmt(Statement):
def __init__(self, kwargs={}):
super(IfStmt, self).__init__(kwargs)
locs = _import()
# Expression condition;
con = kwargs.get(u'condition', {})
self._condition = locs[con[u'@t']](con) if con else None
# Statement thenStmt;
then = kwargs.get(u'thenStmt', {})
self._thenStmt = locs[then[u'@t']](then) if then else None
# Statement elseStmt;
el = kwargs.get(u'elseStmt', {})
self._elseStmt = locs[el[u'@t']](el) if el else None
self.add_as_parent([self.condition, self.thenStmt, self.elseStmt])
@property
def condition(self): return self._condition
@condition.setter
def condition(self, v): self._condition = v
@property
def thenStmt(self): return self._thenStmt
@thenStmt.setter
def thenStmt(self, v): self._thenStmt = v
@property
def elseStmt(self): return self._elseStmt
@elseStmt.setter
def elseStmt(self, v): self._elseStmt = v
| 27.75 | 74 | 0.621622 | 137 | 1,110 | 4.883212 | 0.255474 | 0.077728 | 0.044843 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.251351 | 1,110 | 39 | 75 | 28.461538 | 0.805054 | 0.073874 | 0 | 0.12 | 0 | 0 | 0.030273 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.28 | false | 0 | 0.12 | 0.12 | 0.44 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
a7ce8d54807f93e96a382235bdf7d3f14bebe67b | 467 | py | Python | src/screenlogger.py | swbooking/RobotMaria | 2553358629a870b10458564524315ff4cfda0bd1 | [
"MIT"
] | null | null | null | src/screenlogger.py | swbooking/RobotMaria | 2553358629a870b10458564524315ff4cfda0bd1 | [
"MIT"
] | null | null | null | src/screenlogger.py | swbooking/RobotMaria | 2553358629a870b10458564524315ff4cfda0bd1 | [
"MIT"
] | null | null | null | class ScreenLogger:
def __init__(self, loghandler=None, verbose = True):
self.LogMessage = None
self.LogHandler = loghandler
self.Verbose = verbose
return
def Log(self, message):
if self.LogMessage != message:
self.LogMessage = message
if self.LogHandler != None:
self.LogHandler(self.LogMessage)
if self.Verbose:
print self.LogMessage
return
| 31.133333 | 56 | 0.578158 | 45 | 467 | 5.911111 | 0.333333 | 0.263158 | 0.135338 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.346895 | 467 | 14 | 57 | 33.357143 | 0.872131 | 0 | 0 | 0.142857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.071429 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
38f003c85d91841bc389c08c6a91fa5429cad832 | 40,888 | py | Python | tests/test_runner.py | varunvarma/panoptes | 733e1b17e01d47fe0a399e2fe635f614cc5a0b88 | [
"Apache-2.0"
] | null | null | null | tests/test_runner.py | varunvarma/panoptes | 733e1b17e01d47fe0a399e2fe635f614cc5a0b88 | [
"Apache-2.0"
] | null | null | null | tests/test_runner.py | varunvarma/panoptes | 733e1b17e01d47fe0a399e2fe635f614cc5a0b88 | [
"Apache-2.0"
] | null | null | null | """
Copyright 2018, Oath Inc.
Licensed under the terms of the Apache 2.0 license. See LICENSE file in project root for terms.
"""
import re
import unittest
from mock import patch, MagicMock, Mock, PropertyMock
from testfixtures import LogCapture
from yahoo_panoptes.framework.plugins.panoptes_base_plugin import PanoptesPluginInfo, PanoptesBasePlugin
from yahoo_panoptes.polling.polling_plugin import PanoptesPollingPlugin
from yahoo_panoptes.polling.polling_plugin_agent import polling_plugin_task, PanoptesPollingPluginKeyValueStore, \
PanoptesSecretsStore, PanoptesPollingPluginAgentKeyValueStore
from yahoo_panoptes.discovery.discovery_plugin_agent import PanoptesDiscoveryPluginAgentKeyValueStore, \
PanoptesDiscoveryPluginKeyValueStore, PanoptesSecretsStore, discovery_plugin_task
from yahoo_panoptes.framework.resources import PanoptesContext, PanoptesResource, PanoptesResourcesKeyValueStore
from yahoo_panoptes.framework.plugins.runner import PanoptesPluginRunner, PanoptesPluginWithEnrichmentRunner
from yahoo_panoptes.framework.metrics import PanoptesMetric, PanoptesMetricsGroupSet
from tests.mock_panoptes_producer import MockPanoptesMessageProducer
from test_framework import PanoptesTestKeyValueStore, panoptes_mock_kazoo_client, panoptes_mock_redis_strict_client
from helpers import get_test_conf_file
_TIMESTAMP = 1
def _callback(*args):
pass
def _callback_with_exception(*args):
raise Exception
class PanoptesTestPluginNoLock(PanoptesBasePlugin):
name = None
signature = None
data = {}
execute_now = True
plugin_object = None
def run(self, context):
pass
class PanoptesTestPluginRaisePluginReleaseException:
name = None
version = None
last_executed = None
last_executed_age = None
last_results = None
last_results_age = None
moduleMtime = None
configMtime = None
signature = None
data = {}
execute_now = True
lock = MagicMock(locked=True, release=MagicMock(side_effect=Exception))
def run(self, context):
raise Exception
class MockPluginExecuteNow:
execute_now = False
class MockPluginLockException:
name = None
signature = None
data = {}
execute_now = True
lock = MagicMock(side_effect=Exception)
class MockPluginLockNone:
name = None
signature = None
data = {}
execute_now = True
lock = None
class MockPluginLockIsNotLocked:
name = None
signature = None
data = {}
execute_now = True
lock = MagicMock(locked=False)
_, global_panoptes_test_conf_file = get_test_conf_file()
class TestPanoptesPluginRunner(unittest.TestCase):
@staticmethod
def extract(record):
message = record.getMessage()
match_obj = re.match(r'(?P<name>.*):\w+(?P<body>.*)', message)
if match_obj:
message = match_obj.group('name') + match_obj.group('body')
match_obj = re.match(r'(?P<start>.*[R|r]an in\s)\d+\.?\d*.*(?P<end>seconds.*)', message)
if match_obj:
return record.name, record.levelname, match_obj.group('start') + match_obj.group('end')
match_obj = re.match(r'(?P<start>.*took\s*)\d+\.?\d*.*(?P<seconds>seconds\D*)\d+\s(?P<end>garbage objects.*)',
message)
if match_obj:
return record.name, record.levelname, match_obj.group('start') + match_obj.group('seconds') + \
match_obj.group('end')
match_obj = re.match(
r'(?P<start>Attempting to get lock for plugin .*with lock path) \".*\".*(?P<id> and identifier).*'
r'(?P<in> in) \d\.?\d*(?P<seconds> seconds)',
message)
if match_obj:
return record.name, record.levelname, match_obj.group('start') + match_obj.group('id') + \
match_obj.group('in') + match_obj.group('seconds')
match_obj = re.match(
r'(?P<delete>Deleting module: yapsy_loaded_plugin_Test_Polling_Plugin_Second_Instance|'
r'Deleting module: yapsy_loaded_plugin_Test_Polling_Plugin).*',
message
)
if match_obj:
return record.name, record.levelname, match_obj.group('delete')
return record.name, record.levelname, message
@patch('redis.StrictRedis', panoptes_mock_redis_strict_client)
@patch('kazoo.client.KazooClient', panoptes_mock_kazoo_client)
def setUp(self):
self.my_dir, self.panoptes_test_conf_file = get_test_conf_file()
self._panoptes_context = PanoptesContext(self.panoptes_test_conf_file,
key_value_store_class_list=[PanoptesTestKeyValueStore,
PanoptesResourcesKeyValueStore,
PanoptesPollingPluginKeyValueStore,
PanoptesSecretsStore,
PanoptesPollingPluginAgentKeyValueStore,
PanoptesDiscoveryPluginAgentKeyValueStore,
PanoptesDiscoveryPluginKeyValueStore],
create_message_producer=False, async_message_producer=False,
create_zookeeper_client=True)
self._runner_class = PanoptesPluginRunner
self._log_capture = LogCapture(attributes=self.extract)
def tearDown(self):
self._log_capture.uninstall()
def test_logging_methods(self):
runner = self._runner_class("Test Polling Plugin", "polling", PanoptesPollingPlugin, PanoptesPluginInfo,
None, self._panoptes_context, PanoptesTestKeyValueStore,
PanoptesTestKeyValueStore, PanoptesTestKeyValueStore, "plugin_logger",
PanoptesMetricsGroupSet, _callback)
# Ensure logging methods run:
runner.info(PanoptesTestPluginNoLock(), "Test Info log message")
runner.warn(PanoptesTestPluginNoLock(), "Test Warning log message")
runner.error(PanoptesTestPluginNoLock(), "Test Error log message", Exception)
runner.exception(PanoptesTestPluginNoLock(), "Test Exception log message")
self._log_capture.check(('panoptes.tests.test_runner', 'INFO', '[None] [{}] Test Info log message'),
('panoptes.tests.test_runner', 'WARNING', '[None] [{}] Test Warning log message'),
('panoptes.tests.test_runner', 'ERROR',
"[None] [{}] Test Error log message: <type 'exceptions.Exception'>"),
('panoptes.tests.test_runner', 'ERROR', '[None] [{}] Test Exception log message:'))
def test_basic_operations(self):
runner = self._runner_class("Test Polling Plugin", "polling", PanoptesPollingPlugin, PanoptesPluginInfo,
None, self._panoptes_context, PanoptesTestKeyValueStore,
PanoptesTestKeyValueStore, PanoptesTestKeyValueStore, "plugin_logger",
PanoptesMetricsGroupSet, _callback)
runner.execute_plugin()
self._log_capture.check_present(('panoptes.tests.test_runner', 'INFO',
'Attempting to execute plugin "Test Polling Plugin"'),
('panoptes.tests.test_runner', 'DEBUG',
'''Starting Plugin Manager for "polling" plugins with the following '''
'''configuration: {'polling': <class'''
""" 'yahoo_panoptes.polling.polling_plugin.PanoptesPollingPlugin'>}, """
"""['tests/plugins/polling'], panoptes-plugin"""),
('panoptes.tests.test_runner', 'DEBUG', 'Found 3 plugins'),
('panoptes.tests.test_runner', 'DEBUG',
'Loaded plugin '
'"Test Polling Plugin", version "0.1" of type "polling"'
', category "polling"'),
('panoptes.tests.test_runner',
'DEBUG',
'Loaded plugin "Test Polling Plugin 2", '
'version "0.1" of type "polling", category "polling"'),
('panoptes.tests.test_runner', 'DEBUG',
'Loaded plugin "Test Polling Plugin Second Instance", '
'version "0.1" of type "polling", category "polling"'),
('panoptes.tests.test_runner', 'INFO',
'''[Test Polling Plugin] [None] '''
'''Attempting to get lock for plugin "Test Polling Plugin"'''),
('panoptes.tests.test_runner', 'DEBUG',
'Attempting to get lock for plugin "Test Polling Plugin", with lock path and '
'identifier in seconds'),
('panoptes.tests.test_runner',
'INFO',
'[Test Polling Plugin] [None] Acquired lock'),
('panoptes.tests.test_runner',
'INFO',
'[Test Polling Plugin] [None]'
' Ran in seconds'),
('panoptes.tests.test_runner',
'INFO',
'[Test Polling Plugin] [None] Released lock'),
('panoptes.tests.test_runner',
'INFO',
'[Test Polling Plugin] [None] Plugin returned'
' a result set with 1 members'),
('panoptes.tests.test_runner',
'INFO',
'[Test Polling Plugin] [None]'
' Callback function ran in seconds'),
('panoptes.tests.test_runner',
'INFO',
'[Test Polling Plugin] [None] GC took seconds. There are garbage objects.'),
('panoptes.tests.test_runner',
'DEBUG',
'Deleting module: yapsy_loaded_plugin_Test_Polling_Plugin'),
('panoptes.tests.test_runner',
'DEBUG',
'Deleting module: yapsy_loaded_plugin_Test_Polling_Plugin'),
('panoptes.tests.test_runner',
'DEBUG',
'Deleting module: yapsy_loaded_plugin_Test_Polling_Plugin_Second_Instance'),
order_matters=False
)
def test_nonexistent_plugin(self):
runner = self._runner_class("Non-existent Plugin", "polling", PanoptesPollingPlugin, PanoptesPluginInfo,
None, self._panoptes_context, PanoptesTestKeyValueStore,
PanoptesTestKeyValueStore, PanoptesTestKeyValueStore, "plugin_logger",
PanoptesMetricsGroupSet, _callback)
runner.execute_plugin()
self._log_capture.check_present(('panoptes.tests.test_runner', 'INFO',
'Attempting to execute plugin "Non-existent Plugin"'),
('panoptes.tests.test_runner', 'DEBUG',
'Starting Plugin Manager for "polling" plugins with the following '
"configuration: {'polling': <class 'yahoo_panoptes.polling.polling_plugin."
"PanoptesPollingPlugin'>}, "
"['tests/plugins/polling'], panoptes-plugin"),
('panoptes.tests.test_runner', 'DEBUG', 'Found 3 plugins'),
('panoptes.tests.test_runner', 'DEBUG',
'Loaded plugin "Test Polling Plugin", version "0.1" of type "polling", '
'category "polling"'),
('panoptes.tests.test_runner', 'DEBUG',
'Loaded plugin "Test Polling Plugin Second Instance", version "0.1" of type '
'"polling", category "polling"'),
('panoptes.tests.test_runner', 'WARNING',
'No plugin named "Non-existent Plugin" found in "'
'''['tests/plugins/polling']"'''),
order_matters=False)
def test_bad_plugin_type(self):
runner = self._runner_class("Test Polling Plugin", "bad", PanoptesPollingPlugin, PanoptesPluginInfo,
None, self._panoptes_context, PanoptesTestKeyValueStore,
PanoptesTestKeyValueStore, PanoptesTestKeyValueStore, "plugin_logger",
PanoptesMetricsGroupSet, _callback)
runner.execute_plugin()
self._log_capture.check_present(('panoptes.tests.test_runner', 'ERROR',
'''Error trying to load plugin "Test Polling Plugin": KeyError('bad',)'''))
def test_execute_now_false(self):
mock_get_plugin_by_name = MagicMock(return_value=MockPluginExecuteNow())
with patch('yahoo_panoptes.framework.plugins.runner.PanoptesPluginManager.getPluginByName',
mock_get_plugin_by_name):
runner = self._runner_class("Test Polling Plugin", "polling", PanoptesPollingPlugin, PanoptesPluginInfo,
None, self._panoptes_context, PanoptesTestKeyValueStore,
PanoptesTestKeyValueStore, PanoptesTestKeyValueStore, "plugin_logger",
PanoptesMetricsGroupSet, _callback)
runner.execute_plugin()
self._log_capture.check_present(('panoptes.tests.test_runner', 'INFO',
'Attempting to execute plugin "Test Polling Plugin"'),
('panoptes.tests.test_runner', 'DEBUG',
'''Starting Plugin Manager for '''
'''"polling" plugins with the '''
'''following configuration: {'polling': '''
"""<class 'yahoo_panoptes.polling.polling_plugin.PanoptesPollingPlugin'"""
""">}, ['tests/plugins/polling'], panoptes-plugin"""),
('panoptes.tests.test_runner', 'DEBUG', 'Found 3 plugins'),
('panoptes.tests.test_runner', 'DEBUG',
'Loaded plugin '
'"Test Polling Plugin", version "0.1" of type "polling"'
', category "polling"'),
('panoptes.tests.test_runner', 'DEBUG',
'Loaded plugin "Test Polling Plugin Second Instance", '
'version "0.1" of type "polling", category "polling"'),
order_matters=False)
def test_callback_failure(self):
runner = self._runner_class("Test Polling Plugin", "polling", PanoptesPollingPlugin, PanoptesPluginInfo,
None, self._panoptes_context, PanoptesTestKeyValueStore,
PanoptesTestKeyValueStore, PanoptesTestKeyValueStore, "plugin_logger",
PanoptesMetricsGroupSet, _callback_with_exception)
runner.execute_plugin()
self._log_capture.check_present(('panoptes.tests.test_runner', 'ERROR',
'[Test Polling Plugin] '
'[None] Results callback function failed: :'))
def test_lock_no_lock_object(self):
mock_plugin = MagicMock(return_value=PanoptesTestPluginNoLock)
mock_get_context = MagicMock(return_value=self._panoptes_context)
with patch('yahoo_panoptes.framework.plugins.runner.PanoptesPluginManager.getPluginByName',
mock_plugin):
with patch('yahoo_panoptes.framework.plugins.runner.PanoptesPluginRunner._get_context', mock_get_context):
runner = self._runner_class("Test Polling Plugin", "polling", PanoptesPollingPlugin, PanoptesPluginInfo,
None, self._panoptes_context, PanoptesTestKeyValueStore,
PanoptesTestKeyValueStore, PanoptesTestKeyValueStore, "plugin_logger",
PanoptesMetricsGroupSet, _callback)
runner.execute_plugin()
self._log_capture.check_present(('panoptes.tests.test_runner', 'ERROR',
'[None] [{}] Error in acquiring lock:'))
def test_lock_is_none(self):
mock_get_plugin_by_name = MagicMock(return_value=MockPluginLockNone())
mock_get_context = MagicMock(return_value=self._panoptes_context)
with patch('yahoo_panoptes.framework.plugins.runner.PanoptesPluginManager.getPluginByName',
mock_get_plugin_by_name):
with patch('yahoo_panoptes.framework.plugins.runner.PanoptesPluginRunner._get_context', mock_get_context):
runner = self._runner_class("Test Polling Plugin", "polling", PanoptesPollingPlugin,
PanoptesPluginInfo, None, self._panoptes_context, PanoptesTestKeyValueStore,
PanoptesTestKeyValueStore, PanoptesTestKeyValueStore, "plugin_logger",
PanoptesMetricsGroupSet, _callback)
runner.execute_plugin()
self._log_capture.check_present(('panoptes.tests.test_runner', 'INFO',
'[None] [{}] Attempting to get lock for plugin'
' "Test Polling Plugin"'))
def test_lock_is_not_locked(self):
mock_get_plugin_by_name = MagicMock(return_value=MockPluginLockIsNotLocked())
mock_get_context = MagicMock(return_value=self._panoptes_context)
with patch('yahoo_panoptes.framework.plugins.runner.PanoptesPluginManager.getPluginByName',
mock_get_plugin_by_name):
with patch('yahoo_panoptes.framework.plugins.runner.PanoptesPluginRunner._get_context', mock_get_context):
runner = self._runner_class("Test Polling Plugin", "polling", PanoptesPollingPlugin,
PanoptesPluginInfo, None, self._panoptes_context, PanoptesTestKeyValueStore,
PanoptesTestKeyValueStore, PanoptesTestKeyValueStore, "plugin_logger",
PanoptesMetricsGroupSet, _callback)
runner.execute_plugin()
self._log_capture.check_present(('panoptes.tests.test_runner', 'INFO',
'[None] [{}] Attempting to get lock for plugin'
' "Test Polling Plugin"'))
def test_plugin_failure(self):
mock_plugin = MagicMock(return_value=PanoptesTestPluginRaisePluginReleaseException)
mock_get_context = MagicMock(return_value=self._panoptes_context)
with patch('yahoo_panoptes.framework.plugins.runner.PanoptesPluginManager.getPluginByName',
mock_plugin):
with patch('yahoo_panoptes.framework.plugins.runner.PanoptesPluginRunner._get_context', mock_get_context):
runner = self._runner_class("Test Polling Plugin", "polling", PanoptesPollingPlugin, PanoptesPluginInfo,
None, self._panoptes_context, PanoptesTestKeyValueStore,
PanoptesTestKeyValueStore, PanoptesTestKeyValueStore, "plugin_logger",
PanoptesMetricsGroupSet, _callback)
runner.execute_plugin()
self._log_capture.check_present(('panoptes.tests.test_runner', 'ERROR',
'[None] [{}] Failed to execute plugin:'),
('panoptes.tests.test_runner', 'INFO',
'[None] [{}] Ran in seconds'),
('panoptes.tests.test_runner', 'ERROR',
'[None] [{}] Failed to release lock for plugin:'),
('panoptes.tests.test_runner', 'WARNING',
'[None] [{}] Plugin did not return any results'),
order_matters=False)
def test_plugin_wrong_result_type(self):
runner = self._runner_class("Test Polling Plugin 2", "polling", PanoptesPollingPlugin, PanoptesPluginInfo,
None, self._panoptes_context, PanoptesTestKeyValueStore,
PanoptesTestKeyValueStore, PanoptesTestKeyValueStore, "plugin_logger",
PanoptesMetricsGroupSet, _callback)
runner.execute_plugin()
self._log_capture.check_present(('panoptes.tests.test_runner', 'WARNING',
'[Test Polling Plugin 2] [None] Plugin returned an unexpected result type: '
'"PanoptesMetricsGroup"'))
class TestPanoptesPluginWithEnrichmentRunner(TestPanoptesPluginRunner):
@patch('redis.StrictRedis', panoptes_mock_redis_strict_client)
@patch('kazoo.client.KazooClient', panoptes_mock_kazoo_client)
def setUp(self):
super(TestPanoptesPluginWithEnrichmentRunner, self).setUp()
self._panoptes_resource = PanoptesResource(resource_site="test", resource_class="test",
resource_subclass="test", resource_type="test", resource_id="test",
resource_endpoint="test", resource_creation_timestamp=_TIMESTAMP,
resource_plugin="test")
self._runner_class = PanoptesPluginWithEnrichmentRunner
def test_basic_operations(self):
# Test where enrichment is None
mock_panoptes_enrichment_cache = Mock(return_value=None)
with patch('yahoo_panoptes.framework.plugins.runner.PanoptesEnrichmentCache', mock_panoptes_enrichment_cache):
runner = self._runner_class("Test Polling Plugin", "polling", PanoptesPollingPlugin, PanoptesPluginInfo,
self._panoptes_resource, self._panoptes_context, PanoptesTestKeyValueStore,
PanoptesTestKeyValueStore, PanoptesTestKeyValueStore, "plugin_logger",
PanoptesMetricsGroupSet, _callback)
runner.execute_plugin()
self._log_capture.check_present(('panoptes.tests.test_runner', 'ERROR',
'[Test Polling Plugin] [plugin|test|site|test|class|test|subclass|test|'
'type|test|id|test|endpoint|test] '
'Could not setup context for plugin:'),
order_matters=False
)
self._log_capture.uninstall()
self._log_capture = LogCapture(attributes=self.extract)
# Test with enrichment
runner = self._runner_class("Test Polling Plugin", "polling", PanoptesPollingPlugin, PanoptesPluginInfo,
self._panoptes_resource, self._panoptes_context, PanoptesTestKeyValueStore,
PanoptesTestKeyValueStore, PanoptesTestKeyValueStore, "plugin_logger",
PanoptesMetricsGroupSet, _callback)
runner.execute_plugin()
self._log_capture.check_present(('panoptes.tests.test_runner', 'INFO',
'Attempting to execute plugin "Test Polling Plugin"'),
('panoptes.tests.test_runner', 'DEBUG',
'''Starting Plugin Manager for "polling" plugins with the following '''
'''configuration: {'polling': <class'''
""" 'yahoo_panoptes.polling.polling_plugin.PanoptesPollingPlugin'>}, """
"""['tests/plugins/polling'], panoptes-plugin"""),
('panoptes.tests.test_runner', 'DEBUG', 'Found 3 plugins'),
('panoptes.tests.test_runner', 'DEBUG',
'Loaded plugin '
'"Test Polling Plugin", version "0.1" of type "polling"'
', category "polling"'),
('panoptes.tests.test_runner',
'DEBUG',
'Loaded plugin "Test Polling Plugin 2", '
'version "0.1" of type "polling", category "polling"'),
('panoptes.tests.test_runner', 'DEBUG',
'Loaded plugin "Test Polling Plugin Second Instance", '
'version "0.1" of type "polling", category "polling"'),
('panoptes.tests.test_runner',
'INFO',
'[Test Polling Plugin] [plugin|test|site|test|class|test|subclass|test|'
'type|test|id|test|endpoint|test] Attempting to get lock for plugin '
'"Test Polling Plugin"'),
('panoptes.tests.test_runner', 'DEBUG',
'Attempting to get lock for plugin "Test Polling Plugin", with lock path and '
'identifier in seconds'),
('panoptes.tests.test_runner',
'INFO',
'[Test Polling Plugin] [plugin|test|site|test|class|test|subclass|test|'
'type|test|id|test|endpoint|test] Acquired lock'),
('panoptes.tests.test_runner',
'INFO',
'[Test Polling Plugin] [plugin|test|site|test|class|test|subclass|test|'
'type|test|id|test|endpoint|test]'
' Ran in seconds'),
('panoptes.tests.test_runner',
'INFO',
'[Test Polling Plugin] [plugin|test|site|test|class|test|subclass|test|'
'type|test|id|test|endpoint|test] Released lock'),
('panoptes.tests.test_runner',
'INFO',
'[Test Polling Plugin] [plugin|test|site|test|class|test|subclass|test|'
'type|test|id|test|endpoint|test] Plugin returned'
' a result set with 1 members'),
('panoptes.tests.test_runner',
'INFO',
'[Test Polling Plugin] [plugin|test|site|test|class|test|subclass|test|'
'type|test|id|test|endpoint|test]'
' Callback function ran in seconds'),
('panoptes.tests.test_runner',
'INFO',
'[Test Polling Plugin] [plugin|test|site|test|class|test|subclass|test|type|'
'test|id|test|endpoint|test] GC took seconds. There are garbage objects.'),
('panoptes.tests.test_runner',
'ERROR',
'No enrichment data found on KV store for plugin Test Polling Plugin '
'resource test namespace test using key test'),
('panoptes.tests.test_runner',
'DEBUG',
'Successfully created PanoptesEnrichmentCache enrichment_data {} for plugin '
'Test Polling Plugin'),
order_matters=False
)
def test_callback_failure(self):
runner = self._runner_class("Test Polling Plugin", "polling", PanoptesPollingPlugin, PanoptesPluginInfo,
self._panoptes_resource, self._panoptes_context, PanoptesTestKeyValueStore,
PanoptesTestKeyValueStore, PanoptesTestKeyValueStore, "plugin_logger",
PanoptesMetricsGroupSet, _callback_with_exception)
runner.execute_plugin()
self._log_capture.check_present(('panoptes.tests.test_runner', 'ERROR',
'[Test Polling Plugin] '
'[plugin|test|site|test|class|test|subclass|test|'
'type|test|id|test|endpoint|test] Results callback function failed: :'))
# 'pass' is needed for these methods because the only difference in their logging output from
# TestPanoptesPluginRunner is the presence of the PanoptesResource in some log messages.
def test_lock_no_lock_object(self):
pass
def test_lock_is_none(self):
pass
def test_lock_is_not_locked(self):
pass
def test_plugin_failure(self):
pass
def test_plugin_wrong_result_type(self):
runner = self._runner_class("Test Polling Plugin 2", "polling", PanoptesPollingPlugin, PanoptesPluginInfo,
None, self._panoptes_context, PanoptesTestKeyValueStore,
PanoptesTestKeyValueStore, PanoptesTestKeyValueStore, "plugin_logger",
PanoptesMetric, _callback)
runner.execute_plugin()
self._log_capture.check_present(('panoptes.tests.test_runner',
'ERROR',
'[Test Polling Plugin 2] [None] Could not setup context for plugin:'))
class TestPanoptesPollingPluginRunner(TestPanoptesPluginWithEnrichmentRunner):
@patch('yahoo_panoptes.framework.metrics.time')
@patch('yahoo_panoptes.framework.context.PanoptesContext._get_message_producer')
@patch('yahoo_panoptes.framework.context.PanoptesContext.message_producer', new_callable=PropertyMock)
@patch('yahoo_panoptes.polling.polling_plugin_agent.PanoptesPollingTaskContext')
@patch('yahoo_panoptes.framework.resources.PanoptesResourceStore.get_resource')
def test_polling_plugin_agent(self, resource, panoptes_context, message_producer, message_producer_property, time):
producer = MockPanoptesMessageProducer()
time.return_value = 1
message_producer.return_value = producer
message_producer_property.return_value = producer
resource.return_value = self._panoptes_resource
panoptes_context.return_value = self._panoptes_context
polling_plugin_task('Test Polling Plugin', 'polling')
log_prefix = '[Test Polling Plugin] [plugin|test|site|test|class|test|' \
'subclass|test|type|test|id|test|endpoint|test]'
self._log_capture.check_present(
('panoptes.tests.test_runner', 'INFO', 'Attempting to execute plugin "Test Polling Plugin"'),
('panoptes.tests.test_runner', 'DEBUG',
'''Starting Plugin Manager for "polling" plugins with the following '''
'''configuration: {'polling': <class'''
""" 'yahoo_panoptes.polling.polling_plugin.PanoptesPollingPlugin'>}, """
"""['tests/plugins/polling'], panoptes-plugin"""),
('panoptes.tests.test_runner', 'DEBUG', 'Loaded plugin "Test Polling Plugin", '
'version "0.1" of type "polling", category "polling"'),
('panoptes.tests.test_runner', 'DEBUG', 'Loaded plugin "Test Polling Plugin 2", '
'version "0.1" of type "polling", category "polling"'),
('panoptes.tests.test_runner', 'ERROR', 'No enrichment data found on KV store for plugin Test'
' Polling Plugin resource test namespace test using key test'),
('panoptes.tests.test_runner', 'DEBUG', 'Successfully created PanoptesEnrichmentCache enrichment_data '
'{} for plugin Test Polling Plugin'),
('panoptes.tests.test_runner', 'DEBUG', 'Attempting to get lock for plugin "Test Polling Plugin", '
'with lock path and identifier in seconds'),
('panoptes.tests.test_runner', 'INFO', '{} Acquired lock'.format(log_prefix)),
('panoptes.tests.test_runner', 'INFO', '{} Plugin returned a result set with 1 members'.format(log_prefix)),
('panoptes.tests.test_runner', 'INFO', '{} Callback function ran in seconds'.format(log_prefix)),
('panoptes.tests.test_runner', 'INFO', '{} Ran in seconds'.format(log_prefix)),
('panoptes.tests.test_runner', 'INFO', '{} Released lock'.format(log_prefix)),
('panoptes.tests.test_runner', 'INFO', '{} GC took seconds. There are garbage objects.'.format(log_prefix)),
('panoptes.tests.test_runner', 'DEBUG', 'Deleting module: yapsy_loaded_plugin_Test_Polling_Plugin'),
('panoptes.tests.test_runner', 'DEBUG', 'Deleting module: yapsy_loaded_plugin_Test_Polling_Plugin'),
('panoptes.tests.test_runner', 'DEBUG', 'Deleting module: '
'yapsy_loaded_plugin_Test_Polling_Plugin_Second_Instance'),
order_matters=False
)
kafka_push_log = '"{"metrics_group_interval": 60, "resource": {"resource_site": "test", ' \
'"resource_class": "test", "resource_subclass": "test", "resource_type": ' \
'"test", "resource_id": "test", "resource_endpoint": "test", "resource_metadata":' \
' {"_resource_ttl": "604800"}, "resource_creation_timestamp": 1.0, ' \
'"resource_plugin": "test"}, "dimensions": [], "metrics_group_type": "Test", ' \
'"metrics": [{"metric_creation_timestamp": 1.0, "metric_type": "gauge", ' \
'"metric_name": "test", "metric_value": 0.0}], "metrics_group_creation_timestamp": ' \
'1.0, "metrics_group_schema_version": "0.2"}" to topic "test-processed" ' \
'with key "test:test" and partitioning key "test|Test|"'
# Timestamps need to be removed to check Panoptes Metrics
metric_groups_seen = 0
for line in self._log_capture.actual():
_, _, log = line
if log.find('metric group'):
log = re.sub(r"resource_creation_timestamp\": \d+\.\d+,",
"resource_creation_timestamp\": 1.0,",
log)
if log.startswith('Sent metric group'):
metric_groups_seen += 1
self.assertEqual(log.strip(), "Sent metric group {}".format(kafka_push_log))
if log.startswith('Going to send metric group'):
metric_groups_seen += 1
self.assertEqual(log.strip(), "Going to send metric group {}".format(kafka_push_log))
self.assertEqual(metric_groups_seen, 2)
class TestPanoptesDiscoveryPluginRunner(TestPanoptesPluginRunner):
@patch('yahoo_panoptes.framework.context.PanoptesContext._get_message_producer')
@patch('yahoo_panoptes.framework.context.PanoptesContext.message_producer', new_callable=PropertyMock)
@patch('yahoo_panoptes.discovery.discovery_plugin_agent.PanoptesDiscoveryTaskContext')
def test_discovery_plugin_task(self, panoptes_context, message_producer_property, message_producer):
producer = MockPanoptesMessageProducer()
message_producer_property.return_value = message_producer.return_value = producer
panoptes_context.return_value = self._panoptes_context
discovery_plugin_task("Test Discovery Plugin")
plugin_result = producer.messages
self.assertEqual(len(plugin_result), 1)
plugin_result = plugin_result[0]
self.assertTrue('Test_Discovery_Plugin' in plugin_result['key'])
plugin_result['key'] = 'Test_Discovery_Plugin'
expected_result = {
'topic': 'test_site-resources',
'message': '{"resource_set_creation_timestamp": 1.0, '
'"resource_set_schema_version": "0.1", "resources": '
'[{"resource_site": "test_site", "resource_class": '
'"test_class", "resource_subclass": "test_subclass", '
'"resource_type": "test_type", "resource_id": '
'"test_resource_id", "resource_endpoint": '
'"test_resource_endpoint", "resource_metadata": '
'{"_resource_ttl": "604800"},'
' "resource_creation_timestamp": 1.0,'
' "resource_plugin": "test_resource_plugin"}]}',
'key': 'Test_Discovery_Plugin'}
plugin_result['message'] = re.sub(
r"resource_set_creation_timestamp\": \d+\.\d+,",
"resource_set_creation_timestamp\": 1.0,",
plugin_result['message'])
plugin_result['message'] = re.sub(
r"resource_creation_timestamp\": \d+\.\d+,",
"resource_creation_timestamp\": 1.0,",
plugin_result['message'])
self.assertEqual(plugin_result, expected_result)
| 60.574815 | 123 | 0.531183 | 3,296 | 40,888 | 6.359223 | 0.089806 | 0.050859 | 0.062452 | 0.084494 | 0.750716 | 0.720515 | 0.686737 | 0.659447 | 0.645181 | 0.6302 | 0 | 0.003465 | 0.37884 | 40,888 | 674 | 124 | 60.664688 | 0.821797 | 0.010688 | 0 | 0.584416 | 0 | 0.020408 | 0.298975 | 0.13612 | 0 | 0 | 0 | 0 | 0.011132 | 1 | 0.051948 | false | 0.011132 | 0.025974 | 0 | 0.166976 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
ac07185d13ca3e632e2ca4e17fcc91869d099238 | 3,677 | py | Python | test/SMSGateway_test.py | S2Innovation/ds-s2i-smsgateway | eed5ce3d630c26b0fd73117d79c84606a12bc783 | [
"MIT"
] | null | null | null | test/SMSGateway_test.py | S2Innovation/ds-s2i-smsgateway | eed5ce3d630c26b0fd73117d79c84606a12bc783 | [
"MIT"
] | null | null | null | test/SMSGateway_test.py | S2Innovation/ds-s2i-smsgateway | eed5ce3d630c26b0fd73117d79c84606a12bc783 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
#
# This file is part of the SMSGateway project
#
#
#
# Distributed under the terms of the MIT license.
# See LICENSE.txt for more info.
"""Contain the tests for the SMSGateway for PANIC."""
# Path
import sys
import os
path = os.path.join(os.path.dirname(__file__), os.pardir)
sys.path.insert(0, os.path.abspath(path))
# Imports
from time import sleep
from mock import MagicMock
from PyTango import DevFailed, DevState
from devicetest import DeviceTestCase, main
from SMSGateway import SMSGateway
# Note:
#
# Since the device uses an inner thread, it is necessary to
# wait during the tests in order the let the device update itself.
# Hence, the sleep calls have to be secured enough not to produce
# any inconsistent behavior. However, the unittests need to run fast.
# Here, we use a factor 3 between the read period and the sleep calls.
#
# Look at devicetest examples for more advanced testing
# Device test case
class SMSGatewayDeviceTestCase(DeviceTestCase):
"""Test case for packet generation."""
# PROTECTED REGION ID(SMSGateway.test_additionnal_import) ENABLED START #
# PROTECTED REGION END # // SMSGateway.test_additionnal_import
device = SMSGateway
properties = {'IP': '', 'PIN': '9044',
}
empty = None # Should be []
@classmethod
def mocking(cls):
"""Mock external libraries."""
# Example : Mock numpy
# cls.numpy = SMSGateway.numpy = MagicMock()
# PROTECTED REGION ID(SMSGateway.test_mocking) ENABLED START #
# PROTECTED REGION END # // SMSGateway.test_mocking
def test_properties(self):
# test the properties
# PROTECTED REGION ID(SMSGateway.test_properties) ENABLED START #
# PROTECTED REGION END # // SMSGateway.test_properties
pass
def test_State(self):
"""Test for State"""
# PROTECTED REGION ID(SMSGateway.test_State) ENABLED START #
self.device.State()
# PROTECTED REGION END # // SMSGateway.test_State
def test_Status(self):
"""Test for Status"""
# PROTECTED REGION ID(SMSGateway.test_Status) ENABLED START #
self.device.Status()
# PROTECTED REGION END # // SMSGateway.test_Status
def test_Reset(self):
"""Test for Reset"""
# PROTECTED REGION ID(SMSGateway.test_Reset) ENABLED START #
self.device.Reset()
# PROTECTED REGION END # // SMSGateway.test_Reset
def test_Connect(self):
"""Test for Connect"""
# PROTECTED REGION ID(SMSGateway.test_Connect) ENABLED START #
self.device.Connect()
# PROTECTED REGION END # // SMSGateway.test_Connect
def test_SendSMS(self):
"""Test for SendSMS"""
# PROTECTED REGION ID(SMSGateway.test_SendSMS) ENABLED START #
self.device.SendSMS()
# PROTECTED REGION END # // SMSGateway.test_SendSMS
def test_SetPin(self):
"""Test for SetPin"""
# PROTECTED REGION ID(SMSGateway.test_SetPin) ENABLED START #
self.device.SetPin()
# PROTECTED REGION END # // SMSGateway.test_SetPin
def test_TextMessage(self):
"""Test for TextMessage"""
# PROTECTED REGION ID(SMSGateway.test_TextMessage) ENABLED START #
self.device.TextMessage
# PROTECTED REGION END # // SMSGateway.test_TextMessage
def test_Phone(self):
"""Test for Phone"""
# PROTECTED REGION ID(SMSGateway.test_Phone) ENABLED START #
self.device.Phone
# PROTECTED REGION END # // SMSGateway.test_Phone
# Main execution
if __name__ == "__main__":
main()
| 32.830357 | 77 | 0.661409 | 437 | 3,677 | 5.462243 | 0.318078 | 0.138249 | 0.078341 | 0.124424 | 0.305404 | 0.0553 | 0.0553 | 0 | 0 | 0 | 0 | 0.00252 | 0.244493 | 3,677 | 111 | 78 | 33.126126 | 0.856731 | 0.596138 | 0 | 0 | 0 | 0 | 0.0125 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.277778 | false | 0.027778 | 0.194444 | 0 | 0.583333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
ac07f9a51ba5bae2e9b9b9afd0ca35481fa33be3 | 214 | py | Python | Flask/Lezione4/webapp/project/serate/templates/serate/forms.py | nick87ds/MaterialeSerate | 51627e47ff1d3c3ecfc9ce6741c04b91b3295359 | [
"MIT"
] | 12 | 2021-12-12T22:19:52.000Z | 2022-03-18T11:45:17.000Z | Flask/Lezione4/webapp/project/serate/templates/serate/forms.py | nick87ds/MaterialeSerate | 51627e47ff1d3c3ecfc9ce6741c04b91b3295359 | [
"MIT"
] | 1 | 2022-03-23T13:58:33.000Z | 2022-03-23T14:05:08.000Z | Flask/Lezione4/webapp/project/serate/templates/serate/forms.py | nick87ds/MaterialeSerate | 51627e47ff1d3c3ecfc9ce6741c04b91b3295359 | [
"MIT"
] | 7 | 2021-02-01T22:09:14.000Z | 2021-06-22T08:30:16.000Z | from time import strftime
from flask_wtf import FlaskForm
from wtforms import (
Form,
validators,
StringField,
IntegerField,
SubmitField,
BooleanField,
SelectField,
TextAreaField,
)
| 16.461538 | 31 | 0.705607 | 20 | 214 | 7.5 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.247664 | 214 | 12 | 32 | 17.833333 | 0.931677 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.25 | 0 | 0.25 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
ac0e680fa5ad08e1900fc7ebe2eb246aebdc7e1d | 148 | py | Python | automation/openwebsites.py | abrahammachuki/dnav3-code | d278bf4facbc0702342f9c86a3845f0fb1c247bf | [
"MIT"
] | null | null | null | automation/openwebsites.py | abrahammachuki/dnav3-code | d278bf4facbc0702342f9c86a3845f0fb1c247bf | [
"MIT"
] | null | null | null | automation/openwebsites.py | abrahammachuki/dnav3-code | d278bf4facbc0702342f9c86a3845f0fb1c247bf | [
"MIT"
] | null | null | null | import webbrowser
website = ['site1', 'site2', 'site3', 'site4']
for i in range(len(website)):
site = 'http://' + website[i]
webbrowser.open(site) | 29.6 | 46 | 0.662162 | 20 | 148 | 4.9 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.031008 | 0.128378 | 148 | 5 | 47 | 29.6 | 0.728682 | 0 | 0 | 0 | 0 | 0 | 0.181208 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 0.2 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
ac105d162c447186bd1f92785b821628a3aa1ff5 | 1,865 | py | Python | hedger/tournament.py | dmalison/hedger | 8db634a484769fb4f3feb945c1847ef50803fafe | [
"MIT"
] | null | null | null | hedger/tournament.py | dmalison/hedger | 8db634a484769fb4f3feb945c1847ef50803fafe | [
"MIT"
] | null | null | null | hedger/tournament.py | dmalison/hedger | 8db634a484769fb4f3feb945c1847ef50803fafe | [
"MIT"
] | null | null | null | import hedger
from hedger import Result
class Tournament:
def __init__(self, entries):
self._entries = entries
self._brackets = self._get_brackets()
self._brackets_info = self._get_brackets_info()
@property
def entries(self):
return self._entries
@property
def brackets(self):
return self._brackets
@property
def brackets_info(self):
return self._brackets_info
def _get_brackets(self):
n_brackets = self._get_n_brackets()
brackets = list()
for code in range(n_brackets):
results = self._get_results_from_code(code)
bracket = self._make_bracket(results)
brackets.append(bracket)
return brackets
def _get_brackets_info(self):
return {
bracket.code: (bracket.prob, bracket.winner_names)
for bracket in self._brackets
}
def _get_n_brackets(self):
n_matches = self._get_n_matches()
return 2 ** n_matches
def _get_n_matches(self):
n_entries = len(self._entries)
return n_entries - 1
def _get_results_from_code(self, bracket_index):
binary = self._convert_to_binary(bracket_index)
results = [self._decode_bit_as_result(b) for b in binary]
return results
def _convert_to_binary(self, bracket_index):
n_digits = self._get_n_matches()
binary_fmt = "{" + "0:0{}b".format(n_digits) + "}"
return binary_fmt.format(bracket_index)
def _decode_bit_as_result(self, bit):
if int(bit) == Result.TOP_WINS.value:
return Result.TOP_WINS
else:
return Result.BOTTOM_WINS
def _make_bracket(self, results):
bracket_builder = hedger.BracketBuilder(self, results)
bracket = bracket_builder.get_bracket()
return bracket
| 27.835821 | 65 | 0.641287 | 228 | 1,865 | 4.864035 | 0.232456 | 0.064923 | 0.043282 | 0.039675 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002961 | 0.275603 | 1,865 | 66 | 66 | 28.257576 | 0.817913 | 0 | 0 | 0.057692 | 0 | 0 | 0.00429 | 0 | 0.019231 | 0 | 0 | 0 | 0 | 1 | 0.230769 | false | 0 | 0.038462 | 0.076923 | 0.519231 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
ac14c5baab8284824cd35d4e64729e5b1523569f | 582 | py | Python | elif_bayindir/phase_1/python_basic_1/day_4/q8.py | CodedLadiesInnovateTech/-python-challenge-solutions | 430cd3eb84a2905a286819eef384ee484d8eb9e7 | [
"MIT"
] | 6 | 2020-05-23T19:53:25.000Z | 2021-05-08T20:21:30.000Z | elif_bayindir/phase_1/python_basic_1/day_4/q8.py | CodedLadiesInnovateTech/-python-challenge-solutions | 430cd3eb84a2905a286819eef384ee484d8eb9e7 | [
"MIT"
] | 8 | 2020-05-14T18:53:12.000Z | 2020-07-03T00:06:20.000Z | elif_bayindir/phase_1/python_basic_1/day_4/q8.py | CodedLadiesInnovateTech/-python-challenge-solutions | 430cd3eb84a2905a286819eef384ee484d8eb9e7 | [
"MIT"
] | 39 | 2020-05-10T20:55:02.000Z | 2020-09-12T17:40:59.000Z | # Question 8
# Print even numbers in a list, stop printing when the number is 237
numbers = [
386, 462, 47, 418, 907, 344, 236, 375, 823, 566, 597, 978, 328, 615, 953, 345,
399, 162, 758, 219, 918, 237, 412, 566, 826, 248, 866, 950, 626, 949, 687, 217,
815, 67, 104, 58, 512, 24, 892, 894, 767, 553, 81, 379, 843, 831, 445, 742, 717,
958,743, 527
]
for i in range(len(numbers)):
if numbers[i] % 2 == 0:
print(numbers[i])
elif numbers[i] == 237:
break
# Alternative,
""" for x in numbers:
if x % 2 == 0:
print(x)
elif x == 237:
break """
| 22.384615 | 85 | 0.573883 | 102 | 582 | 3.27451 | 0.745098 | 0.071856 | 0.041916 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.384615 | 0.262887 | 582 | 25 | 86 | 23.28 | 0.393939 | 0.154639 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.090909 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
ac36030e8e89e493e372409b81e3e6f1ab9b3e03 | 1,173 | py | Python | QUANTAXIS/example/DataFetcher.py | cyy1229/QUANTAXIS | 320eff53dfa2cde8032a5e066499f4da0b5064a2 | [
"MIT"
] | null | null | null | QUANTAXIS/example/DataFetcher.py | cyy1229/QUANTAXIS | 320eff53dfa2cde8032a5e066499f4da0b5064a2 | [
"MIT"
] | null | null | null | QUANTAXIS/example/DataFetcher.py | cyy1229/QUANTAXIS | 320eff53dfa2cde8032a5e066499f4da0b5064a2 | [
"MIT"
] | null | null | null | from QUANTAXIS import QA_fetch_stock_day_adv, QA_fetch_stock_list_adv, QA_fetch_stock_day_full_adv, QA_Setting
import pandas as pd
QASETTING = QA_Setting()
DATABASE = QASETTING.client.quantaxis
# def getAllTradeCal():
# return pd.DataFrame(DATABASE.trade_date.find({"is_open": 1}))
class MongoDataLoader:
def __init__(self):
pass
def load_stock_day(self,
code,
start='all',
end=None
):
QA_fetch_stock_day_adv(code, start, end)
def load_stock_list(self):
return QA_fetch_stock_list_adv()
def load_trade_cal(self):
return pd.DataFrame(DATABASE.trade_date.find({"is_open": 1}))
def load_stock_day_full(self, date):
return QA_fetch_stock_day_full_adv(date)
'''根据日期范围加载tushare日线数据'''
def load_tushare_stock_day(self, end, start='20150101'):
return pd.DataFrame(DATABASE.tushare_stock_day.find({"trade_date": {
"$lte": end,
"$gte": start
}}))
if __name__ == '__main__':
print(MongoDataLoader().load_tushare_stock_day(end='20210630'))
| 27.27907 | 110 | 0.627451 | 144 | 1,173 | 4.680556 | 0.340278 | 0.106825 | 0.106825 | 0.089021 | 0.308605 | 0.198813 | 0.133531 | 0.133531 | 0.133531 | 0.133531 | 0 | 0.020979 | 0.268542 | 1,173 | 42 | 111 | 27.928571 | 0.764569 | 0.074169 | 0 | 0 | 0 | 0 | 0.049149 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.230769 | false | 0.038462 | 0.076923 | 0.153846 | 0.5 | 0.038462 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
ac4fb7ef759fec615c1233d88bd6d5b5c8a82c1d | 144 | py | Python | backend/puzzle/apps.py | mductran/puzzle | c4598f5420dff126fa67db1e0adee1677a8baf8f | [
"Apache-2.0"
] | null | null | null | backend/puzzle/apps.py | mductran/puzzle | c4598f5420dff126fa67db1e0adee1677a8baf8f | [
"Apache-2.0"
] | null | null | null | backend/puzzle/apps.py | mductran/puzzle | c4598f5420dff126fa67db1e0adee1677a8baf8f | [
"Apache-2.0"
] | null | null | null | from django.apps import AppConfig
class PuzzleConfig(AppConfig):
default_auto_field = 'django.db.models.BigAutoField'
name = 'puzzle'
| 20.571429 | 56 | 0.756944 | 17 | 144 | 6.294118 | 0.882353 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.152778 | 144 | 6 | 57 | 24 | 0.877049 | 0 | 0 | 0 | 0 | 0 | 0.243056 | 0.201389 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
ac5d5c3626cc5c773bf91a5f517bfdbe0b549607 | 687 | py | Python | tests/api/v2/test_datasources.py | droessmj/python-sdk | 42ea2366d08ef5e4d1fa45029480b800352ab765 | [
"MIT"
] | 2 | 2020-09-08T20:42:05.000Z | 2020-09-09T14:27:55.000Z | tests/api/v2/test_datasources.py | droessmj/python-sdk | 42ea2366d08ef5e4d1fa45029480b800352ab765 | [
"MIT"
] | null | null | null | tests/api/v2/test_datasources.py | droessmj/python-sdk | 42ea2366d08ef5e4d1fa45029480b800352ab765 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Test suite for the community-developed Python SDK for interacting with Lacework APIs.
"""
import pytest
from laceworksdk.api.v2.datasources import DatasourcesAPI
from tests.api.test_base_endpoint import BaseEndpoint
# Tests
@pytest.fixture(scope="module")
def api_object(api):
return api.datasources
class TestDatasources(BaseEndpoint):
OBJECT_ID_NAME = "name"
OBJECT_TYPE = DatasourcesAPI
def test_api_get(self, api_object):
response = api_object.get()
assert "data" in response.keys()
def test_api_get_by_type(self, api_object):
self._get_object_classifier_test(api_object, "type", self.OBJECT_ID_NAME)
| 22.9 | 85 | 0.737991 | 92 | 687 | 5.271739 | 0.51087 | 0.092784 | 0.049485 | 0.053608 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00349 | 0.165939 | 687 | 29 | 86 | 23.689655 | 0.842932 | 0.165939 | 0 | 0 | 0 | 0 | 0.031915 | 0 | 0 | 0 | 0 | 0 | 0.071429 | 1 | 0.214286 | false | 0 | 0.214286 | 0.071429 | 0.714286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
ac75e12c17c4b689ad5e95e21f9b92a7a82c808e | 2,108 | py | Python | my_loc/__init__.py | PIYUSH-GEEK/my_loc | 777eeefec3bc29f03c1be956037c10bf8457dfc9 | [
"MIT"
] | 1 | 2019-08-18T07:06:36.000Z | 2019-08-18T07:06:36.000Z | my_loc/__init__.py | PIYUSH-GEEK/my_loc | 777eeefec3bc29f03c1be956037c10bf8457dfc9 | [
"MIT"
] | null | null | null | my_loc/__init__.py | PIYUSH-GEEK/my_loc | 777eeefec3bc29f03c1be956037c10bf8457dfc9 | [
"MIT"
] | null | null | null | import requests
from bs4 import BeautifulSoup
def locate(i):
my_loc = requests.get('https://ipinfo.io').json()
if str(i).isdigit():
return 'Invalid input. Try a string.'
else:
if i == 'ip':
return my_loc['ip']
elif i == 'city' or i == 'capital':
return my_loc['city']
elif i == 'region' or i == 'state':
return my_loc['region']
elif i == 'country':
return my_loc['country']
elif i == 'loc':
return my_loc['loc']
elif i == 'org':
return my_loc['org']
elif i == 'postal' or i == 'pin':
return my_loc['postal']
elif i == 'readme':
return my_loc['readme']
def lat(input):
if input == 'Gomia' or input == 'gomia':
return 'Try \'gumia\' or \'Gumia\''
try:
page_get = requests.get('https://www.latlong.net/search.php?keyword=%s'%input)
page_content = BeautifulSoup(page_get.content, 'html.parser')
tr1 = page_content.find_all('tr')[1]
td1 = tr1 .find_all('td')[0].text
td2 = tr1 .find_all('td')[1].text
print('Place: ',td1)
return td2
except IndexError as Error:
return 'Search for some other place.'
def lng(input):
if input == 'Gomia' or input == 'gomia':
return 'Try \'gumia\' or \'Gumia\''
try:
page_get = requests.get('https://www.latlong.net/search.php?keyword=%s'%input)
page_content = BeautifulSoup(page_get.content, 'html.parser')
tr1 = page_content.find_all('tr')[1]
td1 = tr1 .find_all('td')[0].text
td3 = tr1 .find_all('td')[2].text
print('Place: ',td1)
return td3
except IndexError as Error:
return 'Search for some other place.'
def ltlng(input):
if input == 'Gomia' or input == 'gomia':
return 'Try \'gumia\' or \'Gumia\''
try:
page_get = requests.get('https://www.latlong.net/search.php?keyword=%s'%input)
page_content = BeautifulSoup(page_get.content, 'html.parser')
tr1 = page_content.find_all('tr')[1]
td1 = tr1 .find_all('td')[0].text
td2 = tr1 .find_all('td')[1].text
td3 = tr1 .find_all('td')[2].text
print('Place: ',td1)
return td2, td3
except IndexError as Error:
return 'Search for some other place.'
| 22.913043 | 80 | 0.624763 | 317 | 2,108 | 4.056782 | 0.233438 | 0.054432 | 0.068429 | 0.065319 | 0.715397 | 0.715397 | 0.695956 | 0.695956 | 0.695956 | 0.695956 | 0 | 0.02071 | 0.198292 | 2,108 | 91 | 81 | 23.164835 | 0.740237 | 0 | 0 | 0.539683 | 0 | 0 | 0.231025 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.063492 | false | 0 | 0.031746 | 0 | 0.380952 | 0.047619 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
ac8080a2e2bb7b553d1d31e52508d8e6de00b522 | 595 | py | Python | kwikposts/admin.py | Vicynet/kwiktalk | 198efdd5965cc0cd3ee8dcf5e469d9022330ec25 | [
"bzip2-1.0.6"
] | null | null | null | kwikposts/admin.py | Vicynet/kwiktalk | 198efdd5965cc0cd3ee8dcf5e469d9022330ec25 | [
"bzip2-1.0.6"
] | null | null | null | kwikposts/admin.py | Vicynet/kwiktalk | 198efdd5965cc0cd3ee8dcf5e469d9022330ec25 | [
"bzip2-1.0.6"
] | null | null | null | from django.contrib import admin
from .models import KwikPost, Comment, Like
# Register your models here.
@admin.register(KwikPost)
class KwikPostAdmin(admin.ModelAdmin):
list_display = ['user', 'featured_image', 'slug', 'post_body', 'created_at']
list_filter = ['created_at']
prepopulated_fields = {'slug': ('post_body',)[:20]}
@admin.register(Comment)
class CommentAdmin(admin.ModelAdmin):
list_display = ['user', 'post', 'user_comment', 'created_at']
@admin.register(Like)
class LikeAdmin(admin.ModelAdmin):
list_display = ['user', 'post', 'values', 'created_at']
| 25.869565 | 80 | 0.710924 | 71 | 595 | 5.774648 | 0.450704 | 0.087805 | 0.139024 | 0.190244 | 0.239024 | 0.165854 | 0 | 0 | 0 | 0 | 0 | 0.003868 | 0.131092 | 595 | 22 | 81 | 27.045455 | 0.789168 | 0.043697 | 0 | 0 | 0 | 0 | 0.208481 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.153846 | 0 | 0.769231 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
3bbd5cc24a379d3da78746ccf10468524d2749f7 | 2,421 | py | Python | _int_tools.py | CaptainSora/Python-Project-Euler | 056400f434eec837ece5ef06653b310ebfcc3d4e | [
"MIT"
] | null | null | null | _int_tools.py | CaptainSora/Python-Project-Euler | 056400f434eec837ece5ef06653b310ebfcc3d4e | [
"MIT"
] | null | null | null | _int_tools.py | CaptainSora/Python-Project-Euler | 056400f434eec837ece5ef06653b310ebfcc3d4e | [
"MIT"
] | null | null | null | """
This module contains functions related to integer formatting and math.
"""
from functools import reduce
from itertools import count
from math import gcd, prod
# ================ ARRAY FORMATTING FUNCTIONS ================
def str_array_to_int(intarray):
return int(''.join(intarray))
def int_array_to_int(intarray):
return str_array_to_int(map(str, intarray))
def int_to_int_array(num):
"""
Deprecated, use int_to_digit_array(num)
"""
return [int(str(num)[a]) for a in range(len(str(num)))]
def int_to_str_array(num):
return [str(num)[a] for a in range(len(str(num)))]
def int_to_digit_array(num):
return [int(str(num)[a]) for a in range(len(str(num)))]
# ================ CALCULATION FUNCTIONS ================
def product(numlist):
"""
Deprecated since Python 3.8, use math.prod instead
Also remove functools.reduce
"""
return reduce(lambda x, y: x * y, numlist, 1)
def factorial(num):
return prod(list(range(1, num + 1)))
def nCr(n, r):
return int(prod(range(n-r+1, n+1)) / prod(range(1, r+1)))
def phi(n):
"""
Returns the value of ϕ(n), or the Euler Totient function.
"""
return len([x for x in range(1, n) if gcd(n, x) == 1])
# ================ COUNTING FUNCTIONS ================
def counting_summations(values, target):
"""
Returns the number of ways to write target as the sum of numbers in values.
"""
csums = [[0 for _ in values]]
while len(csums) <= target:
tempsum = [0 for _ in values]
for a in range(len(values)):
if values[a] > len(csums):
break
elif values[a] == len(csums):
tempsum[a] = 1
else:
tempsum[a] += sum(csums[len(csums) - values[a]][:a+1])
csums.append(tempsum)
return sum(csums[target])
def partition():
"""
Calculates the partition function using Euler's method.
Much faster than the above function.
"""
yield 1
p = [1]
for i in count(1):
new_p = 0
for j in count(1):
# move i
if j % 2 == 0:
i -= j // 2
else:
i -= j
if i < 0:
break
# add to new_p
if (j - 1) % 4 < 2:
new_p += p[i]
else:
new_p -= p[i]
p.append(new_p)
yield new_p
| 23.278846 | 79 | 0.534077 | 341 | 2,421 | 3.703812 | 0.296188 | 0.028504 | 0.019002 | 0.034838 | 0.164687 | 0.115598 | 0.115598 | 0.115598 | 0.115598 | 0.115598 | 0 | 0.016091 | 0.306898 | 2,421 | 103 | 80 | 23.504854 | 0.736591 | 0.251136 | 0 | 0.134615 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.211538 | false | 0 | 0.057692 | 0.115385 | 0.461538 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
3bd3e56f8e3f7640af1c0c1de7776e8679289263 | 103 | py | Python | Exercise-1/Q4_reverse.py | abhay-lal/18CSC207J-APP | 79a955a99837e6d41c89cb1a9e84eb0230c0fa7b | [
"MIT"
] | null | null | null | Exercise-1/Q4_reverse.py | abhay-lal/18CSC207J-APP | 79a955a99837e6d41c89cb1a9e84eb0230c0fa7b | [
"MIT"
] | null | null | null | Exercise-1/Q4_reverse.py | abhay-lal/18CSC207J-APP | 79a955a99837e6d41c89cb1a9e84eb0230c0fa7b | [
"MIT"
] | null | null | null | word = input('Enter a word')
len = len(word)
for i in range(len-1, -1, -1):
print(word[i], end='')
| 20.6 | 30 | 0.572816 | 20 | 103 | 2.95 | 0.6 | 0.067797 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.036145 | 0.194175 | 103 | 4 | 31 | 25.75 | 0.674699 | 0 | 0 | 0 | 0 | 0 | 0.116505 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.25 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
3be4dea35fbe812684c863cfb56967cde0971e92 | 1,679 | py | Python | buildroot/support/testing/tests/init/test_busybox.py | rbrenton/hassos | fa6f7ac74ddba50e76f5779c613c56d937684844 | [
"Apache-2.0"
] | 617 | 2015-01-04T14:33:56.000Z | 2022-03-24T22:42:25.000Z | buildroot/support/testing/tests/init/test_busybox.py | rbrenton/hassos | fa6f7ac74ddba50e76f5779c613c56d937684844 | [
"Apache-2.0"
] | 631 | 2015-01-01T22:53:25.000Z | 2022-03-17T18:41:00.000Z | buildroot/support/testing/tests/init/test_busybox.py | rbrenton/hassos | fa6f7ac74ddba50e76f5779c613c56d937684844 | [
"Apache-2.0"
] | 133 | 2015-03-03T18:40:05.000Z | 2022-03-18T13:34:26.000Z | import infra.basetest
from tests.init.base import InitSystemBase as InitSystemBase
class InitSystemBusyboxBase(InitSystemBase):
config = infra.basetest.BASIC_TOOLCHAIN_CONFIG + \
"""
# BR2_TARGET_ROOTFS_TAR is not set
"""
def check_init(self):
super(InitSystemBusyboxBase, self).check_init("/bin/busybox")
class TestInitSystemBusyboxRo(InitSystemBusyboxBase):
config = InitSystemBusyboxBase.config + \
"""
# BR2_TARGET_GENERIC_REMOUNT_ROOTFS_RW is not set
BR2_TARGET_ROOTFS_SQUASHFS=y
"""
def test_run(self):
self.start_emulator("squashfs")
self.check_init()
self.check_network("eth0", 1)
class TestInitSystemBusyboxRw(InitSystemBusyboxBase):
config = InitSystemBusyboxBase.config + \
"""
BR2_TARGET_ROOTFS_EXT2=y
"""
def test_run(self):
self.start_emulator("ext2")
self.check_init()
self.check_network("eth0", 1)
class TestInitSystemBusyboxRoNet(InitSystemBusyboxBase):
config = InitSystemBusyboxBase.config + \
"""
BR2_SYSTEM_DHCP="eth0"
# BR2_TARGET_GENERIC_REMOUNT_ROOTFS_RW is not set
BR2_TARGET_ROOTFS_SQUASHFS=y
"""
def test_run(self):
self.start_emulator("squashfs")
self.check_init()
self.check_network("eth0")
class TestInitSystemBusyboxRwNet(InitSystemBusyboxBase):
config = InitSystemBusyboxBase.config + \
"""
BR2_SYSTEM_DHCP="eth0"
BR2_TARGET_ROOTFS_EXT2=y
"""
def test_run(self):
self.start_emulator("ext2")
self.check_init()
self.check_network("eth0")
| 25.830769 | 69 | 0.659321 | 170 | 1,679 | 6.223529 | 0.270588 | 0.07656 | 0.070888 | 0.204159 | 0.669187 | 0.669187 | 0.567108 | 0.567108 | 0.567108 | 0.555766 | 0 | 0.016562 | 0.244789 | 1,679 | 64 | 70 | 26.234375 | 0.817823 | 0 | 0 | 0.666667 | 0 | 0 | 0.041834 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.066667 | 0 | 0.566667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
3bed882365f0c947238e86347d95e522a56968a9 | 2,380 | py | Python | deprecated.py | tungr/CoeusBot | 90bdc869a1f8c077a1f88dcf1335d20a19d49fee | [
"MIT"
] | null | null | null | deprecated.py | tungr/CoeusBot | 90bdc869a1f8c077a1f88dcf1335d20a19d49fee | [
"MIT"
] | null | null | null | deprecated.py | tungr/CoeusBot | 90bdc869a1f8c077a1f88dcf1335d20a19d49fee | [
"MIT"
] | null | null | null | #### Transfer data from JSON file to MongoDB ####
# @client.command()
# async def qupload(self, ctx):
# mclient = MongoClient(host="localhost", port=27017)
# db = mclient.coeusbot
# quotesdb = db.quotes
# with open('quotes.json', 'r') as f:
# quotes = json.load(f)
# for quotenum in range(1, len(quotes)):
# datetime = quotes[str(quotenum)]['date_time']
# author = quotes[str(quotenum)]['author']
# quote = quotes[str(quotenum)]['quote']
# guild = ctx.guild.id
# qamount = quotesdb.find({"guild": ctx.guild.id}) # Grab all quotes of same guild id
# qid = 1
# # Increment qid based on # of quotes in guild
# for qnum in qamount:
# qid += 1
# mquote = {
# "datetime": datetime,
# "author": author,
# "quote": quote,
# "guild": guild,
# "qid": qid
# }
# result = quotesdb.insert_one(mquote)
# mclient.close()
# await ctx.reply(f'Quotes transferred')
#### Add quote to JSON file ####
# @client.command(aliases=['qua'])
# async def quoteadd(self, ctx, *quote):
# with open('quotes.json', 'r') as f:
# quotes = json.load(f)
# if str(len(quotes)+1) not in quotes:
# now = dt.datetime.now()
# date_time = now.strftime("%m/%d/%Y, %I:%M%p")
# q_amount = len(quotes) + 1
# quotes[str(q_amount)] = {}
# quotes[str(q_amount)]['quote'] = quote
# quotes[str(q_amount)]['date_time'] = date_time
# quotes[str(q_amount)]['author'] = str(ctx.author)
# with open('quotes.json', 'w') as f:
# json.dump(quotes, f)
# await ctx.reply(f'Quote added')
#### Grab quote from JSON file ####
# @client.command()
# async def quotes(self, ctx):
# with open('quotes.json', 'r') as f:
# quotes = json.load(f)
# randquote = random.randint(1,len(quotes))
# quote = quotes[str(randquote)]['quote']
# date_time = quotes[str(randquote)]['date_time']
# author = quotes[str(randquote)]['author']
# quote_embed = discord.Embed(title=f'💬 Quote #{randquote}', color=0x03fcce)
# newquote = ' '.join(quote)
# quote_embed.add_field(name='\u200b', value=f'{newquote}', inline=False)
# quote_embed.set_footer(text=f'{date_time}')
# await ctx.send(embed=quote_embed) | 32.162162 | 93 | 0.561765 | 299 | 2,380 | 4.411371 | 0.344482 | 0.068234 | 0.042456 | 0.054587 | 0.11903 | 0.084155 | 0.084155 | 0.084155 | 0.084155 | 0.084155 | 0 | 0.009703 | 0.263866 | 2,380 | 74 | 94 | 32.162162 | 0.74258 | 0.928992 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
3beed423b84aed994aacbe9098f28892995cd210 | 491 | py | Python | ramda/memoize_with_test.py | jakobkolb/ramda.py | 982b2172f4bb95b9a5b09eff8077362d6f2f0920 | [
"MIT"
] | 56 | 2018-08-06T08:44:58.000Z | 2022-03-17T09:49:03.000Z | ramda/memoize_with_test.py | jakobkolb/ramda.py | 982b2172f4bb95b9a5b09eff8077362d6f2f0920 | [
"MIT"
] | 28 | 2019-06-17T11:09:52.000Z | 2022-02-18T16:59:21.000Z | ramda/memoize_with_test.py | jakobkolb/ramda.py | 982b2172f4bb95b9a5b09eff8077362d6f2f0920 | [
"MIT"
] | 5 | 2019-09-18T09:24:38.000Z | 2021-07-21T08:40:23.000Z | from ramda.memoize_with import memoize_with
from ramda.product import product
from ramda.private.asserts import assert_equal as e
count = 0
def memoize_with_test():
@memoize_with(lambda x: -x)
def factorial(n):
global count
count += 1
return product(range(1, n + 1))
e(factorial(5), 120)
e(factorial(5), 120)
e(factorial(5), 120)
e(factorial(4), 24)
e(factorial(4), 24)
e(factorial(4), 24)
e(factorial(4), 24)
e(count, 2)
| 20.458333 | 51 | 0.631365 | 76 | 491 | 4 | 0.381579 | 0.230263 | 0.144737 | 0.171053 | 0.3125 | 0.3125 | 0.3125 | 0.3125 | 0.3125 | 0.3125 | 0 | 0.077957 | 0.242363 | 491 | 23 | 52 | 21.347826 | 0.739247 | 0 | 0 | 0.388889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.055556 | 1 | 0.111111 | false | 0 | 0.166667 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
ce1986e97c39f7b0d9070c20a8cf44a57d43a5a3 | 13,093 | py | Python | tests/integration_tests/test_solution/test_solution_interior.py | cwentland0/perform | e08771cb776a7e6518c43350746e2ca72f79b153 | [
"MIT"
] | 6 | 2021-03-24T21:42:06.000Z | 2022-01-28T20:00:13.000Z | tests/integration_tests/test_solution/test_solution_interior.py | cwentland0/perform | e08771cb776a7e6518c43350746e2ca72f79b153 | [
"MIT"
] | 38 | 2021-04-15T15:30:21.000Z | 2022-01-29T01:23:57.000Z | tests/integration_tests/test_solution/test_solution_interior.py | cwentland0/perform | e08771cb776a7e6518c43350746e2ca72f79b153 | [
"MIT"
] | 1 | 2021-07-03T03:13:36.000Z | 2021-07-03T03:13:36.000Z | import unittest
import os
import numpy as np
from constants import (
del_test_dir,
gen_test_dir,
get_output_mode,
solution_domain_setup,
CHEM_DICT_REACT,
SOL_PRIM_IN_REACT,
TEST_DIR,
)
from perform.constants import REAL_TYPE
from perform.system_solver import SystemSolver
from perform.input_funcs import read_restart_file
from perform.gas_model.calorically_perfect_gas import CaloricallyPerfectGas
from perform.time_integrator.implicit_integrator import BDF
from perform.solution.solution_interior import SolutionInterior
class SolutionIntInitTestCase(unittest.TestCase):
def setUp(self):
self.output_mode, self.output_dir = get_output_mode()
# set chemistry
self.chem_dict = CHEM_DICT_REACT
self.gas = CaloricallyPerfectGas(self.chem_dict)
# set time integrator
self.param_dict = {}
self.param_dict["dt"] = 1e-7
self.param_dict["time_scheme"] = "bdf"
self.param_dict["time_order"] = 2
self.time_int = BDF(self.param_dict)
# generate working directory
gen_test_dir()
# generate input text files
solution_domain_setup()
# set SystemSolver
self.solver = SystemSolver(TEST_DIR)
self.num_cells = 2
self.num_reactions = 1
def tearDown(self):
del_test_dir()
def test_solution_int_init(self):
sol = SolutionInterior(
self.gas, SOL_PRIM_IN_REACT, self.solver, self.num_cells, self.num_reactions, self.time_int
)
if self.output_mode:
np.save(os.path.join(self.output_dir, "sol_int_init_sol_cons.npy"), sol.sol_cons)
else:
self.assertTrue(np.array_equal(sol.sol_prim, SOL_PRIM_IN_REACT))
self.assertTrue(
np.allclose(sol.sol_cons, np.load(os.path.join(self.output_dir, "sol_int_init_sol_cons.npy")))
)
# TODO: a LOT of checking of other variables
class SolutionIntMethodsTestCase(unittest.TestCase):
def setUp(self):
self.output_mode, self.output_dir = get_output_mode()
# set chemistry
self.chem_dict = CHEM_DICT_REACT
self.gas = CaloricallyPerfectGas(self.chem_dict)
# set time integrator
self.param_dict = {}
self.param_dict["dt"] = 1e-7
self.param_dict["time_scheme"] = "bdf"
self.param_dict["time_order"] = 2
self.time_int = BDF(self.param_dict)
# generate working directory
gen_test_dir()
# generate input text files
solution_domain_setup()
# set SystemSolver
self.solver = SystemSolver(TEST_DIR)
self.num_cells = 2
self.num_reactions = 1
self.sol = SolutionInterior(
self.gas, SOL_PRIM_IN_REACT, self.solver, self.num_cells, self.num_reactions, self.time_int
)
def tearDown(self):
del_test_dir()
def test_calc_sol_jacob(self):
sol_jacob = self.sol.calc_sol_jacob(inverse=False)
sol_jacob_inv = self.sol.calc_sol_jacob(inverse=True)
if self.output_mode:
np.save(os.path.join(self.output_dir, "sol_int_sol_jacob.npy"), sol_jacob)
np.save(os.path.join(self.output_dir, "sol_int_sol_jacob_inv.npy"), sol_jacob_inv)
else:
self.assertTrue(np.allclose(sol_jacob, np.load(os.path.join(self.output_dir, "sol_int_sol_jacob.npy"))))
self.assertTrue(
np.allclose(sol_jacob_inv, np.load(os.path.join(self.output_dir, "sol_int_sol_jacob_inv.npy")))
)
def test_update_snapshots(self):
# update the snapshot matrix
for self.solver.iter in range(1, self.solver.num_steps + 1):
if (self.solver.iter % self.solver.out_interval) == 0:
self.sol.update_snapshots(self.solver)
self.assertTrue(np.array_equal(self.sol.prim_snap, np.repeat(self.sol.sol_prim[:, :, None], 6, axis=2)))
self.assertTrue(np.array_equal(self.sol.cons_snap, np.repeat(self.sol.sol_cons[:, :, None], 6, axis=2)))
self.assertTrue(
np.array_equal(self.sol.reaction_source_snap, np.repeat(self.sol.reaction_source[:, :, None], 5, axis=2))
)
self.assertTrue(
np.array_equal(self.sol.heat_release_snap, np.repeat(self.sol.heat_release[:, None], 5, axis=1))
)
self.assertTrue(np.array_equal(self.sol.rhs_snap, np.repeat(self.sol.rhs[:, :, None], 5, axis=2)))
def test_snapshot_output(self):
for self.solver.iter in range(1, self.solver.num_steps + 1):
# update the snapshot matrix
if (self.solver.iter % self.solver.out_interval) == 0:
self.sol.update_snapshots(self.solver)
# write and check intermediate results
if ((self.solver.iter % self.solver.out_itmdt_interval) == 0) and (
self.solver.iter != self.solver.num_steps
):
self.sol.write_snapshots(self.solver, intermediate=True, failed=False)
sol_prim_itmdt = np.load(
os.path.join(self.solver.unsteady_output_dir, "sol_prim_" + self.solver.sim_type + "_ITMDT.npy")
)
sol_cons_itmdt = np.load(
os.path.join(self.solver.unsteady_output_dir, "sol_cons_" + self.solver.sim_type + "_ITMDT.npy")
)
source_itmdt = np.load(
os.path.join(self.solver.unsteady_output_dir, "source_" + self.solver.sim_type + "_ITMDT.npy")
)
heat_release_itmdt = np.load(
os.path.join(self.solver.unsteady_output_dir, "heat_release_" + self.solver.sim_type + "_ITMDT.npy")
)
rhs_itmdt = np.load(
os.path.join(self.solver.unsteady_output_dir, "rhs_" + self.solver.sim_type + "_ITMDT.npy")
)
self.assertTrue(np.array_equal(sol_prim_itmdt, np.repeat(self.sol.sol_prim[:, :, None], 3, axis=2)))
self.assertTrue(np.array_equal(sol_cons_itmdt, np.repeat(self.sol.sol_cons[:, :, None], 3, axis=2)))
self.assertTrue(
np.array_equal(source_itmdt, np.repeat(self.sol.reaction_source[:, :, None], 2, axis=2))
)
self.assertTrue(
np.array_equal(heat_release_itmdt, np.repeat(self.sol.heat_release[:, None], 2, axis=1))
)
self.assertTrue(np.array_equal(rhs_itmdt, np.repeat(self.sol.rhs[:, :, None], 2, axis=2)))
# write and check "failed" snapshots
if self.solver.iter == 7:
self.sol.write_snapshots(self.solver, intermediate=False, failed=True)
sol_prim_failed = np.load(
os.path.join(self.solver.unsteady_output_dir, "sol_prim_" + self.solver.sim_type + "_FAILED.npy")
)
sol_cons_failed = np.load(
os.path.join(self.solver.unsteady_output_dir, "sol_cons_" + self.solver.sim_type + "_FAILED.npy")
)
source_failed = np.load(
os.path.join(self.solver.unsteady_output_dir, "source_" + self.solver.sim_type + "_FAILED.npy")
)
heat_release_failed = np.load(
os.path.join(
self.solver.unsteady_output_dir, "heat_release_" + self.solver.sim_type + "_FAILED.npy"
)
)
rhs_failed = np.load(
os.path.join(self.solver.unsteady_output_dir, "rhs_" + self.solver.sim_type + "_FAILED.npy")
)
self.assertTrue(np.array_equal(sol_prim_failed, np.repeat(self.sol.sol_prim[:, :, None], 4, axis=2)))
self.assertTrue(np.array_equal(sol_cons_failed, np.repeat(self.sol.sol_cons[:, :, None], 4, axis=2)))
self.assertTrue(
np.array_equal(source_failed, np.repeat(self.sol.reaction_source[:, :, None], 3, axis=2))
)
self.assertTrue(
np.array_equal(heat_release_failed, np.repeat(self.sol.heat_release[:, None], 3, axis=1))
)
self.assertTrue(np.array_equal(rhs_failed, np.repeat(self.sol.rhs[:, :, None], 3, axis=2)))
# delete intermediate results and check that they deleted properly
self.sol.delete_itmdt_snapshots(self.solver)
self.assertFalse(
os.path.isfile(
os.path.join(self.solver.unsteady_output_dir, "sol_prim_" + self.solver.sim_type + "_ITMDT.npy")
)
)
self.assertFalse(
os.path.isfile(
os.path.join(self.solver.unsteady_output_dir, "sol_cons_" + self.solver.sim_type + "_ITMDT.npy")
)
)
self.assertFalse(
os.path.isfile(
os.path.join(self.solver.unsteady_output_dir, "source_" + self.solver.sim_type + "_ITMDT.npy")
)
)
self.assertFalse(
os.path.isfile(
os.path.join(self.solver.unsteady_output_dir, "heat_release_" + self.solver.sim_type + "_ITMDT.npy")
)
)
self.assertFalse(
os.path.isfile(os.path.join(self.solver.unsteady_output_dir, "rhs_" + self.solver.sim_type + "_ITMDT.npy"))
)
# write final snapshots
self.sol.write_snapshots(self.solver, intermediate=False, failed=False)
sol_prim_final = np.load(
os.path.join(self.solver.unsteady_output_dir, "sol_prim_" + self.solver.sim_type + ".npy")
)
sol_cons_final = np.load(
os.path.join(self.solver.unsteady_output_dir, "sol_cons_" + self.solver.sim_type + ".npy")
)
source_final = np.load(os.path.join(self.solver.unsteady_output_dir, "source_" + self.solver.sim_type + ".npy"))
heat_release_final = np.load(
os.path.join(self.solver.unsteady_output_dir, "heat_release_" + self.solver.sim_type + ".npy")
)
rhs_final = np.load(os.path.join(self.solver.unsteady_output_dir, "rhs_" + self.solver.sim_type + ".npy"))
self.assertTrue(np.array_equal(sol_prim_final, np.repeat(self.sol.sol_prim[:, :, None], 6, axis=2)))
self.assertTrue(np.array_equal(sol_cons_final, np.repeat(self.sol.sol_cons[:, :, None], 6, axis=2)))
self.assertTrue(np.array_equal(source_final, np.repeat(self.sol.reaction_source[:, :, None], 5, axis=2)))
self.assertTrue(np.array_equal(heat_release_final, np.repeat(self.sol.heat_release[:, None], 5, axis=1)))
self.assertTrue(np.array_equal(rhs_final, np.repeat(self.sol.rhs[:, :, None], 5, axis=2)))
def test_write_restart_file(self):
sol_cons = self.sol.sol_cons
self.solver.sol_time = 1e-4
self.solver.iter = 2
self.solver.restart_iter = 4
self.sol.write_restart_file(self.solver)
self.assertEqual(self.solver.restart_iter, 5)
# check restart files
restart_data = np.load(os.path.join(self.solver.restart_output_dir, "restart_file_4.npz"))
self.assertTrue(
np.array_equal(
restart_data["sol_prim"],
np.repeat(SOL_PRIM_IN_REACT[:, :, None], 2, axis=-1),
)
)
self.assertTrue(
np.array_equal(
restart_data["sol_cons"],
np.repeat(sol_cons[:, :, None], 2, axis=-1),
)
)
self.assertEqual(float(restart_data["sol_time"]), 1e-4)
# check iteration files
restart_iter = int(np.loadtxt(os.path.join(self.solver.restart_output_dir, "restart_iter.dat")))
self.assertEqual(restart_iter, 4)
def test_read_restart_file(self):
self.solver.sol_time = 1e-4
self.solver.iter = 2
self.solver.restart_iter = 4
self.sol.write_restart_file(self.solver)
sol_time, sol_prim, restart_iter = read_restart_file(self.solver)
self.assertEqual(sol_time, 1e-4)
self.assertEqual(restart_iter, 5) # 1 is added to avoid overwriting
self.assertTrue(
np.array_equal(
sol_prim,
np.repeat(SOL_PRIM_IN_REACT[:, :, None], 2, axis=-1),
)
)
def test_calc_d_sol_norms(self):
self.solver.iter = 3
self.sol.d_sol_norm_hist = np.zeros((self.solver.num_steps, 2), dtype=REAL_TYPE)
self.sol.sol_hist_prim[0] = self.sol.sol_prim * 2.0
self.sol.calc_d_sol_norms(self.solver, "implicit")
self.assertAlmostEqual(self.sol.d_sol_norm_hist[2, 0], 3.46573790883)
self.assertAlmostEqual(self.sol.d_sol_norm_hist[2, 1], 3.45416666667)
def test_calc_res_norms(self):
self.solver.iter = 3
self.sol.res = self.sol.sol_prim.copy()
self.sol.calc_res_norms(self.solver, 0)
self.assertAlmostEqual(self.sol.res_norm_hist[2, 0], 3.46573790883)
self.assertAlmostEqual(self.sol.res_norm_hist[2, 1], 3.45416666667)
| 39.796353 | 120 | 0.612465 | 1,710 | 13,093 | 4.431579 | 0.095322 | 0.105569 | 0.036949 | 0.051729 | 0.785827 | 0.760755 | 0.713249 | 0.667063 | 0.616522 | 0.540908 | 0 | 0.015045 | 0.268999 | 13,093 | 328 | 121 | 39.917683 | 0.776721 | 0.041014 | 0 | 0.360996 | 0 | 0 | 0.048105 | 0.011328 | 0 | 0 | 0 | 0.003049 | 0.170124 | 1 | 0.049793 | false | 0 | 0.041494 | 0 | 0.099585 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
ce1ae0dcedfa059f4a8bffab465b0fca2f146769 | 51 | py | Python | app/_version.py | sunhailin-Leo/myMacAssistant | 30ba955a4f91a800197cbfdc2ab5d3a5cd993eef | [
"MIT"
] | 63 | 2020-11-02T00:58:49.000Z | 2022-03-20T21:39:02.000Z | fastapi_profiler/_version.py | sunhailin-Leo/fastapi_profiler | b414af6f0b2d92e7b509b6b3e54cde13ec5795e2 | [
"MIT"
] | 10 | 2021-02-23T11:00:39.000Z | 2022-02-07T02:44:05.000Z | app/_version.py | sunhailin-Leo/myMacAssistant | 30ba955a4f91a800197cbfdc2ab5d3a5cd993eef | [
"MIT"
] | 7 | 2020-11-24T08:34:46.000Z | 2022-01-10T12:58:51.000Z | __version__ = "1.0.0"
__author__ = "sunhailin-Leo"
| 17 | 28 | 0.705882 | 7 | 51 | 4 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.066667 | 0.117647 | 51 | 2 | 29 | 25.5 | 0.555556 | 0 | 0 | 0 | 0 | 0 | 0.352941 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
ce2713a447d11afd7d04a70a5793ef6b8c8b2009 | 303 | py | Python | venv/Lib/site-packages/bootstrap4/widgets.py | HRangelov/gallery | 3ccf712ef2e1765a6dfd6567d58e6678e0b2ff6f | [
"MIT"
] | 3 | 2021-02-02T11:13:15.000Z | 2021-02-10T07:26:10.000Z | venv/Lib/site-packages/bootstrap4/widgets.py | HRangelov/gallery | 3ccf712ef2e1765a6dfd6567d58e6678e0b2ff6f | [
"MIT"
] | 3 | 2021-03-30T14:15:20.000Z | 2021-09-22T19:31:57.000Z | cypher_venv/Lib/site-packages/bootstrap4/widgets.py | FrancisLangit/cypher | 4921e2f53ef8154ad63ff4de7f8068b27f29f485 | [
"MIT"
] | null | null | null | from django.forms import RadioSelect
class RadioSelectButtonGroup(RadioSelect):
"""
This widget renders a Bootstrap 4 set of buttons horizontally instead of typical radio buttons.
Much more mobile friendly.
"""
template_name = "bootstrap4/widgets/radio_select_button_group.html"
| 25.25 | 99 | 0.762376 | 36 | 303 | 6.305556 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008032 | 0.178218 | 303 | 11 | 100 | 27.545455 | 0.903614 | 0.405941 | 0 | 0 | 0 | 0 | 0.30625 | 0.30625 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
ce307dac43c76b9afca0ff0e962a64169f480199 | 4,536 | py | Python | questions.py | lasyasreepada/iplaw-for-digital-teens | a1ac53f7b3438876db644450413f78ec8d612bac | [
"MIT"
] | null | null | null | questions.py | lasyasreepada/iplaw-for-digital-teens | a1ac53f7b3438876db644450413f78ec8d612bac | [
"MIT"
] | null | null | null | questions.py | lasyasreepada/iplaw-for-digital-teens | a1ac53f7b3438876db644450413f78ec8d612bac | [
"MIT"
] | null | null | null | """Set of questions for the IP Law quiz
questions.py
Lasya Sreepada
Yale College '19
May 6, 2017
"""
from random import shuffle
import time
def quiz():
questions = [
("Copyright protects both expression of an idea and the idea itself. \nTrue or False?", "f", "cp"),
("Clothing, such as Katy Perry’s “Left Shark” costume is a useful article and is therefore copyrightable. \nTrue or False?", "f", "cp"),
("One of the factors of evaluation for fair use is the effect of the use upon the potential market for the work in question. \nTrue or False?", "t", "cp"),
("In Cariou vs. Prince, the defendant was brought to court because he used images from Cariou’s 2000 book, Yes Rasta, to create a new exhibition of photos with some apparent modifications. This was not fair use because it did not comment on the original work about the nature of the photographs. \nTrue or False?", "t", "cp"),
("Copyright is an inevitable, “divine” grant entrusting total ownership rights to the creator of a work. \nTrue or False?", "f", "cp"),
("In Christian Louboutin vs. YSL, the defendant was brought to court for using a red outsole on women’s shoes that were also red in color. The court ruled that this was a trademark infringement. \nTrue or False? ", "f", "tm"),
("Descriptive names for a company or product (e.g. FishFri) are never trademarkable. \nTrue or False?", "f", "tm"),
("Companies such as Google can potentially lose their trademark protection because of genericide. \nTrue or False?", "t", "tm"),
("The same trademark (e.g. a word) can be registered by different parties, so long as the trademarks are in different classes. \nTrue or False?", "t", "tm"),
("Trademarked goods or services must be made available for commercial sale on a national level (beyond state boundaries). \nTrue or False?", "t", "tm"),
("A utility patent application consists of three parts: drawings, a written description, and claim statements. The drawings of the product are most important in determining what exactly gets patented. \nTrue or False?", "f", "pt"),
("Naturally occurring processes or products, such as human DNA, are not patentable. \nTrue or False?", "t", "pt"),
("The tests that the court used in Alice Corp v. CLS Bank Intl to determine whether the computer software was patent eligible were (1) whether the claims directed to an abstract idea and (2) whether the claims added something inventive. \nTrue or False?", "t", "pt"),
("The level of skill required to develop a product or process is not considered when determining whether it satisfies the non-obviousness requirement for patent eligibility. \nTrue or False?", "f", "pt"),
("There is a distinction between utility and design, therefore, it is possible for one product to have both a utility patent and a design patent. \nTrue or False?", "t", "pt")
]
shuffle(questions)
cp_correct = 0
tm_correct = 0
pt_correct = 0
print("Welcome to Which IP Law Is for You!")
time.sleep(2)
print("We will ask you a series of true or false questions about various cases related with copyrights, trademarks, and patents.")
time.sleep(5)
print("When a question is displayed, you will be prompted for an answer. Please type t for true, and f for false.")
time.sleep(5)
print("At the end of the quiz, we will sort you into a branch of IP Law based on your quiz performance. Good Luck!")
time.sleep(5)
print()
for question, correct_ans, typ in questions:
print(question)
answer = input()
if answer == correct_ans:
print("correct!")
print()
if typ == "cp":
cp_correct += 1
elif typ == "tm":
tm_correct += 1
else: pt_correct += 1
time.sleep(1)
else:
print("incorrect")
print()
time.sleep(1)
total_correct = cp_correct + tm_correct + pt_correct
if total_correct == len(questions):
print("Congratulations, you are the IP Law Supreme Overlord! You got all the questions right and would do well in any branch.")
else:
if (cp_correct >= tm_correct and cp_correct >= pt_correct):
print("You are Copyright Law!")
elif (tm_correct >= cp_correct and tm_correct >= pt_correct):
print("You are Trademark Law!")
else:
print("You are Patent Law!")
quiz()
| 49.846154 | 334 | 0.665785 | 672 | 4,536 | 4.462798 | 0.41369 | 0.037346 | 0.06002 | 0.034678 | 0.114371 | 0.04935 | 0 | 0 | 0 | 0 | 0 | 0.007293 | 0.244268 | 4,536 | 90 | 335 | 50.4 | 0.867561 | 0.020944 | 0 | 0.183333 | 0 | 0.266667 | 0.676066 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.016667 | false | 0 | 0.05 | 0 | 0.066667 | 0.233333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
cbfb410cacd5080693f012f125e877edd266870a | 172 | py | Python | features/environment.py | geeksforsocialchange/imok | efb7189c13c398dbd5d4301ca496a2e583b0f5b7 | [
"MIT"
] | 6 | 2021-05-12T08:40:36.000Z | 2022-01-25T08:31:06.000Z | features/environment.py | geeksforsocialchange/imok | efb7189c13c398dbd5d4301ca496a2e583b0f5b7 | [
"MIT"
] | 14 | 2021-05-12T09:03:08.000Z | 2021-06-10T13:18:52.000Z | features/environment.py | geeksforsocialchange/imok | efb7189c13c398dbd5d4301ca496a2e583b0f5b7 | [
"MIT"
] | 1 | 2021-05-14T20:54:15.000Z | 2021-05-14T20:54:15.000Z | from django.conf import settings
settings.NOTIFY_EMAIL = 'root@localhost'
settings.DEBUG = True
def before_all(context):
context.users = {}
context.members = {}
| 17.2 | 40 | 0.72093 | 21 | 172 | 5.809524 | 0.809524 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.168605 | 172 | 9 | 41 | 19.111111 | 0.853147 | 0 | 0 | 0 | 0 | 0 | 0.081395 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.166667 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
02063c864e384d1ba7ec730d4d03b03f063ebc1f | 80,245 | py | Python | pirates/ai/PiratesMagicWordManager.py | itsyaboyrocket/pirates | 6ca1e7d571c670b0d976f65e608235707b5737e3 | [
"BSD-3-Clause"
] | 3 | 2021-02-25T06:38:13.000Z | 2022-03-22T07:00:15.000Z | pirates/ai/PiratesMagicWordManager.py | itsyaboyrocket/pirates | 6ca1e7d571c670b0d976f65e608235707b5737e3 | [
"BSD-3-Clause"
] | null | null | null | pirates/ai/PiratesMagicWordManager.py | itsyaboyrocket/pirates | 6ca1e7d571c670b0d976f65e608235707b5737e3 | [
"BSD-3-Clause"
] | 1 | 2021-02-25T06:38:17.000Z | 2021-02-25T06:38:17.000Z | # uncompyle6 version 3.2.0
# Python bytecode 2.4 (62061)
# Decompiled from: Python 2.7.14 (v2.7.14:84471935ed, Sep 16 2017, 20:19:30) [MSC v.1500 32 bit (Intel)]
# Embedded file name: pirates.ai.PiratesMagicWordManager
from direct.showbase.ShowBaseGlobal import *
from direct.distributed import DistributedObject
from direct.directnotify import DirectNotifyGlobal
from direct.task import Task
from otp.avatar import Avatar
from otp.chat import ChatManager
import string
from direct.showbase import PythonUtil
from otp.otpbase import OTPGlobals
from direct.distributed.ClockDelta import *
from otp.ai import MagicWordManager
from pirates.pirate import DistributedPlayerPirate
from pirates.npc import DistributedNPCTownfolk
from direct.distributed import DistributedCartesianGrid
from pirates.piratesbase import PiratesGlobals
from pirates.piratesgui.RadarUtil import RadarUtil
from pirates.cutscene import Cutscene, CutsceneData
from pirates.effects.Fireflies import Fireflies
from pirates.effects.GroundFog import GroundFog
from pirates.effects.Bonfire import Bonfire
from pirates.effects.CeilingDust import CeilingDust
from pirates.effects.CeilingDebris import CeilingDebris
from pirates.effects.CameraShaker import CameraShaker
from pirates.effects.DarkWaterFog import DarkWaterFog
from pirates.ship import DistributedSimpleShip
from pirates.world import WorldGlobals
from pirates.effects.FireworkGlobals import *
from pirates.effects.FireworkShowManager import FireworkShowManager
from pirates.piratesbase import PLocalizer
class PiratesMagicWordManager(MagicWordManager.MagicWordManager):
__module__ = __name__
notify = DirectNotifyGlobal.directNotify.newCategory('PiratesMagicWordManager')
neverDisable = 1
GameAvatarClass = DistributedPlayerPirate.DistributedPlayerPirate
def __init__(self, cr):
MagicWordManager.MagicWordManager.__init__(self, cr)
self.pendingCameraReparent = None
self.originalLocation = None
self.groundFog = None
self.fireflies = None
self.rainDrops = None
self.rainMist = None
self.rainSplashes = None
self.rainSplashes2 = None
self.stormEye = None
self.stormRing = None
self.fishCamEnabled = False
return
def generate(self):
MagicWordManager.MagicWordManager.generate(self)
self.accept('magicWord', self.b_setMagicWord)
def doLoginMagicWords(self):
MagicWordManager.MagicWordManager.doLoginMagicWords(self)
if base.config.GetBool('want-chat', 0):
self.d_setMagicWord('~chat', localAvatar.doId, 0)
if base.config.GetBool('want-run', 0) or base.config.GetBool('want-pirates-run', 0):
self.toggleRun()
if base.config.GetBool('immortal-mode', 0):
self.d_setMagicWord('~immortal', localAvatar.doId, 0)
def disable(self):
self.ignore('magicWord')
MagicWordManager.MagicWordManager.disable(self)
if self.pendingCameraReparent:
base.cr.relatedObjectMgr.abortRequest(self.pendingCameraReparent)
self.pendingCameraReparent = None
return
def doMagicWord(self, word, avId, zoneId):
def wordIs(w, word=word):
return word[:len(w) + 1] == '%s ' % w or word == w
if word == '~rio':
self.doMagicWord('~run', avId, zoneId)
if MagicWordManager.MagicWordManager.doMagicWord(self, word, avId, zoneId) == 1:
pass
if word == '~walk':
localAvatar.b_setGameState('LandRoam')
localAvatar.motionFSM.on()
if word == '~players':
players = base.cr.doFindAll('DistributedPlayerPirate')
for player in players:
playerText = '%s %s' % (player.getName(), player.doId)
base.talkAssistant.receiveGameMessage(playerText)
if word == '~rocketman':
if localAvatar.rocketOn == 0:
localAvatar.startRocketJumpMode()
base.talkAssistant.receiveGameMessage('Zero hour nine a.m. (Bill Shattner Version)')
else:
localAvatar.endRocketJumpMode()
base.talkAssistant.receiveGameMessage("And I think it's gonna be a long long time")
if word == '~shipUpgrade':
localAvatar.guiMgr.toggleShipUpgrades()
if word == '~shipCam':
if base.shipLookAhead:
base.talkAssistant.receiveGameMessage('Ship Look ahead camera off!')
base.setShipLookAhead(0)
else:
base.talkAssistant.receiveGameMessage('Ship Look ahead camera on!')
base.setShipLookAhead(1)
if word == '~time':
base.talkAssistant.receiveGameMessage('The time is %s' % base.cr.timeOfDayManager.getCurrentIngameTime())
if word == '~todDebug':
base.cr.timeOfDayManager.toggleDebugMode()
if word == '~vismask':
base.talkAssistant.receiveGameMessage('Vis Mask %s' % localAvatar.invisibleMask)
if word == '~target':
localAvatar.setAvatarViewTarget()
if word == '~collisions_on':
pass
if word == '~collisions_off':
pass
if word == '~topten':
base.cr.guildManager.requestLeaderboardTopTen()
if word == '~airender':
pass
if __dev__ and wordIs('~shiphat'):
args = word.split()
if hasattr(localAvatar, 'shipHat'):
localAvatar.shipHat.modelRoot.detachNode()
localAvatar.shipHat = None
if len(args) == 1:
ship = base.shipFactory.getShip(23)
else:
shipClass = args[1]
ship = base.shipFactory.getShip(int(shipClass))
ship.startSailing()
ship.modelRoot.reparentTo(localAvatar.headNode)
ship.modelRoot.setR(90)
ship.modelRoot.setP(-90)
ship.modelRoot.setX(0.8)
ship.modelRoot.setScale(0.004)
ship.modelRoot.setZ(-0.2)
ship.forceLOD(2)
ship.modelCollisions.detachNode()
localAvatar.shipHat = ship
if __dev__ and wordIs('~cr'):
pass
if __dev__ and wordIs('~watch'):
if taskMgr.hasTaskNamed('lookAtDude'):
taskMgr.remove('lookAtDude')
localAvatar.guiMgr.setIgnoreAllKeys(False)
localAvatar.guiMgr.combatTray.initCombatTray()
localAvatar.unstash()
else:
args = word.split()
if len(args) >= 2:
tgtDoId = int(args[1])
def doHeadsUp(task=None):
targetObj = self.cr.doId2do.get(tgtDoId)
if targetObj:
localAvatar.lookAt(targetObj)
return Task.cont
taskMgr.add(doHeadsUp, 'lookAtDude')
localAvatar.guiMgr.setIgnoreAllKeys(True)
localAvatar.guiMgr.combatTray.skillMapping.clear()
localAvatar.stash()
else:
print 'need a target object doId to watch'
if __dev__ and (wordIs('~ccNPC') or wordIs('~ccShip')):
pass
if wordIs('~bonfire'):
bf = Bonfire()
bf.reparentTo(render)
bf.setPos(localAvatar, 0, 0, 0)
bf.startLoop()
print 'bonfire at %s, %s' % (localAvatar.getPos(), localAvatar.getHpr())
if __dev__ and wordIs('~mario'):
localAvatar.toggleMario()
if wordIs('~islandShips'):
args = word.split()
try:
if args[1] == '1':
localAvatar.getParentObj().setOceanVisEnabled(1)
localAvatar.getParentObj().setFlatShips(0)
else:
localAvatar.getParentObj().setOceanVisEnabled(0)
except:
pass
if wordIs('~swamp'):
if self.fireflies:
self.fireflies.destroy()
self.fireflies = None
self.groundFog.destroy()
self.groundFog = None
else:
self.fireflies = Fireflies()
if self.fireflies:
self.fireflies.reparentTo(localAvatar)
self.fireflies.startLoop()
self.groundFog = GroundFog()
if self.groundFog:
self.groundFog.reparentTo(localAvatar)
self.groundFog.startLoop()
if wordIs('~darkfog'):
if self.groundFog:
self.groundFog.destroy()
self.groundFog = None
else:
self.groundFog = DarkWaterFog()
if self.groundFog:
self.groundFog.reparentTo(localAvatar)
self.groundFog.startLoop()
if wordIs('~dust'):
effect = CeilingDust.getEffect()
if effect:
effect.reparentTo(localAvatar)
effect.setPos(0, 0, 10)
effect.play()
effect = CeilingDebris.getEffect()
if effect:
effect.reparentTo(localAvatar)
effect.setPos(0, 0, 20)
effect.play()
cameraShakerEffect = CameraShaker()
cameraShakerEffect.reparentTo(localAvatar)
cameraShakerEffect.setPos(0, 0, 0)
cameraShakerEffect.shakeSpeed = 0.05
cameraShakerEffect.shakePower = 4.5
cameraShakerEffect.numShakes = 2
cameraShakerEffect.scalePower = 1
cameraShakerEffect.play(80.0)
if wordIs('~rain'):
if self.rainDrops:
self.rainDrops.stopLoop()
self.rainDrops = None
if self.rainMist:
self.rainMist.stopLoop()
self.rainMist = None
if self.rainSplashes:
self.rainSplashes.stopLoop()
self.rainSplashes = None
if self.rainSplashes2:
self.rainSplashes2.stopLoop()
self.rainSplashes2 = None
else:
from pirates.effects.RainDrops import RainDrops
self.rainDrops = RainDrops(base.camera)
self.rainDrops.reparentTo(render)
self.rainDrops.startLoop()
from pirates.effects.RainMist import RainMist
self.rainMist = RainMist(base.camera)
self.rainMist.reparentTo(render)
self.rainMist.startLoop()
from pirates.effects.RainSplashes import RainSplashes
self.rainSplashes = RainSplashes(base.camera)
self.rainSplashes.reparentTo(render)
self.rainSplashes.startLoop()
from pirates.effects.RainSplashes2 import RainSplashes2
self.rainSplashes2 = RainSplashes2(base.camera)
self.rainSplashes2.reparentTo(render)
self.rainSplashes2.startLoop()
if wordIs('~clouds'):
args = word.split()
if len(args) >= 2:
level = int(args[1])
base.cr.timeOfDayManager.skyGroup.transitionClouds(level).start()
if wordIs('~storm'):
if self.stormEye:
self.stormEye.stopLoop()
self.stormEye = None
if self.stormRing:
self.stormRing.stopLoop()
self.stormRing = None
else:
args = word.split()
grid = 0
if len(args) > 1:
grid = int(args[1])
pos = Vec3(base.cr.doId2do[201100017].getZoneCellOrigin(grid)[0], base.cr.doId2do[201100017].getZoneCellOrigin(grid)[1], base.cr.doId2do[201100017].getZoneCellOrigin(grid)[2])
from pirates.effects.StormEye import StormEye
self.stormEye = StormEye()
self.stormEye.reparentTo(render)
self.stormEye.startLoop()
from pirates.effects.StormRing import StormRing
self.stormRing = StormRing()
self.stormRing.reparentTo(render)
self.stormRing.setZ(100)
self.stormRing.startLoop()
if wordIs('~alight'):
args = word.split()
if len(args) > 3:
color = Vec4(float(args[1]), float(args[2]), float(args[3]), 1)
base.cr.timeOfDayManager.alight.node().setColor(color)
if wordIs('~dlight'):
args = word.split()
if len(args) > 3:
color = Vec4(float(args[1]), float(args[2]), float(args[3]), 1)
base.cr.timeOfDayManager.dlight.node().setColor(color)
if wordIs('~fog'):
args = word.split()
if len(args) > 3:
color = Vec4(float(args[1]), float(args[2]), float(args[3]), 1)
base.cr.timeOfDayManager.fog.setColor(color)
if len(args) > 4:
base.cr.timeOfDayManager.fog.setExpDensity(float(args[4]))
if len(args) == 2:
base.cr.timeOfDayManager.fog.setExpDensity(float(args[1]))
if __dev__ and wordIs('~turbo'):
localAvatar.toggleTurbo()
if __dev__ and wordIs('~joincrew'):
base.cr.crewManager.requestNewCrew()
if wordIs('~tm'):
self.cr.teleportMgr.initiateTeleport(PiratesGlobals.INSTANCE_TM, 'treasureMapCove')
if wordIs('~tml'):
self.cr.teleportMgr.initiateTeleport(PiratesGlobals.INSTANCE_MAIN, WorldGlobals.PiratesWorldSceneFileBase)
if wordIs('~pg'):
self.cr.teleportMgr.initiateTeleport(PiratesGlobals.INSTANCE_PG, 'ParlorWorld')
if wordIs('~pgvip'):
self.cr.teleportMgr.initiateTeleport(PiratesGlobals.INSTANCE_PG, 'ParlorVIPWorld')
if wordIs('~pgl'):
self.cr.teleportMgr.initiateTeleport(PiratesGlobals.INSTANCE_MAIN, WorldGlobals.PiratesWorldSceneFileBase)
if wordIs('~tutorial'):
self.cr.teleportMgr.initiateTeleport(PiratesGlobals.INSTANCE_TUTORIAL, 'RambleshackWorld', self.cr.playGame.handleTutorialGeneration)
if wordIs('~tutoriall'):
self.cr.teleportMgr.initiateTeleport(PiratesGlobals.INSTANCE_MAIN, WorldGlobals.PiratesWorldSceneFileBase)
if wordIs('~pvp'):
self.cr.teleportMgr.initiateTeleport(PiratesGlobals.INSTANCE_PVP, 'pvp_mayhemWorld1')
if wordIs('~pirateer'):
self.cr.teleportMgr.initiateTeleport(PiratesGlobals.INSTANCE_PVP, 'pirateerMap')
if wordIs('~pvpl'):
self.cr.teleportMgr.initiateTeleport(PiratesGlobals.INSTANCE_MAIN, WorldGlobals.PiratesWorldSceneFileBase)
if wordIs('~tortuga'):
self.cr.teleportMgr.initiateTeleport(PiratesGlobals.INSTANCE_GENERIC, 'TortugaWorld')
if wordIs('~portRoyal'):
self.cr.teleportMgr.initiateTeleport(PiratesGlobals.INSTANCE_GENERIC, 'PortRoyalWorld')
if wordIs('~delFuego'):
self.cr.teleportMgr.initiateTeleport(PiratesGlobals.INSTANCE_GENERIC, 'DelFuegoWorld')
if wordIs('~bilgewater'):
self.cr.teleportMgr.initiateTeleport(PiratesGlobals.INSTANCE_GENERIC, 'BilgewaterWorld')
if wordIs('~kingshead'):
self.cr.teleportMgr.initiateTeleport(PiratesGlobals.INSTANCE_GENERIC, 'KingsheadWorld')
if wordIs('~cuba'):
self.cr.teleportMgr.initiateTeleport(PiratesGlobals.INSTANCE_GENERIC, 'CubaWorld')
if wordIs('~rumrunner'):
self.cr.teleportMgr.initiateTeleport(PiratesGlobals.INSTANCE_GENERIC, 'RumrunnerWorld')
if wordIs('~wildisland'):
self.cr.teleportMgr.initiateTeleport(PiratesGlobals.INSTANCE_GENERIC, 'WildIslandWorld')
if wordIs('~caveA'):
self.cr.teleportMgr.initiateTeleport(PiratesGlobals.INSTANCE_GENERIC, 'CaveAWorld')
if wordIs('~caveB'):
self.cr.teleportMgr.initiateTeleport(PiratesGlobals.INSTANCE_GENERIC, 'CaveBWorld')
if wordIs('~caveC'):
self.cr.teleportMgr.initiateTeleport(PiratesGlobals.INSTANCE_GENERIC, 'CaveCWorld')
if wordIs('~caveD'):
self.cr.teleportMgr.initiateTeleport(PiratesGlobals.INSTANCE_GENERIC, 'CaveDWorld')
if wordIs('~caveE'):
self.cr.teleportMgr.initiateTeleport(PiratesGlobals.INSTANCE_GENERIC, 'CaveEWorld')
if wordIs('~jungleA'):
self.cr.teleportMgr.initiateTeleport(PiratesGlobals.INSTANCE_GENERIC, 'JungleTestWorldA')
if wordIs('~jungleB'):
self.cr.teleportMgr.initiateTeleport(PiratesGlobals.INSTANCE_GENERIC, 'JungleTestWorld')
if wordIs('~jungleC'):
self.cr.teleportMgr.initiateTeleport(PiratesGlobals.INSTANCE_GENERIC, 'JungleTestWorldC')
if wordIs('~swampA'):
self.cr.teleportMgr.initiateTeleport(PiratesGlobals.INSTANCE_GENERIC, 'SwampTestWorld')
if wordIs('~mainWorld'):
self.cr.teleportMgr.initiateTeleport(PiratesGlobals.INSTANCE_MAIN, WorldGlobals.PiratesWorldSceneFileBase)
if wordIs('~gameArea'):
self.cr.teleportMgr.initiateTeleport(PiratesGlobals.INSTANCE_GENERIC, 'GameAreaSandbox')
if wordIs('~blackpearl') or wordIs('~bp'):
args = word.split()
if len(args) == 1:
self.cr.teleportMgr.initiateTeleport(PiratesGlobals.INSTANCE_TM, 'BlackpearlWorld')
if wordIs('~scrimmage'):
self.cr.teleportMgr.initiateTeleport(PiratesGlobals.INSTANCE_SCRIMMAGE, 'ScrimmageTestWorld')
if wordIs('~fireworks') or wordIs('~fw'):
args = word.split()
if len(args) >= 2 and args[1] in ['show', 's']:
if len(args) >= 3:
showType = args[2]
timestamp = 0.0
if len(args) >= 4:
timestamp = args[3]
if base.cr.activeWorld:
localAvatar.getParentObj().fireworkShowType = int(showType)
localAvatar.getParentObj().beginFireworkShow(timeStamp=timestamp)
else:
if len(args) >= 2 and args[1] in ['type', 't']:
fireworkType = 0
if len(args) >= 3:
fireworkType = int(args[2])
from pirates.effects.Firework import Firework
firework = Firework(fireworkType)
firework.reparentTo(render)
firework.setPos(Point3(10525, 19000, 245))
firework.play()
else:
if len(args) >= 2 and args[1] in ['effect', 'e']:
trailType = 0
burstType = 0
if len(args) >= 3:
burstType = int(args[2])
if len(args) >= 4:
trailType = int(args[3])
from pirates.effects.FireworkEffect import FireworkEffect
firework = FireworkEffect(burstType, trailType)
firework.reparentTo(render)
firework.setPos(Point3(10525, 19000, 245))
firework.play()
if wordIs('~te'):
if localAvatar.gameFSM.getCurrentOrNextState() == 'LandRoam':
localAvatar.b_setGameState('TeleportOut')
else:
if localAvatar.gameFSM.getCurrentOrNextState() == 'TeleportOut':
localAvatar.b_setGameState('LandRoam')
if wordIs('~lfa'):
args = word.split()
activityName = None
if len(args) >= 2:
activityName = args[1]
if activityName == 'blackjack':
localAvatar.requestActivity(PiratesGlobals.GAME_STYLE_BLACKJACK)
else:
if activityName == 'poker':
localAvatar.requestActivity(PiratesGlobals.GAME_STYLE_POKER)
else:
if activityName == 'pvp':
localAvatar.requestActivity(PiratesGlobals.GAME_TYPE_PVP)
else:
if activityName == 'tm':
localAvatar.requestActivity(PiratesGlobals.GAME_TYPE_TM)
else:
if activityName == 'hsa':
localAvatar.requestActivity(PiratesGlobals.GAME_TYPE_HSA)
else:
if activityName == 'mmp':
self.cr.teleportMgr.initiateTeleport(PiratesGlobals.INSTANCE_MAIN, WorldGlobals.PiratesWorldSceneFileBase)
if wordIs('~term') or wordIs('terminator'):
localAvatar.setEquippedWeapons([10103, 10106, 10115])
localAvatar.d_requestEquipWeapons([10103, 10106, 10115])
if wordIs('~battleRandom'):
args = word.split()
if len(args) >= 2:
command = args[1]
if command == 'resync':
localAvatar.battleRandom.resync()
self.notify.info('Client Battle random resynced, counter=0')
else:
response = 'Client Battle random attack counter=%s main counter=%s' % (localAvatar.battleRandom.attackCounter, localAvatar.battleRandom.mainCounter)
self.setMagicWordResponse(response)
if wordIs('~cutscene'):
args = word.split()
name = None
if len(args) >= 2:
csId = args[1]
else:
csId = base.config.GetString('default-cutscene', '0')
if int(csId) >= len(CutsceneData.CutsceneNames):
return
name = CutsceneData.CutsceneNames[int(csId)]
cs = PythonUtil.ScratchPad()
def destroyCutscene(cs=cs):
cs.cutscene.destroy()
c = Cutscene.Cutscene(self.cr, name, PythonUtil.DelayedFunctor(destroyCutscene, '~cutscene-destroy'))
cs.cutscene = c
c.play()
destroyCutscene = None
if wordIs('~forceLod'):
for n in render.findAllMatches('**/+LODNode'):
n.node().forceSwitch(n.node().getHighestSwitch())
if wordIs('~wave'):
args = word.split()
patch = base.cr.doFind('OceanGrid').water.patch
if len(args) < 4:
response = '~wave num amplitude wavelength speed'
numWaves = patch.getNumWaves()
num = 0
while numWaves > 0:
if patch.isWaveEnabled(num):
numWaves -= 1
if patch.getWaveTarget(num) != SeaPatchRoot.WTZ or patch.getWaveFunc(num) != SeaPatchRoot.WFSin:
response = '%s\n%s NON-SINE-WAVE' % (response, num)
else:
response = '%s\n%s amp=%s len=%s spd=%s' % (response, num, patch.getWaveAmplitude(num), patch.getWaveLength(num), patch.getWaveSpeed(num))
num += 1
else:
num = int(args[1])
amplitude = float(args[2])
wavelength = float(args[3])
speed = float(args[4])
patch.enableWave(num)
patch.setWaveTarget(num, SeaPatchRoot.WTZ)
patch.setWaveFunc(num, SeaPatchRoot.WFSin)
patch.setChoppyK(num, 0)
patch.setWaveAmplitude(num, amplitude)
patch.setWaveLength(num, wavelength)
patch.setWaveSpeed(num, speed)
response = 'wave %s modified' % num
self.setMagicWordResponse(response)
if wordIs('~roll'):
args = word.split()
if len(args) < 2:
response = '~roll angle [fakeMass]'
else:
if localAvatar.ship is None:
response = 'not on a ship'
else:
if len(args) > 2:
localAvatar.ship._rocker.setFakeMass(float(args[2]))
localAvatar.ship.addRoll(float(args[1]))
response = 'rolling!'
self.setMagicWordResponse(response)
if wordIs('~ru'):
if hasattr(self, 'radarUtil') and self.radarUtil and not self.radarUtil.isDestroyed():
self.radarUtil.destroy()
else:
self.radarUtil = RadarUtil()
if __dev__ and wordIs('~todpanel'):
tod = base.cr.timeOfDayManager
from pirates.leveleditor import TimeOfDayPanel
p = TimeOfDayPanel.TimeOfDayPanel(tod)
if __dev__ and wordIs('~kraken'):
args = word.split()[1:]
if args and args[0]:
if not hasattr(base, 'oobeMode') or not base.oobeMode:
base.oobe()
base.oobeCamera.wrtReparentTo(render)
if wordIs('~pvpmoney') or wordIs('~pvpinfamy'):
if localAvatar.ship and localAvatar.ship.renownDisplay:
taskMgr.doMethodLater(2.0, localAvatar.ship.renownDisplay.loadRank, 'pvp-infamy-display', [])
if localAvatar.guiMgr and localAvatar.guiMgr.pvpPanel and hasattr(localAvatar.guiMgr.pvpPanel, 'renownDisplay') and localAvatar.guiMgr.pvpPanel.renownDisplay:
taskMgr.doMethodLater(2.0, localAvatar.guiMgr.pvpPanel.renownDisplay.loadRank, 'pvp-infamy-display', [])
if localAvatar.guiMgr and localAvatar.guiMgr.titlesPage:
taskMgr.doMethodLater(2.0, localAvatar.guiMgr.titlesPage.refresh, 'titles-refresh', [])
if wordIs('~profileCard'):
args = word.split()
if len(args) >= 2:
profileId = int(args[1])
else:
profileId = localAvatar.getDoId()
localAvatar.guiMgr.handleAvatarDetails(profileId)
if wordIs('~gmNameTag'):
args = word.split()
if len(args) < 2 and localAvatar.isGM():
response = PLocalizer.MAGICWORD_GMNAMETAG
self.setMagicWordResponse(response)
if len(args) >= 2 and localAvatar.isGM():
if args[1] == 'enable':
localAvatar.setGMNameTagState(1)
else:
if args[1] == 'disable':
localAvatar.setGMNameTagState(0)
else:
if args[1] == 'setString':
xCount = 0
stringToSet = ''
for i in args:
if xCount < 2:
pass
else:
stringToSet = '%s %s' % (stringToSet, args[xCount])
xCount += 1
localAvatar.setGMNameTagString(stringToSet)
else:
if args[1] == 'setColor':
localAvatar.setGMNameTagColor(args[2])
else:
if wordIs('~liveCam'):
LiveCamTransforms = {'1': [Vec3(-385.776, -2369.64, 52.4644), Vec3(-18.0412, -3.24766, 0), 39.3076, 0], '2': [Vec3(79.1195, -2521.26, 52.4644), Vec3(-18.0412, -3.24766, 0), 39.3076, 0], '3': [Vec3(2858.35, 931.111, 37.9564), Vec3(-29.8904, -7.12525, 0), 39.3076, 1], '4': [Vec3(3551.93, 532.437, 37.9564), Vec3(-29.8904, -7.12525, 0), 39.3076, 1], '5': [Vec3(4245.52, 133.763, 37.9564), Vec3(-29.8904, -7.12525, 0), 39.3076, 1], '6': [Vec3(4939.1, -264.911, 37.9564), Vec3(-29.8904, -7.12525, 0), 39.3076, 1]}
lodNodes = render.findAllMatches('**/+LODNode')
for i in xrange(0, lodNodes.getNumPaths()):
lodNodes[i].node().forceSwitch(lodNodes[i].node().getHighestSwitch())
localAvatar.clearInterestNamed(None, ['liveCam'])
localAvatar.getParentObj().setOceanVisEnabled(0)
args = word.split()
if len(args) > 1:
camNum = args[1]
camData = LiveCamTransforms[camNum]
localAvatar.cameraFSM.request('Control')
if camData[3]:
camParent = render
else:
camParent = localAvatar.getParentObj()
base.cam.reparentTo(camParent)
base.cam.setPos(camData[0])
base.cam.setHpr(camData[1])
base.camLens.setFov(camData[2])
if camData[3] == 0:
localAvatar.setInterest(localAvatar.getParentObj().doId, [
11622, 11621, 11443, 11442, 11620, 11619, 11441, 11086, 11085, 11263, 11264, 11265, 11444, 11266, 11267, 11445, 11446, 11268, 11269, 11447, 11449, 11270, 11448, 11271, 11272, 11450, 11451, 11273, 11095, 11093, 11094, 11092, 11091, 11090, 11089, 11088, 11087, 11623, 11624, 11625, 11626, 11627, 11628, 11629, 11807, 11630, 11452, 11274, 11096, 11275, 11277, 11276, 11099, 11098, 11097, 11455, 11454, 11453, 11631, 11632, 11633, 11100, 11278, 11456, 11634, 11990, 11812, 11811, 11989, 11988, 11987, 11809, 11810, 11808, 11986, 11985, 12164, 12163, 12162, 11984, 11806, 11805, 11983, 12161, 12160, 11982, 11804, 11803, 11981, 11980, 12159, 11802, 11801, 11979, 12158, 12157, 12156, 11978, 11799, 11800, 11977, 11798, 11976, 11975, 11797, 11796, 11974, 11084, 11262, 11440, 11618, 11795, 11617, 11439, 11261, 11083, 11082, 11260, 11438, 11616, 11794, 11793, 11615, 11437, 11081, 11259, 11080, 11258, 11436, 11614, 11435, 11257, 11079, 11973, 11972, 12155, 12154, 12153], [
'liveCam'])
else:
localAvatar.getParentObj().setOceanVisEnabled(1)
localAvatar.getParentObj().setFlatShips(0)
else:
localAvatar.cameraFSM.request('FPS')
base.cam.reparentTo(camera)
base.cam.setPos(0, 0, 0)
base.cam.setHpr(0, 0, 0)
base.camLens.setFov(63.742)
else:
if wordIs('~showCams'):
render.findAllMatches('**/liveCamParent*').detach()
LiveCamTransforms = {'1': [Vec3(-385.776, -2369.64, 52.4644), Vec3(-18.0412, -3.24766, 0), 39.3076, 0], '2': [Vec3(79.1195, -2521.26, 52.4644), Vec3(-18.0412, -3.24766, 0), 39.3076, 0], '3': [Vec3(2858.35, 931.111, 37.9564), Vec3(-29.8904, -7.12525, 0), 39.3076, 1], '4': [Vec3(3551.93, 532.437, 37.9564), Vec3(-29.8904, -7.12525, 0), 39.3076, 1], '5': [Vec3(4245.52, 133.763, 37.9564), Vec3(-29.8904, -7.12525, 0), 39.3076, 1], '6': [Vec3(4939.1, -264.911, 37.9564), Vec3(-29.8904, -7.12525, 0), 39.3076, 1]}
camModel = NodePath('camera')
lens = PerspectiveLens()
lens.setFov(base.camLens.getFov())
lens.setFov(39.3076)
g = lens.makeGeometry()
gn = GeomNode('frustum')
gn.addGeom(g)
gnp = camModel.attachNewNode(gn)
if not localAvatar.getShip():
for camNum in range(1, 3):
camData = LiveCamTransforms[str(camNum)]
camParent = localAvatar.getParentObj().attachNewNode('liveCamParent-%s' % camNum)
camParent.setPos(camData[0])
camParent.setHpr(camData[1])
camParent.setScale(10)
camModel.instanceTo(camParent)
for camNum in range(3, 7):
camData = LiveCamTransforms[str(camNum)]
camParent = render.attachNewNode('liveCamParent-%s' % camNum)
camParent.setPos(camData[0])
camParent.setHpr(camData[1])
camParent.setScale(10)
camModel.instanceTo(camParent)
else:
if wordIs('~hideCams'):
render.findAllMatches('**/liveCamParent*').detach()
else:
if wordIs('~dropBlockers'):
ga = localAvatar.getParentObj()
blockers = ga.findAllMatches('**/blocker_*')
blockers.stash()
else:
if __dev__ and wordIs('~effects'):
args = word.split()
self.configEffects(args)
else:
if __dev__ and wordIs('~shipsRock'):
configIs = 'ships-rock'
args = word.split()
self.configShipsRock(configIs, args)
else:
if __dev__ and wordIs('~shipsRockWithoutWaves'):
configIs = 'ships-rock-without-waves'
args = word.split()
self.configShipsRock(configIs, args)
else:
if __dev__ and wordIs('~wantCompassTask'):
self.configToggleBool('want-compass-task')
else:
if __dev__ and wordIs('~wantPatchie'):
def turnOffSeapatch():
if hasattr(base.cr.activeWorld.worldGrid, 'cleanupWater'):
base.cr.activeWorld.worldGrid.cleanupWater()
def turnOnSeapatch():
if hasattr(base.cr.activeWorld.worldGrid, 'setupWater'):
base.cr.activeWorld.worldGrid.setupWater()
self.configToggleBool('want-compass-task', offCode=turnOffSeapatch, onCode=turnOnSeapatch)
else:
if __dev__ and wordIs('~wantShipColl'):
if localAvatar.ship and localAvatar.ship.controlManager.controls.has_key('ship'):
if localAvatar.ship.controlManager.controls['ship'].collisionsActive:
localAvatar.ship.controlManager.controls['ship'].setCollisionsActive(0)
self.setMagicWordResponse('ship collisions OFF')
else:
localAvatar.ship.controlManager.controls['ship'].setCollisionsActive()
self.setMagicWordResponse('ship collisions ON')
else:
self.setMagicWordResponse('get on a ship!')
else:
if __dev__ and wordIs('~wantCannonColl'):
if localAvatar.ship:
args = word.split()
if len(args) > 1:
type = int(args[1])
base.cr.cannonballCollisionDebug = type
else:
if base.cr.cannonballCollisionDebug == 0:
base.cr.cannonballCollisionDebug = 1
else:
base.cr.cannonballCollisionDebug = 0
if base.cr.cannonballCollisionDebug == 0:
self.setMagicWordResponse('cannonball collisions set to ALL OFF')
else:
if base.cr.cannonballCollisionDebug == 1:
self.setMagicWordResponse('cannonball collisions set to ALL ON')
else:
if base.cr.cannonballCollisionDebug == 2:
self.setMagicWordResponse('cannonball collisions set to Broadside ONLY ON')
else:
if base.cr.cannonballCollisionDebug == 3:
self.setMagicWordResponse('cannonball collisions set to Deck ONLY ON')
else:
self.setMagicWordResponse('get on a ship!')
else:
if __dev__ and wordIs('~wantEventCollider'):
self.configWantEventCollider()
else:
if __dev__ and wordIs('~wantFloorEventRay'):
self.configWantFloorEventRay()
else:
if __dev__ and wordIs('~optimized1'):
if not localAvatar.ship:
self.setMagicWordResponse('get on a ship FIRST')
self.configWantFloorEventRay()
self.configWantEventCollider()
self.configWantWaterRippleRay()
self.configToggleBool('want-compass-task')
configIs = 'ships-rock'
args = word.split()
self.configShipsRock(configIs, args)
self.configEffects(args)
else:
if __dev__ and wordIs('~optimized2'):
if not localAvatar.ship:
self.setMagicWordResponse('get on a ship FIRST')
self.configWantFloorEventRay()
self.configWantEventCollider()
self.configWantWaterRippleRay()
else:
if wordIs('~setCannonFireVis'):
args = word.split()
type = 'all'
if len(args) > 2:
if args[2] == 'broadside':
type = 'broadside'
else:
if args[2] == 'deck':
type = 'deck'
if len(args) > 1:
dist = int(args[1])
else:
if type == 'broadside':
dist = config.GetInt('cannon-fire-broadside-dist', 3500)
else:
dist = config.GetInt('cannon-fire-dist', 3500)
if type == 'all' or type == 'deck':
DistributedSimpleShip.DistributedSimpleShip.CannonFireDist = dist
self.setMagicWordResponse('setting deck cannon visibility distance to %s' % dist)
if type == 'all' or type == 'broadside':
DistributedSimpleShip.DistributedSimpleShip.CannonFireBroadsideDist = dist
self.setMagicWordResponse('setting broadside cannon visibility distance to %s' % dist)
else:
if wordIs('~setWakeVis'):
args = word.split()
dist = config.GetInt('ship-wake-dist', 3800)
if len(args) > 1:
dist = int(args[1])
DistributedSimpleShip.DistributedSimpleShip.ShipWakeDist = dist
self.setMagicWordResponse('setting wake visibility distance to %s' % dist)
else:
if wordIs('~setRockVis'):
args = word.split()
dist = config.GetInt('ship-rock-dist', 1000)
if len(args) > 1:
dist = int(args[1])
DistributedSimpleShip.DistributedSimpleShip.ShipRockDist = dist
self.setMagicWordResponse('setting rocking visibility distance to %s' % dist)
else:
if __dev__ and wordIs('~wantReducedShipColl'):
shipPilot = localAvatar.ship.controlManager.controls.get('ship')
shipCollNode = shipPilot.cNodePath.node()
if shipCollNode.getNumSolids() > 1:
shipCollNode.removeSolid(2)
shipCollNode.removeSolid(1)
self.setMagicWordResponse('removing mid and stern spheres')
else:
shipCollNode.addSolid(shipPilot.cMidSphere)
shipCollNode.addSolid(shipPilot.cSternSphere)
self.setMagicWordResponse('adding mid and stern spheres')
else:
if __dev__ and wordIs('~wantCollideMasks'):
args = word.split()
force = None
if len(args) > 1:
force = int(args[1])
from pirates.ship import DistributedSimpleShip
clientShips = filter(lambda x: isinstance(x, DistributedSimpleShip.DistributedSimpleShip) and x is not localAvatar.ship, base.cr.doId2do.values())
cleared = False
for currShip in clientShips:
shipCollWall = currShip.hull[0].collisions.find('**/collision_hull')
if not shipCollWall.isEmpty():
if shipCollWall.getCollideMask() & PiratesGlobals.ShipCollideBitmask == BitMask32.allOff():
shipCollWall.setCollideMask(shipCollWall.getCollideMask() | PiratesGlobals.ShipCollideBitmask)
else:
shipCollWall.setCollideMask(shipCollWall.getCollideMask() ^ PiratesGlobals.ShipCollideBitmask)
cleared = True
if cleared:
self.setMagicWordResponse('cleared ship collide bitmasks')
else:
self.setMagicWordResponse('set ship collide bitmasks')
else:
if __dev__ and wordIs('~saveCamera'):
camera = base.cr.doFind('DistributedCamera')
cameraOV = camera.getOV()
args = word.split()[1:]
if args:
id = cameraOV.saveFixture(int(args[0]))
else:
id = cameraOV.saveFixture()
self.setMagicWordResponse('camera saved: %d' % id)
else:
if __dev__ and wordIs('~removeCamera'):
camera = base.cr.doFind('DistributedCamera')
cameraOV = camera.getOV()
args = word.split()[1:]
if args:
cameraOV.removeFixture(int(args[0]))
else:
self.setMagicWordResponse('need camera id to remove')
else:
if __dev__ and wordIs('~standbyCamera'):
camera = base.cr.doFind('DistributedCamera')
cameraOV = camera.getOV()
args = word.split()[1:]
if args:
cameraOV.standbyFixture(int(args[0]))
else:
self.setMagicWordResponse('need camera id to standby')
else:
if __dev__ and wordIs('~blinkCamera'):
camera = base.cr.doFind('DistributedCamera')
cameraOV = camera.getOV()
args = word.split()[1:]
if args:
cameraOV.blinkFixture(int(args[0]))
else:
self.setMagicWordResponse('need camera id to blink')
else:
if __dev__ and wordIs('~testCamera'):
camera = base.cr.doFind('DistributedCamera')
cameraOV = camera.getOV()
args = word.split()[1:]
if args:
cameraOV.testFixture(int(args[0]))
else:
self.setMagicWordResponse('need camera id to test')
else:
if __dev__ and wordIs('~storeCameras'):
camera = base.cr.doFind('DistributedCamera')
cameraOV = camera.getOV()
args = word.split()[1:]
if args:
cameraOV.storeToFile(args[0])
else:
self.setMagicWordResponse('need name to store')
else:
if __dev__ and wordIs('~loadCameras'):
camera = base.cr.doFind('DistributedCamera')
cameraOV = camera.getOV()
args = word.split()[1:]
if args:
cameraOV.loadFromFile(args[0])
else:
self.setMagicWordResponse('need name to load')
else:
if __dev__ and wordIs('~startRecording'):
camera = base.cr.doFind('DistributedCamera')
cameraOV = camera.getOV()
cameraOV.startRecording()
else:
if __dev__ and wordIs('~stopRecording'):
camera = base.cr.doFind('DistributedCamera')
cameraOV = camera.getOV()
cameraOV.stopRecording()
else:
if __dev__ and base.config.GetBool('want-fishing-game', 0) and wordIs('~fishcam'):
self.toggleFishCam()
self.setMagicWordResponse('toggling fish cam')
cameraOV.stopRecording()
else:
if wordIs('~fishR'):
self.doRequestFish(word, localAvatar, zoneId, localAvatar.doId)
else:
if wordIs('~leg'):
args = word.split()[1:]
if args:
base.fishingGame.wantLeg = arg[0]
else:
base.fishingGame.wantLeg = 1
else:
if wordIs('~legWin'):
if hasattr(base, 'fishingGame'):
if base.fishingGame.fsm.getCurrentOrNextState() == 'LegendaryFish':
base.fishingGame.lfgFsm.request('Win')
else:
self.setMagicWordResponse('Not battling legendary fish! (use ~leg)')
else:
self.setMagicWordResponse('Fishing Game not started.')
else:
if wordIs('~cdunlockall'):
messenger.send('cdUnlockAll')
else:
if wordIs('~camSpin'):
args = word.split()
dist = 40
if len(args) > 1:
dist = float(args[1])
def spin(task=None):
localAvatar.cameraFSM.getCurrentCamera().setH(localAvatar.cameraFSM.getCurrentCamera().getH() + 1)
return Task.cont
if taskMgr.hasTaskNamed('camSpin'):
localAvatar.cameraFSM.getCurrentCamera().setH(0)
localAvatar.cameraFSM.getCurrentCamera()._setCamDistance(14)
localAvatar.cameraFSM.getCurrentCamera().forceMaxDistance = True
localAvatar.cameraFSM.getCurrentCamera()._startCollisionCheck()
taskMgr.remove('camSpin')
else:
localAvatar.cameraFSM.getCurrentCamera()._stopCollisionCheck()
localAvatar.cameraFSM.getCurrentCamera().forceMaxDistance = False
localAvatar.cameraFSM.getCurrentCamera()._setCamDistance(dist)
taskMgr.add(spin, 'camSpin')
else:
if wordIs('~hostilizeNear'):
interactivesNear = base.cr.interactionMgr.sortInteractives()
for currInteractive in interactivesNear:
if isinstance(currInteractive, DistributedNPCTownfolk.DistributedNPCTownfolk):
self.b_setMagicWord('~hostilize ' + str(currInteractive.doId))
return
def configEffects(self, args):
effectCats = args[1:]
def toggleEffects(on=None):
if effectCats:
for currEffectCat in effectCats:
if currEffectCat == 'clearCustom':
base.cr.effectToggles = {}
continue
if currEffectCat == 'listEffectCats':
response = 'known effect types are: \n%s' % base.cr.effectTypes.keys()
self.setMagicWordResponse(response)
continue
effectTypes = base.cr.effectTypes.get(currEffectCat, [currEffectCat])
for currEffectType in effectTypes:
newStatus = not base.cr.effectToggles.get(currEffectType, base.config.GetBool('want-special-effects', 1))
base.cr.effectToggles[currEffectType] = newStatus
response = 'effect %s set to %s' % (currEffectType, choice(newStatus, 'ON', 'OFF'))
self.setMagicWordResponse(response)
base.cr.wantSpecialEffects = base.config.GetBool('want-special-effects', 1)
from pirates.ship import DistributedSimpleShip
clientShips = filter(lambda x: isinstance(x, DistributedSimpleShip.DistributedSimpleShip), base.cr.doId2do.values())
if base.cr.queryShowEffect('BlackSmoke') or base.cr.queryShowEffect('Fire'):
for ship in clientShips:
if base.cr.queryShowEffect('BlackSmoke'):
ship.startSmoke()
if base.cr.queryShowEffect('Fire'):
ship.startFire()
else:
if not base.cr.queryShowEffect('BlackSmoke') or not base.cr.queryShowEffect('Fire'):
for ship in clientShips:
if not base.cr.queryShowEffect('BlackSmoke'):
ship.stopSmoke()
if not base.cr.queryShowEffect('Fire'):
ship.stopFire()
if effectCats:
toggleEffects()
else:
self.configToggleBool('want-special-effects', offCode=lambda p1=False: toggleEffects(p1), onCode=lambda p1=True: toggleEffects(p1))
return
def configWantEventCollider(self):
currControls = localAvatar.controlManager.currentControls
if currControls == None:
return
colliderExists = base.shadowTrav.hasCollider(currControls.cEventSphereNodePath) or currControls.cTrav.hasCollider(currControls.cEventSphereNodePath)
if colliderExists:
currControls.cTrav.removeCollider(currControls.cEventSphereNodePath)
base.shadowTrav.removeCollider(currControls.cEventSphereNodePath)
currControls.pusher.addInPattern('enter%in')
currControls.pusher.addOutPattern('exit%in')
self.setMagicWordResponse('event sphere OFF')
else:
currControls.pusher.clearInPatterns()
currControls.pusher.clearOutPatterns()
avatarRadius = 1.4
base.shadowTrav.addCollider(currControls.cEventSphereNodePath, currControls.event)
self.setMagicWordResponse('event sphere ON')
return
def configWantFloorEventRay(self):
if localAvatar.cTrav.hasCollider(localAvatar.cFloorNodePath):
localAvatar.cTrav.removeCollider(localAvatar.cFloorNodePath)
self.setMagicWordResponse('floor event ray OFF')
else:
localAvatar.cTrav.addCollider(localAvatar.cFloorNodePath, localAvatar.floorEventHandler)
self.setMagicWordResponse('floor event ray ON')
def configWantWaterRippleRay(self):
if localAvatar.cTrav.hasCollider(localAvatar.cWaterNodePath):
localAvatar.cTrav.removeCollider(localAvatar.cWaterNodePath)
self.setMagicWordResponse('water ripple ray OFF')
else:
localAvatar.cTrav.addCollider(localAvatar.cWaterNodePath, localAvatar.waterEventHandler)
self.setMagicWordResponse('water ripple ray ON')
def configWantShadowPlacer(self):
if localAvatar.shadowPlacer.cTrav.hasCollider(localAvatar.shadowPlacer.cRayNodePath):
localAvatar.shadowPlacer.cTrav.removeCollider(localAvatar.shadowPlacer.cRayNodePath)
self.setMagicWordResponse('shadow placer ray OFF')
else:
localAvatar.shadowPlacer.cTrav.addCollider(localAvatar.shadowPlacer.cRayNodePath, localAvatar.shadowPlacer.lifter)
self.setMagicWordResponse('shadow placer ray ON')
def configShipsRock(self, configIs, args):
onlyPlayerRocks = False
if len(args) > 1:
if args[1] == 'playerOnly':
onlyPlayerRocks = True
(config.GetInt(configIs, 1) == 1 or config.GetInt(configIs, 1) == 2) and ConfigVariableInt(configIs).setValue(0)
self.setMagicWordResponse('%s OFF (all ships)' % configIs)
else:
if onlyPlayerRocks:
ConfigVariableInt(configIs).setValue(2)
self.setMagicWordResponse('%s ON (local player ship only)' % configIs)
else:
ConfigVariableInt(configIs).setValue(1)
self.setMagicWordResponse('%s ON (all ships)' % configIs)
def configToggleBool(self, configName, defaultVal=1, offCode=None, onCode=None):
currVal = not config.GetBool(configName, defaultVal)
loadPrcFileData('', '%s %s' % (configName, currVal))
self.setMagicWordResponse('%s %s' % (configName, choice(currVal, 'ON', 'OFF')))
if currVal:
onCode and onCode()
else:
if not currVal and offCode:
offCode()
def cameraFollowTgt(self, target, parentId):
localAvatar.cTrav.removeCollider(localAvatar.cFloorNodePath)
localAvatar.controlManager.use('observer', localAvatar)
localAvatar.controlManager.currentControls.disableAvatarControls()
localAvatar.guiMgr.setIgnoreAllKeys(True)
localAvatar.guiMgr.combatTray.skillMapping.clear()
localAvatar.reparentTo(target)
localAvatar.setScale(1)
parentObj = base.cr.doId2do[parentId]
localAvatar.setPos(0, 0, 0)
localAvatar.setHpr(render, target.getHpr(render))
localAvatar.stash()
if self.pendingCameraReparent:
base.cr.relatedObjectMgr.abortRequest(self.pendingCameraReparent)
self.pendingCameraReparent = None
return
def cameraUnfollowTgt(self, target):
localAvatar.cTrav.addCollider(localAvatar.cFloorNodePath, localAvatar.floorEventHandler)
localAvatar.controlManager.currentControls.enableAvatarControls()
localAvatar.controlManager.use('walk', localAvatar)
localAvatar.guiMgr.setIgnoreAllKeys(False)
localAvatar.guiMgr.combatTray.initCombatTray()
localAvatar.unstash()
if hasattr(localAvatar, 'followTgt'):
del localAvatar.followTgt
def cameraReparent(self, targetId, targetParentId, zoneId):
targetObj = base.cr.doId2do.get(targetParentId)
if targetObj:
if not isinstance(targetObj, NodePath):
return
currParentObj = localAvatar.getParentObj()
if self.originalLocation == None:
self.originalLocation = [
localAvatar.getLocation(), localAvatar.getPos(currParentObj)]
prevPos = None
if targetId == 0:
if (targetParentId == 0 and zoneId == 0 and self).originalLocation:
targetParentId = self.originalLocation[0][0]
zoneId = self.originalLocation[0][1]
prevPos = self.originalLocation[1]
self.originalLocation = None
targetObj = base.cr.doId2do.get(targetParentId)
if targetObj == None or not isinstance(targetObj, NodePath):
self.notify.debug('Parent of target object to reparent avatar/camera to does not yet exist, skipping reparent request')
return
newPos = prevPos and prevPos
else:
newPos = Point3(*targetObj.getZoneCellOriginCenter(zoneId))
localAvatar.reparentTo(targetObj)
localAvatar.setPos(newPos)
localAvatar.isGhosting = True
base.cr.doId2do.has_key(targetId) and self.cameraFollowTgt(base.cr.doId2do[targetId], targetParentId)
else:
if targetId:
self.pendingCameraReparent = base.cr.relatedObjectMgr.requestObjects([targetId], eachCallback=lambda param=None, param2=targetParentId: self.cameraFollowTgt(param, param2))
else:
if self.pendingCameraReparent:
base.cr.relatedObjectMgr.abortRequest(self.pendingCameraReparent)
self.pendingCameraReparent = None
self.cameraUnfollowTgt(targetObj)
localAvatar.isGhosting = False
return
def shipCreated(self, shipId):
return
print 'shipCreated(%s)' % shipId
ship = base.cr.doId2do.get(shipId)
if ship:
print 'ship created: %s' % ship
ship.localAvatarInstantBoard()
ship.enableOnDeckInteractions()
def toggleFishCam(self):
self.fishCamEnabled = not self.fishCamEnabled
if self.fishCamEnabled:
base.oobe()
base.oobeCamera.setPos(-13.0, 4.0, -6.0)
base.oobeCamera.setHpr(90.0, 0.0, 0.0)
from pandac.PandaModules import CardMaker
from direct.interval.IntervalGlobal import PosInterval, ProjectileInterval, Sequence, Wait
cm = CardMaker('fishBackdrop')
self.fishBackdrop = render.attachNewNode(cm.generate())
tex = loader.loadTexture('maps/underseaBackdrop.jpg')
self.fishBackdrop.setTexture(tex)
self.fishBackdrop.reparentTo(localAvatar)
self.fishBackdrop.setHpr(90, 0, 0)
self.fishBackdrop.setPos(0, -100, -108.7)
self.fishBackdrop.setScale(400, 1, 100)
self.fishBackdrop.setBin('ground', 20)
self.fishBackdrop.setDepthWrite(0)
self.fishCamProjectileInterval = Sequence(Wait(4), ProjectileInterval(base.oobeCamera, startPos=Point3(-13.0, 4.0, -6.0), endPos=Point3(-13.0, 164.0, -36.0), duration=3), ProjectileInterval(base.oobeCamera, startPos=Point3(-13.0, 164.0, -36.0), endPos=Point3(-13.0, 4.0, -24.0), gravityMult=-0.5, duration=5), base.oobeCamera.posInterval(5, Point3(-13.0, 4.0, -6.0)))
self.fishCamProjectileInterval.start()
else:
self.fishCamProjectileInterval.finish()
del self.fishCamProjectileInterval
self.fishBackdrop.reparentTo(hidden)
del self.fishBackdrop
base.oobe()
def doRequestFish(self, word, av, zoneId, senderId):
args = word.split()
doid = args[1]
spot = self.cr.doId2do[int(doid)]
spot.requestInteraction(localAvatar.doId) | 69.176724 | 993 | 0.408349 | 5,105 | 80,245 | 6.378257 | 0.202742 | 0.016707 | 0.015172 | 0.032431 | 0.363042 | 0.292497 | 0.253125 | 0.175824 | 0.160192 | 0.152022 | 0 | 0.04485 | 0.51931 | 80,245 | 1,160 | 994 | 69.176724 | 0.79929 | 0.002617 | 0 | 0.333633 | 0 | 0 | 0.054579 | 0.001787 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.010791 | 0.03777 | null | null | 0.003597 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
020a85d2b9268f0ad8b4e717c76fefae39beb819 | 339 | py | Python | Python/DDUtil.py | dalek7/umbrella | cabf0367940905ca5164d104d7aef6ff719ee166 | [
"MIT"
] | 1 | 2021-03-09T09:12:02.000Z | 2021-03-09T09:12:02.000Z | Python/DDUtil.py | dalek7/umbrella | cabf0367940905ca5164d104d7aef6ff719ee166 | [
"MIT"
] | null | null | null | Python/DDUtil.py | dalek7/umbrella | cabf0367940905ca5164d104d7aef6ff719ee166 | [
"MIT"
] | null | null | null | import os
import datetime
def exit():
os._exit(0)
def GetTimeString(m = -1):
if m==0:
s1 = datetime.datetime.now().strftime("%Y%m%d%H%M%S")
else:
s1 = datetime.datetime.now().strftime("%Y%m%d_%H%M%S")
return s1
def MakeDir(directory):
if not os.path.exists(directory):
os.makedirs(directory) | 19.941176 | 62 | 0.60767 | 53 | 339 | 3.849057 | 0.471698 | 0.098039 | 0.176471 | 0.205882 | 0.343137 | 0.343137 | 0.343137 | 0.343137 | 0.343137 | 0.343137 | 0 | 0.022727 | 0.221239 | 339 | 17 | 63 | 19.941176 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0.073529 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.230769 | false | 0 | 0.153846 | 0 | 0.461538 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
021267aeacfe0ae1c6472616df30ce20f8a2d09b | 24,270 | py | Python | picoCTF-web/tests/api/functional/common.py | MongYahHsieh/picoCTF | dd500ad9c59768137b33e2d2b102a089ddf0ad40 | [
"MIT"
] | null | null | null | picoCTF-web/tests/api/functional/common.py | MongYahHsieh/picoCTF | dd500ad9c59768137b33e2d2b102a089ddf0ad40 | [
"MIT"
] | null | null | null | picoCTF-web/tests/api/functional/common.py | MongYahHsieh/picoCTF | dd500ad9c59768137b33e2d2b102a089ddf0ad40 | [
"MIT"
] | null | null | null | """Utilities for functional tests."""
import datetime
import json
import re
import pymongo
import pytest
import api
RATE_LIMIT_BYPASS = "test_bypass"
TESTING_DB_NAME = 'ctf_test'
db = None
def decode_response(res):
"""Parse a WebSuccess or WebError response."""
decoded_dict = json.loads(res.data.decode('utf-8'))
return (decoded_dict['status'], decoded_dict['message'],
decoded_dict['data'])
def get_csrf_token(res):
"""Extract the CSRF token from a response."""
for header in res.headers:
m = re.search('token=(.+?);', header[1])
if m:
return m.group(1)
raise RuntimeError('Could not find CSRF token in response headers: ' + str(res.headers))
def get_conn():
"""Get a connection to the testing database."""
global db
if db is None:
client = pymongo.MongoClient(host='127.0.0.1', port=27018)
db = client[TESTING_DB_NAME]
return db
def clear_db():
"""Clear out the testing database."""
db = get_conn()
db.command('dropDatabase')
@pytest.fixture
def client():
"""Create a test client of the Flask app."""
app = api.create_app({
'TESTING': True,
'MONGO_DB_NAME': TESTING_DB_NAME,
'MONGO_PORT': 27018,
'RATE_LIMIT_BYPASS': RATE_LIMIT_BYPASS
})
return app.test_client()
def app():
"""Create an instance of the Flask app for testing."""
app = api.create_app({
'TESTING': True,
'MONGO_DB_NAME': TESTING_DB_NAME,
'MONGO_PORT': 27018
})
return app
def cache(f, *args, **kwargs):
result = f(reset_cache=True, *args, **kwargs)
return result
def update_all_scoreboards():
api.stats.get_all_team_scores()
api.stats.get_all_team_scores(include_ineligible=True)
for group in api.group.get_all_groups():
api.stats.get_group_scores(gid=group['gid'])
ADMIN_DEMOGRAPHICS = {
'username': 'adminuser',
'password': 'adminpw',
'firstname': 'Admin',
'lastname': 'User',
'email': 'admin@example.com',
'country': 'US',
'affiliation': 'Admin School',
'usertype': 'other',
'demo': {
'parentemail': 'admin@example.com',
'age': '18+'
},
'gid': None,
'rid': None
}
TEACHER_DEMOGRAPHICS = {
'username': 'teacheruser',
'password': 'teacherpw',
'firstname': 'Teacher',
'lastname': 'User',
'email': 'teacher@example.com',
'country': 'US',
'affiliation': 'Sample School',
'usertype': 'teacher',
'demo': {
'parentemail': 'teacher@example.com',
'age': '18+'
},
'gid': None,
'rid': None
}
STUDENT_DEMOGRAPHICS = {
'username': 'studentuser',
'password': 'studentpw',
'firstname': 'Student',
'lastname': 'User',
'email': 'student@example.com',
'country': 'US',
'affiliation': 'Sample School',
'usertype': 'student',
'demo': {
'parentemail': 'student@example.com',
'age': '13-17'
},
'gid': None,
'rid': None
}
STUDENT_2_DEMOGRAPHICS = {
'username': 'studentuser2',
'password': 'studentpw2',
'firstname': 'Student',
'lastname': 'Usertwo',
'email': 'student2@example.com',
'country': 'US',
'affiliation': 'Sample School',
'usertype': 'student',
'demo': {
'parentemail': 'student2@example.com',
'age': '18+'
},
'gid': None,
'rid': None
}
OTHER_USER_DEMOGRAPHICS = {
'username': 'otheruser',
'password': 'otherpw',
'firstname': 'Other',
'lastname': 'User',
'email': 'other@example.com',
'country': 'US',
'affiliation': 'Sample Organization',
'usertype': 'other',
'demo': {
'age': '18+'
},
'gid': None,
'rid': None
}
def register_test_accounts():
"""
Register an admin, teacher, and student account with known demographics.
Intended to be used, if needed, in conjunction with clear_db()
to set up a clean environment for each test.
"""
with app().app_context():
api.user.add_user(ADMIN_DEMOGRAPHICS)
api.user.add_user(TEACHER_DEMOGRAPHICS)
api.user.add_user(STUDENT_DEMOGRAPHICS)
api.user.add_user(STUDENT_2_DEMOGRAPHICS)
api.user.add_user(OTHER_USER_DEMOGRAPHICS)
sample_shellserver_publish_output = r'''
{
"problems": [
{
"name": "ECB 1",
"category": "Cryptography",
"description": "There is a crypto service running at {{server}}:{{port}}. We were able to recover the source code, which you can download at {{url_for(\"ecb.py\")}}.",
"hints": [],
"walkthrough": "Let me google that for you.",
"score": 70,
"author": "Tim Becker",
"organization": "ForAllSecure",
"event": "Sample",
"pip_requirements": [
"pycrypto"
],
"pip_python_version": "3",
"unique_name": "ecb-1-b06174a",
"instances": [
{
"user": "ecb-1_0",
"deployment_directory": "/problems/ecb-1_0_73a0108a98d2862a86f4b71534aaf7c3",
"service": "ecb-1_0",
"socket": null,
"server": "192.168.2.3",
"description": "There is a crypto service running at 192.168.2.3:46981. We were able to recover the source code, which you can download at <a href='//192.168.2.3/static/fd59acc6b8d2359d48bd939a08ecb8ab/ecb.py'>ecb.py</a>.",
"flag": "49e56ea9bf2e2b60ba9af034b5b2a5fd",
"flag_sha1": "77cec418714d6eb0dc48afa6d6f38200402a83c0",
"instance_number": 0,
"should_symlink": false,
"files": [
{
"path": "flag",
"permissions": 288,
"user": null,
"group": null
},
{
"path": "key",
"permissions": 288,
"user": null,
"group": null
},
{
"path": "ecb.py",
"permissions": 1517,
"user": null,
"group": null
},
{
"path": "xinet_startup.sh",
"permissions": 1517,
"user": null,
"group": null
}
],
"port": 46981
},
{
"user": "ecb-1_1",
"deployment_directory": "/problems/ecb-1_1_83b2ed9a1806c86219347bc4982a66de",
"service": "ecb-1_1",
"socket": null,
"server": "192.168.2.3",
"description": "There is a crypto service running at 192.168.2.3:21953. We were able to recover the source code, which you can download at <a href='//192.168.2.3/static/beb9874a05a1810fa8c9d79152ace1b3/ecb.py'>ecb.py</a>.",
"flag": "85a32ccd05fa30e0efd8da555c1a101a",
"flag_sha1": "f28581a86561c885152f7622200057585787c063",
"instance_number": 1,
"should_symlink": false,
"files": [
{
"path": "flag",
"permissions": 288,
"user": null,
"group": null
},
{
"path": "key",
"permissions": 288,
"user": null,
"group": null
},
{
"path": "ecb.py",
"permissions": 1517,
"user": null,
"group": null
},
{
"path": "xinet_startup.sh",
"permissions": 1517,
"user": null,
"group": null
}
],
"port": 21953
},
{
"user": "ecb-1_2",
"deployment_directory": "/problems/ecb-1_2_1998c2cc0f0d17ae54170200f5478b7f",
"service": "ecb-1_2",
"socket": null,
"server": "192.168.2.3",
"description": "There is a crypto service running at 192.168.2.3:17648. We were able to recover the source code, which you can download at <a href='//192.168.2.3/static/19e863cba0bf14ad676e4b4799eacc72/ecb.py'>ecb.py</a>.",
"flag": "f76d2f6b885255450ed2f7307d96e28e",
"flag_sha1": "43cf6f1dab026cf2100e2f663509512416112219",
"instance_number": 2,
"should_symlink": false,
"files": [
{
"path": "flag",
"permissions": 288,
"user": null,
"group": null
},
{
"path": "key",
"permissions": 288,
"user": null,
"group": null
},
{
"path": "ecb.py",
"permissions": 1517,
"user": null,
"group": null
},
{
"path": "xinet_startup.sh",
"permissions": 1517,
"user": null,
"group": null
}
],
"port": 17648
}
],
"sanitized_name": "ecb-1"
},
{
"name": "SQL Injection 1",
"category": "Web Exploitation",
"pkg_dependencies": [
"php7.2-sqlite3"
],
"description": "There is a website running at http://{{server}}:{{port}}. Try to see if you can login!",
"score": 40,
"hints": [],
"author": "Tim Becker",
"organization": "ForAllSecure",
"event": "Sample",
"unique_name": "sql-injection-1-0c436d0",
"instances": [
{
"user": "sql-injection-1_0",
"deployment_directory": "/problems/sql-injection-1_0_9e114b246c48eb158b16525f71ae2a00",
"service": "sql-injection-1_0",
"socket": null,
"server": "192.168.2.3",
"description": "There is a website running at http://192.168.2.3:46984. Try to see if you can login!",
"flag": "9ac0a74de6bced3cdce8e7fd466f32d0",
"flag_sha1": "958416d52940e4948eca8d9fb1eca21e4cf7eda1",
"instance_number": 0,
"should_symlink": false,
"files": [
{
"path": "webroot/index.html",
"permissions": 436,
"user": null,
"group": null
},
{
"path": "webroot/login.php",
"permissions": 436,
"user": null,
"group": null
},
{
"path": "webroot/login.phps",
"permissions": 436,
"user": null,
"group": null
},
{
"path": "webroot/config.php",
"permissions": 436,
"user": null,
"group": null
},
{
"path": "users.db",
"permissions": 288,
"user": null,
"group": null
},
{
"path": "xinet_startup.sh",
"permissions": 1517,
"user": null,
"group": null
}
],
"port": 46984
},
{
"user": "sql-injection-1_1",
"deployment_directory": "/problems/sql-injection-1_1_10a4b1cdfd3a0f78d0d8b9759e6d69c5",
"service": "sql-injection-1_1",
"socket": null,
"server": "192.168.2.3",
"description": "There is a website running at http://192.168.2.3:21955. Try to see if you can login!",
"flag": "28054fef0f362256c78025f82e6572c3",
"flag_sha1": "f57fa5d3861c22a657eecafe30a43bd4ad7a4a2a",
"instance_number": 1,
"should_symlink": false,
"files": [
{
"path": "webroot/index.html",
"permissions": 436,
"user": null,
"group": null
},
{
"path": "webroot/login.php",
"permissions": 436,
"user": null,
"group": null
},
{
"path": "webroot/login.phps",
"permissions": 436,
"user": null,
"group": null
},
{
"path": "webroot/config.php",
"permissions": 436,
"user": null,
"group": null
},
{
"path": "users.db",
"permissions": 288,
"user": null,
"group": null
},
{
"path": "xinet_startup.sh",
"permissions": 1517,
"user": null,
"group": null
},
{
"path": "xinet_startup.sh",
"permissions": 1517,
"user": null,
"group": null
}
],
"port": 21955
},
{
"user": "sql-injection-1_2",
"deployment_directory": "/problems/sql-injection-1_2_57a103ad26a005f69b4332e62d611372",
"service": "sql-injection-1_2",
"socket": null,
"server": "192.168.2.3",
"description": "There is a website running at http://192.168.2.3:17649. Try to see if you can login!",
"flag": "6ed19af4c4540d444ae08735aa5664af",
"flag_sha1": "19bbc88ca231ddfde8063acdda75a92b1e6fd993",
"instance_number": 2,
"should_symlink": false,
"files": [
{
"path": "webroot/index.html",
"permissions": 436,
"user": null,
"group": null
},
{
"path": "webroot/login.php",
"permissions": 436,
"user": null,
"group": null
},
{
"path": "webroot/login.phps",
"permissions": 436,
"user": null,
"group": null
},
{
"path": "webroot/config.php",
"permissions": 436,
"user": null,
"group": null
},
{
"path": "users.db",
"permissions": 288,
"user": null,
"group": null
},
{
"path": "xinet_startup.sh",
"permissions": 1517,
"user": null,
"group": null
},
{
"path": "xinet_startup.sh",
"permissions": 1517,
"user": null,
"group": null
},
{
"path": "xinet_startup.sh",
"permissions": 1517,
"user": null,
"group": null
}
],
"port": 17649
}
],
"sanitized_name": "sql-injection-1"
},
{
"name": "Buffer Overflow 1",
"category": "Binary Exploitation",
"description": "Exploit the {{url_for(\"vuln\", display=\"Buffer Overflow\")}} found here: {{directory}}.",
"score": 50,
"hints": [
"This is a classic buffer overflow with no modern protections."
],
"walkthrough": "PROTIP: Find the correct answer to get the points.",
"author": "Tim Becker",
"organization": "ForAllSecure",
"event": "Sample",
"unique_name": "buffer-overflow-1-35e6d9d",
"instances": [
{
"user": "buffer-overflow-1_0",
"deployment_directory": "/problems/buffer-overflow-1_0_bab40cd8ebd7845e1c4c2951c6f82e1f",
"service": null,
"socket": null,
"server": "192.168.2.3",
"description": "Exploit the <a href='//192.168.2.3/static/bd08ee41f495f8bff378c13157d0f511/vuln'>Buffer Overflow</a> found here: /problems/buffer-overflow-1_0_bab40cd8ebd7845e1c4c2951c6f82e1f.",
"flag": "638608c79eca2165e7b241ff365df05b",
"flag_sha1": "4b97abef055a11ec19c14622eb31eb1168d98aca",
"instance_number": 0,
"should_symlink": true,
"files": [
{
"path": "flag.txt",
"permissions": 288,
"user": null,
"group": null
},
{
"path": "vuln",
"permissions": 1517,
"user": null,
"group": null
}
]
},
{
"user": "buffer-overflow-1_1",
"deployment_directory": "/problems/buffer-overflow-1_1_f49b6bd5da29513569bd87f98a934fa6",
"service": null,
"socket": null,
"server": "192.168.2.3",
"description": "Exploit the <a href='//192.168.2.3/static/c95410042007bb17f49b891a2a87afb2/vuln'>Buffer Overflow</a> found here: /problems/buffer-overflow-1_1_f49b6bd5da29513569bd87f98a934fa6.",
"flag": "35013564b97b80d4fd3f2be45e5836ff",
"flag_sha1": "5675d2d5819084d4203c1ef314239527074938a9",
"instance_number": 1,
"should_symlink": true,
"files": [
{
"path": "flag.txt",
"permissions": 288,
"user": null,
"group": null
},
{
"path": "vuln",
"permissions": 1517,
"user": null,
"group": null
}
]
},
{
"user": "buffer-overflow-1_2",
"deployment_directory": "/problems/buffer-overflow-1_2_6c4daed04928f80dd29290060827be61",
"service": null,
"socket": null,
"server": "192.168.2.3",
"description": "Exploit the <a href='//192.168.2.3/static/dbeb4d34945e752ea988dcdb4454f57d/vuln'>Buffer Overflow</a> found here: /problems/buffer-overflow-1_2_6c4daed04928f80dd29290060827be61.",
"flag": "8dfabcb5c4a18d03ad5ecea19eef27a6",
"flag_sha1": "aef4789685665a1bf4994d62ef10941dbce5647a",
"instance_number": 2,
"should_symlink": true,
"files": [
{
"path": "flag.txt",
"permissions": 288,
"user": null,
"group": null
},
{
"path": "vuln",
"permissions": 1517,
"user": null,
"group": null
}
]
}
],
"sanitized_name": "buffer-overflow-1"
}
],
"bundles": [
{
"name": "Challenge Sampler",
"author": "Christopher Ganas",
"description": "Dependency weightmap for the example challenges provided in the picoCTF-Problems repository.",
"dependencies": {
"ecb-1-b06174a": {
"threshold": 1,
"weightmap": {
"buffer-overflow-1-35e6d9d": 1
}
},
"sql-injection-1-0c436d0": {
"threshold": 1,
"weightmap": {
"buffer-overflow-1-35e6d9d": 1,
"ecb-1-b06174a": 1
}
}
}
}
],
"sid": "728f36885f7c4686805593b9e4988c30"
}
'''
problems_endpoint_response = [{'name': 'SQL Injection 1', 'category': 'Web Exploitation', 'description': 'There is a website running at http://192.168.2.3:17648. Try to see if you can login!', 'score': 40, 'hints': [], 'author': 'Tim Becker', 'organization': 'ForAllSecure', 'sanitized_name': 'sql-injection-1', 'disabled': False, 'pid': '4508167aa0b219fd9d131551d10aa58e', 'solves': 0, 'socket': None, 'server': '192.168.2.3', 'port': 17648, 'server_number': 1, 'solved': False, 'unlocked': True}, {'name': 'Buffer Overflow 1', 'category': 'Binary Exploitation', 'description': "Exploit the <a href='//192.168.2.3/static/bd08ee41f495f8bff378c13157d0f511/vuln'>Buffer Overflow</a> found here: /problems/buffer-overflow-1_0_bab40cd8ebd7845e1c4c2951c6f82e1f.", 'score': 50, 'hints': ['This is a classic buffer overflow with no modern protections.'], 'author': 'Tim Becker', 'organization': 'ForAllSecure', 'sanitized_name': 'buffer-overflow-1', 'disabled': False, 'pid': '1bef644c399e10a3f35fecdbf590bd0c', 'solves': 0, 'socket': None, 'server': '192.168.2.3', 'server_number': 1, 'solved': False, 'unlocked': True}, {'name': 'ECB 1', 'category': 'Cryptography', 'description': "There is a crypto service running at 192.168.2.3:21953. We were able to recover the source code, which you can download at <a href='//192.168.2.3/static/beb9874a05a1810fa8c9d79152ace1b3/ecb.py'>ecb.py</a>.", 'hints': [], 'score': 70, 'author': 'Tim Becker', 'organization': 'ForAllSecure', 'sanitized_name': 'ecb-1', 'disabled': False, 'pid': '7afda419da96e8471b49df9c2009e2ef', 'solves': 0, 'socket': None, 'server': '192.168.2.3', 'port': 21953, 'server_number': 1, 'solved': False, 'unlocked': True}]
def load_sample_problems():
"""Load the sample problems and bundle into the DB."""
with app().app_context():
db = get_conn()
db.shell_servers.insert_one({
'sid': '728f36885f7c4686805593b9e4988c30',
'name': 'Test shell server',
'host': 'testing.picoctf.com',
'port': '22',
'username': 'username',
'password': 'password',
'protocol': 'HTTPS',
'server_number': 1
})
api.problem.load_published(
json.loads(sample_shellserver_publish_output)
)
def enable_sample_problems():
"""Enable any sample problems in the DB."""
db = get_conn()
db.problems.update_many({}, {'$set': {'disabled': False}})
def ensure_within_competition():
"""Adjust the competition times so that protected methods are callable."""
db = get_conn()
db.settings.update_one({}, {'$set': {
'start_time': datetime.datetime.utcnow() - datetime.timedelta(1),
'end_time': datetime.datetime.utcnow() + datetime.timedelta(1),
}})
def ensure_before_competition():
"""Adjust the competition times so that @block_before_competition fails."""
db = get_conn()
db.settings.update_one({}, {'$set': {
'start_time': datetime.datetime.utcnow() + datetime.timedelta(11),
'end_time': datetime.datetime.utcnow() + datetime.timedelta(10),
}})
def ensure_after_competition():
"""Adjust the competition times so that @block_before_competition fails."""
db = get_conn()
db.settings.update_one({}, {'$set': {
'start_time': datetime.datetime.utcnow() - datetime.timedelta(11),
'end_time': datetime.datetime.utcnow() - datetime.timedelta(10),
}})
def get_problem_key(pid, team_name):
"""Get the flag for a given pid and team name."""
db = get_conn()
assigned_instance_id = db.teams.find_one({
'team_name': team_name
})['instances'][pid]
problem_instances = db.problems.find_one({
'pid': pid
})['instances']
assigned_instance = None
for instance in problem_instances:
if instance['iid'] == assigned_instance_id:
assigned_instance = instance
break
return assigned_instance['flag']
| 34.621969 | 1,680 | 0.486279 | 2,130 | 24,270 | 5.430516 | 0.177934 | 0.026973 | 0.043832 | 0.057318 | 0.596006 | 0.565229 | 0.512147 | 0.478689 | 0.435722 | 0.42604 | 0 | 0.103791 | 0.37396 | 24,270 | 700 | 1,681 | 34.671429 | 0.657496 | 0.032798 | 0 | 0.465517 | 0 | 0.023511 | 0.729656 | 0.124866 | 0 | 0 | 0 | 0 | 0 | 1 | 0.023511 | false | 0.012539 | 0.009404 | 0 | 0.043887 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
02223351c3f6f455c742ce52e04a38d560dc3455 | 299 | py | Python | src/z3c/saconfig/__init__.py | zopefoundation/z3c.saconfig | 69a32e7f7617ec4a1f9667d673a1ddc00aff59c2 | [
"ZPL-2.1"
] | 2 | 2016-03-12T14:22:23.000Z | 2019-05-22T04:18:26.000Z | src/z3c/saconfig/__init__.py | zopefoundation/z3c.saconfig | 69a32e7f7617ec4a1f9667d673a1ddc00aff59c2 | [
"ZPL-2.1"
] | 13 | 2015-05-05T12:27:48.000Z | 2021-05-20T11:11:49.000Z | src/z3c/saconfig/__init__.py | zopefoundation/z3c.saconfig | 69a32e7f7617ec4a1f9667d673a1ddc00aff59c2 | [
"ZPL-2.1"
] | 4 | 2015-05-04T12:18:31.000Z | 2019-11-18T09:47:31.000Z | from z3c.saconfig.scopedsession import Session, named_scoped_session
from z3c.saconfig.utility import (
GloballyScopedSession, SiteScopedSession, EngineFactory)
__all__ = [
'Session',
'named_scoped_session',
'GloballyScopedSession',
'SiteScopedSession',
'EngineFactory',
]
| 23 | 68 | 0.752508 | 25 | 299 | 8.68 | 0.52 | 0.064516 | 0.138249 | 0.230415 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007937 | 0.157191 | 299 | 12 | 69 | 24.916667 | 0.853175 | 0 | 0 | 0 | 0 | 0 | 0.26087 | 0.070234 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 0.2 | 0 | 1 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
0223c05bd579183b627da44b67aca37eba1114e5 | 557 | py | Python | src/triage/experiments/singlethreaded.py | josephbajor/triage_NN | cbaee6e5a06e597c91fec372717d89a2b5f34fa5 | [
"MIT"
] | 160 | 2017-06-13T09:59:59.000Z | 2022-03-21T22:00:35.000Z | src/triage/experiments/singlethreaded.py | josephbajor/triage_NN | cbaee6e5a06e597c91fec372717d89a2b5f34fa5 | [
"MIT"
] | 803 | 2016-10-21T19:44:02.000Z | 2022-03-29T00:02:33.000Z | src/triage/experiments/singlethreaded.py | josephbajor/triage_NN | cbaee6e5a06e597c91fec372717d89a2b5f34fa5 | [
"MIT"
] | 59 | 2017-01-31T22:10:22.000Z | 2022-03-19T12:35:03.000Z | from triage.experiments import ExperimentBase
class SingleThreadedExperiment(ExperimentBase):
def process_query_tasks(self, query_tasks):
self.feature_generator.process_table_tasks(query_tasks)
def process_matrix_build_tasks(self, matrix_build_tasks):
self.matrix_builder.build_all_matrices(matrix_build_tasks)
def process_train_test_batches(self, batches):
self.model_train_tester.process_all_batches(batches)
def process_subset_tasks(self, subset_tasks):
self.subsetter.process_all_tasks(subset_tasks)
| 34.8125 | 66 | 0.800718 | 70 | 557 | 5.942857 | 0.385714 | 0.129808 | 0.115385 | 0.096154 | 0.110577 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.13465 | 557 | 15 | 67 | 37.133333 | 0.863071 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | false | 0 | 0.1 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
0236d15dce7606a0d8edbca50d378b142b6663f7 | 127 | py | Python | mynlp/__init__.py | Suneel123/mynlp | 9dcf6fb57df66ebd4a359b8cd866323f43bc8ec4 | [
"MIT"
] | null | null | null | mynlp/__init__.py | Suneel123/mynlp | 9dcf6fb57df66ebd4a359b8cd866323f43bc8ec4 | [
"MIT"
] | null | null | null | mynlp/__init__.py | Suneel123/mynlp | 9dcf6fb57df66ebd4a359b8cd866323f43bc8ec4 | [
"MIT"
] | null | null | null | """Top-level package for mynlp."""
__author__ = """Suneel Dondapati"""
__email__ = 'dsuneel1@gmail.com'
__version__ = '0.1.0'
| 21.166667 | 35 | 0.685039 | 16 | 127 | 4.6875 | 0.9375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.035714 | 0.11811 | 127 | 5 | 36 | 25.4 | 0.633929 | 0.220472 | 0 | 0 | 0 | 0 | 0.419355 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
02426c5e9ebc5b6e7797b501d9a365d58338fa41 | 159 | py | Python | Defer/__init__.py | loynoir/defer.py | 46f37a046028b1854586301a45870c2b3a628f65 | [
"MIT"
] | null | null | null | Defer/__init__.py | loynoir/defer.py | 46f37a046028b1854586301a45870c2b3a628f65 | [
"MIT"
] | null | null | null | Defer/__init__.py | loynoir/defer.py | 46f37a046028b1854586301a45870c2b3a628f65 | [
"MIT"
] | null | null | null | __all__ = ['Defer']
from contextlib import contextmanager, ExitStack
@contextmanager
def Defer():
with ExitStack() as stack:
yield stack.callback
| 19.875 | 48 | 0.72327 | 17 | 159 | 6.529412 | 0.764706 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.188679 | 159 | 7 | 49 | 22.714286 | 0.860465 | 0 | 0 | 0 | 0 | 0 | 0.031447 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.166667 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
0243fa264d20be4663ad37da1958e0275ed6a559 | 3,100 | py | Python | ArcGISDesktop/reconcile_post_versions.py | jonhusen/ArcGIS | 1d39a627888ce6039c490cdad810cd6d8035cb77 | [
"MIT"
] | null | null | null | ArcGISDesktop/reconcile_post_versions.py | jonhusen/ArcGIS | 1d39a627888ce6039c490cdad810cd6d8035cb77 | [
"MIT"
] | null | null | null | ArcGISDesktop/reconcile_post_versions.py | jonhusen/ArcGIS | 1d39a627888ce6039c490cdad810cd6d8035cb77 | [
"MIT"
] | null | null | null | """
Reconcile and posting versions at 10.0
TODO:WIP
"""
import arcpy, os, sys, string
#Populate parent and child versions in the following manner('Parent':'Child', etc). DO NOT LIST DEFAULT
vTree = {'SDE.Parent':'SDE.Child','SDE.QA':'SDE.Edit'}
#Reconcile and post child versions with parent
def RecPostNonDefault(workspace,logWorkspace,logName):
outLog = open(os.path.join(logWorkspace, logName), 'w')
for key, val in vTree.iteritems():
arcpy.ReconcileVersion_management(workspace, val, key,"BY_OBJECT", "FAVOR_TARGET_VERSION", "NO_LOCK_AQUIRED", "NO_ABORT", "POST")
print "Reconciling and posting {0} to {1}".format(val, key)
outLog.write("Reconciling and posting {0} to {1}".format(val, key))
outLog.write("\n")
outLog.close()
del outLog, key, val
#Reconcile and post with parent
def RecPostDefault(workspace,logWorkspace,logName2,defaultVersion):
outLog = open(os.path.join(logWorkspace, logName2), 'w')
#Reconcile and post parents with DEFAULT
for key, val in vTree.iteritems():
arcpy.ReconcileVersion_management(workspace, key, defaultVersion,"BY_OBJECT", "FAVOR_TARGET_VERSION", "NO_LOCK_AQUIRED", "NO_ABORT", "POST")
print "Reconciling and posting {0} to DEFAULT".format(key)
outLog.write("Reconciling and posting {0} to DEFAULT".format(key))
outLog.write("\n")
outLog.close()
del outLog, key, val
def DeleteChildVersions(workspace):
arcpy.ClearWorkspaceCache_management()
for key, val in vTree.iteritems():
arcpy.DeleteVersion_management(workspace, val)
print "Deleted {0}".format(val)
def DeleteParentVersions(workspace):
arcpy.ClearWorkspaceCache_management()
for key, val in vTree.iteritems():
arcpy.DeleteVersion_management(workspace, key)
print "Deleted {0}".format(key)
#Compress database
def Compress(workspace,logWorkspace,logName3):
arcpy.ClearWorkspaceCache_management()
outLog = open(os.path.join(logWorkspace, logName3), 'w')
arcpy.Compress_management(workspace)
print ("Compressed database {0}".format(workspace))
outLog.write("Compressed database {0}".format(workspace))
outLog.close()
def RecreateVersions(workspace, defaultVersion):
for key, val in vTree.iteritems():
arcpy.CreateVersion_management(workspace,defaultVersion, key[4:], "PUBLIC")
print "Created version {0}".format(key)
arcpy.CreateVersion_management(workspace, key, val[4:], "PUBLIC")
print "Created version {0}".format(val)
if __name__=="__main__":
workspace = r"Database Connections\MXD2.sde"
defaultVersion = "sde.DEFAULT"
logName = "RecPostLog.txt"
logName2 = "RecPostDefaultLog.txt"
logName3 = "CompressLog.txt"
logWorkspace = r"C:\temp"
RecPostNonDefault(workspace,logWorkspace,logName)
RecPostDefault(workspace,logWorkspace,logName2,defaultVersion)
DeleteChildVersions(workspace)
DeleteParentVersions(workspace)
Compress(workspace,logWorkspace,logName3)
RecreateVersions(workspace, defaultVersion) | 40.789474 | 148 | 0.709677 | 352 | 3,100 | 6.164773 | 0.272727 | 0.02212 | 0.020737 | 0.025346 | 0.501382 | 0.448848 | 0.367742 | 0.323502 | 0.323502 | 0.323502 | 0 | 0.010109 | 0.170323 | 3,100 | 76 | 149 | 40.789474 | 0.833593 | 0.07871 | 0 | 0.267857 | 0 | 0 | 0.185556 | 0.007508 | 0 | 0 | 0 | 0.013158 | 0 | 0 | null | null | 0 | 0.017857 | null | null | 0.125 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
024cdbf14b841e1da6f77d24cda6ea8444019523 | 1,320 | py | Python | application/src/app_pkg/routes/get_messages.py | eyardley/CSC648-SoftwareEngineering-Snapster | 6dbe1cf9b34de6d6dbc6be75db3a34583f67c01a | [
"MIT"
] | null | null | null | application/src/app_pkg/routes/get_messages.py | eyardley/CSC648-SoftwareEngineering-Snapster | 6dbe1cf9b34de6d6dbc6be75db3a34583f67c01a | [
"MIT"
] | 3 | 2021-06-08T21:39:12.000Z | 2022-01-13T02:46:20.000Z | application/src/app_pkg/routes/get_messages.py | eyardley/CSC648-SoftwareEngineering-Snapster | 6dbe1cf9b34de6d6dbc6be75db3a34583f67c01a | [
"MIT"
] | 1 | 2021-05-09T21:01:28.000Z | 2021-05-09T21:01:28.000Z | # from flask import render_template, request, make_response, jsonify
# from src.app_pkg.routes.common import validate_helper
# from src.app_pkg import app, db
# from src.app_pkg.forms import MessageForm
#
# ################################################
# # Show All Messages / User Profile #
# ################################################
# # AUTHOR: Bakulia Kurmant
# # NOTE: This function handles the route of the show all message functionality.
# # It show the list of messages the user sent or received and single view message modal with message body
# # Once the Database manager API returns a result (as a list), it passes that resulting list
# # to the HTML page to be rendered.
#
#
# @app.route('/user_profile', method=['GET'])
# def all_messages(msg_id):
# isloggedin = validate_helper(request.cookies.get('token'))
#
# if not isloggedin:
# return render_template('search.html')
#
# msg_result_size = 0
# msg_results = []
# print('calling db...')
# msg_result_size, msg_results = db.get_all_messages(isloggedin, msg_id)
#
# if msg_result_size == 0:
# print("You have no messages!")
#
# return render_template('user_profile.html', isloggedin=isloggedin, msg_result_size=msg_result_size,
# msg_results=msg_results)
#
#
| 37.714286 | 106 | 0.641667 | 169 | 1,320 | 4.83432 | 0.491124 | 0.05508 | 0.079559 | 0.047736 | 0.056304 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001871 | 0.190152 | 1,320 | 34 | 107 | 38.823529 | 0.762395 | 0.869697 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
025c491da627375770263331eb452c03d4b317b0 | 431 | py | Python | src/terra/contracts/levana.py | fentas/staketaxcsv | ad37a32d8864111dbf88e926b80eb4ccacb921c6 | [
"MIT"
] | null | null | null | src/terra/contracts/levana.py | fentas/staketaxcsv | ad37a32d8864111dbf88e926b80eb4ccacb921c6 | [
"MIT"
] | null | null | null | src/terra/contracts/levana.py | fentas/staketaxcsv | ad37a32d8864111dbf88e926b80eb4ccacb921c6 | [
"MIT"
] | null | null | null | # known contracts from protocol
CONTRACTS = [
# NFT - Meteor Dust
"terra1p70x7jkqhf37qa7qm4v23g4u4g8ka4ktxudxa7",
# NFT - Eggs
"terra1k0y373yxqne22pc9g7jvnr4qclpsxtafevtrpg",
# NFT - Dragons
"terra1vhuyuwwr4rkdpez5f5lmuqavut28h5dt29rpn6",
# NFT - Loot
"terra14gfnxnwl0yz6njzet4n33erq5n70wt79nm24el",
]
def handle(exporter, elem, txinfo, contract):
print(f"Levana! {contract}")
#print(elem)
| 26.9375 | 51 | 0.723898 | 30 | 431 | 10.4 | 0.766667 | 0.083333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.139205 | 0.183295 | 431 | 15 | 52 | 28.733333 | 0.747159 | 0.218097 | 0 | 0 | 0 | 0 | 0.587879 | 0.533333 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0 | 0 | 0.125 | 0.125 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
0268f15772e163a48707362a23538e64ee3c364e | 4,744 | py | Python | operators/elastic-cloud-eck/python/pulumi_pulumi_kubernetes_crds_operators_elastic_cloud_eck/_tables.py | pulumi/pulumi-kubernetes-crds | 372c4c0182f6b899af82d6edaad521aa14f22150 | [
"Apache-2.0"
] | null | null | null | operators/elastic-cloud-eck/python/pulumi_pulumi_kubernetes_crds_operators_elastic_cloud_eck/_tables.py | pulumi/pulumi-kubernetes-crds | 372c4c0182f6b899af82d6edaad521aa14f22150 | [
"Apache-2.0"
] | 2 | 2020-09-18T17:12:23.000Z | 2020-12-30T19:40:56.000Z | operators/elastic-cloud-eck/python/pulumi_pulumi_kubernetes_crds_operators_elastic_cloud_eck/_tables.py | pulumi/pulumi-kubernetes-crds | 372c4c0182f6b899af82d6edaad521aa14f22150 | [
"Apache-2.0"
] | null | null | null | # coding=utf-8
# *** WARNING: this file was generated by crd2pulumi. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
SNAKE_TO_CAMEL_CASE_TABLE = {
"access_modes": "accessModes",
"api_group": "apiGroup",
"api_version": "apiVersion",
"app_protocol": "appProtocol",
"association_status": "associationStatus",
"available_nodes": "availableNodes",
"change_budget": "changeBudget",
"client_ip": "clientIP",
"cluster_ip": "clusterIP",
"config_ref": "configRef",
"daemon_set": "daemonSet",
"data_source": "dataSource",
"elasticsearch_association_status": "elasticsearchAssociationStatus",
"elasticsearch_ref": "elasticsearchRef",
"expected_nodes": "expectedNodes",
"external_i_ps": "externalIPs",
"external_name": "externalName",
"external_traffic_policy": "externalTrafficPolicy",
"file_realm": "fileRealm",
"health_check_node_port": "healthCheckNodePort",
"ip_family": "ipFamily",
"kibana_association_status": "kibanaAssociationStatus",
"kibana_ref": "kibanaRef",
"last_probe_time": "lastProbeTime",
"last_transition_time": "lastTransitionTime",
"load_balancer_ip": "loadBalancerIP",
"load_balancer_source_ranges": "loadBalancerSourceRanges",
"match_expressions": "matchExpressions",
"match_labels": "matchLabels",
"max_surge": "maxSurge",
"max_unavailable": "maxUnavailable",
"min_available": "minAvailable",
"node_port": "nodePort",
"node_sets": "nodeSets",
"pod_disruption_budget": "podDisruptionBudget",
"pod_template": "podTemplate",
"publish_not_ready_addresses": "publishNotReadyAddresses",
"remote_clusters": "remoteClusters",
"rolling_update": "rollingUpdate",
"secret_name": "secretName",
"secret_token_secret": "secretTokenSecret",
"secure_settings": "secureSettings",
"self_signed_certificate": "selfSignedCertificate",
"service_account_name": "serviceAccountName",
"session_affinity": "sessionAffinity",
"session_affinity_config": "sessionAffinityConfig",
"storage_class_name": "storageClassName",
"subject_alt_names": "subjectAltNames",
"target_port": "targetPort",
"timeout_seconds": "timeoutSeconds",
"topology_keys": "topologyKeys",
"update_strategy": "updateStrategy",
"volume_claim_templates": "volumeClaimTemplates",
"volume_mode": "volumeMode",
"volume_name": "volumeName",
}
CAMEL_TO_SNAKE_CASE_TABLE = {
"accessModes": "access_modes",
"apiGroup": "api_group",
"apiVersion": "api_version",
"appProtocol": "app_protocol",
"associationStatus": "association_status",
"availableNodes": "available_nodes",
"changeBudget": "change_budget",
"clientIP": "client_ip",
"clusterIP": "cluster_ip",
"configRef": "config_ref",
"daemonSet": "daemon_set",
"dataSource": "data_source",
"elasticsearchAssociationStatus": "elasticsearch_association_status",
"elasticsearchRef": "elasticsearch_ref",
"expectedNodes": "expected_nodes",
"externalIPs": "external_i_ps",
"externalName": "external_name",
"externalTrafficPolicy": "external_traffic_policy",
"fileRealm": "file_realm",
"healthCheckNodePort": "health_check_node_port",
"ipFamily": "ip_family",
"kibanaAssociationStatus": "kibana_association_status",
"kibanaRef": "kibana_ref",
"lastProbeTime": "last_probe_time",
"lastTransitionTime": "last_transition_time",
"loadBalancerIP": "load_balancer_ip",
"loadBalancerSourceRanges": "load_balancer_source_ranges",
"matchExpressions": "match_expressions",
"matchLabels": "match_labels",
"maxSurge": "max_surge",
"maxUnavailable": "max_unavailable",
"minAvailable": "min_available",
"nodePort": "node_port",
"nodeSets": "node_sets",
"podDisruptionBudget": "pod_disruption_budget",
"podTemplate": "pod_template",
"publishNotReadyAddresses": "publish_not_ready_addresses",
"remoteClusters": "remote_clusters",
"rollingUpdate": "rolling_update",
"secretName": "secret_name",
"secretTokenSecret": "secret_token_secret",
"secureSettings": "secure_settings",
"selfSignedCertificate": "self_signed_certificate",
"serviceAccountName": "service_account_name",
"sessionAffinity": "session_affinity",
"sessionAffinityConfig": "session_affinity_config",
"storageClassName": "storage_class_name",
"subjectAltNames": "subject_alt_names",
"targetPort": "target_port",
"timeoutSeconds": "timeout_seconds",
"topologyKeys": "topology_keys",
"updateStrategy": "update_strategy",
"volumeClaimTemplates": "volume_claim_templates",
"volumeMode": "volume_mode",
"volumeName": "volume_name",
}
| 39.533333 | 80 | 0.707841 | 407 | 4,744 | 7.857494 | 0.41769 | 0.031895 | 0.018762 | 0.011882 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.000494 | 0.147344 | 4,744 | 119 | 81 | 39.865546 | 0.790111 | 0.030987 | 0 | 0 | 1 | 0 | 0.697583 | 0.178097 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
027134b2e08ff17613c7279b030cfe1fcf0d8e8e | 309 | py | Python | pycon/tutorials/urls.py | azkarmoulana/pycon | 931388e6f640c35b892bb4b2d12581ba7ec8cf4e | [
"BSD-3-Clause"
] | 154 | 2015-01-17T02:29:24.000Z | 2022-03-20T20:37:24.000Z | pycon/tutorials/urls.py | azkarmoulana/pycon | 931388e6f640c35b892bb4b2d12581ba7ec8cf4e | [
"BSD-3-Clause"
] | 316 | 2015-01-10T04:01:50.000Z | 2020-09-30T20:18:08.000Z | pycon/tutorials/urls.py | azkarmoulana/pycon | 931388e6f640c35b892bb4b2d12581ba7ec8cf4e | [
"BSD-3-Clause"
] | 89 | 2015-01-10T05:25:21.000Z | 2022-02-27T03:28:59.000Z | from django.conf.urls import url, patterns
from .views import tutorial_email, tutorial_message
urlpatterns = patterns("", # flake8: noqa
url(r"^mail/(?P<pk>\d+)/(?P<pks>[0-9,]+)/$", tutorial_email, name="tutorial_email"),
url(r"^message/(?P<pk>\d+)/$", tutorial_message, name="tutorial_message"),
)
| 34.333333 | 88 | 0.679612 | 44 | 309 | 4.636364 | 0.522727 | 0.191176 | 0.039216 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.01087 | 0.106796 | 309 | 8 | 89 | 38.625 | 0.728261 | 0.038835 | 0 | 0 | 0 | 0 | 0.298305 | 0.19661 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
027b51903bbc31466f05349aa598a39bb4d2919d | 447 | py | Python | 6.00.1x/quiz/flatten.py | NicholasAsimov/courses | d60981f25816445578eb9e89bbbeef2d38eaf014 | [
"MIT"
] | null | null | null | 6.00.1x/quiz/flatten.py | NicholasAsimov/courses | d60981f25816445578eb9e89bbbeef2d38eaf014 | [
"MIT"
] | null | null | null | 6.00.1x/quiz/flatten.py | NicholasAsimov/courses | d60981f25816445578eb9e89bbbeef2d38eaf014 | [
"MIT"
] | null | null | null | def flatten(aList):
'''
aList: a list
Returns a copy of aList, which is a flattened version of aList
'''
if aList == []:
return aList
if type(aList[0]) == list:
return flatten(aList[0]) + flatten(aList[1:])
return aList[:1] + flatten(aList[1:])
aList = [[1, 'a', ['cat'], 2], [[[3]], 'dog'], 4, 5]
print flatten(aList)
testCase = [1, 'a', 'cat', 2, 3, 'dog', 4, 5]
print flatten(aList) == testCase
| 22.35 | 66 | 0.548098 | 65 | 447 | 3.769231 | 0.384615 | 0.293878 | 0.106122 | 0.04898 | 0.302041 | 0.302041 | 0.302041 | 0.302041 | 0.302041 | 0.302041 | 0 | 0.044776 | 0.250559 | 447 | 19 | 67 | 23.526316 | 0.686567 | 0 | 0 | 0 | 0 | 0 | 0.039886 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.2 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
65f9d6849276abc9d2abce58b864383e8eca894c | 531 | py | Python | madlib.py | Yukthi-C/python_learing | 340579e2bb767e8fdb209f705fdf12058e8e150f | [
"MIT"
] | null | null | null | madlib.py | Yukthi-C/python_learing | 340579e2bb767e8fdb209f705fdf12058e8e150f | [
"MIT"
] | null | null | null | madlib.py | Yukthi-C/python_learing | 340579e2bb767e8fdb209f705fdf12058e8e150f | [
"MIT"
] | null | null | null | ad1 = input(f"Adjective1: ")
ad2 = input(f"Adjective2: ")
part1 = input(f"body part: ")
dish = input(f"Dish: ")
madlib=f"One day, a {ad1} fox invited a stork for dinner. \
Stork was very {ad2} with the invitation – she reached the fox’s home on time and knocked at the door with her {part1}.\
The fox took her to the dinner table and served some {dish} in shallow bowls for both of them.\
As the bowl was too shallow for the stork, she couldn’t have soup at all. But, the fox licked up his soup quickly."
print(f"{madlib}") | 59 | 121 | 0.706215 | 99 | 531 | 3.79798 | 0.616162 | 0.06383 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.018605 | 0.190207 | 531 | 9 | 122 | 59 | 0.853488 | 0 | 0 | 0 | 0 | 0.333333 | 0.093511 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.111111 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
5a0ebed0bb4e1667aef392ee3608c9732dd33560 | 278 | py | Python | systest/tests/test_rm.py | devconsoft/pycred | d72bdae2e703a87a7424f08af326834281b83fee | [
"MIT"
] | null | null | null | systest/tests/test_rm.py | devconsoft/pycred | d72bdae2e703a87a7424f08af326834281b83fee | [
"MIT"
] | 5 | 2018-07-01T22:53:24.000Z | 2018-07-17T21:54:10.000Z | systest/tests/test_rm.py | devconsoft/pycred | d72bdae2e703a87a7424f08af326834281b83fee | [
"MIT"
] | null | null | null | def test_rm_long_opt_help(pycred):
pycred('rm --help')
def test_rm_short_opt_help(pycred):
pycred('rm -h')
def test_rm_none_existing_store_gives_exit_code_2(pycred, workspace):
with workspace():
pycred('rm non-existing-store user', expected_exit_code=2)
| 23.166667 | 69 | 0.73741 | 44 | 278 | 4.25 | 0.477273 | 0.112299 | 0.144385 | 0.203209 | 0.224599 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008439 | 0.147482 | 278 | 11 | 70 | 25.272727 | 0.780591 | 0 | 0 | 0 | 0 | 0 | 0.143885 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.428571 | false | 0 | 0 | 0 | 0.428571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
5a17b8b4d053d2409ae3602977dee83dcbebc0b2 | 4,340 | py | Python | scripts/lc/ARES/testing/run_rose_tool.py | ouankou/rose | 76f2a004bd6d8036bc24be2c566a14e33ba4f825 | [
"BSD-3-Clause"
] | 488 | 2015-01-09T08:54:48.000Z | 2022-03-30T07:15:46.000Z | scripts/lc/ARES/testing/run_rose_tool.py | ouankou/rose | 76f2a004bd6d8036bc24be2c566a14e33ba4f825 | [
"BSD-3-Clause"
] | 174 | 2015-01-28T18:41:32.000Z | 2022-03-31T16:51:05.000Z | scripts/lc/ARES/testing/run_rose_tool.py | ouankou/rose | 76f2a004bd6d8036bc24be2c566a14e33ba4f825 | [
"BSD-3-Clause"
] | 146 | 2015-04-27T02:48:34.000Z | 2022-03-04T07:32:53.000Z | #!/usr/bin/env python
"""Runs a ROSE tool. If the tool does not return status 0, then runs the
corresponding non-ROSE compiler. Records whether the tool succeeded, in
passed.txt and failed.txt, but always returns status 0.
"""
import argparse
import inspect
import os
from support.local_logging import Logger
from support.runner import Runner
_SEPARATOR = "================================================================================"
class ROSERunner (object):
def __init__(self):
# Will be a Namespace (e.g. can refer to self._args_defined.command_args):
self._args_defined = None
# Will be a list:
self._args_remaining = None
self._current_dir = ""
self._failed_file = None
self._failed_file_path = ""
self._logger = Logger("run_rose_tool.ROSERunner")
self._parser = None
self._passed_file = None
self._passed_file_path = ""
self._primary_command = ""
self._runner = Runner()
self._script_dir = ""
self._secondary_command = ""
self._define_args()
def _define_args(self):
""" This script passes all its arguments on to the called
programs, so there are no argus defined.
"""
parser = argparse.ArgumentParser(
description="""Runs a ROSE tool. If the tool does not return status 0, then runs the
corresponding non-ROSE compiler. Records whether the tool succeeded, in
passed.txt and failed.txt, but always returns status 0.
""")
# We want ALL the arguments, so, we are using parse_known_arguments
# below instead and commenting this out for now:
## This matches the first positional and all remaining/following args:
#parser.add_argument('command_args', nargs=argparse.REMAINDER)
self._parser = parser
def _process_args(self):
self._args_defined, self._args_remaining = self._parser.parse_known_args()
self._logger.debug("defined args\n" + str(self._args_defined))
self._logger.debug("remaining args\n" + str(self._args_remaining))
self._current_dir = os.getcwd()
#self._script_dir = os.path.dirname(os.path.abspath(__file__))
# Robustly get this script's directory, even when started by exec or execfiles:
script_rel_path = inspect.getframeinfo(inspect.currentframe()).filename
self._script_dir = os.path.dirname(os.path.abspath(script_rel_path))
self._primary_command = "/g/g17/charles/code/ROSE/rose-0.9.10.64-intel-18.0.1.mpi/tutorial/identityTranslator"
self._secondary_command = "/usr/tce/packages/mvapich2/mvapich2-2.2-intel-18.0.1/bin/mpicxx"
self._passed_file_path = os.path.join (self._script_dir, "passed.txt")
self._failed_file_path = os.path.join (self._script_dir, "failed.txt")
def _log_success(self, args):
self._logger.success("\n" + _SEPARATOR + "\nPASSED")
self._logger.debug("Will log to passed file:")
self._logger.debug(args)
self._passed_file.write(str(args) + '\n')
def _log_failure(self, args):
self._logger.problem("\n" + _SEPARATOR + "\nFAILED")
self._logger.debug("Will log to failed file:")
self._logger.debug(args)
self._failed_file.write(str(args) + '\n')
def _run_command (self, args, dir):
self._logger.info("\n" + _SEPARATOR)
self._runner.callOrLog(args, dir)
def run(self):
""" Run the primary command. If it fails, run the secondary command. If
that fails, let the exception (Runner.Failed) propagate.
"""
self._logger.set_debug_off()
#self._logger._logger.setLevel(Logger.ERROR)
self._process_args()
self._passed_file = open(self._passed_file_path, 'a')
self._failed_file = open(self._failed_file_path, 'a')
try:
primary_args = [self._primary_command] + self._args_remaining
self._run_command(primary_args, self._current_dir)
self._log_success(primary_args)
except Runner.Failed, e:
self._log_failure(primary_args)
secondary_args = [self._secondary_command] + self._args_remaining
self._run_command(secondary_args, self._current_dir)
def main():
ROSERunner().run()
if __name__ == '__main__':
main()
| 41.333333 | 118 | 0.653917 | 563 | 4,340 | 4.756661 | 0.300178 | 0.038835 | 0.031367 | 0.031367 | 0.262883 | 0.250934 | 0.197909 | 0.16953 | 0.146378 | 0.117252 | 0 | 0.007111 | 0.22235 | 4,340 | 104 | 119 | 41.730769 | 0.78637 | 0.122811 | 0 | 0.028571 | 0 | 0.028571 | 0.176861 | 0.075625 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.114286 | 0.071429 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
5a1caac07eb9f441668b6c4d0592a3fd8fa4aefc | 576 | py | Python | ex4.py | AyeAyeZin/python_exercises | 77079dcd7809dd2967180ffd30df0166dd53edb4 | [
"MIT"
] | null | null | null | ex4.py | AyeAyeZin/python_exercises | 77079dcd7809dd2967180ffd30df0166dd53edb4 | [
"MIT"
] | null | null | null | ex4.py | AyeAyeZin/python_exercises | 77079dcd7809dd2967180ffd30df0166dd53edb4 | [
"MIT"
] | null | null | null | cars=100
cars_in_space=5
drivers=20
pasengers=70
car_not_driven=cars-drivers
cars_driven=drivers
carpool_capacity=cars_driven*space_in_a_car
average_passengers_percar=passengers/cars_driven
print("There are", cars,"cars availble")
print("There are only",drivers,"drivers availble")
print("There will be",car_not_driven,"empty cars today")
print("There are",cars_in_space,"space availble in car")
print("We can transport",carpool_capacity,"peopletoday.")
print("We have", passengers,"to carpool today.")
print("We need to put about", average_passengers_per_car,"in each car.")
| 36 | 72 | 0.805556 | 92 | 576 | 4.815217 | 0.413043 | 0.090293 | 0.088036 | 0.076749 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014981 | 0.072917 | 576 | 15 | 73 | 38.4 | 0.814607 | 0 | 0 | 0 | 0 | 0 | 0.338542 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.2 | 0 | 0 | 0 | 0.466667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 2 |
5a1d385aaac2b104c89e97a052215f1dccd44141 | 3,885 | py | Python | backend/src/baserow/contrib/database/migrations/0016_token_tokenpermission.py | ashishdhngr/baserow | b098678d2165eb7c42930ee24dc6753a3cb520c3 | [
"MIT"
] | null | null | null | backend/src/baserow/contrib/database/migrations/0016_token_tokenpermission.py | ashishdhngr/baserow | b098678d2165eb7c42930ee24dc6753a3cb520c3 | [
"MIT"
] | null | null | null | backend/src/baserow/contrib/database/migrations/0016_token_tokenpermission.py | ashishdhngr/baserow | b098678d2165eb7c42930ee24dc6753a3cb520c3 | [
"MIT"
] | null | null | null | # Generated by Django 2.2.11 on 2020-10-23 08:35
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
("core", "0001_initial"),
("database", "0015_emailfield"),
]
operations = [
migrations.CreateModel(
name="Token",
fields=[
(
"id",
models.AutoField(
auto_created=True,
primary_key=True,
serialize=False,
verbose_name="ID",
),
),
(
"name",
models.CharField(
max_length=100,
help_text="The human readable name of the token for the user.",
),
),
(
"key",
models.CharField(
db_index=True,
max_length=32,
unique=True,
help_text="The unique token key that can be used to authorize "
"for the table row endpoints.",
),
),
("created", models.DateTimeField(auto_now=True)),
(
"group",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
to="core.Group",
help_text="Only the tables of the group can be accessed.",
),
),
(
"user",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
to=settings.AUTH_USER_MODEL,
help_text="The user that owns the token.",
),
),
],
options={
"ordering": ("id",),
},
),
migrations.CreateModel(
name="TokenPermission",
fields=[
(
"id",
models.AutoField(
auto_created=True,
primary_key=True,
serialize=False,
verbose_name="ID",
),
),
(
"type",
models.CharField(
choices=[
("create", "Create"),
("read", "Read"),
("update", "Update"),
("delete", "Delete"),
],
max_length=6,
),
),
(
"database",
models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.CASCADE,
to="database.Database",
),
),
(
"table",
models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.CASCADE,
to="database.Table",
),
),
(
"token",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE, to="database.Token"
),
),
],
),
]
| 32.647059 | 88 | 0.344916 | 249 | 3,885 | 5.273092 | 0.369478 | 0.04265 | 0.063976 | 0.100533 | 0.373953 | 0.373953 | 0.373953 | 0.373953 | 0.373953 | 0.36786 | 0 | 0.017878 | 0.568082 | 3,885 | 118 | 89 | 32.923729 | 0.764601 | 0.01184 | 0 | 0.508929 | 1 | 0 | 0.110503 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.026786 | 0 | 0.053571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
5a23d3f4e52679a350233bbde834e4fd8f3310ec | 74 | py | Python | pytracetable/__init__.py | filwaitman/pytracetable | eb884953e179fc65677a9e3b3c70fde1b1439ccb | [
"MIT"
] | 1 | 2016-02-10T20:28:00.000Z | 2016-02-10T20:28:00.000Z | pytracetable/__init__.py | filwaitman/pytracetable | eb884953e179fc65677a9e3b3c70fde1b1439ccb | [
"MIT"
] | 1 | 2020-05-27T18:12:10.000Z | 2020-05-27T18:12:10.000Z | pytracetable/__init__.py | filwaitman/pytracetable | eb884953e179fc65677a9e3b3c70fde1b1439ccb | [
"MIT"
] | null | null | null | from pytracetable.core import tracetable
__all__ = [
'tracetable',
]
| 12.333333 | 40 | 0.716216 | 7 | 74 | 7 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.189189 | 74 | 5 | 41 | 14.8 | 0.816667 | 0 | 0 | 0 | 0 | 0 | 0.135135 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.25 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
5a3f1fd52edcbc6a770d3bea9dab8192d49a92e5 | 1,838 | py | Python | dex/section/section.py | callmejacob/dexfactory | 2de996927ee9f036b2c7fc6cb04f43ac790f35af | [
"BSD-2-Clause"
] | 7 | 2018-06-14T10:40:47.000Z | 2021-05-18T08:55:34.000Z | dex/section/section.py | callmejacob/dexfactory | 2de996927ee9f036b2c7fc6cb04f43ac790f35af | [
"BSD-2-Clause"
] | 1 | 2020-05-28T08:59:50.000Z | 2020-05-28T08:59:50.000Z | dex/section/section.py | callmejacob/dexfactory | 2de996927ee9f036b2c7fc6cb04f43ac790f35af | [
"BSD-2-Clause"
] | 3 | 2018-02-28T02:08:06.000Z | 2018-09-12T03:09:18.000Z | # -- coding: utf-8 --
from section_base import *
from section_map_item import *
from section_header import *
from section_string_id import *
from section_type_id import *
from section_proto_id import *
from section_field_id import *
from section_method_id import *
from section_class_def import *
from section_type_list import *
from section_class_data import *
from section_annotation_set_ref_list import *
from section_annotation_set_item import *
from section_annotation_item import *
from section_string_list import *
from section_encoded_array import *
from section_annotations_directory import *
from section_code import *
from section_debug_info import *
'''
section中的映射表: (类型,Section类)
'''
section_class_map = {
TYPE_HEADER_ITEM : HeaderSection,
TYPE_STRING_ID_ITEM : StringIdListSection,
TYPE_TYPE_ID_ITEM : TypeIdListSection,
TYPE_PROTO_ID_ITEM : ProtoIdListSection,
TYPE_FIELD_ID_ITEM : FieldIdListSection,
TYPE_METHOD_ID_ITEM : MethodIdListSection,
TYPE_CLASS_DEF_ITEM : ClassDefListSection,
TYPE_MAP_LIST : MapItemListSection,
TYPE_TYPE_LIST : TypeListSection,
TYPE_ANNOTATION_SET_REF_LIST : AnnotationSetRefListSection,
TYPE_ANNOTATION_SET_ITEM : AnnotationSetItemSection,
TYPE_CLASS_DATA_ITEM : ClassDataListSection,
TYPE_CODE_ITEM : CodeSection,
TYPE_STRING_DATA_ITEM : StringListSection,
TYPE_DEBUG_INFO_ITEM : DebugInfoSection,
TYPE_ANNOTATION_ITEM : AnnotationItemSection,
TYPE_ENCODED_ARRAY_ITEM : EncodedArraySection,
TYPE_ANNOTATIONS_DIRECTORY_ITEM : AnnotationsDirectorySection,
} | 39.106383 | 69 | 0.699674 | 190 | 1,838 | 6.294737 | 0.268421 | 0.174749 | 0.255853 | 0.079431 | 0.050167 | 0 | 0 | 0 | 0 | 0 | 0 | 0.000735 | 0.259521 | 1,838 | 47 | 70 | 39.106383 | 0.878031 | 0.010337 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.487179 | 0 | 0.487179 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
5a58135dc9e13b466cba75e814598ea999f2751b | 705 | py | Python | COMP-2080/Week-11/knapRecursive.py | kbrezinski/Candidacy-Prep | f4610fb611e6300a7d657af124728d46a8659ba5 | [
"BSD-3-Clause"
] | null | null | null | COMP-2080/Week-11/knapRecursive.py | kbrezinski/Candidacy-Prep | f4610fb611e6300a7d657af124728d46a8659ba5 | [
"BSD-3-Clause"
] | null | null | null | COMP-2080/Week-11/knapRecursive.py | kbrezinski/Candidacy-Prep | f4610fb611e6300a7d657af124728d46a8659ba5 | [
"BSD-3-Clause"
] | null | null | null |
# [weight, value]
I = [[4, 8], [4, 7], [6, 14]]
k = 8
def knapRecursive(I, k):
return knapRecursiveAux(I, k, len(I) - 1)
def knapRecursiveAux(I, k, hi):
# final element
if hi == 0:
# too big for sack
if I[hi][0] > k:
return 0
# fits
else:
return I[hi][1]
else:
# too big for sack
if I[hi][0] > k:
return knapRecursiveAux(I, k, hi - 1)
# fits
else:
# don't include it
s1 = knapRecursiveAux(I, k, hi - 1)
# include it
s2 = I[hi][1] + knapRecursiveAux(I, k - I[hi][0], hi - 1)
return max(s1, s2)
print(knapRecursive(I, k))
| 22.03125 | 69 | 0.455319 | 99 | 705 | 3.242424 | 0.343434 | 0.043614 | 0.280374 | 0.186916 | 0.370717 | 0.161994 | 0.161994 | 0.161994 | 0.161994 | 0.161994 | 0 | 0.054632 | 0.402837 | 705 | 31 | 70 | 22.741935 | 0.707838 | 0.143262 | 0 | 0.277778 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0 | 0.055556 | 0.388889 | 0.055556 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
5a59bbf41d09d9b1b99e57b30f3e8db2c9734a9d | 232 | py | Python | digits/inference/__init__.py | PhysicsTeacher13/Digits-NVIDIA | 80c08ed2b84d5d4eb4f1721ab30f3db2ce67690a | [
"BSD-3-Clause"
] | 111 | 2017-04-21T06:03:04.000Z | 2021-04-26T06:36:54.000Z | digits/inference/__init__.py | PhysicsTeacher13/Digits-NVIDIA | 80c08ed2b84d5d4eb4f1721ab30f3db2ce67690a | [
"BSD-3-Clause"
] | 6 | 2017-05-15T22:02:49.000Z | 2018-03-16T10:25:26.000Z | digits/inference/__init__.py | PhysicsTeacher13/Digits-NVIDIA | 80c08ed2b84d5d4eb4f1721ab30f3db2ce67690a | [
"BSD-3-Clause"
] | 40 | 2017-04-21T07:04:16.000Z | 2019-11-14T14:20:32.000Z | # Copyright (c) 2016, NVIDIA CORPORATION. All rights reserved.
from __future__ import absolute_import
from .images import ImageInferenceJob
from .job import InferenceJob
__all__ = [
'InferenceJob',
'ImageInferenceJob',
]
| 21.090909 | 63 | 0.762931 | 24 | 232 | 7 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.020619 | 0.163793 | 232 | 10 | 64 | 23.2 | 0.845361 | 0.262931 | 0 | 0 | 0 | 0 | 0.171598 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.428571 | 0 | 0.428571 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
5a683a89ea393148d4edd0bc84134016995c858d | 374 | py | Python | runserver.py | chintal/tendril-monitor-vendor | af7577bd88b3d35e09a733607555d5d10e1cd9c7 | [
"MIT"
] | null | null | null | runserver.py | chintal/tendril-monitor-vendor | af7577bd88b3d35e09a733607555d5d10e1cd9c7 | [
"MIT"
] | null | null | null | runserver.py | chintal/tendril-monitor-vendor | af7577bd88b3d35e09a733607555d5d10e1cd9c7 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# encoding: utf-8
# Copyright (C) 2015 Chintalagiri Shashank
# Released under the MIT license.
"""
Simple Deployment Example
-------------------------
"""
from vendor_monitor import worker
from twisted.internet import reactor
import logging
logging.basicConfig(level=logging.INFO)
if __name__ == '__main__':
worker.start()
reactor.run()
| 16.26087 | 42 | 0.68984 | 44 | 374 | 5.659091 | 0.840909 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015674 | 0.147059 | 374 | 22 | 43 | 17 | 0.76489 | 0.430481 | 0 | 0 | 0 | 0 | 0.04 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.428571 | 0 | 0.428571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
ce456a679b725d44ec91f64a8df14df4d86ae155 | 1,918 | py | Python | src/python/grpcio_tests/tests/interop/_intraop_test_case.py | txl0591/grpc | 8b732dc466fb8a567c1bca9dbb84554d29087395 | [
"Apache-2.0"
] | 117 | 2017-10-02T21:34:35.000Z | 2022-03-02T01:49:03.000Z | src/python/grpcio_tests/tests/interop/_intraop_test_case.py | txl0591/grpc | 8b732dc466fb8a567c1bca9dbb84554d29087395 | [
"Apache-2.0"
] | 4 | 2017-10-03T22:45:30.000Z | 2018-09-27T07:31:00.000Z | src/python/grpcio_tests/tests/interop/_intraop_test_case.py | txl0591/grpc | 8b732dc466fb8a567c1bca9dbb84554d29087395 | [
"Apache-2.0"
] | 24 | 2017-10-31T12:14:15.000Z | 2021-12-11T10:07:46.000Z | # Copyright 2015 gRPC authors.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Common code for unit tests of the interoperability test code."""
from tests.interop import methods
class IntraopTestCase(object):
"""Unit test methods.
This class must be mixed in with unittest.TestCase and a class that defines
setUp and tearDown methods that manage a stub attribute.
"""
def testEmptyUnary(self):
methods.TestCase.EMPTY_UNARY.test_interoperability(self.stub, None)
def testLargeUnary(self):
methods.TestCase.LARGE_UNARY.test_interoperability(self.stub, None)
def testServerStreaming(self):
methods.TestCase.SERVER_STREAMING.test_interoperability(self.stub, None)
def testClientStreaming(self):
methods.TestCase.CLIENT_STREAMING.test_interoperability(self.stub, None)
def testPingPong(self):
methods.TestCase.PING_PONG.test_interoperability(self.stub, None)
def testCancelAfterBegin(self):
methods.TestCase.CANCEL_AFTER_BEGIN.test_interoperability(self.stub,
None)
def testCancelAfterFirstResponse(self):
methods.TestCase.CANCEL_AFTER_FIRST_RESPONSE.test_interoperability(
self.stub, None)
def testTimeoutOnSleepingServer(self):
methods.TestCase.TIMEOUT_ON_SLEEPING_SERVER.test_interoperability(
self.stub, None)
| 36.884615 | 80 | 0.728363 | 235 | 1,918 | 5.855319 | 0.485106 | 0.063953 | 0.110465 | 0.162791 | 0.265262 | 0.198401 | 0.122093 | 0 | 0 | 0 | 0 | 0.005212 | 0.199687 | 1,918 | 51 | 81 | 37.607843 | 0.891205 | 0.399374 | 0 | 0.095238 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.380952 | false | 0 | 0.047619 | 0 | 0.47619 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
ce462fe45d9f73cc50c3b487d621d5b2ad86a06b | 99 | py | Python | pubmedpy/__init__.py | dhimmel/pubmedpy | 9d716768f5ab798ec448154588e4fd99afd7584a | [
"BlueOak-1.0.0"
] | 7 | 2019-11-13T09:14:19.000Z | 2022-03-09T01:35:06.000Z | pubmedpy/__init__.py | dhimmel/pubmedpy | 9d716768f5ab798ec448154588e4fd99afd7584a | [
"BlueOak-1.0.0"
] | 2 | 2020-08-24T15:05:57.000Z | 2020-10-21T04:12:56.000Z | pubmedpy/__init__.py | dhimmel/pubmedpy | 9d716768f5ab798ec448154588e4fd99afd7584a | [
"BlueOak-1.0.0"
] | 1 | 2021-02-18T00:01:09.000Z | 2021-02-18T00:01:09.000Z | """
# Utilities for interacting with NCBI EUtilities relating to PubMed
"""
__version__ = "0.0.1"
| 16.5 | 67 | 0.717172 | 13 | 99 | 5.153846 | 0.923077 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.036145 | 0.161616 | 99 | 5 | 68 | 19.8 | 0.771084 | 0.676768 | 0 | 0 | 0 | 0 | 0.208333 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
ce4b6a50f11f5cd0ce57c03afebe02596310a357 | 405 | py | Python | src/utils/config.py | mlrepa/automate-ml-with-dvc | b54a2e4818a991362d304890828df70359bab84a | [
"MIT"
] | 4 | 2021-04-11T17:30:14.000Z | 2021-07-27T10:09:53.000Z | src/utils/config.py | mlrepa/automate-ml-with-dvc | b54a2e4818a991362d304890828df70359bab84a | [
"MIT"
] | null | null | null | src/utils/config.py | mlrepa/automate-ml-with-dvc | b54a2e4818a991362d304890828df70359bab84a | [
"MIT"
] | 1 | 2021-09-05T04:15:07.000Z | 2021-09-05T04:15:07.000Z | import box
from typing import Text
import yaml
def load_config(config_path: Text) -> box.ConfigBox:
"""Loads yaml config in instance of box.ConfigBox.
Args:
config_path {Text}: path to config
Returns:
box.ConfigBox
"""
with open(config_path) as config_file:
config = yaml.safe_load(config_file)
config = box.ConfigBox(config)
return config
| 20.25 | 54 | 0.659259 | 53 | 405 | 4.90566 | 0.45283 | 0.184615 | 0.107692 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.261728 | 405 | 19 | 55 | 21.315789 | 0.869565 | 0.293827 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.375 | 0 | 0.625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
ce51bf2481ad7448201c8511a71d60800f43cedd | 350 | py | Python | Logic/Helpers/ChronosTextEntry.py | terexdev/BSDS-V39 | 7deea469fbfbc56c48f8326ba972369679f6b098 | [
"Apache-2.0"
] | 11 | 2021-11-04T01:49:50.000Z | 2022-01-31T16:50:47.000Z | Logic/Helpers/ChronosTextEntry.py | terexdev/BSDS-V39 | 7deea469fbfbc56c48f8326ba972369679f6b098 | [
"Apache-2.0"
] | 6 | 2021-11-04T08:52:01.000Z | 2021-12-27T02:33:19.000Z | Logic/Helpers/ChronosTextEntry.py | terexdev/BSDS-V39 | 7deea469fbfbc56c48f8326ba972369679f6b098 | [
"Apache-2.0"
] | 5 | 2021-11-04T02:31:56.000Z | 2022-03-14T02:04:33.000Z | from Logic.Classes.LogicDataTables import LogicDataTables
from Logic.Data.DataManager import Writer
from Logic.Data.DataManager import Reader
class ChronosTextEntry:
def decode(self: Reader):
self.readInt()
self.readString()
def encode(self: Writer):
self.writeInt(0)
self.writeString("BSDS is catly hacc")
| 25 | 57 | 0.717143 | 41 | 350 | 6.121951 | 0.585366 | 0.10757 | 0.103586 | 0.191235 | 0.239044 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003571 | 0.2 | 350 | 13 | 58 | 26.923077 | 0.892857 | 0 | 0 | 0 | 0 | 0 | 0.051429 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.3 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
ce5459689c023b5b6363dd479cd3042521f3f23d | 1,112 | py | Python | backend-project/small_eod/collections/migrations/0003_auto_20200131_2033.py | WlodzimierzKorza/small_eod | 027022bd71122a949a2787d0fb86518df80e48cd | [
"MIT"
] | 64 | 2019-12-30T11:24:03.000Z | 2021-06-24T01:04:56.000Z | backend-project/small_eod/collections/migrations/0003_auto_20200131_2033.py | WlodzimierzKorza/small_eod | 027022bd71122a949a2787d0fb86518df80e48cd | [
"MIT"
] | 465 | 2018-06-13T21:43:43.000Z | 2022-01-04T23:33:56.000Z | backend-project/small_eod/collections/migrations/0003_auto_20200131_2033.py | WlodzimierzKorza/small_eod | 027022bd71122a949a2787d0fb86518df80e48cd | [
"MIT"
] | 72 | 2018-12-02T19:47:03.000Z | 2022-01-04T22:54:49.000Z | # Generated by Django 3.0.2 on 2020-01-31 20:33
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('collections', '0002_auto_20200109_1348'),
]
operations = [
migrations.AlterField(
model_name='collection',
name='comment',
field=models.CharField(help_text='Comment for collection.', max_length=256, verbose_name='Comment'),
),
migrations.AlterField(
model_name='collection',
name='expired_on',
field=models.DateTimeField(help_text='An expiration date of collection.', verbose_name='An expiration date'),
),
migrations.AlterField(
model_name='collection',
name='public',
field=models.BooleanField(default=False, help_text='Make public.', verbose_name='Public'),
),
migrations.AlterField(
model_name='collection',
name='query',
field=models.CharField(help_text='Query for collection.', max_length=256, verbose_name='Query'),
),
]
| 32.705882 | 121 | 0.611511 | 113 | 1,112 | 5.858407 | 0.460177 | 0.120846 | 0.151057 | 0.175227 | 0.453172 | 0.36858 | 0.108761 | 0 | 0 | 0 | 0 | 0.045567 | 0.269784 | 1,112 | 33 | 122 | 33.69697 | 0.769704 | 0.040468 | 0 | 0.444444 | 1 | 0 | 0.213146 | 0.021596 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.037037 | 0 | 0.148148 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
ce5d8d0f3c28fed69d76da9c81283dbdc6272f7e | 1,505 | py | Python | code/dependancy/smaliparser.py | OmkarMozar/CUPAP | 6055f423e3f9b8bb1a44dd8fab73630554363b3d | [
"Apache-2.0"
] | null | null | null | code/dependancy/smaliparser.py | OmkarMozar/CUPAP | 6055f423e3f9b8bb1a44dd8fab73630554363b3d | [
"Apache-2.0"
] | null | null | null | code/dependancy/smaliparser.py | OmkarMozar/CUPAP | 6055f423e3f9b8bb1a44dd8fab73630554363b3d | [
"Apache-2.0"
] | null | null | null | from smalisca.core.smalisca_main import SmaliscaApp
from smalisca.modules.module_smali_parser import SmaliParser
from smalisca.core.smalisca_app import App
from smalisca.core.smalisca_logging import log
from smalisca.modules.module_sql_models import AppSQLModel
import smalisca.core.smalisca_config as config
import multiprocessing
import os
from cement.core import controller
from cement.core.controller import CementBaseController
import json
def parsesmali(location):
# Create new smalisca app
# You'll have to create a new app in order to set logging
# settins and so on.
print "Parsing smali files.."
app = SmaliscaApp()
app.setup()
# Set log level
app.log.set_level('info')
# Specify the location where your APK has been dumped
#location = '/root/Downloads/Project/Gmailnew'
# Specify file name suffix
suffix = 'smali'
# Create a new parser
parser = SmaliParser(location, suffix)
# Go for it!
parser.run()
# Get results
results = parser.get_results()
'''
data=app.get_all()
with open('iii.json', 'w') as outfile:
json.dump(data, outfile)
'''
ap = App(__name__)
ap.add_location(location)
ap.add_parser("%s - %s" % (config.PROJECT_NAME, config.PROJECT_VERSION))
# Append classes
for c in results:
ap.add_class_obj(c)
# Write results to JSON
log.info("Exporting results to JSON")
base=os.path.basename(location)
out_path = os.getcwd()+'/Data/Apk_data/'+base+'/'+base+'method.json'
ap.write_json(out_path)
print out_path," created"
| 24.672131 | 73 | 0.743522 | 219 | 1,505 | 4.990868 | 0.43379 | 0.054895 | 0.073193 | 0.065874 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.154817 | 1,505 | 60 | 74 | 25.083333 | 0.859277 | 0.208638 | 0 | 0 | 0 | 0 | 0.089649 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.366667 | null | null | 0.066667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
ce6eed2c9d0065dffb079ead3cb624c8d3a05810 | 224 | py | Python | wixaward/urls.py | LekamCharity/wix-projects | 76f9ab4429a978a42f0cea3e3a305a7cdfc4541d | [
"MIT"
] | null | null | null | wixaward/urls.py | LekamCharity/wix-projects | 76f9ab4429a978a42f0cea3e3a305a7cdfc4541d | [
"MIT"
] | null | null | null | wixaward/urls.py | LekamCharity/wix-projects | 76f9ab4429a978a42f0cea3e3a305a7cdfc4541d | [
"MIT"
] | null | null | null | from django.urls import path
urlpatterns=[
path('profile',views.profile, name='profile'),
]
if settings.DEBUG:
urlpatterns += static(settings.MEDIA_URL,
document_root=settings.MEDIA_ROOT)
| 22.4 | 60 | 0.665179 | 25 | 224 | 5.84 | 0.68 | 0.178082 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.223214 | 224 | 9 | 61 | 24.888889 | 0.83908 | 0 | 0 | 0 | 0 | 0 | 0.0625 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.142857 | 0 | 0.142857 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
ce735019669e5c6f53493f5d8d363b42ab7d2267 | 1,434 | py | Python | class4/e3_pexpect.py | ktbyers/pynet_wantonik | 601bce26142b6741202c2bdafb9e0d0cec1b3c78 | [
"Apache-2.0"
] | 2 | 2017-05-11T12:05:15.000Z | 2021-07-15T18:13:19.000Z | class4/e3_pexpect.py | ktbyers/pynet_wantonik | 601bce26142b6741202c2bdafb9e0d0cec1b3c78 | [
"Apache-2.0"
] | null | null | null | class4/e3_pexpect.py | ktbyers/pynet_wantonik | 601bce26142b6741202c2bdafb9e0d0cec1b3c78 | [
"Apache-2.0"
] | 1 | 2017-05-11T12:05:18.000Z | 2017-05-11T12:05:18.000Z | #!/usr/bin/env python
'''
Simple script to execute shell command on lab router with Pexpect module.
'''
import pexpect, sys, re
from getpass import getpass
def main():
## Define variables
ip_addr = '184.105.247.71'
username = 'pyclass'
port = 22
password = getpass()
## Set up connection with router
try:
ssh_conn = pexpect.spawn('ssh -l {} {} -p {}'.format(username, ip_addr, port))
ssh_conn.timeout = 3
## Expects to see line with 'ssword' string on router
ssh_conn.expect('ssword:')
## We then send the password to log in on router
ssh_conn.sendline(password)
## To see the prompt on router
ssh_conn.expect('pynet-rtr2#')
## Execute 'show ip int brief' command to router
ssh_conn.sendline('show ip int brief\n')
ssh_conn.expect('pynet-rtr2#')
print ssh_conn.before
## Disable 'more' paging
ssh_conn.sendline('terminal length 0')
ssh_conn.expect('pynet-rtr2#')
print ssh_conn.before
print ssh_conn.after
## Catching timeout and other errors from the above section
except pexpect.TIMEOUT:
print 'Found timeout or other error - check your code: variables, args.'
host_name = ssh_conn.before
host_name = host_name.strip()
prompt = host_name + ssh_conn.after
prompt = prompt.strip()
if __name__ == "__main__":
main()
| 29.875 | 86 | 0.631799 | 193 | 1,434 | 4.549223 | 0.492228 | 0.111617 | 0.059226 | 0.051253 | 0.149203 | 0.091116 | 0.091116 | 0.091116 | 0.091116 | 0 | 0 | 0.01711 | 0.266388 | 1,434 | 47 | 87 | 30.510638 | 0.81749 | 0.22106 | 0 | 0.178571 | 0 | 0 | 0.183513 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.107143 | 0.071429 | null | null | 0.142857 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
ce796d88eb98f929fefba1eaa8a093ed6e266e4a | 1,285 | py | Python | icm/__main__.py | MCasari-PMEL/EDD-ICMGUI | 3210e7bb74ff2ace6e1c8c0bf132ecae5713141b | [
"MIT"
] | null | null | null | icm/__main__.py | MCasari-PMEL/EDD-ICMGUI | 3210e7bb74ff2ace6e1c8c0bf132ecae5713141b | [
"MIT"
] | 3 | 2018-01-08T16:44:33.000Z | 2018-01-08T16:47:55.000Z | icm/__main__.py | MCasari-PMEL/EDD-ICMGUI | 3210e7bb74ff2ace6e1c8c0bf132ecae5713141b | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
import sys, os, time, serial, json
import numpy as np
import pyqtgraph as pg
import pyqtgraph.console
from PyQt5.QtCore import pyqtSignal, QObject
from pyqtgraph.Qt import QtCore, QtGui
from pyqtgraph.dockarea import *
from icm.ui_clock import *
from icm.ui_createfile import *
from icm.ui_parameter import *
from icm.ui_serial import *
from icm.ui_qicmgui import Ui_QIcmGuiMainWindow
class Communicate(QObject):
closeApp = pyqtSignal()
class QIcmGuiMainWindow(QtGui.QMainWindow, Ui_QIcmGuiMainWindow):
""" ICMGUI Main Window """
def __init__(self,parent=None):
#Initialize the GUI
super().__init__(parent)
self.setupUi(self)
#Create a communication event to close the window
self.c = Communicate()
## Create docks and place them in the window
self.cd = SerialWidget(self.gridLayout_2)
#@QtCore.slot()
def closeEvent(self,event):
pass
def main():
app = QtGui.QApplication([])
window = QIcmGuiMainWindow()
window.show()
sys.exit(app.exec_())
if __name__ == "__main__":
main()
#if __name__ != "__main__":
# raise ImportError('this module should not be imported')
| 22.54386 | 65 | 0.662257 | 155 | 1,285 | 5.277419 | 0.548387 | 0.061125 | 0.079462 | 0.091687 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003074 | 0.240467 | 1,285 | 56 | 66 | 22.946429 | 0.835041 | 0.209339 | 0 | 0 | 0 | 0 | 0.007992 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.103448 | false | 0.034483 | 0.413793 | 0 | 0.62069 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
ce801d6bd90e41604f5f09f5bc95fde822da704c | 728 | py | Python | TASQ/problems/forms.py | harshraj22/smallProjects | b31e9173c60abb778a1c196609757704ec9c3750 | [
"MIT"
] | 2 | 2019-11-18T14:13:57.000Z | 2020-11-08T06:50:32.000Z | TASQ/problems/forms.py | harshraj22/smallProjects | b31e9173c60abb778a1c196609757704ec9c3750 | [
"MIT"
] | 16 | 2019-11-12T13:08:01.000Z | 2022-02-27T10:51:28.000Z | TASQ/problems/forms.py | harshraj22/smallProjects | b31e9173c60abb778a1c196609757704ec9c3750 | [
"MIT"
] | null | null | null | from django import forms
from .models import Problem
class ProblemForm(forms.ModelForm):
options = (
('A', 'A'),
('B', 'B'),
('C', 'C'),
('D', 'D'),
)
choice = forms.ChoiceField(choices=options)
class Meta:
model = Problem
fields = '__all__'
# exclude = ['answer']
widgets = {
'answer': forms.HiddenInput(),
'description': forms.Textarea(attrs={'readonly':'readonly', 'style': 'border: None'}),
'opt1': forms.TextInput(attrs={'readonly':'readonly'}),
'opt2': forms.TextInput(attrs={'readonly':'readonly'}),
'opt3': forms.TextInput(attrs={'readonly':'readonly'}),
'opt4': forms.TextInput(attrs={'readonly':'readonly'}),
'difficulty': forms.TextInput(attrs={'readonly':'readonly'}),
}
| 28 | 89 | 0.634615 | 77 | 728 | 5.948052 | 0.493506 | 0.170306 | 0.275109 | 0.29476 | 0.382096 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0064 | 0.141484 | 728 | 25 | 90 | 29.12 | 0.7264 | 0.027473 | 0 | 0 | 0 | 0 | 0.242553 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.090909 | 0 | 0.272727 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
ce849a188316eb44b68f6012ce73ef84e1631ac2 | 9,586 | py | Python | satyr/proxies/scheduler.py | usheth/satyr | 01fadbe2f9c294b9a9719a85d5bd032925453ee6 | [
"Apache-2.0"
] | null | null | null | satyr/proxies/scheduler.py | usheth/satyr | 01fadbe2f9c294b9a9719a85d5bd032925453ee6 | [
"Apache-2.0"
] | null | null | null | satyr/proxies/scheduler.py | usheth/satyr | 01fadbe2f9c294b9a9719a85d5bd032925453ee6 | [
"Apache-2.0"
] | 1 | 2018-10-10T18:57:54.000Z | 2018-10-10T18:57:54.000Z | from __future__ import absolute_import, division, print_function
import logging
import sys
from mesos.interface import Scheduler
from .messages import Filters, decode, encode
class SchedulerProxy(Scheduler):
def __init__(self, scheduler):
self.scheduler = scheduler
def registered(self, driver, frameworkId, masterInfo):
logging.info("Registered with master", extra=dict())
return self.scheduler.on_registered(SchedulerDriverProxy(driver),
decode(frameworkId),
decode(masterInfo))
def reregistered(self, driver, masterInfo):
logging.info("Re-registered with master", extra=dict())
return self.scheduler.on_reregistered(SchedulerDriverProxy(driver),
decode(masterInfo))
def disconnected(self, driver):
logging.debug("Disconnected from master")
return self.scheduler.on_disconnected(SchedulerDriverProxy(driver))
def resourceOffers(self, driver, offers):
logging.debug("Got resource offers",
extra=dict(num_offers=len(offers)))
return self.scheduler.on_offers(SchedulerDriverProxy(driver),
map(decode, offers))
def offerRescinded(self, driver, offerId):
logging.debug('Offer rescinded', extra=dict(offer_id=offerId))
return self.scheduler.on_rescinded(SchedulerDriverProxy(driver),
decode(offerId))
def statusUpdate(self, driver, status):
logging.debug('Status update received', extra=dict(
state=status.state, description=status.message))
return self.scheduler.on_update(SchedulerDriverProxy(driver),
decode(status))
def frameworkMessage(self, driver, executorId, slaveId, message):
logging.debug('Framework message received')
return self.scheduler.on_message(SchedulerDriverProxy(driver),
decode(executorId),
decode(slaveId),
message)
def slaveLost(self, driver, slaveId):
logging.debug('Slave has been lost, tasks should be rescheduled')
return self.scheduler.on_slave_lost(SchedulerDriverProxy(driver),
decode(slaveId))
def executorLost(self, driver, executorId, slaveId, status):
logging.debug('Executor has been lost')
return self.scheduler.on_executor_lost(SchedulerDriverProxy(driver),
decode(executorId),
decode(slaveId),
status)
def error(self, driver, message):
print("Error from Mesos: %s " % message, file=sys.stderr)
return self.scheduler.on_error(SchedulerDriverProxy(driver), message)
class SchedulerDriverProxy(object):
"""Proxy Interface for Mesos scheduler drivers."""
def __init__(self, driver):
self.driver = driver
def start(self):
"""Starts the scheduler driver.
This needs to be called before any other driver calls are made.
"""
return self.driver.start()
def stop(self, failover=False):
"""Stops the scheduler driver.
If the 'failover' flag is set to False then it is expected that this
framework will never reconnect to Mesos and all of its executors and
tasks can be terminated. Otherwise, all executors and tasks will
remain running (for some framework specific failover timeout) allowing
the scheduler to reconnect (possibly in the same process, or from a
different process, for example, on a different machine.)
"""
return self.driver.stop(failover)
def abort(self):
"""Aborts the driver so that no more callbacks can be made to the
scheduler.
The semantics of abort and stop have deliberately been separated so that
code can detect an aborted driver (i.e., via the return status of
SchedulerDriver.join), and instantiate and start another driver if
desired (from within the same process.)
"""
return self.driver.abort()
def join(self):
"""Waits for the driver to be stopped or aborted, possibly blocking the
current thread indefinitely.
The return status of this function can be used to determine if the
driver was aborted (see mesos.proto for a description of Status).
"""
return self.driver.join()
def request(self, requests):
"""Requests resources from Mesos.
(see mesos.proto for a description of Request and how, for example, to
request resources from specific slaves.)
Any resources available are offered to the framework via
Scheduler.resourceOffers callback, asynchronously.
"""
return self.driver.requestResources(map(encode, requests))
def launch(self, offer_id, tasks, filters=Filters()):
"""Launches the given set of tasks.
Any resources remaining (i.e., not used by the tasks or their executors)
will be considered declined.
The specified filters are applied on all unused resources (see
mesos.proto for a description of Filters). Available resources are
aggregated when multiple offers are provided. Note that all offers must
belong to the same slave. Invoking this function with an empty
collection of tasks declines the offers in entirety (see
Scheduler.decline).
Note that passing a single offer is also supported.
"""
return self.driver.launchTasks(encode(offer_id),
map(encode, tasks),
encode(filters))
def kill(self, task_id):
"""Kills the specified task.
Note that attempting to kill a task is currently not reliable.
If, for example, a scheduler fails over while it was attempting to kill
a task it will need to retry in the future.
Likewise, if unregistered / disconnected, the request will be dropped
(these semantics may be changed in the future).
"""
return self.driver.killTask(encode(task_id))
def reconcile(self, statuses):
"""Allows the framework to query the status for non-terminal tasks.
This causes the master to send back the latest task status for each task
in 'statuses', if possible. Tasks that are no longer known will result
in a TASK_LOST update. If statuses is empty, then the master will send
the latest status for each task currently known.
"""
return self.driver.reconcileTasks(map(encode, statuses))
def decline(self, offer_id, filters=Filters()):
"""Declines an offer in its entirety and applies the specified
filters on the resources (see mesos.proto for a description of
Filters).
Note that this can be done at any time, it is not necessary to do this
within the Scheduler::resourceOffers callback.
"""
return self.driver.declineOffer(encode(offer_id),
encode(filters)) # TODO filters
def accept(self, offer_ids, operations, filters=Filters()):
"""Accepts the given offers and performs a sequence of operations
on those accepted offers.
See Offer.Operation in mesos.proto for the set of available operations.
Available resources are aggregated when multiple offers are provided.
Note that all offers must belong to the same slave. Any unused resources
will be considered declined. The specified filters are applied on all
unused resources (see mesos.proto for a description of Filters).
"""
return self.driver.acceptOffers(map(encode, offer_ids),
map(encode, operations),
encode(filters))
def revive(self):
"""Removes all filters previously set by the framework (via
launchTasks()).
This enables the framework to receive offers from those filtered slaves.
"""
return self.driver.reviveOffers()
def suppress(self):
"""Inform Mesos master to stop sending offers to the framework.
The scheduler should call reviveOffers() to resume getting offers.
"""
return self.driver.suppressOffers()
def acknowledge(self, status):
"""Acknowledges the status update.
This should only be called once the status update is processed durably
by the scheduler.
Not that explicit acknowledgements must be requested via the constructor
argument, otherwise a call to this method will cause the driver to
crash.
"""
return self.driver.acknowledgeStatusUpdate(encode(status))
def message(self, executor_id, slave_id, message):
"""Sends a message from the framework to one of its executors.
These messages are best effort; do not expect a framework message to be
retransmitted in any reliable fashion.
"""
return self.driver.sendFrameworkMessage(encode(executor_id),
encode(slave_id),
message)
| 41.860262 | 80 | 0.628312 | 1,086 | 9,586 | 5.508287 | 0.269797 | 0.043464 | 0.037446 | 0.035105 | 0.130391 | 0.12337 | 0.104982 | 0.094952 | 0.094952 | 0.070545 | 0 | 0 | 0.305654 | 9,586 | 228 | 81 | 42.04386 | 0.898738 | 0.41738 | 0 | 0.120879 | 0 | 0 | 0.049523 | 0 | 0 | 0 | 0 | 0.004386 | 0 | 1 | 0.285714 | false | 0 | 0.054945 | 0 | 0.626374 | 0.021978 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
ce86369c63bb6fc50980df6a3068e5a13c86663c | 3,350 | py | Python | xen/xen-4.2.2/tools/python/xen/xm/console.py | zhiming-shen/Xen-Blanket-NG | 47e59d9bb92e8fdc60942df526790ddb983a5496 | [
"Apache-2.0"
] | 1 | 2018-02-02T00:15:26.000Z | 2018-02-02T00:15:26.000Z | xen/xen-4.2.2/tools/python/xen/xm/console.py | zhiming-shen/Xen-Blanket-NG | 47e59d9bb92e8fdc60942df526790ddb983a5496 | [
"Apache-2.0"
] | null | null | null | xen/xen-4.2.2/tools/python/xen/xm/console.py | zhiming-shen/Xen-Blanket-NG | 47e59d9bb92e8fdc60942df526790ddb983a5496 | [
"Apache-2.0"
] | 1 | 2019-05-27T09:47:18.000Z | 2019-05-27T09:47:18.000Z | #============================================================================
# This library is free software; you can redistribute it and/or
# modify it under the terms of version 2.1 of the GNU Lesser General Public
# License as published by the Free Software Foundation.
#
# This library is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
# Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this library; if not, write to the Free Software
# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
#============================================================================
# Copyright (C) 2005 XenSource Ltd
#============================================================================
import xen.util.auxbin
import xen.lowlevel.xs
import os
import sys
import signal
from xen.util import utils
XENCONSOLE = "xenconsole"
def execConsole(domid, num = 0):
xen.util.auxbin.execute(XENCONSOLE, [str(domid), "--num", str(num)])
class OurXenstoreConnection:
def __init__(self):
self.handle = xen.lowlevel.xs.xs()
def read_eventually(self, path):
watch = None
trans = None
try:
signal.alarm(10)
watch = self.handle.watch(path, None)
while True:
result = self.handle.read('0', path)
if result is not None:
signal.alarm(0)
return result
self.handle.read_watch()
finally:
signal.alarm(0)
if watch is not None: self.handle.unwatch(path, watch)
def read_maybe(self, path):
return self.handle.read('0', path)
def runVncViewer(domid, do_autopass, do_daemonize=False):
xs = OurXenstoreConnection()
d = '/local/domain/%d/' % domid
vnc_port = xs.read_eventually(d + 'console/vnc-port')
vfb_backend = xs.read_maybe(d + 'device/vfb/0/backend')
vnc_listen = None
vnc_password = None
vnc_password_tmpfile = None
cmdl = ['vncviewer']
if vfb_backend is not None:
vnc_listen = xs.read_maybe(vfb_backend + '/vnclisten')
if do_autopass:
vnc_password = xs.read_maybe(vfb_backend + '/vncpasswd')
if vnc_password is not None:
cmdl.append('-autopass')
vnc_password_tmpfile = os.tmpfile()
print >>vnc_password_tmpfile, vnc_password
vnc_password_tmpfile.seek(0)
vnc_password_tmpfile.flush()
if vnc_listen is None:
vnc_listen = 'localhost'
cmdl.append('%s:%d' % (vnc_listen, int(vnc_port) - 5900))
if do_daemonize:
pid = utils.daemonize('vncviewer', cmdl, vnc_password_tmpfile)
if pid == 0:
print >>sys.stderr, 'failed to invoke vncviewer'
os._exit(-1)
else:
print 'invoking ', ' '.join(cmdl)
if vnc_password_tmpfile is not None:
os.dup2(vnc_password_tmpfile.fileno(), 0)
try:
os.execvp('vncviewer', cmdl)
except OSError:
print >>sys.stderr, 'Error: external vncviewer missing or not \
in the path\nExiting'
os._exit(-1)
| 37.640449 | 77 | 0.589851 | 411 | 3,350 | 4.690998 | 0.394161 | 0.068465 | 0.074689 | 0.029564 | 0.093361 | 0.051867 | 0.03527 | 0 | 0 | 0 | 0 | 0.015218 | 0.254627 | 3,350 | 88 | 78 | 38.068182 | 0.756908 | 0.268358 | 0 | 0.090909 | 0 | 0 | 0.072279 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.212121 | 0.090909 | null | null | 0.060606 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
ceac15e5add44827ffdab3055b716cb3256a3e2a | 1,327 | py | Python | src/aws_lambda_typing/events/kinesis_stream.py | chuckwondo/aws-lambda-typing | 8417ab67f2492be1508fe38b2c34bc106619a56d | [
"MIT"
] | 29 | 2021-01-07T13:35:16.000Z | 2022-03-25T07:20:54.000Z | src/aws_lambda_typing/events/kinesis_stream.py | chuckwondo/aws-lambda-typing | 8417ab67f2492be1508fe38b2c34bc106619a56d | [
"MIT"
] | 13 | 2021-02-28T00:31:00.000Z | 2022-03-29T15:24:01.000Z | src/aws_lambda_typing/events/kinesis_stream.py | chuckwondo/aws-lambda-typing | 8417ab67f2492be1508fe38b2c34bc106619a56d | [
"MIT"
] | 5 | 2021-02-27T13:50:42.000Z | 2022-01-13T15:05:44.000Z | #!/usr/bin/env python
import sys
if sys.version_info >= (3, 8):
from typing import List, TypedDict
else:
from typing import List
from typing_extensions import TypedDict
class KinesisStreamKinesis(TypedDict):
"""
KinesisStreamKinesis
Attributes:
----------
kinesisSchemaVersion: str
partitionKey: str
sequenceNumber: str
data: str
approximateArrivalTimestamp: float
"""
kinesisSchemaVersion: str
partitionKey: str
sequenceNumber: str
data: str
approximateArrivalTimestamp: float
class KinesisStreamRecord(TypedDict):
"""
KinesisStreamRecord
Attributes:
----------
kinesis: :py:class:`KinesisStreamKinesis`
eventSource: str
eventVersion: str
eventID: str
eventName: str
invokeIdentityArn: str
awsRegion: str
eventSourceARN: str
"""
kinesis: KinesisStreamKinesis
eventSource: str
eventVersion: str
eventID: str
eventName: str
invokeIdentityArn: str
awsRegion: str
eventSourceARN: str
class KinesisStreamEvent(TypedDict):
"""
KinesisStreamEvent
https://docs.aws.amazon.com/lambda/latest/dg/with-kinesis.html
Attributes:
----------
Records: List[:py:class:`KinesisStreamRecord`]
"""
Records: List[KinesisStreamRecord]
| 16.382716 | 66 | 0.666164 | 115 | 1,327 | 7.669565 | 0.426087 | 0.034014 | 0.036281 | 0.045351 | 0.485261 | 0.485261 | 0.485261 | 0.485261 | 0.485261 | 0.485261 | 0 | 0.001976 | 0.237378 | 1,327 | 80 | 67 | 16.5875 | 0.869565 | 0.410701 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.173913 | 0 | 0.913043 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
ceadbfc8ec08afd61feb6385ed4d339e585d1115 | 538 | py | Python | exercises/de/test_01_07.py | Jette16/spacy-course | 32df0c8f6192de6c9daba89740a28c0537e4d6a0 | [
"MIT"
] | 2,085 | 2019-04-17T13:10:40.000Z | 2022-03-30T21:51:46.000Z | exercises/de/test_01_07.py | Jette16/spacy-course | 32df0c8f6192de6c9daba89740a28c0537e4d6a0 | [
"MIT"
] | 79 | 2019-04-18T14:42:55.000Z | 2022-03-07T08:15:43.000Z | exercises/de/test_01_07.py | Jette16/spacy-course | 32df0c8f6192de6c9daba89740a28c0537e4d6a0 | [
"MIT"
] | 361 | 2019-04-17T13:34:32.000Z | 2022-03-28T04:42:45.000Z | def test():
assert "spacy.load" in __solution__, "Rufst du spacy.load auf?"
assert nlp.meta["lang"] == "de", "Lädst du das korrekte Modell?"
assert nlp.meta["name"] == "core_news_sm", "Lädst du das korrekte Modell?"
assert "nlp(text)" in __solution__, "Verarbeitest du den Text korrekt?"
assert "print(doc.text)" in __solution__, "Druckst du den Text des Doc?"
__msg__.good(
"Gut gemacht! Jetzt wo du das Laden von Modellen geübt hast, lass uns "
"mal ein paar ihrer Vorhersagen anschauen."
)
| 44.833333 | 79 | 0.669145 | 77 | 538 | 4.441558 | 0.636364 | 0.087719 | 0.076023 | 0.105263 | 0.192982 | 0.192982 | 0.192982 | 0 | 0 | 0 | 0 | 0 | 0.211896 | 538 | 11 | 80 | 48.909091 | 0.806604 | 0 | 0 | 0 | 0 | 0 | 0.574349 | 0 | 0 | 0 | 0 | 0 | 0.5 | 1 | 0.1 | true | 0 | 0 | 0 | 0.1 | 0.1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
ceafef1d012a2252b4736fc5912d1fe98bb743cd | 10,608 | py | Python | psdet/models/point_detector/utils.py | Jiaolong/gcn-parking-slot | f8c3b445b186e3a7fd13af1f17fa5ba0336027c7 | [
"MIT"
] | 56 | 2021-03-24T08:24:27.000Z | 2022-03-26T13:56:36.000Z | psdet/models/point_detector/utils.py | Jiaolong/gcn-parking-slot | f8c3b445b186e3a7fd13af1f17fa5ba0336027c7 | [
"MIT"
] | 7 | 2021-04-05T03:55:05.000Z | 2022-03-08T03:12:20.000Z | psdet/models/point_detector/utils.py | Jiaolong/gcn-parking-slot | f8c3b445b186e3a7fd13af1f17fa5ba0336027c7 | [
"MIT"
] | 17 | 2021-04-04T02:42:09.000Z | 2022-03-31T01:48:06.000Z | """Universal network struture unit definition."""
import torch
import math
from torch import nn
import torchvision
from torch.utils import model_zoo
from torchvision.models.resnet import BasicBlock, model_urls, Bottleneck
def define_squeeze_unit(basic_channel_size):
"""Define a 1x1 squeeze convolution with norm and activation."""
conv = nn.Conv2d(2 * basic_channel_size, basic_channel_size, kernel_size=1,
stride=1, padding=0, bias=False)
norm = nn.BatchNorm2d(basic_channel_size)
relu = nn.LeakyReLU(0.1)
layers = [conv, norm, relu]
return layers
def define_expand_unit(basic_channel_size):
"""Define a 3x3 expand convolution with norm and activation."""
conv = nn.Conv2d(basic_channel_size, 2 * basic_channel_size, kernel_size=3,
stride=1, padding=1, bias=False)
norm = nn.BatchNorm2d(2 * basic_channel_size)
relu = nn.LeakyReLU(0.1)
layers = [conv, norm, relu]
return layers
def define_halve_unit(basic_channel_size):
"""Define a 4x4 stride 2 expand convolution with norm and activation."""
conv = nn.Conv2d(basic_channel_size, 2 * basic_channel_size, kernel_size=4,
stride=2, padding=1, bias=False)
norm = nn.BatchNorm2d(2 * basic_channel_size)
relu = nn.LeakyReLU(0.1)
layers = [conv, norm, relu]
return layers
def define_depthwise_expand_unit(basic_channel_size):
"""Define a 3x3 expand convolution with norm and activation."""
conv1 = nn.Conv2d(basic_channel_size, 2 * basic_channel_size,
kernel_size=1, stride=1, padding=0, bias=False)
norm1 = nn.BatchNorm2d(2 * basic_channel_size)
relu1 = nn.LeakyReLU(0.1)
conv2 = nn.Conv2d(2 * basic_channel_size, 2 * basic_channel_size, kernel_size=3,
stride=1, padding=1, bias=False, groups=2 * basic_channel_size)
norm2 = nn.BatchNorm2d(2 * basic_channel_size)
relu2 = nn.LeakyReLU(0.1)
layers = [conv1, norm1, relu1, conv2, norm2, relu2]
return layers
def define_detector_block(basic_channel_size):
"""Define a unit composite of a squeeze and expand unit."""
layers = []
layers += define_squeeze_unit(basic_channel_size)
layers += define_expand_unit(basic_channel_size)
return layers
class YetAnotherDarknet(nn.modules.Module):
"""Yet another darknet, imitating darknet-53 with depth of darknet-19."""
def __init__(self, input_channel_size, depth_factor):
super(YetAnotherDarknet, self).__init__()
layers = []
# 0
layers += [nn.Conv2d(input_channel_size, depth_factor, kernel_size=3,
stride=1, padding=1, bias=False)]
layers += [nn.BatchNorm2d(depth_factor)]
layers += [nn.LeakyReLU(0.1)]
# 1
layers += define_halve_unit(depth_factor)
layers += define_detector_block(depth_factor)
# 2
depth_factor *= 2
layers += define_halve_unit(depth_factor)
layers += define_detector_block(depth_factor)
# 3
depth_factor *= 2
layers += define_halve_unit(depth_factor)
layers += define_detector_block(depth_factor)
layers += define_detector_block(depth_factor)
# 4
depth_factor *= 2
layers += define_halve_unit(depth_factor)
layers += define_detector_block(depth_factor)
layers += define_detector_block(depth_factor)
# 5
depth_factor *= 2
layers += define_halve_unit(depth_factor)
layers += define_detector_block(depth_factor)
self.model = nn.Sequential(*layers)
def forward(self, x):
return self.model(x)
# vgg backbone
class VGG(nn.Module):
def __init__(self, features, num_classes=1000, init_weights=True):
super(VGG, self).__init__()
self.features = features
if init_weights:
self._initialize_weights()
def forward(self, x):
x = self.features(x)
return x
def _initialize_weights(self):
for m in self.modules():
if isinstance(m, nn.Conv2d):
n = m.kernel_size[0] * m.kernel_size[1] * m.out_channels
m.weight.data.normal_(0, math.sqrt(2. / n))
if m.bias is not None:
m.bias.data.zero_()
elif isinstance(m, nn.BatchNorm2d):
m.weight.data.fill_(1)
m.bias.data.zero_()
elif isinstance(m, nn.Linear):
m.weight.data.normal_(0, 0.01)
m.bias.data.zero_()
cfg = {
'A': [64, 'M', 128, 'M', 256, 256, 'M', 512, 512, 'M', 512, 512, 'M'],
'B': [64, 64, 'M', 128, 128, 'M', 256, 256, 'M', 512, 512, 'M', 512, 512, 'M'],
'D': [64, 64, 'M', 128, 128, 'M', 256, 256, 256, 'M', 512, 512, 512, 'M', 512, 512, 1024, 'M'],
'E': [64, 64, 'M', 128, 128, 'M', 256, 256, 256, 256, 'M', 512, 512, 512, 512, 'M', 512, 512, 512, 512, 'M'],
}
def make_layers(cfg, batch_norm=False):
layers = []
in_channels = 3
for v in cfg:
if v == 'M':
layers += [nn.MaxPool2d(kernel_size=2, stride=2)]
else:
conv2d = nn.Conv2d(in_channels, v, kernel_size=3, padding=1)
if batch_norm:
layers += [conv2d, nn.BatchNorm2d(v), nn.ReLU(inplace=True)]
else:
layers += [conv2d, nn.ReLU(inplace=True)]
in_channels = v
return nn.Sequential(*layers)
def vgg16(pretrained=False, **kwargs):
"""VGG 16-layer model (configuration "D")
Args:
pretrained (bool): If True, returns a model pre-trained on ImageNet
"""
if pretrained:
kwargs['init_weights'] = False
model = VGG(make_layers(cfg['D']), **kwargs)
if pretrained:
model.load_state_dict(model_zoo.load_url(model_urls['vgg16']))
return model
class ResNet18(nn.Module):
def __init__(self, block, layers, aux_classes=1000, classes=100, domains=3):
self.inplanes = 64
super(ResNet18, self).__init__()
self.conv1 = nn.Conv2d(3, 64, kernel_size=7, stride=2, padding=3,
bias=False)
self.bn1 = nn.BatchNorm2d(64)
self.relu = nn.ReLU(inplace=True)
self.maxpool = nn.MaxPool2d(kernel_size=3, stride=2, padding=1)
self.layer1 = self._make_layer(block, 64, layers[0])
self.layer2 = self._make_layer(block, 128, layers[1], stride=2)
self.layer3 = self._make_layer(block, 256, layers[2], stride=2)
self.layer4 = self._make_layer(block, 1024, layers[3], stride=2)#resnet 18
for m in self.modules():
if isinstance(m, nn.Conv2d):
nn.init.kaiming_normal_(m.weight, mode='fan_out', nonlinearity='relu')
elif isinstance(m, nn.BatchNorm2d):
nn.init.constant_(m.weight, 1)
nn.init.constant_(m.bias, 0)
def _make_layer(self, block, planes, blocks, stride=1):
downsample = None
if stride != 1 or self.inplanes != planes * block.expansion:
downsample = nn.Sequential(
nn.Conv2d(self.inplanes, planes * block.expansion,
kernel_size=1, stride=stride, bias=False),
nn.BatchNorm2d(planes * block.expansion),
)
layers = []
layers.append(block(self.inplanes, planes, stride, downsample))
self.inplanes = planes * block.expansion
for i in range(1, blocks):
layers.append(block(self.inplanes, planes))
return nn.Sequential(*layers)
def is_patch_based(self):
return False
def forward(self, x, **kwargs):
x = self.conv1(x)
x = self.bn1(x)
x = self.relu(x)
x = self.maxpool(x)
x = self.layer1(x)
x = self.layer2(x)
x = self.layer3(x)
x = self.layer4(x)
return x
def resnet18(pretrained=False, **kwargs):
"""Constructs a ResNet-18 model.
Args:
pretrained (bool): If True, returns a model pre-trained on ImageNet
"""
model = ResNet18(BasicBlock, [2, 2, 2, 2], **kwargs)
if pretrained:
model.load_state_dict(model_zoo.load_url(model_urls['resnet18']), strict=False)
return model
class ResNet50(nn.Module):
def __init__(self, block, layers, aux_classes=1000, classes=100, domains=3):
self.inplanes = 64
super(ResNet50, self).__init__()
self.conv1 = nn.Conv2d(3, 64, kernel_size=7, stride=2, padding=3,
bias=False)
self.bn1 = nn.BatchNorm2d(64)
self.relu = nn.ReLU(inplace=True)
self.maxpool = nn.MaxPool2d(kernel_size=3, stride=2, padding=1)
self.layer1 = self._make_layer(block, 64, layers[0])
self.layer2 = self._make_layer(block, 128, layers[1], stride=2)
self.layer3 = self._make_layer(block, 256, layers[2], stride=2)
self.layer4 = self._make_layer(block, 256, layers[3], stride=2) #resnet50
for m in self.modules():
if isinstance(m, nn.Conv2d):
nn.init.kaiming_normal_(m.weight, mode='fan_out', nonlinearity='relu')
elif isinstance(m, nn.BatchNorm2d):
nn.init.constant_(m.weight, 1)
nn.init.constant_(m.bias, 0)
def _make_layer(self, block, planes, blocks, stride=1):
downsample = None
if stride != 1 or self.inplanes != planes * block.expansion:
downsample = nn.Sequential(
nn.Conv2d(self.inplanes, planes * block.expansion,
kernel_size=1, stride=stride, bias=False),
nn.BatchNorm2d(planes * block.expansion),
)
layers = []
layers.append(block(self.inplanes, planes, stride, downsample))
self.inplanes = planes * block.expansion
for i in range(1, blocks):
layers.append(block(self.inplanes, planes))
return nn.Sequential(*layers)
def is_patch_based(self):
return False
def forward(self, x, **kwargs):
x = self.conv1(x)
x = self.bn1(x)
x = self.relu(x)
x = self.maxpool(x)
x = self.layer1(x)
x = self.layer2(x)
x = self.layer3(x)
x = self.layer4(x)
return x
def resnet50(pretrained=False, **kwargs):
"""Constructs a ResNet-50 model.
Args:
pretrained (bool): If True, returns a model pre-trained on ImageNet
"""
model = ResNet50(Bottleneck, [3, 4, 6, 3], **kwargs)
if pretrained:
model.load_state_dict(model_zoo.load_url(model_urls['resnet50']), strict=False)
return model
| 36.961672 | 113 | 0.607089 | 1,391 | 10,608 | 4.463695 | 0.128684 | 0.044291 | 0.059269 | 0.030118 | 0.747463 | 0.714286 | 0.66162 | 0.657916 | 0.639878 | 0.627476 | 0 | 0.054222 | 0.269796 | 10,608 | 286 | 114 | 37.090909 | 0.747353 | 0.073906 | 0 | 0.607306 | 0 | 0 | 0.008328 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.100457 | false | 0 | 0.027397 | 0.013699 | 0.223744 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
cebbaac4650f5e836f24ada37e7051b81fcb685c | 262 | py | Python | hw_asr/metric/__init__.py | kostyayatsok/asr_project_template | ee5fb8006fa4e4f5d4a2e5c6e9f6352c22ad5bbb | [
"MIT"
] | null | null | null | hw_asr/metric/__init__.py | kostyayatsok/asr_project_template | ee5fb8006fa4e4f5d4a2e5c6e9f6352c22ad5bbb | [
"MIT"
] | null | null | null | hw_asr/metric/__init__.py | kostyayatsok/asr_project_template | ee5fb8006fa4e4f5d4a2e5c6e9f6352c22ad5bbb | [
"MIT"
] | null | null | null | from hw_asr.metric.cer_metric import ArgmaxCERMetric, BeamsearchCERMetric
from hw_asr.metric.wer_metric import ArgmaxWERMetric, BeamsearchWERMetric
__all__ = [
"ArgmaxWERMetric",
"ArgmaxCERMetric",
"BeamsearchCERMetric",
"BeamsearchWERMetric"
]
| 26.2 | 73 | 0.790076 | 23 | 262 | 8.652174 | 0.521739 | 0.060302 | 0.090452 | 0.150754 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.133588 | 262 | 9 | 74 | 29.111111 | 0.876652 | 0 | 0 | 0 | 0 | 0 | 0.259542 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.25 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
ced99530196a9edbf9316f733aa9a4cd0cc8c3bd | 1,515 | py | Python | Code/Examples/Example_22.py | R6500/SLab | d8e4eac7d59dcdb2941ad4b267b59533bd038cab | [
"MIT"
] | 2 | 2018-02-23T18:23:35.000Z | 2018-04-10T11:30:31.000Z | Code/Examples/Example_22.py | R6500/SLab | d8e4eac7d59dcdb2941ad4b267b59533bd038cab | [
"MIT"
] | null | null | null | Code/Examples/Example_22.py | R6500/SLab | d8e4eac7d59dcdb2941ad4b267b59533bd038cab | [
"MIT"
] | null | null | null | '''
SLab Example
Example_22.py
Create several waveforms
Connect DAC 1 to ADC 1
'''
# Locate slab in the parent folder
import sys
sys.path.append('..')
sys.path.append('.')
import slab
# Set prefix to locate calibrations
slab.setFilePrefix("../")
# Open serial communication
slab.connect()
# Set sample time to 100us
slab.setSampleTime(0.0001)
# Set storage requirements
slab.setTransientStorage(200,1)
# (A) Creates and measures a square wave
slab.waveSquare(1.0,2.0,100)
slab.wavePlot()
# (B) Creates and measures a triangle wave
slab.waveTriangle(1.0,2.0,100)
slab.wavePlot()
# (C) Creates and measures a sawtooth wave
slab.waveSawtooth(1.0,2.0,100)
slab.wavePlot()
# (D) Creates and measures a sine wave
slab.waveSine(1.0,2.0,100)
slab.wavePlot()
# (E) Creates and measures a 10% duty pulse wave
slab.wavePulse(1.0,2.0,100,90)
slab.wavePlot()
# (F) Creates and measures a staircase waveform
list = []
for i in range(0,10):
for j in range(0,10):
list.append(1.0+0.1*i)
slab.loadWavetable(list)
slab.wavePlot()
# (G) Creates and measures a cosine wave
slab.waveCosine(1.0,2.0,100)
slab.wavePlot()
# (H) Creates and measures a noise wave
slab.waveNoise(1.5,0.1,100)
t,a1 = slab.wavePlot(1,returnData=True)
print "Std Dev is " + str(slab.std(a1)) + " V"
# (I) Creates and measures a random wave between 1V and 2V
slab.waveRandom(1,2,100)
slab.wavePlot()
# Close serial communication
slab.disconnect()
| 19.675325 | 59 | 0.678548 | 242 | 1,515 | 4.243802 | 0.396694 | 0.087634 | 0.157741 | 0.166504 | 0.099318 | 0.092502 | 0.092502 | 0 | 0 | 0 | 0 | 0.072712 | 0.192079 | 1,515 | 76 | 60 | 19.934211 | 0.76634 | 0.365677 | 0 | 0.25 | 0 | 0 | 0.024173 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.0625 | null | null | 0.03125 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
0c729770e81b2c493f1f39b13a63dd8b4098a0c6 | 104 | py | Python | micropsi_core/world/island/__init__.py | brucepro/micropsi2 | 84c304d5339f25d112da5565fb2cd98c31524f94 | [
"Apache-2.0"
] | 119 | 2015-01-23T11:24:58.000Z | 2022-03-13T08:00:50.000Z | micropsi_core/world/island/__init__.py | Chediak/micropsi2 | 74a2642d20da9da1d64acc5e4c11aeabee192a27 | [
"MIT"
] | 9 | 2015-02-18T20:44:58.000Z | 2021-09-17T14:38:05.000Z | micropsi_core/world/island/__init__.py | Chediak/micropsi2 | 74a2642d20da9da1d64acc5e4c11aeabee192a27 | [
"MIT"
] | 34 | 2015-04-01T20:48:49.000Z | 2022-03-13T08:02:00.000Z | #!/usr/local/bin/python
# -*- coding: utf-8 -*-
"""
"""
__author__ = 'joscha'
__date__ = '03.08.12'
| 9.454545 | 23 | 0.548077 | 13 | 104 | 3.769231 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.081395 | 0.173077 | 104 | 10 | 24 | 10.4 | 0.488372 | 0.423077 | 0 | 0 | 0 | 0 | 0.285714 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
0c8967029f04eaa76a7e75ae4334c986e1287fc3 | 1,887 | py | Python | dataset_generator/learning/imitation/tensorflow/_layers.py | rjean/duckie-segmentation | 5e720e1a96ef61c4560823030549ac1d5d16e2a4 | [
"Apache-2.0"
] | 1 | 2021-02-03T02:23:34.000Z | 2021-02-03T02:23:34.000Z | dataset_generator/learning/imitation/tensorflow/_layers.py | rjean/mobile-segmentation | 5e720e1a96ef61c4560823030549ac1d5d16e2a4 | [
"Apache-2.0"
] | null | null | null | dataset_generator/learning/imitation/tensorflow/_layers.py | rjean/mobile-segmentation | 5e720e1a96ef61c4560823030549ac1d5d16e2a4 | [
"Apache-2.0"
] | null | null | null | import tensorflow as tf
L2_LAMBDA = 1e-04
def _residual_block(x, size, dropout=False, dropout_prob=0.5, seed=None):
residual = tf.layers.batch_normalization(x) # TODO: check if the defaults in Tf are the same as in Keras
residual = tf.nn.relu(residual)
residual = tf.layers.conv2d(
residual,
filters=size,
kernel_size=3,
strides=2,
padding="same",
kernel_initializer=tf.keras.initializers.he_normal(seed=seed),
kernel_regularizer=tf.keras.regularizers.l2(L2_LAMBDA),
)
if dropout:
residual = tf.nn.dropout(residual, dropout_prob, seed=seed)
residual = tf.layers.batch_normalization(residual)
residual = tf.nn.relu(residual)
residual = tf.layers.conv2d(
residual,
filters=size,
kernel_size=3,
padding="same",
kernel_initializer=tf.keras.initializers.he_normal(seed=seed),
kernel_regularizer=tf.keras.regularizers.l2(L2_LAMBDA),
)
if dropout:
residual = tf.nn.dropout(residual, dropout_prob, seed=seed)
return residual
def one_residual(x, keep_prob=0.5, seed=None):
nn = tf.layers.conv2d(
x,
filters=32,
kernel_size=5,
strides=2,
padding="same",
kernel_initializer=tf.keras.initializers.he_normal(seed=seed),
kernel_regularizer=tf.keras.regularizers.l2(L2_LAMBDA),
)
nn = tf.layers.max_pooling2d(nn, pool_size=3, strides=2)
rb_1 = _residual_block(nn, 32, dropout_prob=keep_prob, seed=seed)
nn = tf.layers.conv2d(
nn,
filters=32,
kernel_size=1,
strides=2,
padding="same",
kernel_initializer=tf.keras.initializers.he_normal(seed=seed),
kernel_regularizer=tf.keras.regularizers.l2(L2_LAMBDA),
)
nn = tf.keras.layers.add([rb_1, nn])
nn = tf.layers.flatten(nn)
return nn
| 29.484375 | 109 | 0.650768 | 250 | 1,887 | 4.76 | 0.232 | 0.052941 | 0.053782 | 0.094118 | 0.694118 | 0.613445 | 0.613445 | 0.613445 | 0.613445 | 0.613445 | 0 | 0.026352 | 0.235824 | 1,887 | 63 | 110 | 29.952381 | 0.79889 | 0.030737 | 0 | 0.622642 | 0 | 0 | 0.008758 | 0 | 0 | 0 | 0 | 0.015873 | 0 | 1 | 0.037736 | false | 0 | 0.018868 | 0 | 0.09434 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
0c914d8fbc8a5b30dab1155628e15079383dc757 | 1,342 | py | Python | apps/images/migrations/0001_initial.py | coogger/coogger | 9e5e3ca172d8a14272948284a6822000b119119c | [
"MIT"
] | 48 | 2018-04-13T13:00:10.000Z | 2020-03-17T23:35:23.000Z | apps/images/migrations/0001_initial.py | coogger/coogger | 9e5e3ca172d8a14272948284a6822000b119119c | [
"MIT"
] | 77 | 2018-03-25T13:17:12.000Z | 2020-08-11T08:24:49.000Z | apps/images/migrations/0001_initial.py | coogger/coogger | 9e5e3ca172d8a14272948284a6822000b119119c | [
"MIT"
] | 35 | 2018-03-30T21:43:21.000Z | 2020-08-11T05:51:46.000Z | # Generated by Django 3.0.3 on 2020-02-28 13:21
import django.utils.timezone
from django.db import migrations, models
class Migration(migrations.Migration):
initial = True
dependencies = []
operations = [
migrations.CreateModel(
name="Image",
fields=[
(
"id",
models.AutoField(
auto_created=True,
primary_key=True,
serialize=False,
verbose_name="ID",
),
),
(
"title",
models.CharField(
blank=True,
help_text="Title | Optional",
max_length=55,
null=True,
verbose_name="",
),
),
(
"image",
models.ImageField(upload_to="images/", verbose_name=""),
),
(
"created",
models.DateTimeField(
default=django.utils.timezone.now,
verbose_name="Created",
),
),
],
),
]
| 26.84 | 76 | 0.354694 | 84 | 1,342 | 5.559524 | 0.642857 | 0.094218 | 0.08137 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.028765 | 0.559613 | 1,342 | 49 | 77 | 27.387755 | 0.761421 | 0.033532 | 0 | 0.190476 | 1 | 0 | 0.043243 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.047619 | 0 | 0.142857 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
0c91f8beba1444262adffec26a3344ef48edb987 | 435 | py | Python | ex3.py | SuPoPoo/python-exercise | 601b87c38c0090406cf532d2f9676b18650a0e0f | [
"MIT"
] | null | null | null | ex3.py | SuPoPoo/python-exercise | 601b87c38c0090406cf532d2f9676b18650a0e0f | [
"MIT"
] | null | null | null | ex3.py | SuPoPoo/python-exercise | 601b87c38c0090406cf532d2f9676b18650a0e0f | [
"MIT"
] | null | null | null | print("I will now count my chickens:")
print ("Hens",25+30/6)
print ("Roosters",100-25*3%4)
print("How I will count the eggs:")
print(3+2+1-5+4%2-1/4+6)
print("Is it true that 3+2<5-7?")
print(3+2<5-7)
print("What is 3+2?", 3+2)
print("What is 5-7?", 5-7)
print("Oh,that's why it's false")
print("How about some more.")
print("Is it greater?",5>-2)
print("Is it greater or equal?",5>=-2)
print("Is it less or equal?",5<=-2)
| 16.730769 | 38 | 0.62069 | 98 | 435 | 2.755102 | 0.387755 | 0.037037 | 0.133333 | 0.02963 | 0.148148 | 0 | 0 | 0 | 0 | 0 | 0 | 0.115591 | 0.144828 | 435 | 25 | 39 | 17.4 | 0.610215 | 0 | 0 | 0 | 0 | 0 | 0.497696 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
0c93992159c77c279e8541bafd3b789955b4b418 | 473 | py | Python | 3/node.py | Pavel3P/Machine-Learning | 441da7de69ebf6cef9ebe54a0b3992918faf1d40 | [
"MIT"
] | null | null | null | 3/node.py | Pavel3P/Machine-Learning | 441da7de69ebf6cef9ebe54a0b3992918faf1d40 | [
"MIT"
] | null | null | null | 3/node.py | Pavel3P/Machine-Learning | 441da7de69ebf6cef9ebe54a0b3992918faf1d40 | [
"MIT"
] | null | null | null | import numpy as np
class Node:
def __init__(self,
gini: float,
num_samples_per_class: np.ndarray,
) -> None:
self.gini: float = gini
self.num_samples_per_class: np.ndarray = num_samples_per_class
self.predicted_class: int = np.argmax(num_samples_per_class)
self.feature_index: int = 0
self.threshold: float = 0
self.left: Node = None
self.right: Node = None
| 26.277778 | 70 | 0.587738 | 60 | 473 | 4.333333 | 0.416667 | 0.153846 | 0.2 | 0.276923 | 0.376923 | 0.207692 | 0 | 0 | 0 | 0 | 0 | 0.006349 | 0.334038 | 473 | 17 | 71 | 27.823529 | 0.819048 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076923 | false | 0 | 0.076923 | 0 | 0.230769 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
0c9ae725d3c7ffae05b2711dd0cf627833bcd823 | 5,855 | py | Python | tests/test_build_endpoint.py | lsst-sqre/ltd-dasher | 176e125839b380f005a092189db760b716e8e23d | [
"MIT"
] | null | null | null | tests/test_build_endpoint.py | lsst-sqre/ltd-dasher | 176e125839b380f005a092189db760b716e8e23d | [
"MIT"
] | 9 | 2017-01-24T20:28:49.000Z | 2021-10-04T15:36:17.000Z | tests/test_build_endpoint.py | lsst-sqre/ltd-dasher | 176e125839b380f005a092189db760b716e8e23d | [
"MIT"
] | null | null | null | """Test app.routes.build."""
import responses
mock_product_data = {
"bucket_name": "lsst-the-docs",
"doc_repo": "https://github.com/lsst-sqre/test-059.git",
"domain": "test-059.lsst.io",
"fastly_domain": "n.global-ssl.fastly.net",
"published_url": "https://test-059.lsst.io",
"root_domain": "lsst.io",
"root_fastly_domain": "n.global-ssl.fastly.net",
"self_url": "https://keeper-staging.lsst.codes/products/test-059",
"slug": "test-059",
"surrogate_key": "235becbe0b8349aa88b7f6e086529d77",
"title": "Test Technote Via Bot"
}
mock_editions_data = {
"editions": [
"https://keeper-staging.lsst.codes/editions/388",
"https://keeper-staging.lsst.codes/editions/390"
]
}
mock_edition_388_data = {
"build_url": "https://keeper-staging.lsst.codes/builds/1322",
"date_created": "2017-02-03T23:49:23Z",
"date_ended": None,
"date_rebuilt": "2017-02-03T23:51:21Z",
"product_url": "https://keeper-staging.lsst.codes/products/test-059",
"published_url": "https://test-059.lsst.io",
"self_url": "https://keeper-staging.lsst.codes/editions/388",
"slug": "main",
"surrogate_key": "c1e29b6b1c97450c9d6d854ee3395ec9",
"title": "Latest",
"tracked_refs": [
"master"
]
}
mock_edition_390_data = {
"build_url": "https://keeper-staging.lsst.codes/builds/1324",
"date_created": "2017-02-09T23:40:57Z",
"date_ended": None,
"date_rebuilt": "2017-02-09T23:41:17Z",
"product_url": "https://keeper-staging.lsst.codes/products/test-059",
"published_url": "https://test-059.lsst.io/v/test-branch",
"self_url": "https://keeper-staging.lsst.codes/editions/390",
"slug": "test-branch",
"surrogate_key": "99ab3d93b1b54a4ea49dbe1764b7ea6a",
"title": "test-branch",
"tracked_refs": [
"test-branch"
]
}
mock_builds_data = {
"builds": [
"https://keeper-staging.lsst.codes/builds/1322",
"https://keeper-staging.lsst.codes/builds/1324"
]
}
mock_build_1322_data = {
"bucket_name": "lsst-the-docs",
"bucket_root_dir": "test-059/builds/1",
"date_created": "2017-02-03T23:51:08Z",
"date_ended": None,
"git_refs": [
"master"
],
"github_requester": None,
"product_url": "https://keeper-staging.lsst.codes/products/test-059",
"published_url": "https://test-059.lsst.io/builds/1",
"self_url": "https://keeper-staging.lsst.codes/builds/1322",
"slug": "1",
"surrogate_key": "006e34ec8f714aed956292645bb7e432",
"uploaded": True
}
mock_build_1324_data = {
"bucket_name": "lsst-the-docs",
"bucket_root_dir": "test-059/builds/2",
"date_created": "2017-02-09T23:40:57Z",
"date_ended": None,
"git_refs": [
"test-branch"
],
"github_requester": None,
"product_url": "https://keeper-staging.lsst.codes/products/test-059",
"published_url": "https://test-059.lsst.io/builds/2",
"self_url": "https://keeper-staging.lsst.codes/builds/1324",
"slug": "2",
"surrogate_key": "a7dc0f6b0f4b40cdab851ff68be0ee51",
"uploaded": True
}
mock_bulk_data = {
"product": mock_product_data,
"editions": [
mock_edition_388_data,
mock_edition_390_data
],
"builds": [
mock_build_1322_data,
mock_build_1324_data
]
}
@responses.activate
def test_rebuild_dashboards(anon_client):
"""Test dashboard rebuilds with full client using new bulk metadata
endpoint.
"""
responses.add(
responses.GET,
'https://keeper-staging.lsst.codes/products/test-059/dashboard',
json=mock_bulk_data,
status=200,
content_type='application/json')
r = anon_client.post(
'/build',
{
'product_urls': ['https://keeper-staging.lsst.codes/'
'products/test-059']
}
)
assert r.status == 202
@responses.activate
def test_rebuild_dashboards_oldstyle(anon_client):
"""Test dashboard rebuilds with full client using original endpoints."""
responses.add(
responses.GET,
'https://keeper-staging.lsst.codes/products/test-059/dashboard',
json={},
status=404,
content_type='application/json')
responses.add(
responses.GET,
'https://keeper-staging.lsst.codes/products/test-059',
json=mock_product_data,
status=200,
content_type='application/json')
responses.add(
responses.GET,
'https://keeper-staging.lsst.codes/products/test-059/editions/',
json=mock_editions_data,
status=200,
content_type='application/json')
responses.add(
responses.GET,
'https://keeper-staging.lsst.codes/editions/388',
json=mock_edition_388_data,
status=200,
content_type='application/json')
responses.add(
responses.GET,
'https://keeper-staging.lsst.codes/editions/390',
json=mock_edition_390_data,
status=200,
content_type='application/json')
responses.add(
responses.GET,
'https://keeper-staging.lsst.codes/products/test-059/builds/',
json=mock_builds_data,
status=200,
content_type='application/json')
responses.add(
responses.GET,
'https://keeper-staging.lsst.codes/builds/1322',
json=mock_build_1322_data,
status=200,
content_type='application/json')
responses.add(
responses.GET,
'https://keeper-staging.lsst.codes/builds/1324',
json=mock_build_1324_data,
status=200,
content_type='application/json')
r = anon_client.post(
'/build',
{
'product_urls': ['https://keeper-staging.lsst.codes/'
'products/test-059']
}
)
assert r.status == 202
| 28.985149 | 76 | 0.625107 | 684 | 5,855 | 5.171053 | 0.173977 | 0.080859 | 0.132316 | 0.161719 | 0.726604 | 0.706531 | 0.676279 | 0.587786 | 0.536896 | 0.470455 | 0 | 0.081557 | 0.214688 | 5,855 | 201 | 77 | 29.129353 | 0.68769 | 0.02801 | 0 | 0.479769 | 0 | 0 | 0.493999 | 0.036357 | 0 | 0 | 0 | 0 | 0.011561 | 1 | 0.011561 | false | 0 | 0.00578 | 0 | 0.017341 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
0c9b0609ebab7f0a04accd501a146384f501a809 | 2,650 | py | Python | HW2/heart.py | MohammadJRanjbar/Data-Mining | 66492166df12924a754273cdaad169d84968f2e1 | [
"MIT"
] | null | null | null | HW2/heart.py | MohammadJRanjbar/Data-Mining | 66492166df12924a754273cdaad169d84968f2e1 | [
"MIT"
] | null | null | null | HW2/heart.py | MohammadJRanjbar/Data-Mining | 66492166df12924a754273cdaad169d84968f2e1 | [
"MIT"
] | null | null | null | from sklearn import tree
from matplotlib import pyplot as plt
from sklearn.model_selection import cross_val_score
from sklearn.model_selection import train_test_split
from sklearn import model_selection
from sklearn import metrics
import numpy as np
import pandas as pd
import seaborn as sns
from sklearn.neighbors import KNeighborsClassifier
from sklearn.linear_model import LogisticRegression
from sklearn.naive_bayes import GaussianNB
from sklearn.naive_bayes import MultinomialNB
from sklearn.neighbors import KNeighborsClassifier
from sklearn.model_selection import train_test_split
from sklearn.preprocessing import MinMaxScaler
from sklearn.preprocessing import StandardScaler
from pandas import DataFrame
data = pd.read_csv("heart.csv")
# sns.set(style="ticks", color_codes=True)
# plot=sns.pairplot(data)
# plot.savefig("heart.png")
# pd.crosstab(data.sex,data.target).plot(kind="bar",figsize=(15,6),color=['#1CA53B','#AA1111' ])
# plt.title('Heart Disease Frequency for Sex')
# plt.xlabel('Sex (0 = Female, 1 = Male)')
# plt.xticks(rotation=0)
# plt.legend(["Haven't Disease", "Have Disease"])
# plt.ylabel('Frequency')
# plt.savefig("heart1.png")
# pd.crosstab(data.age,data.target).plot(kind="bar",figsize=(20,6))
# plt.title('Heart Disease Frequency for Ages')
# plt.xlabel('Age')
# plt.ylabel('Frequency')
# plt.savefig('heartDiseaseAndAges.png')
feature_names =["age","sex","cp","trestbps","chol" ,"fbs","restecg","thalach","exang","oldpeak","slope","ca","thal"]
x = data[feature_names].values
y = data["target"].values
X_train, X_test, y_train, y_test = train_test_split(x, y, test_size=0.2, random_state=5,shuffle=True)
feature_scaler = StandardScaler()
X_train = feature_scaler.fit_transform(X_train)
X_test = feature_scaler.transform(X_test)
# Krange = range(1,30)
# scores = {}
# scores_list = []
# for k in Krange:
# knn = KNeighborsClassifier(n_neighbors = k)
# knn.fit(X_train,y_train)
# y_pred = knn.predict(X_test)
# scores[k] = metrics.accuracy_score(y_test,y_pred)
# scores_list.append(metrics.accuracy_score(y_test,y_pred))
# plt.plot(Krange,scores_list)
# plt.xlabel("Value of K")
# plt.ylabel("Accuracy")
# plt.savefig("k.png")
# plt.show()
model = KNeighborsClassifier(n_neighbors=7)
model.fit(X_train,y_train)
y_pred= model.predict(X_test)
print("Accuracy KNN:",metrics.accuracy_score(y_test, y_pred))
X_train, X_test, y_train, y_test = train_test_split(x, y, test_size=0.2, random_state=5,shuffle=True)
#Create a Gaussian Classifier
gnb = GaussianNB()
gnb.fit(X_train, y_train)
y_pred = gnb.predict(X_test)
print("Accuracy NB:",metrics.accuracy_score(y_test, y_pred))
| 29.444444 | 116 | 0.755849 | 401 | 2,650 | 4.820449 | 0.321696 | 0.073978 | 0.018107 | 0.043456 | 0.438696 | 0.339886 | 0.277807 | 0.131402 | 0.131402 | 0.131402 | 0 | 0.011402 | 0.106415 | 2,650 | 89 | 117 | 29.775281 | 0.804899 | 0.382264 | 0 | 0.171429 | 0 | 0 | 0.062422 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.514286 | 0 | 0.514286 | 0.057143 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.