hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
4c62e0ef93563a827a455e57b66a65400bfbb39d | 239 | py | Python | utils/misc/subscription.py | YoshlikMedia/DTM-Moderator-bot | e6cbf10fdc7844cd3b33d642f8c15ccfe2817b40 | [
"MIT"
] | null | null | null | utils/misc/subscription.py | YoshlikMedia/DTM-Moderator-bot | e6cbf10fdc7844cd3b33d642f8c15ccfe2817b40 | [
"MIT"
] | null | null | null | utils/misc/subscription.py | YoshlikMedia/DTM-Moderator-bot | e6cbf10fdc7844cd3b33d642f8c15ccfe2817b40 | [
"MIT"
] | null | null | null | from typing import Union
from aiogram import Bot
async def check(user_id, channel: Union[int, str]):
bot = Bot.get_current()
member = await bot.get_chat_member(user_id=user_id, chat_id=channel)
return member.is_chat_member() | 26.555556 | 72 | 0.748954 | 39 | 239 | 4.358974 | 0.538462 | 0.105882 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.158996 | 239 | 9 | 73 | 26.555556 | 0.845771 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
4c70d86ef4c5e463fe1b428455827b17d471577a | 841 | py | Python | vega/modules/loss/mean_loss.py | jie311/vega | 1bba6100ead802697e691403b951e6652a99ccae | [
"MIT"
] | 724 | 2020-06-22T12:05:30.000Z | 2022-03-31T07:10:54.000Z | vega/modules/loss/mean_loss.py | jie311/vega | 1bba6100ead802697e691403b951e6652a99ccae | [
"MIT"
] | 147 | 2020-06-30T13:34:46.000Z | 2022-03-29T11:30:17.000Z | vega/modules/loss/mean_loss.py | jie311/vega | 1bba6100ead802697e691403b951e6652a99ccae | [
"MIT"
] | 160 | 2020-06-29T18:27:58.000Z | 2022-03-23T08:42:21.000Z | # -*- coding: utf-8 -*-
# Copyright (C) 2020. Huawei Technologies Co., Ltd. All rights reserved.
# This program is free software; you can redistribute it and/or modify
# it under the terms of the MIT License.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# MIT License for more details.
"""MeanLoss for data."""
from vega.modules.module import Module
from vega.common import ClassType, ClassFactory
@ClassFactory.register(ClassType.LOSS)
class MeanLoss(Module):
"""MeanLoss Loss for data."""
def __init__(self):
super(MeanLoss, self).__init__()
def call(self, inputs, targets):
"""Compute loss, mean() to average on multi-gpu."""
return inputs.mean()
| 32.346154 | 72 | 0.715815 | 118 | 841 | 5.033898 | 0.686441 | 0.037037 | 0.043771 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007289 | 0.184304 | 841 | 25 | 73 | 33.64 | 0.858601 | 0.604043 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.25 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
4c741820d948931ebf928324e086cf06701b48c4 | 167 | py | Python | Hackerrank/Practice/Python/15.numpy/15.Linear Algebra.py | kushagra1212/Competitive-Programming | 5b68774c617d6abdf1b29893b1b13d47f62161e8 | [
"MIT"
] | 994 | 2017-02-28T06:13:47.000Z | 2022-03-31T10:49:00.000Z | Hackerrank_python/15.numpy/15.Linear Algebra.py | devesh17m/Competitive-Programming | 2d459dc8dc5ac628d94700b739988b0ea364cb71 | [
"MIT"
] | 16 | 2018-01-01T02:59:55.000Z | 2021-11-22T12:49:16.000Z | Hackerrank_python/15.numpy/15.Linear Algebra.py | devesh17m/Competitive-Programming | 2d459dc8dc5ac628d94700b739988b0ea364cb71 | [
"MIT"
] | 325 | 2017-06-15T03:32:43.000Z | 2022-03-28T22:43:42.000Z | import numpy
n=int(input())
numpy.set_printoptions(legacy='1.13')
arr1=([list(map(float,input().split()))for _ in range(n)])
print (numpy.linalg.det(arr1))
| 18.555556 | 59 | 0.664671 | 26 | 167 | 4.192308 | 0.807692 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.034247 | 0.125749 | 167 | 8 | 60 | 20.875 | 0.712329 | 0 | 0 | 0 | 0 | 0 | 0.025641 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 0.2 | 0.4 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
4c795d80803d039889dee06f76caf20844797f1c | 249 | py | Python | metricfarmer/extensions/mf/targets/print_text.py | useblocks/metricfarmer | 556b459d081f84b0d9285266ba472c7ed27ddd46 | [
"MIT"
] | null | null | null | metricfarmer/extensions/mf/targets/print_text.py | useblocks/metricfarmer | 556b459d081f84b0d9285266ba472c7ed27ddd46 | [
"MIT"
] | 2 | 2019-08-17T07:32:17.000Z | 2019-08-23T13:21:31.000Z | metricfarmer/extensions/mf/targets/print_text.py | useblocks/metricfarmer | 556b459d081f84b0d9285266ba472c7ed27ddd46 | [
"MIT"
] | null | null | null | import click
from colorama import Fore, Style
def target_print(metrics, **kwargs):
click.echo()
for name, metric in metrics.items():
click.echo(' {name}: '.format(name=name) + Fore.GREEN + str(metric['result']) + Style.RESET_ALL)
| 27.666667 | 105 | 0.670683 | 34 | 249 | 4.852941 | 0.676471 | 0.109091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.176707 | 249 | 8 | 106 | 31.125 | 0.804878 | 0 | 0 | 0 | 0 | 0 | 0.064257 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.333333 | 0 | 0.5 | 0.166667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
d5b436df14a9c42a6e07de3029e80869957055d6 | 535 | py | Python | Post-Exploitation/LaZagne/Mac/lazagne/softwares/system/system.py | FOGSEC/TID3xploits | b57d8bae454081a3883a5684679e2a329e72d6e5 | [
"MIT"
] | 5 | 2018-01-15T13:58:40.000Z | 2022-02-17T02:38:58.000Z | Post-Exploitation/LaZagne/Mac/lazagne/softwares/system/system.py | bhattsameer/TID3xploits | b57d8bae454081a3883a5684679e2a329e72d6e5 | [
"MIT"
] | null | null | null | Post-Exploitation/LaZagne/Mac/lazagne/softwares/system/system.py | bhattsameer/TID3xploits | b57d8bae454081a3883a5684679e2a329e72d6e5 | [
"MIT"
] | 4 | 2019-06-21T07:51:11.000Z | 2020-11-04T05:20:09.000Z | from lazagne.config.write_output import print_debug
from lazagne.config.moduleInfo import ModuleInfo
from lazagne.config.constant import *
class System(ModuleInfo):
def __init__(self):
options = {'command': '-system', 'action': 'store_true', 'dest': 'system', 'help': 'Print system passwords found (keychain, system account)'}
ModuleInfo.__init__(self, 'system', 'system', options)
def run(self, software_name=None):
pwdFound = []
pwdFound += constant.keychains_pwd
pwdFound += constant.system_pwd
return pwdFound
| 29.722222 | 143 | 0.742056 | 64 | 535 | 5.984375 | 0.53125 | 0.086162 | 0.133159 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.13271 | 535 | 17 | 144 | 31.470588 | 0.825431 | 0 | 0 | 0 | 0 | 0 | 0.207865 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0.083333 | 0.25 | 0 | 0.583333 | 0.083333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 2 |
d5b4607a6835cae50892abb3457decb6ea755238 | 4,162 | py | Python | Dashboard/backend/runtest.py | CESNET/Nemea-GUI | 5ab626a23fa8a3cbd58968dfd7bc8ae2263d0595 | [
"BSD-3-Clause"
] | null | null | null | Dashboard/backend/runtest.py | CESNET/Nemea-GUI | 5ab626a23fa8a3cbd58968dfd7bc8ae2263d0595 | [
"BSD-3-Clause"
] | null | null | null | Dashboard/backend/runtest.py | CESNET/Nemea-GUI | 5ab626a23fa8a3cbd58968dfd7bc8ae2263d0595 | [
"BSD-3-Clause"
] | 1 | 2019-06-05T08:04:04.000Z | 2019-06-05T08:04:04.000Z | #!/usr/bin/python3
import pymongo
import unittest
from tests import *
import json
from dashboards import *
from alert_data import *
GENERATED_ALERT_COUNT = 1000 # How many alerts should be generated to test DB before running tests.
dbClient = pymongo.MongoClient("mongodb://localhost:27017/")
testDb = dbClient["testDb"]
testCol = testDb["tests"]
alertTestCol = testDb["testAlerts"]
class TestDashboardOperations(unittest.TestCase):
"""
Tests for functions in dashboars.py
"""
@staticmethod
def tearDownClass():
testCol.delete_many({})
def test_empty_get(self):
self.assertEqual(get_all_dashboards('test', testCol), json.dumps(['Default']))
def test_insert_dashboard(self):
self.assertEqual(add_dashboard('test', 'Test Dashboard', testCol), json.dumps({"success": True}))
self.assertEqual(get_all_dashboards('test', testCol), json.dumps(['Default', 'Test Dashboard']))
def test_get_data_empty(self):
self.assertEqual(get_dashboard_data('test', 'Nonexistent', testCol), json.dumps([]))
self.assertEqual(get_dashboard_data('test', 'Default', testCol), json.dumps([]))
self.assertEqual(get_dashboard_data('test', 'Test Dashboard', testCol), json.dumps([]))
def test_insert_data(self):
self.assertEqual(modify_dashboard('test', 'Nonexistent', '[]', testCol), json.dumps({'success': False}))
self.assertEqual(modify_dashboard('test', 'Test Dashboard',
["testData"],
testCol), json.dumps({'success': True}))
self.assertEqual(get_dashboard_data('test', 'Test Dashboard', testCol),
json.dumps(["testData"]))
class TestAlertOperations(unittest.TestCase):
"""
Tests for functions in alert_data.py
"""
@staticmethod
def setUpClass():
gen_n_alerts(GENERATED_ALERT_COUNT, alertTestCol)
@staticmethod
def tearDownClass():
cleanup_db(alertTestCol)
def test_get_categories(self):
valid_test_categories = ["any", "Attempt.Login", "Anomaly.Connection", "Recon.Scanning", "Availibility.DDoS",
"Intrusion.Botnet"]
self.assertTrue(
set(valid_test_categories + json.loads(get_available_alert_categories(alertTestCol)))
.issubset(set(valid_test_categories)))
def test_event_count(self):
# Should find all alerts - 52595 hours == 6 years
self.assertEqual(get_alert_count('any', 52595, alertTestCol), json.dumps(GENERATED_ALERT_COUNT))
self.assertTrue(0 <= json.loads(get_alert_count('Attempt.Login', 52595, alertTestCol)) <= GENERATED_ALERT_COUNT)
def test_top_flows(self):
data = json.loads(get_top_flow_alerts(3, 52595, alertTestCol))
self.assertEqual(len(data), 3)
self.assertTrue(data[0]['FlowCount'] >= data[1]['FlowCount'] >= data[2]['FlowCount'])
def test_pie_chart_data(self):
data = json.loads(get_pie_chart_data('Category', 52595, alertTestCol))
self.assertEqual(sum(data['series']), GENERATED_ALERT_COUNT)
self.assertEqual(len(data['series']), len(data['labels']))
def test_pie_chart_category_overflow(self):
self.assertEqual(len(json.loads(get_pie_chart_data('FlowCount', 52595, alertTestCol))['labels']), 15)
def test_bar_chart_data(self):
data = json.loads(get_bar_chart_data('Category', 52595, 600000, alertTestCol))
self.assertEqual(len(data['labels']), len(data['series']))
total = 0
for x in data['series']:
total += sum(x['data'])
self.assertEqual(total, GENERATED_ALERT_COUNT)
def test_datetime_overflow(self):
self.assertEqual(len(json.loads(get_pie_chart_data('Category', 99999999, alertTestCol))), GENERATED_ALERT_COUNT)
# def test_bar_chart_edge_case(self):
# self.assertEqual(json.loads(get_bar_chart_data('Category', 0, 1, alertTestCol))['data'], [])
if __name__ == "__main__":
unittest.main()
| 40.019231 | 121 | 0.649207 | 464 | 4,162 | 5.596983 | 0.256466 | 0.103966 | 0.055449 | 0.041586 | 0.396226 | 0.311513 | 0.235271 | 0.182133 | 0.16134 | 0.135541 | 0 | 0.021826 | 0.218405 | 4,162 | 103 | 122 | 40.407767 | 0.776514 | 0.081451 | 0 | 0.073529 | 1 | 0 | 0.121295 | 0.007071 | 0 | 0 | 0 | 0 | 0.294118 | 1 | 0.205882 | false | 0 | 0.088235 | 0 | 0.323529 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
d5bff27151f515af70ae3f4a431ddf1b694ad26a | 241 | py | Python | py/ClientProject/.idea/Common/OutHtml.py | Justyer/Secret | 164ea9c777832772ee6115c21999d97f521c6adb | [
"MIT"
] | null | null | null | py/ClientProject/.idea/Common/OutHtml.py | Justyer/Secret | 164ea9c777832772ee6115c21999d97f521c6adb | [
"MIT"
] | null | null | null | py/ClientProject/.idea/Common/OutHtml.py | Justyer/Secret | 164ea9c777832772ee6115c21999d97f521c6adb | [
"MIT"
] | null | null | null | import os
def outHtml(filedir,filename,HtmlCon):
if os.path.exists(filedir) == False:
os.makedirs(filedir)
fileHeader =open(filedir + filename, 'w',encoding='utf8')
fileHeader.write(HtmlCon)
fileHeader.close()
| 30.125 | 62 | 0.672199 | 28 | 241 | 5.785714 | 0.678571 | 0.185185 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005155 | 0.195021 | 241 | 7 | 63 | 34.428571 | 0.829897 | 0 | 0 | 0 | 0 | 0 | 0.021368 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.142857 | 0 | 0.285714 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
d5cdb878b2be1a95d96fb6f869f02e34e35e2470 | 23,893 | py | Python | src-python/tests/test_trp2.py | kmascar/amazon-textract-response-parser | 14e8b8295f25f23df6c23a975584bab0fb17e6e7 | [
"Apache-2.0"
] | null | null | null | src-python/tests/test_trp2.py | kmascar/amazon-textract-response-parser | 14e8b8295f25f23df6c23a975584bab0fb17e6e7 | [
"Apache-2.0"
] | null | null | null | src-python/tests/test_trp2.py | kmascar/amazon-textract-response-parser | 14e8b8295f25f23df6c23a975584bab0fb17e6e7 | [
"Apache-2.0"
] | 1 | 2022-03-23T23:10:12.000Z | 2022-03-23T23:10:12.000Z | from trp.t_pipeline import add_page_orientation, order_blocks_by_geo
from typing import List
from trp.t_pipeline import add_page_orientation, order_blocks_by_geo, pipeline_merge_tables, add_kv_ocr_confidence
from trp.t_tables import MergeOptions, HeaderFooterType
import trp.trp2 as t2
import trp as t1
import json
import os
import pytest
from trp import Document
from uuid import uuid4
import logging
current_folder = os.path.dirname(os.path.realpath(__file__))
def return_json_for_file(filename):
with open(os.path.join(current_folder, filename)) as test_json:
return json.load(test_json)
@pytest.fixture
def json_response():
return return_json_for_file("test-response.json")
def test_serialization():
"""
testing that None values are removed when serializing
"""
bb_1 = t2.TBoundingBox(0.4, 0.3, 0.1, top=None) # type:ignore forcing some None/null values
bb_2 = t2.TBoundingBox(0.4, 0.3, 0.1, top=0.2)
p1 = t2.TPoint(x=0.1, y=0.1)
p2 = t2.TPoint(x=0.3, y=None) # type:ignore
geo = t2.TGeometry(bounding_box=bb_1, polygon=[p1, p2])
geo_s = t2.TGeometrySchema()
s: str = geo_s.dumps(geo)
assert not "null" in s
geo = t2.TGeometry(bounding_box=bb_2, polygon=[p1, p2])
s: str = geo_s.dumps(geo)
assert not "null" in s
def test_tblock_order_blocks_by_geo():
p = os.path.dirname(os.path.realpath(__file__))
f = open(os.path.join(p, "data/gib.json"))
j = json.load(f)
t_document: t2.TDocument = t2.TDocumentSchema().load(j)
new_order = order_blocks_by_geo(t_document)
doc = t1.Document(t2.TDocumentSchema().dump(new_order))
assert "Value 1.1.1" == doc.pages[0].tables[0].rows[0].cells[0].text.strip()
assert "Value 2.1.1" == doc.pages[0].tables[1].rows[0].cells[0].text.strip()
assert "Value 3.1.1" == doc.pages[0].tables[2].rows[0].cells[0].text.strip()
def test_tblock_order_block_by_geo_multi_page():
p = os.path.dirname(os.path.realpath(__file__))
f = open(os.path.join(p, "data/gib_multi_page_tables.json"))
j = json.load(f)
t_document: t2.TDocument = t2.TDocumentSchema().load(j)
t_document = order_blocks_by_geo(t_document)
doc = t1.Document(t2.TDocumentSchema().dump(t_document))
assert "Page 1 - Value 1.1.1" == doc.pages[0].tables[0].rows[0].cells[0].text.strip()
assert "Page 1 - Value 2.1.1" == doc.pages[0].tables[1].rows[0].cells[0].text.strip()
def test_tblock():
p = os.path.dirname(os.path.realpath(__file__))
f = open(os.path.join(p, "data/gib.json"))
j = json.load(f)
t_document: t2.TDocument = t2.TDocumentSchema().load(j)
new_order = order_blocks_by_geo(t_document)
doc = t1.Document(t2.TDocumentSchema().dump(new_order))
assert "Value 1.1.1" == doc.pages[0].tables[0].rows[0].cells[0].text.strip()
assert "Value 2.1.1" == doc.pages[0].tables[1].rows[0].cells[0].text.strip()
assert "Value 3.1.1" == doc.pages[0].tables[2].rows[0].cells[0].text.strip()
def test_custom_tblock():
p = os.path.dirname(os.path.realpath(__file__))
f = open(os.path.join(p, "data/gib.json"))
j = json.load(f)
t_document: t2.TDocument = t2.TDocumentSchema().load(j)
t_document.custom = {'testblock': {'here': 'is some fun stuff'}}
assert 'testblock' in t2.TDocumentSchema().dumps(t_document)
def test_custom_page_orientation(json_response):
doc = Document(json_response)
assert 1 == len(doc.pages)
lines = [line for line in doc.pages[0].lines]
assert 22 == len(lines)
words = [word for line in lines for word in line.words]
assert 53 == len(words)
t_document: t2.TDocument = t2.TDocumentSchema().load(json_response)
t_document.custom = {'orientation': 180}
new_t_doc_json = t2.TDocumentSchema().dump(t_document)
assert "Custom" in new_t_doc_json
assert "orientation" in new_t_doc_json["Custom"]
assert new_t_doc_json["Custom"]["orientation"] == 180
p = os.path.dirname(os.path.realpath(__file__))
f = open(os.path.join(p, "data/gib.json"))
j = json.load(f)
t_document: t2.TDocument = t2.TDocumentSchema().load(j)
t_document = add_page_orientation(t_document)
assert -1 < t_document.pages[0].custom['PageOrientationBasedOnWords'] < 2
p = os.path.dirname(os.path.realpath(__file__))
f = open(os.path.join(p, "data/gib_10_degrees.json"))
j = json.load(f)
t_document: t2.TDocument = t2.TDocumentSchema().load(j)
t_document = add_page_orientation(t_document)
assert 5 < t_document.pages[0].custom['PageOrientationBasedOnWords'] < 15
p = os.path.dirname(os.path.realpath(__file__))
f = open(os.path.join(p, "data/gib__15_degrees.json"))
j = json.load(f)
t_document: t2.TDocument = t2.TDocumentSchema().load(j)
t_document = add_page_orientation(t_document)
assert 10 < t_document.pages[0].custom['PageOrientationBasedOnWords'] < 20
p = os.path.dirname(os.path.realpath(__file__))
f = open(os.path.join(p, "data/gib__25_degrees.json"))
j = json.load(f)
t_document: t2.TDocument = t2.TDocumentSchema().load(j)
t_document = add_page_orientation(t_document)
assert 17 < t_document.pages[0].custom['PageOrientationBasedOnWords'] < 30
p = os.path.dirname(os.path.realpath(__file__))
f = open(os.path.join(p, "data/gib__180_degrees.json"))
j = json.load(f)
t_document: t2.TDocument = t2.TDocumentSchema().load(j)
t_document = add_page_orientation(t_document)
assert 170 < t_document.pages[0].custom['PageOrientationBasedOnWords'] < 190
p = os.path.dirname(os.path.realpath(__file__))
f = open(os.path.join(p, "data/gib__270_degrees.json"))
j = json.load(f)
t_document: t2.TDocument = t2.TDocumentSchema().load(j)
t_document = add_page_orientation(t_document)
assert -100 < t_document.pages[0].custom['PageOrientationBasedOnWords'] < -80
p = os.path.dirname(os.path.realpath(__file__))
f = open(os.path.join(p, "data/gib__90_degrees.json"))
j = json.load(f)
t_document: t2.TDocument = t2.TDocumentSchema().load(j)
t_document = add_page_orientation(t_document)
assert 80 < t_document.pages[0].custom['PageOrientationBasedOnWords'] < 100
p = os.path.dirname(os.path.realpath(__file__))
f = open(os.path.join(p, "data/gib__minus_10_degrees.json"))
j = json.load(f)
t_document: t2.TDocument = t2.TDocumentSchema().load(j)
t_document = add_page_orientation(t_document)
assert -10 < t_document.pages[0].custom['PageOrientationBasedOnWords'] < 5
doc = t1.Document(t2.TDocumentSchema().dump(t_document))
for page in doc.pages:
assert page.custom['PageOrientationBasedOnWords']
def test_filter_blocks_by_type():
block_list = [t2.TBlock(id="1", block_type=t2.TextractBlockTypes.WORD.name)]
assert t2.TDocument.filter_blocks_by_type(block_list=block_list,
textract_block_type=[t2.TextractBlockTypes.WORD]) == block_list
def test_next_token_response():
p = os.path.dirname(os.path.realpath(__file__))
f = open(os.path.join(p, "data/gib.json"))
j = json.load(f)
assert j['NextToken']
t_document: t2.TDocument = t2.TDocumentSchema().load(j)
t_document = add_page_orientation(t_document)
assert t_document.pages[0].custom
def test_rotate_point():
assert t2.TPoint(2, 2) == t2.TPoint(2, 2)
p = t2.TPoint(2, 2).rotate(degrees=180, origin_y=0, origin_x=0, force_limits=False)
assert t2.TPoint(x=round(p.x), y=round(p.y)) == t2.TPoint(-2, -2)
p = t2.TPoint(3, 4).rotate(degrees=-30, origin_y=0, origin_x=0, force_limits=False)
assert t2.TPoint(x=round(p.x), y=round(p.y)) == t2.TPoint(5, 2)
p = t2.TPoint(3, 4).rotate(degrees=-77, origin_y=0, origin_x=0, force_limits=False)
assert t2.TPoint(x=round(p.x), y=round(p.y)) == t2.TPoint(5, -2)
p = t2.TPoint(3, 4).rotate(degrees=-90, origin_y=0, origin_x=0, force_limits=False)
assert t2.TPoint(x=round(p.x), y=round(p.y)) == t2.TPoint(4, -3)
p = t2.TPoint(3, 4).rotate(degrees=-270, origin_y=0, origin_x=0, force_limits=False)
assert t2.TPoint(x=round(p.x), y=round(p.y)) == t2.TPoint(-4, 3)
p = t2.TPoint(2, 2).rotate(degrees=180, origin_x=1, origin_y=1)
assert t2.TPoint(x=round(p.x), y=round(p.y)) == t2.TPoint(0, 0)
p = t2.TPoint(3, 4).rotate(degrees=-30, origin_y=0, origin_x=0, force_limits=False)
assert t2.TPoint(x=round(p.x), y=round(p.y)) == t2.TPoint(5, 2)
p = t2.TPoint(3, 4).rotate(degrees=-77, origin_x=4, origin_y=4, force_limits=False)
assert t2.TPoint(x=round(p.x), y=round(p.y)) == t2.TPoint(4, 5)
p = t2.TPoint(3, 4).rotate(degrees=-90, origin_x=4, origin_y=6, force_limits=False)
assert t2.TPoint(x=round(p.x), y=round(p.y)) == t2.TPoint(2, 7)
p = t2.TPoint(3, 4).rotate(degrees=-270, origin_x=4, origin_y=6, force_limits=False)
assert t2.TPoint(x=round(p.x), y=round(p.y)) == t2.TPoint(6, 5)
def test_rotate():
points = []
width = 0.05415758863091469
height = 0.011691284365952015
left = 0.13994796574115753
top = 0.8997916579246521
origin: t2.TPoint = t2.TPoint(x=0.5, y=0.5)
degrees: float = 180
points.append(t2.TPoint(x=left, y=top).rotate(origin_x=origin.x, origin_y=origin.y, degrees=degrees))
points.append(t2.TPoint(x=left + width, y=top).rotate(origin_x=origin.x, origin_y=origin.y, degrees=degrees))
points.append(t2.TPoint(x=left, y=top + height).rotate(origin_x=origin.x, origin_y=origin.y, degrees=degrees))
points.append(
t2.TPoint(x=left + width, y=top + height).rotate(origin_x=origin.x, origin_y=origin.y, degrees=degrees))
assert not None in points
def test_adjust_bounding_boxes_and_polygons_to_orientation():
# p = os.path.dirname(os.path.realpath(__file__))
# f = open(os.path.join(p, "data/gib.json"))
# j = json.load(f)
# t_document: t2.TDocument = t2.TDocumentSchema().load(j)
# t_document = add_page_orientation(t_document)
# doc = t1.Document(t2.TDocumentSchema().dump(t_document))
# key = "Date:"
# fields = doc.pages[0].form.searchFieldsByKey(key)
# for field in fields:
# print(f"Field: Key: {field.key}, Value: {field.value}, Geo: {field.geometry} ")
p = os.path.dirname(os.path.realpath(__file__))
f = open(os.path.join(p, "data/gib__180_degrees.json"))
j = json.load(f)
t_document: t2.TDocument = t2.TDocumentSchema().load(j)
t_document = add_page_orientation(t_document)
new_order = order_blocks_by_geo(t_document)
doc = t1.Document(t2.TDocumentSchema().dump(t_document))
# for line in doc.pages[0].lines:
# print("Line: {}".format(line.text))
# print("=========================== after rotation ========================")
# doc = t1.Document(t2.TDocumentSchema().dump(t_document))
# key = "Date:"
# fields = doc.pages[0].form.searchFieldsByKey(key)
# rotate_point = t2.TPoint(x=0.5, y=0.5)
# for field in fields:
# print(f"Field: Key: {field.key}, Value: {field.value}, Geo: {field.geometry} ")
# bbox = field.geometry.boundingBox
# new_point = t_pipeline.__rotate(origin=rotate_point,
# point=t2.TPoint(x=bbox.left, y=bbox.top),
# angle_degrees=180)
# print(f"new point: {new_point}")
# FIXME: remove duplicates in relationship_recursive!
# [b.rotate(origin=t2.TPoint(0.5, 0.5), degrees=180) for b in t_document.relationships_recursive(block=t_document.pages[0])]
# t_document.rotate(page=t_document.pages[0], degrees=180)
# new_order = order_blocks_by_geo(t_document)
# with open("/Users/schadem/temp/rotation/rotate_json2.jon", "w") as out_file:
# out_file.write(t2.TDocumentSchema().dumps(t_document))
# doc = t1.Document(t2.TDocumentSchema().dump(t_document))
# for line in doc.pages[0].lines:
# print("Line: {}".format(line.text))
# p = t2.TPoint(x=0.75, y=0.03)
# p.rotate(origin_x=0.5, origin_y=0.5, degrees=180)
# print(p)
# new_point = rotate(origin=t2.TPoint(x=0.5, y=0.5), point = )
# print(f"new_point: {new_point.x:.2f}, {new_point.y:.2f}")
# print(rotate(origin=t2.TPoint(x=0.5, y=0.5), point = t2.TPoint(x=.75, y=0.03)))
def test_scale(caplog):
p1: t2.TPoint = t2.TPoint(x=0.5, y=0.5)
p1.scale(doc_width=10, doc_height=10)
assert (p1 == t2.TPoint(x=5, y=5))
b1: t2.TBoundingBox = t2.TBoundingBox(width=0.1, height=0.1, left=0.5, top=0.5)
b1.scale(doc_width=10, doc_height=10)
assert (b1 == t2.TBoundingBox(width=1, height=1, left=5, top=5))
p1: t2.TPoint = t2.TPoint(x=0.5, y=0.5)
b1: t2.TBoundingBox = t2.TBoundingBox(width=0.1, height=0.1, left=0.5, top=0.5)
g1: t2.TGeometry = t2.TGeometry(bounding_box=b1, polygon=[p1])
g1.scale(doc_width=10, doc_height=10)
assert (g1 == t2.TGeometry(bounding_box=t2.TBoundingBox(width=1, height=1, left=5, top=5),
polygon=[t2.TPoint(x=5, y=5)]))
def test_ratio(caplog):
p1: t2.TPoint = t2.TPoint(x=0.5, y=0.5)
p2: t2.TPoint = t2.TPoint(x=5, y=5)
p2.ratio(doc_width=10, doc_height=10)
assert (p1 == p2)
b1: t2.TBoundingBox = t2.TBoundingBox(width=0.1, height=0.1, left=0.5, top=0.5)
b2: t2.TBoundingBox = t2.TBoundingBox(width=1, height=1, left=5, top=5)
b2.ratio(doc_width=10, doc_height=10)
assert (b1 == b2)
p1: t2.TPoint = t2.TPoint(x=0.5, y=0.5)
p2: t2.TPoint = t2.TPoint(x=5, y=5)
b1: t2.TBoundingBox = t2.TBoundingBox(width=0.1, height=0.1, left=0.5, top=0.5)
b2: t2.TBoundingBox = t2.TBoundingBox(width=1, height=1, left=5, top=5)
g1: t2.TGeometry = t2.TGeometry(bounding_box=b1, polygon=[p1])
g2: t2.TGeometry = t2.TGeometry(bounding_box=b2, polygon=[p2])
g2.ratio(doc_width=10, doc_height=10)
assert (g1 == g2)
def test_get_blocks_for_relationship(caplog):
caplog.set_level(logging.DEBUG)
# existing relationships
p = os.path.dirname(os.path.realpath(__file__))
with open(os.path.join(p, "data/gib.json")) as f:
j = json.load(f)
t_document: t2.TDocument = t2.TDocumentSchema().load(j)
page = t_document.pages[0]
block = t_document.get_block_by_id("458a9301-8a9d-4eb2-9469-70302c62622e")
relationships = block.get_relationships_for_type()
relationships_value = block.get_relationships_for_type(relationship_type="VALUE")
if relationships and relationships_value:
rel = t_document.get_blocks_for_relationships(relationship=relationships)
assert len(rel) == 1
rel_value = t_document.get_blocks_for_relationships(relationship=relationships_value)
assert len(rel_value) == 1
child_rel: List[t2.TBlock] = list()
for value_block in rel_value:
child_rel.extend(t_document.get_blocks_for_relationships(value_block.get_relationships_for_type()))
assert len(child_rel) == 1
else:
assert False
def test_add_ids_to_relationships(caplog):
tdocument = t2.TDocument()
page_block = t2.TBlock(
id=str(uuid4()),
block_type="PAGE",
geometry=t2.TGeometry(bounding_box=t2.TBoundingBox(width=1, height=1, left=0, top=0),
polygon=[t2.TPoint(x=0, y=0), t2.TPoint(x=1, y=1)]),
)
tblock = t2.TBlock(id=str(uuid4()),
block_type="WORD",
text="sometest",
geometry=t2.TGeometry(bounding_box=t2.TBoundingBox(width=0, height=0, left=0, top=0),
polygon=[t2.TPoint(x=0, y=0), t2.TPoint(x=0, y=0)]),
confidence=99,
text_type="VIRTUAL")
tdocument.add_block(page_block)
tdocument.add_block(tblock)
page_block.add_ids_to_relationships([tblock.id])
tblock.add_ids_to_relationships(["1", "2"])
assert page_block.relationships and len(page_block.relationships) > 0
assert tblock.relationships and len(tblock.relationships) > 0
def test_key_value_set_key_name(caplog):
caplog.set_level(logging.DEBUG)
# existing relationships
p = os.path.dirname(os.path.realpath(__file__))
with open(os.path.join(p, "data/gib.json")) as f:
j = json.load(f)
t_document: t2.TDocument = t2.TDocumentSchema().load(j)
page = t_document.pages[0]
keys = list(t_document.keys(page=page))
assert keys and len(keys) > 0
for key_value in keys:
child_relationship = key_value.get_relationships_for_type('CHILD')
if child_relationship:
for id in child_relationship.ids:
k_b = t_document.get_block_by_id(id=id)
print(k_b.text)
print(' '.join([x.text for x in t_document.value_for_key(key_value)]))
def test_get_relationships_for_type(caplog):
# existing relationships
p = os.path.dirname(os.path.realpath(__file__))
with open(os.path.join(p, "data/gib.json")) as f:
j = json.load(f)
t_document: t2.TDocument = t2.TDocumentSchema().load(j)
page = t_document.pages[0]
new_block = t2.TBlock(id=str(uuid4()))
t_document.add_block(new_block)
page.add_ids_to_relationships([new_block.id])
assert t_document.get_block_by_id(new_block.id) == new_block
#empty relationships
t_document: t2.TDocument = t2.TDocument()
t_document.add_block(t2.TBlock(id=str(uuid4()), block_type="PAGE"))
page = t_document.pages[0]
new_block = t2.TBlock(id=str(uuid4()))
t_document.add_block(new_block)
page.add_ids_to_relationships([new_block.id])
assert t_document.get_block_by_id(new_block.id) == new_block
def test_merge_tables():
p = os.path.dirname(os.path.realpath(__file__))
f = open(os.path.join(p, "data/gib_multi_page_tables.json"))
j = json.load(f)
t_document: t2.TDocument = t2.TDocumentSchema().load(j)
tbl_id1 = 'fed02fb4-1996-4a15-98dc-29da193cc476'
tbl_id2 = '47c6097f-02d5-4432-8423-13c05fbfacbd'
pre_merge_tbl1_cells_no = len(t_document.get_block_by_id(tbl_id1).relationships[0].ids) # type: ignore
pre_merge_tbl2_cells_no = len(t_document.get_block_by_id(tbl_id2).relationships[0].ids) # type: ignore
pre_merge_tbl1_lastcell = t_document.get_block_by_id(tbl_id1).relationships[0].ids[-1] # type: ignore
pre_merge_tbl2_lastcell = t_document.get_block_by_id(tbl_id2).relationships[0].ids[-1] # type: ignore
pre_merge_tbl1_last_row = t_document.get_block_by_id(pre_merge_tbl1_lastcell).row_index # type: ignore
pre_merge_tbl2_last_row = t_document.get_block_by_id(pre_merge_tbl2_lastcell).row_index # type: ignore
t_document.merge_tables([[tbl_id1, tbl_id2]])
post_merge_tbl1_cells_no = len(t_document.get_block_by_id(tbl_id1).relationships[0].ids) # type: ignore
post_merge_tbl1_lastcell = t_document.get_block_by_id(tbl_id1).relationships[0].ids[-1] # type: ignore
post_merge_tbl1_last_row = t_document.get_block_by_id(post_merge_tbl1_lastcell).row_index # type: ignore
assert post_merge_tbl1_cells_no == pre_merge_tbl1_cells_no + pre_merge_tbl2_cells_no
assert pre_merge_tbl2_last_row
assert post_merge_tbl1_last_row == pre_merge_tbl1_last_row + pre_merge_tbl2_last_row # type: ignore
def test_delete_blocks():
p = os.path.dirname(os.path.realpath(__file__))
f = open(os.path.join(p, "data/gib_multi_page_tables.json"))
j = json.load(f)
t_document: t2.TDocument = t2.TDocumentSchema().load(j)
tbl_id1 = 'fed02fb4-1996-4a15-98dc-29da193cc476'
tbl_id2 = '47c6097f-02d5-4432-8423-13c05fbfacbd'
pre_delete_block_no = len(t_document.blocks)
t_document.delete_blocks([tbl_id1, tbl_id2])
post_delete_block_no = len(t_document.blocks)
assert post_delete_block_no == pre_delete_block_no - 2
def test_link_tables():
p = os.path.dirname(os.path.realpath(__file__))
f = open(os.path.join(p, "data/gib_multi_page_tables.json"))
j = json.load(f)
t_document: t2.TDocument = t2.TDocumentSchema().load(j)
tbl_id1 = 'fed02fb4-1996-4a15-98dc-29da193cc476'
tbl_id2 = '47c6097f-02d5-4432-8423-13c05fbfacbd'
t_document.link_tables([[tbl_id1, tbl_id2]])
assert t_document.get_block_by_id(tbl_id1).custom['next_table'] == tbl_id2 # type: ignore
assert t_document.get_block_by_id(tbl_id2).custom['previous_table'] == tbl_id1 # type: ignore
def test_pipeline_merge_tables():
p = os.path.dirname(os.path.realpath(__file__))
f = open(os.path.join(p, "data/gib_multi_page_table_merge.json"))
j = json.load(f)
t_document: t2.TDocument = t2.TDocumentSchema().load(j)
tbl_id1 = '5685498d-d196-42a7-8b40-594d6d886ca9'
tbl_id2 = 'a9191a66-0d32-4d36-8fd6-58e6917f4ea6'
tbl_id3 = 'e0368543-c9c3-4616-bd6c-f25e66c859b2'
pre_merge_tbl1_cells_no = len(t_document.get_block_by_id(tbl_id1).relationships[0].ids) # type: ignore
pre_merge_tbl2_cells_no = len(t_document.get_block_by_id(tbl_id2).relationships[0].ids) # type: ignore
pre_merge_tbl3_cells_no = len(t_document.get_block_by_id(tbl_id3).relationships[0].ids) # type: ignore
t_document = pipeline_merge_tables(t_document, MergeOptions.MERGE, None, HeaderFooterType.NONE)
post_merge_tbl1_cells_no = len(t_document.get_block_by_id(tbl_id1).relationships[0].ids) # type: ignore
assert post_merge_tbl1_cells_no == pre_merge_tbl1_cells_no + pre_merge_tbl2_cells_no + pre_merge_tbl3_cells_no
def test_pipeline_merge_multiple_tables():
p = os.path.dirname(os.path.realpath(__file__))
f = open(os.path.join(p, "data/gib_multi_tables_multi_page_sample.json"))
j = json.load(f)
t_document: t2.TDocument = t2.TDocumentSchema().load(j)
tbl_id1 = '4894d2ba-0479-4196-9cbd-c0fea4d28762'
tbl_id2 = 'b5e061ec-05be-48d5-83fc-6719fdd4397a'
tbl_id3 = '8bbc3f4f-0354-4999-a001-4585631bb7fe'
tbl_id4 = 'cf8e09a1-c317-40c1-9c45-e830e14167d5'
pre_merge_tbl1_cells_no = len(t_document.get_block_by_id(tbl_id1).relationships[0].ids) # type: ignore
pre_merge_tbl2_cells_no = len(t_document.get_block_by_id(tbl_id2).relationships[0].ids) # type: ignore
pre_merge_tbl3_cells_no = len(t_document.get_block_by_id(tbl_id3).relationships[0].ids) # type: ignore
pre_merge_tbl4_cells_no = len(t_document.get_block_by_id(tbl_id4).relationships[0].ids) # type: ignore
t_document = pipeline_merge_tables(t_document, MergeOptions.MERGE, None, HeaderFooterType.NONE)
post_merge_tbl1_cells_no = len(t_document.get_block_by_id(tbl_id1).relationships[0].ids) # type: ignore
post_merge_tbl2_cells_no = len(t_document.get_block_by_id(tbl_id3).relationships[0].ids) # type: ignore
assert post_merge_tbl1_cells_no == pre_merge_tbl1_cells_no + pre_merge_tbl2_cells_no
assert post_merge_tbl2_cells_no == pre_merge_tbl3_cells_no + pre_merge_tbl4_cells_no
def test_kv_ocr_confidence(caplog):
caplog.set_level(logging.DEBUG)
p = os.path.dirname(os.path.realpath(__file__))
f = open(os.path.join(p, "data/employment-application.json"))
j = json.load(f)
t_document: t2.TDocument = t2.TDocumentSchema().load(j)
t_document = add_kv_ocr_confidence(t_document)
doc = t1.Document(t2.TDocumentSchema().dump(t_document))
for page in doc.pages:
k1 = page.form.getFieldByKey("Home Address:")
k1.key.custom['OCRConfidence'] == {'mean': 99.60698318481445}
k1.value.custom['OCRConfidence'] == {'mean': 99.8596928914388}
k1 = page.form.getFieldByKey("Phone Number:")
k1.key.custom['OCRConfidence'] == {'mean': 99.55334854125977}
k1.value.custom['OCRConfidence'] == {'mean': 99.23233032226562}
# for field in page.form.fields:
# print(
# f"{field.key.text} - {field.key.custom['OCRConfidence']}, {field.value.text} - {field.value.custom['OCRConfidence']}"
# )
| 46.304264 | 135 | 0.684259 | 3,731 | 23,893 | 4.141785 | 0.083356 | 0.073966 | 0.020384 | 0.03365 | 0.760305 | 0.730279 | 0.679027 | 0.656442 | 0.62098 | 0.600984 | 0 | 0.06029 | 0.169045 | 23,893 | 515 | 136 | 46.394175 | 0.718042 | 0.107814 | 0 | 0.442105 | 0 | 0 | 0.079365 | 0.056097 | 0 | 0 | 0 | 0.001942 | 0.171053 | 1 | 0.065789 | false | 0 | 0.031579 | 0.002632 | 0.102632 | 0.005263 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
d5d8d7b5659891fa8e171d12fc1e681568a1e049 | 3,989 | py | Python | gui/serializers.py | cryptosharks131/lndg | 41d3acb8e87c6f58420cc4fbef114bb802a7ad85 | [
"MIT"
] | 56 | 2021-09-11T14:56:33.000Z | 2022-03-31T04:52:18.000Z | gui/serializers.py | SatoshiNakamotoBitcoin/lndg | 41d3acb8e87c6f58420cc4fbef114bb802a7ad85 | [
"MIT"
] | 37 | 2021-09-23T18:28:36.000Z | 2022-03-30T00:35:45.000Z | gui/serializers.py | SatoshiNakamotoBitcoin/lndg | 41d3acb8e87c6f58420cc4fbef114bb802a7ad85 | [
"MIT"
] | 15 | 2021-09-30T23:48:03.000Z | 2022-03-28T21:21:50.000Z | from rest_framework import serializers
from rest_framework.relations import PrimaryKeyRelatedField
from .models import LocalSettings, Payments, PaymentHops, Invoices, Forwards, Channels, Rebalancer, Peers, Onchain, PendingHTLCs, FailedHTLCs
##FUTURE UPDATE 'exclude' TO 'fields'
class PaymentSerializer(serializers.HyperlinkedModelSerializer):
payment_hash = serializers.ReadOnlyField()
class Meta:
model = Payments
exclude = []
class InvoiceSerializer(serializers.HyperlinkedModelSerializer):
r_hash = serializers.ReadOnlyField()
class Meta:
model = Invoices
exclude = []
class ForwardSerializer(serializers.HyperlinkedModelSerializer):
id = serializers.ReadOnlyField()
class Meta:
model = Forwards
exclude = []
class ChannelSerializer(serializers.HyperlinkedModelSerializer):
chan_id = serializers.ReadOnlyField()
remote_pubkey = serializers.ReadOnlyField()
funding_txid = serializers.ReadOnlyField()
output_index = serializers.ReadOnlyField()
capacity = serializers.ReadOnlyField()
local_balance = serializers.ReadOnlyField()
remote_balance = serializers.ReadOnlyField()
unsettled_balance = serializers.ReadOnlyField()
local_commit = serializers.ReadOnlyField()
local_chan_reserve = serializers.ReadOnlyField()
initiator = serializers.ReadOnlyField()
local_base_fee = serializers.ReadOnlyField()
local_fee_rate = serializers.ReadOnlyField()
remote_base_fee = serializers.ReadOnlyField()
remote_fee_rate = serializers.ReadOnlyField()
is_active = serializers.ReadOnlyField()
is_open = serializers.ReadOnlyField()
num_updates = serializers.ReadOnlyField()
class Meta:
model = Channels
exclude = []
class RebalancerSerializer(serializers.HyperlinkedModelSerializer):
id = serializers.ReadOnlyField()
requested = serializers.ReadOnlyField()
start = serializers.ReadOnlyField()
stop = serializers.ReadOnlyField()
status = serializers.ReadOnlyField()
class Meta:
model = Rebalancer
exclude = []
class ConnectPeerSerializer(serializers.Serializer):
peer_id = serializers.CharField(label='peer_pubkey', max_length=200)
class OpenChannelSerializer(serializers.Serializer):
peer_pubkey = serializers.CharField(label='peer_pubkey', max_length=66)
local_amt = serializers.IntegerField(label='local_amt')
sat_per_byte = serializers.IntegerField(label='sat_per_btye')
class CloseChannelSerializer(serializers.Serializer):
chan_id = serializers.IntegerField(label='chan_id')
target_fee = serializers.IntegerField(label='target_fee')
force = serializers.BooleanField(default=False)
class AddInvoiceSerializer(serializers.Serializer):
value = serializers.IntegerField(label='value')
class UpdateAliasSerializer(serializers.Serializer):
peer_pubkey = serializers.CharField(label='peer_pubkey', max_length=66)
class PeerSerializer(serializers.HyperlinkedModelSerializer):
pubkey = serializers.ReadOnlyField()
class Meta:
model = Peers
exclude = []
class OnchainSerializer(serializers.HyperlinkedModelSerializer):
tx_hash = serializers.ReadOnlyField()
class Meta:
model = Onchain
exclude = []
class PaymentHopsSerializer(serializers.HyperlinkedModelSerializer):
payment_hash = PrimaryKeyRelatedField(read_only=True)
class Meta:
model = PaymentHops
exclude = []
class LocalSettingsSerializer(serializers.HyperlinkedModelSerializer):
key = serializers.ReadOnlyField()
class Meta:
model = LocalSettings
exclude = []
class PendingHTLCSerializer(serializers.HyperlinkedModelSerializer):
id = serializers.ReadOnlyField()
class Meta:
model = PendingHTLCs
exclude = []
class FailedHTLCSerializer(serializers.HyperlinkedModelSerializer):
id = serializers.ReadOnlyField()
class Meta:
model = FailedHTLCs
exclude = [] | 35.936937 | 141 | 0.748057 | 342 | 3,989 | 8.587719 | 0.292398 | 0.25332 | 0.052434 | 0.11236 | 0.262172 | 0.188968 | 0.146067 | 0.131086 | 0.052434 | 0.052434 | 0 | 0.002111 | 0.168714 | 3,989 | 111 | 142 | 35.936937 | 0.883595 | 0.008774 | 0 | 0.301075 | 0 | 0 | 0.019226 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.032258 | 0 | 0.763441 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
d5d9990c00c7cb12a25b590111e56b5927729d53 | 635 | py | Python | mesonbuild/interpreter/kwargs.py | ManuelAtWork/meson | c3f5c2e745c3467248a8b61b5346969c7682c711 | [
"Apache-2.0"
] | null | null | null | mesonbuild/interpreter/kwargs.py | ManuelAtWork/meson | c3f5c2e745c3467248a8b61b5346969c7682c711 | [
"Apache-2.0"
] | null | null | null | mesonbuild/interpreter/kwargs.py | ManuelAtWork/meson | c3f5c2e745c3467248a8b61b5346969c7682c711 | [
"Apache-2.0"
] | null | null | null | # SPDX-License-Identifier: Apache-2.0
# Copyright © 2021 The Meson Developers
# Copyright © 2021 Intel Corporation
"""Keyword Argument type annotations."""
import typing as T
from typing_extensions import TypedDict
from ..mesonlib import MachineChoice
class FuncAddProjectArgs(TypedDict):
"""Keyword Arguments for the add_*_arguments family of arguments.
including `add_global_arguments`, `add_project_arguments`, and their
link variants
Because of the use of a convertor function, we get the native keyword as
a MachineChoice instance already.
"""
native: MachineChoice
language: T.List[str]
| 23.518519 | 76 | 0.752756 | 81 | 635 | 5.839506 | 0.666667 | 0.042283 | 0.059197 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.019194 | 0.179528 | 635 | 26 | 77 | 24.423077 | 0.884837 | 0.628346 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
d5de3bf7f5e465b5709c627678f10fa9d93d103f | 1,269 | py | Python | csst/core/processor.py | csster/csst | 127b8098acac2a8d8098145ce38494952f310ff4 | [
"MIT"
] | null | null | null | csst/core/processor.py | csster/csst | 127b8098acac2a8d8098145ce38494952f310ff4 | [
"MIT"
] | null | null | null | csst/core/processor.py | csster/csst | 127b8098acac2a8d8098145ce38494952f310ff4 | [
"MIT"
] | null | null | null | from abc import ABC, ABCMeta, abstractmethod
from enum import Enum
class CsstProcStatus(Enum):
empty = -1
normal = 0
ioerror = 1
runtimeerror = 2
# self['empty'].info = 'Not run yet.'
# self['normal'].info = 'This is a normal run.'
# self['ioerror'].info = 'This run is exceptionally stopped due to IO error.'
# self['runtimeerror'].info = 'This run is exceptionally stopped due to runtime error.'
class CsstProcessor(ABC):
def __init__(self, **kwargs):
# self._status = CsstProcStatus()
pass
@abstractmethod
def prepare(self, **kwargs):
# do your preparation here
raise NotImplementedError
@abstractmethod
def run(self, **kwargs):
# run your pipeline
raise NotImplementedError
@abstractmethod
def cleanup(self):
# clean up environment
raise NotImplementedError
class CsstDemoProcessor(CsstProcessor):
def __init__(self, **kwargs):
super().__init__()
def some_function(self, **kwargs):
print("some function")
def prepare(self):
print("prepare")
def run(self):
print("run")
def cleanup(self):
print("clear up")
if __name__ == "__main__":
cp = CsstDemoProcessor()
| 21.15 | 91 | 0.62569 | 138 | 1,269 | 5.594203 | 0.42029 | 0.064767 | 0.028497 | 0.033679 | 0.098446 | 0.098446 | 0.098446 | 0.098446 | 0 | 0 | 0 | 0.004292 | 0.265563 | 1,269 | 59 | 92 | 21.508475 | 0.824034 | 0.276596 | 0 | 0.3125 | 0 | 0 | 0.043142 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.28125 | false | 0.03125 | 0.0625 | 0 | 0.5625 | 0.125 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
d5e2fa45d415f896d64b02fec047cb5c2a098c92 | 300 | py | Python | tests/recipes/test_libcurl.py | ht-thomas/python-for-android | 75342099c3e0db8f570351e94c59736d59179bfe | [
"MIT"
] | 6,278 | 2015-01-02T16:34:05.000Z | 2022-03-31T10:24:45.000Z | tests/recipes/test_libcurl.py | ht-thomas/python-for-android | 75342099c3e0db8f570351e94c59736d59179bfe | [
"MIT"
] | 1,877 | 2015-01-01T16:16:10.000Z | 2022-03-27T17:34:34.000Z | tests/recipes/test_libcurl.py | ht-thomas/python-for-android | 75342099c3e0db8f570351e94c59736d59179bfe | [
"MIT"
] | 1,565 | 2015-01-02T19:35:37.000Z | 2022-03-31T15:37:06.000Z | import unittest
from tests.recipes.recipe_lib_test import BaseTestForMakeRecipe
class TestLibcurlRecipe(BaseTestForMakeRecipe, unittest.TestCase):
"""
An unittest for recipe :mod:`~pythonforandroid.recipes.libcurl`
"""
recipe_name = "libcurl"
sh_command_calls = ["./configure"]
| 27.272727 | 67 | 0.753333 | 30 | 300 | 7.366667 | 0.733333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.146667 | 300 | 10 | 68 | 30 | 0.863281 | 0.21 | 0 | 0 | 0 | 0 | 0.081448 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
d5e486c352fedd659c08b5d7bfc1aa33ebcb5528 | 268 | py | Python | fastreid/data/transforms/__init__.py | YanzuoLu/fast-reid | dbf6fc8a61b1a60a03691b9bbda29956bd65d88d | [
"Apache-2.0"
] | null | null | null | fastreid/data/transforms/__init__.py | YanzuoLu/fast-reid | dbf6fc8a61b1a60a03691b9bbda29956bd65d88d | [
"Apache-2.0"
] | null | null | null | fastreid/data/transforms/__init__.py | YanzuoLu/fast-reid | dbf6fc8a61b1a60a03691b9bbda29956bd65d88d | [
"Apache-2.0"
] | null | null | null | # encoding: utf-8
"""
@author: sherlock
@contact: sherlockliao01@gmail.com
"""
from .autoaugment import AutoAugment
from .build import build_transforms
from .transforms import *
from .mosaic import *
__all__ = [k for k in globals().keys() if not k.startswith("_")]
| 20.615385 | 64 | 0.731343 | 35 | 268 | 5.428571 | 0.685714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.013043 | 0.141791 | 268 | 12 | 65 | 22.333333 | 0.813043 | 0.261194 | 0 | 0 | 0 | 0 | 0.005263 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.8 | 0 | 0.8 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
d5e8743a10e3dab83cc90d37caefd57ade4c396c | 856 | py | Python | mss/factory.py | RedFantom/python-mss | 7e26de184b1e6a0800231b01451f794087a76f73 | [
"MIT"
] | 1 | 2019-06-13T15:50:07.000Z | 2019-06-13T15:50:07.000Z | mss/factory.py | RedFantom/python-mss | 7e26de184b1e6a0800231b01451f794087a76f73 | [
"MIT"
] | null | null | null | mss/factory.py | RedFantom/python-mss | 7e26de184b1e6a0800231b01451f794087a76f73 | [
"MIT"
] | null | null | null | # coding: utf-8
"""
This is part of the MSS Python's module.
Source: https://github.com/BoboTiG/python-mss
"""
import platform
from .exception import ScreenShotError
def mss(**kwargs):
# type: (**str) -> MSS
""" Factory returning a proper MSS class instance.
It detects the plateform we are running on
and choose the most adapted mss_class to take
screenshots.
It then proxies its arguments to the class for
instantiation.
"""
operating_system = platform.system().lower()
if operating_system == 'darwin':
from .darwin import MSS
elif operating_system == 'linux':
from .linux import MSS
elif operating_system == 'windows':
from .windows import MSS
else:
raise ScreenShotError('System not (yet?) implemented.', locals())
return MSS(**kwargs)
| 24.457143 | 73 | 0.650701 | 106 | 856 | 5.207547 | 0.632075 | 0.108696 | 0.047101 | 0.07971 | 0.101449 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001563 | 0.252336 | 856 | 34 | 74 | 25.176471 | 0.860938 | 0.391355 | 0 | 0 | 0 | 0 | 0.104121 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076923 | false | 0 | 0.384615 | 0 | 0.538462 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
d5e8d39e697c6975b43f500aded78dcac86225e9 | 513 | py | Python | 00-example/solution.py | alvarogzp/badoo-challenge-2015 | f4e1d8b1837c7cc5ae31bb3fa808a24b60513214 | [
"MIT"
] | 1 | 2016-01-10T16:59:00.000Z | 2016-01-10T16:59:00.000Z | 00-example/solution.py | alvarogzp/badoo-challenge-2015 | f4e1d8b1837c7cc5ae31bb3fa808a24b60513214 | [
"MIT"
] | null | null | null | 00-example/solution.py | alvarogzp/badoo-challenge-2015 | f4e1d8b1837c7cc5ae31bb3fa808a24b60513214 | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
def get_case_data():
return [int(i) for i in input().split()]
# Using recursive implementation
def get_gcd(a, b):
return get_gcd(b, a % b) if b != 0 else a
def print_number_or_ok_if_equals(number, guess):
print("OK" if number == guess else number)
number_of_cases = int(input())
for case in range(number_of_cases):
first_integer, second_integer, proposed_gcd = get_case_data()
real_gcd = get_gcd(first_integer, second_integer)
print_number_or_ok_if_equals(real_gcd, proposed_gcd)
| 28.5 | 62 | 0.758285 | 90 | 513 | 4 | 0.422222 | 0.05 | 0.061111 | 0.083333 | 0.127778 | 0.127778 | 0 | 0 | 0 | 0 | 0 | 0.004474 | 0.128655 | 513 | 17 | 63 | 30.176471 | 0.800895 | 0.101365 | 0 | 0 | 0 | 0 | 0.004357 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.272727 | false | 0 | 0 | 0.181818 | 0.454545 | 0.272727 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
d5eaf6f929c8bf6f13362276a9f77bb820ab074c | 1,413 | py | Python | services/service-pick&drop/src/api/v1/models.py | Beracah-Group/docker-microservices | 2876b05ba585772e97746a11845b64bd4ede61cb | [
"MIT"
] | 1 | 2020-02-18T08:52:02.000Z | 2020-02-18T08:52:02.000Z | services/service-pick&drop/src/api/v1/models.py | Beracah-Group/docker-microservices | 2876b05ba585772e97746a11845b64bd4ede61cb | [
"MIT"
] | null | null | null | services/service-pick&drop/src/api/v1/models.py | Beracah-Group/docker-microservices | 2876b05ba585772e97746a11845b64bd4ede61cb | [
"MIT"
] | null | null | null | # import modules
from datetime import datetime
from src.api.__init__ import databases
# washing class model with methods
class Pickanddrop(databases.Model):
__tablename__ = 'Pickanddrop'
id = databases.Column(databases.Integer, primary_key=True, autoincrement=True)
name = databases.Column(databases.String(20))
price = databases.Column(databases.Integer)
description = databases.Column(databases.String(300))
date_created = databases.Column(databases.DateTime, default=datetime.utcnow())
date_modified = databases.Column(databases.DateTime, default=datetime.utcnow(), onupdate=datetime.utcnow())
type = databases.Column(databases.String(50))
__mapper_args__ = {
'polymorphic_on': type,
'polymorphic_identity': 'Pickanddrop'
}
def save(self):
databases.session.add(self)
databases.session.commit()
def to_json(self):
return {
'id': self.id,
'name': self.name,
'price': self.price,
'description': self.description
}
class Pickdrop(Pickanddrop):
__mapper_args__ = {
'polymorphic_identity': 'pickanddrop'
}
class Selfdrop(Pickanddrop):
__mapper_args__ = {
'polymorphic_identity': 'selfdrop'
}
class Homeservice(Pickanddrop):
__mapper_args__ = {
'polymorphic_identity': 'homeservice'
}
| 27.705882 | 111 | 0.661005 | 136 | 1,413 | 6.595588 | 0.389706 | 0.117057 | 0.187291 | 0.100334 | 0.251951 | 0.118172 | 0.118172 | 0 | 0 | 0 | 0 | 0.006452 | 0.23213 | 1,413 | 51 | 112 | 27.705882 | 0.820277 | 0.033263 | 0 | 0.108108 | 0 | 0 | 0.123167 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.054054 | false | 0 | 0.054054 | 0.027027 | 0.567568 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
9101d5ab87a82438c9d34c389eef218848ed6051 | 5,737 | py | Python | test/test_point.py | Lofgren/svgelements | e94aa514035b88502612ff699bb8b9f50bdc8a56 | [
"MIT"
] | null | null | null | test/test_point.py | Lofgren/svgelements | e94aa514035b88502612ff699bb8b9f50bdc8a56 | [
"MIT"
] | null | null | null | test/test_point.py | Lofgren/svgelements | e94aa514035b88502612ff699bb8b9f50bdc8a56 | [
"MIT"
] | null | null | null | from __future__ import print_function
import unittest
from random import random
from svgelements import *
class TestElementPoint(unittest.TestCase):
def test_point_init_string(self):
p = Point("(0,24)")
self.assertEqual(p, (0, 24))
self.assertEqual(p, 0 + 24j)
self.assertEqual(p, [0, 24])
self.assertEqual(p, "(0,24)")
def test_polar_angle(self):
for i in range(1000):
p = Point(random() * 50, random() * 50)
a = random() * tau - tau / 2
r = random() * 50
m = Point.polar(p, a, r)
self.assertAlmostEqual(Point.angle(p, m), a)
def test_not_equal_unparsed(self):
self.assertNotEqual(Point(0, 0), "string that doesn't parse to point")
def test_dunder_iadd(self):
p = Point(0)
p += (1, 0)
self.assertEqual(p, (1, 0))
p += Point(1, 1)
self.assertEqual(p, (2, 1))
p += 1 + 2j
self.assertEqual(p, (3, 3))
class c:
def __init__(self):
self.x = 1
self.y = 1
p += c()
self.assertEqual(p, (4, 4))
p += Point("-4,-4")
self.assertEqual(p, (0, 0))
p += 1
self.assertEqual(p, (1, 0))
self.assertRaises(TypeError, 'p += "hello"')
def test_dunder_isub(self):
p = Point(0)
p -= (1, 0)
self.assertEqual(p, (-1, 0))
p -= Point(1, 1)
self.assertEqual(p, (-2, -1))
p -= 1 + 2j
self.assertEqual(p, (-3, -3))
class c:
def __init__(self):
self.x = 1
self.y = 1
p -= c()
self.assertEqual(p, (-4, -4))
p -= Point("-4,-4")
self.assertEqual(p, (0, 0))
p -= 1
self.assertEqual(p, (-1, 0))
r = p - 1
self.assertEqual(r, (-2, 0))
self.assertRaises(TypeError, 'p -= "hello"')
def test_dunder_add(self):
p = Point(0)
p = p + (1, 0)
self.assertEqual(p, (1, 0))
p = p + Point(1, 1)
self.assertEqual(p, (2, 1))
p = p + 1 + 2j
self.assertEqual(p, (3, 3))
class c:
def __init__(self):
self.x = 1
self.y = 1
p = p + c()
self.assertEqual(p, (4, 4))
p = p + Point("-4,-4")
self.assertEqual(p, (0, 0))
p = p + 1
self.assertEqual(p, (1, 0))
self.assertRaises(TypeError, 'p = p + "hello"')
def test_dunder_sub(self):
p = Point(0)
p = p - (1, 0)
self.assertEqual(p, (-1, 0))
p = p - Point(1, 1)
self.assertEqual(p, (-2, -1))
p = p - (1 + 2j)
self.assertEqual(p, (-3, -3))
class c:
def __init__(self):
self.x = 1
self.y = 1
p = p - c()
self.assertEqual(p, (-4, -4))
p = p - Point("-4,-4")
self.assertEqual(p, (0, 0))
p = p - 1
self.assertEqual(p, (-1, 0))
self.assertRaises(TypeError, 'p = p - "hello"')
def test_dunder_rsub(self):
p = Point(0)
p = (1, 0) - p
self.assertEqual(p, (1, 0))
p = Point(1, 1) - p
self.assertEqual(p, (0, 1))
p = (1 + 2j) - p
self.assertEqual(p, (1, 1))
class c:
def __init__(self):
self.x = 1
self.y = 1
p = c() - p
self.assertEqual(p, (0, 0))
p = Point("-4,-4") - p
self.assertEqual(p, (-4, -4))
p = 1 - p
self.assertEqual(p, (5, 4))
self.assertRaises(TypeError, 'p = "hello" - p')
def test_dunder_mult(self):
"""
For backwards compatibility multiplication of points works like multiplication of complex variables.
:return:
"""
p = Point(2, 2)
p *= (1, 0)
self.assertEqual(p, (2, 2))
p *= Point(1, 1)
self.assertEqual(p, (0, 4))
p *= 1 + 2j
self.assertEqual(p, (-8, 4))
class c:
def __init__(self):
self.x = 1
self.y = 1
p *= c()
self.assertEqual(p, (-12, -4))
p *= Point("-4,-4")
self.assertEqual(p, (32, 64))
p *= 1
self.assertEqual(p, (32, 64))
r = p * 1
self.assertEqual(r, (32, 64))
r *= "scale(0.1)"
self.assertEqual(r, (3.2, 6.4))
def test_dunder_transform(self):
p = Point(4, 4)
m = Matrix("scale(4)")
p.matrix_transform(m)
self.assertEqual(p, (16, 16))
def test_move_towards(self):
p = Point(4, 4)
p.move_towards((6, 6), 0.5)
self.assertEqual(p, (5, 5))
def test_distance_to(self):
p = Point(4, 4)
m = p.distance_to((6, 6))
self.assertEqual(m, 2 * sqrt(2))
m = p.distance_to(4)
self.assertEqual(m, 4)
def test_angle_to(self):
p = Point(0)
a = p.angle_to((3, 3))
self.assertEqual(a, Angle.parse("45deg"))
a = p.angle_to((0, 3))
self.assertEqual(a, Angle.parse("0.25turn"))
a = p.angle_to((-3, 0))
self.assertEqual(a, Angle.parse("200grad"))
def test_polar(self):
p = Point(0)
q = p.polar_to(Angle.parse("45deg"), 10)
self.assertEqual(q, (sqrt(2)/2 * 10, sqrt(2)/2 * 10))
def test_reflected_across(self):
p = Point(0)
r = p.reflected_across((10,10))
self.assertEqual(r, (20,20)) | 27.849515 | 109 | 0.446923 | 755 | 5,737 | 3.304636 | 0.124503 | 0.312625 | 0.269339 | 0.07495 | 0.60521 | 0.533066 | 0.45491 | 0.437675 | 0.427655 | 0.363126 | 0 | 0.072685 | 0.395677 | 5,737 | 206 | 110 | 27.849515 | 0.646957 | 0.019174 | 0 | 0.376471 | 0 | 0 | 0.034873 | 0 | 0 | 0 | 0 | 0 | 0.347059 | 1 | 0.123529 | false | 0 | 0.023529 | 0 | 0.188235 | 0.005882 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
91023fba0c0f6b34adc5c33724793cd683dfff26 | 127 | py | Python | core/noise.py | BenSmithers/MultiHex2 | 3a241d7b6e8681b56ac8f6dcc7f707bed47420ea | [
"MIT"
] | null | null | null | core/noise.py | BenSmithers/MultiHex2 | 3a241d7b6e8681b56ac8f6dcc7f707bed47420ea | [
"MIT"
] | null | null | null | core/noise.py | BenSmithers/MultiHex2 | 3a241d7b6e8681b56ac8f6dcc7f707bed47420ea | [
"MIT"
] | null | null | null | import numpy as np
def perlin(self):
nodes = 100
xs = np.random.rand(2*nodes,nodes)
ys = np.sqrt(1.0-xs**2)
| 14.111111 | 38 | 0.582677 | 23 | 127 | 3.217391 | 0.73913 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.075269 | 0.267717 | 127 | 9 | 39 | 14.111111 | 0.72043 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.2 | 0 | 0.4 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
91075b55116d940f3d31d51ea539bab16d4fbe2c | 317 | py | Python | nec_calendar/views.py | lindychi/mnec | 3d18f257ed3b0bc327340de988e7552e035dbec1 | [
"MIT"
] | 1 | 2018-02-20T13:46:41.000Z | 2018-02-20T13:46:41.000Z | nec_calendar/views.py | lindychi/mnec | 3d18f257ed3b0bc327340de988e7552e035dbec1 | [
"MIT"
] | 53 | 2017-10-10T02:43:22.000Z | 2022-03-11T23:15:05.000Z | nec_calendar/views.py | lindychi/mnec | 3d18f257ed3b0bc327340de988e7552e035dbec1 | [
"MIT"
] | null | null | null | from django.shortcuts import render
from nec_calendar.classes.calendar import Calendar
from django.utils import timezone
# Create your views here.
def index(request):
now = timezone.now()
calendar = Calendar(now.year, now.month)
return render(request, 'nec_calendar/index.html', {'calendar': calendar})
| 28.818182 | 77 | 0.757098 | 42 | 317 | 5.666667 | 0.52381 | 0.084034 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.141956 | 317 | 10 | 78 | 31.7 | 0.875 | 0.072555 | 0 | 0 | 0 | 0 | 0.106164 | 0.078767 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.428571 | 0 | 0.714286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
91303d2fbc9b37495ba6d567ea44afda53555c4e | 4,281 | py | Python | interview/models.py | OnGridSystems/RobotVeraWebApp | 01cee658a3983fcaf128b40bb99c1a4272e90c07 | [
"MIT"
] | 11 | 2018-06-13T10:10:11.000Z | 2021-06-05T08:23:43.000Z | interview/models.py | OnGridSystems/RobotVeraWebApp | 01cee658a3983fcaf128b40bb99c1a4272e90c07 | [
"MIT"
] | 5 | 2020-06-05T18:24:25.000Z | 2022-03-11T23:21:01.000Z | interview/models.py | OnGridSystems/RobotVeraWebApp | 01cee658a3983fcaf128b40bb99c1a4272e90c07 | [
"MIT"
] | 5 | 2018-08-17T16:09:33.000Z | 2021-06-06T05:32:10.000Z | import time
from django.core.validators import MinValueValidator
from django.db import models
from django.urls import reverse
from django.utils.timezone import now
from django.utils.translation import ugettext_lazy as _
from jsonfield import JSONField
from model_utils.models import SoftDeletableModel
from jobboard.helpers import BaseAction
from users.models import Member
class ActionInterview(BaseAction, models.Model):
action = models.OneToOneField('pipeline.Action',
on_delete=models.CASCADE,
null=False,
related_name='interview')
start_date = models.DateField(default=now,
help_text=_('Date from you want to interview candidates'))
end_date = models.DateField(blank=True,
null=True,
help_text=_('Date to you want to interview candidates'))
start_time = models.TimeField(default='08:00',
help_text=_('Time from you want to interview'))
end_time = models.TimeField(blank=True,
null=True,
default='18:00',
help_text=_('Time to you want to interview'))
duration = models.IntegerField(help_text=_('Interview duration'),
default=10,
validators=[
MinValueValidator(10, 'Interview duration cannot be less than 10 minutes'), ])
recruiters = models.ManyToManyField('users.Member')
def get_result_url(self, **kwargs):
pass
def get_candidate_url(self):
return reverse('candidate_interviewing', kwargs={'pk': self.id})
@property
def vacancy(self):
return self.action.pipeline.vacancy
class Meta:
abstract = False
class ScheduledMeeting(SoftDeletableModel):
# all_objects = models.Manager()
action_interview = models.ForeignKey(ActionInterview,
on_delete=models.CASCADE,
related_name='scheduled_meetings')
recruiter = models.ForeignKey(Member,
on_delete=models.CASCADE,
related_name='recruiter_scheduled_meetings')
candidate = models.ForeignKey(Member,
on_delete=models.CASCADE,
related_name='candidate_scheduled_meetings')
uuid = models.CharField(max_length=32,
blank=False,
null=False)
conf_id = models.CharField(max_length=32,
blank=False,
null=False)
link_start = models.URLField(max_length=768,
blank=False,
null=False)
link_join = models.URLField(blank=False,
null=False)
date = models.DateField(blank=False,
null=False)
time = models.TimeField(blank=False,
null=False)
@property
def vacancy(self):
return self.action_interview.action.pipeline.vacancy
def __str__(self):
return '{} {}'.format(self.date, self.time)
class Meta:
unique_together = (('action_interview', 'candidate', 'is_removed'),)
class InterviewPassed(models.Model):
interview = models.ForeignKey(ActionInterview,
on_delete=models.CASCADE,
related_name='passes')
recruiter = models.ForeignKey(Member,
on_delete=models.CASCADE,
related_name='recruiter_passed_interviews')
candidate = models.ForeignKey(Member,
on_delete=models.CASCADE,
related_name='candidate_passed_interviews')
data = JSONField(blank=True,
null=True)
date_created = models.DateTimeField(auto_now_add=True)
duration = models.DurationField(blank=True,
null=True)
| 41.563107 | 117 | 0.547068 | 386 | 4,281 | 5.901554 | 0.300518 | 0.024583 | 0.04302 | 0.06453 | 0.320018 | 0.262511 | 0.262511 | 0.229148 | 0.229148 | 0.18964 | 0 | 0.007865 | 0.376314 | 4,281 | 102 | 118 | 41.970588 | 0.845318 | 0.007008 | 0 | 0.344828 | 0 | 0 | 0.106613 | 0.031066 | 0 | 0 | 0 | 0 | 0 | 1 | 0.057471 | false | 0.057471 | 0.114943 | 0.045977 | 0.528736 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 2 |
9138ce8d0c4e2cb41fb8a1573bfe22374c35dceb | 4,411 | py | Python | kitsune/products/migrations/0009_auto__del_field_product_questions_enabled.py | safwanrahman/Ford | 87e91dea1cc22b1759eea81cef069359ccb5cd0b | [
"BSD-3-Clause"
] | 1 | 2017-07-03T12:11:03.000Z | 2017-07-03T12:11:03.000Z | kitsune/products/migrations/0009_auto__del_field_product_questions_enabled.py | maiakangalova/kitsune | b03b099e57a43717796fe0890af44ba96a7b51c8 | [
"BSD-3-Clause"
] | 8 | 2020-06-05T18:42:14.000Z | 2022-03-11T23:26:51.000Z | kitsune/products/migrations/0009_auto__del_field_product_questions_enabled.py | safwanrahman/Ford | 87e91dea1cc22b1759eea81cef069359ccb5cd0b | [
"BSD-3-Clause"
] | 1 | 2020-11-03T23:47:55.000Z | 2020-11-03T23:47:55.000Z | # -*- coding: utf-8 -*-
import datetime
from south.db import db
from south.v2 import SchemaMigration
from django.db import models
class Migration(SchemaMigration):
def forwards(self, orm):
# Deleting field 'Product.questions_enabled'
db.delete_column(u'products_product', 'questions_enabled')
def backwards(self, orm):
# Adding field 'Product.questions_enabled'
db.add_column(u'products_product', 'questions_enabled',
self.gf('django.db.models.fields.BooleanField')(default=False),
keep_default=False)
models = {
u'products.platform': {
'Meta': {'object_name': 'Platform'},
'display_order': ('django.db.models.fields.IntegerField', [], {}),
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'name': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
'slug': ('django.db.models.fields.SlugField', [], {'max_length': '50'}),
'visible': ('django.db.models.fields.BooleanField', [], {})
},
u'products.product': {
'Meta': {'ordering': "['display_order']", 'object_name': 'Product'},
'description': ('django.db.models.fields.TextField', [], {}),
'display_order': ('django.db.models.fields.IntegerField', [], {}),
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'image': ('django.db.models.fields.files.ImageField', [], {'max_length': '250', 'null': 'True', 'blank': 'True'}),
'image_cachebuster': ('django.db.models.fields.CharField', [], {'default': 'None', 'max_length': '32', 'null': 'True'}),
'image_offset': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'}),
'platforms': ('django.db.models.fields.related.ManyToManyField', [], {'to': u"orm['products.Platform']", 'symmetrical': 'False'}),
'slug': ('django.db.models.fields.SlugField', [], {'max_length': '50'}),
'sprite_height': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'}),
'title': ('django.db.models.fields.CharField', [], {'max_length': '255', 'db_index': 'True'}),
'visible': ('django.db.models.fields.BooleanField', [], {'default': 'False'})
},
u'products.topic': {
'Meta': {'ordering': "['product', 'display_order']", 'unique_together': "(('slug', 'product'),)", 'object_name': 'Topic'},
'description': ('django.db.models.fields.TextField', [], {}),
'display_order': ('django.db.models.fields.IntegerField', [], {}),
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'image': ('django.db.models.fields.files.ImageField', [], {'max_length': '250', 'null': 'True', 'blank': 'True'}),
'parent': ('django.db.models.fields.related.ForeignKey', [], {'blank': 'True', 'related_name': "'subtopics'", 'null': 'True', 'to': u"orm['products.Topic']"}),
'product': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'topics'", 'to': u"orm['products.Product']"}),
'slug': ('django.db.models.fields.CharField', [], {'max_length': '255', 'db_index': 'True'}),
'title': ('django.db.models.fields.CharField', [], {'max_length': '255', 'db_index': 'True'}),
'visible': ('django.db.models.fields.BooleanField', [], {'default': 'False'})
},
u'products.version': {
'Meta': {'ordering': "['-max_version']", 'object_name': 'Version'},
'default': ('django.db.models.fields.BooleanField', [], {}),
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'max_version': ('django.db.models.fields.FloatField', [], {}),
'min_version': ('django.db.models.fields.FloatField', [], {}),
'name': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
'product': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'versions'", 'to': u"orm['products.Product']"}),
'slug': ('django.db.models.fields.CharField', [], {'max_length': '255', 'db_index': 'True'}),
'visible': ('django.db.models.fields.BooleanField', [], {})
}
}
complete_apps = ['products'] | 63.014286 | 171 | 0.564725 | 447 | 4,411 | 5.465324 | 0.203579 | 0.114613 | 0.194842 | 0.278346 | 0.724519 | 0.677036 | 0.571019 | 0.553009 | 0.553009 | 0.395006 | 0 | 0.009073 | 0.200408 | 4,411 | 70 | 172 | 63.014286 | 0.683584 | 0.023804 | 0 | 0.389831 | 0 | 0 | 0.548222 | 0.302812 | 0 | 0 | 0 | 0 | 0 | 1 | 0.033898 | false | 0 | 0.067797 | 0 | 0.152542 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
913957e8d6735ca07a92b392d22679a4f08d1bdd | 2,342 | py | Python | importing/migrations/0006_importresponse.py | perimetro20/asiaticon | 40e833d9ad3e3d8a97803fa10fec64f69ca622cb | [
"MIT"
] | null | null | null | importing/migrations/0006_importresponse.py | perimetro20/asiaticon | 40e833d9ad3e3d8a97803fa10fec64f69ca622cb | [
"MIT"
] | null | null | null | importing/migrations/0006_importresponse.py | perimetro20/asiaticon | 40e833d9ad3e3d8a97803fa10fec64f69ca622cb | [
"MIT"
] | null | null | null | # Generated by Django 2.0.4 on 2018-11-06 00:44
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('importing', '0005_importrequest_photo'),
]
operations = [
migrations.CreateModel(
name='ImportResponse',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('hs_code', models.CharField(max_length=255, verbose_name='HS CODE')),
('material', models.CharField(max_length=255, verbose_name='Material')),
('height', models.DecimalField(decimal_places=2, max_digits=12, verbose_name='Height')),
('width', models.DecimalField(decimal_places=2, max_digits=12, verbose_name='Width')),
('depth', models.DecimalField(decimal_places=2, max_digits=12, verbose_name='Depth')),
('weight', models.DecimalField(decimal_places=2, max_digits=12, verbose_name='Weight')),
('color', models.CharField(max_length=255, verbose_name='Color')),
('time_production', models.CharField(max_length=255, verbose_name='Production Time')),
('moq', models.CharField(max_length=255, verbose_name='MOQ')),
('total_pieces', models.IntegerField(verbose_name='TOTAL pcs')),
('pieces_carton', models.IntegerField(verbose_name='PCS/carton')),
('box_height', models.DecimalField(decimal_places=2, max_digits=12, verbose_name='Box Height')),
('box_width', models.DecimalField(decimal_places=2, max_digits=12, verbose_name='Box Width')),
('box_depth', models.DecimalField(decimal_places=2, max_digits=12, verbose_name='Box Depth')),
('total_cbm', models.IntegerField(verbose_name='Total CBM')),
('fob_price', models.DecimalField(decimal_places=2, max_digits=12, verbose_name='FOB PRICE / usd')),
('comments', models.TextField(verbose_name='Comments')),
('supplier_information', models.TextField(verbose_name='Supplier Data')),
('import_request', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='importing.ImportRequest')),
],
),
]
| 58.55 | 129 | 0.638343 | 258 | 2,342 | 5.577519 | 0.310078 | 0.14524 | 0.138985 | 0.172342 | 0.507992 | 0.460737 | 0.460737 | 0.3287 | 0.3287 | 0.3287 | 0 | 0.031642 | 0.217336 | 2,342 | 39 | 130 | 60.051282 | 0.75341 | 0.019214 | 0 | 0 | 1 | 0 | 0.173856 | 0.020479 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.151515 | 0 | 0.242424 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
913acae6541ed1a21289cabd865477a0381ebd58 | 428 | py | Python | extraResources/electron-backend/routes/delete_toa_entries.py | vilaj46/ad1-ad2-briefs | 8bd5de28315a0525b28adb4cf8f1a7d22eefef25 | [
"MIT"
] | null | null | null | extraResources/electron-backend/routes/delete_toa_entries.py | vilaj46/ad1-ad2-briefs | 8bd5de28315a0525b28adb4cf8f1a7d22eefef25 | [
"MIT"
] | null | null | null | extraResources/electron-backend/routes/delete_toa_entries.py | vilaj46/ad1-ad2-briefs | 8bd5de28315a0525b28adb4cf8f1a7d22eefef25 | [
"MIT"
] | null | null | null | from classes.Table_Of_Authorities import get_my_toa
def delete_toa_entries():
TABLE_OF_AUTHORITIES = get_my_toa()
TABLE_OF_AUTHORITIES.set_entries([])
TABLE_OF_AUTHORITIES.set_entries_to_one()
return {
'entries': TABLE_OF_AUTHORITIES.data['entries'],
'toaEntriesError': TABLE_OF_AUTHORITIES.data['toaEntriesError'],
'toaNumbersError': TABLE_OF_AUTHORITIES.data['toaNumbersError']
}
| 32.923077 | 72 | 0.745327 | 50 | 428 | 5.9 | 0.38 | 0.166102 | 0.427119 | 0.254237 | 0.189831 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.154206 | 428 | 12 | 73 | 35.666667 | 0.814917 | 0 | 0 | 0 | 0 | 0 | 0.172897 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1 | false | 0 | 0.1 | 0 | 0.3 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
9148f9e322a2f6439ec483a026c611cd71c9eeac | 536 | py | Python | dader/model/_utils.py | tuhahaha/dader-pypi | b5867727151fe7de0f2711e8202778a901517ec0 | [
"MIT"
] | null | null | null | dader/model/_utils.py | tuhahaha/dader-pypi | b5867727151fe7de0f2711e8202778a901517ec0 | [
"MIT"
] | null | null | null | dader/model/_utils.py | tuhahaha/dader-pypi | b5867727151fe7de0f2711e8202778a901517ec0 | [
"MIT"
] | null | null | null | import torch
def shift_tokens_right(input_ids: torch.Tensor, pad_token_id: int, decoder_start_token_id: int):
"""
Shift input ids one token to the right.
"""
shifted_input_ids = input_ids.new_zeros(input_ids.shape)
shifted_input_ids[:, 1:] = input_ids[:, :-1].clone()
shifted_input_ids[:, 0] = decoder_start_token_id
assert pad_token_id is not None, "self.model.config.pad_token_id has to be defined."
shifted_input_ids.masked_fill_(shifted_input_ids == -100, pad_token_id)
return shifted_input_ids | 41.230769 | 96 | 0.740672 | 86 | 536 | 4.209302 | 0.453488 | 0.243094 | 0.248619 | 0.104972 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.013274 | 0.156716 | 536 | 13 | 97 | 41.230769 | 0.787611 | 0.072761 | 0 | 0 | 0 | 0 | 0.10166 | 0.062241 | 0 | 0 | 0 | 0 | 0.125 | 1 | 0.125 | false | 0 | 0.125 | 0 | 0.375 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
914e66f23df02c490e01b11c799489d634eeb2b2 | 1,701 | py | Python | filoc/backends/backend_pickle.py | jeromerg/filoc | ecbf7250a119eb5987662c1bf006bb36a8667ab9 | [
"MIT"
] | 2 | 2020-12-13T16:30:40.000Z | 2021-03-06T16:41:38.000Z | filoc/backends/backend_pickle.py | jeromerg/filoc | ecbf7250a119eb5987662c1bf006bb36a8667ab9 | [
"MIT"
] | 4 | 2020-07-10T13:33:46.000Z | 2021-01-27T09:50:13.000Z | filoc/backends/backend_pickle.py | jeromerg/filoc | ecbf7250a119eb5987662c1bf006bb36a8667ab9 | [
"MIT"
] | null | null | null | """ Filoc default pickle backend implementation """
import os
import pickle
from typing import Dict, Any
from fsspec import AbstractFileSystem
from filoc.contract import PropsList, BackendContract, Constraints, Props
from filoc.utils import filter_and_coerce_loaded_file_content, coerce_file_content_to_write
class PickleBackend(BackendContract):
"""
filoc backend used to read data from Pickle files and write into them. This implementation is used when you call the filoc factory with the ``backend`` argument set to ``'pickle'``. Example:
.. code-block:: python
loc = filoc('/my/locpath/{id}/data.pickle', backend='pickle')
It is recommended to read files that you wrote with filoc itself. If you want to read pickle files written by a third library, it is recommended to implement your own backend,
so that you can better handle the edge cases and print out better error messages.
"""
def __init__(self, is_singleton) -> None:
super().__init__()
self.is_singleton = is_singleton
def read(self, fs: AbstractFileSystem, path: str, path_props : Props, constraints: Constraints) -> PropsList:
"""(see BackendContract contract) """
with fs.open(path, 'rb') as f:
return filter_and_coerce_loaded_file_content(path, pickle.load(f), path_props, constraints, self.is_singleton)
def write(self, fs: AbstractFileSystem, path: str, props_list: PropsList) -> None:
"""(see BackendContract contract)"""
fs.makedirs(os.path.dirname(path), exist_ok=True)
with fs.open(path, 'wb') as f:
return pickle.dump(coerce_file_content_to_write(path, props_list, self.is_singleton), f)
| 45.972973 | 194 | 0.720165 | 232 | 1,701 | 5.125 | 0.439655 | 0.046257 | 0.050463 | 0.035324 | 0.146341 | 0.053827 | 0 | 0 | 0 | 0 | 0 | 0 | 0.188713 | 1,701 | 36 | 195 | 47.25 | 0.861594 | 0.380952 | 0 | 0 | 0 | 0 | 0.004 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.176471 | false | 0 | 0.352941 | 0 | 0.705882 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
e66eb27d58bf0403a40f7c15cd97af0ce4021f3b | 1,068 | py | Python | clickhouse_driver/defines.py | dourvaris/clickhouse-driver | 059bba7632a44fe14228bb72518b794c67779ca8 | [
"MIT"
] | 1 | 2021-03-10T09:28:00.000Z | 2021-03-10T09:28:00.000Z | clickhouse_driver/defines.py | dourvaris/clickhouse-driver | 059bba7632a44fe14228bb72518b794c67779ca8 | [
"MIT"
] | null | null | null | clickhouse_driver/defines.py | dourvaris/clickhouse-driver | 059bba7632a44fe14228bb72518b794c67779ca8 | [
"MIT"
] | null | null | null |
DEFAULT_PORT = 9000
DEFAULT_SECURE_PORT = 9440
DBMS_MIN_REVISION_WITH_TEMPORARY_TABLES = 50264
DBMS_MIN_REVISION_WITH_TOTAL_ROWS_IN_PROGRESS = 51554
DBMS_MIN_REVISION_WITH_BLOCK_INFO = 51903
# Legacy above.
DBMS_MIN_REVISION_WITH_CLIENT_INFO = 54032
DBMS_MIN_REVISION_WITH_SERVER_TIMEZONE = 54058
DBMS_MIN_REVISION_WITH_QUOTA_KEY_IN_CLIENT_INFO = 54060
DBMS_MIN_REVISION_WITH_SERVER_DISPLAY_NAME = 54372
DBMS_MIN_REVISION_WITH_VERSION_PATCH = 54401
DBMS_MIN_REVISION_WITH_SERVER_LOGS = 54406
DBMS_MIN_REVISION_WITH_COLUMN_DEFAULTS_METADATA = 54410
DBMS_MIN_REVISION_WITH_CLIENT_WRITE_INFO = 54420
DBMS_MIN_REVISION_WITH_SETTINGS_SERIALIZED_AS_STRINGS = 54429
# Timeouts
DBMS_DEFAULT_CONNECT_TIMEOUT_SEC = 10
DBMS_DEFAULT_TIMEOUT_SEC = 300
DBMS_DEFAULT_SYNC_REQUEST_TIMEOUT_SEC = 5
DEFAULT_COMPRESS_BLOCK_SIZE = 1048576
DEFAULT_INSERT_BLOCK_SIZE = 1048576
DBMS_NAME = 'ClickHouse'
CLIENT_NAME = 'python-driver'
CLIENT_VERSION_MAJOR = 18
CLIENT_VERSION_MINOR = 10
CLIENT_VERSION_PATCH = 3
CLIENT_REVISION = 54429
BUFFER_SIZE = 1048576
STRINGS_ENCODING = 'utf-8'
| 28.105263 | 61 | 0.876404 | 161 | 1,068 | 5.180124 | 0.453416 | 0.100719 | 0.215827 | 0.273381 | 0.14988 | 0 | 0 | 0 | 0 | 0 | 0 | 0.10883 | 0.088015 | 1,068 | 37 | 62 | 28.864865 | 0.747433 | 0.020599 | 0 | 0 | 0 | 0 | 0.026871 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
e66f2ff0fe65f47d61331543fe31a8fe2f6e3d0c | 442 | py | Python | 2021/Day_01/part_2.py | Adilius/adventofcode | d0d3ad1a0430c3732d108ad8ef2b4d218a37944b | [
"MIT"
] | 2 | 2020-12-01T14:50:51.000Z | 2020-12-03T17:08:43.000Z | 2021/Day_01/part_2.py | Adilius/adventofcode | d0d3ad1a0430c3732d108ad8ef2b4d218a37944b | [
"MIT"
] | null | null | null | 2021/Day_01/part_2.py | Adilius/adventofcode | d0d3ad1a0430c3732d108ad8ef2b4d218a37944b | [
"MIT"
] | null | null | null | input_file = open("input.txt", "r")
entriesArray = input_file.read().split("\n")
depth_measure_increase = 0
for i in range(3, len(entriesArray), 1):
first_window = int(entriesArray[i-1]) + int(entriesArray[i-2]) + int(entriesArray[i-3])
second_window = int(entriesArray[i]) + int(entriesArray[i-1]) + int(entriesArray[i-2])
if second_window > first_window:
depth_measure_increase += 1
print(f'{depth_measure_increase=}') | 40.181818 | 91 | 0.701357 | 65 | 442 | 4.584615 | 0.430769 | 0.302013 | 0.322148 | 0.147651 | 0.228188 | 0.228188 | 0.228188 | 0.228188 | 0 | 0 | 0 | 0.023499 | 0.133484 | 442 | 11 | 92 | 40.181818 | 0.754569 | 0 | 0 | 0 | 0 | 0 | 0.083521 | 0.056433 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.111111 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
e68bbfefc5641782101c78637163dfda3ddfad16 | 326 | py | Python | aula#10/desafio035.py | daramariabs/exercicios-python | 0d9785a9cccd5442a190572c58ab8dd6e2fe0cce | [
"MIT"
] | null | null | null | aula#10/desafio035.py | daramariabs/exercicios-python | 0d9785a9cccd5442a190572c58ab8dd6e2fe0cce | [
"MIT"
] | null | null | null | aula#10/desafio035.py | daramariabs/exercicios-python | 0d9785a9cccd5442a190572c58ab8dd6e2fe0cce | [
"MIT"
] | null | null | null | """ Desenvolva um programa que leia o comprimento de três retas
e diga ao usuário se elas podem ou não formar um triângulo."""
a = float(input('Reta A:'))
b = float(input('Reta B:'))
c = float(input('Reta C:'))
if a < b + c and b < a + c and c < a + b:
print('Forma um triangulo')
else:
print('Não forma um triangulo') | 36.222222 | 63 | 0.647239 | 59 | 326 | 3.576271 | 0.559322 | 0.14218 | 0.199052 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.208589 | 326 | 9 | 64 | 36.222222 | 0.817829 | 0.365031 | 0 | 0 | 0 | 0 | 0.303483 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.285714 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
e699fa1ed93de2837ee78138e6d28a5d1f013dc6 | 818 | py | Python | tests/conftest.py | blurrcat/debot | 21f78e8a29e607ccb860fa8d12a0a36afc87bdea | [
"MIT"
] | null | null | null | tests/conftest.py | blurrcat/debot | 21f78e8a29e607ccb860fa8d12a0a36afc87bdea | [
"MIT"
] | null | null | null | tests/conftest.py | blurrcat/debot | 21f78e8a29e607ccb860fa8d12a0a36afc87bdea | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
import os
import pytest
from debot.app import create_app
@pytest.fixture
def slack_token():
return 'slack_token'
@pytest.fixture
def app(request, slack_token):
os.environ['DEBOT_SLACK_TOKEN'] = slack_token
os.environ['DEBOT_DEBUG'] = 'True'
_app = create_app()
context = _app.app_context()
context.push()
def clean():
context.pop()
request.addfinalizer(clean)
return _app
@pytest.fixture
def dispatcher(app):
return app.extensions['dispatcher']
@pytest.fixture
def echo_command(app, dispatcher):
def echo(what):
"""
echo a string.
:param what: what to echo
"""
return what
dispatcher.add_hook('test_plugins', 'echo', echo)
dispatcher._gen_help()
return echo
| 18.590909 | 53 | 0.649144 | 103 | 818 | 4.980583 | 0.417476 | 0.097466 | 0.124756 | 0.074074 | 0.093567 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001582 | 0.227384 | 818 | 43 | 54 | 19.023256 | 0.810127 | 0.101467 | 0 | 0.148148 | 0 | 0 | 0.098291 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0 | 0.111111 | 0.074074 | 0.518519 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
e69a68151368558ed4de860300ffbbd90b4c5061 | 1,225 | py | Python | docs/autogen.py | Techtonique/GPopt | 37eb7cbd55679b67b0f0f39dddb310309531e5ca | [
"BSD-3-Clause-Clear"
] | 1 | 2021-07-14T11:56:32.000Z | 2021-07-14T11:56:32.000Z | docs/autogen.py | Techtonique/GPopt | 37eb7cbd55679b67b0f0f39dddb310309531e5ca | [
"BSD-3-Clause-Clear"
] | null | null | null | docs/autogen.py | Techtonique/GPopt | 37eb7cbd55679b67b0f0f39dddb310309531e5ca | [
"BSD-3-Clause-Clear"
] | null | null | null | # -*- coding: utf-8 -*-
import pathlib
import shutil
import keras_autodoc
PAGES = {
'documentation/gpopt.md': [
'GPopt.GPOpt.GPOpt.GPOpt',
'GPopt.GPOpt.GPOpt.GPOpt.optimize',
'GPopt.GPOpt.GPOpt.GPOpt.load',
'GPopt.GPOpt.GPOpt.GPOpt.close_shelve'
]
}
GPopt_dir = pathlib.Path(__file__).resolve().parents[1]
def generate(dest_dir):
template_dir = GPopt_dir / 'docs' / 'templates'
doc_generator = keras_autodoc.DocumentationGenerator(
pages = PAGES,
# project_url = 'https://github.com/Techtonique/GPopt',
template_dir = template_dir,
#GPopt_dir / 'examples'
)
doc_generator.generate(dest_dir)
readme = (GPopt_dir / 'README.md').read_text()
index = (template_dir / 'index.md').read_text()
index = index.replace('{{autogenerated}}', readme[readme.find('##'):])
(dest_dir / 'index.md').write_text(index, encoding='utf-8')
shutil.copyfile(GPopt_dir / 'CONTRIBUTING.md',
dest_dir / 'contributing.md')
#shutil.copyfile(GPopt_dir / 'docs' / 'extra.css',
# dest_dir / 'extra.css')
if __name__ == '__main__':
generate(GPopt_dir / 'docs' / 'sources') | 30.625 | 74 | 0.617959 | 138 | 1,225 | 5.217391 | 0.398551 | 0.180556 | 0.208333 | 0.194444 | 0.116667 | 0.055556 | 0.055556 | 0 | 0 | 0 | 0 | 0.003175 | 0.228571 | 1,225 | 40 | 75 | 30.625 | 0.75873 | 0.15102 | 0 | 0 | 0 | 0 | 0.243478 | 0.136232 | 0 | 0 | 0 | 0 | 0 | 1 | 0.037037 | false | 0 | 0.111111 | 0 | 0.148148 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
e6b34aed109e0fb773fcc927112667f38c52e168 | 861 | py | Python | src/moonshot/utils/image_utils.py | rpeloff/moonshot | f58ddaa15c2bea416731e3bd1f2c5de86d6aa115 | [
"MIT"
] | 4 | 2019-10-29T09:50:59.000Z | 2019-11-22T19:01:07.000Z | src/moonshot/utils/image_utils.py | rpeloff/moonshot | f58ddaa15c2bea416731e3bd1f2c5de86d6aa115 | [
"MIT"
] | null | null | null | src/moonshot/utils/image_utils.py | rpeloff/moonshot | f58ddaa15c2bea416731e3bd1f2c5de86d6aa115 | [
"MIT"
] | null | null | null | """Utility functions for manipulating image data.
Author: Ryan Eloff
Contact: ryan.peter.eloff@gmail.com
Date: July 2019
"""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import numpy as np
from skimage.io import imread
def load_image_array(image_path):
"""Read image from file to ndarray."""
return np.asarray(imread(image_path))
# TODO(rpeloff) old code, remove if not using?
# def resize_square_crop(image_arr, size=(224, 224), resample=Image.LANCZOS):
# h, w, _ = image_arr.shape
# short_edge = min(w, h)
# h_shift = int((h - short_edge) / 2)
# w_shift = int((w - short_edge) / 2)
# image_resize = Image.fromarray(image_arr).resize(
# size, box=(w_shift, h_shift, w - w_shift, h - h_shift), resample=resample)
# return np.asarray(image_resize)
| 26.090909 | 84 | 0.708479 | 129 | 861 | 4.457364 | 0.511628 | 0.052174 | 0.083478 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.016973 | 0.178862 | 861 | 32 | 85 | 26.90625 | 0.796322 | 0.67712 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.03125 | 0 | 1 | 0.142857 | false | 0 | 0.714286 | 0 | 1 | 0.142857 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
e6b380263c2a3f3f89f7de066b6fedec64b09e5f | 396 | py | Python | app/admin/routes/index.py | digipointtku/viuhka-flask | e51382a306675efabebe2a47d0ae54b7abcdb884 | [
"MIT"
] | null | null | null | app/admin/routes/index.py | digipointtku/viuhka-flask | e51382a306675efabebe2a47d0ae54b7abcdb884 | [
"MIT"
] | 1 | 2019-10-24T07:28:50.000Z | 2019-10-24T07:28:50.000Z | app/admin/routes/index.py | codepointtku/viuhka-flask | e51382a306675efabebe2a47d0ae54b7abcdb884 | [
"MIT"
] | 1 | 2019-11-29T05:46:59.000Z | 2019-11-29T05:46:59.000Z | from flask import Blueprint, render_template, redirect
from flask_login import current_user
import json
from ..forms.login import LoginForm
module = Blueprint('admin', __name__)
_name_ = 'Flask Admin'
@module.route('/admin', methods=['GET'])
def index():
if current_user.is_authenticated:
return render_template('admin/index.html')
else:
return redirect('/login') | 18.857143 | 54 | 0.719697 | 49 | 396 | 5.571429 | 0.55102 | 0.065934 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.169192 | 396 | 21 | 55 | 18.857143 | 0.829787 | 0 | 0 | 0 | 0 | 0 | 0.118388 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.083333 | false | 0 | 0.333333 | 0 | 0.583333 | 0.166667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
e6b78ef5d6b695363f50c706f042883ab5cad376 | 4,732 | py | Python | posthog/api/test/test_authentication.py | FarazPatankar/posthog | dddf2644376d0fd6836ed96c139f6a825c74202f | [
"MIT"
] | 1 | 2021-04-09T09:13:23.000Z | 2021-04-09T09:13:23.000Z | posthog/api/test/test_authentication.py | FarazPatankar/posthog | dddf2644376d0fd6836ed96c139f6a825c74202f | [
"MIT"
] | 1 | 2021-10-13T10:05:26.000Z | 2021-10-13T10:05:26.000Z | posthog/api/test/test_authentication.py | FarazPatankar/posthog | dddf2644376d0fd6836ed96c139f6a825c74202f | [
"MIT"
] | 1 | 2021-06-17T02:18:43.000Z | 2021-06-17T02:18:43.000Z | from unittest.mock import patch
from rest_framework import status
from posthog.models import User
from posthog.test.base import APIBaseTest
class TestAuthenticationAPI(APIBaseTest):
CONFIG_AUTO_LOGIN = False
@patch("posthoganalytics.capture")
def test_user_logs_in_with_email_and_password(self, mock_capture):
response = self.client.post("/api/login", {"email": self.CONFIG_EMAIL, "password": self.CONFIG_PASSWORD})
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(response.json(), {"success": True})
# Test that we're actually logged in
response = self.client.get("/api/user/")
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(response.json()["email"], self.user.email)
# Assert the event was captured.
mock_capture.assert_called_once_with(
self.user.distinct_id, "user logged in", properties={"social_provider": ""}
)
@patch("posthoganalytics.capture")
def test_user_cant_login_with_incorrect_password(self, mock_capture):
invalid_passwords = ["1234", "abcdefgh", "testpassword1234", "😈😈😈"]
for password in invalid_passwords:
response = self.client.post("/api/login", {"email": self.CONFIG_EMAIL, "password": password})
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertEqual(response.json(), self.ERROR_INVALID_CREDENTIALS)
# Assert user is not logged in
response = self.client.get("/api/user/")
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
self.assertNotIn("email", response.json())
# Events never get reported
mock_capture.assert_not_called()
@patch("posthoganalytics.capture")
def test_user_cant_login_with_incorrect_email(self, mock_capture):
response = self.client.post("/api/login", {"email": "user2@posthog.com", "password": self.CONFIG_PASSWORD})
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertEqual(response.json(), self.ERROR_INVALID_CREDENTIALS)
# Assert user is not logged in
response = self.client.get("/api/user/")
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
self.assertNotIn("email", response.json())
# Events never get reported
mock_capture.assert_not_called()
def test_cant_login_without_required_attributes(self):
required_attributes = [
"email",
"password",
]
for attribute in required_attributes:
body = {
"email": self.CONFIG_EMAIL,
"password": self.CONFIG_PASSWORD,
}
body.pop(attribute)
response = self.client.post("/api/login/", body)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertEqual(
response.json(),
{
"type": "validation_error",
"code": "required",
"detail": "This field is required.",
"attr": attribute,
},
)
# Assert user is not logged in
response = self.client.get("/api/user/")
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
def test_login_endpoint_is_protected_against_brute_force_attempts(self):
User.objects.create(email="new_user@posthog.com", password="87654321")
# Fill the attempt limit
with self.settings(AXES_FAILURE_LIMIT=3):
for _ in range(0, 2):
response = self.client.post("/api/login", {"email": "new_user@posthog.com", "password": "invalid"})
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertEqual(response.json(), self.ERROR_INVALID_CREDENTIALS)
# Assert user is not logged in
response = self.client.get("/api/user/")
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
response = self.client.post("/api/login", {"email": "new_user@posthog.com", "password": "invalid"})
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
self.assertEqual(
response.json(),
{
"type": "authentication_error",
"code": "too_many_failed_attempts",
"detail": "Too many failed login attempts. Please try again in 15 minutes.",
"attr": None,
},
)
| 41.147826 | 115 | 0.626585 | 514 | 4,732 | 5.552529 | 0.249027 | 0.094604 | 0.14506 | 0.111773 | 0.668185 | 0.655922 | 0.621233 | 0.621233 | 0.605116 | 0.605116 | 0 | 0.015782 | 0.263525 | 4,732 | 114 | 116 | 41.508772 | 0.802296 | 0.0541 | 0 | 0.382716 | 0 | 0 | 0.144425 | 0.021496 | 0 | 0 | 0 | 0 | 0.283951 | 1 | 0.061728 | false | 0.148148 | 0.049383 | 0 | 0.135802 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
e6bd5c723404a4d9743a8602e68fe32bc31fee7e | 64 | py | Python | spruned/__init__.py | cdecker/spruned | 70f2e3e8564249d8c125b5c00929c32d11abf875 | [
"MIT"
] | 1 | 2021-08-31T10:29:59.000Z | 2021-08-31T10:29:59.000Z | spruned/__init__.py | cdecker/spruned | 70f2e3e8564249d8c125b5c00929c32d11abf875 | [
"MIT"
] | null | null | null | spruned/__init__.py | cdecker/spruned | 70f2e3e8564249d8c125b5c00929c32d11abf875 | [
"MIT"
] | null | null | null | __version__ = '0.0.1a7'
__bitcoind_version_emulation__ = '0.16'
| 21.333333 | 39 | 0.765625 | 9 | 64 | 4.333333 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.12069 | 0.09375 | 64 | 2 | 40 | 32 | 0.551724 | 0 | 0 | 0 | 0 | 0 | 0.171875 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
e6c8c91995230fb828d156a40b171fd1819c6e49 | 544 | py | Python | py/mapqueue/redis.py | mapqueue/mapqueue | bd8f9ddb59b9c5b0928c1e019b2e137f4dfa6920 | [
"MIT"
] | null | null | null | py/mapqueue/redis.py | mapqueue/mapqueue | bd8f9ddb59b9c5b0928c1e019b2e137f4dfa6920 | [
"MIT"
] | 1 | 2018-12-26T19:20:57.000Z | 2018-12-26T19:20:57.000Z | py/mapqueue/redis.py | mapqueue/mapqueue | bd8f9ddb59b9c5b0928c1e019b2e137f4dfa6920 | [
"MIT"
] | null | null | null | from .base import Key, Map, int_bytes, Optional, UUID
from .config import NAME
from redis import Redis as connect
class RedisMap(Map):
def open(self):
self._db = connect(db=NAME)
return self
def _put(self, key: Key, value: bytes) -> Key:
self._db.set(key.uuid.bytes_le + int_bytes(-key.time), value)
return key
def _get(self, uuid: UUID, time: int) -> Optional[bytes]:
return self._db.get(uuid.bytes_le + int_bytes(-time))
def close(self):
del(self._db)
return self
| 24.727273 | 69 | 0.632353 | 81 | 544 | 4.111111 | 0.358025 | 0.072072 | 0.066066 | 0.084084 | 0.114114 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.25 | 544 | 21 | 70 | 25.904762 | 0.816176 | 0 | 0 | 0.133333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.266667 | false | 0 | 0.2 | 0.066667 | 0.8 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
e6ce1c0e79d36ce7dc102b1787ddb1b472ffebca | 15,940 | py | Python | rbc/tests/test_omnisci_array_operators.py | brenocfg/rbc | 7274504ff6c72ff50467eaaab83e9611f446ea40 | [
"BSD-3-Clause"
] | 21 | 2019-05-21T14:44:01.000Z | 2021-12-09T21:48:36.000Z | rbc/tests/test_omnisci_array_operators.py | brenocfg/rbc | 7274504ff6c72ff50467eaaab83e9611f446ea40 | [
"BSD-3-Clause"
] | 349 | 2019-07-31T17:48:21.000Z | 2022-03-31T06:57:52.000Z | rbc/tests/test_omnisci_array_operators.py | brenocfg/rbc | 7274504ff6c72ff50467eaaab83e9611f446ea40 | [
"BSD-3-Clause"
] | 10 | 2020-01-23T20:14:17.000Z | 2022-02-08T20:43:08.000Z | import pytest
import numpy as np
from rbc.omnisci_backend import Array
from rbc.tests import omnisci_fixture
from numba import types as nb_types
import operator
rbc_omnisci = pytest.importorskip('rbc.omniscidb')
available_version, reason = rbc_omnisci.is_available()
pytestmark = pytest.mark.skipif(not available_version, reason=reason)
@pytest.fixture(scope='module')
def omnisci():
for o in omnisci_fixture(globals()):
define(o)
yield o
operator_methods = [
('abs', (6,), np.arange(6)),
('add', (6,), np.full(6, 5)),
('and_bw', (6,), [0, 0, 2, 2, 0, 0]),
('countOf', (6, 3, 4), 0),
('countOf', (6, 3, 3), 6),
('eq', (6, 3), [0, 0, 0, 1, 0, 0]),
('eq_array', (6, 3), True),
('floordiv', (6,), [3, 2, 2, 2, 2, 1]),
('floordiv2', (6,), [3.0, 2.0, 2.0, 2.0, 2.0, 1.0]),
('ge', (6, 3), [0, 0, 0, 1, 1, 1]),
('ge_array', (6, 3), True),
('gt', (6, 3), [0, 0, 0, 0, 1, 1]),
('gt_array', (6, 3), False),
('iadd', (6,), [1, 2, 3, 4, 5, 6]),
('iand', (6,), [0, 0, 2, 2, 0, 0]),
('ifloordiv', (6,), [3, 2, 2, 2, 2, 1]),
('ifloordiv2', (6,), [3, 2, 2, 2, 2, 1]),
('ilshift', (6,), [0, 16, 16, 12, 8, 5]),
('imul', (6,), [0, 4, 6, 6, 4, 0]),
('ior', (6,), [5, 5, 3, 3, 5, 5]),
('isub', (6,), [-5, -3, -1, 1, 3, 5]),
('ipow', (6,), [1, 32, 81, 64, 25, 6]),
('irshift', (6,), [0, 0, 0, 0, 2, 5]),
('itruediv', (6,), [3, 2, 2, 2, 2, 1]),
('itruediv2', (6,), [3.3333333333333335, 2.75, 2.4, 2.1666666666666665, 2.0, 1.875]), # noqa: E501
('imod', (6,), [0, 4, 1, 5, 2, 6]),
('ixor', (6,), [5, 5, 1, 1, 5, 5]),
('in', (6, 3), True),
('is', (6, 3), True),
('is_not', (6, 3), False),
('is_not2', (6, 3), True),
('le', (6, 3), [1, 1, 1, 1, 0, 0]),
('le_array', (6, 3), True),
('lshift', (6,), [0, 16, 16, 12, 8, 5]),
('lt', (6, 3), [1, 1, 1, 0, 0, 0]),
('lt_array', (6, 3), False),
('mul', (6,), [0, 4, 6, 6, 4, 0]),
('mod', (6,), [0, 4, 1, 5, 2, 6]),
('ne', (6, 3), [1, 1, 1, 0, 1, 1]),
('ne_array', (6, 3), False),
('neg', (6,), [0, -1, -2, -3, -4, -5]),
('not_in', (6, 3), False),
('or_bw', (6,), [5, 5, 3, 3, 5, 5]),
('pos', (6,), [0, -1, -2, -3, -4, -5]),
('pow', (6,), [1, 32, 81, 64, 25, 6]),
('rshift', (6,), [0, 0, 0, 0, 2, 5]),
('sub', (6,), [-5, -3, -1, 1, 3, 5]),
('truediv', (6,), [3, 2, 2, 2, 2, 1]),
('truediv2', (6,), [3.3333333333333335, 2.75, 2.4, 2.1666666666666665, 2.0, 1.875]), # noqa: E501
('xor', (6,), [5, 5, 1, 1, 5, 5]),
]
def define(omnisci):
@omnisci('int32[](int64)')
def operator_abs(size):
a = Array(size, 'int32')
for i in range(size):
a[i] = nb_types.int32(-i)
return abs(a)
@omnisci('int32[](int64)')
def operator_add(size):
a = Array(size, 'int32')
b = Array(size, 'int32')
for i in range(size):
a[i] = nb_types.int32(i)
b[i] = nb_types.int32(size-i-1)
return operator.add(a, b)
@omnisci('int32[](int64)')
def operator_and_bw(size):
a = Array(size, 'int32')
b = Array(size, 'int32')
for i in range(size):
a[i] = nb_types.int32(i)
b[i] = nb_types.int32(size-i-1)
return operator.and_(a, b)
@omnisci('int64(int64, int64, int64)')
def operator_countOf(size, fill_value, b):
a = Array(size, 'int64')
for i in range(size):
a[i] = fill_value
return operator.countOf(a, b)
@omnisci('int8[](int64, int32)')
def operator_eq(size, v):
a = Array(size, 'int32')
for i in range(size):
a[i] = nb_types.int32(i)
return a == v
@omnisci('bool(int64, int32)')
def operator_eq_array(size, v):
a = Array(size, 'int32')
for i in range(size):
a[i] = nb_types.int32(i)
return a == a
@omnisci('int32[](int64)')
def operator_floordiv(size):
a = Array(size, 'int32')
b = Array(size, 'int32')
for i in range(size):
a[i] = nb_types.int32(i+10)
b[i] = nb_types.int32(i+3)
return operator.floordiv(a, b)
@omnisci('double[](int64)')
def operator_floordiv2(size):
a = Array(size, 'double')
b = Array(size, 'double')
for i in range(size):
a[i] = nb_types.double(i+10)
b[i] = nb_types.double(i+3)
return operator.floordiv(a, b)
@omnisci('int8[](int64, int32)')
def operator_ge(size, v):
a = Array(size, 'int32')
for i in range(size):
a[i] = nb_types.int32(i)
return a >= v
@omnisci('bool(int64, int32)')
def operator_ge_array(size, v):
a = Array(size, 'int32')
for i in range(size):
a[i] = nb_types.int32(i)
return a >= a
@omnisci('int8[](int64, int32)')
def operator_gt(size, v):
a = Array(size, 'int32')
for i in range(size):
a[i] = nb_types.int32(i)
return a > v
@omnisci('bool(int64, int32)')
def operator_gt_array(size, v):
a = Array(size, 'int32')
for i in range(size):
a[i] = nb_types.int32(i)
return a > a
@omnisci('int32[](int64)')
def operator_iadd(size):
a = Array(size, 'int32')
b = Array(size, 'int32')
for i in range(size):
a[i] = nb_types.int32(i)
b[i] = nb_types.int32(1)
operator.iadd(a, b)
return a
@omnisci('int32[](int64)')
def operator_iand(size):
a = Array(size, 'int32')
b = Array(size, 'int32')
for i in range(size):
a[i] = nb_types.int32(i)
b[i] = nb_types.int32(size-i-1)
operator.iand(a, b)
return a
@omnisci('int32[](int64)')
def operator_ifloordiv(size):
a = Array(size, 'int32')
b = Array(size, 'int32')
for i in range(size):
a[i] = nb_types.int32(i+10)
b[i] = nb_types.int32(i+3)
operator.ifloordiv(a, b)
return a
@omnisci('double[](int64)')
def operator_ifloordiv2(size):
a = Array(size, 'double')
b = Array(size, 'double')
for i in range(size):
a[i] = nb_types.double(i+10)
b[i] = nb_types.double(i+3)
operator.ifloordiv(a, b)
return a
@omnisci('int32[](int64)')
def operator_ilshift(size):
a = Array(size, 'int32')
b = Array(size, 'int32')
for i in range(size):
a[i] = nb_types.int32(i)
b[i] = nb_types.int32(size-i-1)
operator.ilshift(a, b)
return a
@omnisci('int32[](int64)')
def operator_imul(size):
a = Array(size, 'int32')
b = Array(size, 'int32')
for i in range(size):
a[i] = nb_types.int32(i)
b[i] = nb_types.int32(size-i-1)
operator.imul(a, b)
return a
@omnisci('int32[](int64)')
def operator_ior(size):
a = Array(size, 'int32')
b = Array(size, 'int32')
for i in range(size):
a[i] = nb_types.int32(i)
b[i] = nb_types.int32(size-i-1)
operator.ior(a, b)
return a
@omnisci('int32[](int64)')
def operator_isub(size):
a = Array(size, 'int32')
b = Array(size, 'int32')
for i in range(size):
a[i] = nb_types.int32(i)
b[i] = nb_types.int32(size-i-1)
operator.isub(a, b)
return a
@omnisci('int32[](int64)')
def operator_ipow(size):
a = Array(size, 'int32')
b = Array(size, 'int32')
for i in range(size):
a[i] = nb_types.int32(i+1)
b[i] = nb_types.int32(size-i)
operator.ipow(a, b)
return a
@omnisci('int32[](int64)')
def operator_irshift(size):
a = Array(size, 'int32')
b = Array(size, 'int32')
for i in range(size):
a[i] = nb_types.int32(i)
b[i] = nb_types.int32(size-i-1)
operator.irshift(a, b)
return a
@omnisci('int32[](int64)')
def operator_itruediv(size):
a = Array(size, 'int32')
b = Array(size, 'int32')
for i in range(size):
a[i] = nb_types.int32(i+10)
b[i] = nb_types.int32(i+3)
operator.itruediv(a, b)
return a
@omnisci('double[](int64)')
def operator_itruediv2(size):
a = Array(size, 'double')
b = Array(size, 'double')
for i in range(size):
a[i] = nb_types.double(i+10)
b[i] = nb_types.double(i+3)
operator.itruediv(a, b)
return a
@omnisci('int32[](int64)')
def operator_imod(size):
a = Array(size, 'int32')
b = Array(size, 'int32')
for i in range(size):
a[i] = nb_types.int32(i * 123)
b[i] = nb_types.int32(7)
operator.imod(a, b)
return a
@omnisci('int32[](int64)')
def operator_ixor(size):
a = Array(size, 'int32')
b = Array(size, 'int32')
for i in range(size):
a[i] = nb_types.int32(i)
b[i] = nb_types.int32(size-i-1)
operator.ixor(a, b)
return a
@omnisci('int8(int64, int32)')
def operator_in(size, v):
a = Array(size, 'int32')
for i in range(size):
a[i] = nb_types.int32(i)
return v in a
@omnisci('int8(int64, int32)')
def operator_is(size, v):
a = Array(size, 'int32')
a.fill(v)
return a is a
@omnisci('int8(int64, int32)')
def operator_is_not(size, v):
a = Array(size, 'int32')
a.fill(v)
return a is not a
@omnisci('int8(int64, int32)')
def operator_is_not2(size, v):
a = Array(size, 'int32')
a.fill(v)
b = Array(size, 'int32')
b.fill(v)
return a is not b
@omnisci('int8[](int64, int32)')
def operator_le(size, v):
a = Array(size, 'int32')
for i in range(size):
a[i] = nb_types.int32(i)
return a <= v
@omnisci('bool(int64, int32)')
def operator_le_array(size, v):
a = Array(size, 'int32')
for i in range(size):
a[i] = nb_types.int32(i)
return a <= a
@omnisci('int32[](int64)')
def operator_lshift(size):
a = Array(size, 'int32')
b = Array(size, 'int32')
for i in range(size):
a[i] = nb_types.int32(i)
b[i] = nb_types.int32(size-i-1)
return operator.lshift(a, b)
@omnisci('int8[](int64, int32)')
def operator_lt(size, v):
a = Array(size, 'int32')
for i in range(size):
a[i] = nb_types.int32(i)
return a < v
@omnisci('bool(int64, int32)')
def operator_lt_array(size, v):
a = Array(size, 'int32')
for i in range(size):
a[i] = nb_types.int32(i)
return a < a
@omnisci('int32[](int64)')
def operator_mul(size):
a = Array(size, 'int32')
b = Array(size, 'int32')
for i in range(size):
a[i] = nb_types.int32(i)
b[i] = nb_types.int32(size-i-1)
return operator.mul(a, b)
@omnisci('int32[](int64)')
def operator_mod(size):
a = Array(size, 'int32')
b = Array(size, 'int32')
for i in range(size):
a[i] = nb_types.int32(i * 123)
b[i] = nb_types.int32(7)
return operator.mod(a, b)
@omnisci('int8[](int64, int32)')
def operator_ne(size, v):
a = Array(size, 'int32')
for i in range(size):
a[i] = nb_types.int32(i)
return a != v
@omnisci('bool(int64, int32)')
def operator_ne_array(size, v):
a = Array(size, 'int32')
for i in range(size):
a[i] = nb_types.int32(i)
return a != a
@omnisci('int32[](int64)')
def operator_neg(size):
a = Array(size, 'int32')
for i in range(size):
a[i] = nb_types.int32(i)
return operator.neg(a)
@omnisci('int8(int64, int32)')
def operator_not_in(size, v):
a = Array(size, 'int32')
for i in range(size):
a[i] = nb_types.int32(i)
return v not in a
@omnisci('int32[](int64)')
def operator_or_bw(size):
a = Array(size, 'int32')
b = Array(size, 'int32')
for i in range(size):
a[i] = nb_types.int32(i)
b[i] = nb_types.int32(size-i-1)
return operator.or_(a, b)
@omnisci('int32[](int64)')
def operator_pos(size):
a = Array(size, 'int32')
for i in range(size):
a[i] = nb_types.int32(-i)
return operator.pos(a)
@omnisci('int32[](int64)')
def operator_pow(size):
a = Array(size, 'int32')
b = Array(size, 'int32')
for i in range(size):
a[i] = nb_types.int32(i+1)
b[i] = nb_types.int32(size-i)
return operator.pow(a, b)
@omnisci('int32[](int64)')
def operator_rshift(size):
a = Array(size, 'int32')
b = Array(size, 'int32')
for i in range(size):
a[i] = nb_types.int32(i)
b[i] = nb_types.int32(size-i-1)
return operator.rshift(a, b)
@omnisci('int32[](int64)')
def operator_sub(size):
a = Array(size, 'int32')
b = Array(size, 'int32')
for i in range(size):
a[i] = nb_types.int32(i)
b[i] = nb_types.int32(size-i-1)
return operator.sub(a, b)
@omnisci('int32[](int64)')
def operator_truediv(size):
a = Array(size, 'int32')
b = Array(size, 'int32')
for i in range(size):
a[i] = nb_types.int32(i+10)
b[i] = nb_types.int32(i+3)
return operator.truediv(a, b)
@omnisci('double[](int64)')
def operator_truediv2(size):
a = Array(size, 'double')
b = Array(size, 'double')
for i in range(size):
a[i] = nb_types.double(i+10)
b[i] = nb_types.double(i+3)
return operator.truediv(a, b)
@omnisci('int32[](int64)')
def operator_xor(size):
a = Array(size, 'int32')
b = Array(size, 'int32')
for i in range(size):
a[i] = nb_types.int32(i)
b[i] = nb_types.int32(size-i-1)
return operator.xor(a, b)
@pytest.mark.parametrize("suffix, args, expected", operator_methods,
ids=[item[0] for item in operator_methods])
def test_array_operators(omnisci, suffix, args, expected):
if omnisci.has_cuda and suffix in ['countOf', 'in', 'not_in'] and omnisci.version < (5, 5):
# https://github.com/xnd-project/rbc/issues/107
pytest.skip(f'operator_{suffix}: crashes CUDA enabled omniscidb server'
' [rbc issue 107]')
if (available_version[:3] == (5, 3, 1)
and suffix in ['abs', 'add', 'and_bw', 'eq', 'floordiv', 'floordiv2',
'ge', 'gt', 'iadd', 'iand', 'ifloordiv', 'ifloordiv2',
'ilshift', 'imul', 'ior', 'isub', 'ipow', 'irshift',
'itruediv', 'itruediv2', 'imod', 'ixor', 'le', 'lshift',
'lt', 'mul', 'mod', 'ne', 'neg', 'or_bw', 'pos', 'pow',
'rshift', 'sub', 'truediv', 'truediv2', 'xor']):
pytest.skip(
f'operator_{suffix}: crashes CPU-only omniscidb server v 5.3.1'
' [issue 115]')
query = 'select operator_{suffix}'.format(**locals()) + \
'(' + ', '.join(map(str, args)) + ')'
_, result = omnisci.sql_execute(query)
out = list(result)[0]
if suffix in ['in', 'not_in']:
assert (expected == out[0]), 'operator_' + suffix
elif '_array' in suffix:
assert (expected == out[0]), 'operator_' + suffix
else:
assert np.array_equal(expected, out[0]), 'operator_' + suffix
| 30.59501 | 103 | 0.498557 | 2,301 | 15,940 | 3.376358 | 0.069535 | 0.09731 | 0.075171 | 0.108766 | 0.764191 | 0.750161 | 0.716823 | 0.641653 | 0.603295 | 0.533402 | 0 | 0.086957 | 0.314617 | 15,940 | 520 | 104 | 30.653846 | 0.624165 | 0.004203 | 0 | 0.592105 | 0 | 0 | 0.118722 | 0 | 0 | 0 | 0 | 0 | 0.006579 | 1 | 0.114035 | false | 0 | 0.015351 | 0 | 0.236842 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
e6cedeaf46f9af1523d8ef63aafb86944ad69d8d | 1,177 | py | Python | test/test_rpq_manual.py | 6851-2021/retroactive-priority-queue | d41c3eaf706c38284950bd51043ad238055d2aea | [
"MIT"
] | null | null | null | test/test_rpq_manual.py | 6851-2021/retroactive-priority-queue | d41c3eaf706c38284950bd51043ad238055d2aea | [
"MIT"
] | null | null | null | test/test_rpq_manual.py | 6851-2021/retroactive-priority-queue | d41c3eaf706c38284950bd51043ad238055d2aea | [
"MIT"
] | null | null | null | import unittest
from retropq import RetroactivePriorityQueue
class PriorityQueueManualTest(unittest.TestCase):
def test_simple(self):
queue = RetroactivePriorityQueue()
self.assertEqual([], list(queue))
queue.add_insert(0, 5)
self.assertEqual([5], list(queue))
queue.add_insert(10, 3)
self.assertEqual([3, 5], list(queue))
queue.add_delete_min(5)
self.assertEqual([3], list(queue))
queue.add_insert(2, 7)
self.assertEqual([3, 7], list(queue))
queue.add_insert(3, 4)
self.assertEqual([3, 5, 7], list(queue))
queue.add_delete_min(7)
self.assertEqual([3, 7], list(queue))
# delete insert
queue.remove(2)
self.assertEqual([3], list(queue))
# delete delete
queue.remove(5)
self.assertEqual([3, 5], list(queue))
def test_get_min(self):
queue = RetroactivePriorityQueue()
self.assertEqual(None, queue.get_min())
queue.add_insert(2, 3)
queue.add_insert(5, 8)
self.assertEqual(3, queue.get_min())
queue.remove(2)
self.assertEqual(8, queue.get_min())
| 26.155556 | 49 | 0.606627 | 144 | 1,177 | 4.847222 | 0.208333 | 0.25788 | 0.183381 | 0.146132 | 0.587393 | 0.213467 | 0.077364 | 0 | 0 | 0 | 0 | 0.03908 | 0.260833 | 1,177 | 44 | 50 | 26.75 | 0.763218 | 0.02294 | 0 | 0.333333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.4 | 1 | 0.066667 | false | 0 | 0.066667 | 0 | 0.166667 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
e6da903391fdbd90fa432408f1e72862e8b14a5a | 1,149 | py | Python | Vokeur/website/migrations/0022_auto_20190616_1150.py | lsdr1999/Project | 445c4c4d9ca28b071347d9ebc0028aa7e89b77c6 | [
"MIT"
] | null | null | null | Vokeur/website/migrations/0022_auto_20190616_1150.py | lsdr1999/Project | 445c4c4d9ca28b071347d9ebc0028aa7e89b77c6 | [
"MIT"
] | null | null | null | Vokeur/website/migrations/0022_auto_20190616_1150.py | lsdr1999/Project | 445c4c4d9ca28b071347d9ebc0028aa7e89b77c6 | [
"MIT"
] | null | null | null | # Generated by Django 2.2.1 on 2019-06-16 11:50
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('website', '0021_auto_20190614_1538'),
]
operations = [
migrations.RemoveField(
model_name='antwoorden',
name='antwoorden',
),
migrations.AddField(
model_name='antwoorden',
name='eens',
field=models.CharField(default='Eens', max_length=240),
),
migrations.AddField(
model_name='antwoorden',
name='geenvanbeide',
field=models.CharField(default='Geen van Beide', max_length=240),
),
migrations.AddField(
model_name='antwoorden',
name='oneens',
field=models.CharField(default='Oneens', max_length=240),
),
migrations.AlterField(
model_name='antwoorden',
name='id',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, primary_key=True, serialize=False, to='website.Vragen'),
),
]
| 29.461538 | 137 | 0.587467 | 113 | 1,149 | 5.858407 | 0.486726 | 0.126888 | 0.143505 | 0.173716 | 0.222054 | 0.222054 | 0.160121 | 0.160121 | 0.160121 | 0 | 0 | 0.0492 | 0.292428 | 1,149 | 38 | 138 | 30.236842 | 0.765068 | 0.039164 | 0 | 0.40625 | 1 | 0 | 0.137931 | 0.020871 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.0625 | 0 | 0.15625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
e6dedbaf89775b3c2d0197c0e5c2500b4bfc9048 | 742 | py | Python | src/Classes/PathManager.py | erick-dsnk/Electric | 7e8aad1f792321d7839717ed97b641bee7a4a64e | [
"Apache-2.0"
] | null | null | null | src/Classes/PathManager.py | erick-dsnk/Electric | 7e8aad1f792321d7839717ed97b641bee7a4a64e | [
"Apache-2.0"
] | null | null | null | src/Classes/PathManager.py | erick-dsnk/Electric | 7e8aad1f792321d7839717ed97b641bee7a4a64e | [
"Apache-2.0"
] | null | null | null | ######################################################################
# PATH MANAGER #
######################################################################
import os
class PathManager:
@staticmethod
def get_parent_directory() -> str:
directory = os.path.dirname(os.path.abspath(__file__))
return directory.replace('Classes', '').replace('src', '')[:-1].replace(R'\bin', '')
@staticmethod
def get_current_directory() -> str:
directory = os.path.dirname(os.path.abspath(__file__))
return os.path.split(directory)[0]
@staticmethod
def get_appdata_directory() -> str:
return os.environ['APPDATA'] + R'\electric'
| 33.727273 | 92 | 0.466307 | 62 | 742 | 5.354839 | 0.451613 | 0.090361 | 0.162651 | 0.138554 | 0.343373 | 0.343373 | 0.343373 | 0.343373 | 0.343373 | 0.343373 | 0 | 0.003515 | 0.233154 | 742 | 22 | 93 | 33.727273 | 0.579965 | 0.016173 | 0 | 0.384615 | 0 | 0 | 0.056285 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.230769 | false | 0 | 0.076923 | 0.076923 | 0.615385 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
e6e264bf419f7cf7fed7a1e019385edfbf9b096e | 2,238 | py | Python | apps/growth/migrations/0003_auto_20200506_1453.py | lsdlab/djshop_toturial | 6d450225cc05e6a1ecd161de2b522e1af0b68cc0 | [
"MIT"
] | null | null | null | apps/growth/migrations/0003_auto_20200506_1453.py | lsdlab/djshop_toturial | 6d450225cc05e6a1ecd161de2b522e1af0b68cc0 | [
"MIT"
] | 6 | 2020-06-07T15:18:58.000Z | 2021-09-22T19:07:33.000Z | apps/growth/migrations/0003_auto_20200506_1453.py | lsdlab/djshop_toturial | 6d450225cc05e6a1ecd161de2b522e1af0b68cc0 | [
"MIT"
] | null | null | null | # Generated by Django 3.0.5 on 2020-05-06 14:53
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
('growth', '0002_auto_20191226_2252'),
]
operations = [
migrations.AlterField(
model_name='checkin',
name='from_points',
field=models.IntegerField(help_text='变更前积分'),
),
migrations.AlterField(
model_name='checkin',
name='to_points',
field=models.IntegerField(help_text='变更后积分'),
),
migrations.AlterField(
model_name='checkin',
name='user',
field=models.ForeignKey(help_text='用户', on_delete=django.db.models.deletion.CASCADE, related_name='user_checkins', to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='invite',
name='left',
field=models.IntegerField(default=10, help_text='剩余邀请次数'),
),
migrations.AlterField(
model_name='invite',
name='shortuuid',
field=models.CharField(help_text='UUID', max_length=255),
),
migrations.AlterField(
model_name='invite',
name='user',
field=models.OneToOneField(help_text='用户', on_delete=django.db.models.deletion.CASCADE, related_name='user_invite', to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='log',
name='desc',
field=models.CharField(default='', help_text='描述', max_length=255),
),
migrations.AlterField(
model_name='log',
name='invite',
field=models.ForeignKey(help_text='邀请', on_delete=django.db.models.deletion.CASCADE, related_name='invite_logs', to='growth.Invite'),
),
migrations.AlterField(
model_name='log',
name='to_user',
field=models.ForeignKey(help_text='被邀请用户', on_delete=django.db.models.deletion.CASCADE, related_name='to_user_logs', to=settings.AUTH_USER_MODEL),
),
]
| 36.096774 | 158 | 0.608132 | 238 | 2,238 | 5.516807 | 0.289916 | 0.137091 | 0.171363 | 0.198781 | 0.623762 | 0.584158 | 0.309216 | 0.246763 | 0.246763 | 0.09444 | 0 | 0.02378 | 0.267203 | 2,238 | 61 | 159 | 36.688525 | 0.776829 | 0.020107 | 0 | 0.527273 | 1 | 0 | 0.104062 | 0.010497 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.054545 | 0 | 0.109091 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
e6fd558d88c8a519e322f63ced5921114efe9e6c | 4,816 | py | Python | qa/listings.py | imaginaryusername/openbazaar-go | 0ffe9523170b3ec5fb9964ddd3e4f7d91dd96562 | [
"MIT"
] | 2 | 2018-12-01T19:33:29.000Z | 2019-01-10T11:42:51.000Z | qa/listings.py | imaginaryusername/openbazaar-go | 0ffe9523170b3ec5fb9964ddd3e4f7d91dd96562 | [
"MIT"
] | null | null | null | qa/listings.py | imaginaryusername/openbazaar-go | 0ffe9523170b3ec5fb9964ddd3e4f7d91dd96562 | [
"MIT"
] | null | null | null | import requests
import json
from collections import OrderedDict
from test_framework.test_framework import OpenBazaarTestFramework, TestFailure
class ListingsTest(OpenBazaarTestFramework):
def __init__(self):
super().__init__()
self.num_nodes = 2
def setup_network(self):
self.setup_nodes()
def run_test(self):
vendor = self.nodes[0]
browser = self.nodes[1]
currency = "tbtc"
# no listings POSTed
api_url = vendor["gateway_url"] + "ob/listings"
r = requests.get(api_url)
if r.status_code == 200:
if len(json.loads(r.text)) == 0:
pass
else:
raise TestFailure("ListingsTest - FAIL: No listings should be returned")
elif r.status_code == 404:
raise TestFailure("ListingsTest - FAIL: Listings get endpoint not found")
else:
resp = json.loads(r.text)
raise TestFailure("ListingsTest - FAIL: Listings GET failed. Reason: %s", resp["reason"])
# POST listing
with open('testdata/listing.json') as listing_file:
ljson = json.load(listing_file, object_pairs_hook=OrderedDict)
ljson["metadata"]["pricingCurrency"] = "T" + self.cointype
currency = "T" + self.cointype
api_url = vendor["gateway_url"] + "ob/listing"
r = requests.post(api_url, data=json.dumps(ljson, indent=4))
if r.status_code == 200:
pass
elif r.status_code == 404:
raise TestFailure("ListingsTest - FAIL: Listing post endpoint not found")
else:
resp = json.loads(r.text)
raise TestFailure("ListingsTest - FAIL: Listing POST failed. Reason: %s", resp["reason"])
# one listing POSTed and index returning correct data
api_url = vendor["gateway_url"] + "ob/listings"
r = requests.get(api_url)
if r.status_code == 404:
raise TestFailure("ListingsTest - FAIL: Listings get endpoint not found")
elif r.status_code != 200:
resp = json.loads(r.text)
raise TestFailure("ListingsTest - FAIL: Listings GET failed. Reason: %s", resp["reason"])
resp = json.loads(r.text)
if len(resp) != 1:
raise TestFailure("ListingsTest - FAIL: One listing should be returned")
listing = resp[0]
if currency.lower() not in listing["acceptedCurrencies"]:
raise TestFailure("ListingsTest - FAIL: Listing should have acceptedCurrencies")
# listing show endpoint returning correct data
slug = listing["slug"]
api_url = vendor["gateway_url"] + "ob/listing/" + slug
r = requests.get(api_url)
if r.status_code == 404:
raise TestFailure("ListingsTest - FAIL: Listings get endpoint not found")
elif r.status_code != 200:
resp = json.loads(r.text)
raise TestFailure("ListingsTest - FAIL: Listings GET failed. Reason: %s", resp["reason"])
resp = json.loads(r.text)
if currency.lower() not in resp["listing"]["metadata"]["acceptedCurrencies"]:
raise TestFailure("ListingsTest - FAIL: Listing should have acceptedCurrences in metadata")
# check vendor's index from another node
api_url = browser["gateway_url"] + "ob/listings/" + vendor["peerId"]
r = requests.get(api_url)
if r.status_code == 404:
raise TestFailure("ListingsTest - FAIL: Listings get endpoint not found")
elif r.status_code != 200:
resp = json.loads(r.text)
raise TestFailure("ListingsTest - FAIL: Listings GET failed. Reason: %s", resp["reason"])
resp = json.loads(r.text)
if len(resp) != 1:
raise TestFailure("ListingsTest - FAIL: One listing should be returned")
if currency.lower() not in resp[0]["acceptedCurrencies"]:
raise TestFailure("ListingsTest - FAIL: Listing should have acceptedCurrences")
# check listing show page from another node
api_url = vendor["gateway_url"] + "ob/listing/" + vendor["peerId"] + "/" + slug
r = requests.get(api_url)
if r.status_code == 404:
raise TestFailure("ListingsTest - FAIL: Listings get endpoint not found")
elif r.status_code != 200:
resp = json.loads(r.text)
raise TestFailure("ListingsTest - FAIL: Listings GET failed. Reason: %s", resp["reason"])
resp = json.loads(r.text)
if currency.lower() not in resp["listing"]["metadata"]["acceptedCurrencies"]:
raise TestFailure("ListingsTest - FAIL: Listing should have acceptedCurrences in metadata")
print("ListingsTest - PASS")
if __name__ == '__main__':
print("Running ListingTest")
ListingsTest().main()
| 42.619469 | 103 | 0.619601 | 556 | 4,816 | 5.266187 | 0.183453 | 0.103825 | 0.181694 | 0.20765 | 0.693306 | 0.659836 | 0.644126 | 0.612363 | 0.589481 | 0.54235 | 0 | 0.012784 | 0.269103 | 4,816 | 112 | 104 | 43 | 0.819034 | 0.043397 | 0 | 0.58427 | 0 | 0 | 0.308043 | 0.004565 | 0 | 0 | 0 | 0 | 0 | 1 | 0.033708 | false | 0.033708 | 0.044944 | 0 | 0.089888 | 0.022472 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
fc00000dcd292100053ada03a83722b8f940991d | 2,482 | py | Python | tools/installer/cefpython3.__init__.py | donalm/cefpython | ce21ff975194ea1b17a9061ceb66648cbb3115c5 | [
"CNRI-Python",
"RSA-MD",
"Linux-OpenIB"
] | null | null | null | tools/installer/cefpython3.__init__.py | donalm/cefpython | ce21ff975194ea1b17a9061ceb66648cbb3115c5 | [
"CNRI-Python",
"RSA-MD",
"Linux-OpenIB"
] | null | null | null | tools/installer/cefpython3.__init__.py | donalm/cefpython | ce21ff975194ea1b17a9061ceb66648cbb3115c5 | [
"CNRI-Python",
"RSA-MD",
"Linux-OpenIB"
] | null | null | null | # Copyright (c) 2013 CEF Python, see the Authors file.
# All rights reserved. Licensed under BSD 3-clause license.
# Project website: https://github.com/cztomczak/cefpython
# NOTE: Template variables like {{VERSION}} are replaced with actual
# values when make_installer.py tool generates this package
# installer.
import os
import sys
import ctypes
import platform
__all__ = ["cefpython"] # Disabled: "wx"
__version__ = "{{VERSION}}"
__author__ = "The CEF Python authors"
# If package was installed using PIP or setup.py then package
# dir is here:
# /usr/local/lib/python2.7/dist-packages/cefpython3/
# If this is a debian package then package_dir returns:
# /usr/lib/pymodules/python2.7/cefpython3
# The above path consists of symbolic links to the real directory:
# /usr/share/pyshared/cefpython3
package_dir = os.path.dirname(os.path.abspath(__file__))
# This loads the libcef.so library for the subprocess executable.
# On Mac it works without setting library paths.
os.environ["LD_LIBRARY_PATH"] = package_dir
# This env variable will be returned by cefpython.GetModuleDirectory().
os.environ["CEFPYTHON3_PATH"] = package_dir
# This loads the libcef library for the main python executable.
# Loading library dynamically using ctypes.CDLL is required on Linux.
# TODO: Check if on Linux libcef.so can be linked like on Mac.
# On Mac the CEF framework dependency information is added to
# the cefpython*.so module by linking to CEF framework.
# The libffmpegsumo.so library does not need to be loaded here,
# it may cause issues to load it here in the browser process.
if platform.system() == "Linux":
libcef = os.path.join(package_dir, "libcef.so")
ctypes.CDLL(libcef, ctypes.RTLD_GLOBAL)
# Load the cefpython module for given Python version
if sys.version_info[:2] == (2, 7):
# noinspection PyUnresolvedReferences
from . import cefpython_py27 as cefpython
elif sys.version_info[:2] == (3, 4):
# noinspection PyUnresolvedReferences
from . import cefpython_py34 as cefpython
elif sys.version_info[:2] == (3, 5):
# noinspection PyUnresolvedReferences
from . import cefpython_py35 as cefpython
elif sys.version_info[:2] == (3, 6):
# noinspection PyUnresolvedReferences
from . import cefpython_py36 as cefpython
elif sys.version_info[:2] == (3, 7):
# noinspection PyUnresolvedReferences
from . import cefpython_py37 as cefpython
else:
raise Exception("Python version not supported: " + sys.version)
| 38.184615 | 71 | 0.750201 | 350 | 2,482 | 5.217143 | 0.462857 | 0.032859 | 0.038335 | 0.041073 | 0.214129 | 0.127054 | 0.067908 | 0.067908 | 0 | 0 | 0 | 0.018278 | 0.162369 | 2,482 | 64 | 72 | 38.78125 | 0.860029 | 0.599517 | 0 | 0 | 0 | 0 | 0.120457 | 0 | 0 | 0 | 0 | 0.015625 | 0 | 1 | 0 | false | 0 | 0.36 | 0 | 0.36 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
fc145411c47e506870c90f72327b7f7654df7a5c | 1,549 | py | Python | ddtrace/contrib/psycopg/__init__.py | p7g/dd-trace-py | 141ac0ab6e9962e3b3bafc9de172076075289a19 | [
"Apache-2.0",
"BSD-3-Clause"
] | null | null | null | ddtrace/contrib/psycopg/__init__.py | p7g/dd-trace-py | 141ac0ab6e9962e3b3bafc9de172076075289a19 | [
"Apache-2.0",
"BSD-3-Clause"
] | null | null | null | ddtrace/contrib/psycopg/__init__.py | p7g/dd-trace-py | 141ac0ab6e9962e3b3bafc9de172076075289a19 | [
"Apache-2.0",
"BSD-3-Clause"
] | null | null | null | """
The psycopg integration instruments the psycopg2 library to trace Postgres queries.
Enabling
~~~~~~~~
The psycopg integration is enabled automatically when using
:ref:`ddtrace-run<ddtracerun>` or :ref:`patch_all()<patch_all>`.
Or use :ref:`patch()<patch>` to manually enable the integration::
from ddtrace import patch
patch(psycopg=True)
Global Configuration
~~~~~~~~~~~~~~~~~~~~
.. py:data:: ddtrace.config.psycopg["service"]
The service name reported by default for psycopg spans.
This option can also be set with the ``DD_PSYCOPG_SERVICE`` environment
variable.
Default: ``"postgres"``
.. py:data:: ddtrace.config.psycopg["trace_fetch_methods"]
Whether or not to trace fetch methods.
Can also configured via the ``DD_PSYCOPG_TRACE_FETCH_METHODS`` environment variable.
Default: ``False``
Instance Configuration
~~~~~~~~~~~~~~~~~~~~~~
To configure the psycopg integration on an per-connection basis use the
``Pin`` API::
from ddtrace import Pin
import psycopg2
db = psycopg2.connect(connection_factory=factory)
# Use a pin to override the service name.
Pin.override(db, service="postgres-users")
cursor = db.cursor()
cursor.execute("select * from users where id = 1")
"""
from ...internal.utils.importlib import require_modules
required_modules = ["psycopg2"]
with require_modules(required_modules) as missing_modules:
if not missing_modules:
from .patch import patch
from .patch import patch_conn
__all__ = ["patch", "patch_conn"]
| 23.830769 | 87 | 0.70368 | 196 | 1,549 | 5.443878 | 0.454082 | 0.028116 | 0.059044 | 0.035614 | 0.048735 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003909 | 0.174306 | 1,549 | 64 | 88 | 24.203125 | 0.830336 | 0.805681 | 0 | 0 | 0 | 0 | 0.078498 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.428571 | 0 | 0.428571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
fc19c655354fa4430aff4f6f7a63d0da39a57e1d | 543 | py | Python | classes/pos.py | cripplet/langmuir-hash | 5b4aa8e705b237704dbb99fbaa89af8cc2e7a8b5 | [
"MIT"
] | null | null | null | classes/pos.py | cripplet/langmuir-hash | 5b4aa8e705b237704dbb99fbaa89af8cc2e7a8b5 | [
"MIT"
] | null | null | null | classes/pos.py | cripplet/langmuir-hash | 5b4aa8e705b237704dbb99fbaa89af8cc2e7a8b5 | [
"MIT"
] | null | null | null | # classical (x, y) position vectors
class Pos:
def __init__(self, x, y):
self.x = x
self.y = y
def __add__(self, other):
return(Pos(self.x + other.x, self.y + other.y))
def __eq__(self, other):
return(
(self.x == other.x) and
(self.y == other.y))
def __mul__(self, factor):
return(Pos(factor * self.x, factor * self.y))
def __ne__(self, other):
return(not(self == other))
def __str__(self):
return("(" + str(self.x) + ", " + str(self.y) + ")")
def __sub__(self, subtrahend):
return(self + (subtrahend * -1))
| 20.884615 | 54 | 0.607735 | 84 | 543 | 3.595238 | 0.27381 | 0.099338 | 0.149007 | 0.072848 | 0.092715 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002299 | 0.198895 | 543 | 25 | 55 | 21.72 | 0.691954 | 0.060773 | 0 | 0 | 0 | 0 | 0.007874 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.388889 | false | 0 | 0 | 0.333333 | 0.444444 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
fc240e7202a4fa0058ae27b11faad8316ce84d8e | 73 | py | Python | dawncli/__init__.py | leonardossz/dawn | e656e189be509fea49e581c030353df4aec184d6 | [
"MIT"
] | 1 | 2020-09-13T13:50:52.000Z | 2020-09-13T13:50:52.000Z | tests/__init__.py | leonardossz/dawn | e656e189be509fea49e581c030353df4aec184d6 | [
"MIT"
] | null | null | null | tests/__init__.py | leonardossz/dawn | e656e189be509fea49e581c030353df4aec184d6 | [
"MIT"
] | null | null | null | __copyright__ = 'Copyright 2020 See AUTHORS'
__license__ = 'See LICENSE'
| 24.333333 | 44 | 0.780822 | 8 | 73 | 6.125 | 0.625 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.063492 | 0.136986 | 73 | 2 | 45 | 36.5 | 0.714286 | 0 | 0 | 0 | 0 | 0 | 0.506849 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
fc34bd59a415cf11f753ae4a809a28fd9cef44e8 | 198 | py | Python | custom_components/sleep_as_android/const.py | Antoni-Czaplicki/HA-SleepAsAndroid | 12d649c779604491574bb7d237a4222aa7927aea | [
"Apache-2.0"
] | null | null | null | custom_components/sleep_as_android/const.py | Antoni-Czaplicki/HA-SleepAsAndroid | 12d649c779604491574bb7d237a4222aa7927aea | [
"Apache-2.0"
] | null | null | null | custom_components/sleep_as_android/const.py | Antoni-Czaplicki/HA-SleepAsAndroid | 12d649c779604491574bb7d237a4222aa7927aea | [
"Apache-2.0"
] | null | null | null | import voluptuous as vol
DOMAIN = "sleep_as_android"
DEVICE_MACRO: str = "%%%device%%%"
DEFAULT_NAME = "SleepAsAndroid"
DEFAULT_TOPIC_TEMPLATE = "SleepAsAndroid/%s" % DEVICE_MACRO
DEFAULT_QOS = 0
| 22 | 59 | 0.762626 | 25 | 198 | 5.72 | 0.72 | 0.153846 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005747 | 0.121212 | 198 | 8 | 60 | 24.75 | 0.816092 | 0 | 0 | 0 | 0 | 0 | 0.29798 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.166667 | 0 | 0.166667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
fc3a4402e543e17cf41fa859a414bcfd90640cc9 | 20,734 | py | Python | metagrok/pkmn/engine/test_engine.py | yuzeh/metagrok | 27f71441653611de939f1fe43e7aee6a7cdf1981 | [
"MIT"
] | 21 | 2019-06-21T05:00:40.000Z | 2021-01-26T02:07:58.000Z | metagrok/pkmn/engine/test_engine.py | yuzeh/metagrok | 27f71441653611de939f1fe43e7aee6a7cdf1981 | [
"MIT"
] | 2 | 2019-07-12T00:36:00.000Z | 2020-02-27T10:20:57.000Z | metagrok/pkmn/engine/test_engine.py | yuzeh/metagrok | 27f71441653611de939f1fe43e7aee6a7cdf1981 | [
"MIT"
] | 4 | 2019-10-07T18:22:11.000Z | 2021-11-02T10:04:21.000Z | import copy
import json
import unittest
from metagrok.pkmn.engine import core
update_with_request = core._update_with_request
get_side = core._get_side
postproc = core._postprocess_engine_state
class UpdateWithRequestTest(unittest.TestCase):
def test_begin(self):
state = copy.deepcopy(state_begin)
postproc(state)
update_with_request(state, req_begin)
side = get_side(state, state['whoami'])
poke = side['pokemon'][0]
self.assertEqual('p2: Primeape', poke['ident'])
self.assertEqual(244., poke['maxhp'])
self.assertEqual(244., poke['hp'])
self.assertEqual(
[('icepunch', 0), ('uturn', 0), ('encore', 0), ('closecombat', 0)],
list(map(tuple, poke['moveTrack'])))
self.assertEqual('lifeorb', poke['item'])
self.assertEqual('vitalspirit', poke['ability'])
self.assertEqual('vitalspirit', poke['baseAbility'])
self.assertEqual(True, poke['active'])
self.assertEqual(False, poke['fainted'])
poke = side['pokemon'][1]
self.assertEqual('p2: Zoroark', poke['ident'])
self.assertEqual(222., poke['maxhp'])
self.assertEqual(222., poke['hp'])
self.assertEqual(
[('flamethrower', 0), ('nastyplot', 0), ('suckerpunch', 0), ('darkpulse', 0)],
list(map(tuple, poke['moveTrack'])))
self.assertEqual('lifeorb', poke['item'])
self.assertEqual('illusion', poke['ability'])
self.assertEqual('illusion', poke['baseAbility'])
self.assertEqual(False, poke['active'])
self.assertEqual(False, poke['fainted'])
def test_zoroark_switchin(self):
state = copy.deepcopy(state_zoroark_switch)
postproc(state)
update_with_request(state, req_zoroark_switch)
state_begin = json.loads(r'''{
"turn": 1,
"ended": false,
"usesUpkeep": false,
"weather": "",
"pseudoWeather": [],
"weatherTimeLeft": 0,
"weatherMinTimeLeft": 0,
"mySide": {
"battle": {
"$ref": "$"
},
"name": "metagrok-random",
"id": "metagrokrandom",
"initialized": true,
"n": 0,
"foe": {
"battle": {
"$ref": "$"
},
"name": "borrel-ahorse",
"id": "borrelahorse",
"initialized": true,
"n": 1,
"foe": {
"$ref": "$[\"mySide\"]"
},
"totalPokemon": 6,
"sideConditions": {},
"wisher": null,
"active": [
{
"name": "Primeape",
"species": "Primeape",
"searchid": "p2: Primeape|Primeape, L83, M",
"side": {
"$ref": "$[\"mySide\"][\"foe\"]"
},
"fainted": false,
"hp": 244,
"maxhp": 244,
"ability": "",
"baseAbility": "",
"item": "",
"itemEffect": "",
"prevItem": "",
"prevItemEffect": "",
"boosts": {},
"status": "",
"volatiles": {},
"turnstatuses": {},
"movestatuses": {},
"lastmove": "",
"moveTrack": [],
"statusData": {
"sleepTurns": 0,
"toxicTurns": 0
},
"num": 57,
"types": [
"Fighting"
],
"baseStats": {
"hp": 65,
"atk": 105,
"def": 60,
"spa": 60,
"spd": 70,
"spe": 95
},
"abilities": {
"0": "Vital Spirit",
"1": "Anger Point",
"H": "Defiant"
},
"heightm": 1,
"weightkg": 32,
"color": "Brown",
"prevo": "mankey",
"evoLevel": 28,
"eggGroups": [
"Field"
],
"exists": true,
"id": "primeape",
"speciesid": "primeape",
"baseSpecies": "Primeape",
"forme": "",
"formeLetter": "",
"formeid": "",
"spriteid": "primeape",
"effectType": "Template",
"gen": 1,
"slot": 0,
"details": "Primeape, L83, M",
"ident": "p2: Primeape",
"level": 83,
"gender": "M",
"shiny": false
}
],
"lastPokemon": null,
"pokemon": [
{
"$ref": "$[\"mySide\"][\"foe\"][\"active\"][0]"
}
]
},
"totalPokemon": 6,
"sideConditions": {},
"wisher": null,
"active": [
{
"name": "Reshiram",
"species": "Reshiram",
"searchid": "p1: Reshiram|Reshiram, L73",
"side": {
"$ref": "$[\"mySide\"]"
},
"fainted": false,
"hp": 100,
"maxhp": 100,
"ability": "Turboblaze",
"baseAbility": "Turboblaze",
"item": "",
"itemEffect": "",
"prevItem": "",
"prevItemEffect": "",
"boosts": {},
"status": "",
"volatiles": {},
"turnstatuses": {},
"movestatuses": {},
"lastmove": "",
"moveTrack": [],
"statusData": {
"sleepTurns": 0,
"toxicTurns": 0
},
"num": 643,
"types": [
"Dragon",
"Fire"
],
"gender": "",
"baseStats": {
"hp": 100,
"atk": 120,
"def": 100,
"spa": 150,
"spd": 120,
"spe": 90
},
"abilities": {
"0": "Turboblaze"
},
"heightm": 3.2,
"weightkg": 330,
"color": "White",
"eggGroups": [
"Undiscovered"
],
"exists": true,
"id": "reshiram",
"speciesid": "reshiram",
"baseSpecies": "Reshiram",
"forme": "",
"formeLetter": "",
"formeid": "",
"spriteid": "reshiram",
"effectType": "Template",
"gen": 5,
"slot": 0,
"details": "Reshiram, L73",
"ident": "p1: Reshiram",
"level": 73,
"shiny": false
}
],
"lastPokemon": null,
"pokemon": [
{
"$ref": "$[\"mySide\"][\"active\"][0]"
}
]
},
"yourSide": {
"$ref": "$[\"mySide\"][\"foe\"]"
},
"p1": {
"$ref": "$[\"mySide\"]"
},
"p2": {
"$ref": "$[\"mySide\"][\"foe\"]"
},
"sides": [
{
"$ref": "$[\"mySide\"]"
},
{
"$ref": "$[\"mySide\"][\"foe\"]"
}
],
"lastMove": "",
"gen": 7,
"speciesClause": true,
"gameType": "singles",
"tier": "[Gen 7] Random Battle",
"lastmove": "switch-in"
}''')
req_begin = json.loads('''{
"active": [
{
"moves": [
{
"move": "Ice Punch",
"id": "icepunch",
"pp": 24,
"maxpp": 24,
"target": "normal",
"disabled": false
},
{
"move": "U-turn",
"id": "uturn",
"pp": 32,
"maxpp": 32,
"target": "normal",
"disabled": false
},
{
"move": "Encore",
"id": "encore",
"pp": 8,
"maxpp": 8,
"target": "normal",
"disabled": false
},
{
"move": "Close Combat",
"id": "closecombat",
"pp": 8,
"maxpp": 8,
"target": "normal",
"disabled": false
}
]
}
],
"side": {
"name": "borrel-ahorse",
"id": "p2",
"pokemon": [
{
"ident": "p2: Primeape",
"details": "Primeape, L83, M",
"condition": "244/244",
"active": true,
"stats": {
"atk": 222,
"def": 147,
"spa": 147,
"spd": 164,
"spe": 205
},
"moves": [
"icepunch",
"uturn",
"encore",
"closecombat"
],
"baseAbility": "vitalspirit",
"item": "lifeorb",
"pokeball": "pokeball",
"ability": "vitalspirit"
},
{
"ident": "p2: Zoroark",
"details": "Zoroark, L78, F",
"condition": "222/222",
"active": false,
"stats": {
"atk": 209,
"def": 139,
"spa": 232,
"spd": 139,
"spe": 209
},
"moves": [
"flamethrower",
"nastyplot",
"suckerpunch",
"darkpulse"
],
"baseAbility": "illusion",
"item": "lifeorb",
"pokeball": "pokeball",
"ability": "illusion"
},
{
"ident": "p2: Shiftry",
"details": "Shiftry, L83, M",
"condition": "285/285",
"active": false,
"stats": {
"atk": 214,
"def": 147,
"spa": 197,
"spd": 147,
"spe": 180
},
"moves": [
"swordsdance",
"leafblade",
"lowkick",
"suckerpunch"
],
"baseAbility": "earlybird",
"item": "lifeorb",
"pokeball": "pokeball",
"ability": "earlybird"
},
{
"ident": "p2: Tornadus",
"details": "Tornadus, L78, M",
"condition": "251/251",
"active": false,
"stats": {
"atk": 184,
"def": 154,
"spa": 240,
"spd": 170,
"spe": 218
},
"moves": [
"tailwind",
"heatwave",
"taunt",
"hurricane"
],
"baseAbility": "prankster",
"item": "leftovers",
"pokeball": "pokeball",
"ability": "prankster"
},
{
"ident": "p2: Steelix",
"details": "Steelix, L79, F",
"condition": "248/248",
"active": false,
"stats": {
"atk": 180,
"def": 362,
"spa": 132,
"spd": 148,
"spe": 93
},
"moves": [
"stealthrock",
"earthquake",
"toxic",
"dragontail"
],
"baseAbility": "sturdy",
"item": "steelixite",
"pokeball": "pokeball",
"ability": "sturdy"
},
{
"ident": "p2: Scrafty",
"details": "Scrafty, L81, F",
"condition": "238/238",
"active": false,
"stats": {
"atk": 192,
"def": 233,
"spa": 120,
"spd": 233,
"spe": 141
},
"moves": [
"rest",
"highjumpkick",
"dragondance",
"icepunch"
],
"baseAbility": "intimidate",
"item": "chestoberry",
"pokeball": "pokeball",
"ability": "intimidate"
}
]
},
"rqid": 3
}''')
state_zoroark_switch = json.loads(r'''{"turn":9,"ended":false,"usesUpkeep":true,"weather":"","p
seudoWeather":[],"weatherTimeLeft":0,"weatherMinTimeLeft":0,"mySide":{"battle":{"$ref":"$"},"na
me":"metagrok-random","id":"metagrokrandom","initialized":true,"n":0,"foe":{"battle":{"$ref":"$
"},"name":"borrel-ahorse","id":"borrelahorse","initialized":true,"n":1,"foe":{"$ref":"$[\"mySid
e\"]"},"totalPokemon":6,"sideConditions":{},"wisher":null,"active":[{"name":"Steelix","species"
:"Steelix","searchid":"p2: Steelix|Steelix, L79, F","side":{"$ref":"$[\"mySide\"][\"foe\"]"},"f
ainted":false,"hp":222,"maxhp":222,"ability":"","baseAbility":"","item":"","itemEffect":"","pre
vItem":"","prevItemEffect":"","boosts":{},"status":"","volatiles":{},"turnstatuses":{},"movesta
tuses":{},"lastmove":"","moveTrack":[],"statusData":{"sleepTurns":0,"toxicTurns":0},"num":208,"
types":["Steel","Ground"],"baseStats":{"hp":75,"atk":85,"def":200,"spa":55,"spd":65,"spe":30},"
abilities":{"0":"Rock Head","1":"Sturdy","H":"Sheer Force"},"heightm":9.2,"weightkg":400,"color
":"Gray","prevo":"onix","evoLevel":1,"eggGroups":["Mineral"],"otherFormes":["steelixmega"],"exi
sts":true,"id":"steelix","speciesid":"steelix","baseSpecies":"Steelix","forme":"","formeLetter"
:"","formeid":"","spriteid":"steelix","effectType":"Template","gen":2,"slot":0,"details":"Steel
ix, L79, F","ident":"p2: Steelix","level":79,"gender":"F","shiny":false}],"lastPokemon":{"name"
:"Scrafty","species":"Scrafty","searchid":"p2: Scrafty|Scrafty, L81, F","side":{"$ref":"$[\"myS
ide\"][\"foe\"]"},"fainted":true,"hp":0,"maxhp":238,"ability":"","baseAbility":"","item":"","it
emEffect":"","prevItem":"","prevItemEffect":"","boosts":{},"status":"","volatiles":{},"turnstat
uses":{},"movestatuses":{},"lastmove":"highjumpkick","moveTrack":[["Rest",2],["Dragon Dance",2]
,["Ice Punch",1],["High Jump Kick",1]],"statusData":{"sleepTurns":0,"toxicTurns":0},"num":560,"
types":["Dark","Fighting"],"baseStats":{"hp":65,"atk":90,"def":115,"spa":45,"spd":115,"spe":58}
,"abilities":{"0":"Shed Skin","1":"Moxie","H":"Intimidate"},"heightm":1.1,"weightkg":30,"color"
:"Red","prevo":"scraggy","evoLevel":39,"eggGroups":["Field","Dragon"],"exists":true,"id":"scraf
ty","speciesid":"scrafty","baseSpecies":"Scrafty","forme":"","formeLetter":"","formeid":"","spr
iteid":"scrafty","effectType":"Template","gen":5,"slot":0,"details":"Scrafty, L81, F","ident":"
p2: Scrafty","level":81,"gender":"F","shiny":false},"pokemon":[{"name":"Primeape","species":"Pr
imeape","searchid":"p2: Primeape|Primeape, L83, M","side":{"$ref":"$[\"mySide\"][\"foe\"]"},"fa
inted":true,"hp":0,"maxhp":244,"ability":"","baseAbility":"","item":"Life Orb","itemEffect":"",
"prevItem":"","prevItemEffect":"","boosts":{},"status":"","volatiles":{},"turnstatuses":{},"mov
estatuses":{},"lastmove":"closecombat","moveTrack":[["Ice Punch",1],["Close Combat",1]],"status
Data":{"sleepTurns":0,"toxicTurns":0},"num":57,"types":["Fighting"],"baseStats":{"hp":65,"atk":
105,"def":60,"spa":60,"spd":70,"spe":95},"abilities":{"0":"Vital Spirit","1":"Anger Point","H":
"Defiant"},"heightm":1,"weightkg":32,"color":"Brown","prevo":"mankey","evoLevel":28,"eggGroups"
:["Field"],"exists":true,"id":"primeape","speciesid":"primeape","baseSpecies":"Primeape","forme
":"","formeLetter":"","formeid":"","spriteid":"primeape","effectType":"Template","gen":1,"slot"
:0,"details":"Primeape, L83, M","ident":"p2: Primeape","level":83,"gender":"M","shiny":false},{
"$ref":"$[\"mySide\"][\"foe\"][\"lastPokemon\"]"},{"$ref":"$[\"mySide\"][\"foe\"][\"active\"][0
]"}]},"totalPokemon":6,"sideConditions":{},"wisher":null,"active":[{"name":"Huntail","species":
"Huntail","searchid":"p1: Huntail|Huntail, L83, F","side":{"$ref":"$[\"mySide\"]"},"fainted":fa
lse,"hp":5,"maxhp":100,"ability":"","baseAbility":"","item":"","itemEffect":"","prevItem":"","p
revItemEffect":"","boosts":{},"status":"","volatiles":{},"turnstatuses":{},"movestatuses":{},"l
astmove":"waterfall","moveTrack":[["Waterfall",1]],"statusData":{"sleepTurns":0,"toxicTurns":0}
,"num":367,"types":["Water"],"baseStats":{"hp":55,"atk":104,"def":105,"spa":94,"spd":75,"spe":5
2},"abilities":{"0":"Swift Swim","H":"Water Veil"},"heightm":1.7,"weightkg":27,"color":"Blue","
prevo":"clamperl","evoLevel":1,"eggGroups":["Water 1"],"exists":true,"id":"huntail","speciesid"
:"huntail","baseSpecies":"Huntail","forme":"","formeLetter":"","formeid":"","spriteid":"huntail
","effectType":"Template","gen":3,"slot":0,"details":"Huntail, L83, F","ident":"p1: Huntail","l
evel":83,"gender":"F","shiny":false}],"lastPokemon":{"name":"Krookodile","species":"Krookodile"
,"searchid":"p1: Krookodile|Krookodile, L77, M","side":{"$ref":"$[\"mySide\"]"},"fainted":false
,"hp":91,"maxhp":100,"ability":"","baseAbility":"","item":"Life Orb","itemEffect":"","prevItem"
:"","prevItemEffect":"","boosts":{},"status":"","volatiles":{},"turnstatuses":{},"movestatuses"
:{},"lastmove":"superpower","moveTrack":[["Superpower",1]],"statusData":{"sleepTurns":0,"toxicT
urns":0},"num":553,"types":["Ground","Dark"],"baseStats":{"hp":95,"atk":117,"def":80,"spa":65,"
spd":70,"spe":92},"abilities":{"0":"Intimidate","1":"Moxie","H":"Anger Point"},"heightm":1.5,"w
eightkg":96.3,"color":"Red","prevo":"krokorok","evoLevel":40,"eggGroups":["Field"],"exists":tru
e,"id":"krookodile","speciesid":"krookodile","baseSpecies":"Krookodile","forme":"","formeLetter
":"","formeid":"","spriteid":"krookodile","effectType":"Template","gen":5,"slot":0,"details":"K
rookodile, L77, M","ident":"p1: Krookodile","level":77,"gender":"M","shiny":false},"pokemon":[{
"name":"Reshiram","species":"Reshiram","searchid":"p1: Reshiram|Reshiram, L73","side":{"$ref":"
$[\"mySide\"]"},"fainted":true,"hp":0,"maxhp":100,"ability":"Turboblaze","baseAbility":"Turbobl
aze","item":"Leftovers","itemEffect":"","prevItem":"","prevItemEffect":"","boosts":{},"status":
"","volatiles":{},"turnstatuses":{},"movestatuses":{},"lastmove":"","moveTrack":[["Blue Flare",
1],["Flame Charge",1]],"statusData":{"sleepTurns":0,"toxicTurns":0},"num":643,"types":["Dragon"
,"Fire"],"gender":"","baseStats":{"hp":100,"atk":120,"def":100,"spa":150,"spd":120,"spe":90},"a
bilities":{"0":"Turboblaze"},"heightm":3.2,"weightkg":330,"color":"White","eggGroups":["Undisco
vered"],"exists":true,"id":"reshiram","speciesid":"reshiram","baseSpecies":"Reshiram","forme":"
","formeLetter":"","formeid":"","spriteid":"reshiram","effectType":"Template","gen":5,"slot":0,
"details":"Reshiram, L73","ident":"p1: Reshiram","level":73,"shiny":false},{"name":"Grumpig","s
pecies":"Grumpig","searchid":"p1: Grumpig|Grumpig, L83, F","side":{"$ref":"$[\"mySide\"]"},"fai
nted":false,"hp":100,"maxhp":100,"ability":"","baseAbility":"","item":"","itemEffect":"","prevI
tem":"","prevItemEffect":"","boosts":{},"status":"","volatiles":{},"turnstatuses":{},"movestatu
ses":{},"lastmove":"","moveTrack":[],"statusData":{"sleepTurns":0,"toxicTurns":0},"num":326,"ty
pes":["Psychic"],"baseStats":{"hp":80,"atk":45,"def":65,"spa":90,"spd":110,"spe":80},"abilities
":{"0":"Thick Fat","1":"Own Tempo","H":"Gluttony"},"heightm":0.9,"weightkg":71.5,"color":"Purpl
e","prevo":"spoink","evoLevel":32,"eggGroups":["Field"],"exists":true,"id":"grumpig","speciesid
":"grumpig","baseSpecies":"Grumpig","forme":"","formeLetter":"","formeid":"","spriteid":"grumpi
g","effectType":"Template","gen":3,"slot":0,"details":"Grumpig, L83, F","ident":"p1: Grumpig","
level":83,"gender":"F","shiny":false},{"$ref":"$[\"mySide\"][\"lastPokemon\"]"},{"$ref":"$[\"my
Side\"][\"active\"][0]"}]},"yourSide":{"$ref":"$[\"mySide\"][\"foe\"]"},"p1":{"$ref":"$[\"mySid
e\"]"},"p2":{"$ref":"$[\"mySide\"][\"foe\"]"},"sides":[{"$ref":"$[\"mySide\"]"},{"$ref":"$[\"my
Side\"][\"foe\"]"}],"lastMove":"","gen":7,"speciesClause":true,"gameType":"singles","tier":"[Ge
n 7] Random Battle","lastmove":"switch-in"}
'''.replace('\n', '').replace('\r', ''))
req_zoroark_switch = json.loads(r'''{"active":[{"moves":[{"move":"Flamethrower","id":"flamethro
wer","pp":24,"maxpp":24,"target":"normal","disabled":false},{"move":"Nasty Plot","id":"nastyplo
t","pp":32,"maxpp":32,"target":"self","disabled":false},{"move":"Sucker Punch","id":"suckerpunc
h","pp":8,"maxpp":8,"target":"normal","disabled":false},{"move":"Dark Pulse","id":"darkpulse","
pp":24,"maxpp":24,"target":"any","disabled":false}]}],"side":{"name":"borrel-ahorse","id":"p2",
"pokemon":[{"ident":"p2: Zoroark","details":"Zoroark, L78, F","condition":"222/222","active":tr
ue,"stats":{"atk":209,"def":139,"spa":232,"spd":139,"spe":209},"moves":["flamethrower","nastypl
ot","suckerpunch","darkpulse"],"baseAbility":"illusion","item":"lifeorb","pokeball":"pokeball",
"ability":"illusion"},{"ident":"p2: Scrafty","details":"Scrafty, L81, F","condition":"0 fnt","a
ctive":false,"stats":{"atk":192,"def":233,"spa":120,"spd":233,"spe":141},"moves":["rest","highj
umpkick","dragondance","icepunch"],"baseAbility":"intimidate","item":"chestoberry","pokeball":"
pokeball","ability":"intimidate"},{"ident":"p2: Shiftry","details":"Shiftry, L83, M","condition
":"285/285","active":false,"stats":{"atk":214,"def":147,"spa":197,"spd":147,"spe":180},"moves":
["swordsdance","leafblade","lowkick","suckerpunch"],"baseAbility":"earlybird","item":"lifeorb",
"pokeball":"pokeball","ability":"earlybird"},{"ident":"p2: Tornadus","details":"Tornadus, L78,
M","condition":"251/251","active":false,"stats":{"atk":184,"def":154,"spa":240,"spd":170,"spe":
218},"moves":["tailwind","heatwave","taunt","hurricane"],"baseAbility":"prankster","item":"left
overs","pokeball":"pokeball","ability":"prankster"},{"ident":"p2: Steelix","details":"Steelix,
L79, F","condition":"248/248","active":false,"stats":{"atk":180,"def":362,"spa":132,"spd":148,"
spe":93},"moves":["stealthrock","earthquake","toxic","dragontail"],"baseAbility":"sturdy","item
":"steelixite","pokeball":"pokeball","ability":"sturdy"},{"ident":"p2: Primeape","details":"Pri
meape, L83, M","condition":"0 fnt","active":false,"stats":{"atk":222,"def":147,"spa":147,"spd":
164,"spe":205},"moves":["icepunch","uturn","encore","closecombat"],"baseAbility":"vitalspirit",
"item":"lifeorb","pokeball":"pokeball","ability":"vitalspirit"}]},"rqid":25}
'''.replace('\n', '').replace('\r', ''))
if __name__ == '__main__':
unittest.main()
| 37.025 | 95 | 0.517749 | 2,032 | 20,734 | 5.26378 | 0.206201 | 0.018512 | 0.025804 | 0.015987 | 0.641081 | 0.585733 | 0.561051 | 0.503646 | 0.473822 | 0.447831 | 0 | 0.044987 | 0.207726 | 20,734 | 559 | 96 | 37.091234 | 0.606136 | 0 | 0 | 0.296703 | 0 | 0.120879 | 0.920179 | 0.454664 | 0 | 0 | 0 | 0 | 0.032967 | 1 | 0.003663 | false | 0 | 0.007326 | 0 | 0.012821 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
fc44de3876338e03afd87e823f6a799bc6d828e2 | 1,802 | py | Python | Tools/Dev/rbuild/askapdev/rbuild/utils/get_release_version.py | rtobar/askapsoft | 6bae06071d7d24f41abe3f2b7f9ee06cb0a9445e | [
"BSL-1.0",
"Apache-2.0",
"OpenSSL"
] | 1 | 2020-06-18T08:37:43.000Z | 2020-06-18T08:37:43.000Z | Tools/Dev/rbuild/askapdev/rbuild/utils/get_release_version.py | ATNF/askapsoft | d839c052d5c62ad8a511e58cd4b6548491a6006f | [
"BSL-1.0",
"Apache-2.0",
"OpenSSL"
] | null | null | null | Tools/Dev/rbuild/askapdev/rbuild/utils/get_release_version.py | ATNF/askapsoft | d839c052d5c62ad8a511e58cd4b6548491a6006f | [
"BSL-1.0",
"Apache-2.0",
"OpenSSL"
] | null | null | null | ## Package for various utility functions to execute build and shell commands
#
# @copyright (c) 2011 CSIRO
# Australia Telescope National Facility (ATNF)
# Commonwealth Scientific and Industrial Research Organisation (CSIRO)
# PO Box 76, Epping NSW 1710, Australia
# atnf-enquiries@csiro.au
#
# This file is part of the ASKAP software distribution.
#
# The ASKAP software distribution is free software: you can redistribute it
# and/or modify it under the terms of the GNU General Public License as
# published by the Free Software Foundation; either version 2 of the License
# or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA.
#
# @author Malte Marquarding <Malte.Marquarding@csiro.au>
#
import os.path
import datetime
from get_package_name import get_package_name
from get_svn_revision import get_svn_revision
from get_svn_branch_info import get_svn_branch_info
def get_release_version():
'''Return the branch of the repository we are using.
e.g. ['trunk'], ['releases', '0.3'], ['features', 'TOS', 'JC'] etc
'''
currentrev = get_svn_revision()
bi = get_svn_branch_info()
items = [get_package_name()]
items.append("==".join(["ASKAPsoft", os.path.join(*bi)]))
items.append("r" + currentrev)
items.append(str(datetime.date.today()))
return "; ".join(items)
if __name__ == '__main__':
print get_release_version()
| 36.04 | 76 | 0.743618 | 266 | 1,802 | 4.913534 | 0.567669 | 0.027544 | 0.029839 | 0.043611 | 0.062739 | 0.042846 | 0 | 0 | 0 | 0 | 0 | 0.017976 | 0.166482 | 1,802 | 49 | 77 | 36.77551 | 0.852197 | 0.597114 | 0 | 0 | 0 | 0 | 0.038801 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.333333 | null | null | 0.066667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
fc5938fea9b2cb111985d71ff93f31a55e4d112a | 561 | py | Python | multauth/api/services/phone/urls.py | andrenerd/django-multiform-authentication | 4a8b94ebd660cc7afc7dcdedcc12344ef85e6615 | [
"MIT"
] | 7 | 2020-08-28T16:17:02.000Z | 2021-11-11T18:01:20.000Z | multauth/api/services/phone/urls.py | andrenerd/django-multiform-authentication | 4a8b94ebd660cc7afc7dcdedcc12344ef85e6615 | [
"MIT"
] | null | null | null | multauth/api/services/phone/urls.py | andrenerd/django-multiform-authentication | 4a8b94ebd660cc7afc7dcdedcc12344ef85e6615 | [
"MIT"
] | 2 | 2021-01-06T04:11:28.000Z | 2021-05-19T14:43:52.000Z | from django.urls import include, path
from .me import views as me_views
from .auth import views as auth_views
urlpatterns = [
# path('me/phone/hardcode/', me_views.MeHardcodeView.as_view(), name='me-phone-hardcode'),
path('me/phone/pushcode/', me_views.MePhonePushcodeView.as_view(), name='me-phone-pushcode'),
path('signup/verification/phone/', auth_views.SignupVerificationPhoneView.as_view(), name='signup-verification-phone'),
path('signin/passcode/phone/', auth_views.SigninPasscodePhoneView.as_view(), name='signin-passcode-phone'),
]
| 40.071429 | 123 | 0.754011 | 73 | 561 | 5.657534 | 0.328767 | 0.067797 | 0.096852 | 0.058111 | 0.082324 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.096257 | 561 | 13 | 124 | 43.153846 | 0.814596 | 0.156863 | 0 | 0 | 0 | 0 | 0.274468 | 0.2 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.125 | 0.375 | 0 | 0.375 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 2 |
fc59530a25d169aca3aff73d09ebe11d0fab5ba6 | 1,075 | py | Python | profiles/migrations/0002_auto_20170301_1823.py | pyladieshre/pyladies | 5cbea02a48ac64b1194d6329c5bc55183142f2ab | [
"MIT"
] | 1 | 2020-10-19T17:25:40.000Z | 2020-10-19T17:25:40.000Z | profiles/migrations/0002_auto_20170301_1823.py | pyladieshre/pyladies | 5cbea02a48ac64b1194d6329c5bc55183142f2ab | [
"MIT"
] | 3 | 2017-01-22T17:36:36.000Z | 2017-03-07T09:24:21.000Z | profiles/migrations/0002_auto_20170301_1823.py | herambchaudhari4121/pyladies | 5cbea02a48ac64b1194d6329c5bc55183142f2ab | [
"MIT"
] | 9 | 2017-01-21T11:16:04.000Z | 2020-10-19T04:14:34.000Z | # -*- coding: utf-8 -*-
# Generated by Django 1.10.3 on 2017-03-01 16:23
from __future__ import unicode_literals
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('profiles', '0001_initial'),
]
operations = [
migrations.RemoveField(
model_name='profile',
name='phone_number',
),
migrations.AddField(
model_name='profile',
name='birth_date',
field=models.DateField(blank=True, null=True),
),
migrations.AddField(
model_name='profile',
name='contact_number',
field=models.CharField(blank=True, max_length=16, null=True),
),
migrations.AddField(
model_name='profile',
name='location',
field=models.CharField(blank=True, max_length=30),
),
migrations.AlterField(
model_name='profile',
name='bio',
field=models.TextField(blank=True, max_length=500),
),
]
| 26.875 | 73 | 0.563721 | 106 | 1,075 | 5.556604 | 0.509434 | 0.076401 | 0.135823 | 0.169779 | 0.349745 | 0.349745 | 0.285229 | 0.156197 | 0 | 0 | 0 | 0.038095 | 0.316279 | 1,075 | 39 | 74 | 27.564103 | 0.763265 | 0.063256 | 0 | 0.40625 | 1 | 0 | 0.101594 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.0625 | 0 | 0.15625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
fc7492b4f44a0e3dba8aa1d9ca79d4a8146c49a0 | 452 | py | Python | jenkins-exporter.py | iamabhishek-dubey/jenkins-exporter | 755bdabcc4f9b1219c7a9c44a3e1bf3d826cd01b | [
"Apache-2.0"
] | null | null | null | jenkins-exporter.py | iamabhishek-dubey/jenkins-exporter | 755bdabcc4f9b1219c7a9c44a3e1bf3d826cd01b | [
"Apache-2.0"
] | null | null | null | jenkins-exporter.py | iamabhishek-dubey/jenkins-exporter | 755bdabcc4f9b1219c7a9c44a3e1bf3d826cd01b | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/python3
from httplib2 import Http
import base64
import json
import time
import os
import signal
import faulthandler
from threading import Lock
from urllib.parse import urlencode, quote_plus
from prometheus_client import start_http_server
from prometheus_client.core import GaugeMetricFamily, REGISTRY
import logging
from pythonjsonlogger import jsonlogger
faulthandler.enable()
class JenkinsApiClient():
def __init__(self, config)
| 22.6 | 62 | 0.834071 | 58 | 452 | 6.344828 | 0.637931 | 0.076087 | 0.108696 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.010127 | 0.126106 | 452 | 20 | 63 | 22.6 | 0.921519 | 0.037611 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.8125 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
fc938bada0189acfd429b837e4defb597e4f6b44 | 7,450 | py | Python | src/build/simics/hb-simdebug.py | mabaiocchi/hostboot | 38d5e9300b57a3469793dac12851c96fc82b728a | [
"ECL-2.0",
"Apache-2.0"
] | 57 | 2015-01-28T06:16:56.000Z | 2021-12-26T07:46:31.000Z | src/build/simics/hb-simdebug.py | mabaiocchi/hostboot | 38d5e9300b57a3469793dac12851c96fc82b728a | [
"ECL-2.0",
"Apache-2.0"
] | 185 | 2015-01-05T09:23:25.000Z | 2022-03-17T19:47:06.000Z | src/build/simics/hb-simdebug.py | mabaiocchi/hostboot | 38d5e9300b57a3469793dac12851c96fc82b728a | [
"ECL-2.0",
"Apache-2.0"
] | 99 | 2015-01-12T22:20:29.000Z | 2021-09-16T15:02:03.000Z | # IBM_PROLOG_BEGIN_TAG
# This is an automatically generated prolog.
#
# $Source: src/build/simics/hb-simdebug.py $
#
# OpenPOWER HostBoot Project
#
# Contributors Listed Below - COPYRIGHT 2011,2018
# [+] International Business Machines Corp.
#
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied. See the License for the specific language governing
# permissions and limitations under the License.
#
# IBM_PROLOG_END_TAG
import os,sys
import conf
import configuration
import cli
import binascii
import datetime
import commands ## getoutput, getstatusoutput
import random
#===============================================================================
# HOSTBOOT Commands
#===============================================================================
default_syms = "hbicore.syms"
default_stringFile = "hbotStringFile"
#------------------------------------------------
#------------------------------------------------
new_command("hb-bltrace",
lambda: run_hb_debug_framework("BlTrace", outputFile = "hb-bltrace.output"),
#alias = "hbt",
type = ["hostboot-commands"],
#see_also = ["hb-trace"],
see_also = [ ],
short = "Display the Bootloader trace buffer",
doc = """
Parameters: \n
Defaults: \n
'syms' = './hbibl.syms' \n\n
Examples: \n
hb-bltrace \n\n
NOTE: Results are unpredictable after control is passed to Hostboot Base.\n
""")
#------------------------------------------------
#------------------------------------------------
new_command("hb-trace",
(lambda comp: run_hb_debug_framework("Trace",
("components="+comp) if comp else "",
outputFile = "hb-trace.output")),
[arg(str_t, "comp", "?", None),
],
#alias = "hbt",
type = ["hostboot-commands"],
#see_also = ["hb_printk"],
see_also = [ ],
short = "Display the hostboot trace",
doc = """
Parameters: \n
in = component name(s) \n
Defaults: \n
'comp' = all buffers \n
'syms' = './hbicore.syms' \n
'stringFile' = './hbotStringFile' \n\n
Examples: \n
hb-trace \n
hb-trace ERRL\n
hb-trace "ERRL,INITSERVICE" \n
""")
#------------------------------------------------
#------------------------------------------------
new_command("hb-bldata",
lambda: run_hb_debug_framework("BlData", outputFile = "hb-bldata.output"),
#alias = "hbt",
type = ["hostboot-commands"],
#see_also = ["hb-trace"],
see_also = [ ],
short = "Display Bootloader data",
doc = """
Parameters: \n
Defaults: \n
'syms' = './hbibl.syms' \n\n
Examples: \n
hb-bldata \n\n
NOTE: Results are unpredictable after control is passed to Hostboot Base.\n
""")
#------------------------------------------------
#------------------------------------------------
new_command("hb-printk",
lambda: run_hb_debug_framework("Printk", outputFile = "hb-printk.output"),
#alias = "hbt",
type = ["hostboot-commands"],
#see_also = ["hb-trace"],
see_also = [ ],
short = "Display the kernel printk buffer",
doc = """
Parameters: \n
Defaults: \n
'syms' = './hbicore.syms' \n\n
Examples: \n
hb-printk \n
""")
#------------------------------------------------
#------------------------------------------------
new_command("hb-dump",
lambda: run_hb_debug_framework("Dump", outputFile = "hb-dump.output"),
#alias = "hbt",
type = ["hostboot-commands"],
#see_also = ["hb-trace"],
see_also = [ ],
short = "Dumps HB memory to hbdump.<timestamp>",
doc = """
Parameters: \n
Defaults: \n
Examples: \n
hb-dump \n
""")
#------------------------------------------------
# Disable for now, need to pass in lots of options
#------------------------------------------------
new_command("hb-istep",
lambda istep: run_hb_debug_framework("Istep", istep,
outputFile = "hb-istep.output"),
[ arg( str_t, "istep", "?", "") ],
type = ["hostboot-commands"],
see_also = [ ],
short = "Run IStep commands",
doc = """
Parameters: \n
Defaults: \n
Examples: \n
hb-istep \n
hb-istep list \n
hb-istep splessmode \n
hb-istep fspmode \n
hb-istep clear-trace \n
hb-istep resume \n
hb-istep s4 \n
hb-istep s4..N \n
hb-istep poweron \n
hb-istep poweron..clock_frequency_set \n
""")
#------------------------------------------------
#------------------------------------------------
new_command("hb-errl",
(lambda logid, logidStr, flg_l, flg_d:
run_hb_debug_framework("Errl",
("display="+(("0x%x" % logid) if logid else logidStr) if flg_d else ""
),
outputFile = "hb-errl.output")),
[ arg(int_t, "logid", "?", None),
arg(str_t, "logidStr", "?", None),
arg(flag_t, "-l"),
arg(flag_t, "-d"),
],
#alias = "hbt",
type = ["hostboot-commands"],
#see_also = ["hb_printk"],
see_also = [ ],
short = "Display the hostboot error logs",
doc = """
Parameters: \n
in = option for dumping error logs\n
Defaults: \n
'flag' = '-l' \n
Examples: \n
hb_errl [-l]\n
hb-errl -d 1\n
hb-errl -d [all]\n
""")
#------------------------------------------------
#------------------------------------------------
def hb_singlethread():
# Note: will default to using the currently selected cpu
# emulates the SBE thread count register
run_command("($hb_cpu).write-reg scratch7 0x0000800000000000");
return
new_command("hb-singlethread",
hb_singlethread,
[],
alias = "hb-st",
type = ["hostboot-commands"],
short = "Disable all threads except 1 - Must be run before starting sim.")
#------------------------------------------------
#------------------------------------------------
def hb_get_objects_by_class(classname):
obj_list=[]
obj_dict={}
# Put objects into a dictionary, indexed by object name
for obj in SIM_get_all_objects():
if (obj.classname == classname):
obj_dict[obj.name]=obj
# Sort the dictionary by key (object name)
obj_names=obj_dict.keys()
obj_names.sort()
for obj_name in obj_names:
obj_list.append(obj_dict[obj_name])
#print "object name=%s" % obj_name
return obj_list
def hb_getallregs(regname):
proc_list=[]
proc_list=hb_get_objects_by_class("ppc_power9_mambo_core")
for proc in proc_list:
output = run_command("%s.read-reg %s"%(proc.name,regname))
print ">> %s : " %(proc.name) + "%x" %output
new_command("hb-getallregs",
(lambda reg: hb_getallregs(reg)),
[ arg(str_t, "reg", "?", None),
],
alias = "hb-gar",
type = ["hostboot-commands"],
short = "Read a reg from all cores.",
doc = """
Examples: \n
hb-getallregs <regname>\n
hb-getallregs pc\n
""")
#------------------------------------------------
#------------------------------------------------
| 28.326996 | 86 | 0.511812 | 817 | 7,450 | 4.541004 | 0.29743 | 0.01779 | 0.021563 | 0.035849 | 0.272237 | 0.212129 | 0.207547 | 0.197035 | 0.178706 | 0.178706 | 0 | 0.006156 | 0.215034 | 7,450 | 262 | 87 | 28.435115 | 0.628249 | 0.326443 | 0 | 0.371951 | 1 | 0 | 0.452929 | 0.009899 | 0 | 0 | 0.003636 | 0 | 0 | 0 | null | null | 0.012195 | 0.04878 | null | null | 0.030488 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
fc9ced3800e246b21215b47ecd3cded23d87f45b | 183 | py | Python | face_alignment/__init__.py | NovemberJoy/VTuber_Unity | 14655337842193655f41c6e8ff91aada989768d1 | [
"MIT",
"BSD-3-Clause"
] | 669 | 2019-10-15T15:25:38.000Z | 2022-03-31T22:31:18.000Z | face_alignment/__init__.py | NovemberJoy/VTuber_Unity | 14655337842193655f41c6e8ff91aada989768d1 | [
"MIT",
"BSD-3-Clause"
] | 21 | 2020-01-15T10:14:24.000Z | 2021-07-06T13:29:25.000Z | face_alignment/__init__.py | NovemberJoy/VTuber_Unity | 14655337842193655f41c6e8ff91aada989768d1 | [
"MIT",
"BSD-3-Clause"
] | 107 | 2019-10-31T11:24:46.000Z | 2022-03-26T06:25:23.000Z | # -*- coding: utf-8 -*-
__author__ = """Adrian Bulat"""
__email__ = 'adrian.bulat@nottingham.ac.uk'
__version__ = '1.0.0'
from .api import FaceAlignment, LandmarksType, NetworkSize
| 22.875 | 58 | 0.704918 | 22 | 183 | 5.318182 | 0.863636 | 0.188034 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.025 | 0.125683 | 183 | 7 | 59 | 26.142857 | 0.70625 | 0.114754 | 0 | 0 | 0 | 0 | 0.2875 | 0.18125 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.25 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
5d85eb3831600bd43057146320d04c96c90edc9f | 849 | py | Python | hippybot/plugins/wave.py | 1stvamp/hippybot | 931fb1accae295da3ae94184ef138aeedd5a726e | [
"BSD-2-Clause-FreeBSD"
] | 33 | 2015-03-03T08:41:56.000Z | 2022-02-16T12:05:30.000Z | hippybot/plugins/wave.py | 1stvamp/hippybot | 931fb1accae295da3ae94184ef138aeedd5a726e | [
"BSD-2-Clause-FreeBSD"
] | 9 | 2015-01-09T00:29:33.000Z | 2016-06-21T13:09:54.000Z | hippybot/plugins/wave.py | 1stvamp/hippybot | 931fb1accae295da3ae94184ef138aeedd5a726e | [
"BSD-2-Clause-FreeBSD"
] | 18 | 2015-01-07T22:40:45.000Z | 2018-04-04T18:58:50.000Z | from collections import Counter
from hippybot.decorators import botcmd
class Plugin(object):
"""HippyBot plugin to make the bot complete a wave if 3 people in a
row do the action "\o/".
"""
global_commands = ['\o/', 'wave']
command_aliases = {'\o/': 'wave'}
counts = Counter()
def __init__(self, config):
pass
@botcmd
def wave(self, mess, args):
"""
If enough people \o/, techbot will too.
Everyone loves a follower, well, techbot is here to fulfill that need
"""
channel = unicode(mess.getFrom()).split('/')[0]
self.bot.log.info("\o/ %s" %self.counts[channel])
if not self.bot.from_bot(mess):
self.counts[channel] += 1
if self.counts[channel] == 3:
self.counts[channel] = 0
return r'\o/'
| 29.275862 | 77 | 0.57126 | 109 | 849 | 4.385321 | 0.568807 | 0.083682 | 0.142259 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008375 | 0.29682 | 849 | 28 | 78 | 30.321429 | 0.792295 | 0.234393 | 0 | 0 | 0 | 0 | 0.039735 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.117647 | false | 0.058824 | 0.117647 | 0 | 0.529412 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 2 |
5d8b658aebc8d6bd9033454507e3ff4c840a5a86 | 192 | py | Python | demo/credit_card_demo/unzip_utils.py | sungkyu-kim/keras-anomaly-detection | 2941e18f23fa74969aa5710d9b66ed34f32e5487 | [
"MIT"
] | 370 | 2018-01-31T13:31:20.000Z | 2022-03-24T13:01:56.000Z | demo/credit_card_demo/unzip_utils.py | sungkyu-kim/keras-anomaly-detection | 2941e18f23fa74969aa5710d9b66ed34f32e5487 | [
"MIT"
] | 5 | 2018-05-06T18:43:30.000Z | 2019-08-21T15:59:30.000Z | demo/credit_card_demo/unzip_utils.py | sungkyu-kim/keras-anomaly-detection | 2941e18f23fa74969aa5710d9b66ed34f32e5487 | [
"MIT"
] | 170 | 2018-02-02T12:26:01.000Z | 2022-01-25T20:27:25.000Z | import zipfile
def unzip(path_to_zip_file, directory_to_extract_to):
zip_ref = zipfile.ZipFile(path_to_zip_file, 'r')
zip_ref.extractall(directory_to_extract_to)
zip_ref.close()
| 24 | 53 | 0.78125 | 31 | 192 | 4.354839 | 0.451613 | 0.148148 | 0.133333 | 0.192593 | 0.385185 | 0.385185 | 0 | 0 | 0 | 0 | 0 | 0 | 0.130208 | 192 | 7 | 54 | 27.428571 | 0.808383 | 0 | 0 | 0 | 0 | 0 | 0.005208 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.2 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
5d8f59c5dff9a5c219ce9554efe7d216bbcf33dd | 26,237 | py | Python | influxdb/influxdb08/client.py | phalenor/influxdb-python | ccbffdd9b0126017aeae1e78e6703bdeafad44b3 | [
"MIT"
] | null | null | null | influxdb/influxdb08/client.py | phalenor/influxdb-python | ccbffdd9b0126017aeae1e78e6703bdeafad44b3 | [
"MIT"
] | null | null | null | influxdb/influxdb08/client.py | phalenor/influxdb-python | ccbffdd9b0126017aeae1e78e6703bdeafad44b3 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Python client for InfluxDB
"""
import json
import socket
import requests
import requests.exceptions
import warnings
from sys import version_info
from influxdb import chunked_json
try:
xrange
except NameError:
xrange = range
if version_info[0] == 3:
from urllib.parse import urlparse
else:
from urlparse import urlparse
session = requests.Session()
class InfluxDBClientError(Exception):
"""Raised when an error occurs in the request"""
def __init__(self, content, code=-1):
super(InfluxDBClientError, self).__init__(
"{0}: {1}".format(code, content))
self.content = content
self.code = code
class InfluxDBClient(object):
"""
The ``InfluxDBClient`` object holds information necessary to connect
to InfluxDB. Requests can be made to InfluxDB directly through the client.
:param host: hostname to connect to InfluxDB, defaults to 'localhost'
:type host: string
:param port: port to connect to InfluxDB, defaults to 'localhost'
:type port: int
:param username: user to connect, defaults to 'root'
:type username: string
:param password: password of the user, defaults to 'root'
:type password: string
:param database: database name to connect to, defaults is None
:type database: string
:param ssl: use https instead of http to connect to InfluxDB, defaults is
False
:type ssl: boolean
:param verify_ssl: verify SSL certificates for HTTPS requests, defaults is
False
:type verify_ssl: boolean
:param timeout: number of seconds Requests will wait for your client to
establish a connection, defaults to None
:type timeout: int
:param use_udp: use UDP to connect to InfluxDB, defaults is False
:type use_udp: int
:param udp_port: UDP port to connect to InfluxDB, defaults is 4444
:type udp_port: int
"""
def __init__(self,
host='localhost',
port=8086,
username='root',
password='root',
database=None,
ssl=False,
verify_ssl=False,
timeout=None,
use_udp=False,
udp_port=4444):
"""
Construct a new InfluxDBClient object.
"""
self._host = host
self._port = port
self._username = username
self._password = password
self._database = database
self._timeout = timeout
self._verify_ssl = verify_ssl
self.use_udp = use_udp
self.udp_port = udp_port
if use_udp:
self.udp_socket = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
self._scheme = "http"
if ssl is True:
self._scheme = "https"
self._baseurl = "{0}://{1}:{2}".format(
self._scheme,
self._host,
self._port)
self._headers = {
'Content-type': 'application/json',
'Accept': 'text/plain'}
@staticmethod
def from_DSN(dsn, **kwargs):
"""
Returns an instance of InfluxDBClient from the provided data source
name. Supported schemes are "influxdb", "https+influxdb",
"udp+influxdb". Parameters for the InfluxDBClient constructor may be
also be passed to this function.
Examples:
>> cli = InfluxDBClient.from_DSN('influxdb://username:password@\
... localhost:8086/databasename', timeout=5)
>> type(cli)
<class 'influxdb.client.InfluxDBClient'>
>> cli = InfluxDBClient.from_DSN('udp+influxdb://username:pass@\
... localhost:8086/databasename', timeout=5, udp_port=159)
>> print('{0._baseurl} - {0.use_udp} {0.udp_port}'.format(cli))
http://localhost:8086 - True 159
:param dsn: data source name
:type dsn: string
:param **kwargs: additional parameters for InfluxDBClient.
:type **kwargs: dict
:note: parameters provided in **kwargs may override dsn parameters.
:note: when using "udp+influxdb" the specified port (if any) will be
used for the TCP connection; specify the udp port with the additional
udp_port parameter (cf. examples).
:raise ValueError: if the provided DSN has any unexpected value.
"""
init_args = {}
conn_params = urlparse(dsn)
scheme_info = conn_params.scheme.split('+')
if len(scheme_info) == 1:
scheme = scheme_info[0]
modifier = None
else:
modifier, scheme = scheme_info
if scheme != 'influxdb':
raise ValueError('Unknown scheme "{}".'.format(scheme))
if modifier:
if modifier == 'udp':
init_args['use_udp'] = True
elif modifier == 'https':
init_args['ssl'] = True
else:
raise ValueError('Unknown modifier "{}".'.format(modifier))
if conn_params.hostname:
init_args['host'] = conn_params.hostname
if conn_params.port:
init_args['port'] = conn_params.port
if conn_params.username:
init_args['username'] = conn_params.username
if conn_params.password:
init_args['password'] = conn_params.password
if conn_params.path and len(conn_params.path) > 1:
init_args['database'] = conn_params.path[1:]
init_args.update(kwargs)
return InfluxDBClient(**init_args)
# Change member variables
def switch_database(self, database):
"""
switch_database()
Change client database.
:param database: the new database name to switch to
:type database: string
"""
self._database = database
def switch_db(self, database):
"""
DEPRECATED. Change client database.
"""
warnings.warn(
"switch_db is deprecated, and will be removed "
"in future versions. Please use "
"``InfluxDBClient.switch_database(database)`` instead.",
FutureWarning)
return self.switch_database(database)
def switch_user(self, username, password):
"""
switch_user()
Change client username.
:param username: the new username to switch to
:type username: string
:param password: the new password to switch to
:type password: string
"""
self._username = username
self._password = password
def request(self, url, method='GET', params=None, data=None,
expected_response_code=200):
"""
Make a http request to API
"""
url = "{0}/{1}".format(self._baseurl, url)
if params is None:
params = {}
auth = {
'u': self._username,
'p': self._password
}
params.update(auth)
if data is not None and not isinstance(data, str):
data = json.dumps(data)
# Try to send the request a maximum of three times. (see #103)
# TODO (aviau): Make this configurable.
for i in range(0, 3):
try:
response = session.request(
method=method,
url=url,
params=params,
data=data,
headers=self._headers,
verify=self._verify_ssl,
timeout=self._timeout
)
break
except (requests.exceptions.ConnectionError,
requests.exceptions.Timeout) as e:
if i < 2:
continue
else:
raise e
if response.status_code == expected_response_code:
return response
else:
raise InfluxDBClientError(response.content, response.status_code)
def write(self, data):
""" Provided as convenience for influxdb v0.9.0, this may change. """
self.request(
url="write",
method='POST',
params=None,
data=data,
expected_response_code=200
)
return True
# Writing Data
#
# Assuming you have a database named foo_production you can write data
# by doing a POST to /db/foo_production/series?u=some_user&p=some_password
# with a JSON body of points.
def write_points(self, data, time_precision='s', *args, **kwargs):
"""
Write to multiple time series names. An example data blob is:
data = [
{
"points": [
[
12
]
],
"name": "cpu_load_short",
"columns": [
"value"
]
}
]
:param data: A list of dicts in InfluxDB 0.8.x data format.
:param time_precision: [Optional, default 's'] Either 's', 'm', 'ms'
or 'u'.
:param batch_size: [Optional] Value to write the points in batches
instead of all at one time. Useful for when doing data dumps from
one database to another or when doing a massive write operation
:type batch_size: int
"""
def list_chunks(l, n):
""" Yield successive n-sized chunks from l.
"""
for i in xrange(0, len(l), n):
yield l[i:i + n]
batch_size = kwargs.get('batch_size')
if batch_size and batch_size > 0:
for item in data:
name = item.get('name')
columns = item.get('columns')
point_list = item.get('points', [])
for batch in list_chunks(point_list, batch_size):
item = [{
"points": batch,
"name": name,
"columns": columns
}]
self._write_points(
data=item,
time_precision=time_precision)
return True
else:
return self._write_points(data=data,
time_precision=time_precision)
def write_points_with_precision(self, data, time_precision='s'):
"""
DEPRECATED. Write to multiple time series names
"""
warnings.warn(
"write_points_with_precision is deprecated, and will be removed "
"in future versions. Please use "
"``InfluxDBClient.write_points(time_precision='..')`` instead.",
FutureWarning)
return self._write_points(data=data, time_precision=time_precision)
def _write_points(self, data, time_precision):
if time_precision not in ['s', 'm', 'ms', 'u']:
raise Exception(
"Invalid time precision is given. (use 's', 'm', 'ms' or 'u')")
if self.use_udp and time_precision != 's':
raise Exception(
"InfluxDB only supports seconds precision for udp writes"
)
url = "db/{0}/series".format(self._database)
params = {
'time_precision': time_precision
}
if self.use_udp:
self.send_packet(data)
else:
self.request(
url=url,
method='POST',
params=params,
data=data,
expected_response_code=200
)
return True
# One Time Deletes
def delete_points(self, name):
"""
Delete an entire series
"""
url = "db/{0}/series/{1}".format(self._database, name)
self.request(
url=url,
method='DELETE',
expected_response_code=204
)
return True
# Regularly Scheduled Deletes
def create_scheduled_delete(self, json_body):
"""
TODO: Create scheduled delete
2013-11-08: This endpoint has not been implemented yet in ver0.0.8,
but it is documented in http://influxdb.org/docs/api/http.html.
See also: src/api/http/api.go:l57
"""
raise NotImplementedError()
# get list of deletes
# curl http://localhost:8086/db/site_dev/scheduled_deletes
#
# remove a regularly scheduled delete
# curl -X DELETE http://localhost:8086/db/site_dev/scheduled_deletes/:id
def get_list_scheduled_delete(self):
"""
TODO: Get list of scheduled deletes
2013-11-08: This endpoint has not been implemented yet in ver0.0.8,
but it is documented in http://influxdb.org/docs/api/http.html.
See also: src/api/http/api.go:l57
"""
raise NotImplementedError()
def remove_scheduled_delete(self, delete_id):
"""
TODO: Remove scheduled delete
2013-11-08: This endpoint has not been implemented yet in ver0.0.8,
but it is documented in http://influxdb.org/docs/api/http.html.
See also: src/api/http/api.go:l57
"""
raise NotImplementedError()
def query(self, query, time_precision='s', chunked=False):
"""
Quering data
:param time_precision: [Optional, default 's'] Either 's', 'm', 'ms'
or 'u'.
:param chunked: [Optional, default=False] True if the data shall be
retrieved in chunks, False otherwise.
"""
return self._query(query, time_precision=time_precision,
chunked=chunked)
# Querying Data
#
# GET db/:name/series. It takes five parameters
def _query(self, query, time_precision='s', chunked=False):
if time_precision not in ['s', 'm', 'ms', 'u']:
raise Exception(
"Invalid time precision is given. (use 's', 'm', 'ms' or 'u')")
if chunked is True:
chunked_param = 'true'
else:
chunked_param = 'false'
# Build the URL of the serie to query
url = "db/{0}/series".format(self._database)
params = {
'q': query,
'time_precision': time_precision,
'chunked': chunked_param
}
response = self.request(
url=url,
method='GET',
params=params,
expected_response_code=200
)
if chunked:
decoded = {}
try:
decoded = chunked_json.loads(response.content.decode())
except UnicodeDecodeError:
decoded = chunked_json.loads(response.content.decode('utf-8'))
finally:
return list(decoded)
else:
return response.json()
# Creating and Dropping Databases
#
# ### create a database
# curl -X POST http://localhost:8086/db -d '{"name": "site_development"}'
#
# ### drop a database
# curl -X DELETE http://localhost:8086/db/site_development
def create_database(self, database):
"""
create_database()
Create a database on the InfluxDB server.
:param database: the name of the database to create
:type database: string
:rtype: boolean
"""
url = "db"
data = {'name': database}
self.request(
url=url,
method='POST',
data=data,
expected_response_code=201
)
return True
def delete_database(self, database):
"""
delete_database()
Drop a database on the InfluxDB server.
:param database: the name of the database to delete
:type database: string
:rtype: boolean
"""
url = "db/{0}".format(database)
self.request(
url=url,
method='DELETE',
expected_response_code=204
)
return True
# ### get list of databases
# curl -X GET http://localhost:8086/db
def get_list_database(self):
"""
Get the list of databases
"""
url = "db"
response = self.request(
url=url,
method='GET',
expected_response_code=200
)
return response.json()
def get_database_list(self):
"""
DEPRECATED. Get the list of databases
"""
warnings.warn(
"get_database_list is deprecated, and will be removed "
"in future versions. Please use "
"``InfluxDBClient.get_list_database`` instead.",
FutureWarning)
return self.get_list_database()
def delete_series(self, series):
"""
delete_series()
Drop a series on the InfluxDB server.
:param series: the name of the series to delete
:type series: string
:rtype: boolean
"""
url = "db/{0}/series/{1}".format(
self._database,
series
)
self.request(
url=url,
method='DELETE',
expected_response_code=204
)
return True
def get_list_series(self):
"""
Get a list of all time series in a database
"""
response = self._query('list series')
series_list = []
for series in response[0]['points']:
series_list.append(series[1])
return series_list
def get_list_continuous_queries(self):
"""
Get a list of continuous queries
"""
response = self._query('list continuous queries')
queries_list = []
for query in response[0]['points']:
queries_list.append(query[2])
return queries_list
# Security
# get list of cluster admins
# curl http://localhost:8086/cluster_admins?u=root&p=root
# add cluster admin
# curl -X POST http://localhost:8086/cluster_admins?u=root&p=root \
# -d '{"name": "paul", "password": "i write teh docz"}'
# update cluster admin password
# curl -X POST http://localhost:8086/cluster_admins/paul?u=root&p=root \
# -d '{"password": "new pass"}'
# delete cluster admin
# curl -X DELETE http://localhost:8086/cluster_admins/paul?u=root&p=root
# Database admins, with a database name of site_dev
# get list of database admins
# curl http://localhost:8086/db/site_dev/admins?u=root&p=root
# add database admin
# curl -X POST http://localhost:8086/db/site_dev/admins?u=root&p=root \
# -d '{"name": "paul", "password": "i write teh docz"}'
# update database admin password
# curl -X POST http://localhost:8086/db/site_dev/admins/paul?u=root&p=root\
# -d '{"password": "new pass"}'
# delete database admin
# curl -X DELETE \
# http://localhost:8086/db/site_dev/admins/paul?u=root&p=root
def get_list_cluster_admins(self):
"""
Get list of cluster admins
"""
response = self.request(
url="cluster_admins",
method='GET',
expected_response_code=200
)
return response.json()
def add_cluster_admin(self, new_username, new_password):
"""
Add cluster admin
"""
data = {
'name': new_username,
'password': new_password
}
self.request(
url="cluster_admins",
method='POST',
data=data,
expected_response_code=200
)
return True
def update_cluster_admin_password(self, username, new_password):
"""
Update cluster admin password
"""
url = "cluster_admins/{0}".format(username)
data = {
'password': new_password
}
self.request(
url=url,
method='POST',
data=data,
expected_response_code=200
)
return True
def delete_cluster_admin(self, username):
"""
Delete cluster admin
"""
url = "cluster_admins/{0}".format(username)
self.request(
url=url,
method='DELETE',
expected_response_code=200
)
return True
def set_database_admin(self, username):
"""
Set user as database admin
"""
return self.alter_database_admin(username, True)
def unset_database_admin(self, username):
"""
Unset user as database admin
"""
return self.alter_database_admin(username, False)
def alter_database_admin(self, username, is_admin):
url = "db/{0}/users/{1}".format(self._database, username)
data = {'admin': is_admin}
self.request(
url=url,
method='POST',
data=data,
expected_response_code=200
)
return True
def get_list_database_admins(self):
"""
TODO: Get list of database admins
2013-11-08: This endpoint has not been implemented yet in ver0.0.8,
but it is documented in http://influxdb.org/docs/api/http.html.
See also: src/api/http/api.go:l57
"""
raise NotImplementedError()
def add_database_admin(self, new_username, new_password):
"""
TODO: Add cluster admin
2013-11-08: This endpoint has not been implemented yet in ver0.0.8,
but it is documented in http://influxdb.org/docs/api/http.html.
See also: src/api/http/api.go:l57
"""
raise NotImplementedError()
def update_database_admin_password(self, username, new_password):
"""
TODO: Update database admin password
2013-11-08: This endpoint has not been implemented yet in ver0.0.8,
but it is documented in http://influxdb.org/docs/api/http.html.
See also: src/api/http/api.go:l57
"""
raise NotImplementedError()
def delete_database_admin(self, username):
"""
TODO: Delete database admin
2013-11-08: This endpoint has not been implemented yet in ver0.0.8,
but it is documented in http://influxdb.org/docs/api/http.html.
See also: src/api/http/api.go:l57
"""
raise NotImplementedError()
###
# Limiting User Access
# Database users
# get list of database users
# curl http://localhost:8086/db/site_dev/users?u=root&p=root
# add database user
# curl -X POST http://localhost:8086/db/site_dev/users?u=root&p=root \
# -d '{"name": "paul", "password": "i write teh docz"}'
# update database user password
# curl -X POST http://localhost:8086/db/site_dev/users/paul?u=root&p=root \
# -d '{"password": "new pass"}'
# delete database user
# curl -X DELETE http://localhost:8086/db/site_dev/users/paul?u=root&p=root
def get_database_users(self):
"""
Get list of database users
"""
url = "db/{0}/users".format(self._database)
response = self.request(
url=url,
method='GET',
expected_response_code=200
)
return response.json()
def add_database_user(self, new_username, new_password, permissions=None):
"""
Add database user
:param permissions: A ``(readFrom, writeTo)`` tuple
"""
url = "db/{0}/users".format(self._database)
data = {
'name': new_username,
'password': new_password
}
if permissions:
try:
data['readFrom'], data['writeTo'] = permissions
except (ValueError, TypeError):
raise TypeError(
"'permissions' must be (readFrom, writeTo) tuple"
)
self.request(
url=url,
method='POST',
data=data,
expected_response_code=200
)
return True
def update_database_user_password(self, username, new_password):
"""
Update password
"""
return self.alter_database_user(username, new_password)
def alter_database_user(self, username, password=None, permissions=None):
"""
Alters a database user and/or their permissions.
:param permissions: A ``(readFrom, writeTo)`` tuple
:raise TypeError: if permissions cannot be read.
:raise ValueError: if neither password nor permissions provided.
"""
url = "db/{0}/users/{1}".format(self._database, username)
if not password and not permissions:
raise ValueError("Nothing to alter for user {}.".format(username))
data = {}
if password:
data['password'] = password
if permissions:
try:
data['readFrom'], data['writeTo'] = permissions
except (ValueError, TypeError):
raise TypeError(
"'permissions' must be (readFrom, writeTo) tuple"
)
self.request(
url=url,
method='POST',
data=data,
expected_response_code=200
)
if username == self._username:
self._password = password
return True
def delete_database_user(self, username):
"""
Delete database user
"""
url = "db/{0}/users/{1}".format(self._database, username)
self.request(
url=url,
method='DELETE',
expected_response_code=200
)
return True
# update the user by POSTing to db/site_dev/users/paul
def update_permission(self, username, json_body):
"""
TODO: Update read/write permission
2013-11-08: This endpoint has not been implemented yet in ver0.0.8,
but it is documented in http://influxdb.org/docs/api/http.html.
See also: src/api/http/api.go:l57
"""
raise NotImplementedError()
def send_packet(self, packet):
data = json.dumps(packet)
byte = data.encode('utf-8')
self.udp_socket.sendto(byte, (self._host, self.udp_port))
| 29.217149 | 79 | 0.559134 | 2,945 | 26,237 | 4.863158 | 0.124958 | 0.022692 | 0.026533 | 0.022483 | 0.466695 | 0.422986 | 0.374948 | 0.336545 | 0.301355 | 0.27957 | 0 | 0.018174 | 0.341502 | 26,237 | 897 | 80 | 29.249721 | 0.810789 | 0.335595 | 0 | 0.411765 | 0 | 0.004706 | 0.091914 | 0.010156 | 0 | 0 | 0 | 0.010033 | 0 | 1 | 0.101176 | false | 0.051765 | 0.021176 | 0 | 0.2 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
5d9b571e2608ccff1467b33b574a13e950c0fd96 | 181 | py | Python | core/settings/prod_worker.py | Christiaanvdm/django-bims | f92a63156c711b2d53c5f8ea06867cd64cee9eb9 | [
"MIT"
] | null | null | null | core/settings/prod_worker.py | Christiaanvdm/django-bims | f92a63156c711b2d53c5f8ea06867cd64cee9eb9 | [
"MIT"
] | null | null | null | core/settings/prod_worker.py | Christiaanvdm/django-bims | f92a63156c711b2d53c5f8ea06867cd64cee9eb9 | [
"MIT"
] | null | null | null | from .prod_docker import * # noqa
CACHES = {
'default': {
'BACKEND': 'django.core.cache.backends.memcached.MemcachedCache',
'LOCATION': 'cache:11211',
}
}
| 20.111111 | 73 | 0.60221 | 17 | 181 | 6.352941 | 0.941176 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.036232 | 0.237569 | 181 | 8 | 74 | 22.625 | 0.746377 | 0.022099 | 0 | 0 | 0 | 0 | 0.48 | 0.291429 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.142857 | 0 | 0.142857 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
5d9c12f564b69351fad5c1860653fa5983ae0e15 | 428 | py | Python | api/urls_api.py | sunlightlabs/realtimecongress-server-old | 885a1bc87f87afb9bfcd35b53c1585f73ba78f51 | [
"BSD-3-Clause"
] | null | null | null | api/urls_api.py | sunlightlabs/realtimecongress-server-old | 885a1bc87f87afb9bfcd35b53c1585f73ba78f51 | [
"BSD-3-Clause"
] | null | null | null | api/urls_api.py | sunlightlabs/realtimecongress-server-old | 885a1bc87f87afb9bfcd35b53c1585f73ba78f51 | [
"BSD-3-Clause"
] | null | null | null | from django.conf.urls.defaults import *
from piston.emitters import Emitter
from piston.resource import Resource
from realtimecongress_server.api.handlers import *
urlpatterns = patterns('',
url(r'^legislators.(?P<emitter_format>.+)$', Resource(LegislatorHandler)),
url(r'^legislation.(?P<emitter_format>.+)$', Resource(LegislationHandler)),
url(r'^rollcalls.(?P<emitter_format>.+)$', Resource(RollCallHandler)),
)
| 38.909091 | 79 | 0.747664 | 48 | 428 | 6.583333 | 0.541667 | 0.037975 | 0.132911 | 0.208861 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.091122 | 428 | 10 | 80 | 42.8 | 0.812339 | 0 | 0 | 0 | 0 | 0 | 0.247664 | 0.247664 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.444444 | 0 | 0.444444 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
5d9d9c8421e81ceefd657947432196939565ee03 | 193 | py | Python | pages/themes/beginners/basicIOFormatedStrings/examples/input_example_1.py | ProgressBG-Python-Course/ProgressBG-VC2-Python | 03b892a42ee1fad3d4f97e328e06a4b1573fd356 | [
"MIT"
] | null | null | null | pages/themes/beginners/basicIOFormatedStrings/examples/input_example_1.py | ProgressBG-Python-Course/ProgressBG-VC2-Python | 03b892a42ee1fad3d4f97e328e06a4b1573fd356 | [
"MIT"
] | null | null | null | pages/themes/beginners/basicIOFormatedStrings/examples/input_example_1.py | ProgressBG-Python-Course/ProgressBG-VC2-Python | 03b892a42ee1fad3d4f97e328e06a4b1573fd356 | [
"MIT"
] | null | null | null | user_name = input("hi, what's your name: ")
user_surname = input("will you tell me your sur name?:")
print("Nice to meet you, ", user_name.capitalize() + " " + user_surname.capitalize() + "!")
| 48.25 | 91 | 0.668394 | 29 | 193 | 4.310345 | 0.62069 | 0.128 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.150259 | 193 | 3 | 92 | 64.333333 | 0.762195 | 0 | 0 | 0 | 0 | 0 | 0.38342 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
5da7960b67500b59bb9ea42b21e2f3e944965f71 | 11,023 | py | Python | python/osmosis_proto/tendermint/types/__init__.py | fabio-nukui/osmosis.proto | 4780f22681881626b853109971602a6e29a3fb69 | [
"Apache-2.0"
] | 1 | 2022-02-22T06:18:40.000Z | 2022-02-22T06:18:40.000Z | python/osmosis_proto/tendermint/types/__init__.py | fabio-nukui/osmosis.proto | 4780f22681881626b853109971602a6e29a3fb69 | [
"Apache-2.0"
] | null | null | null | python/osmosis_proto/tendermint/types/__init__.py | fabio-nukui/osmosis.proto | 4780f22681881626b853109971602a6e29a3fb69 | [
"Apache-2.0"
] | null | null | null | # Generated by the protocol buffer compiler. DO NOT EDIT!
# sources: tendermint/types/block.proto, tendermint/types/evidence.proto, tendermint/types/params.proto, tendermint/types/types.proto, tendermint/types/validator.proto
# plugin: python-betterproto
from dataclasses import dataclass
from datetime import datetime, timedelta
from typing import List
import betterproto
from betterproto.grpc.grpclib_server import ServiceBase
class BlockIdFlag(betterproto.Enum):
"""BlockIdFlag indicates which BlcokID the signature is for"""
BLOCK_ID_FLAG_UNKNOWN = 0
BLOCK_ID_FLAG_ABSENT = 1
BLOCK_ID_FLAG_COMMIT = 2
BLOCK_ID_FLAG_NIL = 3
class SignedMsgType(betterproto.Enum):
"""SignedMsgType is a type of signed message in the consensus."""
SIGNED_MSG_TYPE_UNKNOWN = 0
# Votes
SIGNED_MSG_TYPE_PREVOTE = 1
SIGNED_MSG_TYPE_PRECOMMIT = 2
# Proposals
SIGNED_MSG_TYPE_PROPOSAL = 32
@dataclass(eq=False, repr=False)
class ValidatorSet(betterproto.Message):
validators: List["Validator"] = betterproto.message_field(1)
proposer: "Validator" = betterproto.message_field(2)
total_voting_power: int = betterproto.int64_field(3)
@dataclass(eq=False, repr=False)
class Validator(betterproto.Message):
address: bytes = betterproto.bytes_field(1)
pub_key: "_crypto__.PublicKey" = betterproto.message_field(2)
voting_power: int = betterproto.int64_field(3)
proposer_priority: int = betterproto.int64_field(4)
@dataclass(eq=False, repr=False)
class SimpleValidator(betterproto.Message):
pub_key: "_crypto__.PublicKey" = betterproto.message_field(1)
voting_power: int = betterproto.int64_field(2)
@dataclass(eq=False, repr=False)
class PartSetHeader(betterproto.Message):
"""PartsetHeader"""
total: int = betterproto.uint32_field(1)
hash: bytes = betterproto.bytes_field(2)
@dataclass(eq=False, repr=False)
class Part(betterproto.Message):
index: int = betterproto.uint32_field(1)
bytes_: bytes = betterproto.bytes_field(2)
proof: "_crypto__.Proof" = betterproto.message_field(3)
@dataclass(eq=False, repr=False)
class BlockId(betterproto.Message):
"""BlockID"""
hash: bytes = betterproto.bytes_field(1)
part_set_header: "PartSetHeader" = betterproto.message_field(2)
@dataclass(eq=False, repr=False)
class Header(betterproto.Message):
"""Header defines the structure of a Tendermint block header."""
# basic block info
version: "_version__.Consensus" = betterproto.message_field(1)
chain_id: str = betterproto.string_field(2)
height: int = betterproto.int64_field(3)
time: datetime = betterproto.message_field(4)
# prev block info
last_block_id: "BlockId" = betterproto.message_field(5)
# hashes of block data
last_commit_hash: bytes = betterproto.bytes_field(6)
data_hash: bytes = betterproto.bytes_field(7)
# hashes from the app output from the prev block
validators_hash: bytes = betterproto.bytes_field(8)
next_validators_hash: bytes = betterproto.bytes_field(9)
consensus_hash: bytes = betterproto.bytes_field(10)
app_hash: bytes = betterproto.bytes_field(11)
last_results_hash: bytes = betterproto.bytes_field(12)
# consensus info
evidence_hash: bytes = betterproto.bytes_field(13)
proposer_address: bytes = betterproto.bytes_field(14)
@dataclass(eq=False, repr=False)
class Data(betterproto.Message):
"""Data contains the set of transactions included in the block"""
# Txs that will be applied by state @ block.Height+1. NOTE: not all txs here
# are valid. We're just agreeing on the order first. This means that
# block.AppHash does not include these txs.
txs: List[bytes] = betterproto.bytes_field(1)
@dataclass(eq=False, repr=False)
class Vote(betterproto.Message):
"""
Vote represents a prevote, precommit, or commit vote from validators for
consensus.
"""
type: "SignedMsgType" = betterproto.enum_field(1)
height: int = betterproto.int64_field(2)
round: int = betterproto.int32_field(3)
block_id: "BlockId" = betterproto.message_field(4)
timestamp: datetime = betterproto.message_field(5)
validator_address: bytes = betterproto.bytes_field(6)
validator_index: int = betterproto.int32_field(7)
signature: bytes = betterproto.bytes_field(8)
@dataclass(eq=False, repr=False)
class Commit(betterproto.Message):
"""
Commit contains the evidence that a block was committed by a set of
validators.
"""
height: int = betterproto.int64_field(1)
round: int = betterproto.int32_field(2)
block_id: "BlockId" = betterproto.message_field(3)
signatures: List["CommitSig"] = betterproto.message_field(4)
@dataclass(eq=False, repr=False)
class CommitSig(betterproto.Message):
"""CommitSig is a part of the Vote included in a Commit."""
block_id_flag: "BlockIdFlag" = betterproto.enum_field(1)
validator_address: bytes = betterproto.bytes_field(2)
timestamp: datetime = betterproto.message_field(3)
signature: bytes = betterproto.bytes_field(4)
@dataclass(eq=False, repr=False)
class Proposal(betterproto.Message):
type: "SignedMsgType" = betterproto.enum_field(1)
height: int = betterproto.int64_field(2)
round: int = betterproto.int32_field(3)
pol_round: int = betterproto.int32_field(4)
block_id: "BlockId" = betterproto.message_field(5)
timestamp: datetime = betterproto.message_field(6)
signature: bytes = betterproto.bytes_field(7)
@dataclass(eq=False, repr=False)
class SignedHeader(betterproto.Message):
header: "Header" = betterproto.message_field(1)
commit: "Commit" = betterproto.message_field(2)
@dataclass(eq=False, repr=False)
class LightBlock(betterproto.Message):
signed_header: "SignedHeader" = betterproto.message_field(1)
validator_set: "ValidatorSet" = betterproto.message_field(2)
@dataclass(eq=False, repr=False)
class BlockMeta(betterproto.Message):
block_id: "BlockId" = betterproto.message_field(1)
block_size: int = betterproto.int64_field(2)
header: "Header" = betterproto.message_field(3)
num_txs: int = betterproto.int64_field(4)
@dataclass(eq=False, repr=False)
class TxProof(betterproto.Message):
"""
TxProof represents a Merkle proof of the presence of a transaction in the
Merkle tree.
"""
root_hash: bytes = betterproto.bytes_field(1)
data: bytes = betterproto.bytes_field(2)
proof: "_crypto__.Proof" = betterproto.message_field(3)
@dataclass(eq=False, repr=False)
class ConsensusParams(betterproto.Message):
"""
ConsensusParams contains consensus critical parameters that determine the
validity of blocks.
"""
block: "BlockParams" = betterproto.message_field(1)
evidence: "EvidenceParams" = betterproto.message_field(2)
validator: "ValidatorParams" = betterproto.message_field(3)
version: "VersionParams" = betterproto.message_field(4)
@dataclass(eq=False, repr=False)
class BlockParams(betterproto.Message):
"""BlockParams contains limits on the block size."""
# Max block size, in bytes. Note: must be greater than 0
max_bytes: int = betterproto.int64_field(1)
# Max gas per block. Note: must be greater or equal to -1
max_gas: int = betterproto.int64_field(2)
# Minimum time increment between consecutive blocks (in milliseconds) If the
# block header timestamp is ahead of the system clock, decrease this value.
# Not exposed to the application.
time_iota_ms: int = betterproto.int64_field(3)
@dataclass(eq=False, repr=False)
class EvidenceParams(betterproto.Message):
"""EvidenceParams determine how we handle evidence of malfeasance."""
# Max age of evidence, in blocks. The basic formula for calculating this is:
# MaxAgeDuration / {average block time}.
max_age_num_blocks: int = betterproto.int64_field(1)
# Max age of evidence, in time. It should correspond with an app's "unbonding
# period" or other similar mechanism for handling [Nothing-At-Stake
# attacks](https://github.com/ethereum/wiki/wiki/Proof-of-Stake-FAQ#what-is-
# the-nothing-at-stake-problem-and-how-can-it-be-fixed).
max_age_duration: timedelta = betterproto.message_field(2)
# This sets the maximum size of total evidence in bytes that can be committed
# in a single block. and should fall comfortably under the max block bytes.
# Default is 1048576 or 1MB
max_bytes: int = betterproto.int64_field(3)
@dataclass(eq=False, repr=False)
class ValidatorParams(betterproto.Message):
"""
ValidatorParams restrict the public key types validators can use. NOTE:
uses ABCI pubkey naming, not Amino names.
"""
pub_key_types: List[str] = betterproto.string_field(1)
@dataclass(eq=False, repr=False)
class VersionParams(betterproto.Message):
"""VersionParams contains the ABCI application version."""
app_version: int = betterproto.uint64_field(1)
@dataclass(eq=False, repr=False)
class HashedParams(betterproto.Message):
"""
HashedParams is a subset of ConsensusParams. It is hashed into the
Header.ConsensusHash.
"""
block_max_bytes: int = betterproto.int64_field(1)
block_max_gas: int = betterproto.int64_field(2)
@dataclass(eq=False, repr=False)
class Evidence(betterproto.Message):
duplicate_vote_evidence: "DuplicateVoteEvidence" = betterproto.message_field(
1, group="sum"
)
light_client_attack_evidence: "LightClientAttackEvidence" = (
betterproto.message_field(2, group="sum")
)
@dataclass(eq=False, repr=False)
class DuplicateVoteEvidence(betterproto.Message):
"""
DuplicateVoteEvidence contains evidence of a validator signed two
conflicting votes.
"""
vote_a: "Vote" = betterproto.message_field(1)
vote_b: "Vote" = betterproto.message_field(2)
total_voting_power: int = betterproto.int64_field(3)
validator_power: int = betterproto.int64_field(4)
timestamp: datetime = betterproto.message_field(5)
@dataclass(eq=False, repr=False)
class LightClientAttackEvidence(betterproto.Message):
"""
LightClientAttackEvidence contains evidence of a set of validators
attempting to mislead a light client.
"""
conflicting_block: "LightBlock" = betterproto.message_field(1)
common_height: int = betterproto.int64_field(2)
byzantine_validators: List["Validator"] = betterproto.message_field(3)
total_voting_power: int = betterproto.int64_field(4)
timestamp: datetime = betterproto.message_field(5)
@dataclass(eq=False, repr=False)
class EvidenceList(betterproto.Message):
evidence: List["Evidence"] = betterproto.message_field(1)
@dataclass(eq=False, repr=False)
class Block(betterproto.Message):
header: "Header" = betterproto.message_field(1)
data: "Data" = betterproto.message_field(2)
evidence: "EvidenceList" = betterproto.message_field(3)
last_commit: "Commit" = betterproto.message_field(4)
from .. import crypto as _crypto__
from .. import version as _version__
| 34.446875 | 167 | 0.739091 | 1,401 | 11,023 | 5.66167 | 0.202712 | 0.154312 | 0.118886 | 0.068079 | 0.439486 | 0.339637 | 0.25706 | 0.217474 | 0.170701 | 0.170701 | 0 | 0.019316 | 0.159303 | 11,023 | 319 | 168 | 34.554859 | 0.836625 | 0.238955 | 0 | 0.245614 | 1 | 0 | 0.050251 | 0.005638 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.040936 | 0 | 0.818713 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
5da7c0cda486abe640d48a08fa4a2b7c312d2355 | 760 | py | Python | parsetest.py | alexanderkunz/AEC-Webserver | fa66f5df17c884c05f0895616df708d2a7c66309 | [
"Unlicense"
] | 1 | 2020-11-07T08:44:14.000Z | 2020-11-07T08:44:14.000Z | parsetest.py | alexanderkunz/AEC-Webserver | fa66f5df17c884c05f0895616df708d2a7c66309 | [
"Unlicense"
] | 1 | 2019-04-17T19:28:25.000Z | 2019-04-30T13:26:34.000Z | parsetest.py | alexanderkunz/AEC-Webserver | fa66f5df17c884c05f0895616df708d2a7c66309 | [
"Unlicense"
] | null | null | null | from binascii import unhexlify
from aeconversion import aeconvcmd_parse, aeconvcmd_parse_response, aeconvcmd_parse_request
if __name__ == "__main__":
print("Testing Script for aeconversion protocol libary.")
print("\nParsing Request: 03A603FD5B")
print(aeconvcmd_parse_request(unhexlify("03A603FD5B")))
print("\nParsing Request: 00E603F015")
print(aeconvcmd_parse_request(unhexlify("00E603F015")))
print("\nParsing Response: 27170023AAE000BC1913EF")
print(aeconvcmd_parse_response(unhexlify("27170023AAE000BC1913EF")))
print("\nParsing AutoDetect: 03A703ED4A")
print(aeconvcmd_parse(unhexlify("03A703ED4A")))
print("\nParsing AutoDetect: 03A703ED4A")
print(aeconvcmd_parse(unhexlify("27170023AAE000BC1913EF"))) | 38 | 91 | 0.775 | 73 | 760 | 7.780822 | 0.328767 | 0.197183 | 0.167254 | 0.091549 | 0.338028 | 0.214789 | 0.214789 | 0.214789 | 0 | 0 | 0 | 0.135618 | 0.117105 | 760 | 20 | 92 | 38 | 0.710879 | 0 | 0 | 0.142857 | 0 | 0 | 0.386334 | 0.086728 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.142857 | 0 | 0.142857 | 0.785714 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
5da95cc7d5aaa7ba9e78863d5786bdba3f2fdd58 | 848 | py | Python | hypernetworks/core/HTMeronymy.py | rdchar/HypernetworkTheory | d18696c5ac8db3c8633d4441b8932b9a4c1efbd4 | [
"MIT"
] | 1 | 2022-03-30T18:30:01.000Z | 2022-03-30T18:30:01.000Z | hypernetworks/core/HTMeronymy.py | rdchar/hypernetworks | c49882aae05ba9c1a4d50f0d0214e6533124984f | [
"MIT"
] | null | null | null | hypernetworks/core/HTMeronymy.py | rdchar/hypernetworks | c49882aae05ba9c1a4d50f0d0214e6533124984f | [
"MIT"
] | null | null | null | import numpy as np
MERONYMY = [
"component",
"member",
"portion",
"stuff",
"feature",
"place",
"in",
"is-a",
"attribute",
"attached",
"belongs-to"
]
M_UNKNOWN = -1
M_COMPONENT = 0
M_MEMBER = 1
M_PORTION = 2
M_STUFF = 3
M_FEATURE = 4
M_PLACE = 5
M_IN = 6
M_IS_A = 7
M_ATTRIBUTE = 8
M_ATTACHED = 9
M_BELONGS_TO = 10
# Meronymy compatibility matrix
meronymy_matrix = np.full((11, 11), False)
meronymy_matrix[0, 0] = True
meronymy_matrix[1, 1] = True
meronymy_matrix[2, 2] = True
meronymy_matrix[3, 3] = True
meronymy_matrix[4, 4] = True
meronymy_matrix[5, 5] = True
meronymy_matrix[6, 6] = True
meronymy_matrix[7, 7] = True
meronymy_matrix[8, 8] = True
meronymy_matrix[9, 9] = True
meronymy_matrix[10, 10] = True
meronymy_matrix[M_IS_A, M_COMPONENT] = True
meronymy_matrix[M_COMPONENT, M_IS_A] = True
| 17.666667 | 43 | 0.673349 | 137 | 848 | 3.919708 | 0.270073 | 0.364991 | 0.402235 | 0.070764 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.060117 | 0.195755 | 848 | 47 | 44 | 18.042553 | 0.727273 | 0.034198 | 0 | 0 | 0 | 0 | 0.088235 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.025 | 0 | 0.025 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
5dac54848963cf8b56e7987d8a546808a23d5a09 | 331 | py | Python | runAM/read/yaml_file.py | arista-netdevops-community/runAM | c461b0fada8ddb22ed1607eb5773cd6aef43dbf9 | [
"BSD-3-Clause"
] | null | null | null | runAM/read/yaml_file.py | arista-netdevops-community/runAM | c461b0fada8ddb22ed1607eb5773cd6aef43dbf9 | [
"BSD-3-Clause"
] | 3 | 2021-01-15T08:06:41.000Z | 2021-02-17T13:23:11.000Z | runAM/read/yaml_file.py | arista-netdevops-community/runAM | c461b0fada8ddb22ed1607eb5773cd6aef43dbf9 | [
"BSD-3-Clause"
] | null | null | null | import yaml
def yaml_file(filename, load_all=False):
with open(filename, mode='r') as f:
if not load_all:
yaml_data = yaml.load(f, Loader=yaml.FullLoader)
else:
yaml_data = list(yaml.load_all(f, Loader=yaml.FullLoader)) # convert generator to list before returning
return yaml_data | 36.777778 | 116 | 0.664653 | 48 | 331 | 4.4375 | 0.5625 | 0.098592 | 0.103286 | 0.197183 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.241692 | 331 | 9 | 117 | 36.777778 | 0.848606 | 0.126888 | 0 | 0 | 0 | 0 | 0.003472 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.125 | 0 | 0.375 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
5dafb4630b4e440859e4d5eec1f65821a9cf2e0a | 529 | py | Python | infinite_iterator.py | ysharma1126/ssl_identifiability | ee46a35acf68461172b3eaeeb1d6fbc779fce661 | [
"MIT"
] | 19 | 2021-11-05T05:13:32.000Z | 2022-02-16T13:38:52.000Z | infinite_iterator.py | ysharma1126/ssl_identifiability | ee46a35acf68461172b3eaeeb1d6fbc779fce661 | [
"MIT"
] | 2 | 2021-12-14T11:19:53.000Z | 2022-01-12T21:45:20.000Z | infinite_iterator.py | ysharma1126/ssl_identifiability | ee46a35acf68461172b3eaeeb1d6fbc779fce661 | [
"MIT"
] | 2 | 2021-11-05T05:13:34.000Z | 2021-12-31T18:11:48.000Z | from typing import Iterable
class InfiniteIterator:
"""Infinitely repeat the iterable."""
def __init__(self, iterable: Iterable):
self._iterable = iterable
self.iterator = iter(self._iterable)
def __iter__(self):
return self
def __next__(self):
for _ in range(2):
try:
return next(self.iterator)
except StopIteration:
# reset iterator
del self.iterator
self.iterator = iter(self._iterable) | 26.45 | 52 | 0.57845 | 52 | 529 | 5.576923 | 0.480769 | 0.165517 | 0.137931 | 0.165517 | 0.193103 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002882 | 0.344045 | 529 | 20 | 52 | 26.45 | 0.832853 | 0.088847 | 0 | 0.142857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.214286 | false | 0 | 0.071429 | 0.071429 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
5db920518bcefa2241748a88837a92095994c2a1 | 271 | py | Python | tests/test_loaders.py | sakost/kutana | 7695902803f17e1ce6109b5f9a8a7c24126d322f | [
"MIT"
] | 69 | 2018-10-05T21:42:51.000Z | 2022-03-16T17:22:21.000Z | tests/test_loaders.py | sakost/kutana | 7695902803f17e1ce6109b5f9a8a7c24126d322f | [
"MIT"
] | 41 | 2018-10-20T09:18:43.000Z | 2021-11-22T12:19:44.000Z | tests/test_loaders.py | sakost/kutana | 7695902803f17e1ce6109b5f9a8a7c24126d322f | [
"MIT"
] | 26 | 2018-10-20T09:13:42.000Z | 2021-12-24T17:01:02.000Z | from os.path import dirname
from kutana.loaders import load_plugins
def test_load_plugins():
plugins = load_plugins(dirname(__file__) + "/assets", verbose=True)
assert len(plugins) == 3
assert {"echo", "hello 1", "hello 2"} == set(p.name for p in plugins)
| 27.1 | 73 | 0.697417 | 40 | 271 | 4.525 | 0.675 | 0.18232 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.013393 | 0.173432 | 271 | 9 | 74 | 30.111111 | 0.794643 | 0 | 0 | 0 | 0 | 0 | 0.092251 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.166667 | false | 0 | 0.333333 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
5dbbbb2b9c580b158d83eb7e7fdd85d15ee81fd2 | 2,315 | py | Python | quantrocket/countdown.py | Jay-Jay-D/quantrocket-client | b70ac199382d22d56fad923ca2233ce027f3264a | [
"Apache-2.0"
] | null | null | null | quantrocket/countdown.py | Jay-Jay-D/quantrocket-client | b70ac199382d22d56fad923ca2233ce027f3264a | [
"Apache-2.0"
] | null | null | null | quantrocket/countdown.py | Jay-Jay-D/quantrocket-client | b70ac199382d22d56fad923ca2233ce027f3264a | [
"Apache-2.0"
] | 1 | 2019-06-12T11:34:27.000Z | 2019-06-12T11:34:27.000Z | # Copyright 2017 QuantRocket - All Rights Reserved
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from quantrocket.houston import houston
from quantrocket.cli.utils.output import json_to_cli
def _load_or_show_crontab(service, filename=None):
if filename:
return json_to_cli(load_crontab, service, filename)
else:
exit_code = 0
return get_crontab(service), exit_code
def get_crontab(service):
"""
Return the current crontab.
Parameters
----------
service : str, required
the name of the service, e.g. ``countdown-usa``
Returns
-------
str
string representation of crontab
"""
response = houston.get("/{0}/crontab".format(service))
houston.raise_for_status_with_json(response)
return response.text
def load_crontab(service, filename):
"""
Upload a new crontab.
Parameters
----------
service : str, required
the name of the service, e.g. ``countdown-usa``
filename : str, required
the crontab file to upload to the countdown service
Returns
-------
dict
status message
"""
with open(filename) as file:
response = houston.put("/{0}/crontab".format(service), data=file.read())
houston.raise_for_status_with_json(response)
return response.json()
def get_timezone(service):
"""
Return the service timezone.
Parameters
----------
service : str, required
the name of the service, e.g. ``countdown-usa``
Returns
-------
dict
dict with key timezone
"""
response = houston.get("/{0}/timezone".format(service))
houston.raise_for_status_with_json(response)
return response.json()
def _cli_get_timezone(*args, **kwargs):
return json_to_cli(get_timezone, *args, **kwargs)
| 27.559524 | 80 | 0.671706 | 299 | 2,315 | 5.090301 | 0.377926 | 0.039422 | 0.036794 | 0.055191 | 0.296978 | 0.26544 | 0.26544 | 0.26544 | 0.26544 | 0.26544 | 0 | 0.006656 | 0.221166 | 2,315 | 83 | 81 | 27.891566 | 0.837493 | 0.507991 | 0 | 0.217391 | 0 | 0 | 0.037871 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.217391 | false | 0 | 0.086957 | 0.043478 | 0.565217 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
5dbf5655e8b6682e1a1a90a179beaf09d85f0c92 | 1,398 | py | Python | mayan/apps/documents/tests/test_recently_created_document_views.py | CMU-313/fall-2021-hw2-451-unavailable-for-legal-reasons | 0e4e919fd2e1ded6711354a0330135283e87f8c7 | [
"Apache-2.0"
] | 2 | 2021-09-12T19:41:19.000Z | 2021-09-12T19:41:20.000Z | mayan/apps/documents/tests/test_recently_created_document_views.py | CMU-313/fall-2021-hw2-451-unavailable-for-legal-reasons | 0e4e919fd2e1ded6711354a0330135283e87f8c7 | [
"Apache-2.0"
] | 37 | 2021-09-13T01:00:12.000Z | 2021-10-02T03:54:30.000Z | mayan/apps/documents/tests/test_recently_created_document_views.py | CMU-313/fall-2021-hw2-451-unavailable-for-legal-reasons | 0e4e919fd2e1ded6711354a0330135283e87f8c7 | [
"Apache-2.0"
] | 1 | 2021-09-22T13:17:30.000Z | 2021-09-22T13:17:30.000Z | from ..permissions import permission_document_view
from .base import GenericDocumentViewTestCase
from .mixins.recently_created_document_mixins import RecentlyCreatedDocumentViewTestMixin
class RecentlyCreatedDocumentViewTestCase(
RecentlyCreatedDocumentViewTestMixin, GenericDocumentViewTestCase
):
def test_recently_created_document_list_view_no_permission(self):
response = self._request_test_recently_created_document_list_view()
self.assertNotContains(
response=response, text=self.test_document.label, status_code=200
)
def test_recently_created_document_list_view_with_access(self):
self.grant_access(
obj=self.test_document, permission=permission_document_view
)
response = self._request_test_recently_created_document_list_view()
self.assertContains(
response=response, text=self.test_document.label, status_code=200
)
def test_trashed_recently_created_document_list_view_with_access(self):
self.grant_access(
obj=self.test_document, permission=permission_document_view
)
self.test_document.delete()
response = self._request_test_recently_created_document_list_view()
self.assertNotContains(
response=response, text=self.test_document.label, status_code=200
)
| 38.833333 | 90 | 0.739628 | 142 | 1,398 | 6.838028 | 0.239437 | 0.108136 | 0.165808 | 0.166838 | 0.663234 | 0.663234 | 0.663234 | 0.624099 | 0.624099 | 0.624099 | 0 | 0.008094 | 0.204578 | 1,398 | 35 | 91 | 39.942857 | 0.865108 | 0 | 0 | 0.428571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.107143 | 1 | 0.107143 | false | 0 | 0.107143 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
5dcd2d709dfe5d049be7bbd0bd9891a80df3244c | 296 | py | Python | util/gen_sinewave.py | ghsecuritylab/BleFox | 7a4f21e6c0c5b058793e11a0baa2434e5b69e0f3 | [
"Apache-2.0"
] | 2 | 2018-06-27T13:07:05.000Z | 2019-03-12T21:25:18.000Z | util/gen_sinewave.py | ghsecuritylab/BleFox | 7a4f21e6c0c5b058793e11a0baa2434e5b69e0f3 | [
"Apache-2.0"
] | null | null | null | util/gen_sinewave.py | ghsecuritylab/BleFox | 7a4f21e6c0c5b058793e11a0baa2434e5b69e0f3 | [
"Apache-2.0"
] | 3 | 2018-06-20T09:39:25.000Z | 2020-03-06T23:09:12.000Z | import math
def frange(x, y, jump):
while x < y:
yield x
x += jump
print [int(round(math.sin(x)*255)) for x in frange(0, math.pi, math.pi/254)]
#print [int(round( ((math.exp(math.sin(x/2000.0*math.pi)) - 0.36787944)*108.0)*255 )-17400) for x in frange(0, 2*math.pi, math.pi/254)]
| 26.909091 | 135 | 0.621622 | 60 | 296 | 3.066667 | 0.416667 | 0.163043 | 0.141304 | 0.184783 | 0.304348 | 0 | 0 | 0 | 0 | 0 | 0 | 0.155102 | 0.172297 | 296 | 10 | 136 | 29.6 | 0.595918 | 0.452703 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.166667 | null | null | 0.166667 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
5dcf08312909f2b378bf9a9242f9af92477dd39d | 2,948 | py | Python | scripts/dreadnot-deploy.py | muffinresearch/zamboni | 045a6f07c775b99672af6d9857d295ed02fe5dd9 | [
"BSD-3-Clause"
] | null | null | null | scripts/dreadnot-deploy.py | muffinresearch/zamboni | 045a6f07c775b99672af6d9857d295ed02fe5dd9 | [
"BSD-3-Clause"
] | null | null | null | scripts/dreadnot-deploy.py | muffinresearch/zamboni | 045a6f07c775b99672af6d9857d295ed02fe5dd9 | [
"BSD-3-Clause"
] | null | null | null | # commandline deployer for dreadnot
# Based on the original at
# https://github.com/jasonthomas/random/blob/master/dreadnot.deploy
import getpass
from ConfigParser import SafeConfigParser
from optparse import OptionParser
from urlparse import urljoin
import requests
try:
import keyring
except ImportError:
keyring = None
print ('Keyring module not found, "pip install keyring" if you want your'
' password remembered')
# config file should be ini format
def configure(config_file, env):
config = {}
conf = SafeConfigParser()
passwd = None
if conf.read(config_file):
config['username'] = conf.get(env, 'username')
config['dreadnot'] = conf.get(env, 'dreadnot')
config['region'] = conf.get(env, 'region')
else:
config['dreadnot'] = raw_input('Dreadnot URL:')
config['region'] = raw_input('Deployment region:')
config['username'] = raw_input('Dreadnot username:')
conf.add_section(env)
for name in ('dreadnot', 'region', 'username'):
conf.set(env, name, config[name])
with open(config_file, 'w') as f:
conf.write(f)
if keyring:
passwd = keyring.get_password('dreadnot', config['dreadnot'])
if not passwd:
passwd = getpass.getpass('Dreadnot password:')
if keyring:
keyring.set_password('dreadnot', config['dreadnot'], passwd)
config['password'] = passwd
return config
# deploy the goodness
def deploy(dreadnot, username, password, region, app_name, revision,
ssl_verify=False):
DREADNOT_DEPLOY = urljoin(
dreadnot,
'/api/1.0/stacks/%s/regions/%s/deployments' % (app_name, region))
TO_REVISION = {'to_revision': revision}
r = requests.post(DREADNOT_DEPLOY, data=TO_REVISION,
auth=(username, password), verify=ssl_verify)
print "%s - %s" % (r.status_code, r.content)
def main():
parser = OptionParser(usage="usage: %prog [options] app_name ...")
parser.add_option("-c", "--conf",
default='dreadnot.ini',
type='string',
help="Configuration File")
parser.add_option("-e", "--environment",
default='dev',
type='string',
help="Environment you want to deploy to")
parser.add_option("-r", "--revision",
default='origin/master',
type='string',
help="Git Revision")
(options, args) = parser.parse_args()
if len(args) < 1:
parser.error("wrong number of arguments")
config = configure(options.conf, options.environment)
for app_name in args:
print "Deploying " + app_name
deploy(config['dreadnot'], config['username'], config['password'],
config['region'], app_name, options.revision)
if __name__ == '__main__':
main()
| 32.755556 | 77 | 0.605495 | 326 | 2,948 | 5.368098 | 0.377301 | 0.024 | 0.017143 | 0.034286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001388 | 0.266621 | 2,948 | 89 | 78 | 33.123596 | 0.808048 | 0.060041 | 0 | 0.072464 | 0 | 0 | 0.213743 | 0.014828 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.15942 | 0.101449 | null | null | 0.043478 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
5de4452c8ab1e96cdefd9ce11e0d04b7296cb83b | 3,302 | py | Python | beginner/lesson016/pyxl/shapes.py | tmck-code/Syllabus | 150dd08d3a165f4df2adb09cb0197f4a8e363fc5 | [
"MIT"
] | 3 | 2019-01-08T09:55:35.000Z | 2021-12-02T01:13:48.000Z | beginner/lesson016/pyxl/shapes.py | tmck-code/Syllabus | 150dd08d3a165f4df2adb09cb0197f4a8e363fc5 | [
"MIT"
] | null | null | null | beginner/lesson016/pyxl/shapes.py | tmck-code/Syllabus | 150dd08d3a165f4df2adb09cb0197f4a8e363fc5 | [
"MIT"
] | null | null | null | from dataclasses import dataclass
import math
from pyxl.pixels import BinaryPixels
def construct(*shape_classes):
'''e.g.:
C = shapes.construct(shapes.HollowSquare, shapes.DebugShape)
C(5, 5)
'''
class ConstructedShape(*shape_classes): pass
return ConstructedShape
# Superclasses ----------------------------------
@dataclass
class Shape:
width: int
height: int
pixels: BinaryPixels = BinaryPixels()
def __post_init__(self): pass
def fill_calculation(self, x, y) -> bool:
raise NotImplementedError('Must implement fill_calculation(self, x, y) method')
def should_fill(self, x, y):
return bool(self.fill_calculation(x, y))
def pixel_to_draw(self, x, y):
return self.pixels.full
def draw(self):
for y in range(0, self.height):
for x in range(0, self.width):
if self.should_fill(x, y):
yield x, y, self.pixel_to_draw(x, y)
def __str__(self):
return f'{self.__class__.__name__}, coords: ' + list(self.draw())
class FuzzyShape(Shape):
def __init__(self, *args, tolerance=2, **kwargs):
super().__init__(*args, **kwargs)
self.tolerance = tolerance
def should_fill(self, x, y):
return self.fill_calculation(x, y) < self.tolerance
def fill_calculation(self, x, y):
'''This produces a number that is compared against the tolerance.
If it is below (<) the required tolerance, the should_fill method
will return True'''
raise NotImplementedError('Must implement fill_calculation')
class DebugShape(FuzzyShape):
def __init__(self, *args, debug=False, **kwargs):
super().__init__(*args, **kwargs)
self.debug = debug
def pixel_to_draw(self, x, y):
'If debug=True, then the pixel will be the value of fill_calculation'
if self.debug:
return str(self.fill_calculation(x, y))
else:
return self.pixels.full
class DebugCanvasShape(DebugShape):
def should_fill(self, x, y):
return True
def pixel_to_draw(self, x, y):
'If debug=True, then the pixel will be the value of fill_calculation'
if self.debug:
return str(int(self.fill_calculation(x, y)))
else:
return self.pixels.full
# Squares ---------------------------------------
class FilledSquare(Shape):
def fill_calculation(self, x, y):
return 0 <= x <= self.width and 0 <= y <= self.height
class HollowSquare(Shape):
def fill_calculation(self, x, y):
return 0 in (y % (self.width-1), x % (self.width-1))
# Circles ---------------------------------------
class FilledCircle(FuzzyShape):
'''(x – h)2 + (y – k)2 = r2
where (h,k) are the center coordinates, and r is the radius.'''
def fill_calculation(self, x, y):
h, k = int(self.width/2), int(self.height/2)
r = int(self.width/2)
return int((x-h)**2 + (y-k)**2) - r**2
class HollowCircle(FilledCircle):
def should_fill(self, x, y):
return -self.tolerance <= self.fill_calculation(x, y) < self.tolerance
# Curves ----------------------------------------
class Exponential(FuzzyShape):
def fill_calculation(self, x, y):
return abs((x**2) - y)
| 29.482143 | 87 | 0.597214 | 431 | 3,302 | 4.431555 | 0.241299 | 0.023037 | 0.043979 | 0.050262 | 0.441361 | 0.42199 | 0.299476 | 0.213613 | 0.183246 | 0.14555 | 0 | 0.007997 | 0.24258 | 3,302 | 111 | 88 | 29.747748 | 0.754898 | 0.192308 | 0 | 0.333333 | 0 | 0 | 0.091374 | 0.017544 | 0 | 0 | 0 | 0 | 0 | 1 | 0.275362 | false | 0.028986 | 0.043478 | 0.130435 | 0.724638 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
5dfbcad2df8ea14619ba0584ea55d5e4e1a26df0 | 997 | py | Python | compressai/zoo/pretrained.py | micmic123/CompressAI | a8605baba61d5cdcd433fb9b3ed8ff5522f09e9c | [
"Apache-2.0"
] | null | null | null | compressai/zoo/pretrained.py | micmic123/CompressAI | a8605baba61d5cdcd433fb9b3ed8ff5522f09e9c | [
"Apache-2.0"
] | null | null | null | compressai/zoo/pretrained.py | micmic123/CompressAI | a8605baba61d5cdcd433fb9b3ed8ff5522f09e9c | [
"Apache-2.0"
] | null | null | null | # Copyright 2020 InterDigital Communications, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
def rename_key(key):
"""Rename state_dict key."""
# ResidualBlockWithStride: 'downsample' -> 'skip'
if ".downsample.bias" in key or ".downsample.weight" in key:
return key.replace("downsample", "skip")
return key
def load_pretrained(state_dict):
"""Convert state_dict keys."""
state_dict = {rename_key(k): v for k, v in state_dict.items()}
return state_dict
| 33.233333 | 74 | 0.724173 | 144 | 997 | 4.951389 | 0.583333 | 0.084151 | 0.036466 | 0.044881 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009804 | 0.181545 | 997 | 29 | 75 | 34.37931 | 0.863971 | 0.669007 | 0 | 0 | 0 | 0 | 0.156863 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | false | 0 | 0 | 0 | 0.714286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
5dfd94dba3549da9f6d86e814241197c4f32d4c3 | 179 | py | Python | 7_kyu/Find_the_divisors!.py | JoaoVitorLeite/CodeWars | 156feda7273b37fdc90d007e1f638cf0dc73959f | [
"MIT"
] | null | null | null | 7_kyu/Find_the_divisors!.py | JoaoVitorLeite/CodeWars | 156feda7273b37fdc90d007e1f638cf0dc73959f | [
"MIT"
] | null | null | null | 7_kyu/Find_the_divisors!.py | JoaoVitorLeite/CodeWars | 156feda7273b37fdc90d007e1f638cf0dc73959f | [
"MIT"
] | null | null | null | def divisors(integer):
aux = [i for i in range(2, integer) if integer % i == 0]
if len(aux) == 0:
return "{} is prime".format(integer)
else:
return aux | 29.833333 | 60 | 0.558659 | 27 | 179 | 3.703704 | 0.62963 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.024 | 0.301676 | 179 | 6 | 61 | 29.833333 | 0.776 | 0 | 0 | 0 | 0 | 0 | 0.061111 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b901e337f05c9e1da0b5afa6b2b225e6396f156a | 1,005 | py | Python | python/ray/ml/preprocessors/__init__.py | BearerPipelineTest/ray | c1054a0baaea0903a7bcc858fdfa3b5f4f583567 | [
"Apache-2.0"
] | null | null | null | python/ray/ml/preprocessors/__init__.py | BearerPipelineTest/ray | c1054a0baaea0903a7bcc858fdfa3b5f4f583567 | [
"Apache-2.0"
] | null | null | null | python/ray/ml/preprocessors/__init__.py | BearerPipelineTest/ray | c1054a0baaea0903a7bcc858fdfa3b5f4f583567 | [
"Apache-2.0"
] | null | null | null | from ray.ml.preprocessors.batch_mapper import BatchMapper
from ray.ml.preprocessors.chain import Chain
from ray.ml.preprocessors.encoder import OrdinalEncoder, OneHotEncoder, LabelEncoder
from ray.ml.preprocessors.hasher import FeatureHasher
from ray.ml.preprocessors.imputer import SimpleImputer
from ray.ml.preprocessors.normalizer import Normalizer
from ray.ml.preprocessors.scaler import (
StandardScaler,
MinMaxScaler,
MaxAbsScaler,
RobustScaler,
)
from ray.ml.preprocessors.tokenizer import Tokenizer
from ray.ml.preprocessors.transformer import PowerTransformer
from ray.ml.preprocessors.vectorizer import CountVectorizer, HashingVectorizer
__all__ = [
"BatchMapper",
"CountVectorizer",
"Chain",
"FeatureHasher",
"HashingVectorizer",
"LabelEncoder",
"MaxAbsScaler",
"MinMaxScaler",
"Normalizer",
"OneHotEncoder",
"OrdinalEncoder",
"PowerTransformer",
"RobustScaler",
"SimpleImputer",
"StandardScaler",
"Tokenizer",
]
| 28.714286 | 84 | 0.762189 | 94 | 1,005 | 8.095745 | 0.319149 | 0.091984 | 0.118265 | 0.289093 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.148259 | 1,005 | 34 | 85 | 29.558824 | 0.889019 | 0 | 0 | 0 | 0 | 0 | 0.197015 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.30303 | 0 | 0.30303 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
b9052b0c2f94c0bb5256efa11aba7d840bf490dc | 456 | py | Python | data-processing/common/s2_data.py | hhchi13/scholarphi | 5683e68d5934a2f461aa674acbf4a5e9db3b5dbb | [
"Apache-2.0"
] | 285 | 2020-09-30T23:52:56.000Z | 2022-03-17T09:01:19.000Z | data-processing/common/s2_data.py | hhchi13/scholarphi | 5683e68d5934a2f461aa674acbf4a5e9db3b5dbb | [
"Apache-2.0"
] | 116 | 2019-12-02T17:15:01.000Z | 2020-09-03T12:12:23.000Z | data-processing/common/s2_data.py | hhchi13/scholarphi | 5683e68d5934a2f461aa674acbf4a5e9db3b5dbb | [
"Apache-2.0"
] | 35 | 2020-10-01T09:11:41.000Z | 2022-01-26T15:51:46.000Z | import logging
import os.path
from typing import Optional
from common import directories
def get_s2_id(arxiv_id: str) -> Optional[str]:
s2_id_path = os.path.join(
directories.arxiv_subdir("s2-metadata", arxiv_id), "s2_id"
)
if not os.path.exists(s2_id_path):
logging.warning("Could not find S2 ID file for %s. Skipping", arxiv_id)
return None
with open(s2_id_path) as s2_id_file:
return s2_id_file.read()
| 26.823529 | 79 | 0.695175 | 74 | 456 | 4.054054 | 0.459459 | 0.106667 | 0.08 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.025 | 0.210526 | 456 | 16 | 80 | 28.5 | 0.808333 | 0 | 0 | 0 | 0 | 0 | 0.127193 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076923 | false | 0 | 0.307692 | 0 | 0.538462 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
b90a80c45bd0535964159cbaa462e13f3e764202 | 158 | py | Python | pyslim/_version.py | andrewkern/pyslim | 55d25fb9021d3d475842b11a97d8fe238e34974d | [
"MIT"
] | null | null | null | pyslim/_version.py | andrewkern/pyslim | 55d25fb9021d3d475842b11a97d8fe238e34974d | [
"MIT"
] | null | null | null | pyslim/_version.py | andrewkern/pyslim | 55d25fb9021d3d475842b11a97d8fe238e34974d | [
"MIT"
] | null | null | null | # coding: utf-8
pyslim_version = '0.700'
slim_file_version = '0.7'
# other file versions that require no modification
compatible_slim_file_versions = ['0.7']
| 26.333333 | 50 | 0.759494 | 25 | 158 | 4.56 | 0.68 | 0.140351 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.065217 | 0.126582 | 158 | 5 | 51 | 31.6 | 0.76087 | 0.392405 | 0 | 0 | 0 | 0 | 0.11828 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
f8dc1de369cd72313b82ac05b471169d5c514c7d | 627 | py | Python | pless/passwordless/mail.py | omab/psa-passwordless | acd116f253c0b2d4be74bd545cf3e500612b56f3 | [
"MIT"
] | 19 | 2015-02-02T23:44:33.000Z | 2021-02-23T02:30:21.000Z | pless/passwordless/mail.py | omab/psa-passwordless | acd116f253c0b2d4be74bd545cf3e500612b56f3 | [
"MIT"
] | null | null | null | pless/passwordless/mail.py | omab/psa-passwordless | acd116f253c0b2d4be74bd545cf3e500612b56f3 | [
"MIT"
] | 2 | 2016-03-09T22:56:38.000Z | 2017-01-25T18:37:43.000Z | from django.conf import settings
from django.core.mail import send_mail
from django.core.urlresolvers import reverse
# Send mail validation to user, the email should include a link to continue the
# auth process. This is a simple example, it could easilly be extended to
# render a template and send a fancy HTML email instad.
def send_validation(strategy, backend, code):
url = reverse('token_login', args=(code.code,))
url = strategy.request.build_absolute_uri(url)
send_mail('Passwordless Login', 'Use this URL to login {0}'.format(url),
settings.EMAIL_FROM, [code.email], fail_silently=False)
| 39.1875 | 79 | 0.749601 | 95 | 627 | 4.863158 | 0.589474 | 0.064935 | 0.060606 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001912 | 0.165869 | 627 | 15 | 80 | 41.8 | 0.881453 | 0.323764 | 0 | 0 | 0 | 0 | 0.128571 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0.125 | 0.375 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 2 |
f8ecc7dd1ff51b0c127b7bd137800d2286c2664a | 229 | py | Python | pyvsystems_rewards/format.py | belovachap/pyvsystems_rewards | 70786b08ef8ce8f96a91ac0983fd1e61c3db86cc | [
"BSD-3-Clause"
] | 3 | 2020-02-11T10:56:04.000Z | 2020-06-03T08:10:42.000Z | pyvsystems_rewards/format.py | belovachap/pyvsystems_rewards | 70786b08ef8ce8f96a91ac0983fd1e61c3db86cc | [
"BSD-3-Clause"
] | 1 | 2020-03-17T14:12:09.000Z | 2020-03-17T14:26:17.000Z | pyvsystems_rewards/format.py | virtualeconomy/pyvsystems_rewards | cd2fd3380195933c573f55700147e30b2c0f7789 | [
"BSD-3-Clause"
] | 2 | 2020-02-04T02:39:02.000Z | 2020-06-03T08:10:32.000Z |
def format_as_vsys(amount):
abs_amount = abs(amount)
whole = int(abs_amount / 100000000)
fraction = abs_amount % 100000000
if amount < 0:
whole *= -1
return f'{whole}.{str(fraction).rjust(8, "0")}'
| 20.818182 | 51 | 0.615721 | 31 | 229 | 4.387097 | 0.580645 | 0.264706 | 0.220588 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.127168 | 0.244541 | 229 | 10 | 52 | 22.9 | 0.65896 | 0 | 0 | 0 | 0 | 0 | 0.162281 | 0.135965 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0 | 0 | 0.285714 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
f8f704e32ff2a93014bbaf86f8073b87d7f1220b | 174 | py | Python | fernet/configuration.py | heroku/fernet-py | 02a739408b4eec774102c3588ae31bd28b708b57 | [
"MIT"
] | 2 | 2015-11-05T07:41:12.000Z | 2016-02-03T13:26:34.000Z | fernet/configuration.py | heroku/fernet-py | 02a739408b4eec774102c3588ae31bd28b708b57 | [
"MIT"
] | null | null | null | fernet/configuration.py | heroku/fernet-py | 02a739408b4eec774102c3588ae31bd28b708b57 | [
"MIT"
] | 2 | 2016-10-08T19:20:02.000Z | 2019-10-08T17:33:54.000Z | __author__ = 'spersinger'
class Configuration:
@staticmethod
def run():
Configuration.enforce_ttl = True
Configuration.ttl = 60
Configuration.run()
| 17.4 | 40 | 0.678161 | 16 | 174 | 7.0625 | 0.6875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015038 | 0.235632 | 174 | 9 | 41 | 19.333333 | 0.834586 | 0 | 0 | 0 | 0 | 0 | 0.057471 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0 | 0 | 0.285714 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
5d05a85a2d19382988a8508cf7434817a426fae2 | 24,297 | py | Python | terra_sdk/protobuf/ibc/core/client/v1/tx_pb2.py | sejalsahni/terra.py | 0fd84969441c58427a21448520697c3ab3ec2d0c | [
"MIT"
] | 24 | 2021-05-30T05:48:33.000Z | 2021-10-07T04:47:15.000Z | terra_sdk/protobuf/ibc/core/client/v1/tx_pb2.py | sejalsahni/terra.py | 0fd84969441c58427a21448520697c3ab3ec2d0c | [
"MIT"
] | 18 | 2021-05-30T09:05:26.000Z | 2021-10-17T07:12:12.000Z | terra_sdk/protobuf/ibc/core/client/v1/tx_pb2.py | sejalsahni/terra.py | 0fd84969441c58427a21448520697c3ab3ec2d0c | [
"MIT"
] | 10 | 2021-02-11T00:56:04.000Z | 2021-05-27T08:37:49.000Z | # -*- coding: utf-8 -*-
# Generated by the protocol buffer compiler. DO NOT EDIT!
# source: ibc/core/client/v1/tx.proto
"""Generated protocol buffer code."""
from google.protobuf import descriptor as _descriptor
from google.protobuf import message as _message
from google.protobuf import reflection as _reflection
from google.protobuf import symbol_database as _symbol_database
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
from gogoproto import gogo_pb2 as gogoproto_dot_gogo__pb2
from google.protobuf import any_pb2 as google_dot_protobuf_dot_any__pb2
from ibc.core.client.v1 import (
client_pb2 as ibc_dot_core_dot_client_dot_v1_dot_client__pb2,
)
DESCRIPTOR = _descriptor.FileDescriptor(
name="ibc/core/client/v1/tx.proto",
package="ibc.core.client.v1",
syntax="proto3",
serialized_options=b"Z5github.com/cosmos/ibc-go/modules/core/02-client/types",
create_key=_descriptor._internal_create_key,
serialized_pb=b'\n\x1bibc/core/client/v1/tx.proto\x12\x12ibc.core.client.v1\x1a\x14gogoproto/gogo.proto\x1a\x19google/protobuf/any.proto\x1a\x1fibc/core/client/v1/client.proto"\xbb\x01\n\x0fMsgCreateClient\x12\x43\n\x0c\x63lient_state\x18\x01 \x01(\x0b\x32\x14.google.protobuf.AnyB\x17\xf2\xde\x1f\x13yaml:"client_state"\x12I\n\x0f\x63onsensus_state\x18\x02 \x01(\x0b\x32\x14.google.protobuf.AnyB\x1a\xf2\xde\x1f\x16yaml:"consensus_state"\x12\x0e\n\x06signer\x18\x03 \x01(\t:\x08\xe8\xa0\x1f\x00\x88\xa0\x1f\x00"\x19\n\x17MsgCreateClientResponse"z\n\x0fMsgUpdateClient\x12\'\n\tclient_id\x18\x01 \x01(\tB\x14\xf2\xde\x1f\x10yaml:"client_id"\x12$\n\x06header\x18\x02 \x01(\x0b\x32\x14.google.protobuf.Any\x12\x0e\n\x06signer\x18\x03 \x01(\t:\x08\xe8\xa0\x1f\x00\x88\xa0\x1f\x00"\x19\n\x17MsgUpdateClientResponse"\xf5\x02\n\x10MsgUpgradeClient\x12\'\n\tclient_id\x18\x01 \x01(\tB\x14\xf2\xde\x1f\x10yaml:"client_id"\x12\x43\n\x0c\x63lient_state\x18\x02 \x01(\x0b\x32\x14.google.protobuf.AnyB\x17\xf2\xde\x1f\x13yaml:"client_state"\x12I\n\x0f\x63onsensus_state\x18\x03 \x01(\x0b\x32\x14.google.protobuf.AnyB\x1a\xf2\xde\x1f\x16yaml:"consensus_state"\x12=\n\x14proof_upgrade_client\x18\x04 \x01(\x0c\x42\x1f\xf2\xde\x1f\x1byaml:"proof_upgrade_client"\x12O\n\x1dproof_upgrade_consensus_state\x18\x05 \x01(\x0c\x42(\xf2\xde\x1f$yaml:"proof_upgrade_consensus_state"\x12\x0e\n\x06signer\x18\x06 \x01(\t:\x08\xe8\xa0\x1f\x00\x88\xa0\x1f\x00"\x1a\n\x18MsgUpgradeClientResponse"\x86\x01\n\x15MsgSubmitMisbehaviour\x12\'\n\tclient_id\x18\x01 \x01(\tB\x14\xf2\xde\x1f\x10yaml:"client_id"\x12*\n\x0cmisbehaviour\x18\x02 \x01(\x0b\x32\x14.google.protobuf.Any\x12\x0e\n\x06signer\x18\x03 \x01(\t:\x08\xe8\xa0\x1f\x00\x88\xa0\x1f\x00"\x1f\n\x1dMsgSubmitMisbehaviourResponse2\xa2\x03\n\x03Msg\x12`\n\x0c\x43reateClient\x12#.ibc.core.client.v1.MsgCreateClient\x1a+.ibc.core.client.v1.MsgCreateClientResponse\x12`\n\x0cUpdateClient\x12#.ibc.core.client.v1.MsgUpdateClient\x1a+.ibc.core.client.v1.MsgUpdateClientResponse\x12\x63\n\rUpgradeClient\x12$.ibc.core.client.v1.MsgUpgradeClient\x1a,.ibc.core.client.v1.MsgUpgradeClientResponse\x12r\n\x12SubmitMisbehaviour\x12).ibc.core.client.v1.MsgSubmitMisbehaviour\x1a\x31.ibc.core.client.v1.MsgSubmitMisbehaviourResponseB7Z5github.com/cosmos/ibc-go/modules/core/02-client/typesb\x06proto3',
dependencies=[
gogoproto_dot_gogo__pb2.DESCRIPTOR,
google_dot_protobuf_dot_any__pb2.DESCRIPTOR,
ibc_dot_core_dot_client_dot_v1_dot_client__pb2.DESCRIPTOR,
],
)
_MSGCREATECLIENT = _descriptor.Descriptor(
name="MsgCreateClient",
full_name="ibc.core.client.v1.MsgCreateClient",
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name="client_state",
full_name="ibc.core.client.v1.MsgCreateClient.client_state",
index=0,
number=1,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=b'\362\336\037\023yaml:"client_state"',
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
),
_descriptor.FieldDescriptor(
name="consensus_state",
full_name="ibc.core.client.v1.MsgCreateClient.consensus_state",
index=1,
number=2,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=b'\362\336\037\026yaml:"consensus_state"',
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
),
_descriptor.FieldDescriptor(
name="signer",
full_name="ibc.core.client.v1.MsgCreateClient.signer",
index=2,
number=3,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=b"\350\240\037\000\210\240\037\000",
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=134,
serialized_end=321,
)
_MSGCREATECLIENTRESPONSE = _descriptor.Descriptor(
name="MsgCreateClientResponse",
full_name="ibc.core.client.v1.MsgCreateClientResponse",
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=323,
serialized_end=348,
)
_MSGUPDATECLIENT = _descriptor.Descriptor(
name="MsgUpdateClient",
full_name="ibc.core.client.v1.MsgUpdateClient",
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name="client_id",
full_name="ibc.core.client.v1.MsgUpdateClient.client_id",
index=0,
number=1,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=b'\362\336\037\020yaml:"client_id"',
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
),
_descriptor.FieldDescriptor(
name="header",
full_name="ibc.core.client.v1.MsgUpdateClient.header",
index=1,
number=2,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
),
_descriptor.FieldDescriptor(
name="signer",
full_name="ibc.core.client.v1.MsgUpdateClient.signer",
index=2,
number=3,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=b"\350\240\037\000\210\240\037\000",
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=350,
serialized_end=472,
)
_MSGUPDATECLIENTRESPONSE = _descriptor.Descriptor(
name="MsgUpdateClientResponse",
full_name="ibc.core.client.v1.MsgUpdateClientResponse",
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=474,
serialized_end=499,
)
_MSGUPGRADECLIENT = _descriptor.Descriptor(
name="MsgUpgradeClient",
full_name="ibc.core.client.v1.MsgUpgradeClient",
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name="client_id",
full_name="ibc.core.client.v1.MsgUpgradeClient.client_id",
index=0,
number=1,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=b'\362\336\037\020yaml:"client_id"',
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
),
_descriptor.FieldDescriptor(
name="client_state",
full_name="ibc.core.client.v1.MsgUpgradeClient.client_state",
index=1,
number=2,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=b'\362\336\037\023yaml:"client_state"',
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
),
_descriptor.FieldDescriptor(
name="consensus_state",
full_name="ibc.core.client.v1.MsgUpgradeClient.consensus_state",
index=2,
number=3,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=b'\362\336\037\026yaml:"consensus_state"',
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
),
_descriptor.FieldDescriptor(
name="proof_upgrade_client",
full_name="ibc.core.client.v1.MsgUpgradeClient.proof_upgrade_client",
index=3,
number=4,
type=12,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"",
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=b'\362\336\037\033yaml:"proof_upgrade_client"',
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
),
_descriptor.FieldDescriptor(
name="proof_upgrade_consensus_state",
full_name="ibc.core.client.v1.MsgUpgradeClient.proof_upgrade_consensus_state",
index=4,
number=5,
type=12,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"",
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=b'\362\336\037$yaml:"proof_upgrade_consensus_state"',
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
),
_descriptor.FieldDescriptor(
name="signer",
full_name="ibc.core.client.v1.MsgUpgradeClient.signer",
index=5,
number=6,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=b"\350\240\037\000\210\240\037\000",
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=502,
serialized_end=875,
)
_MSGUPGRADECLIENTRESPONSE = _descriptor.Descriptor(
name="MsgUpgradeClientResponse",
full_name="ibc.core.client.v1.MsgUpgradeClientResponse",
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=877,
serialized_end=903,
)
_MSGSUBMITMISBEHAVIOUR = _descriptor.Descriptor(
name="MsgSubmitMisbehaviour",
full_name="ibc.core.client.v1.MsgSubmitMisbehaviour",
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name="client_id",
full_name="ibc.core.client.v1.MsgSubmitMisbehaviour.client_id",
index=0,
number=1,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=b'\362\336\037\020yaml:"client_id"',
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
),
_descriptor.FieldDescriptor(
name="misbehaviour",
full_name="ibc.core.client.v1.MsgSubmitMisbehaviour.misbehaviour",
index=1,
number=2,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
),
_descriptor.FieldDescriptor(
name="signer",
full_name="ibc.core.client.v1.MsgSubmitMisbehaviour.signer",
index=2,
number=3,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=b"\350\240\037\000\210\240\037\000",
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=906,
serialized_end=1040,
)
_MSGSUBMITMISBEHAVIOURRESPONSE = _descriptor.Descriptor(
name="MsgSubmitMisbehaviourResponse",
full_name="ibc.core.client.v1.MsgSubmitMisbehaviourResponse",
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=1042,
serialized_end=1073,
)
_MSGCREATECLIENT.fields_by_name[
"client_state"
].message_type = google_dot_protobuf_dot_any__pb2._ANY
_MSGCREATECLIENT.fields_by_name[
"consensus_state"
].message_type = google_dot_protobuf_dot_any__pb2._ANY
_MSGUPDATECLIENT.fields_by_name[
"header"
].message_type = google_dot_protobuf_dot_any__pb2._ANY
_MSGUPGRADECLIENT.fields_by_name[
"client_state"
].message_type = google_dot_protobuf_dot_any__pb2._ANY
_MSGUPGRADECLIENT.fields_by_name[
"consensus_state"
].message_type = google_dot_protobuf_dot_any__pb2._ANY
_MSGSUBMITMISBEHAVIOUR.fields_by_name[
"misbehaviour"
].message_type = google_dot_protobuf_dot_any__pb2._ANY
DESCRIPTOR.message_types_by_name["MsgCreateClient"] = _MSGCREATECLIENT
DESCRIPTOR.message_types_by_name["MsgCreateClientResponse"] = _MSGCREATECLIENTRESPONSE
DESCRIPTOR.message_types_by_name["MsgUpdateClient"] = _MSGUPDATECLIENT
DESCRIPTOR.message_types_by_name["MsgUpdateClientResponse"] = _MSGUPDATECLIENTRESPONSE
DESCRIPTOR.message_types_by_name["MsgUpgradeClient"] = _MSGUPGRADECLIENT
DESCRIPTOR.message_types_by_name["MsgUpgradeClientResponse"] = _MSGUPGRADECLIENTRESPONSE
DESCRIPTOR.message_types_by_name["MsgSubmitMisbehaviour"] = _MSGSUBMITMISBEHAVIOUR
DESCRIPTOR.message_types_by_name[
"MsgSubmitMisbehaviourResponse"
] = _MSGSUBMITMISBEHAVIOURRESPONSE
_sym_db.RegisterFileDescriptor(DESCRIPTOR)
MsgCreateClient = _reflection.GeneratedProtocolMessageType(
"MsgCreateClient",
(_message.Message,),
{
"DESCRIPTOR": _MSGCREATECLIENT,
"__module__": "ibc.core.client.v1.tx_pb2"
# @@protoc_insertion_point(class_scope:ibc.core.client.v1.MsgCreateClient)
},
)
_sym_db.RegisterMessage(MsgCreateClient)
MsgCreateClientResponse = _reflection.GeneratedProtocolMessageType(
"MsgCreateClientResponse",
(_message.Message,),
{
"DESCRIPTOR": _MSGCREATECLIENTRESPONSE,
"__module__": "ibc.core.client.v1.tx_pb2"
# @@protoc_insertion_point(class_scope:ibc.core.client.v1.MsgCreateClientResponse)
},
)
_sym_db.RegisterMessage(MsgCreateClientResponse)
MsgUpdateClient = _reflection.GeneratedProtocolMessageType(
"MsgUpdateClient",
(_message.Message,),
{
"DESCRIPTOR": _MSGUPDATECLIENT,
"__module__": "ibc.core.client.v1.tx_pb2"
# @@protoc_insertion_point(class_scope:ibc.core.client.v1.MsgUpdateClient)
},
)
_sym_db.RegisterMessage(MsgUpdateClient)
MsgUpdateClientResponse = _reflection.GeneratedProtocolMessageType(
"MsgUpdateClientResponse",
(_message.Message,),
{
"DESCRIPTOR": _MSGUPDATECLIENTRESPONSE,
"__module__": "ibc.core.client.v1.tx_pb2"
# @@protoc_insertion_point(class_scope:ibc.core.client.v1.MsgUpdateClientResponse)
},
)
_sym_db.RegisterMessage(MsgUpdateClientResponse)
MsgUpgradeClient = _reflection.GeneratedProtocolMessageType(
"MsgUpgradeClient",
(_message.Message,),
{
"DESCRIPTOR": _MSGUPGRADECLIENT,
"__module__": "ibc.core.client.v1.tx_pb2"
# @@protoc_insertion_point(class_scope:ibc.core.client.v1.MsgUpgradeClient)
},
)
_sym_db.RegisterMessage(MsgUpgradeClient)
MsgUpgradeClientResponse = _reflection.GeneratedProtocolMessageType(
"MsgUpgradeClientResponse",
(_message.Message,),
{
"DESCRIPTOR": _MSGUPGRADECLIENTRESPONSE,
"__module__": "ibc.core.client.v1.tx_pb2"
# @@protoc_insertion_point(class_scope:ibc.core.client.v1.MsgUpgradeClientResponse)
},
)
_sym_db.RegisterMessage(MsgUpgradeClientResponse)
MsgSubmitMisbehaviour = _reflection.GeneratedProtocolMessageType(
"MsgSubmitMisbehaviour",
(_message.Message,),
{
"DESCRIPTOR": _MSGSUBMITMISBEHAVIOUR,
"__module__": "ibc.core.client.v1.tx_pb2"
# @@protoc_insertion_point(class_scope:ibc.core.client.v1.MsgSubmitMisbehaviour)
},
)
_sym_db.RegisterMessage(MsgSubmitMisbehaviour)
MsgSubmitMisbehaviourResponse = _reflection.GeneratedProtocolMessageType(
"MsgSubmitMisbehaviourResponse",
(_message.Message,),
{
"DESCRIPTOR": _MSGSUBMITMISBEHAVIOURRESPONSE,
"__module__": "ibc.core.client.v1.tx_pb2"
# @@protoc_insertion_point(class_scope:ibc.core.client.v1.MsgSubmitMisbehaviourResponse)
},
)
_sym_db.RegisterMessage(MsgSubmitMisbehaviourResponse)
DESCRIPTOR._options = None
_MSGCREATECLIENT.fields_by_name["client_state"]._options = None
_MSGCREATECLIENT.fields_by_name["consensus_state"]._options = None
_MSGCREATECLIENT._options = None
_MSGUPDATECLIENT.fields_by_name["client_id"]._options = None
_MSGUPDATECLIENT._options = None
_MSGUPGRADECLIENT.fields_by_name["client_id"]._options = None
_MSGUPGRADECLIENT.fields_by_name["client_state"]._options = None
_MSGUPGRADECLIENT.fields_by_name["consensus_state"]._options = None
_MSGUPGRADECLIENT.fields_by_name["proof_upgrade_client"]._options = None
_MSGUPGRADECLIENT.fields_by_name["proof_upgrade_consensus_state"]._options = None
_MSGUPGRADECLIENT._options = None
_MSGSUBMITMISBEHAVIOUR.fields_by_name["client_id"]._options = None
_MSGSUBMITMISBEHAVIOUR._options = None
_MSG = _descriptor.ServiceDescriptor(
name="Msg",
full_name="ibc.core.client.v1.Msg",
file=DESCRIPTOR,
index=0,
serialized_options=None,
create_key=_descriptor._internal_create_key,
serialized_start=1076,
serialized_end=1494,
methods=[
_descriptor.MethodDescriptor(
name="CreateClient",
full_name="ibc.core.client.v1.Msg.CreateClient",
index=0,
containing_service=None,
input_type=_MSGCREATECLIENT,
output_type=_MSGCREATECLIENTRESPONSE,
serialized_options=None,
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name="UpdateClient",
full_name="ibc.core.client.v1.Msg.UpdateClient",
index=1,
containing_service=None,
input_type=_MSGUPDATECLIENT,
output_type=_MSGUPDATECLIENTRESPONSE,
serialized_options=None,
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name="UpgradeClient",
full_name="ibc.core.client.v1.Msg.UpgradeClient",
index=2,
containing_service=None,
input_type=_MSGUPGRADECLIENT,
output_type=_MSGUPGRADECLIENTRESPONSE,
serialized_options=None,
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name="SubmitMisbehaviour",
full_name="ibc.core.client.v1.Msg.SubmitMisbehaviour",
index=3,
containing_service=None,
input_type=_MSGSUBMITMISBEHAVIOUR,
output_type=_MSGSUBMITMISBEHAVIOURRESPONSE,
serialized_options=None,
create_key=_descriptor._internal_create_key,
),
],
)
_sym_db.RegisterServiceDescriptor(_MSG)
DESCRIPTOR.services_by_name["Msg"] = _MSG
# @@protoc_insertion_point(module_scope)
| 35.573939 | 2,320 | 0.661851 | 2,562 | 24,297 | 5.947697 | 0.092506 | 0.038719 | 0.046463 | 0.055125 | 0.700683 | 0.635451 | 0.618454 | 0.560244 | 0.544757 | 0.525988 | 0 | 0.044461 | 0.231675 | 24,297 | 682 | 2,321 | 35.6261 | 0.771802 | 0.034984 | 0 | 0.677725 | 1 | 0.004739 | 0.188655 | 0.154125 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.011058 | 0 | 0.011058 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
5d0661a7d81a873558f9efd83196c65836b18cad | 117 | py | Python | boa3_test/test_sc/built_in_methods_test/ClearTuple.py | hal0x2328/neo3-boa | 6825a3533384cb01660773050719402a9703065b | [
"Apache-2.0"
] | 25 | 2020-07-22T19:37:43.000Z | 2022-03-08T03:23:55.000Z | boa3_test/test_sc/built_in_methods_test/ClearTuple.py | hal0x2328/neo3-boa | 6825a3533384cb01660773050719402a9703065b | [
"Apache-2.0"
] | 419 | 2020-04-23T17:48:14.000Z | 2022-03-31T13:17:45.000Z | boa3_test/test_sc/built_in_methods_test/ClearTuple.py | hal0x2328/neo3-boa | 6825a3533384cb01660773050719402a9703065b | [
"Apache-2.0"
] | 15 | 2020-05-21T21:54:24.000Z | 2021-11-18T06:17:24.000Z | from typing import Tuple
def Main(op: str, args: list) -> Tuple[int]:
a = (1, 2, 3)
a.clear()
return a
| 14.625 | 44 | 0.564103 | 20 | 117 | 3.3 | 0.85 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.035714 | 0.282051 | 117 | 7 | 45 | 16.714286 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.2 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
5d21c2d73f99ebedd15f8af9845aefda397275ab | 465 | py | Python | examples/courseware/strings.py | LettError/drawbot | dce9af449d429af3f10827654d8b9d3bb8bb8efe | [
"BSD-2-Clause"
] | 2 | 2015-09-17T01:27:02.000Z | 2020-11-26T12:07:13.000Z | examples/courseware/strings.py | LettError/drawbot | dce9af449d429af3f10827654d8b9d3bb8bb8efe | [
"BSD-2-Clause"
] | null | null | null | examples/courseware/strings.py | LettError/drawbot | dce9af449d429af3f10827654d8b9d3bb8bb8efe | [
"BSD-2-Clause"
] | null | null | null | print 'this is a so called "string"'
print "this is a so called 'string'"
print "this is a so called \"string\""
print "one string " + "another string"
a = "one string"
b = "another string"
print a + " " + b
print "many " * 10
print "non-ascii should generally work:"
print "Åbenrå © Ђ ק"
print "and now an error:"
print "many " * 10.0
# string multiplication really wants an
# integer number; a float that happens to
# be a whole number is not good enough
| 19.375 | 41 | 0.67957 | 78 | 465 | 4.064103 | 0.512821 | 0.138801 | 0.104101 | 0.113565 | 0.26183 | 0.26183 | 0.26183 | 0.26183 | 0.26183 | 0.26183 | 0 | 0.013661 | 0.212903 | 465 | 23 | 42 | 20.217391 | 0.849727 | 0.245161 | 0 | 0 | 0 | 0 | 0.572254 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.833333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
5d24a89f927db5768804f2f605b5787b6e3745b7 | 725 | py | Python | Helper/Player/Player.py | jingege315/gobang | 983a0ce34dc2120464f42441ef6b190ef2433f08 | [
"MIT"
] | 4 | 2019-04-20T07:04:02.000Z | 2020-06-23T14:12:15.000Z | Helper/Player/Player.py | jingege315/gobang_alphazero | 983a0ce34dc2120464f42441ef6b190ef2433f08 | [
"MIT"
] | null | null | null | Helper/Player/Player.py | jingege315/gobang_alphazero | 983a0ce34dc2120464f42441ef6b190ef2433f08 | [
"MIT"
] | null | null | null | from ..Base import *
class Player(object):
"""
the player playing gobang can be human or AI
"""
def __init__(self, chess_self: Chess):
self._chess_self = chess_self
def get_next(self, board: BoardSave) -> (int, int):
"""
the player can use the information of board and order to decide how to move in next step
:param board:
:return:
return (x,y):the move about next step
return None:waiting for human click
"""
raise NotImplementedError()
@staticmethod
def is_auto() -> bool:
"""
:return: whether the player is AI to auto move chess
"""
raise NotImplementedError()
def get_chess_color(self) -> Chess:
"""
:return: the chess's color of this player
"""
return self._chess_self
| 21.323529 | 90 | 0.685517 | 106 | 725 | 4.556604 | 0.5 | 0.111801 | 0.134576 | 0.111801 | 0.082816 | 0.082816 | 0.082816 | 0 | 0 | 0 | 0 | 0 | 0.208276 | 725 | 33 | 91 | 21.969697 | 0.841463 | 0.470345 | 0 | 0.181818 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.363636 | false | 0 | 0.090909 | 0 | 0.636364 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
5d25b69278552a57f0abf23def640bf5a2fde428 | 1,486 | py | Python | tests/test_loaders.py | HolmesNL/python-configuration | 8b6446d99b46a76dfd13951ed655844c8aa97e0c | [
"Apache-2.0"
] | 5 | 2017-12-18T12:55:37.000Z | 2020-04-27T11:33:02.000Z | tests/test_loaders.py | HolmesNL/python-configuration | 8b6446d99b46a76dfd13951ed655844c8aa97e0c | [
"Apache-2.0"
] | 33 | 2018-01-11T18:03:34.000Z | 2020-09-28T07:12:50.000Z | tests/test_loaders.py | NetherlandsForensicInstitute/confidence | cbe2cece94e8d3e3f792f11f314e8a464dee2bfc | [
"Apache-2.0"
] | 2 | 2018-06-29T11:22:23.000Z | 2019-03-07T16:09:06.000Z | from itertools import chain, groupby
from confidence import DEFAULT_LOAD_ORDER, loaders, Locality
from confidence.io import _LOADERS
def test_default_load_order_all_loaders():
all_loaders = set(chain.from_iterable(_LOADERS.values()))
assert len(all_loaders) == len(DEFAULT_LOAD_ORDER)
assert all(loader in DEFAULT_LOAD_ORDER for loader in all_loaders)
def test_default_load_order_locality():
localities = {loader: locality for locality, local_loaders in _LOADERS.items() for loader in local_loaders}
localities = map(localities.get, DEFAULT_LOAD_ORDER)
assert tuple(key for key, _ in groupby(localities)) == tuple(sorted(Locality))
def test_no_loaders():
assert tuple(*loaders()) == ()
def test_locality_loaders():
assert tuple(loaders(Locality.USER)) == _LOADERS[Locality.USER]
assert tuple(loaders(Locality.SYSTEM, Locality.APPLICATION)) == tuple(chain(_LOADERS[Locality.SYSTEM], _LOADERS[Locality.APPLICATION]))
assert tuple(loaders(Locality.ENVIRONMENT, Locality.ENVIRONMENT)) == tuple(chain(_LOADERS[Locality.ENVIRONMENT], _LOADERS[Locality.ENVIRONMENT]))
def test_loaders_mixed():
def function():
pass
assert tuple(loaders('just a string')) == ('just a string',)
assert tuple(loaders('just a string', function)) == ('just a string', function)
assert tuple(loaders(function, Locality.ENVIRONMENT, '{name}.{extension}')) == tuple(chain([function], _LOADERS[Locality.ENVIRONMENT], ['{name}.{extension}']))
| 40.162162 | 163 | 0.746299 | 184 | 1,486 | 5.815217 | 0.23913 | 0.140187 | 0.117757 | 0.072897 | 0.11028 | 0.11028 | 0 | 0 | 0 | 0 | 0 | 0 | 0.129206 | 1,486 | 36 | 164 | 41.277778 | 0.826893 | 0 | 0 | 0 | 0 | 0 | 0.059219 | 0 | 0 | 0 | 0 | 0 | 0.434783 | 1 | 0.26087 | false | 0.043478 | 0.130435 | 0 | 0.391304 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
5d2b46040e98dd0feaa1d65d0c14205881c03ebc | 246 | py | Python | py/40.py | higgsd/euler | 340f192ed2831ab9d37df43fe19e39971ef19cf8 | [
"BSD-2-Clause"
] | null | null | null | py/40.py | higgsd/euler | 340f192ed2831ab9d37df43fe19e39971ef19cf8 | [
"BSD-2-Clause"
] | null | null | null | py/40.py | higgsd/euler | 340f192ed2831ab9d37df43fe19e39971ef19cf8 | [
"BSD-2-Clause"
] | null | null | null | # 210
def fdigit(n):
n -= 1
p = 9
d = 1
while n >= d * p:
n -= d * p
p *= 10
d += 1
v = (10 ** (d - 1)) + n / d
return int(str(v)[n % d])
p = 1
for i in xrange(7):
p *= fdigit(10 ** i)
print p
| 14.470588 | 31 | 0.357724 | 46 | 246 | 1.913043 | 0.434783 | 0.090909 | 0.102273 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.119403 | 0.455285 | 246 | 16 | 32 | 15.375 | 0.537313 | 0.012195 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.071429 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
5d470111a848809ce671f2f8d33b614e4e246f50 | 184,345 | py | Python | zmk/nyokaBase/PMML43Ext.py | frenebo/ZMOD | 58159fcbf61200c1ec2d6b92fca0cd9d4e83a208 | [
"Apache-2.0"
] | null | null | null | zmk/nyokaBase/PMML43Ext.py | frenebo/ZMOD | 58159fcbf61200c1ec2d6b92fca0cd9d4e83a208 | [
"Apache-2.0"
] | null | null | null | zmk/nyokaBase/PMML43Ext.py | frenebo/ZMOD | 58159fcbf61200c1ec2d6b92fca0cd9d4e83a208 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python
#
# Generated Wed Mar 13 13:31:45 2019 by generateDS.py version 2.28a.
#
# Command line options:
# ('--no-warnings', '')
# ('--export', 'write literal etree')
# ('--super', 'nyoka.PMML43ExtSuper')
# ('--subclass-suffix', '')
# ('-o', 'nyoka.PMML43ExtSuper.py')
# ('-s', 'nyoka.PMML43Ext.py')
# ('-b', 'behaviorsDir.xml')
# ('-f', '')
#
# Command line arguments:
# ..\nyoka.PMML43Ext.xsd
#
# Command line:
# C:\Projects\nyoka\nyoka\PMML43Ext\gds_local.py --no-warnings --export="write literal etree" --super="nyoka.PMML43ExtSuper" --subclass-suffix -o "nyoka.PMML43ExtSuper.py" -s "nyoka.PMML43Ext.py" -b "behaviorsDir.xml" -f ..\nyoka.PMML43Ext.xsd
#
# Current working directory (os.getcwd()):
# PMML43Ext
#
import sys
from lxml import etree as etree_
import nyokaBase.PMML43ExtSuper as supermod
def parsexml_(infile, parser=None, **kwargs):
if parser is None:
# Use the lxml ElementTree compatible parser so that, e.g.,
# we ignore comments.
parser = etree_.ETCompatXMLParser(huge_tree=True)
doc = etree_.parse(infile, parser=parser, **kwargs)
return doc
#
# Globals
#
ExternalEncoding = 'utf-8'
#
# Data representation classes
#
class AssociationModel(supermod.AssociationModel):
def __init__(self, modelName=None, functionName=None, algorithmName=None, numberOfTransactions=None, maxNumberOfItemsPerTA=None, avgNumberOfItemsPerTA=None, minimumSupport=None, minimumConfidence=None, lengthLimit=None, numberOfItems=None, numberOfItemsets=None, numberOfRules=None, isScorable=True, MiningSchema=None, Output=None, ModelStats=None, LocalTransformations=None, Item=None, Itemset=None, AssociationRule=None, ModelVerification=None, Extension=None):
super(AssociationModel, self).__init__(modelName, functionName, algorithmName, numberOfTransactions, maxNumberOfItemsPerTA, avgNumberOfItemsPerTA, minimumSupport, minimumConfidence, lengthLimit, numberOfItems, numberOfItemsets, numberOfRules, isScorable, MiningSchema, Output, ModelStats, LocalTransformations, Item, Itemset, AssociationRule, ModelVerification, Extension, )
#
# XMLBehaviors
#
def set_Item(self, Item, *args):
self.Item = Item
self.numberOfItems = len(self.Item)
def set_Item_wrapper(self, Item, *args):
result = self.set_Item(Item, *args)
return result
def add_Item(self, value, *args):
self.Item.append(value)
self.numberOfItems = len(self.Item)
def add_Item_wrapper(self, value, *args):
result = self.add_Item(value, *args)
return result
def insert_Item_at(self, index, value, *args):
self.Item.insert(index, value)
self.numberOfItems = len(self.Item)
def insert_Item_at_wrapper(self, index, value, *args):
result = self.insert_Item_at(index, value, *args)
return result
def set_Itemset(self, Itemset, *args):
self.Itemset = Itemset
self.numberOfItemsets = len(self.Itemset)
def set_Itemset_wrapper(self, Itemset, *args):
result = self.set_Itemset(Itemset, *args)
return result
def add_Itemset(self, value, *args):
self.Itemset.append(value)
self.numberOfItemsets = len(self.Itemset)
def add_Itemset_wrapper(self, value, *args):
result = self.add_Itemset(value, *args)
return result
def insert_Itemset_at(self, index, value, *args):
self.Itemset.insert(index, value)
self.numberOfItemsets = len(self.Itemset)
def insert_Itemset_at_wrapper(self, index, value, *args):
result = self.insert_Itemset_at(index, value, *args)
return result
def set_AssociationRule(self, Rules, *args):
pass
def set_AssociationRule_wrapper(self, Rules, *args):
result = self.set_AssociationRule(Rules, *args)
return result
def add_AssociationRule(self, value, *args):
self.AssociationRule.append(value)
self.numberOfRules = len(self.AssociationRule)
def add_AssociationRule_wrapper(self, value, *args):
result = self.add_AssociationRule(value, *args)
return result
def insert_AssociationRule_at(self, index, value, *args):
self.AssociationRule.insert(index, value)
self.numberOfRules = len(self.AssociationRule)
def insert_AssociationRule_at_wrapper(self, index, value, *args):
result = self.insert_AssociationRule_at(index, value, *args)
return result
supermod.AssociationModel.subclass = AssociationModel
# end class AssociationModel
class Item(supermod.Item):
def __init__(self, id=None, value=None, field=None, category=None, mappedValue=None, weight=None, Extension=None):
super(Item, self).__init__(id, value, field, category, mappedValue, weight, Extension, )
#
# XMLBehaviors
#
supermod.Item.subclass = Item
# end class Item
class Itemset(supermod.Itemset):
def __init__(self, id=None, support=None, numberOfItems=None, Extension=None, ItemRef=None):
super(Itemset, self).__init__(id, support, numberOfItems, Extension, ItemRef, )
#
# XMLBehaviors
#
def set_ItemRef(self, ItemRef, *args):
self.ItemRef = ItemRef
self.numberOfItems = len(self.ItemRef)
def set_ItemRef_wrapper(self, ItemRef, *args):
result = self.set_ItemRef(ItemRef, *args)
return result
def add_ItemRef(self, value, *args):
self.ItemRef.append(value)
self.numberOfItems = len(self.ItemRef)
def add_ItemRef_wrapper(self, value, *args):
result = self.add_ItemRef(value, *args)
return result
def insert_ItemRef_at(self, index, value, *args):
self.ItemRef.insert(index, value)
self.numberOfItems = len(self.ItemRef)
def insert_ItemRef_at_wrapper(self, index, value, *args):
result = self.insert_ItemRef_at(index, value, *args)
return result
supermod.Itemset.subclass = Itemset
# end class Itemset
class ItemRef(supermod.ItemRef):
def __init__(self, itemRef=None, Extension=None):
super(ItemRef, self).__init__(itemRef, Extension, )
#
# XMLBehaviors
#
supermod.ItemRef.subclass = ItemRef
# end class ItemRef
class AssociationRule(supermod.AssociationRule):
def __init__(self, antecedent=None, consequent=None, support=None, confidence=None, lift=None, leverage=None, affinity=None, id=None, Extension=None):
super(AssociationRule, self).__init__(antecedent, consequent, support, confidence, lift, leverage, affinity, id, Extension, )
#
# XMLBehaviors
#
supermod.AssociationRule.subclass = AssociationRule
# end class AssociationRule
class BaselineModel(supermod.BaselineModel):
def __init__(self, modelName=None, functionName=None, algorithmName=None, isScorable=True, MiningSchema=None, Output=None, ModelStats=None, ModelExplanation=None, Targets=None, LocalTransformations=None, TestDistributions=None, ModelVerification=None, Extension=None):
super(BaselineModel, self).__init__(modelName, functionName, algorithmName, isScorable, MiningSchema, Output, ModelStats, ModelExplanation, Targets, LocalTransformations, TestDistributions, ModelVerification, Extension, )
#
# XMLBehaviors
#
supermod.BaselineModel.subclass = BaselineModel
# end class BaselineModel
class TestDistributions(supermod.TestDistributions):
def __init__(self, field=None, testStatistic=None, resetValue='0.0', windowSize='0', weightField=None, normalizationScheme=None, Baseline=None, Alternate=None, Extension=None):
super(TestDistributions, self).__init__(field, testStatistic, resetValue, windowSize, weightField, normalizationScheme, Baseline, Alternate, Extension, )
#
# XMLBehaviors
#
supermod.TestDistributions.subclass = TestDistributions
# end class TestDistributions
class Baseline(supermod.Baseline):
def __init__(self, AnyDistribution=None, GaussianDistribution=None, PoissonDistribution=None, UniformDistribution=None, Extension=None, CountTable=None, NormalizedCountTable=None, FieldRef=None):
super(Baseline, self).__init__(AnyDistribution, GaussianDistribution, PoissonDistribution, UniformDistribution, Extension, CountTable, NormalizedCountTable, FieldRef, )
#
# XMLBehaviors
#
supermod.Baseline.subclass = Baseline
# end class Baseline
class Alternate(supermod.Alternate):
def __init__(self, AnyDistribution=None, GaussianDistribution=None, PoissonDistribution=None, UniformDistribution=None, Extension=None):
super(Alternate, self).__init__(AnyDistribution, GaussianDistribution, PoissonDistribution, UniformDistribution, Extension, )
#
# XMLBehaviors
#
supermod.Alternate.subclass = Alternate
# end class Alternate
class AnyDistribution(supermod.AnyDistribution):
def __init__(self, mean=None, variance=None, Extension=None):
super(AnyDistribution, self).__init__(mean, variance, Extension, )
#
# XMLBehaviors
#
supermod.AnyDistribution.subclass = AnyDistribution
# end class AnyDistribution
class GaussianDistribution(supermod.GaussianDistribution):
def __init__(self, mean=None, variance=None, Extension=None):
super(GaussianDistribution, self).__init__(mean, variance, Extension, )
#
# XMLBehaviors
#
supermod.GaussianDistribution.subclass = GaussianDistribution
# end class GaussianDistribution
class PoissonDistribution(supermod.PoissonDistribution):
def __init__(self, mean=None, Extension=None):
super(PoissonDistribution, self).__init__(mean, Extension, )
#
# XMLBehaviors
#
supermod.PoissonDistribution.subclass = PoissonDistribution
# end class PoissonDistribution
class UniformDistribution(supermod.UniformDistribution):
def __init__(self, lower=None, upper=None, Extension=None):
super(UniformDistribution, self).__init__(lower, upper, Extension, )
#
# XMLBehaviors
#
supermod.UniformDistribution.subclass = UniformDistribution
# end class UniformDistribution
class COUNT_TABLE_TYPE(supermod.COUNT_TABLE_TYPE):
def __init__(self, sample=None, Extension=None, FieldValue=None, FieldValueCount=None):
super(COUNT_TABLE_TYPE, self).__init__(sample, Extension, FieldValue, FieldValueCount, )
#
# XMLBehaviors
#
supermod.COUNT_TABLE_TYPE.subclass = COUNT_TABLE_TYPE
# end class COUNT_TABLE_TYPE
class FieldValue(supermod.FieldValue):
def __init__(self, field=None, value=None, Extension=None, FieldValue_member=None, FieldValueCount=None):
super(FieldValue, self).__init__(field, value, Extension, FieldValue_member, FieldValueCount, )
#
# XMLBehaviors
#
supermod.FieldValue.subclass = FieldValue
# end class FieldValue
class FieldValueCount(supermod.FieldValueCount):
def __init__(self, field=None, value=None, count=None, Extension=None):
super(FieldValueCount, self).__init__(field, value, count, Extension, )
#
# XMLBehaviors
#
supermod.FieldValueCount.subclass = FieldValueCount
# end class FieldValueCount
class BayesianNetworkModel(supermod.BayesianNetworkModel):
def __init__(self, modelName=None, functionName=None, algorithmName=None, isScorable=True, MiningSchema=None, Output=None, ModelStats=None, ModelExplanation=None, Targets=None, LocalTransformations=None, BayesianNetworkNodes=None, ModelVerification=None, Extension=None):
super(BayesianNetworkModel, self).__init__(modelName, functionName, algorithmName, isScorable, MiningSchema, Output, ModelStats, ModelExplanation, Targets, LocalTransformations, BayesianNetworkNodes, ModelVerification, Extension, )
#
# XMLBehaviors
#
supermod.BayesianNetworkModel.subclass = BayesianNetworkModel
# end class BayesianNetworkModel
class BayesianNetworkNodes(supermod.BayesianNetworkNodes):
def __init__(self, Extension=None, DiscreteNode=None, ContinuousNode=None):
super(BayesianNetworkNodes, self).__init__(Extension, DiscreteNode, ContinuousNode, )
#
# XMLBehaviors
#
supermod.BayesianNetworkNodes.subclass = BayesianNetworkNodes
# end class BayesianNetworkNodes
class DiscreteNode(supermod.DiscreteNode):
def __init__(self, name=None, count=None, Extension=None, DerivedField=None, DiscreteConditionalProbability=None, ValueProbability=None):
super(DiscreteNode, self).__init__(name, count, Extension, DerivedField, DiscreteConditionalProbability, ValueProbability, )
#
# XMLBehaviors
#
supermod.DiscreteNode.subclass = DiscreteNode
# end class DiscreteNode
class ContinuousNode(supermod.ContinuousNode):
def __init__(self, name=None, count=None, Extension=None, DerivedField=None, ContinuousConditionalProbability=None, ContinuousDistribution=None):
super(ContinuousNode, self).__init__(name, count, Extension, DerivedField, ContinuousConditionalProbability, ContinuousDistribution, )
#
# XMLBehaviors
#
supermod.ContinuousNode.subclass = ContinuousNode
# end class ContinuousNode
class DiscreteConditionalProbability(supermod.DiscreteConditionalProbability):
def __init__(self, count=None, Extension=None, ParentValue=None, ValueProbability=None):
super(DiscreteConditionalProbability, self).__init__(count, Extension, ParentValue, ValueProbability, )
#
# XMLBehaviors
#
supermod.DiscreteConditionalProbability.subclass = DiscreteConditionalProbability
# end class DiscreteConditionalProbability
class ParentValue(supermod.ParentValue):
def __init__(self, parent=None, value=None, Extension=None):
super(ParentValue, self).__init__(parent, value, Extension, )
#
# XMLBehaviors
#
supermod.ParentValue.subclass = ParentValue
# end class ParentValue
class ValueProbability(supermod.ValueProbability):
def __init__(self, value=None, probability=None, Extension=None):
super(ValueProbability, self).__init__(value, probability, Extension, )
#
# XMLBehaviors
#
supermod.ValueProbability.subclass = ValueProbability
# end class ValueProbability
class ContinuousConditionalProbability(supermod.ContinuousConditionalProbability):
def __init__(self, count=None, Extension=None, ParentValue=None, ContinuousDistribution=None):
super(ContinuousConditionalProbability, self).__init__(count, Extension, ParentValue, ContinuousDistribution, )
#
# XMLBehaviors
#
supermod.ContinuousConditionalProbability.subclass = ContinuousConditionalProbability
# end class ContinuousConditionalProbability
class ContinuousDistribution(supermod.ContinuousDistribution):
def __init__(self, Extension=None, TriangularDistributionForBN=None, NormalDistributionForBN=None, LognormalDistributionForBN=None, UniformDistributionForBN=None):
super(ContinuousDistribution, self).__init__(Extension, TriangularDistributionForBN, NormalDistributionForBN, LognormalDistributionForBN, UniformDistributionForBN, )
#
# XMLBehaviors
#
supermod.ContinuousDistribution.subclass = ContinuousDistribution
# end class ContinuousDistribution
class TriangularDistributionForBN(supermod.TriangularDistributionForBN):
def __init__(self, Extension=None, Mean=None, Lower=None, Upper=None):
super(TriangularDistributionForBN, self).__init__(Extension, Mean, Lower, Upper, )
#
# XMLBehaviors
#
supermod.TriangularDistributionForBN.subclass = TriangularDistributionForBN
# end class TriangularDistributionForBN
class NormalDistributionForBN(supermod.NormalDistributionForBN):
def __init__(self, Extension=None, Mean=None, Variance=None):
super(NormalDistributionForBN, self).__init__(Extension, Mean, Variance, )
#
# XMLBehaviors
#
supermod.NormalDistributionForBN.subclass = NormalDistributionForBN
# end class NormalDistributionForBN
class LognormalDistributionForBN(supermod.LognormalDistributionForBN):
def __init__(self, Extension=None, Mean=None, Variance=None):
super(LognormalDistributionForBN, self).__init__(Extension, Mean, Variance, )
#
# XMLBehaviors
#
supermod.LognormalDistributionForBN.subclass = LognormalDistributionForBN
# end class LognormalDistributionForBN
class UniformDistributionForBN(supermod.UniformDistributionForBN):
def __init__(self, Extension=None, Lower=None, Upper=None):
super(UniformDistributionForBN, self).__init__(Extension, Lower, Upper, )
#
# XMLBehaviors
#
supermod.UniformDistributionForBN.subclass = UniformDistributionForBN
# end class UniformDistributionForBN
class Mean(supermod.Mean):
def __init__(self, Extension=None, Apply=None, FieldRef=None, Constant=None, NormContinuous=None, NormDiscrete=None, Discretize=None, MapValues=None, TextIndex=None, Aggregate=None, Lag=None):
super(Mean, self).__init__(Extension, Apply, FieldRef, Constant, NormContinuous, NormDiscrete, Discretize, MapValues, TextIndex, Aggregate, Lag, )
#
# XMLBehaviors
#
supermod.Mean.subclass = Mean
# end class Mean
class Lower(supermod.Lower):
def __init__(self, Extension=None, Apply=None, FieldRef=None, Constant=None, NormContinuous=None, NormDiscrete=None, Discretize=None, MapValues=None, TextIndex=None, Aggregate=None, Lag=None):
super(Lower, self).__init__(Extension, Apply, FieldRef, Constant, NormContinuous, NormDiscrete, Discretize, MapValues, TextIndex, Aggregate, Lag, )
#
# XMLBehaviors
#
supermod.Lower.subclass = Lower
# end class Lower
class Upper(supermod.Upper):
def __init__(self, Extension=None, Apply=None, FieldRef=None, Constant=None, NormContinuous=None, NormDiscrete=None, Discretize=None, MapValues=None, TextIndex=None, Aggregate=None, Lag=None):
super(Upper, self).__init__(Extension, Apply, FieldRef, Constant, NormContinuous, NormDiscrete, Discretize, MapValues, TextIndex, Aggregate, Lag, )
#
# XMLBehaviors
#
supermod.Upper.subclass = Upper
# end class Upper
class Variance(supermod.Variance):
def __init__(self, Extension=None, Apply=None, FieldRef=None, Constant=None, NormContinuous=None, NormDiscrete=None, Discretize=None, MapValues=None, TextIndex=None, Aggregate=None, Lag=None):
super(Variance, self).__init__(Extension, Apply, FieldRef, Constant, NormContinuous, NormDiscrete, Discretize, MapValues, TextIndex, Aggregate, Lag, )
#
# XMLBehaviors
#
supermod.Variance.subclass = Variance
# end class Variance
class ClusteringModel(supermod.ClusteringModel):
def __init__(self, modelName=None, functionName=None, algorithmName=None, modelClass=None, numberOfClusters=None, isScorable=True, MiningSchema=None, Output=None, ModelStats=None, ModelExplanation=None, LocalTransformations=None, ComparisonMeasure=None, ClusteringField=None, MissingValueWeights=None, Cluster=None, ModelVerification=None, Extension=None):
super(ClusteringModel, self).__init__(modelName, functionName, algorithmName, modelClass, numberOfClusters, isScorable, MiningSchema, Output, ModelStats, ModelExplanation, LocalTransformations, ComparisonMeasure, ClusteringField, MissingValueWeights, Cluster, ModelVerification, Extension, )
#
# XMLBehaviors
#
def set_Cluster(self, Cluster, *args):
self.Cluster = Cluster
self.numberOfClusters = len(self.Cluster)
def set_Cluster_wrapper(self, Cluster, *args):
result = self.set_Cluster(Cluster, *args)
return result
def add_Cluster(self, value, *args):
self.Cluster.append(value)
self.numberOfClusters = len(self.Cluster)
def add_Cluster_wrapper(self, value, *args):
result = self.add_Cluster(value, *args)
return result
def insert_Cluster_at(self, index, value, *args):
self.Cluster.insert(index, value)
self.numberOfClusters = len(self.Cluster)
def insert_Cluster_at_wrapper(self, index, value, *args):
result = self.insert_Cluster_at(index, value, *args)
return result
supermod.ClusteringModel.subclass = ClusteringModel
# end class ClusteringModel
class MissingValueWeights(supermod.MissingValueWeights):
def __init__(self, Extension=None, Array=None):
super(MissingValueWeights, self).__init__(Extension, Array, )
#
# XMLBehaviors
#
supermod.MissingValueWeights.subclass = MissingValueWeights
# end class MissingValueWeights
class Cluster(supermod.Cluster):
def __init__(self, id=None, name=None, size=None, Extension=None, KohonenMap=None, Array=None, Partition=None, Covariances=None):
super(Cluster, self).__init__(id, name, size, Extension, KohonenMap, Array, Partition, Covariances, )
#
# XMLBehaviors
#
supermod.Cluster.subclass = Cluster
# end class Cluster
class KohonenMap(supermod.KohonenMap):
def __init__(self, coord1=None, coord2=None, coord3=None, Extension=None):
super(KohonenMap, self).__init__(coord1, coord2, coord3, Extension, )
#
# XMLBehaviors
#
supermod.KohonenMap.subclass = KohonenMap
# end class KohonenMap
class Covariances(supermod.Covariances):
def __init__(self, Extension=None, Matrix=None):
super(Covariances, self).__init__(Extension, Matrix, )
#
# XMLBehaviors
#
supermod.Covariances.subclass = Covariances
# end class Covariances
class ClusteringField(supermod.ClusteringField):
def __init__(self, field=None, isCenterField='true', fieldWeight='1', similarityScale=None, compareFunction=None, Extension=None, Comparisons=None):
super(ClusteringField, self).__init__(field, isCenterField, fieldWeight, similarityScale, compareFunction, Extension, Comparisons, )
#
# XMLBehaviors
#
supermod.ClusteringField.subclass = ClusteringField
# end class ClusteringField
class Comparisons(supermod.Comparisons):
def __init__(self, Extension=None, Matrix=None):
super(Comparisons, self).__init__(Extension, Matrix, )
#
# XMLBehaviors
#
supermod.Comparisons.subclass = Comparisons
# end class Comparisons
class ComparisonMeasure(supermod.ComparisonMeasure):
def __init__(self, kind=None, compareFunction='absDiff', minimum=None, maximum=None, Extension=None, euclidean=None, squaredEuclidean=None, chebychev=None, cityBlock=None, minkowski=None, simpleMatching=None, jaccard=None, tanimoto=None, binarySimilarity=None):
super(ComparisonMeasure, self).__init__(kind, compareFunction, minimum, maximum, Extension, euclidean, squaredEuclidean, chebychev, cityBlock, minkowski, simpleMatching, jaccard, tanimoto, binarySimilarity, )
#
# XMLBehaviors
#
supermod.ComparisonMeasure.subclass = ComparisonMeasure
# end class ComparisonMeasure
class euclidean(supermod.euclidean):
def __init__(self, Extension=None):
super(euclidean, self).__init__(Extension, )
#
# XMLBehaviors
#
supermod.euclidean.subclass = euclidean
# end class euclidean
class squaredEuclidean(supermod.squaredEuclidean):
def __init__(self, Extension=None):
super(squaredEuclidean, self).__init__(Extension, )
#
# XMLBehaviors
#
supermod.squaredEuclidean.subclass = squaredEuclidean
# end class squaredEuclidean
class cityBlock(supermod.cityBlock):
def __init__(self, Extension=None):
super(cityBlock, self).__init__(Extension, )
#
# XMLBehaviors
#
supermod.cityBlock.subclass = cityBlock
# end class cityBlock
class chebychev(supermod.chebychev):
def __init__(self, Extension=None):
super(chebychev, self).__init__(Extension, )
#
# XMLBehaviors
#
supermod.chebychev.subclass = chebychev
# end class chebychev
class minkowski(supermod.minkowski):
def __init__(self, p_parameter=None, Extension=None):
super(minkowski, self).__init__(p_parameter, Extension, )
#
# XMLBehaviors
#
supermod.minkowski.subclass = minkowski
# end class minkowski
class simpleMatching(supermod.simpleMatching):
def __init__(self, Extension=None):
super(simpleMatching, self).__init__(Extension, )
#
# XMLBehaviors
#
supermod.simpleMatching.subclass = simpleMatching
# end class simpleMatching
class jaccard(supermod.jaccard):
def __init__(self, Extension=None):
super(jaccard, self).__init__(Extension, )
#
# XMLBehaviors
#
supermod.jaccard.subclass = jaccard
# end class jaccard
class tanimoto(supermod.tanimoto):
def __init__(self, Extension=None):
super(tanimoto, self).__init__(Extension, )
#
# XMLBehaviors
#
supermod.tanimoto.subclass = tanimoto
# end class tanimoto
class binarySimilarity(supermod.binarySimilarity):
def __init__(self, c00_parameter=None, c01_parameter=None, c10_parameter=None, c11_parameter=None, d00_parameter=None, d01_parameter=None, d10_parameter=None, d11_parameter=None, Extension=None):
super(binarySimilarity, self).__init__(c00_parameter, c01_parameter, c10_parameter, c11_parameter, d00_parameter, d01_parameter, d10_parameter, d11_parameter, Extension, )
#
# XMLBehaviors
#
supermod.binarySimilarity.subclass = binarySimilarity
# end class binarySimilarity
class DataDictionary(supermod.DataDictionary):
def __init__(self, numberOfFields=None, Extension=None, DataField=None, Taxonomy=None):
super(DataDictionary, self).__init__(numberOfFields, Extension, DataField, Taxonomy, )
#
# XMLBehaviors
#
def set_DataField(self, DataField, *args):
self.DataField = DataField
self.numberOfFields = len(self.DataField)
def set_DataField_wrapper(self, DataField, *args):
result = self.set_DataField(DataField, *args)
return result
def add_DataField(self, value, *args):
self.DataField.append(value)
self.numberOfFields = len(self.DataField)
def add_DataField_wrapper(self, value, *args):
result = self.add_DataField(value, *args)
return result
def insert_DataField_at(self, index, value, *args):
self.DataField.insert(index, value)
self.numberOfFields = len(self.DataField)
def insert_DataField_at_wrapper(self, index, value, *args):
result = self.insert_DataField_at(index, value, *args)
return result
supermod.DataDictionary.subclass = DataDictionary
# end class DataDictionary
class DataField(supermod.DataField):
def __init__(self, name=None, displayName=None, optype=None, dataType=None, mimeType=None, taxonomy=None, isCyclic='0', Extension=None, Interval=None, Value=None):
super(DataField, self).__init__(name, displayName, optype, dataType, mimeType, taxonomy, isCyclic, Extension, Interval, Value, )
#
# XMLBehaviors
#
supermod.DataField.subclass = DataField
# end class DataField
class Value(supermod.Value):
def __init__(self, value=None, displayValue=None, property='valid', Extension=None):
super(Value, self).__init__(value, displayValue, property, Extension, )
#
# XMLBehaviors
#
supermod.Value.subclass = Value
# end class Value
class Interval(supermod.Interval):
def __init__(self, closure=None, leftMargin=None, rightMargin=None, Extension=None):
super(Interval, self).__init__(closure, leftMargin, rightMargin, Extension, )
#
# XMLBehaviors
#
supermod.Interval.subclass = Interval
# end class Interval
class DefineFunction(supermod.DefineFunction):
def __init__(self, name=None, optype=None, dataType=None, Extension=None, ParameterField=None, Apply=None, FieldRef=None, Constant=None, NormContinuous=None, NormDiscrete=None, Discretize=None, MapValues=None, TextIndex=None, Aggregate=None, Lag=None):
super(DefineFunction, self).__init__(name, optype, dataType, Extension, ParameterField, Apply, FieldRef, Constant, NormContinuous, NormDiscrete, Discretize, MapValues, TextIndex, Aggregate, Lag, )
#
# XMLBehaviors
#
supermod.DefineFunction.subclass = DefineFunction
# end class DefineFunction
class ParameterField(supermod.ParameterField):
def __init__(self, name=None, optype=None, dataType=None):
super(ParameterField, self).__init__(name, optype, dataType, )
#
# XMLBehaviors
#
supermod.ParameterField.subclass = ParameterField
# end class ParameterField
class Apply(supermod.Apply):
def __init__(self, function=None, mapMissingTo=None, defaultValue=None, invalidValueTreatment='returnInvalid', Extension=None, Apply_member=None, FieldRef=None, Constant=None, NormContinuous=None, NormDiscrete=None, Discretize=None, MapValues=None, TextIndex=None, Aggregate=None, Lag=None):
super(Apply, self).__init__(function, mapMissingTo, defaultValue, invalidValueTreatment, Extension, Apply_member, FieldRef, Constant, NormContinuous, NormDiscrete, Discretize, MapValues, TextIndex, Aggregate, Lag, )
#
# XMLBehaviors
#
supermod.Apply.subclass = Apply
# end class Apply
class DeepNetwork(supermod.DeepNetwork):
def __init__(self, modelName=None, functionName=None, algorithmName=None, normalizationMethod='none', numberOfLayers=None, isScorable=True, Extension=None, MiningSchema=None, Output=None, ModelStats=None, ModelExplanation=None, Targets=None, LocalTransformations=None, TrainingParameters=None, NetworkLayer=None, NeuralOutputs=None, ModelVerification=None):
super(DeepNetwork, self).__init__(modelName, functionName, algorithmName, normalizationMethod, numberOfLayers, isScorable, Extension, MiningSchema, Output, ModelStats, ModelExplanation, Targets, LocalTransformations, TrainingParameters, NetworkLayer, NeuralOutputs, ModelVerification, )
#
# XMLBehaviors
#
def set_NetworkLayer(self, NetworkLayer, *args):
self.NetworkLayer = NetworkLayer
self.numberOfLayers = len(self.NetworkLayer)
def set_NetworkLayer_wrapper(self, NetworkLayer, *args):
result = self.set_NetworkLayer(NetworkLayer, *args)
return result
def add_NetworkLayer(self, value, *args):
self.NetworkLayer.append(value)
self.numberOfLayers = len(self.NetworkLayer)
def add_NetworkLayer_wrapper(self, value, *args):
result = self.add_NetworkLayer(value, *args)
return result
def insert_NetworkLayer_at(self, index, value, *args):
self.NetworkLayer.insert(index, value)
self.numberOfLayers = len(self.NetworkLayer)
def insert_NetworkLayer_at_wrapper(self, index, value, *args):
result = self.insert_NetworkLayer_at(index, value, *args)
return result
supermod.DeepNetwork.subclass = DeepNetwork
# end class DeepNetwork
class NetworkLayer(supermod.NetworkLayer):
def __init__(self, normalizationMethod='none', layerType=None, layerId=None, connectionLayerId=None, inputFieldName=None, Extension=None, NetworkLayer_member=None, LayerParameters=None, LayerWeights=None, LayerBias=None):
super(NetworkLayer, self).__init__(normalizationMethod, layerType, layerId, connectionLayerId, inputFieldName, Extension, NetworkLayer_member, LayerParameters, LayerWeights, LayerBias, )
#
# XMLBehaviors
#
supermod.NetworkLayer.subclass = NetworkLayer
# end class NetworkLayer
class TrainingParameters(supermod.TrainingParameters):
def __init__(self, architectureName=None, dataset=None, framework=None, Extension=None, Losses=None, Metrics=None, Optimizers=None):
super(TrainingParameters, self).__init__(architectureName, dataset, framework, Extension, Losses, Metrics, Optimizers, )
#
# XMLBehaviors
#
supermod.TrainingParameters.subclass = TrainingParameters
# end class TrainingParameters
class Metrics(supermod.Metrics):
def __init__(self, top_k_categories_for_accuracy=None, metric=None, Extension=None):
super(Metrics, self).__init__(top_k_categories_for_accuracy, metric, Extension, )
#
# XMLBehaviors
#
supermod.Metrics.subclass = Metrics
# end class Metrics
class Optimizers(supermod.Optimizers):
def __init__(self, clipnorm=None, clipvalue=None, Extension=None, SGD=None, RMSprop=None, Adagrad=None, Adadelta=None, Adam=None, Adamax=None, Nadam=None):
super(Optimizers, self).__init__(clipnorm, clipvalue, Extension, SGD, RMSprop, Adagrad, Adadelta, Adam, Adamax, Nadam, )
#
# XMLBehaviors
#
supermod.Optimizers.subclass = Optimizers
# end class Optimizers
class Losses(supermod.Losses):
def __init__(self, loss=None, Extension=None):
super(Losses, self).__init__(loss, Extension, )
#
# XMLBehaviors
#
supermod.Losses.subclass = Losses
# end class Losses
class SGD(supermod.SGD):
def __init__(self, learningRate=None, momentum=None, decayRate=None, nesterov=None, Extension=None):
super(SGD, self).__init__(learningRate, momentum, decayRate, nesterov, Extension, )
#
# XMLBehaviors
#
supermod.SGD.subclass = SGD
# end class SGD
class RMSprop(supermod.RMSprop):
def __init__(self, learningRate=None, rho=None, decayRate=None, epsilon=None, Extension=None):
super(RMSprop, self).__init__(learningRate, rho, decayRate, epsilon, Extension, )
#
# XMLBehaviors
#
supermod.RMSprop.subclass = RMSprop
# end class RMSprop
class Adagrad(supermod.Adagrad):
def __init__(self, learningRate=None, decayRate=None, epsilon=None, Extension=None):
super(Adagrad, self).__init__(learningRate, decayRate, epsilon, Extension, )
#
# XMLBehaviors
#
supermod.Adagrad.subclass = Adagrad
# end class Adagrad
class Adadelta(supermod.Adadelta):
def __init__(self, learningRate=None, rho=None, decayRate=None, epsilon=None, Extension=None):
super(Adadelta, self).__init__(learningRate, rho, decayRate, epsilon, Extension, )
#
# XMLBehaviors
#
supermod.Adadelta.subclass = Adadelta
# end class Adadelta
class Adam(supermod.Adam):
def __init__(self, learningRate=None, beta_1=None, beta_2=None, decayRate=None, epsilon=None, Extension=None):
super(Adam, self).__init__(learningRate, beta_1, beta_2, decayRate, epsilon, Extension, )
#
# XMLBehaviors
#
supermod.Adam.subclass = Adam
# end class Adam
class Adamax(supermod.Adamax):
def __init__(self, learningRate=None, beta_1=None, beta_2=None, decayRate=None, epsilon=None, Extension=None):
super(Adamax, self).__init__(learningRate, beta_1, beta_2, decayRate, epsilon, Extension, )
#
# XMLBehaviors
#
supermod.Adamax.subclass = Adamax
# end class Adamax
class Nadam(supermod.Nadam):
def __init__(self, learningRate=None, beta_1=None, beta_2=None, schedule_decay=None, epsilon=None, Extension=None):
super(Nadam, self).__init__(learningRate, beta_1, beta_2, schedule_decay, epsilon, Extension, )
#
# XMLBehaviors
#
supermod.Nadam.subclass = Nadam
# end class Nadam
class LayerWeights(supermod.LayerWeights):
def __init__(self, weightsShape=None, weightsFlattenAxis=None, Extension=None, valueOf_=None, mixedclass_=None, content_=None):
super(LayerWeights, self).__init__(weightsShape, weightsFlattenAxis, Extension, valueOf_, mixedclass_, content_, )
#
# XMLBehaviors
#
def export(self, outfile, level, namespace_='', name_='LayerWeights', namespacedef_='', pretty_print=True, *args):
imported_ns_def_ = supermod.GenerateDSNamespaceDefs_.get('LayerWeights')
if imported_ns_def_ is not None:
namespacedef_ = imported_ns_def_
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.original_tagname_ is not None:
name_ = self.original_tagname_
supermod.showIndent(outfile, level, pretty_print)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = set()
self.exportAttributes(outfile, level, already_processed, namespace_, name_='LayerWeights')
if self.hasContent_():
outfile.write('>%s' % (eol_, ))
if not pretty_print:
self.content_[0].value = self.content_[0].value.replace('\t', '').replace(' ', '')
self.valueOf_ = self.valueOf_.replace('\t', '').replace(' ', '')
self.exportChildren(outfile, level + 1, namespace_='', name_='LayerWeights', pretty_print=pretty_print)
outfile.write(eol_)
supermod.showIndent(outfile, level, pretty_print)
outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
else:
outfile.write('/>%s' % (eol_, ))
def export_wrapper(self, outfile, level, namespace_='', name_='LayerWeights', namespacedef_='', pretty_print=True, *args):
result = self.export(outfile, level, namespace_='', name_='LayerWeights', namespacedef_='', pretty_print=True, *args)
return result
def __init__(self, src=None, embedded=False, Extension=None, valueOf_=None, mixedclass_=None, content_=None, *args):
self.original_tagname_ = None
self.src = supermod._cast(None, src)
if Extension is None:
self.Extension = []
else:
self.Extension = supermod.Extension
self.valueOf_ = valueOf_
if mixedclass_ is None:
self.mixedclass_ = supermod.MixedContainer
else:
self.mixedclass_ = mixedclass_
if content_ is None:
self.content_ = []
else:
self.content_ = content_
self.valueOf_ = valueOf_
def __init___wrapper(self, src=None, embedded=False, Extension=None, valueOf_=None, mixedclass_=None, content_=None, *args):
result = self.__init__(src=None, embedded=False, Extension=None, valueOf_=None, mixedclass_=None, content_=None, *args)
return result
def weights(self, *args):
import nyokaBase
if self.src is not None:
raw_content = open(self.src, "r").read()
elif self.content_ is not None and self.content_[0].value is not None:
raw_content = self.content_[0].value
raw_content = raw_content.replace(' ', '')
raw_content = raw_content.replace('\t', '')
raw_content = raw_content.replace('\n', '')
if raw_content.startswith("data:float32;base64,") or raw_content.startswith("data:float64;base64,") or raw_content.startswith("data:float16;base64,"):
raw_content = raw_content[20:] + "=="
elif raw_content.startswith("data:float;base64,"):
raw_content = raw_content[18:] + "=="
else:
return None
from nyokaBase.Base64 import FloatBase64
if raw_content.find("+") > 0:
return FloatBase64.to_floatArray_urlsafe(raw_content)
else:
return FloatBase64.to_floatArray(raw_content)
def weights_wrapper(self, *args):
result = self.weights(*args)
return result
supermod.LayerWeights.subclass = LayerWeights
# end class LayerWeights
class LayerBias(supermod.LayerBias):
def __init__(self, biasShape=None, biasFlattenAxis=None, Extension=None, valueOf_=None, mixedclass_=None, content_=None):
super(LayerBias, self).__init__(biasShape, biasFlattenAxis, Extension, valueOf_, mixedclass_, content_, )
#
# XMLBehaviors
#
def export(self, outfile, level, namespace_='', name_='LayerBias', namespacedef_='', pretty_print=True, *args):
imported_ns_def_ = supermod.GenerateDSNamespaceDefs_.get('LayerBias')
if imported_ns_def_ is not None:
namespacedef_ = imported_ns_def_
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.original_tagname_ is not None:
name_ = self.original_tagname_
supermod.showIndent(outfile, level, pretty_print)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = set()
self.exportAttributes(outfile, level, already_processed, namespace_, name_='LayerBias')
if self.hasContent_():
outfile.write('>%s' % (eol_, ))
if not pretty_print:
self.content_[0].value = self.content_[0].value.replace('\t', '').replace(' ', '')
self.valueOf_ = self.valueOf_.replace('\t', '').replace(' ', '')
self.exportChildren(outfile, level + 1, namespace_='', name_='LayerBias', pretty_print=pretty_print)
outfile.write(eol_)
supermod.showIndent(outfile, level, pretty_print)
outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
else:
outfile.write('/>%s' % (eol_, ))
def export_wrapper(self, outfile, level, namespace_='', name_='LayerBias', namespacedef_='', pretty_print=True, *args):
result = self.export(outfile, level, namespace_='', name_='LayerBias', namespacedef_='', pretty_print=True, *args)
return result
def weights(self, *args):
import nyokaBase
if self.src is not None:
raw_content = open(self.src, "r").read()
elif self.content_ is not None and self.content_[0].value is not None:
raw_content = self.content_[0].value
raw_content = raw_content.replace(' ', '')
raw_content = raw_content.replace('\t', '')
raw_content = raw_content.replace('\n', '')
if raw_content.startswith("data:float32;base64,") or raw_content.startswith("data:float64;base64,") or raw_content.startswith("data:float16;base64,"):
raw_content = raw_content[20:] + "=="
elif raw_content.startswith("data:float;base64,"):
raw_content = raw_content[18:] + "=="
else:
return None
from nyokaBase.Base64 import FloatBase64
if raw_content.find("+") > 0:
return FloatBase64.to_floatArray_urlsafe(raw_content)
else:
return FloatBase64.to_floatArray(raw_content)
def weights_wrapper(self, *args):
result = self.weights(*args)
return result
def __init__(self, src=None, embedded=False, Extension=None, valueOf_=None, mixedclass_=None, content_=None, *args):
self.original_tagname_ = None
self.src = supermod._cast(None, src)
if Extension is None:
self.Extension = []
else:
self.Extension = supermod.Extension
self.valueOf_ = valueOf_
if mixedclass_ is None:
self.mixedclass_ = supermod.MixedContainer
else:
self.mixedclass_ = mixedclass_
if content_ is None:
self.content_ = []
else:
self.content_ = content_
self.valueOf_ = valueOf_
def __init___wrapper(self, src=None, embedded=False, Extension=None, valueOf_=None, mixedclass_=None, content_=None, *args):
result = self.__init__(src=None, embedded=False, Extension=None, valueOf_=None, mixedclass_=None, content_=None, *args)
return result
supermod.LayerBias.subclass = LayerBias
# end class LayerBias
class LayerParameters(supermod.LayerParameters):
def __init__(self, activationFunction=None, inputDimension=None, outputDimension=None, featureMaps=None, kernel=None, paddingType=None, stride=None, dilationRate=None, poolSize=None, depthMultiplier=None, paddingDims=None, croppingDims=None, upsamplingSize=None, return_sequences=None, return_state=None, stateful=None, inputLength=None, recurrentUnits=None, recurrentActivation=None, recurrentDropout=None, go_backwards=None, batchNormalizationEpsilon=None, flattenAxis=None, batchNormalizationAxis=None, batchNormalizationMomentum=None, batchNormalizationCenter=None, batchNormalizationScale=None, gaussianNoiseStdev=None, gaussianDropoutRate=None, alphaDropoutRate=None, alphaDropoutSeed=None, betaInitializer=None, gammaInitializer=None, movingMeanInitializer=None, movingVarianceInitializer=None, recurrentInitializer=None, betaRegularizer=None, gammaRegularizer=None, betaConstraint=None, gammaConstraint=None, kernelInitializer=None, biasInitializer=None, kernelRegularizer=None, biasRegularizer=None, kernelConstraint=None, biasConstraint=None, depthwiseConstraint=None, pointwiseConstraint=None, recurrentConstraint=None, batchSize=None, dropoutRate=None, dropoutNoiseShape=None, dropoutSeed=None, generalLUAlpha=None, reshapeTarget=None, permuteDims=None, repeatVectorTimes=None, activityRegularizerL1=None, activityRegularizerL2=None, maskValue=None, mergeLayerOp=None, mergeLayerDotOperationAxis=None, mergeLayerDotNormalize=None, mergeLayerConcatOperationAxes=None, slicingAxis=None, max_value=None, trainable=None, units=None, function=None, pool_shape=None, proposal_count=None, nms_threshold=None, Extension=None):
super(LayerParameters, self).__init__(activationFunction, inputDimension, outputDimension, featureMaps, kernel, paddingType, stride, dilationRate, poolSize, depthMultiplier, paddingDims, croppingDims, upsamplingSize, return_sequences, return_state, stateful, inputLength, recurrentUnits, recurrentActivation, recurrentDropout, go_backwards, batchNormalizationEpsilon, flattenAxis, batchNormalizationAxis, batchNormalizationMomentum, batchNormalizationCenter, batchNormalizationScale, gaussianNoiseStdev, gaussianDropoutRate, alphaDropoutRate, alphaDropoutSeed, betaInitializer, gammaInitializer, movingMeanInitializer, movingVarianceInitializer, recurrentInitializer, betaRegularizer, gammaRegularizer, betaConstraint, gammaConstraint, kernelInitializer, biasInitializer, kernelRegularizer, biasRegularizer, kernelConstraint, biasConstraint, depthwiseConstraint, pointwiseConstraint, recurrentConstraint, batchSize, dropoutRate, dropoutNoiseShape, dropoutSeed, generalLUAlpha, reshapeTarget, permuteDims, repeatVectorTimes, activityRegularizerL1, activityRegularizerL2, maskValue, mergeLayerOp, mergeLayerDotOperationAxis, mergeLayerDotNormalize, mergeLayerConcatOperationAxes, slicingAxis, max_value, trainable, units, function, pool_shape, proposal_count, nms_threshold, Extension, )
#
# XMLBehaviors
#
supermod.LayerParameters.subclass = LayerParameters
# end class LayerParameters
class GaussianProcessModel(supermod.GaussianProcessModel):
def __init__(self, modelName=None, functionName=None, algorithmName=None, optimizer=None, isScorable=True, MiningSchema=None, Output=None, ModelStats=None, ModelExplanation=None, Targets=None, LocalTransformations=None, RadialBasisKernel=None, ARDSquaredExponentialKernel=None, AbsoluteExponentialKernel=None, GeneralizedExponentialKernel=None, TrainingInstances=None, ModelVerification=None, Extension=None):
super(GaussianProcessModel, self).__init__(modelName, functionName, algorithmName, optimizer, isScorable, MiningSchema, Output, ModelStats, ModelExplanation, Targets, LocalTransformations, RadialBasisKernel, ARDSquaredExponentialKernel, AbsoluteExponentialKernel, GeneralizedExponentialKernel, TrainingInstances, ModelVerification, Extension, )
#
# XMLBehaviors
#
supermod.GaussianProcessModel.subclass = GaussianProcessModel
# end class GaussianProcessModel
class RadialBasisKernel(supermod.RadialBasisKernel):
def __init__(self, description=None, gamma='1', noiseVariance='1', lambda_='1', Extension=None):
super(RadialBasisKernel, self).__init__(description, gamma, noiseVariance, lambda_, Extension, )
#
# XMLBehaviors
#
supermod.RadialBasisKernel.subclass = RadialBasisKernel
# end class RadialBasisKernel
class ARDSquaredExponentialKernel(supermod.ARDSquaredExponentialKernel):
def __init__(self, description=None, gamma='1', noiseVariance='1', Extension=None, Lambda=None):
super(ARDSquaredExponentialKernel, self).__init__(description, gamma, noiseVariance, Extension, Lambda, )
#
# XMLBehaviors
#
supermod.ARDSquaredExponentialKernel.subclass = ARDSquaredExponentialKernel
# end class ARDSquaredExponentialKernel
class AbsoluteExponentialKernel(supermod.AbsoluteExponentialKernel):
def __init__(self, description=None, gamma='1', noiseVariance='1', Extension=None, Lambda=None):
super(AbsoluteExponentialKernel, self).__init__(description, gamma, noiseVariance, Extension, Lambda, )
#
# XMLBehaviors
#
supermod.AbsoluteExponentialKernel.subclass = AbsoluteExponentialKernel
# end class AbsoluteExponentialKernel
class GeneralizedExponentialKernel(supermod.GeneralizedExponentialKernel):
def __init__(self, description=None, gamma='1', noiseVariance='1', degree='1', Extension=None, Lambda=None):
super(GeneralizedExponentialKernel, self).__init__(description, gamma, noiseVariance, degree, Extension, Lambda, )
#
# XMLBehaviors
#
supermod.GeneralizedExponentialKernel.subclass = GeneralizedExponentialKernel
# end class GeneralizedExponentialKernel
class Lambda(supermod.Lambda):
def __init__(self, Extension=None, Array=None):
super(Lambda, self).__init__(Extension, Array, )
#
# XMLBehaviors
#
supermod.Lambda.subclass = Lambda
# end class Lambda
class GeneralRegressionModel(supermod.GeneralRegressionModel):
def __init__(self, targetVariableName=None, modelType=None, modelName=None, functionName=None, algorithmName=None, targetReferenceCategory=None, cumulativeLink=None, linkFunction=None, linkParameter=None, trialsVariable=None, trialsValue=None, distribution=None, distParameter=None, offsetVariable=None, offsetValue=None, modelDF=None, endTimeVariable=None, startTimeVariable=None, subjectIDVariable=None, statusVariable=None, baselineStrataVariable=None, isScorable=True, MiningSchema=None, Output=None, ModelStats=None, ModelExplanation=None, Targets=None, LocalTransformations=None, ParameterList=None, FactorList=None, CovariateList=None, PPMatrix=None, PCovMatrix=None, ParamMatrix=None, EventValues=None, BaseCumHazardTables=None, ModelVerification=None, Extension=None):
super(GeneralRegressionModel, self).__init__(targetVariableName, modelType, modelName, functionName, algorithmName, targetReferenceCategory, cumulativeLink, linkFunction, linkParameter, trialsVariable, trialsValue, distribution, distParameter, offsetVariable, offsetValue, modelDF, endTimeVariable, startTimeVariable, subjectIDVariable, statusVariable, baselineStrataVariable, isScorable, MiningSchema, Output, ModelStats, ModelExplanation, Targets, LocalTransformations, ParameterList, FactorList, CovariateList, PPMatrix, PCovMatrix, ParamMatrix, EventValues, BaseCumHazardTables, ModelVerification, Extension, )
#
# XMLBehaviors
#
supermod.GeneralRegressionModel.subclass = GeneralRegressionModel
# end class GeneralRegressionModel
class ParameterList(supermod.ParameterList):
def __init__(self, Extension=None, Parameter=None):
super(ParameterList, self).__init__(Extension, Parameter, )
#
# XMLBehaviors
#
supermod.ParameterList.subclass = ParameterList
# end class ParameterList
class Parameter(supermod.Parameter):
def __init__(self, name=None, value=None, Extension=None):
super(Parameter, self).__init__(name, value, Extension, )
#
# XMLBehaviors
#
supermod.Parameter.subclass = Parameter
# end class Parameter
class FactorList(supermod.FactorList):
def __init__(self, Extension=None, Predictor=None):
super(FactorList, self).__init__(Extension, Predictor, )
#
# XMLBehaviors
#
supermod.FactorList.subclass = FactorList
# end class FactorList
class CovariateList(supermod.CovariateList):
def __init__(self, Extension=None, Predictor=None):
super(CovariateList, self).__init__(Extension, Predictor, )
#
# XMLBehaviors
#
supermod.CovariateList.subclass = CovariateList
# end class CovariateList
class Predictor(supermod.Predictor):
def __init__(self, name=None, contrastMatrixType=None, Extension=None, Categories=None, Matrix=None):
super(Predictor, self).__init__(name, contrastMatrixType, Extension, Categories, Matrix, )
#
# XMLBehaviors
#
supermod.Predictor.subclass = Predictor
# end class Predictor
class Categories(supermod.Categories):
def __init__(self, Extension=None, Category=None):
super(Categories, self).__init__(Extension, Category, )
#
# XMLBehaviors
#
supermod.Categories.subclass = Categories
# end class Categories
class Category(supermod.Category):
def __init__(self, value=None, Extension=None):
super(Category, self).__init__(value, Extension, )
#
# XMLBehaviors
#
supermod.Category.subclass = Category
# end class Category
class PPMatrix(supermod.PPMatrix):
def __init__(self, Extension=None, PPCell=None):
super(PPMatrix, self).__init__(Extension, PPCell, )
#
# XMLBehaviors
#
supermod.PPMatrix.subclass = PPMatrix
# end class PPMatrix
class PPCell(supermod.PPCell):
def __init__(self, value=None, predictorName=None, parameterName=None, targetCategory=None, Extension=None):
super(PPCell, self).__init__(value, predictorName, parameterName, targetCategory, Extension, )
#
# XMLBehaviors
#
supermod.PPCell.subclass = PPCell
# end class PPCell
class PCovMatrix(supermod.PCovMatrix):
def __init__(self, type_=None, Extension=None, PCovCell=None):
super(PCovMatrix, self).__init__(type_, Extension, PCovCell, )
#
# XMLBehaviors
#
supermod.PCovMatrix.subclass = PCovMatrix
# end class PCovMatrix
class PCovCell(supermod.PCovCell):
def __init__(self, pRow=None, pCol=None, tRow=None, tCol=None, value=None, targetCategory=None, Extension=None):
super(PCovCell, self).__init__(pRow, pCol, tRow, tCol, value, targetCategory, Extension, )
#
# XMLBehaviors
#
supermod.PCovCell.subclass = PCovCell
# end class PCovCell
class ParamMatrix(supermod.ParamMatrix):
def __init__(self, Extension=None, PCell=None):
super(ParamMatrix, self).__init__(Extension, PCell, )
#
# XMLBehaviors
#
supermod.ParamMatrix.subclass = ParamMatrix
# end class ParamMatrix
class PCell(supermod.PCell):
def __init__(self, targetCategory=None, parameterName=None, beta=None, df=None, Extension=None):
super(PCell, self).__init__(targetCategory, parameterName, beta, df, Extension, )
#
# XMLBehaviors
#
supermod.PCell.subclass = PCell
# end class PCell
class BaseCumHazardTables(supermod.BaseCumHazardTables):
def __init__(self, maxTime=None, Extension=None, BaselineStratum=None, BaselineCell=None):
super(BaseCumHazardTables, self).__init__(maxTime, Extension, BaselineStratum, BaselineCell, )
#
# XMLBehaviors
#
supermod.BaseCumHazardTables.subclass = BaseCumHazardTables
# end class BaseCumHazardTables
class BaselineStratum(supermod.BaselineStratum):
def __init__(self, value=None, label=None, maxTime=None, Extension=None, BaselineCell=None):
super(BaselineStratum, self).__init__(value, label, maxTime, Extension, BaselineCell, )
#
# XMLBehaviors
#
supermod.BaselineStratum.subclass = BaselineStratum
# end class BaselineStratum
class BaselineCell(supermod.BaselineCell):
def __init__(self, time=None, cumHazard=None, Extension=None):
super(BaselineCell, self).__init__(time, cumHazard, Extension, )
#
# XMLBehaviors
#
supermod.BaselineCell.subclass = BaselineCell
# end class BaselineCell
class EventValues(supermod.EventValues):
def __init__(self, Extension=None, Value=None, Interval=None):
super(EventValues, self).__init__(Extension, Value, Interval, )
#
# XMLBehaviors
#
supermod.EventValues.subclass = EventValues
# end class EventValues
class PMML(supermod.PMML):
def __init__(self, version=None, Header=None, script=None, MiningBuildTask=None, DataDictionary=None, TransformationDictionary=None, AssociationModel=None, AnomalyDetectionModel=None, BayesianNetworkModel=None, BaselineModel=None, ClusteringModel=None, DeepNetwork=None, GaussianProcessModel=None, GeneralRegressionModel=None, MiningModel=None, NaiveBayesModel=None, NearestNeighborModel=None, NeuralNetwork=None, RegressionModel=None, RuleSetModel=None, SequenceModel=None, Scorecard=None, SupportVectorMachineModel=None, TextModel=None, TimeSeriesModel=None, TreeModel=None, Extension=None):
super(PMML, self).__init__(version, Header, script, MiningBuildTask, DataDictionary, TransformationDictionary, AssociationModel, AnomalyDetectionModel, BayesianNetworkModel, BaselineModel, ClusteringModel, DeepNetwork, GaussianProcessModel, GeneralRegressionModel, MiningModel, NaiveBayesModel, NearestNeighborModel, NeuralNetwork, RegressionModel, RuleSetModel, SequenceModel, Scorecard, SupportVectorMachineModel, TextModel, TimeSeriesModel, TreeModel, Extension, )
#
# XMLBehaviors
#
def export(self, outfile, level, namespace_='', name_='PMML', namespacedef_='', pretty_print=True, *args):
imported_ns_def_ = supermod.GenerateDSNamespaceDefs_.get('Timestamp')
if imported_ns_def_ is not None:
namespacedef_ = imported_ns_def_
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.original_tagname_ is not None:
name_ = self.original_tagname_
supermod.showIndent(outfile, level, pretty_print)
outfile.write('<?xml version="1.0" encoding="UTF-8"?>' + eol_)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = set()
outfile.write(' xmlns="http://www.dmg.org/PMML-4_3"')
self.exportAttributes(outfile, level, already_processed, namespace_, name_='Timestamp')
if self.hasContent_():
outfile.write('>%s' % (eol_, ))
self.exportChildren(outfile, level + 1, namespace_='', name_='Timestamp', pretty_print=pretty_print)
supermod.showIndent(outfile, 0, pretty_print)
outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
else:
outfile.write('/>%s' % (eol_, ))
def export_wrapper(self, outfile, level, namespace_='', name_='PMML', namespacedef_='', pretty_print=True, *args):
result = self.export(outfile, level, namespace_='', name_='PMML', namespacedef_='', pretty_print=True, *args)
return result
supermod.PMML.subclass = PMML
# end class PMML
class MiningBuildTask(supermod.MiningBuildTask):
def __init__(self, Extension=None):
super(MiningBuildTask, self).__init__(Extension, )
#
# XMLBehaviors
#
supermod.MiningBuildTask.subclass = MiningBuildTask
# end class MiningBuildTask
class Extension(supermod.Extension):
def __init__(self, extender=None, name=None, value=None, anytypeobjs_=None):
super(Extension, self).__init__(extender, name, value, anytypeobjs_, )
#
# XMLBehaviors
#
def build(self, node, *args):
already_processed = set()
self.buildAttributes(node, node.attrib, already_processed)
for child in node:
nodeName_ = supermod.Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
if self.anytypeobjs_ == []:
if node.text is not None:
self.anytypeobjs_ = list(filter(None, [obj_.lstrip(' ') for obj_ in node.text.split('\n')]))
return self
supermod.Extension.subclass = Extension
# end class Extension
class ArrayType(supermod.ArrayType):
def __init__(self, n=None, type_=None, Extension=None, valueOf_=None, mixedclass_=None, content_=None):
super(ArrayType, self).__init__(n, type_, Extension, valueOf_, mixedclass_, content_, )
#
# XMLBehaviors
#
def export(self, outfile, level, namespace_='', name_='ArrayType', namespacedef_='', pretty_print=True, *args):
imported_ns_def_ = supermod.GenerateDSNamespaceDefs_.get('ArrayType')
if imported_ns_def_ is not None:
namespacedef_ = imported_ns_def_
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.original_tagname_ is not None:
name_ = self.original_tagname_
supermod.showIndent(outfile, level, pretty_print)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = set()
self.exportAttributes(outfile, level, already_processed, namespace_, name_='ArrayType')
if self.hasContent_():
outfile.write('>%s' % (eol_, ))
if not pretty_print:
self.content_[0].value = self.content_[0].value.replace('\t', '').replace(' ', '')
self.valueOf_ = self.valueOf_.replace('\t', '').replace(' ', '')
self.exportChildren(outfile, level + 1, namespace_='', name_='ArrayType', pretty_print=pretty_print)
outfile.write(eol_)
supermod.showIndent(outfile, level, pretty_print)
outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
else:
outfile.write('/>%s' % (eol_, ))
def export_wrapper(self, outfile, level, namespace_='', name_='ArrayType', namespacedef_='', pretty_print=True, *args):
result = self.export(outfile, level, namespace_='', name_='ArrayType', namespacedef_='', pretty_print=True, *args)
return result
supermod.ArrayType.subclass = ArrayType
# end class ArrayType
class INT_SparseArray(supermod.INT_SparseArray):
def __init__(self, n=None, defaultValue='0', Indices=None, INT_Entries=None):
super(INT_SparseArray, self).__init__(n, defaultValue, Indices, INT_Entries, )
#
# XMLBehaviors
#
supermod.INT_SparseArray.subclass = INT_SparseArray
# end class INT_SparseArray
class REAL_SparseArray(supermod.REAL_SparseArray):
def __init__(self, n=None, defaultValue='0', Indices=None, REAL_Entries=None):
super(REAL_SparseArray, self).__init__(n, defaultValue, Indices, REAL_Entries, )
#
# XMLBehaviors
#
supermod.REAL_SparseArray.subclass = REAL_SparseArray
# end class REAL_SparseArray
class Matrix(supermod.Matrix):
def __init__(self, kind='any', nbRows=None, nbCols=None, diagDefault=None, offDiagDefault=None, Array=None, MatCell=None):
super(Matrix, self).__init__(kind, nbRows, nbCols, diagDefault, offDiagDefault, Array, MatCell, )
#
# XMLBehaviors
#
supermod.Matrix.subclass = Matrix
# end class Matrix
class MatCell(supermod.MatCell):
def __init__(self, row=None, col=None, valueOf_=None):
super(MatCell, self).__init__(row, col, valueOf_, )
#
# XMLBehaviors
#
supermod.MatCell.subclass = MatCell
# end class MatCell
class Header(supermod.Header):
def __init__(self, copyright=None, description=None, modelVersion=None, Extension=None, Application=None, Annotation=None, Timestamp=None):
super(Header, self).__init__(copyright, description, modelVersion, Extension, Application, Annotation, Timestamp, )
#
# XMLBehaviors
#
def exportAttributes(self, outfile, level, already_processed, namespace_='', name_='Header', *args):
from datetime import datetime
if self.copyright is not None and 'copyright' not in already_processed:
if not self.copyright.endswith("Software AG"):
self.copyright += ", exported to PMML by Nyoka (c) " + str(datetime.now().year) + " Software AG"
already_processed.add('copyright')
outfile.write(' copyright=%s' % (self.gds_encode(self.gds_format_string(supermod.quote_attrib(self.copyright), input_name='copyright')), ))
if self.description is not None and 'description' not in already_processed:
already_processed.add('description')
outfile.write(' description=%s' % (self.gds_encode(self.gds_format_string(supermod.quote_attrib(self.description), input_name='description')), ))
if self.modelVersion is not None and 'modelVersion' not in already_processed:
already_processed.add('modelVersion')
outfile.write(' modelVersion=%s' % (self.gds_encode(self.gds_format_string(supermod.quote_attrib(self.modelVersion), input_name='modelVersion')), ))
def exportAttributes_wrapper(self, outfile, level, already_processed, namespace_='', name_='Header', *args):
result = self.exportAttributes(outfile, level, already_processed, namespace_='', name_='Header', *args)
return result
supermod.Header.subclass = Header
# end class Header
class script(supermod.script):
def __init__(self, for_=None, class_=None, Extension=None, valueOf_=None, mixedclass_=None, content_=None):
super(script, self).__init__(for_, class_, Extension, valueOf_, mixedclass_, content_, )
#
# XMLBehaviors
#
def export(self, outfile, level, namespace_='', name_='script', namespacedef_='', pretty_print=True, *args):
imported_ns_def_ = supermod.GenerateDSNamespaceDefs_.get('script')
if imported_ns_def_ is not None:
namespacedef_ = imported_ns_def_
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.original_tagname_ is not None:
name_ = self.original_tagname_
showIndent(outfile, level, pretty_print)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = set()
self.exportAttributes(outfile, level, already_processed, namespace_, name_='script')
if self.hasContent_():
outfile.write('>%s' % (eol_, ))
if pretty_print:
lines = []
code = self.valueOf_.lstrip('\n')
leading_spaces = len(code) - len(code.lstrip(' '))
for line in code.split('\n'):
lines.append(line[leading_spaces:])
code = '\n'.join(lines)
indent = " " * (level + 1)
count = code.count('\n')
indented = indent + code.replace("\n", "\n" + indent, count - 1)
self.content_ = [supermod.MixedContainer(1, 2, "", str(indented))]
self.valueOf_ = str(indented)
self.exportChildren(outfile, level + 1, namespace_='', name_='script', pretty_print=pretty_print)
showIndent(outfile, level, pretty_print)
outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
else:
outfile.write('/>%s' % (eol_, ))
def export_wrapper(self, outfile, level, namespace_='', name_='script', namespacedef_='', pretty_print=True, *args):
result = self.export(outfile, level, namespace_='', name_='script', namespacedef_='', pretty_print=True, *args)
return result
supermod.script.subclass = script
# end class script
class Application(supermod.Application):
def __init__(self, name=None, version=None, Extension=None):
super(Application, self).__init__(name, version, Extension, )
#
# XMLBehaviors
#
supermod.Application.subclass = Application
# end class Application
class Annotation(supermod.Annotation):
def __init__(self, Extension=None, valueOf_=None, mixedclass_=None, content_=None):
super(Annotation, self).__init__(Extension, valueOf_, mixedclass_, content_, )
#
# XMLBehaviors
#
supermod.Annotation.subclass = Annotation
# end class Annotation
class Timestamp(supermod.Timestamp):
def __init__(self, Extension=None, valueOf_=None, mixedclass_=None, content_=None):
super(Timestamp, self).__init__(Extension, valueOf_, mixedclass_, content_, )
#
# XMLBehaviors
#
def export(self, outfile, level, namespace_='', name_='Timestamp', namespacedef_='', pretty_print=True, *args):
imported_ns_def_ = supermod.GenerateDSNamespaceDefs_.get('Timestamp')
if imported_ns_def_ is not None:
namespacedef_ = imported_ns_def_
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.original_tagname_ is not None:
name_ = self.original_tagname_
supermod.showIndent(outfile, level, pretty_print)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = set()
self.exportAttributes(outfile, level, already_processed, namespace_, name_='Timestamp')
if self.hasContent_():
outfile.write('>%s' % ('', ))
self.exportChildren(outfile, level + 1, namespace_='', name_='Timestamp', pretty_print=pretty_print)
supermod.showIndent(outfile, 0, pretty_print)
outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
else:
outfile.write('/>%s' % (eol_, ))
def export_wrapper(self, outfile, level, namespace_='', name_='Timestamp', namespacedef_='', pretty_print=True, *args):
result = self.export(outfile, level, namespace_='', name_='Timestamp', namespacedef_='', pretty_print=True, *args)
return result
supermod.Timestamp.subclass = Timestamp
# end class Timestamp
class NearestNeighborModel(supermod.NearestNeighborModel):
def __init__(self, modelName=None, functionName=None, algorithmName=None, numberOfNeighbors=None, continuousScoringMethod='average', categoricalScoringMethod='majorityVote', instanceIdVariable=None, threshold='0.001', isScorable=True, MiningSchema=None, Output=None, ModelStats=None, ModelExplanation=None, Targets=None, LocalTransformations=None, TrainingInstances=None, ComparisonMeasure=None, KNNInputs=None, ModelVerification=None, Extension=None):
super(NearestNeighborModel, self).__init__(modelName, functionName, algorithmName, numberOfNeighbors, continuousScoringMethod, categoricalScoringMethod, instanceIdVariable, threshold, isScorable, MiningSchema, Output, ModelStats, ModelExplanation, Targets, LocalTransformations, TrainingInstances, ComparisonMeasure, KNNInputs, ModelVerification, Extension, )
#
# XMLBehaviors
#
supermod.NearestNeighborModel.subclass = NearestNeighborModel
# end class NearestNeighborModel
class TrainingInstances(supermod.TrainingInstances):
def __init__(self, isTransformed=False, recordCount=None, fieldCount=None, Extension=None, InstanceFields=None, TableLocator=None, InlineTable=None):
super(TrainingInstances, self).__init__(isTransformed, recordCount, fieldCount, Extension, InstanceFields, TableLocator, InlineTable, )
#
# XMLBehaviors
#
supermod.TrainingInstances.subclass = TrainingInstances
# end class TrainingInstances
class InstanceFields(supermod.InstanceFields):
def __init__(self, Extension=None, InstanceField=None):
super(InstanceFields, self).__init__(Extension, InstanceField, )
#
# XMLBehaviors
#
supermod.InstanceFields.subclass = InstanceFields
# end class InstanceFields
class InstanceField(supermod.InstanceField):
def __init__(self, field=None, column=None, Extension=None):
super(InstanceField, self).__init__(field, column, Extension, )
#
# XMLBehaviors
#
supermod.InstanceField.subclass = InstanceField
# end class InstanceField
class KNNInputs(supermod.KNNInputs):
def __init__(self, Extension=None, KNNInput=None):
super(KNNInputs, self).__init__(Extension, KNNInput, )
#
# XMLBehaviors
#
supermod.KNNInputs.subclass = KNNInputs
# end class KNNInputs
class KNNInput(supermod.KNNInput):
def __init__(self, field=None, fieldWeight='1', compareFunction=None, Extension=None):
super(KNNInput, self).__init__(field, fieldWeight, compareFunction, Extension, )
#
# XMLBehaviors
#
supermod.KNNInput.subclass = KNNInput
# end class KNNInput
class MiningSchema(supermod.MiningSchema):
def __init__(self, Extension=None, MiningField=None):
super(MiningSchema, self).__init__(Extension, MiningField, )
#
# XMLBehaviors
#
supermod.MiningSchema.subclass = MiningSchema
# end class MiningSchema
class MiningField(supermod.MiningField):
def __init__(self, name=None, usageType='active', optype=None, importance=None, outliers='asIs', lowValue=None, highValue=None, missingValueReplacement=None, missingValueTreatment=None, invalidValueTreatment='returnInvalid', Extension=None):
super(MiningField, self).__init__(name, usageType, optype, importance, outliers, lowValue, highValue, missingValueReplacement, missingValueTreatment, invalidValueTreatment, Extension, )
#
# XMLBehaviors
#
def exportAttributes(self, outfile, level, already_processed, namespace_='', name_='MiningField', *args):
if self.name is not None and 'name' not in already_processed:
already_processed.add('name')
outfile.write(' name=%s' % (supermod.quote_attrib(self.name), ))
if self.usageType is not None and 'usageType' not in already_processed:
already_processed.add('usageType')
outfile.write(' usageType=%s' % (supermod.quote_attrib(self.usageType), ))
if self.optype is not None and 'optype' not in already_processed:
already_processed.add('optype')
outfile.write(' optype=%s' % (supermod.quote_attrib(self.optype), ))
if self.importance is not None and 'importance' not in already_processed:
already_processed.add('importance')
outfile.write(' importance=%s' % (supermod.quote_attrib(self.importance), ))
if self.outliers != "asIs" and 'outliers' not in already_processed:
already_processed.add('outliers')
outfile.write(' outliers=%s' % (supermod.quote_attrib(self.outliers), ))
if self.lowValue is not None and 'lowValue' not in already_processed:
already_processed.add('lowValue')
outfile.write(' lowValue=%s' % (supermod.quote_attrib(self.lowValue), ))
if self.highValue is not None and 'highValue' not in already_processed:
already_processed.add('highValue')
outfile.write(' highValue=%s' % (supermod.quote_attrib(self.highValue), ))
if self.missingValueReplacement is not None and 'missingValueReplacement' not in already_processed:
already_processed.add('missingValueReplacement')
outfile.write(' missingValueReplacement=%s' % (self.gds_encode(self.gds_format_string(supermod.quote_attrib(self.missingValueReplacement), input_name='missingValueReplacement')), ))
if self.missingValueTreatment is not None and 'missingValueTreatment' not in already_processed:
already_processed.add('missingValueTreatment')
outfile.write(' missingValueTreatment=%s' % (supermod.quote_attrib(self.missingValueTreatment), ))
if self.invalidValueTreatment != "returnInvalid" and 'invalidValueTreatment' not in already_processed:
already_processed.add('invalidValueTreatment')
outfile.write(' invalidValueTreatment=%s' % (supermod.quote_attrib(self.invalidValueTreatment), ))
def exportAttributes_wrapper(self, outfile, level, already_processed, namespace_='', name_='MiningField', *args):
result = self.exportAttributes(outfile, level, already_processed, namespace_='', name_='MiningField', *args)
return result
supermod.MiningField.subclass = MiningField
# end class MiningField
class ModelExplanation(supermod.ModelExplanation):
def __init__(self, Extension=None, PredictiveModelQuality=None, ClusteringModelQuality=None, Correlations=None):
super(ModelExplanation, self).__init__(Extension, PredictiveModelQuality, ClusteringModelQuality, Correlations, )
#
# XMLBehaviors
#
supermod.ModelExplanation.subclass = ModelExplanation
# end class ModelExplanation
class PredictiveModelQuality(supermod.PredictiveModelQuality):
def __init__(self, targetField=None, dataName=None, dataUsage='training', meanError=None, meanAbsoluteError=None, meanSquaredError=None, rootMeanSquaredError=None, r_squared=None, adj_r_squared=None, sumSquaredError=None, sumSquaredRegression=None, numOfRecords=None, numOfRecordsWeighted=None, numOfPredictors=None, degreesOfFreedom=None, fStatistic=None, AIC=None, BIC=None, AICc=None, Extension=None, ConfusionMatrix=None, LiftData=None, ROC=None):
super(PredictiveModelQuality, self).__init__(targetField, dataName, dataUsage, meanError, meanAbsoluteError, meanSquaredError, rootMeanSquaredError, r_squared, adj_r_squared, sumSquaredError, sumSquaredRegression, numOfRecords, numOfRecordsWeighted, numOfPredictors, degreesOfFreedom, fStatistic, AIC, BIC, AICc, Extension, ConfusionMatrix, LiftData, ROC, )
#
# XMLBehaviors
#
supermod.PredictiveModelQuality.subclass = PredictiveModelQuality
# end class PredictiveModelQuality
class ClusteringModelQuality(supermod.ClusteringModelQuality):
def __init__(self, dataName=None, SSE=None, SSB=None):
super(ClusteringModelQuality, self).__init__(dataName, SSE, SSB, )
#
# XMLBehaviors
#
supermod.ClusteringModelQuality.subclass = ClusteringModelQuality
# end class ClusteringModelQuality
class LiftData(supermod.LiftData):
def __init__(self, targetFieldValue=None, targetFieldDisplayValue=None, rankingQuality=None, Extension=None, ModelLiftGraph=None, OptimumLiftGraph=None, RandomLiftGraph=None):
super(LiftData, self).__init__(targetFieldValue, targetFieldDisplayValue, rankingQuality, Extension, ModelLiftGraph, OptimumLiftGraph, RandomLiftGraph, )
#
# XMLBehaviors
#
supermod.LiftData.subclass = LiftData
# end class LiftData
class ModelLiftGraph(supermod.ModelLiftGraph):
def __init__(self, Extension=None, LiftGraph=None):
super(ModelLiftGraph, self).__init__(Extension, LiftGraph, )
#
# XMLBehaviors
#
supermod.ModelLiftGraph.subclass = ModelLiftGraph
# end class ModelLiftGraph
class OptimumLiftGraph(supermod.OptimumLiftGraph):
def __init__(self, Extension=None, LiftGraph=None):
super(OptimumLiftGraph, self).__init__(Extension, LiftGraph, )
#
# XMLBehaviors
#
supermod.OptimumLiftGraph.subclass = OptimumLiftGraph
# end class OptimumLiftGraph
class RandomLiftGraph(supermod.RandomLiftGraph):
def __init__(self, Extension=None, LiftGraph=None):
super(RandomLiftGraph, self).__init__(Extension, LiftGraph, )
#
# XMLBehaviors
#
supermod.RandomLiftGraph.subclass = RandomLiftGraph
# end class RandomLiftGraph
class LiftGraph(supermod.LiftGraph):
def __init__(self, Extension=None, XCoordinates=None, YCoordinates=None, BoundaryValues=None, BoundaryValueMeans=None):
super(LiftGraph, self).__init__(Extension, XCoordinates, YCoordinates, BoundaryValues, BoundaryValueMeans, )
#
# XMLBehaviors
#
supermod.LiftGraph.subclass = LiftGraph
# end class LiftGraph
class XCoordinates(supermod.XCoordinates):
def __init__(self, Extension=None, Array=None):
super(XCoordinates, self).__init__(Extension, Array, )
#
# XMLBehaviors
#
supermod.XCoordinates.subclass = XCoordinates
# end class XCoordinates
class YCoordinates(supermod.YCoordinates):
def __init__(self, Extension=None, Array=None):
super(YCoordinates, self).__init__(Extension, Array, )
#
# XMLBehaviors
#
supermod.YCoordinates.subclass = YCoordinates
# end class YCoordinates
class BoundaryValues(supermod.BoundaryValues):
def __init__(self, Extension=None, Array=None):
super(BoundaryValues, self).__init__(Extension, Array, )
#
# XMLBehaviors
#
supermod.BoundaryValues.subclass = BoundaryValues
# end class BoundaryValues
class BoundaryValueMeans(supermod.BoundaryValueMeans):
def __init__(self, Extension=None, Array=None):
super(BoundaryValueMeans, self).__init__(Extension, Array, )
#
# XMLBehaviors
#
supermod.BoundaryValueMeans.subclass = BoundaryValueMeans
# end class BoundaryValueMeans
class ROC(supermod.ROC):
def __init__(self, positiveTargetFieldValue=None, positiveTargetFieldDisplayValue=None, negativeTargetFieldValue=None, negativeTargetFieldDisplayValue=None, Extension=None, ROCGraph=None):
super(ROC, self).__init__(positiveTargetFieldValue, positiveTargetFieldDisplayValue, negativeTargetFieldValue, negativeTargetFieldDisplayValue, Extension, ROCGraph, )
#
# XMLBehaviors
#
supermod.ROC.subclass = ROC
# end class ROC
class ROCGraph(supermod.ROCGraph):
def __init__(self, Extension=None, XCoordinates=None, YCoordinates=None, BoundaryValues=None):
super(ROCGraph, self).__init__(Extension, XCoordinates, YCoordinates, BoundaryValues, )
#
# XMLBehaviors
#
supermod.ROCGraph.subclass = ROCGraph
# end class ROCGraph
class ConfusionMatrix(supermod.ConfusionMatrix):
def __init__(self, Extension=None, ClassLabels=None, Matrix=None):
super(ConfusionMatrix, self).__init__(Extension, ClassLabels, Matrix, )
#
# XMLBehaviors
#
supermod.ConfusionMatrix.subclass = ConfusionMatrix
# end class ConfusionMatrix
class ClassLabels(supermod.ClassLabels):
def __init__(self, Extension=None, Array=None):
super(ClassLabels, self).__init__(Extension, Array, )
#
# XMLBehaviors
#
supermod.ClassLabels.subclass = ClassLabels
# end class ClassLabels
class Correlations(supermod.Correlations):
def __init__(self, Extension=None, CorrelationFields=None, CorrelationValues=None, CorrelationMethods=None):
super(Correlations, self).__init__(Extension, CorrelationFields, CorrelationValues, CorrelationMethods, )
#
# XMLBehaviors
#
supermod.Correlations.subclass = Correlations
# end class Correlations
class CorrelationFields(supermod.CorrelationFields):
def __init__(self, Extension=None, Array=None):
super(CorrelationFields, self).__init__(Extension, Array, )
#
# XMLBehaviors
#
supermod.CorrelationFields.subclass = CorrelationFields
# end class CorrelationFields
class CorrelationValues(supermod.CorrelationValues):
def __init__(self, Extension=None, Matrix=None):
super(CorrelationValues, self).__init__(Extension, Matrix, )
#
# XMLBehaviors
#
supermod.CorrelationValues.subclass = CorrelationValues
# end class CorrelationValues
class CorrelationMethods(supermod.CorrelationMethods):
def __init__(self, Extension=None, Matrix=None):
super(CorrelationMethods, self).__init__(Extension, Matrix, )
#
# XMLBehaviors
#
supermod.CorrelationMethods.subclass = CorrelationMethods
# end class CorrelationMethods
class ModelVerification(supermod.ModelVerification):
def __init__(self, recordCount=None, fieldCount=None, Extension=None, VerificationFields=None, InlineTable=None):
super(ModelVerification, self).__init__(recordCount, fieldCount, Extension, VerificationFields, InlineTable, )
#
# XMLBehaviors
#
supermod.ModelVerification.subclass = ModelVerification
# end class ModelVerification
class VerificationFields(supermod.VerificationFields):
def __init__(self, Extension=None, VerificationField=None):
super(VerificationFields, self).__init__(Extension, VerificationField, )
#
# XMLBehaviors
#
supermod.VerificationFields.subclass = VerificationFields
# end class VerificationFields
class VerificationField(supermod.VerificationField):
def __init__(self, field=None, column=None, precision=1E-6, zeroThreshold=1E-16, Extension=None):
super(VerificationField, self).__init__(field, column, precision, zeroThreshold, Extension, )
#
# XMLBehaviors
#
supermod.VerificationField.subclass = VerificationField
# end class VerificationField
class MiningModel(supermod.MiningModel):
def __init__(self, modelName=None, functionName=None, algorithmName=None, isScorable=True, MiningSchema=None, Output=None, ModelStats=None, ModelExplanation=None, Targets=None, LocalTransformations=None, Regression=None, DecisionTree=None, Segmentation=None, ModelVerification=None, Extension=None):
super(MiningModel, self).__init__(modelName, functionName, algorithmName, isScorable, MiningSchema, Output, ModelStats, ModelExplanation, Targets, LocalTransformations, Regression, DecisionTree, Segmentation, ModelVerification, Extension, )
#
# XMLBehaviors
#
supermod.MiningModel.subclass = MiningModel
# end class MiningModel
class Segmentation(supermod.Segmentation):
def __init__(self, multipleModelMethod=None, Extension=None, Segment=None):
super(Segmentation, self).__init__(multipleModelMethod, Extension, Segment, )
#
# XMLBehaviors
#
supermod.Segmentation.subclass = Segmentation
# end class Segmentation
class Segment(supermod.Segment):
def __init__(self, id=None, weight='1', Extension=None, SimplePredicate=None, CompoundPredicate=None, SimpleSetPredicate=None, True_=None, False_=None, AssociationModel=None, AnomalyDetectionModel=None, BayesianNetworkModel=None, BaselineModel=None, ClusteringModel=None, DeepNetwork=None, GaussianProcessModel=None, GeneralRegressionModel=None, MiningModel=None, NaiveBayesModel=None, NearestNeighborModel=None, NeuralNetwork=None, RegressionModel=None, RuleSetModel=None, SequenceModel=None, Scorecard=None, SupportVectorMachineModel=None, TextModel=None, TimeSeriesModel=None, TreeModel=None):
super(Segment, self).__init__(id, weight, Extension, SimplePredicate, CompoundPredicate, SimpleSetPredicate, True_, False_, AssociationModel, AnomalyDetectionModel, BayesianNetworkModel, BaselineModel, ClusteringModel, DeepNetwork, GaussianProcessModel, GeneralRegressionModel, MiningModel, NaiveBayesModel, NearestNeighborModel, NeuralNetwork, RegressionModel, RuleSetModel, SequenceModel, Scorecard, SupportVectorMachineModel, TextModel, TimeSeriesModel, TreeModel, )
#
# XMLBehaviors
#
supermod.Segment.subclass = Segment
# end class Segment
class ResultField(supermod.ResultField):
def __init__(self, name=None, displayName=None, optype=None, dataType=None, feature=None, value=None, Extension=None):
super(ResultField, self).__init__(name, displayName, optype, dataType, feature, value, Extension, )
#
# XMLBehaviors
#
supermod.ResultField.subclass = ResultField
# end class ResultField
class Regression(supermod.Regression):
def __init__(self, modelName=None, functionName=None, algorithmName=None, normalizationMethod='none', Extension=None, Output=None, ModelStats=None, Targets=None, LocalTransformations=None, ResultField=None, RegressionTable=None):
super(Regression, self).__init__(modelName, functionName, algorithmName, normalizationMethod, Extension, Output, ModelStats, Targets, LocalTransformations, ResultField, RegressionTable, )
#
# XMLBehaviors
#
supermod.Regression.subclass = Regression
# end class Regression
class DecisionTree(supermod.DecisionTree):
def __init__(self, modelName=None, functionName=None, algorithmName=None, missingValueStrategy='none', missingValuePenalty='1.0', noTrueChildStrategy='returnNullPrediction', splitCharacteristic='multiSplit', Extension=None, Output=None, ModelStats=None, Targets=None, LocalTransformations=None, ResultField=None, Node=None):
super(DecisionTree, self).__init__(modelName, functionName, algorithmName, missingValueStrategy, missingValuePenalty, noTrueChildStrategy, splitCharacteristic, Extension, Output, ModelStats, Targets, LocalTransformations, ResultField, Node, )
#
# XMLBehaviors
#
supermod.DecisionTree.subclass = DecisionTree
# end class DecisionTree
class NaiveBayesModel(supermod.NaiveBayesModel):
def __init__(self, modelName=None, threshold=None, functionName=None, algorithmName=None, isScorable=True, MiningSchema=None, Output=None, ModelStats=None, ModelExplanation=None, Targets=None, LocalTransformations=None, BayesInputs=None, BayesOutput=None, ModelVerification=None, Extension=None):
super(NaiveBayesModel, self).__init__(modelName, threshold, functionName, algorithmName, isScorable, MiningSchema, Output, ModelStats, ModelExplanation, Targets, LocalTransformations, BayesInputs, BayesOutput, ModelVerification, Extension, )
#
# XMLBehaviors
#
supermod.NaiveBayesModel.subclass = NaiveBayesModel
# end class NaiveBayesModel
class BayesInputs(supermod.BayesInputs):
def __init__(self, Extension=None, BayesInput=None):
super(BayesInputs, self).__init__(Extension, BayesInput, )
#
# XMLBehaviors
#
supermod.BayesInputs.subclass = BayesInputs
# end class BayesInputs
class BayesInput(supermod.BayesInput):
def __init__(self, fieldName=None, Extension=None, TargetValueStats=None, DerivedField=None, PairCounts=None):
super(BayesInput, self).__init__(fieldName, Extension, TargetValueStats, DerivedField, PairCounts, )
#
# XMLBehaviors
#
supermod.BayesInput.subclass = BayesInput
# end class BayesInput
class BayesOutput(supermod.BayesOutput):
def __init__(self, fieldName=None, Extension=None, TargetValueCounts=None):
super(BayesOutput, self).__init__(fieldName, Extension, TargetValueCounts, )
#
# XMLBehaviors
#
supermod.BayesOutput.subclass = BayesOutput
# end class BayesOutput
class TargetValueStats(supermod.TargetValueStats):
def __init__(self, Extension=None, TargetValueStat=None):
super(TargetValueStats, self).__init__(Extension, TargetValueStat, )
#
# XMLBehaviors
#
supermod.TargetValueStats.subclass = TargetValueStats
# end class TargetValueStats
class TargetValueStat(supermod.TargetValueStat):
def __init__(self, value=None, AnyDistribution=None, GaussianDistribution=None, PoissonDistribution=None, UniformDistribution=None, Extension=None):
super(TargetValueStat, self).__init__(value, AnyDistribution, GaussianDistribution, PoissonDistribution, UniformDistribution, Extension, )
#
# XMLBehaviors
#
supermod.TargetValueStat.subclass = TargetValueStat
# end class TargetValueStat
class PairCounts(supermod.PairCounts):
def __init__(self, value=None, Extension=None, TargetValueCounts=None):
super(PairCounts, self).__init__(value, Extension, TargetValueCounts, )
#
# XMLBehaviors
#
supermod.PairCounts.subclass = PairCounts
# end class PairCounts
class TargetValueCounts(supermod.TargetValueCounts):
def __init__(self, Extension=None, TargetValueCount=None):
super(TargetValueCounts, self).__init__(Extension, TargetValueCount, )
#
# XMLBehaviors
#
supermod.TargetValueCounts.subclass = TargetValueCounts
# end class TargetValueCounts
class TargetValueCount(supermod.TargetValueCount):
def __init__(self, value=None, count=None, Extension=None):
super(TargetValueCount, self).__init__(value, count, Extension, )
#
# XMLBehaviors
#
supermod.TargetValueCount.subclass = TargetValueCount
# end class TargetValueCount
class NeuralNetwork(supermod.NeuralNetwork):
def __init__(self, modelName=None, functionName=None, algorithmName=None, activationFunction=None, normalizationMethod='none', threshold='0', width=None, altitude='1.0', numberOfLayers=None, isScorable=True, MiningSchema=None, Output=None, ModelStats=None, ModelExplanation=None, Targets=None, LocalTransformations=None, NeuralInputs=None, NeuralLayer=None, NeuralOutputs=None, ModelVerification=None, Extension=None):
super(NeuralNetwork, self).__init__(modelName, functionName, algorithmName, activationFunction, normalizationMethod, threshold, width, altitude, numberOfLayers, isScorable, MiningSchema, Output, ModelStats, ModelExplanation, Targets, LocalTransformations, NeuralInputs, NeuralLayer, NeuralOutputs, ModelVerification, Extension, )
#
# XMLBehaviors
#
def set_NeuralLayer(self, NeuralLayer, *args):
self.NeuralLayer = NeuralLayer
self.numberOfLayers = len(self.NeuralLayer)
def set_NeuralLayer_wrapper(self, NeuralLayer, *args):
result = self.set_NeuralLayer(NeuralLayer, *args)
return result
def add_NeuralLayer(self, value, *args):
self.NeuralLayer.append(value)
self.numberOfLayers = len(self.NeuralLayer)
def add_NeuralLayer_wrapper(self, value, *args):
result = self.add_NeuralLayer(value, *args)
return result
def insert_NeuralLayer_at(self, index, value, *args):
self.NeuralLayer.insert(index, value)
self.numberOfLayers = len(self.NeuralLayer)
def insert_NeuralLayer_at_wrapper(self, index, value, *args):
result = self.insert_NeuralLayer_at(index, value, *args)
return result
supermod.NeuralNetwork.subclass = NeuralNetwork
# end class NeuralNetwork
class NeuralInputs(supermod.NeuralInputs):
def __init__(self, numberOfInputs=None, Extension=None, NeuralInput=None):
super(NeuralInputs, self).__init__(numberOfInputs, Extension, NeuralInput, )
#
# XMLBehaviors
#
def set_NeuralInput(self, NeuralInput, *args):
self.NeuralInput = NeuralInput
self.numberOfInputs = len(NeuralInput)
def set_NeuralInput_wrapper(self, NeuralInput, *args):
result = self.set_NeuralInput(NeuralInput, *args)
return result
def add_NeuralInput(self, value, *args):
self.NeuralInput.append(value)
self.numberOfInputs = len(self.NeuralInput)
def add_NeuralInput_wrapper(self, value, *args):
result = self.add_NeuralInput(value, *args)
return result
def insert_NeuralInput_at(self, index, value, *args):
self.NeuralInput.insert(index, value)
self.numberOfInputs = len(self.NeuralInput)
def insert_NeuralInput_at_wrapper(self, index, value, *args):
result = self.insert_NeuralInput_at(index, value, *args)
return result
supermod.NeuralInputs.subclass = NeuralInputs
# end class NeuralInputs
class NeuralLayer(supermod.NeuralLayer):
def __init__(self, numberOfNeurons=None, activationFunction=None, threshold=None, width=None, altitude=None, normalizationMethod=None, Extension=None, Neuron=None):
super(NeuralLayer, self).__init__(numberOfNeurons, activationFunction, threshold, width, altitude, normalizationMethod, Extension, Neuron, )
#
# XMLBehaviors
#
def set_Neuron(self, Neuron, *args):
self.Neuron = Neuron
self.numberOfNeurons = len(self.Neuron)
def set_Neuron_wrapper(self, Neuron, *args):
result = self.set_Neuron(Neuron, *args)
return result
def add_Neuron(self, value, *args):
self.Neuron.append(value)
self.numberOfNeurons = len(self.Neuron)
def add_Neuron_wrapper(self, value, *args):
result = self.add_Neuron(value, *args)
return result
def insert_Neuron_at(self, index, value, *args):
self.Neuron.insert(index, value)
self.numberOfNeurons = len(self.Neuron)
def insert_Neuron_at_wrapper(self, index, value, *args):
result = self.insert_Neuron_at(index, value, *args)
return result
supermod.NeuralLayer.subclass = NeuralLayer
# end class NeuralLayer
class NeuralOutputs(supermod.NeuralOutputs):
def __init__(self, numberOfOutputs=None, Extension=None, NeuralOutput=None):
super(NeuralOutputs, self).__init__(numberOfOutputs, Extension, NeuralOutput, )
#
# XMLBehaviors
#
def set_NeuralOutput(self, NeuralOutput, *args):
self.Neuron = Neuron
self.numberOfNeurons = len(self.Neuron)
def set_NeuralOutput_wrapper(self, NeuralOutput, *args):
result = self.set_NeuralOutput(NeuralOutput, *args)
return result
def add_NeuralOutput(self, value, *args):
self.NeuralOutput.append(value)
self.numberOfOutputs = len(self.NeuralOutput)
def add_NeuralOutput_wrapper(self, value, *args):
result = self.add_NeuralOutput(value, *args)
return result
def insert_NeuralOutput_at(self, index, value, *args):
self.NeuralOutput.insert(index, value)
self.numberOfOutputs = len(self.NeuralOutput)
def insert_NeuralOutput_at_wrapper(self, index, value, *args):
result = self.insert_NeuralOutput_at(index, value, *args)
return result
supermod.NeuralOutputs.subclass = NeuralOutputs
# end class NeuralOutputs
class NeuralInput(supermod.NeuralInput):
def __init__(self, id=None, Extension=None, DerivedField=None):
super(NeuralInput, self).__init__(id, Extension, DerivedField, )
#
# XMLBehaviors
#
supermod.NeuralInput.subclass = NeuralInput
# end class NeuralInput
class Neuron(supermod.Neuron):
def __init__(self, id=None, bias=None, width=None, altitude=None, Extension=None, Con=None):
super(Neuron, self).__init__(id, bias, width, altitude, Extension, Con, )
#
# XMLBehaviors
#
supermod.Neuron.subclass = Neuron
# end class Neuron
class Con(supermod.Con):
def __init__(self, from_=None, weight=None, Extension=None):
super(Con, self).__init__(from_, weight, Extension, )
#
# XMLBehaviors
#
supermod.Con.subclass = Con
# end class Con
class NeuralOutput(supermod.NeuralOutput):
def __init__(self, outputNeuron=None, Extension=None, DerivedField=None):
super(NeuralOutput, self).__init__(outputNeuron, Extension, DerivedField, )
#
# XMLBehaviors
#
supermod.NeuralOutput.subclass = NeuralOutput
# end class NeuralOutput
class Output(supermod.Output):
def __init__(self, Extension=None, OutputField=None):
super(Output, self).__init__(Extension, OutputField, )
#
# XMLBehaviors
#
supermod.Output.subclass = Output
# end class Output
class OutputField(supermod.OutputField):
def __init__(self, name=None, displayName=None, optype=None, dataType=None, targetField=None, feature='predictedValue', value=None, numTopCategories=None, threshold=None, ruleFeature='consequent', algorithm='exclusiveRecommendation', rank='1', rankBasis='confidence', rankOrder='descending', isMultiValued='0', segmentId=None, isFinalResult=True, Extension=None, Decisions=None, Apply=None, FieldRef=None, Constant=None, NormContinuous=None, NormDiscrete=None, Discretize=None, MapValues=None, TextIndex=None, Aggregate=None, Lag=None, Value=None):
super(OutputField, self).__init__(name, displayName, optype, dataType, targetField, feature, value, numTopCategories, threshold, ruleFeature, algorithm, rank, rankBasis, rankOrder, isMultiValued, segmentId, isFinalResult, Extension, Decisions, Apply, FieldRef, Constant, NormContinuous, NormDiscrete, Discretize, MapValues, TextIndex, Aggregate, Lag, Value, )
#
# XMLBehaviors
#
def exportAttributes(self, outfile, level, already_processed, namespace_='', name_='OutputFields', *args):
if self.name is not None and 'name' not in already_processed:
already_processed.add('name')
outfile.write(' name=%s' % (supermod.quote_attrib(self.name), ))
if self.displayName is not None and 'displayName' not in already_processed:
already_processed.add('displayName')
outfile.write(' displayName=%s' % (self.gds_encode(self.gds_format_string(supermod.quote_attrib(self.displayName), input_name='displayName')), ))
if self.optype is not None and 'optype' not in already_processed:
already_processed.add('optype')
outfile.write(' optype=%s' % (supermod.quote_attrib(self.optype), ))
if self.dataType is not None and 'dataType' not in already_processed:
already_processed.add('dataType')
outfile.write(' dataType=%s' % (supermod.quote_attrib(self.dataType), ))
if self.targetField is not None and 'targetField' not in already_processed:
already_processed.add('targetField')
outfile.write(' targetField=%s' % (supermod.quote_attrib(self.targetField), ))
if self.feature is not None and 'feature' not in already_processed:
already_processed.add('feature')
outfile.write(' feature=%s' % (supermod.quote_attrib(self.feature), ))
if self.value is not None and 'value' not in already_processed:
already_processed.add('value')
outfile.write(' value=%s' % (self.gds_encode(self.gds_format_string(supermod.quote_attrib(self.value), input_name='value')), ))
if self.ruleFeature != "consequent" and 'ruleFeature' not in already_processed:
already_processed.add('ruleFeature')
outfile.write(' ruleFeature=%s' % (supermod.quote_attrib(self.ruleFeature), ))
if self.algorithm != "exclusiveRecommendation" and 'algorithm' not in already_processed:
already_processed.add('algorithm')
outfile.write(' algorithm=%s' % (self.gds_encode(self.gds_format_string(supermod.quote_attrib(self.algorithm), input_name='algorithm')), ))
# if self.rank is not None and 'rank' not in already_processed:
# already_processed.add('rank')
# outfile.write(' rank=%s' % (supermod.quote_attrib(self.rank), ))
if self.rankBasis != "confidence" and 'rankBasis' not in already_processed:
already_processed.add('rankBasis')
outfile.write(' rankBasis=%s' % (self.gds_encode(self.gds_format_string(supermod.quote_attrib(self.rankBasis), input_name='rankBasis')), ))
if self.rankOrder != "descending" and 'rankOrder' not in already_processed:
already_processed.add('rankOrder')
outfile.write(' rankOrder=%s' % (self.gds_encode(self.gds_format_string(supermod.quote_attrib(self.rankOrder), input_name='rankOrder')), ))
if self.isMultiValued != "0" and 'isMultiValued' not in already_processed:
already_processed.add('isMultiValued')
outfile.write(' isMultiValued=%s' % (self.gds_encode(self.gds_format_string(supermod.quote_attrib(self.isMultiValued), input_name='isMultiValued')), ))
if self.segmentId is not None and 'segmentId' not in already_processed:
already_processed.add('segmentId')
outfile.write(' segmentId=%s' % (self.gds_encode(self.gds_format_string(supermod.quote_attrib(self.segmentId), input_name='segmentId')), ))
if not self.isFinalResult and 'isFinalResult' not in already_processed:
already_processed.add('isFinalResult')
outfile.write(' isFinalResult="%s"' % self.gds_format_boolean(self.isFinalResult, input_name='isFinalResult'))
if self.numTopCategories is not None and 'numTopCategories' not in already_processed:
already_processed.add('numTopCategories')
outfile.write(' numTopCategories=%s' % (supermod.quote_attrib(self.numTopCategories), ))
if self.threshold is not None and 'threshold' not in already_processed:
already_processed.add('threshold')
outfile.write(' threshold=%s' % (supermod.quote_attrib(self.threshold), ))
def exportAttributes_wrapper(self, outfile, level, already_processed, namespace_='', name_='OutputFields', *args):
result = self.exportAttributes(outfile, level, already_processed, namespace_='', name_='OutputFields', *args)
return result
supermod.OutputField.subclass = OutputField
# end class OutputField
class Decisions(supermod.Decisions):
def __init__(self, businessProblem=None, description=None, Extension=None, Decision=None):
super(Decisions, self).__init__(businessProblem, description, Extension, Decision, )
#
# XMLBehaviors
#
supermod.Decisions.subclass = Decisions
# end class Decisions
class Decision(supermod.Decision):
def __init__(self, value=None, displayValue=None, description=None, Extension=None):
super(Decision, self).__init__(value, displayValue, description, Extension, )
#
# XMLBehaviors
#
supermod.Decision.subclass = Decision
# end class Decision
class AnomalyDetectionModel(supermod.AnomalyDetectionModel):
def __init__(self, modelName=None, sampleDataSize=None, functionName=None, algorithmType=None, MiningSchema=None, Output=None, LocalTransformations=None, ParameterList=None, ModelVerification=None, AssociationModel=None, AnomalyDetectionModel_member=None, BayesianNetworkModel=None, BaselineModel=None, ClusteringModel=None, DeepNetwork=None, GaussianProcessModel=None, GeneralRegressionModel=None, MiningModel=None, NaiveBayesModel=None, NearestNeighborModel=None, NeuralNetwork=None, RegressionModel=None, RuleSetModel=None, SequenceModel=None, Scorecard=None, SupportVectorMachineModel=None, TextModel=None, TimeSeriesModel=None, TreeModel=None, Extension=None):
super(AnomalyDetectionModel, self).__init__(modelName, sampleDataSize, functionName, algorithmType, MiningSchema, Output, LocalTransformations, ParameterList, ModelVerification, AssociationModel, AnomalyDetectionModel_member, BayesianNetworkModel, BaselineModel, ClusteringModel, DeepNetwork, GaussianProcessModel, GeneralRegressionModel, MiningModel, NaiveBayesModel, NearestNeighborModel, NeuralNetwork, RegressionModel, RuleSetModel, SequenceModel, Scorecard, SupportVectorMachineModel, TextModel, TimeSeriesModel, TreeModel, Extension, )
#
# XMLBehaviors
#
supermod.AnomalyDetectionModel.subclass = AnomalyDetectionModel
# end class AnomalyDetectionModel
class RegressionModel(supermod.RegressionModel):
def __init__(self, modelName=None, functionName=None, algorithmName=None, modelType=None, targetFieldName=None, normalizationMethod='none', isScorable=True, MiningSchema=None, Output=None, ModelStats=None, ModelExplanation=None, Targets=None, LocalTransformations=None, RegressionTable=None, ModelVerification=None, Extension=None):
super(RegressionModel, self).__init__(modelName, functionName, algorithmName, modelType, targetFieldName, normalizationMethod, isScorable, MiningSchema, Output, ModelStats, ModelExplanation, Targets, LocalTransformations, RegressionTable, ModelVerification, Extension, )
#
# XMLBehaviors
#
supermod.RegressionModel.subclass = RegressionModel
# end class RegressionModel
class RegressionTable(supermod.RegressionTable):
def __init__(self, intercept=None, targetCategory=None, Extension=None, NumericPredictor=None, CategoricalPredictor=None, PredictorTerm=None):
super(RegressionTable, self).__init__(intercept, targetCategory, Extension, NumericPredictor, CategoricalPredictor, PredictorTerm, )
#
# XMLBehaviors
#
supermod.RegressionTable.subclass = RegressionTable
# end class RegressionTable
class NumericPredictor(supermod.NumericPredictor):
def __init__(self, name=None, exponent='1', coefficient=None, Extension=None):
super(NumericPredictor, self).__init__(name, exponent, coefficient, Extension, )
#
# XMLBehaviors
#
supermod.NumericPredictor.subclass = NumericPredictor
# end class NumericPredictor
class CategoricalPredictor(supermod.CategoricalPredictor):
def __init__(self, name=None, value=None, coefficient=None, Extension=None):
super(CategoricalPredictor, self).__init__(name, value, coefficient, Extension, )
#
# XMLBehaviors
#
supermod.CategoricalPredictor.subclass = CategoricalPredictor
# end class CategoricalPredictor
class PredictorTerm(supermod.PredictorTerm):
def __init__(self, name=None, coefficient=None, Extension=None, FieldRef=None):
super(PredictorTerm, self).__init__(name, coefficient, Extension, FieldRef, )
#
# XMLBehaviors
#
supermod.PredictorTerm.subclass = PredictorTerm
# end class PredictorTerm
class RuleSetModel(supermod.RuleSetModel):
def __init__(self, modelName=None, functionName=None, algorithmName=None, isScorable=True, MiningSchema=None, Output=None, ModelStats=None, ModelExplanation=None, Targets=None, LocalTransformations=None, RuleSet=None, ModelVerification=None, Extension=None):
super(RuleSetModel, self).__init__(modelName, functionName, algorithmName, isScorable, MiningSchema, Output, ModelStats, ModelExplanation, Targets, LocalTransformations, RuleSet, ModelVerification, Extension, )
#
# XMLBehaviors
#
supermod.RuleSetModel.subclass = RuleSetModel
# end class RuleSetModel
class RuleSet(supermod.RuleSet):
def __init__(self, recordCount=None, nbCorrect=None, defaultScore=None, defaultConfidence=None, Extension=None, RuleSelectionMethod=None, ScoreDistribution=None, SimpleRule=None, CompoundRule=None):
super(RuleSet, self).__init__(recordCount, nbCorrect, defaultScore, defaultConfidence, Extension, RuleSelectionMethod, ScoreDistribution, SimpleRule, CompoundRule, )
#
# XMLBehaviors
#
supermod.RuleSet.subclass = RuleSet
# end class RuleSet
class RuleSelectionMethod(supermod.RuleSelectionMethod):
def __init__(self, criterion=None, Extension=None):
super(RuleSelectionMethod, self).__init__(criterion, Extension, )
#
# XMLBehaviors
#
supermod.RuleSelectionMethod.subclass = RuleSelectionMethod
# end class RuleSelectionMethod
class SimpleRule(supermod.SimpleRule):
def __init__(self, id=None, score=None, recordCount=None, nbCorrect=None, confidence='1', weight='1', Extension=None, SimplePredicate=None, CompoundPredicate=None, SimpleSetPredicate=None, True_=None, False_=None, ScoreDistribution=None):
super(SimpleRule, self).__init__(id, score, recordCount, nbCorrect, confidence, weight, Extension, SimplePredicate, CompoundPredicate, SimpleSetPredicate, True_, False_, ScoreDistribution, )
#
# XMLBehaviors
#
supermod.SimpleRule.subclass = SimpleRule
# end class SimpleRule
class CompoundRule(supermod.CompoundRule):
def __init__(self, Extension=None, SimplePredicate=None, CompoundPredicate=None, SimpleSetPredicate=None, True_=None, False_=None, SimpleRule=None, CompoundRule_member=None):
super(CompoundRule, self).__init__(Extension, SimplePredicate, CompoundPredicate, SimpleSetPredicate, True_, False_, SimpleRule, CompoundRule_member, )
#
# XMLBehaviors
#
supermod.CompoundRule.subclass = CompoundRule
# end class CompoundRule
class Scorecard(supermod.Scorecard):
def __init__(self, modelName=None, functionName=None, algorithmName=None, initialScore='0', useReasonCodes=True, reasonCodeAlgorithm='pointsBelow', baselineScore=None, baselineMethod='other', isScorable=True, MiningSchema=None, Output=None, ModelStats=None, ModelExplanation=None, Targets=None, LocalTransformations=None, Characteristics=None, ModelVerification=None, Extension=None):
super(Scorecard, self).__init__(modelName, functionName, algorithmName, initialScore, useReasonCodes, reasonCodeAlgorithm, baselineScore, baselineMethod, isScorable, MiningSchema, Output, ModelStats, ModelExplanation, Targets, LocalTransformations, Characteristics, ModelVerification, Extension, )
#
# XMLBehaviors
#
supermod.Scorecard.subclass = Scorecard
# end class Scorecard
class Characteristics(supermod.Characteristics):
def __init__(self, Extension=None, Characteristic=None):
super(Characteristics, self).__init__(Extension, Characteristic, )
#
# XMLBehaviors
#
supermod.Characteristics.subclass = Characteristics
# end class Characteristics
class Characteristic(supermod.Characteristic):
def __init__(self, name=None, reasonCode=None, baselineScore=None, Extension=None, Attribute=None):
super(Characteristic, self).__init__(name, reasonCode, baselineScore, Extension, Attribute, )
#
# XMLBehaviors
#
supermod.Characteristic.subclass = Characteristic
# end class Characteristic
class Attribute(supermod.Attribute):
def __init__(self, reasonCode=None, partialScore=None, Extension=None, SimplePredicate=None, CompoundPredicate=None, SimpleSetPredicate=None, True_=None, False_=None, ComplexPartialScore=None):
super(Attribute, self).__init__(reasonCode, partialScore, Extension, SimplePredicate, CompoundPredicate, SimpleSetPredicate, True_, False_, ComplexPartialScore, )
#
# XMLBehaviors
#
supermod.Attribute.subclass = Attribute
# end class Attribute
class ComplexPartialScore(supermod.ComplexPartialScore):
def __init__(self, Extension=None, Apply=None, FieldRef=None, Constant=None, NormContinuous=None, NormDiscrete=None, Discretize=None, MapValues=None, TextIndex=None, Aggregate=None, Lag=None):
super(ComplexPartialScore, self).__init__(Extension, Apply, FieldRef, Constant, NormContinuous, NormDiscrete, Discretize, MapValues, TextIndex, Aggregate, Lag, )
#
# XMLBehaviors
#
supermod.ComplexPartialScore.subclass = ComplexPartialScore
# end class ComplexPartialScore
class SequenceModel(supermod.SequenceModel):
def __init__(self, modelName=None, functionName=None, algorithmName=None, numberOfTransactions=None, maxNumberOfItemsPerTransaction=None, avgNumberOfItemsPerTransaction=None, numberOfTransactionGroups=None, maxNumberOfTAsPerTAGroup=None, avgNumberOfTAsPerTAGroup=None, isScorable=True, MiningSchema=None, ModelStats=None, LocalTransformations=None, Constraints=None, Item=None, Itemset=None, SetPredicate=None, Sequence=None, SequenceRule=None, Extension=None):
super(SequenceModel, self).__init__(modelName, functionName, algorithmName, numberOfTransactions, maxNumberOfItemsPerTransaction, avgNumberOfItemsPerTransaction, numberOfTransactionGroups, maxNumberOfTAsPerTAGroup, avgNumberOfTAsPerTAGroup, isScorable, MiningSchema, ModelStats, LocalTransformations, Constraints, Item, Itemset, SetPredicate, Sequence, SequenceRule, Extension, )
#
# XMLBehaviors
#
supermod.SequenceModel.subclass = SequenceModel
# end class SequenceModel
class Constraints(supermod.Constraints):
def __init__(self, minimumNumberOfItems='1', maximumNumberOfItems=None, minimumNumberOfAntecedentItems='1', maximumNumberOfAntecedentItems=None, minimumNumberOfConsequentItems='1', maximumNumberOfConsequentItems=None, minimumSupport='0', minimumConfidence='0', minimumLift='0', minimumTotalSequenceTime='0', maximumTotalSequenceTime=None, minimumItemsetSeparationTime='0', maximumItemsetSeparationTime=None, minimumAntConsSeparationTime='0', maximumAntConsSeparationTime=None, Extension=None):
super(Constraints, self).__init__(minimumNumberOfItems, maximumNumberOfItems, minimumNumberOfAntecedentItems, maximumNumberOfAntecedentItems, minimumNumberOfConsequentItems, maximumNumberOfConsequentItems, minimumSupport, minimumConfidence, minimumLift, minimumTotalSequenceTime, maximumTotalSequenceTime, minimumItemsetSeparationTime, maximumItemsetSeparationTime, minimumAntConsSeparationTime, maximumAntConsSeparationTime, Extension, )
#
# XMLBehaviors
#
supermod.Constraints.subclass = Constraints
# end class Constraints
class SetPredicate(supermod.SetPredicate):
def __init__(self, id=None, field=None, operator=None, Extension=None, Array=None):
super(SetPredicate, self).__init__(id, field, operator, Extension, Array, )
#
# XMLBehaviors
#
supermod.SetPredicate.subclass = SetPredicate
# end class SetPredicate
class Delimiter(supermod.Delimiter):
def __init__(self, delimiter=None, gap=None, Extension=None):
super(Delimiter, self).__init__(delimiter, gap, Extension, )
#
# XMLBehaviors
#
supermod.Delimiter.subclass = Delimiter
# end class Delimiter
class Time(supermod.Time):
def __init__(self, min=None, max=None, mean=None, standardDeviation=None, Extension=None):
super(Time, self).__init__(min, max, mean, standardDeviation, Extension, )
#
# XMLBehaviors
#
supermod.Time.subclass = Time
# end class Time
class Sequence(supermod.Sequence):
def __init__(self, id=None, numberOfSets=None, occurrence=None, support=None, Extension=None, Delimiter=None, SetReference=None, Time=None):
super(Sequence, self).__init__(id, numberOfSets, occurrence, support, Extension, Delimiter, SetReference, Time, )
#
# XMLBehaviors
#
supermod.Sequence.subclass = Sequence
# end class Sequence
class SetReference(supermod.SetReference):
def __init__(self, setId=None, Extension=None):
super(SetReference, self).__init__(setId, Extension, )
#
# XMLBehaviors
#
supermod.SetReference.subclass = SetReference
# end class SetReference
class SequenceRule(supermod.SequenceRule):
def __init__(self, id=None, numberOfSets=None, occurrence=None, support=None, confidence=None, lift=None, Extension=None, AntecedentSequence=None, Delimiter=None, ConsequentSequence=None, Time=None):
super(SequenceRule, self).__init__(id, numberOfSets, occurrence, support, confidence, lift, Extension, AntecedentSequence, Delimiter, ConsequentSequence, Time, )
#
# XMLBehaviors
#
supermod.SequenceRule.subclass = SequenceRule
# end class SequenceRule
class SequenceReference(supermod.SequenceReference):
def __init__(self, seqId=None, Extension=None):
super(SequenceReference, self).__init__(seqId, Extension, )
#
# XMLBehaviors
#
supermod.SequenceReference.subclass = SequenceReference
# end class SequenceReference
class AntecedentSequence(supermod.AntecedentSequence):
def __init__(self, Extension=None, SequenceReference=None, Time=None):
super(AntecedentSequence, self).__init__(Extension, SequenceReference, Time, )
#
# XMLBehaviors
#
supermod.AntecedentSequence.subclass = AntecedentSequence
# end class AntecedentSequence
class ConsequentSequence(supermod.ConsequentSequence):
def __init__(self, Extension=None, SequenceReference=None, Time=None):
super(ConsequentSequence, self).__init__(Extension, SequenceReference, Time, )
#
# XMLBehaviors
#
supermod.ConsequentSequence.subclass = ConsequentSequence
# end class ConsequentSequence
class ModelStats(supermod.ModelStats):
def __init__(self, Extension=None, UnivariateStats=None, MultivariateStats=None):
super(ModelStats, self).__init__(Extension, UnivariateStats, MultivariateStats, )
#
# XMLBehaviors
#
supermod.ModelStats.subclass = ModelStats
# end class ModelStats
class UnivariateStats(supermod.UnivariateStats):
def __init__(self, field=None, weighted='0', Extension=None, Counts=None, NumericInfo=None, DiscrStats=None, ContStats=None, Anova=None):
super(UnivariateStats, self).__init__(field, weighted, Extension, Counts, NumericInfo, DiscrStats, ContStats, Anova, )
#
# XMLBehaviors
#
supermod.UnivariateStats.subclass = UnivariateStats
# end class UnivariateStats
class Counts(supermod.Counts):
def __init__(self, totalFreq=None, missingFreq=None, invalidFreq=None, cardinality=None, Extension=None):
super(Counts, self).__init__(totalFreq, missingFreq, invalidFreq, cardinality, Extension, )
#
# XMLBehaviors
#
supermod.Counts.subclass = Counts
# end class Counts
class NumericInfo(supermod.NumericInfo):
def __init__(self, minimum=None, maximum=None, mean=None, standardDeviation=None, median=None, interQuartileRange=None, Extension=None, Quantile=None):
super(NumericInfo, self).__init__(minimum, maximum, mean, standardDeviation, median, interQuartileRange, Extension, Quantile, )
#
# XMLBehaviors
#
supermod.NumericInfo.subclass = NumericInfo
# end class NumericInfo
class Quantile(supermod.Quantile):
def __init__(self, quantileLimit=None, quantileValue=None, Extension=None):
super(Quantile, self).__init__(quantileLimit, quantileValue, Extension, )
#
# XMLBehaviors
#
supermod.Quantile.subclass = Quantile
# end class Quantile
class DiscrStats(supermod.DiscrStats):
def __init__(self, modalValue=None, Extension=None, Array=None):
super(DiscrStats, self).__init__(modalValue, Extension, Array, )
#
# XMLBehaviors
#
supermod.DiscrStats.subclass = DiscrStats
# end class DiscrStats
class ContStats(supermod.ContStats):
def __init__(self, totalValuesSum=None, totalSquaresSum=None, Extension=None, Interval=None, NUM_ARRAY=None):
super(ContStats, self).__init__(totalValuesSum, totalSquaresSum, Extension, Interval, NUM_ARRAY, )
#
# XMLBehaviors
#
supermod.ContStats.subclass = ContStats
# end class ContStats
class MultivariateStats(supermod.MultivariateStats):
def __init__(self, targetCategory=None, Extension=None, MultivariateStat=None):
super(MultivariateStats, self).__init__(targetCategory, Extension, MultivariateStat, )
#
# XMLBehaviors
#
supermod.MultivariateStats.subclass = MultivariateStats
# end class MultivariateStats
class MultivariateStat(supermod.MultivariateStat):
def __init__(self, name=None, category=None, exponent='1', isIntercept=False, importance=None, stdError=None, tValue=None, chiSquareValue=None, fStatistic=None, dF=None, pValueAlpha=None, pValueInitial=None, pValueFinal=None, confidenceLevel='0.95', confidenceLowerBound=None, confidenceUpperBound=None, Extension=None):
super(MultivariateStat, self).__init__(name, category, exponent, isIntercept, importance, stdError, tValue, chiSquareValue, fStatistic, dF, pValueAlpha, pValueInitial, pValueFinal, confidenceLevel, confidenceLowerBound, confidenceUpperBound, Extension, )
#
# XMLBehaviors
#
supermod.MultivariateStat.subclass = MultivariateStat
# end class MultivariateStat
class Anova(supermod.Anova):
def __init__(self, target=None, Extension=None, AnovaRow=None):
super(Anova, self).__init__(target, Extension, AnovaRow, )
#
# XMLBehaviors
#
supermod.Anova.subclass = Anova
# end class Anova
class AnovaRow(supermod.AnovaRow):
def __init__(self, type_=None, sumOfSquares=None, degreesOfFreedom=None, meanOfSquares=None, fValue=None, pValue=None, Extension=None):
super(AnovaRow, self).__init__(type_, sumOfSquares, degreesOfFreedom, meanOfSquares, fValue, pValue, Extension, )
#
# XMLBehaviors
#
supermod.AnovaRow.subclass = AnovaRow
# end class AnovaRow
class Partition(supermod.Partition):
def __init__(self, name=None, size=None, Extension=None, PartitionFieldStats=None):
super(Partition, self).__init__(name, size, Extension, PartitionFieldStats, )
#
# XMLBehaviors
#
supermod.Partition.subclass = Partition
# end class Partition
class PartitionFieldStats(supermod.PartitionFieldStats):
def __init__(self, field=None, weighted='0', Extension=None, Counts=None, NumericInfo=None, Array=None):
super(PartitionFieldStats, self).__init__(field, weighted, Extension, Counts, NumericInfo, Array, )
#
# XMLBehaviors
#
supermod.PartitionFieldStats.subclass = PartitionFieldStats
# end class PartitionFieldStats
class SupportVectorMachineModel(supermod.SupportVectorMachineModel):
def __init__(self, modelName=None, functionName=None, algorithmName=None, threshold='0', svmRepresentation='SupportVectors', classificationMethod='OneAgainstAll', maxWins=False, isScorable=True, MiningSchema=None, Output=None, ModelStats=None, ModelExplanation=None, Targets=None, LocalTransformations=None, LinearKernelType=None, PolynomialKernelType=None, RadialBasisKernelType=None, SigmoidKernelType=None, VectorDictionary=None, SupportVectorMachine=None, ModelVerification=None, Extension=None):
super(SupportVectorMachineModel, self).__init__(modelName, functionName, algorithmName, threshold, svmRepresentation, classificationMethod, maxWins, isScorable, MiningSchema, Output, ModelStats, ModelExplanation, Targets, LocalTransformations, LinearKernelType, PolynomialKernelType, RadialBasisKernelType, SigmoidKernelType, VectorDictionary, SupportVectorMachine, ModelVerification, Extension, )
#
# XMLBehaviors
#
supermod.SupportVectorMachineModel.subclass = SupportVectorMachineModel
# end class SupportVectorMachineModel
class LinearKernelType(supermod.LinearKernelType):
def __init__(self, description=None, Extension=None):
super(LinearKernelType, self).__init__(description, Extension, )
#
# XMLBehaviors
#
supermod.LinearKernelType.subclass = LinearKernelType
# end class LinearKernelType
class PolynomialKernelType(supermod.PolynomialKernelType):
def __init__(self, description=None, gamma='1', coef0='1', degree='1', Extension=None):
super(PolynomialKernelType, self).__init__(description, gamma, coef0, degree, Extension, )
#
# XMLBehaviors
#
supermod.PolynomialKernelType.subclass = PolynomialKernelType
# end class PolynomialKernelType
class RadialBasisKernelType(supermod.RadialBasisKernelType):
def __init__(self, description=None, gamma='1', Extension=None):
super(RadialBasisKernelType, self).__init__(description, gamma, Extension, )
#
# XMLBehaviors
#
supermod.RadialBasisKernelType.subclass = RadialBasisKernelType
# end class RadialBasisKernelType
class SigmoidKernelType(supermod.SigmoidKernelType):
def __init__(self, description=None, gamma='1', coef0='1', Extension=None):
super(SigmoidKernelType, self).__init__(description, gamma, coef0, Extension, )
#
# XMLBehaviors
#
supermod.SigmoidKernelType.subclass = SigmoidKernelType
# end class SigmoidKernelType
class VectorDictionary(supermod.VectorDictionary):
def __init__(self, numberOfVectors=None, Extension=None, VectorFields=None, VectorInstance=None):
super(VectorDictionary, self).__init__(numberOfVectors, Extension, VectorFields, VectorInstance, )
#
# XMLBehaviors
#
def set_VectorInstance(self, VectorInstance, *args):
self.VectorInstance = VectorInstance
self.numberOfVectors = len(self.VectorInstance)
def set_VectorInstance_wrapper(self, VectorInstance, *args):
result = self.set_VectorInstance(VectorInstance, *args)
return result
def add_VectorInstance(self, value, *args):
self.VectorInstance.append(value)
self.numberOfVectors = len(self.VectorInstance)
def add_VectorInstance_wrapper(self, value, *args):
result = self.add_VectorInstance(value, *args)
return result
def insert_VectorInstance_at(self, index, value, *args):
self.VectorInstance.insert(index, value)
self.numberOfVectors = len(self.VectorInstance)
def insert_VectorInstance_at_wrapper(self, index, value, *args):
result = self.insert_VectorInstance_at(index, value, *args)
return result
supermod.VectorDictionary.subclass = VectorDictionary
# end class VectorDictionary
class VectorFields(supermod.VectorFields):
def __init__(self, numberOfFields=None, Extension=None, FieldRef=None, CategoricalPredictor=None):
super(VectorFields, self).__init__(numberOfFields, Extension, FieldRef, CategoricalPredictor, )
#
# XMLBehaviors
#
supermod.VectorFields.subclass = VectorFields
# end class VectorFields
class VectorInstance(supermod.VectorInstance):
def __init__(self, id=None, Extension=None, REAL_SparseArray=None, Array=None):
super(VectorInstance, self).__init__(id, Extension, REAL_SparseArray, Array, )
#
# XMLBehaviors
#
supermod.VectorInstance.subclass = VectorInstance
# end class VectorInstance
class SupportVectorMachine(supermod.SupportVectorMachine):
def __init__(self, targetCategory=None, alternateTargetCategory=None, threshold=None, Extension=None, SupportVectors=None, Coefficients=None):
super(SupportVectorMachine, self).__init__(targetCategory, alternateTargetCategory, threshold, Extension, SupportVectors, Coefficients, )
#
# XMLBehaviors
#
supermod.SupportVectorMachine.subclass = SupportVectorMachine
# end class SupportVectorMachine
class SupportVectors(supermod.SupportVectors):
def __init__(self, numberOfSupportVectors=None, numberOfAttributes=None, Extension=None, SupportVector=None):
super(SupportVectors, self).__init__(numberOfSupportVectors, numberOfAttributes, Extension, SupportVector, )
#
# XMLBehaviors
#
def set_SupportVector(self, SupportVector, *args):
self.SupportVector = SupportVector
self.numberOfVectors = len(self.SupportVector)
def set_SupportVector_wrapper(self, SupportVector, *args):
result = self.set_SupportVector(SupportVector, *args)
return result
def add_SupportVector(self, value, *args):
self.SupportVector.append(value)
self.numberOfVectors = len(self.SupportVector)
def add_SupportVector_wrapper(self, value, *args):
result = self.add_SupportVector(value, *args)
return result
def insert_SupportVector_at(self, index, value, *args):
self.SupportVector.insert(index, value)
self.numberOfVectors = len(self.SupportVector)
def insert_SupportVector_at_wrapper(self, index, value, *args):
result = self.insert_SupportVector_at(index, value, *args)
return result
supermod.SupportVectors.subclass = SupportVectors
# end class SupportVectors
class SupportVector(supermod.SupportVector):
def __init__(self, vectorId=None, Extension=None):
super(SupportVector, self).__init__(vectorId, Extension, )
#
# XMLBehaviors
#
supermod.SupportVector.subclass = SupportVector
# end class SupportVector
class Coefficients(supermod.Coefficients):
def __init__(self, numberOfCoefficients=None, absoluteValue='0', Extension=None, Coefficient=None):
super(Coefficients, self).__init__(numberOfCoefficients, absoluteValue, Extension, Coefficient, )
#
# XMLBehaviors
#
def set_Coefficient(self, Coefficient, *args):
self.Coefficient = Coefficient
self.numberOfCoefficients = len(self.Coefficient)
def set_Coefficient_wrapper(self, Coefficient, *args):
result = self.set_Coefficient(Coefficient, *args)
return result
def add_Coefficient(self, value, *args):
self.Coefficient.append(value)
self.numberOfCoefficients = len(self.Coefficient)
def add_Coefficient_wrapper(self, value, *args):
result = self.add_Coefficient(value, *args)
return result
def insert_Coefficient_at(self, index, value, *args):
self.Coefficient.insert(index, value)
self.numberOfCoefficients = len(self.Coefficient)
def insert_Coefficient_at_wrapper(self, index, value, *args):
result = self.insert_Coefficient_at(index, value, *args)
return result
supermod.Coefficients.subclass = Coefficients
# end class Coefficients
class Coefficient(supermod.Coefficient):
def __init__(self, value='0', Extension=None):
super(Coefficient, self).__init__(value, Extension, )
#
# XMLBehaviors
#
supermod.Coefficient.subclass = Coefficient
# end class Coefficient
class Targets(supermod.Targets):
def __init__(self, Extension=None, Target=None):
super(Targets, self).__init__(Extension, Target, )
#
# XMLBehaviors
#
supermod.Targets.subclass = Targets
# end class Targets
class Target(supermod.Target):
def __init__(self, field=None, optype=None, castInteger=None, min=None, max=None, rescaleConstant=0, rescaleFactor=1, Extension=None, TargetValue=None):
super(Target, self).__init__(field, optype, castInteger, min, max, rescaleConstant, rescaleFactor, Extension, TargetValue, )
#
# XMLBehaviors
#
supermod.Target.subclass = Target
# end class Target
class TargetValue(supermod.TargetValue):
def __init__(self, value=None, displayValue=None, priorProbability=None, defaultValue=None, Extension=None, Partition=None):
super(TargetValue, self).__init__(value, displayValue, priorProbability, defaultValue, Extension, Partition, )
#
# XMLBehaviors
#
supermod.TargetValue.subclass = TargetValue
# end class TargetValue
class Taxonomy(supermod.Taxonomy):
def __init__(self, name=None, Extension=None, ChildParent=None):
super(Taxonomy, self).__init__(name, Extension, ChildParent, )
#
# XMLBehaviors
#
supermod.Taxonomy.subclass = Taxonomy
# end class Taxonomy
class ChildParent(supermod.ChildParent):
def __init__(self, childField=None, parentField=None, parentLevelField=None, isRecursive='no', Extension=None, FieldColumnPair=None, TableLocator=None, InlineTable=None):
super(ChildParent, self).__init__(childField, parentField, parentLevelField, isRecursive, Extension, FieldColumnPair, TableLocator, InlineTable, )
#
# XMLBehaviors
#
supermod.ChildParent.subclass = ChildParent
# end class ChildParent
class TableLocator(supermod.TableLocator):
def __init__(self, Extension=None):
super(TableLocator, self).__init__(Extension, )
#
# XMLBehaviors
#
supermod.TableLocator.subclass = TableLocator
# end class TableLocator
class InlineTable(supermod.InlineTable):
def __init__(self, Extension=None, row=None):
super(InlineTable, self).__init__(Extension, row, )
#
# XMLBehaviors
#
supermod.InlineTable.subclass = InlineTable
# end class InlineTable
class row(supermod.row):
def __init__(self, anytypeobjs_=None):
super(row, self).__init__(anytypeobjs_, )
#
# XMLBehaviors
#
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False, *args):
if not hasattr(self, "elementobjs_"):
self.elementobjs_ = []
if hasattr(self, nodeName_) and nodeName_ not in self.elementobjs_:
nodeName_ += '_'
if nodeName_ not in self.elementobjs_:
self.elementobjs_.append(nodeName_)
if not eval("hasattr(self, '" + nodeName_ + "')"):
nodeVal = list(filter(None, [obj_.lstrip(' ') for obj_ in child_.text.split('\n')]))[0]
try:
setattr(self, nodeName_,eval(nodeVal))
except:
setattr(self, nodeName_,nodeVal)
else:
if getattr(self,nodeName_).__class__.__name__ == 'str':
setattr(self,nodeName_,[getattr(self,nodeName_)])
else:
setattr(self,nodeName_,list(getattr(self,nodeName_)))
nodeVal = list(filter(None, [obj_.lstrip(' ') for obj_ in child_.text.split('\n')]))[0]
try:
getattr(self, nodeName_).append(eval(nodeVal))
except:
getattr(self, nodeName_).append(nodeVal)
def buildChildren_wrapper(self, child_, node, nodeName_, fromsubclass_=False, *args):
result = self.buildChildren(child_, node, nodeName_, fromsubclass_=False, *args)
return result
supermod.row.subclass = row
# end class row
class TextModel(supermod.TextModel):
def __init__(self, modelName=None, functionName=None, algorithmName=None, numberOfTerms=None, numberOfDocuments=None, isScorable=True, MiningSchema=None, Output=None, ModelStats=None, ModelExplanation=None, Targets=None, LocalTransformations=None, TextDictionary=None, TextCorpus=None, DocumentTermMatrix=None, TextModelNormalization=None, TextModelSimiliarity=None, ModelVerification=None, Extension=None):
super(TextModel, self).__init__(modelName, functionName, algorithmName, numberOfTerms, numberOfDocuments, isScorable, MiningSchema, Output, ModelStats, ModelExplanation, Targets, LocalTransformations, TextDictionary, TextCorpus, DocumentTermMatrix, TextModelNormalization, TextModelSimiliarity, ModelVerification, Extension, )
#
# XMLBehaviors
#
supermod.TextModel.subclass = TextModel
# end class TextModel
class TextDictionary(supermod.TextDictionary):
def __init__(self, Extension=None, Taxonomy=None, Array=None):
super(TextDictionary, self).__init__(Extension, Taxonomy, Array, )
#
# XMLBehaviors
#
supermod.TextDictionary.subclass = TextDictionary
# end class TextDictionary
class TextCorpus(supermod.TextCorpus):
def __init__(self, Extension=None, TextDocument=None):
super(TextCorpus, self).__init__(Extension, TextDocument, )
#
# XMLBehaviors
#
supermod.TextCorpus.subclass = TextCorpus
# end class TextCorpus
class TextDocument(supermod.TextDocument):
def __init__(self, id=None, name=None, length=None, file=None, Extension=None):
super(TextDocument, self).__init__(id, name, length, file, Extension, )
#
# XMLBehaviors
#
supermod.TextDocument.subclass = TextDocument
# end class TextDocument
class DocumentTermMatrix(supermod.DocumentTermMatrix):
def __init__(self, Extension=None, Matrix=None):
super(DocumentTermMatrix, self).__init__(Extension, Matrix, )
#
# XMLBehaviors
#
supermod.DocumentTermMatrix.subclass = DocumentTermMatrix
# end class DocumentTermMatrix
class TextModelNormalization(supermod.TextModelNormalization):
def __init__(self, localTermWeights='termFrequency', globalTermWeights='inverseDocumentFrequency', documentNormalization='none', Extension=None):
super(TextModelNormalization, self).__init__(localTermWeights, globalTermWeights, documentNormalization, Extension, )
#
# XMLBehaviors
#
supermod.TextModelNormalization.subclass = TextModelNormalization
# end class TextModelNormalization
class TextModelSimiliarity(supermod.TextModelSimiliarity):
def __init__(self, similarityType=None, Extension=None):
super(TextModelSimiliarity, self).__init__(similarityType, Extension, )
#
# XMLBehaviors
#
supermod.TextModelSimiliarity.subclass = TextModelSimiliarity
# end class TextModelSimiliarity
class TimeSeriesModel(supermod.TimeSeriesModel):
def __init__(self, modelName=None, functionName=None, algorithmName=None, bestFit=None, isScorable=True, MiningSchema=None, Output=None, ModelStats=None, ModelExplanation=None, LocalTransformations=None, TimeSeries=None, SpectralAnalysis=None, ARIMA=None, ExponentialSmoothing=None, SeasonalTrendDecomposition=None, StateSpaceModel=None, GARCH=None, ModelVerification=None, Extension=None):
super(TimeSeriesModel, self).__init__(modelName, functionName, algorithmName, bestFit, isScorable, MiningSchema, Output, ModelStats, ModelExplanation, LocalTransformations, TimeSeries, SpectralAnalysis, ARIMA, ExponentialSmoothing, SeasonalTrendDecomposition, StateSpaceModel, GARCH, ModelVerification, Extension, )
#
# XMLBehaviors
#
supermod.TimeSeriesModel.subclass = TimeSeriesModel
# end class TimeSeriesModel
class TimeSeries(supermod.TimeSeries):
def __init__(self, usage='original', startTime=None, endTime=None, interpolationMethod='none', TimeAnchor=None, TimeValue=None):
super(TimeSeries, self).__init__(usage, startTime, endTime, interpolationMethod, TimeAnchor, TimeValue, )
#
# XMLBehaviors
#
supermod.TimeSeries.subclass = TimeSeries
# end class TimeSeries
class TimeValue(supermod.TimeValue):
def __init__(self, index=None, time=None, value=None, standardError=None, Timestamp=None):
super(TimeValue, self).__init__(index, time, value, standardError, Timestamp, )
#
# XMLBehaviors
#
supermod.TimeValue.subclass = TimeValue
# end class TimeValue
class TimeAnchor(supermod.TimeAnchor):
def __init__(self, type_=None, offset=None, stepsize=None, displayName=None, TimeCycle=None, TimeException=None):
super(TimeAnchor, self).__init__(type_, offset, stepsize, displayName, TimeCycle, TimeException, )
#
# XMLBehaviors
#
supermod.TimeAnchor.subclass = TimeAnchor
# end class TimeAnchor
class TimeCycle(supermod.TimeCycle):
def __init__(self, length=None, type_=None, displayName=None, Array=None):
super(TimeCycle, self).__init__(length, type_, displayName, Array, )
#
# XMLBehaviors
#
supermod.TimeCycle.subclass = TimeCycle
# end class TimeCycle
class TimeException(supermod.TimeException):
def __init__(self, type_=None, count=None, Array=None):
super(TimeException, self).__init__(type_, count, Array, )
#
# XMLBehaviors
#
supermod.TimeException.subclass = TimeException
# end class TimeException
class ExponentialSmoothing(supermod.ExponentialSmoothing):
def __init__(self, RMSE=None, transformation='none', Level=None, Trend_ExpoSmooth=None, Seasonality_ExpoSmooth=None, TimeValue=None):
super(ExponentialSmoothing, self).__init__(RMSE, transformation, Level, Trend_ExpoSmooth, Seasonality_ExpoSmooth, TimeValue, )
#
# XMLBehaviors
#
supermod.ExponentialSmoothing.subclass = ExponentialSmoothing
# end class ExponentialSmoothing
class Level(supermod.Level):
def __init__(self, alpha=None, initialLevelValue=None, smoothedValue=None):
super(Level, self).__init__(alpha, initialLevelValue, smoothedValue, )
#
# XMLBehaviors
#
supermod.Level.subclass = Level
# end class Level
class Trend_ExpoSmooth(supermod.Trend_ExpoSmooth):
def __init__(self, trend='additive', gamma=None, initialTrendValue=None, phi='1', smoothedValue=None, Array=None):
super(Trend_ExpoSmooth, self).__init__(trend, gamma, initialTrendValue, phi, smoothedValue, Array, )
#
# XMLBehaviors
#
supermod.Trend_ExpoSmooth.subclass = Trend_ExpoSmooth
# end class Trend_ExpoSmooth
class Seasonality_ExpoSmooth(supermod.Seasonality_ExpoSmooth):
def __init__(self, type_=None, period=None, initialSeasonalTrendValue=None, unit=None, phase=None, delta=None, Array=None):
super(Seasonality_ExpoSmooth, self).__init__(type_, period, initialSeasonalTrendValue, unit, phase, delta, Array, )
#
# XMLBehaviors
#
supermod.Seasonality_ExpoSmooth.subclass = Seasonality_ExpoSmooth
# end class Seasonality_ExpoSmooth
class ARIMA(supermod.ARIMA):
def __init__(self, RMSE=None, transformation='none', constantTerm='0', predictionMethod='conditionalLeastSquares', Extension=None, NonseasonalComponent=None, SeasonalComponent=None, DynamicRegressor=None, MaximumLikelihoodStat=None, OutlierEffect=None):
super(ARIMA, self).__init__(RMSE, transformation, constantTerm, predictionMethod, Extension, NonseasonalComponent, SeasonalComponent, DynamicRegressor, MaximumLikelihoodStat, OutlierEffect, )
#
# XMLBehaviors
#
supermod.ARIMA.subclass = ARIMA
# end class ARIMA
class NonseasonalComponent(supermod.NonseasonalComponent):
def __init__(self, p=None, d=None, q=None, Extension=None, AR=None, MA=None):
super(NonseasonalComponent, self).__init__(p, d, q, Extension, AR, MA, )
#
# XMLBehaviors
#
supermod.NonseasonalComponent.subclass = NonseasonalComponent
# end class NonseasonalComponent
class SeasonalComponent(supermod.SeasonalComponent):
def __init__(self, P=None, D=None, Q=None, period=None, Extension=None, AR=None, MA=None):
super(SeasonalComponent, self).__init__(P, D, Q, period, Extension, AR, MA, )
#
# XMLBehaviors
#
supermod.SeasonalComponent.subclass = SeasonalComponent
# end class SeasonalComponent
class AR(supermod.AR):
def __init__(self, Extension=None, Array=None):
super(AR, self).__init__(Extension, Array, )
#
# XMLBehaviors
#
supermod.AR.subclass = AR
# end class AR
class MA(supermod.MA):
def __init__(self, Extension=None, Coefficients=None, Residuals=None):
super(MA, self).__init__(Extension, Coefficients, Residuals, )
#
# XMLBehaviors
#
supermod.MA.subclass = MA
# end class MA
class Residuals(supermod.Residuals):
def __init__(self, Extension=None, Array=None):
super(Residuals, self).__init__(Extension, Array, )
#
# XMLBehaviors
#
supermod.Residuals.subclass = Residuals
# end class Residuals
class DynamicRegressor(supermod.DynamicRegressor):
def __init__(self, field=None, transformation='none', delay='0', futureValuesMethod='constant', targetField=None, Extension=None, Numerator=None, Denominator=None, RegressorValues=None):
super(DynamicRegressor, self).__init__(field, transformation, delay, futureValuesMethod, targetField, Extension, Numerator, Denominator, RegressorValues, )
#
# XMLBehaviors
#
supermod.DynamicRegressor.subclass = DynamicRegressor
# end class DynamicRegressor
class Numerator(supermod.Numerator):
def __init__(self, Extension=None, NonseasonalFactor=None, SeasonalFactor=None):
super(Numerator, self).__init__(Extension, NonseasonalFactor, SeasonalFactor, )
#
# XMLBehaviors
#
supermod.Numerator.subclass = Numerator
# end class Numerator
class Denominator(supermod.Denominator):
def __init__(self, Extension=None, NonseasonalFactor=None, SeasonalFactor=None):
super(Denominator, self).__init__(Extension, NonseasonalFactor, SeasonalFactor, )
#
# XMLBehaviors
#
supermod.Denominator.subclass = Denominator
# end class Denominator
class SeasonalFactor(supermod.SeasonalFactor):
def __init__(self, difference='0', maximumOrder=None, Extension=None, Array=None):
super(SeasonalFactor, self).__init__(difference, maximumOrder, Extension, Array, )
#
# XMLBehaviors
#
supermod.SeasonalFactor.subclass = SeasonalFactor
# end class SeasonalFactor
class NonseasonalFactor(supermod.NonseasonalFactor):
def __init__(self, difference='0', maximumOrder=None, Extension=None, Array=None):
super(NonseasonalFactor, self).__init__(difference, maximumOrder, Extension, Array, )
#
# XMLBehaviors
#
supermod.NonseasonalFactor.subclass = NonseasonalFactor
# end class NonseasonalFactor
class RegressorValues(supermod.RegressorValues):
def __init__(self, Extension=None, TimeSeries=None, TrendCoefficients=None, TransferFunctionValues=None):
super(RegressorValues, self).__init__(Extension, TimeSeries, TrendCoefficients, TransferFunctionValues, )
#
# XMLBehaviors
#
supermod.RegressorValues.subclass = RegressorValues
# end class RegressorValues
class TrendCoefficients(supermod.TrendCoefficients):
def __init__(self, Extension=None, REAL_SparseArray=None):
super(TrendCoefficients, self).__init__(Extension, REAL_SparseArray, )
#
# XMLBehaviors
#
supermod.TrendCoefficients.subclass = TrendCoefficients
# end class TrendCoefficients
class TransferFunctionValues(supermod.TransferFunctionValues):
def __init__(self, Array=None):
super(TransferFunctionValues, self).__init__(Array, )
#
# XMLBehaviors
#
supermod.TransferFunctionValues.subclass = TransferFunctionValues
# end class TransferFunctionValues
class MaximumLikelihoodStat(supermod.MaximumLikelihoodStat):
def __init__(self, method=None, periodDeficit='0', KalmanState=None, ThetaRecursionState=None):
super(MaximumLikelihoodStat, self).__init__(method, periodDeficit, KalmanState, ThetaRecursionState, )
#
# XMLBehaviors
#
supermod.MaximumLikelihoodStat.subclass = MaximumLikelihoodStat
# end class MaximumLikelihoodStat
class KalmanState(supermod.KalmanState):
def __init__(self, FinalOmega=None, FinalStateVector=None, HVector=None):
super(KalmanState, self).__init__(FinalOmega, FinalStateVector, HVector, )
#
# XMLBehaviors
#
supermod.KalmanState.subclass = KalmanState
# end class KalmanState
class FinalOmega(supermod.FinalOmega):
def __init__(self, Matrix=None):
super(FinalOmega, self).__init__(Matrix, )
#
# XMLBehaviors
#
supermod.FinalOmega.subclass = FinalOmega
# end class FinalOmega
class FinalStateVector(supermod.FinalStateVector):
def __init__(self, Array=None):
super(FinalStateVector, self).__init__(Array, )
#
# XMLBehaviors
#
supermod.FinalStateVector.subclass = FinalStateVector
# end class FinalStateVector
class HVector(supermod.HVector):
def __init__(self, Array=None):
super(HVector, self).__init__(Array, )
#
# XMLBehaviors
#
supermod.HVector.subclass = HVector
# end class HVector
class ThetaRecursionState(supermod.ThetaRecursionState):
def __init__(self, FinalNoise=None, FinalPredictedNoise=None, FinalTheta=None, FinalNu=None):
super(ThetaRecursionState, self).__init__(FinalNoise, FinalPredictedNoise, FinalTheta, FinalNu, )
#
# XMLBehaviors
#
supermod.ThetaRecursionState.subclass = ThetaRecursionState
# end class ThetaRecursionState
class FinalNoise(supermod.FinalNoise):
def __init__(self, Array=None):
super(FinalNoise, self).__init__(Array, )
#
# XMLBehaviors
#
supermod.FinalNoise.subclass = FinalNoise
# end class FinalNoise
class FinalPredictedNoise(supermod.FinalPredictedNoise):
def __init__(self, Array=None):
super(FinalPredictedNoise, self).__init__(Array, )
#
# XMLBehaviors
#
supermod.FinalPredictedNoise.subclass = FinalPredictedNoise
# end class FinalPredictedNoise
class FinalTheta(supermod.FinalTheta):
def __init__(self, Theta=None):
super(FinalTheta, self).__init__(Theta, )
#
# XMLBehaviors
#
supermod.FinalTheta.subclass = FinalTheta
# end class FinalTheta
class Theta(supermod.Theta):
def __init__(self, i=None, j=None, theta=None):
super(Theta, self).__init__(i, j, theta, )
#
# XMLBehaviors
#
supermod.Theta.subclass = Theta
# end class Theta
class FinalNu(supermod.FinalNu):
def __init__(self, Array=None):
super(FinalNu, self).__init__(Array, )
#
# XMLBehaviors
#
supermod.FinalNu.subclass = FinalNu
# end class FinalNu
class OutlierEffect(supermod.OutlierEffect):
def __init__(self, type_=None, startTime=None, magnitude=None, dampingCoefficient=None, Extension=None):
super(OutlierEffect, self).__init__(type_, startTime, magnitude, dampingCoefficient, Extension, )
#
# XMLBehaviors
#
supermod.OutlierEffect.subclass = OutlierEffect
# end class OutlierEffect
class GARCH(supermod.GARCH):
def __init__(self, Extension=None, ARMAPart=None, GARCHPart=None):
super(GARCH, self).__init__(Extension, ARMAPart, GARCHPart, )
#
# XMLBehaviors
#
supermod.GARCH.subclass = GARCH
# end class GARCH
class ARMAPart(supermod.ARMAPart):
def __init__(self, constant='0', p=None, q=None, Extension=None, AR=None, MA=None):
super(ARMAPart, self).__init__(constant, p, q, Extension, AR, MA, )
#
# XMLBehaviors
#
supermod.ARMAPart.subclass = ARMAPart
# end class ARMAPart
class GARCHPart(supermod.GARCHPart):
def __init__(self, constant='0', gp=None, gq=None, Extension=None, ResidualSquareCoefficients=None, VarianceCoefficients=None):
super(GARCHPart, self).__init__(constant, gp, gq, Extension, ResidualSquareCoefficients, VarianceCoefficients, )
#
# XMLBehaviors
#
supermod.GARCHPart.subclass = GARCHPart
# end class GARCHPart
class ResidualSquareCoefficients(supermod.ResidualSquareCoefficients):
def __init__(self, Extension=None, Residuals=None, Coefficients=None):
super(ResidualSquareCoefficients, self).__init__(Extension, Residuals, Coefficients, )
#
# XMLBehaviors
#
supermod.ResidualSquareCoefficients.subclass = ResidualSquareCoefficients
# end class ResidualSquareCoefficients
class VarianceCoefficients(supermod.VarianceCoefficients):
def __init__(self, Extension=None, PastVariances=None, Coefficients=None):
super(VarianceCoefficients, self).__init__(Extension, PastVariances, Coefficients, )
#
# XMLBehaviors
#
supermod.VarianceCoefficients.subclass = VarianceCoefficients
# end class VarianceCoefficients
class PastVariances(supermod.PastVariances):
def __init__(self, Extension=None, Array=None):
super(PastVariances, self).__init__(Extension, Array, )
#
# XMLBehaviors
#
supermod.PastVariances.subclass = PastVariances
# end class PastVariances
class StateSpaceModel(supermod.StateSpaceModel):
def __init__(self, variance=None, period='none', intercept='0', Extension=None, StateVector=None, TransitionMatrix=None, MeasurementMatrix=None, PsiVector=None, DynamicRegressor=None):
super(StateSpaceModel, self).__init__(variance, period, intercept, Extension, StateVector, TransitionMatrix, MeasurementMatrix, PsiVector, DynamicRegressor, )
#
# XMLBehaviors
#
supermod.StateSpaceModel.subclass = StateSpaceModel
# end class StateSpaceModel
class StateVector(supermod.StateVector):
def __init__(self, Extension=None, Array=None):
super(StateVector, self).__init__(Extension, Array, )
#
# XMLBehaviors
#
supermod.StateVector.subclass = StateVector
# end class StateVector
class TransitionMatrix(supermod.TransitionMatrix):
def __init__(self, Extension=None, Matrix=None):
super(TransitionMatrix, self).__init__(Extension, Matrix, )
#
# XMLBehaviors
#
supermod.TransitionMatrix.subclass = TransitionMatrix
# end class TransitionMatrix
class MeasurementMatrix(supermod.MeasurementMatrix):
def __init__(self, Extension=None, Matrix=None):
super(MeasurementMatrix, self).__init__(Extension, Matrix, )
#
# XMLBehaviors
#
supermod.MeasurementMatrix.subclass = MeasurementMatrix
# end class MeasurementMatrix
class PsiVector(supermod.PsiVector):
def __init__(self, targetField=None, variance=None, Extension=None, Array=None):
super(PsiVector, self).__init__(targetField, variance, Extension, Array, )
#
# XMLBehaviors
#
supermod.PsiVector.subclass = PsiVector
# end class PsiVector
class SpectralAnalysis(supermod.SpectralAnalysis):
def __init__(self):
super(SpectralAnalysis, self).__init__()
#
# XMLBehaviors
#
supermod.SpectralAnalysis.subclass = SpectralAnalysis
# end class SpectralAnalysis
class SeasonalTrendDecomposition(supermod.SeasonalTrendDecomposition):
def __init__(self):
super(SeasonalTrendDecomposition, self).__init__()
#
# XMLBehaviors
#
supermod.SeasonalTrendDecomposition.subclass = SeasonalTrendDecomposition
# end class SeasonalTrendDecomposition
class TransformationDictionary(supermod.TransformationDictionary):
def __init__(self, Extension=None, DefineFunction=None, DerivedField=None):
super(TransformationDictionary, self).__init__(Extension, DefineFunction, DerivedField, )
#
# XMLBehaviors
#
supermod.TransformationDictionary.subclass = TransformationDictionary
# end class TransformationDictionary
class LocalTransformations(supermod.LocalTransformations):
def __init__(self, Extension=None, DerivedField=None):
super(LocalTransformations, self).__init__(Extension, DerivedField, )
#
# XMLBehaviors
#
supermod.LocalTransformations.subclass = LocalTransformations
# end class LocalTransformations
class DerivedField(supermod.DerivedField):
def __init__(self, name=None, displayName=None, optype=None, dataType=None, datasetName=None, trainingBackend=None, architectureName=None, Extension=None, Apply=None, FieldRef=None, Constant=None, NormContinuous=None, NormDiscrete=None, Discretize=None, MapValues=None, TextIndex=None, Aggregate=None, Lag=None, Value=None):
super(DerivedField, self).__init__(name, displayName, optype, dataType, datasetName, trainingBackend, architectureName, Extension, Apply, FieldRef, Constant, NormContinuous, NormDiscrete, Discretize, MapValues, TextIndex, Aggregate, Lag, Value, )
#
# XMLBehaviors
#
supermod.DerivedField.subclass = DerivedField
# end class DerivedField
class Constant(supermod.Constant):
def __init__(self, dataType=None, valueOf_=None):
super(Constant, self).__init__(dataType, valueOf_, )
#
# XMLBehaviors
#
supermod.Constant.subclass = Constant
# end class Constant
class FieldRef(supermod.FieldRef):
def __init__(self, field=None, mapMissingTo=None, Extension=None):
super(FieldRef, self).__init__(field, mapMissingTo, Extension, )
#
# XMLBehaviors
#
supermod.FieldRef.subclass = FieldRef
# end class FieldRef
class NormContinuous(supermod.NormContinuous):
def __init__(self, mapMissingTo=None, field=None, outliers='asIs', Extension=None, LinearNorm=None):
super(NormContinuous, self).__init__(mapMissingTo, field, outliers, Extension, LinearNorm, )
#
# XMLBehaviors
#
supermod.NormContinuous.subclass = NormContinuous
# end class NormContinuous
class LinearNorm(supermod.LinearNorm):
def __init__(self, orig=None, norm=None, Extension=None):
super(LinearNorm, self).__init__(orig, norm, Extension, )
#
# XMLBehaviors
#
supermod.LinearNorm.subclass = LinearNorm
# end class LinearNorm
class NormDiscrete(supermod.NormDiscrete):
def __init__(self, field=None, value=None, mapMissingTo=None, Extension=None):
super(NormDiscrete, self).__init__(field, value, mapMissingTo, Extension, )
#
# XMLBehaviors
#
supermod.NormDiscrete.subclass = NormDiscrete
# end class NormDiscrete
class Discretize(supermod.Discretize):
def __init__(self, field=None, mapMissingTo=None, defaultValue=None, dataType=None, Extension=None, DiscretizeBin=None):
super(Discretize, self).__init__(field, mapMissingTo, defaultValue, dataType, Extension, DiscretizeBin, )
#
# XMLBehaviors
#
supermod.Discretize.subclass = Discretize
# end class Discretize
class DiscretizeBin(supermod.DiscretizeBin):
def __init__(self, binValue=None, Extension=None, Interval=None):
super(DiscretizeBin, self).__init__(binValue, Extension, Interval, )
#
# XMLBehaviors
#
supermod.DiscretizeBin.subclass = DiscretizeBin
# end class DiscretizeBin
class MapValues(supermod.MapValues):
def __init__(self, mapMissingTo=None, defaultValue=None, outputColumn=None, dataType=None, Extension=None, FieldColumnPair=None, TableLocator=None, InlineTable=None):
super(MapValues, self).__init__(mapMissingTo, defaultValue, outputColumn, dataType, Extension, FieldColumnPair, TableLocator, InlineTable, )
#
# XMLBehaviors
#
supermod.MapValues.subclass = MapValues
# end class MapValues
class FieldColumnPair(supermod.FieldColumnPair):
def __init__(self, field=None, column=None, Extension=None):
super(FieldColumnPair, self).__init__(field, column, Extension, )
#
# XMLBehaviors
#
supermod.FieldColumnPair.subclass = FieldColumnPair
# end class FieldColumnPair
class TextIndex(supermod.TextIndex):
def __init__(self, textField=None, localTermWeights='termFrequency', isCaseSensitive=False, maxLevenshteinDistance=0, countHits='allHits', wordSeparatorCharacterRE='\\s', tokenize=True, Extension=None, TextIndexNormalization=None, Apply=None, FieldRef=None, Constant=None, NormContinuous=None, NormDiscrete=None, Discretize=None, MapValues=None, TextIndex_member=None, Aggregate=None, Lag=None):
super(TextIndex, self).__init__(textField, localTermWeights, isCaseSensitive, maxLevenshteinDistance, countHits, wordSeparatorCharacterRE, tokenize, Extension, TextIndexNormalization, Apply, FieldRef, Constant, NormContinuous, NormDiscrete, Discretize, MapValues, TextIndex_member, Aggregate, Lag, )
#
# XMLBehaviors
#
supermod.TextIndex.subclass = TextIndex
# end class TextIndex
class TextIndexNormalization(supermod.TextIndexNormalization):
def __init__(self, inField='string', outField='stem', regexField='regex', recursive=False, isCaseSensitive=None, maxLevenshteinDistance=None, wordSeparatorCharacterRE=None, tokenize=None, Extension=None, TableLocator=None, InlineTable=None):
super(TextIndexNormalization, self).__init__(inField, outField, regexField, recursive, isCaseSensitive, maxLevenshteinDistance, wordSeparatorCharacterRE, tokenize, Extension, TableLocator, InlineTable, )
#
# XMLBehaviors
#
supermod.TextIndexNormalization.subclass = TextIndexNormalization
# end class TextIndexNormalization
class Aggregate(supermod.Aggregate):
def __init__(self, field=None, function=None, groupField=None, sqlWhere=None, Extension=None):
super(Aggregate, self).__init__(field, function, groupField, sqlWhere, Extension, )
#
# XMLBehaviors
#
supermod.Aggregate.subclass = Aggregate
# end class Aggregate
class Lag(supermod.Lag):
def __init__(self, field=None, n=1, Extension=None, BlockIndicator=None):
super(Lag, self).__init__(field, n, Extension, BlockIndicator, )
#
# XMLBehaviors
#
supermod.Lag.subclass = Lag
# end class Lag
class BlockIndicator(supermod.BlockIndicator):
def __init__(self, field=None):
super(BlockIndicator, self).__init__(field, )
#
# XMLBehaviors
#
supermod.BlockIndicator.subclass = BlockIndicator
# end class BlockIndicator
class TreeModel(supermod.TreeModel):
def __init__(self, modelName=None, functionName=None, algorithmName=None, missingValueStrategy='none', missingValuePenalty='1.0', noTrueChildStrategy='returnNullPrediction', splitCharacteristic='multiSplit', isScorable=True, MiningSchema=None, Output=None, ModelStats=None, ModelExplanation=None, Targets=None, LocalTransformations=None, Node=None, ModelVerification=None, Extension=None):
super(TreeModel, self).__init__(modelName, functionName, algorithmName, missingValueStrategy, missingValuePenalty, noTrueChildStrategy, splitCharacteristic, isScorable, MiningSchema, Output, ModelStats, ModelExplanation, Targets, LocalTransformations, Node, ModelVerification, Extension, )
#
# XMLBehaviors
#
supermod.TreeModel.subclass = TreeModel
# end class TreeModel
class Node(supermod.Node):
def __init__(self, id=None, score=None, recordCount=None, defaultChild=None, SimplePredicate=None, CompoundPredicate=None, SimpleSetPredicate=None, True_=None, False_=None, Partition=None, ScoreDistribution=None, Node_member=None, Extension=None, Regression=None, DecisionTree=None):
super(Node, self).__init__(id, score, recordCount, defaultChild, SimplePredicate, CompoundPredicate, SimpleSetPredicate, True_, False_, Partition, ScoreDistribution, Node_member, Extension, Regression, DecisionTree, )
#
# XMLBehaviors
#
supermod.Node.subclass = Node
# end class Node
class SimplePredicate(supermod.SimplePredicate):
def __init__(self, field=None, operator=None, value=None, Extension=None):
super(SimplePredicate, self).__init__(field, operator, value, Extension, )
#
# XMLBehaviors
#
supermod.SimplePredicate.subclass = SimplePredicate
# end class SimplePredicate
class CompoundPredicate(supermod.CompoundPredicate):
def __init__(self, booleanOperator=None, Extension=None, SimplePredicate=None, CompoundPredicate_member=None, SimpleSetPredicate=None, True_=None, False_=None):
super(CompoundPredicate, self).__init__(booleanOperator, Extension, SimplePredicate, CompoundPredicate_member, SimpleSetPredicate, True_, False_, )
#
# XMLBehaviors
#
supermod.CompoundPredicate.subclass = CompoundPredicate
# end class CompoundPredicate
class SimpleSetPredicate(supermod.SimpleSetPredicate):
def __init__(self, field=None, booleanOperator=None, Extension=None, Array=None):
super(SimpleSetPredicate, self).__init__(field, booleanOperator, Extension, Array, )
#
# XMLBehaviors
#
supermod.SimpleSetPredicate.subclass = SimpleSetPredicate
# end class SimpleSetPredicate
class True_(supermod.True_):
def __init__(self, Extension=None):
super(True_, self).__init__(Extension, )
#
# XMLBehaviors
#
supermod.True_.subclass = True_
# end class True_
class False_(supermod.False_):
def __init__(self, Extension=None):
super(False_, self).__init__(Extension, )
#
# XMLBehaviors
#
supermod.False_.subclass = False_
# end class False_
class ScoreDistribution(supermod.ScoreDistribution):
def __init__(self, value=None, recordCount=None, confidence=None, probability=None, Extension=None):
super(ScoreDistribution, self).__init__(value, recordCount, confidence, probability, Extension, )
#
# XMLBehaviors
#
supermod.ScoreDistribution.subclass = ScoreDistribution
# end class ScoreDistribution
def get_root_tag(node):
tag = supermod.Tag_pattern_.match(node.tag).groups()[-1]
rootClass = None
rootClass = supermod.GDSClassesMapping.get(tag)
if rootClass is None and hasattr(supermod, tag):
rootClass = getattr(supermod, tag)
return tag, rootClass
def parseSub(inFilename, silence=False):
parser = None
doc = parsexml_(inFilename, parser)
rootNode = doc.getroot()
rootTag, rootClass = get_root_tag(rootNode)
if rootClass is None:
rootTag = 'AssociationModel'
rootClass = supermod.AssociationModel
rootObj = rootClass.factory()
rootObj.build(rootNode)
# Enable Python to collect the space used by the DOM.
doc = None
if not silence:
sys.stdout.write('<?xml version="1.0" ?>\n')
rootObj.export(
sys.stdout, 0, name_=rootTag,
namespacedef_='',
pretty_print=True)
return rootObj
def parseEtree(inFilename, silence=False):
parser = None
doc = parsexml_(inFilename, parser)
rootNode = doc.getroot()
rootTag, rootClass = get_root_tag(rootNode)
if rootClass is None:
rootTag = 'AssociationModel'
rootClass = supermod.AssociationModel
rootObj = rootClass.factory()
rootObj.build(rootNode)
# Enable Python to collect the space used by the DOM.
doc = None
mapping = {}
rootElement = rootObj.to_etree(None, name_=rootTag, mapping_=mapping)
reverse_mapping = rootObj.gds_reverse_node_mapping(mapping)
if not silence:
content = etree_.tostring(
rootElement, pretty_print=True,
xml_declaration=True, encoding="utf-8")
sys.stdout.write(content)
sys.stdout.write('\n')
return rootObj, rootElement, mapping, reverse_mapping
def parseString(inString, silence=False):
from StringIO import StringIO
parser = None
doc = parsexml_(StringIO(inString), parser)
rootNode = doc.getroot()
rootTag, rootClass = get_root_tag(rootNode)
if rootClass is None:
rootTag = 'AssociationModel'
rootClass = supermod.AssociationModel
rootObj = rootClass.factory()
rootObj.build(rootNode)
# Enable Python to collect the space used by the DOM.
doc = None
if not silence:
sys.stdout.write('<?xml version="1.0" ?>\n')
rootObj.export(
sys.stdout, 0, name_=rootTag,
namespacedef_='')
return rootObj
def parseLiteral(inFilename, silence=False):
parser = None
doc = parsexml_(inFilename, parser)
rootNode = doc.getroot()
rootTag, rootClass = get_root_tag(rootNode)
if rootClass is None:
rootTag = 'AssociationModel'
rootClass = supermod.AssociationModel
rootObj = rootClass.factory()
rootObj.build(rootNode)
# Enable Python to collect the space used by the DOM.
doc = None
if not silence:
sys.stdout.write('#from nyokaBase.PMML43ExtSuper import *\n\n')
sys.stdout.write('import nyokaBase.PMML43ExtSuper as model_\n\n')
sys.stdout.write('rootObj = model_.rootClass(\n')
rootObj.exportLiteral(sys.stdout, 0, name_=rootTag)
sys.stdout.write(')\n')
return rootObj
USAGE_TEXT = """
Usage: python ???.py <infilename>
"""
def usage():
print(USAGE_TEXT)
sys.exit(1)
def main():
args = sys.argv[1:]
if len(args) != 1:
usage()
infilename = args[0]
parse(infilename)
if __name__ == '__main__':
#import pdb; pdb.set_trace()
main()
def parse(inFileName, silence=False):
orig_init()
result = parseSub(inFileName, silence)
new_init()
return result
def new_init():
def LayerWeights_init(self, weightsShape=None, weightsFlattenAxis=None, content=None, floatType="float32", floatsPerLine=12, src=None, Extension=None, mixedclass_=None):
self.original_tagname_ = None
self.weightsShape = supermod._cast(None, weightsShape)
self.weightsFlattenAxis = supermod._cast(None, weightsFlattenAxis)
self.src = supermod._cast(None, src)
if Extension is None:
self.Extension = []
else:
self.Extension = Extension
if mixedclass_ is None:
self.mixedclass_ = supermod.MixedContainer
else:
self.mixedclass_ = mixedclass_
validFloatTypes = ["float6", "float7", "float8", "float16", "float32", "float64"]
if floatType not in validFloatTypes:
floatType = "float32"
from nyokaBase.Base64 import FloatBase64
base64string = "\t\t\t\t" + "data:" + floatType + ";base64," + FloatBase64.from_floatArray(content, floatsPerLine)
base64string = base64string.replace("\n", "\n\t\t\t\t")
self.content_ = [supermod.MixedContainer(1, 2, "", base64string)]
self.valueOf_ = base64string
def LayerBias_init(self, biasShape=None, biasFlattenAxis=None, content=None, floatType="float32", floatsPerLine=12, src=None, Extension=None, mixedclass_=None):
self.original_tagname_ = None
self.biasShape = supermod._cast(None, biasShape)
self.biasFlattenAxis = supermod._cast(None, biasFlattenAxis)
self.src = supermod._cast(None, src)
if Extension is None:
self.Extension = []
else:
self.Extension = Extension
if mixedclass_ is None:
self.mixedclass_ = supermod.MixedContainer
else:
self.mixedclass_ = mixedclass_
validFloatTypes = ["float6", "float7", "float8", "float16", "float32", "float64"]
if floatType not in validFloatTypes:
floatType = "float32"
from nyokaBase.Base64 import FloatBase64
base64string = "\t\t\t\t" + "data:" + floatType + ";base64," + FloatBase64.from_floatArray(content, floatsPerLine)
base64string = base64string.replace("\n", "\n\t\t\t\t")
self.content_ = [supermod.MixedContainer(1, 2, "", base64string)]
self.valueOf_ = base64string
def ArrayType_init(self, content=None, n=None, type_=None, mixedclass_=None):
self.original_tagname_ = None
self.n = supermod._cast(None, n)
self.type_ = supermod._cast(None, type_)
if mixedclass_ is None:
self.mixedclass_ = supermod.MixedContainer
else:
self.mixedclass_ = mixedclass_
self.content_ = [supermod.MixedContainer(1, 2, "", str(content))]
self.valueOf_ = str(content)
def Annotation_init(self, content=None, Extension=None, mixedclass_=None):
self.original_tagname_ = None
if Extension is None:
self.Extension = []
else:
self.Extension = Extension
if mixedclass_ is None:
self.mixedclass_ = supermod.MixedContainer
else:
self.mixedclass_ = mixedclass_
self.content_ = [supermod.MixedContainer(1, 2, "", str(content))]
self.valueOf_ = str(content)
def Timestamp_init(self, content=None, Extension=None, mixedclass_=None):
self.original_tagname_ = None
if Extension is None:
self.Extension = []
else:
self.Extension = Extension
if mixedclass_ is None:
self.mixedclass_ = supermod.MixedContainer
else:
self.mixedclass_ = mixedclass_
self.content_ = [supermod.MixedContainer(1, 2, "", str(content))]
self.valueOf_ = str(content)
def PMML_init(self, version='4.3', Header=None, script=None, MiningBuildTask=None, DataDictionary=None, TransformationDictionary=None, AssociationModel=None, AnomalyDetectionModel=None, BayesianNetworkModel=None, BaselineModel=None, ClusteringModel=None, DeepNetwork=None, GaussianProcessModel=None, GeneralRegressionModel=None, MiningModel=None, NaiveBayesModel=None, NearestNeighborModel=None, NeuralNetwork=None, RegressionModel=None, RuleSetModel=None, SequenceModel=None, Scorecard=None, SupportVectorMachineModel=None, TextModel=None, TimeSeriesModel=None, TreeModel=None, Extension=None):
self.original_tagname_ = None
self.version = supermod._cast(None, version)
self.Header = Header
if script is None:
self.script = []
else:
self.script = script
self.MiningBuildTask = MiningBuildTask
self.DataDictionary = DataDictionary
if AssociationModel is None:
self.AssociationModel = []
else:
self.AssociationModel = AssociationModel
if AnomalyDetectionModel is None:
self.AnomalyDetectionModel = []
else:
self.AnomalyDetectionModel = AnomalyDetectionModel
if BayesianNetworkModel is None:
self.BayesianNetworkModel = []
else:
self.BayesianNetworkModel = BayesianNetworkModel
if BaselineModel is None:
self.BaselineModel = []
else:
self.BaselineModel = BaselineModel
if ClusteringModel is None:
self.ClusteringModel = []
else:
self.ClusteringModel = ClusteringModel
if DeepNetwork is None:
self.DeepNetwork = []
else:
self.DeepNetwork = DeepNetwork
if GaussianProcessModel is None:
self.GaussianProcessModel = []
else:
self.GaussianProcessModel = GaussianProcessModel
if GeneralRegressionModel is None:
self.GeneralRegressionModel = []
else:
self.GeneralRegressionModel = GeneralRegressionModel
if MiningModel is None:
self.MiningModel = []
else:
self.MiningModel = MiningModel
if NaiveBayesModel is None:
self.NaiveBayesModel = []
else:
self.NaiveBayesModel = NaiveBayesModel
if NearestNeighborModel is None:
self.NearestNeighborModel = []
else:
self.NearestNeighborModel = NearestNeighborModel
if NeuralNetwork is None:
self.NeuralNetwork = []
else:
self.NeuralNetwork = NeuralNetwork
if RegressionModel is None:
self.RegressionModel = []
else:
self.RegressionModel = RegressionModel
if RuleSetModel is None:
self.RuleSetModel = []
else:
self.RuleSetModel = RuleSetModel
if SequenceModel is None:
self.SequenceModel = []
else:
self.SequenceModel = SequenceModel
if Scorecard is None:
self.Scorecard = []
else:
self.Scorecard = Scorecard
if SupportVectorMachineModel is None:
self.SupportVectorMachineModel = []
else:
self.SupportVectorMachineModel = SupportVectorMachineModel
if TextModel is None:
self.TextModel = []
else:
self.TextModel = TextModel
if TimeSeriesModel is None:
self.TimeSeriesModel = []
else:
self.TimeSeriesModel = TimeSeriesModel
if TransformationDictionary is None:
self.TransformationDictionary = []
else:
self.TransformationDictionary = TransformationDictionary
if TreeModel is None:
self.TreeModel = []
else:
self.TreeModel = TreeModel
if Extension is None:
self.Extension = []
else:
self.Extension = Extension
def script_init(self, content=None, for_=None, class_=None, Extension=None):
self.original_tagname_ = None
self.for_ = supermod._cast(None, for_)
self.class_ = supermod._cast(None, class_)
if Extension is None:
self.Extension = []
else:
self.Extension = Extension
self.anyAttributes_ = {}
self.mixedclass_ = supermod.MixedContainer
self.content_ = [supermod.MixedContainer(1, 2, "", str(content))]
self.valueOf_ = str(content)
LayerWeights.__init__ = LayerWeights_init
LayerBias.__init__ = LayerBias_init
ArrayType.__init__ = ArrayType_init
Annotation.__init__ = Annotation_init
Timestamp.__init__ = Timestamp_init
PMML.__init__ = PMML_init
script.__init__ = script_init
def orig_init():
def LayerWeights_init(self, weightsShape=None, weightsFlattenAxis=None, src=None, Extension=None, valueOf_=None, mixedclass_=None, content_=None):
self.original_tagname_ = None
self.weightsShape = supermod._cast(None, weightsShape)
self.weightsFlattenAxis = supermod._cast(None, weightsFlattenAxis)
self.src = supermod._cast(None, src)
if Extension is None:
self.Extension = []
else:
self.Extension = Extension
self.valueOf_ = valueOf_
if mixedclass_ is None:
self.mixedclass_ = supermod.MixedContainer
else:
self.mixedclass_ = mixedclass_
if content_ is None:
self.content_ = []
else:
self.content_ = content_
self.valueOf_ = valueOf_
def LayerBias_init(self, biasShape=None, biasFlattenAxis=None, src=None, Extension=None, valueOf_=None, mixedclass_=None, content_=None):
self.original_tagname_ = None
self.biasShape = supermod._cast(None, biasShape)
self.biasFlattenAxis = supermod._cast(None, biasFlattenAxis)
self.src = supermod._cast(None, src)
if Extension is None:
self.Extension = []
else:
self.Extension = Extension
self.valueOf_ = valueOf_
if mixedclass_ is None:
self.mixedclass_ = supermod.MixedContainer
else:
self.mixedclass_ = mixedclass_
if content_ is None:
self.content_ = []
else:
self.content_ = content_
self.valueOf_ = valueOf_
def ArrayType_init(self, n=None, type_=None, valueOf_=None, mixedclass_=None, content_=None):
self.original_tagname_ = None
self.n = supermod._cast(None, n)
self.type_ = supermod._cast(None, type_)
self.valueOf_ = valueOf_
if mixedclass_ is None:
self.mixedclass_ = supermod.MixedContainer
else:
self.mixedclass_ = mixedclass_
if content_ is None:
self.content_ = []
else:
self.content_ = content_
self.valueOf_ = valueOf_
def Annotation_init(self, Extension=None, valueOf_=None, mixedclass_=None, content_=None):
self.original_tagname_ = None
if Extension is None:
self.Extension = []
else:
self.Extension = Extension
self.valueOf_ = valueOf_
if mixedclass_ is None:
self.mixedclass_ = supermod.MixedContainer
else:
self.mixedclass_ = mixedclass_
if content_ is None:
self.content_ = []
else:
self.content_ = content_
self.valueOf_ = valueOf_
def Timestamp_init(self, Extension=None, valueOf_=None, mixedclass_=None, content_=None):
self.original_tagname_ = None
if Extension is None:
self.Extension = []
else:
self.Extension = Extension
self.valueOf_ = valueOf_
if mixedclass_ is None:
self.mixedclass_ = supermod.MixedContainer
else:
self.mixedclass_ = mixedclass_
if content_ is None:
self.content_ = []
else:
self.content_ = content_
self.valueOf_ = valueOf_
def PMML_init(self, version=None, Header=None, script=None, MiningBuildTask=None, DataDictionary=None, TransformationDictionary=None, AssociationModel=None, AnomalyDetectionModel=None, BayesianNetworkModel=None, BaselineModel=None, ClusteringModel=None, DeepNetwork=None, GaussianProcessModel=None, GeneralRegressionModel=None, MiningModel=None, NaiveBayesModel=None, NearestNeighborModel=None, NeuralNetwork=None, RegressionModel=None, RuleSetModel=None, SequenceModel=None, Scorecard=None, SupportVectorMachineModel=None, TextModel=None, TimeSeriesModel=None, TreeModel=None, Extension=None):
self.original_tagname_ = None
self.version = supermod._cast(None, version)
self.Header = Header
if script is None:
self.script = []
else:
self.script = script
self.MiningBuildTask = MiningBuildTask
self.DataDictionary = DataDictionary
self.TransformationDictionary = TransformationDictionary
if AssociationModel is None:
self.AssociationModel = []
else:
self.AssociationModel = AssociationModel
if AnomalyDetectionModel is None:
self.AnomalyDetectionModel = []
else:
self.AnomalyDetectionModel = AnomalyDetectionModel
if BayesianNetworkModel is None:
self.BayesianNetworkModel = []
else:
self.BayesianNetworkModel = BayesianNetworkModel
if BaselineModel is None:
self.BaselineModel = []
else:
self.BaselineModel = BaselineModel
if ClusteringModel is None:
self.ClusteringModel = []
else:
self.ClusteringModel = ClusteringModel
if DeepNetwork is None:
self.DeepNetwork = []
else:
self.DeepNetwork = DeepNetwork
if GaussianProcessModel is None:
self.GaussianProcessModel = []
else:
self.GaussianProcessModel = GaussianProcessModel
if GeneralRegressionModel is None:
self.GeneralRegressionModel = []
else:
self.GeneralRegressionModel = GeneralRegressionModel
if MiningModel is None:
self.MiningModel = []
else:
self.MiningModel = MiningModel
if NaiveBayesModel is None:
self.NaiveBayesModel = []
else:
self.NaiveBayesModel = NaiveBayesModel
if NearestNeighborModel is None:
self.NearestNeighborModel = []
else:
self.NearestNeighborModel = NearestNeighborModel
if NeuralNetwork is None:
self.NeuralNetwork = []
else:
self.NeuralNetwork = NeuralNetwork
if RegressionModel is None:
self.RegressionModel = []
else:
self.RegressionModel = RegressionModel
if RuleSetModel is None:
self.RuleSetModel = []
else:
self.RuleSetModel = RuleSetModel
if SequenceModel is None:
self.SequenceModel = []
else:
self.SequenceModel = SequenceModel
if Scorecard is None:
self.Scorecard = []
else:
self.Scorecard = Scorecard
if SupportVectorMachineModel is None:
self.SupportVectorMachineModel = []
else:
self.SupportVectorMachineModel = SupportVectorMachineModel
if TextModel is None:
self.TextModel = []
else:
self.TextModel = TextModel
if TimeSeriesModel is None:
self.TimeSeriesModel = []
else:
self.TimeSeriesModel = TimeSeriesModel
if TransformationDictionary is None:
self.TransformationDictionary = []
else:
self.TransformationDictionary = TransformationDictionary
if TreeModel is None:
self.TreeModel = []
else:
self.TreeModel = TreeModel
if Extension is None:
self.Extension = []
else:
self.Extension = Extension
def script_init(self, for_=None, class_=None, Extension=None, valueOf_=None, mixedclass_=None, content_=None):
self.original_tagname_ = None
self.for_ = supermod._cast(None, for_)
self.class_ = supermod._cast(None, class_)
if Extension is None:
self.Extension = []
else:
self.Extension = Extension
self.valueOf_ = valueOf_
self.anyAttributes_ = {}
if mixedclass_ is None:
self.mixedclass_ = supermod.MixedContainer
else:
self.mixedclass_ = mixedclass_
if content_ is None:
self.content_ = []
else:
self.content_ = content_
self.valueOf_ = valueOf_
LayerWeights.__init__ = LayerWeights_init
LayerBias.__init__ = LayerBias_init
ArrayType.__init__ = ArrayType_init
Annotation.__init__ = Annotation_init
Timestamp.__init__ = Timestamp_init
PMML.__init__ = PMML_init
script.__init__ = script_init
new_init()
def showIndent(outfile, level, pretty_print=True):
if pretty_print:
for idx in range(level):
outfile.write('\t')
| 36.089468 | 1,634 | 0.726182 | 17,719 | 184,345 | 7.332299 | 0.059202 | 0.020074 | 0.026416 | 0.014062 | 0.421556 | 0.378184 | 0.330408 | 0.282295 | 0.269964 | 0.250044 | 0 | 0.002341 | 0.174971 | 184,345 | 5,107 | 1,635 | 36.096534 | 0.851897 | 0.068285 | 0 | 0.346506 | 1 | 0 | 0.018372 | 0.002159 | 0.000832 | 0 | 0 | 0 | 0 | 1 | 0.186772 | false | 0.000416 | 0.015807 | 0 | 0.360649 | 0.021215 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
5d5b47c0a8dc24a1db06e36fcc2d3bcfe9b7f216 | 22,196 | py | Python | test/cut/test_cut_augmentation.py | pzelasko/lhotse | 41984467d2ead1dc69f418638b969e46f63308c7 | [
"Apache-2.0"
] | 64 | 2020-04-27T14:55:15.000Z | 2020-10-25T06:57:56.000Z | test/cut/test_cut_augmentation.py | pzelasko/lhotse | 41984467d2ead1dc69f418638b969e46f63308c7 | [
"Apache-2.0"
] | 85 | 2020-04-26T06:29:47.000Z | 2020-10-19T20:28:52.000Z | test/cut/test_cut_augmentation.py | pzelasko/lhotse | 41984467d2ead1dc69f418638b969e46f63308c7 | [
"Apache-2.0"
] | 17 | 2020-06-19T06:26:33.000Z | 2020-10-12T15:19:15.000Z | import numpy as np
import pytest
from lhotse import AudioSource, CutSet, MonoCut, Recording, SupervisionSegment
from lhotse.audio import RecordingSet
from lhotse.cut import PaddingCut
from lhotse.utils import fastcopy
@pytest.fixture
def file_source():
return AudioSource(type="file", channels=[0], source="test/fixtures/mono_c0.wav")
@pytest.fixture
def recording(file_source):
return Recording(
id="rec",
sources=[file_source],
sampling_rate=8000,
num_samples=4000,
duration=0.5,
)
@pytest.fixture
def rir():
return Recording.from_file("test/fixtures/rir/sim_1ch.wav")
@pytest.fixture
def multi_channel_rir():
return Recording.from_file("test/fixtures/rir/real_8ch.wav")
@pytest.fixture
def libri_recording_orig():
return Recording.from_file("test/fixtures/libri/libri-1088-134315-0000.wav")
@pytest.fixture
def libri_recording_rvb():
return Recording.from_file("test/fixtures/libri/libri-1088-134315-0000_rvb.wav")
@pytest.fixture
def cut_with_supervision(recording):
return MonoCut(
id="cut",
start=0.0,
duration=0.5,
channel=0,
supervisions=[
SupervisionSegment(id="sup", recording_id="rec", start=0.0, duration=0.5)
],
recording=recording,
)
@pytest.fixture
def libri_cut_with_supervision(libri_recording_orig):
return MonoCut(
id="libri_cut_1",
start=0,
duration=libri_recording_orig.duration,
channel=0,
supervisions=[
SupervisionSegment(
id="sup",
recording_id="rec",
start=0,
duration=libri_recording_orig.duration,
)
],
recording=libri_recording_orig,
)
def test_cut_perturb_speed11(cut_with_supervision):
cut_sp = cut_with_supervision.perturb_speed(1.1)
assert cut_sp.start == 0.0
assert cut_sp.duration == 0.4545
assert cut_sp.end == 0.4545
assert cut_sp.num_samples == 3636
assert cut_sp.recording.duration == 0.4545
assert cut_sp.recording.num_samples == 3636
assert cut_sp.supervisions[0].start == 0.0
assert cut_sp.supervisions[0].duration == 0.4545
assert cut_sp.supervisions[0].end == 0.4545
cut_samples = cut_sp.load_audio()
assert cut_samples.shape[0] == 1
assert cut_samples.shape[1] == 3636
recording_samples = cut_sp.recording.load_audio()
assert recording_samples.shape[0] == 1
assert recording_samples.shape[1] == 3636
def test_cut_perturb_speed09(cut_with_supervision):
cut_sp = cut_with_supervision.perturb_speed(0.9)
assert cut_sp.start == 0.0
assert cut_sp.duration == 0.5555
assert cut_sp.end == 0.5555
assert cut_sp.num_samples == 4444
assert cut_sp.recording.duration == 0.5555
assert cut_sp.recording.num_samples == 4444
assert cut_sp.supervisions[0].start == 0.0
assert cut_sp.supervisions[0].duration == 0.5555
assert cut_sp.supervisions[0].end == 0.5555
cut_samples = cut_sp.load_audio()
assert cut_samples.shape[0] == 1
assert cut_samples.shape[1] == 4444
recording_samples = cut_sp.recording.load_audio()
assert recording_samples.shape[0] == 1
assert recording_samples.shape[1] == 4444
def test_cut_perturb_tempo09(cut_with_supervision):
cut_tp = cut_with_supervision.perturb_tempo(0.9)
assert cut_tp.start == 0.0
assert cut_tp.duration == 0.5555
assert cut_tp.end == 0.5555
assert cut_tp.num_samples == 4444
assert cut_tp.recording.duration == 0.5555
assert cut_tp.recording.num_samples == 4444
assert cut_tp.supervisions[0].start == 0.0
assert cut_tp.supervisions[0].duration == 0.5555
assert cut_tp.supervisions[0].end == 0.5555
cut_samples = cut_tp.load_audio()
assert cut_samples.shape[0] == 1
assert cut_samples.shape[1] == 4444
recording_samples = cut_tp.recording.load_audio()
assert recording_samples.shape[0] == 1
assert recording_samples.shape[1] == 4444
def test_cut_perturb_tempo11(cut_with_supervision):
cut_tp = cut_with_supervision.perturb_tempo(1.1)
assert cut_tp.start == 0.0
assert cut_tp.duration == 0.4545
assert cut_tp.end == 0.4545
assert cut_tp.num_samples == 3636
assert cut_tp.recording.duration == 0.4545
assert cut_tp.recording.num_samples == 3636
assert cut_tp.supervisions[0].start == 0.0
assert cut_tp.supervisions[0].duration == 0.4545
assert cut_tp.supervisions[0].end == 0.4545
cut_samples = cut_tp.load_audio()
assert cut_samples.shape[0] == 1
assert cut_samples.shape[1] == 3636
recording_samples = cut_tp.recording.load_audio()
assert recording_samples.shape[0] == 1
assert recording_samples.shape[1] == 3636
def test_cut_set_perturb_speed_doesnt_duplicate_transforms(cut_with_supervision):
cuts = CutSet.from_cuts(
[cut_with_supervision, cut_with_supervision.with_id("other-id")]
)
cuts_sp = cuts.perturb_speed(1.1)
for cut in cuts_sp:
# This prevents a bug regression where multiple cuts referencing the same recording would
# attach transforms to the same manifest
assert len(cut.recording.transforms) == 1
def test_cut_set_perturb_volume_doesnt_duplicate_transforms(cut_with_supervision):
cuts = CutSet.from_cuts(
[cut_with_supervision, cut_with_supervision.with_id("other-id")]
)
cuts_vp = cuts.perturb_volume(2.0)
for cut in cuts_vp:
# This prevents a bug regression where multiple cuts referencing the same recording would
# attach transforms to the same manifest
assert len(cut.recording.transforms) == 1
def test_cut_set_reverb_rir_doesnt_duplicate_transforms(cut_with_supervision, rir):
rirs = RecordingSet.from_recordings([rir])
cuts = CutSet.from_cuts(
[cut_with_supervision, cut_with_supervision.with_id("other-id")]
)
cuts_vp = cuts.reverb_rir(rir_recordings=rirs)
for cut in cuts_vp:
# This prevents a bug regression where multiple cuts referencing the same recording would
# attach transforms to the same manifest
assert len(cut.recording.transforms) == 1
def test_cut_set_resample_doesnt_duplicate_transforms(cut_with_supervision):
cuts = CutSet.from_cuts(
[cut_with_supervision, cut_with_supervision.with_id("other-id")]
)
cuts_res = cuts.resample(44100)
for cut in cuts_res:
# This prevents a bug regression where multiple cuts referencing the same recording would
# attach transforms to the same manifest
assert len(cut.recording.transforms) == 1
@pytest.fixture
def cut_with_supervision_start01(recording):
return MonoCut(
id="cut_start01",
start=0.1,
duration=0.4,
channel=0,
supervisions=[
SupervisionSegment(id="sup", recording_id="rec", start=0.1, duration=0.3)
],
recording=recording,
)
def test_cut_start01_perturb_speed11(cut_with_supervision_start01):
cut_sp = cut_with_supervision_start01.perturb_speed(1.1)
assert cut_sp.start == 0.090875
assert cut_sp.duration == 0.363625
assert cut_sp.end == 0.4545
assert cut_sp.num_samples == 2909
assert cut_sp.recording.duration == 0.4545
assert cut_sp.recording.num_samples == 3636
assert cut_sp.supervisions[0].start == 0.090875
assert cut_sp.supervisions[0].duration == 0.27275
assert cut_sp.supervisions[0].end == 0.363625
cut_samples = cut_sp.load_audio()
assert cut_samples.shape[0] == 1
assert cut_samples.shape[1] == 2909
recording_samples = cut_sp.recording.load_audio()
assert recording_samples.shape[0] == 1
assert recording_samples.shape[1] == 3636
def test_cut_start01_perturb_speed09(cut_with_supervision_start01):
cut_sp = cut_with_supervision_start01.perturb_speed(0.9)
assert cut_sp.start == 0.111125
assert cut_sp.duration == 0.4445
assert cut_sp.end == 0.555625
assert cut_sp.num_samples == 3556
assert cut_sp.recording.duration == 0.5555
assert cut_sp.recording.num_samples == 4444
assert cut_sp.supervisions[0].start == 0.111125
assert cut_sp.supervisions[0].duration == 0.333375
assert cut_sp.supervisions[0].end == 0.4445
cut_samples = cut_sp.load_audio()
assert cut_samples.shape[0] == 1
assert cut_samples.shape[1] == 3556
recording_samples = cut_sp.recording.load_audio()
assert recording_samples.shape[0] == 1
assert recording_samples.shape[1] == 4444
def test_mixed_cut_start01_perturb_speed(cut_with_supervision_start01):
mixed_sp = cut_with_supervision_start01.append(
cut_with_supervision_start01
).perturb_speed(1.1)
assert mixed_sp.start == 0 # MixedCut always starts at 0
assert mixed_sp.duration == 0.363625 * 2
assert mixed_sp.end == 0.363625 * 2
assert mixed_sp.num_samples == 2909 * 2
assert mixed_sp.supervisions[0].start == 0.090875
assert mixed_sp.supervisions[0].duration == 0.27275
assert mixed_sp.supervisions[0].end == 0.363625
assert (
mixed_sp.supervisions[1].start == 0.4545
) # round(0.363625 + 0.090875, ndigits=8)
assert mixed_sp.supervisions[1].duration == 0.27275
assert mixed_sp.supervisions[1].end == 0.363625 * 2
cut_samples = mixed_sp.load_audio()
assert cut_samples.shape[0] == 1
assert cut_samples.shape[1] == 2909 * 2
def test_mixed_cut_start01_perturb_volume(cut_with_supervision_start01):
mixed_vp = cut_with_supervision_start01.append(
cut_with_supervision_start01
).perturb_volume(0.125)
assert mixed_vp.start == 0 # MixedCut always starts at 0
assert mixed_vp.duration == cut_with_supervision_start01.duration * 2
assert mixed_vp.end == cut_with_supervision_start01.duration * 2
assert mixed_vp.num_samples == cut_with_supervision_start01.num_samples * 2
assert (
mixed_vp.supervisions[0].start
== cut_with_supervision_start01.supervisions[0].start
)
assert (
mixed_vp.supervisions[0].duration
== cut_with_supervision_start01.supervisions[0].duration
)
assert (
mixed_vp.supervisions[0].end == cut_with_supervision_start01.supervisions[0].end
)
assert mixed_vp.supervisions[1].start == (
cut_with_supervision_start01.duration
+ cut_with_supervision_start01.supervisions[0].start
)
assert (
mixed_vp.supervisions[1].duration
== cut_with_supervision_start01.supervisions[0].duration
)
assert mixed_vp.supervisions[1].end == (
cut_with_supervision_start01.duration
+ cut_with_supervision_start01.supervisions[0].end
)
cut_samples = mixed_vp.load_audio()
cut_with_supervision_start01_samples = cut_with_supervision_start01.load_audio()
assert (
cut_samples.shape[0] == cut_with_supervision_start01_samples.shape[0]
and cut_samples.shape[1] == cut_with_supervision_start01_samples.shape[1] * 2
)
np.testing.assert_array_almost_equal(
cut_samples,
np.hstack(
(cut_with_supervision_start01_samples, cut_with_supervision_start01_samples)
)
* 0.125,
)
def test_mixed_cut_start01_reverb_rir(cut_with_supervision_start01, rir):
mixed_rvb = cut_with_supervision_start01.append(
cut_with_supervision_start01
).reverb_rir(rir_recording=rir)
assert mixed_rvb.start == 0 # MixedCut always starts at 0
assert mixed_rvb.duration == cut_with_supervision_start01.duration * 2
assert mixed_rvb.end == cut_with_supervision_start01.duration * 2
assert mixed_rvb.num_samples == cut_with_supervision_start01.num_samples * 2
assert (
mixed_rvb.supervisions[0].start
== cut_with_supervision_start01.supervisions[0].start
)
assert (
mixed_rvb.supervisions[0].duration
== cut_with_supervision_start01.supervisions[0].duration
)
assert (
mixed_rvb.supervisions[0].end
== cut_with_supervision_start01.supervisions[0].end
)
assert mixed_rvb.supervisions[1].start == (
cut_with_supervision_start01.duration
+ cut_with_supervision_start01.supervisions[0].start
)
assert (
mixed_rvb.supervisions[1].duration
== cut_with_supervision_start01.supervisions[0].duration
)
assert mixed_rvb.supervisions[1].end == (
cut_with_supervision_start01.duration
+ cut_with_supervision_start01.supervisions[0].end
)
cut_samples = mixed_rvb.load_audio()
cut_with_supervision_start01_samples = cut_with_supervision_start01.reverb_rir(
rir_recording=rir
).load_audio()
assert (
cut_samples.shape[0] == cut_with_supervision_start01_samples.shape[0]
and cut_samples.shape[1] == cut_with_supervision_start01_samples.shape[1] * 2
)
np.testing.assert_array_almost_equal(
cut_samples,
np.hstack(
(cut_with_supervision_start01_samples, cut_with_supervision_start01_samples)
),
)
@pytest.mark.parametrize(
"rir_channels, expected_num_tracks",
[([0], 2), ([0, 1], 2), ([0, 1, 2], None)],
)
def test_mixed_cut_start01_reverb_rir_multi_channel(
cut_with_supervision_start01, multi_channel_rir, rir_channels, expected_num_tracks
):
mixed_cut = cut_with_supervision_start01.append(cut_with_supervision_start01)
if expected_num_tracks is not None:
mixed_rvb = mixed_cut.reverb_rir(multi_channel_rir, rir_channels=rir_channels)
assert len(mixed_rvb.tracks) == expected_num_tracks
else:
with pytest.raises(AssertionError):
mixed_cut.reverb_rir(multi_channel_rir, rir_channels=rir_channels)
def test_padding_cut_perturb_speed():
cut = PaddingCut(
id="cut",
duration=5.75,
sampling_rate=16000,
feat_value=1e-10,
num_samples=92000,
)
cut_sp = cut.perturb_speed(1.1)
assert cut_sp.num_samples == 83636
assert cut_sp.duration == 5.22725
def test_padding_cut_perturb_volume():
cut = PaddingCut(
id="cut",
duration=5.75,
sampling_rate=16000,
feat_value=1e-10,
num_samples=92000,
)
cut_vp = cut.perturb_volume(0.125)
assert cut_vp.num_samples == cut.num_samples
assert cut_vp.duration == cut.duration
np.testing.assert_array_almost_equal(cut_vp.load_audio(), cut.load_audio())
def test_padding_cut_reverb_rir(rir):
cut = PaddingCut(
id="cut",
duration=5.75,
sampling_rate=16000,
feat_value=1e-10,
num_samples=92000,
)
cut_rvb = cut.reverb_rir(rir_recording=rir)
assert cut_rvb.num_samples == cut.num_samples
assert cut_rvb.duration == cut.duration
np.testing.assert_array_almost_equal(cut_rvb.load_audio(), cut.load_audio())
def test_cut_set_perturb_speed(cut_with_supervision, cut_with_supervision_start01):
cut_set = CutSet.from_cuts([cut_with_supervision, cut_with_supervision_start01])
cs_sp = cut_set.perturb_speed(1.1)
for cut_sp, cut in zip(cs_sp, cut_set):
samples = cut_sp.load_audio()
assert samples.shape[1] == cut_sp.num_samples
assert samples.shape[1] < cut.num_samples
@pytest.fixture()
def cut_set(cut_with_supervision, cut_with_supervision_start01):
return CutSet.from_cuts([cut_with_supervision, cut_with_supervision_start01])
@pytest.fixture()
def libri_cut_set(libri_cut_with_supervision):
cut1 = libri_cut_with_supervision
cut2 = fastcopy(cut1, id="libri_cut_2")
return CutSet.from_cuts([cut1, cut2])
@pytest.mark.parametrize("cut_id", ["cut", "cut_start01"])
def test_resample_cut(cut_set, cut_id):
original = cut_set[cut_id]
resampled = original.resample(16000)
assert original.sampling_rate == 8000
assert resampled.sampling_rate == 16000
assert resampled.num_samples == 2 * original.num_samples
samples = resampled.load_audio()
assert samples.shape[1] == resampled.num_samples
@pytest.mark.parametrize("cut_id", ["cut", "cut_start01"])
@pytest.mark.parametrize("scale", [0.125, 2.0])
def test_cut_perturb_volume(cut_set, cut_id, scale):
cut = cut_set[cut_id]
cut_vp = cut.perturb_volume(scale)
assert cut_vp.start == cut.start
assert cut_vp.duration == cut.duration
assert cut_vp.end == cut.end
assert cut_vp.num_samples == cut.num_samples
assert cut_vp.recording.duration == cut.recording.duration
assert cut_vp.recording.num_samples == cut.recording.num_samples
assert cut_vp.supervisions[0].start == cut.supervisions[0].start
assert cut_vp.supervisions[0].duration == cut.supervisions[0].duration
assert cut_vp.supervisions[0].end == cut.supervisions[0].end
assert cut_vp.load_audio().shape == cut.load_audio().shape
assert cut_vp.recording.load_audio().shape == cut.recording.load_audio().shape
np.testing.assert_array_almost_equal(cut_vp.load_audio(), cut.load_audio() * scale)
np.testing.assert_array_almost_equal(
cut_vp.recording.load_audio(), cut.recording.load_audio() * scale
)
def test_cut_reverb_rir(libri_cut_with_supervision, libri_recording_rvb, rir):
cut = libri_cut_with_supervision
cut_rvb = cut.reverb_rir(rir)
assert cut_rvb.start == cut.start
assert cut_rvb.duration == cut.duration
assert cut_rvb.end == cut.end
assert cut_rvb.num_samples == cut.num_samples
assert cut_rvb.recording.duration == cut.recording.duration
assert cut_rvb.recording.num_samples == cut.recording.num_samples
assert cut_rvb.supervisions[0].start == cut.supervisions[0].start
assert cut_rvb.supervisions[0].duration == cut.supervisions[0].duration
assert cut_rvb.supervisions[0].end == cut.supervisions[0].end
assert cut_rvb.load_audio().shape == cut.load_audio().shape
assert cut_rvb.recording.load_audio().shape == cut.recording.load_audio().shape
rvb_audio_from_fixture = libri_recording_rvb.load_audio()
np.testing.assert_array_almost_equal(cut_rvb.load_audio(), rvb_audio_from_fixture)
@pytest.mark.parametrize(
"rir_channels, expected_type, expected_num_tracks",
[
([0], "MonoCut", 1),
([1], "MonoCut", 1),
([0, 1], "MixedCut", 2),
],
)
def test_cut_reverb_multi_channel_rir(
libri_cut_with_supervision,
multi_channel_rir,
rir_channels,
expected_type,
expected_num_tracks,
):
cut = libri_cut_with_supervision
cut_rvb = cut.reverb_rir(multi_channel_rir, rir_channels=rir_channels)
assert cut_rvb.to_dict()["type"] == expected_type
if expected_type == "MixedCut":
assert len(cut_rvb.tracks) == expected_num_tracks
for track in cut_rvb.tracks:
assert track.cut.start == cut.start
assert track.cut.duration == cut.duration
assert track.cut.end == cut.end
assert track.cut.num_samples == cut.num_samples
assert np.vstack(cut_rvb.load_audio(mixed=False)).shape == (
expected_num_tracks,
cut.num_samples,
)
else:
assert cut_rvb.load_audio().shape == (expected_num_tracks, cut.num_samples)
def test_padding_cut_resample():
original = PaddingCut(
id="cut",
duration=5.75,
sampling_rate=16000,
feat_value=1e-10,
num_samples=92000,
)
resampled = original.resample(8000)
assert resampled.sampling_rate == 8000
assert resampled.num_samples == original.num_samples / 2
samples = resampled.load_audio()
assert samples.shape[1] == resampled.num_samples
def test_mixed_cut_resample(cut_with_supervision_start01):
original = cut_with_supervision_start01.append(cut_with_supervision_start01)
resampled = original.resample(16000)
assert original.sampling_rate == 8000
assert resampled.sampling_rate == 16000
assert resampled.num_samples == 2 * original.num_samples
samples = resampled.load_audio()
assert samples.shape[1] == resampled.num_samples
@pytest.mark.parametrize("affix_id", [True, False])
def test_cut_set_resample(cut_set, affix_id):
resampled_cs = cut_set.resample(16000, affix_id=affix_id)
for original, resampled in zip(cut_set, resampled_cs):
if affix_id:
assert original.id != resampled.id
assert resampled.id.endswith("_rs16000")
else:
assert original.id == resampled.id
assert original.sampling_rate == 8000
assert resampled.sampling_rate == 16000
assert resampled.num_samples == 2 * original.num_samples
samples = resampled.load_audio()
assert samples.shape[1] == resampled.num_samples
@pytest.mark.parametrize("scale", [0.125, 2.0])
@pytest.mark.parametrize("affix_id", [True, False])
def test_cut_set_perturb_volume(cut_set, affix_id, scale):
perturbed_vp_cs = cut_set.perturb_volume(scale, affix_id=affix_id)
for original, perturbed_vp in zip(cut_set, perturbed_vp_cs):
if affix_id:
assert original.id != perturbed_vp.id
assert perturbed_vp.id.endswith(f"_vp{scale}")
else:
assert original.id == perturbed_vp.id
assert original.sampling_rate == perturbed_vp.sampling_rate
assert original.num_samples == perturbed_vp.num_samples
assert original.load_audio().shape == perturbed_vp.load_audio().shape
np.testing.assert_array_almost_equal(
perturbed_vp.load_audio(), original.load_audio() * scale
)
@pytest.mark.parametrize("affix_id", [True, False])
def test_cut_set_reverb_rir(libri_cut_set, rir, affix_id):
rirs = RecordingSet.from_recordings([rir])
perturbed_rvb_cs = libri_cut_set.reverb_rir(rirs, affix_id=affix_id)
for original, perturbed_rvb in zip(libri_cut_set, perturbed_rvb_cs):
if affix_id:
assert original.id != perturbed_rvb.id
assert perturbed_rvb.id.endswith(f"_rvb")
else:
assert original.id == perturbed_rvb.id
assert original.sampling_rate == perturbed_rvb.sampling_rate
assert original.num_samples == perturbed_rvb.num_samples
assert original.load_audio().shape == perturbed_rvb.load_audio().shape
| 34.359133 | 97 | 0.708326 | 3,043 | 22,196 | 4.859021 | 0.058166 | 0.060868 | 0.109563 | 0.098066 | 0.841945 | 0.776613 | 0.695117 | 0.619978 | 0.588327 | 0.505952 | 0 | 0.051645 | 0.192197 | 22,196 | 645 | 98 | 34.412403 | 0.773006 | 0.028338 | 0 | 0.421359 | 0 | 0 | 0.023012 | 0.008351 | 0 | 0 | 0 | 0 | 0.376699 | 1 | 0.073786 | false | 0 | 0.01165 | 0.019417 | 0.106796 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
5d5e2faa22593ed62da301da0114b2b99e99f783 | 2,073 | py | Python | code/1/utest.py | pwang13/AutomatedSE_Coursework | b416672d9756fcc60367143b989d29b0c905cfc3 | [
"Unlicense"
] | null | null | null | code/1/utest.py | pwang13/AutomatedSE_Coursework | b416672d9756fcc60367143b989d29b0c905cfc3 | [
"Unlicense"
] | null | null | null | code/1/utest.py | pwang13/AutomatedSE_Coursework | b416672d9756fcc60367143b989d29b0c905cfc3 | [
"Unlicense"
] | null | null | null | #!/usr/bin/python
"""
utest.py (c) 2016 tim@menzies.us, MIT licence
Part of http://tiny.cc/ase16: teaching tools for
(model-based) automated software enginering.
USAGE:
(1) If you place '@ok' before a function, then
load that file, then that function will execute and
all assertion failures will add one to a FAIL
count.
(2) To get the final counts, add 'oks()' at the end
of the source code.
For more on this kind of tool, see
https://www.youtube.com/watch?v=nIonZ6-4nuU
"""
from __future__ import division,print_function
import sys,re,traceback,random,string
sys.dont_write_bytecode=True
PASS=FAIL=0
VERBOSE=True
def oks():
global PASS, FAIL
print("\n# PASS= %s FAIL= %s %%PASS = %s%%" % (
PASS, FAIL, int(round(PASS*100/(PASS+FAIL+0.001)))))
def ok(f):
global PASS, FAIL
try:
print("\n-----| %s |-----------------------" % f.__name__)
if f.__doc__:
print("# "+ re.sub(r'\n[ \t]*',"\n# ",f.__doc__))
f()
print("# pass")
PASS += 1
except Exception,e:
FAIL += 1
print(traceback.format_exc())
return f
#################################################
def same(x):
return x
def any(lst):
return random.choice(lst)
def any3(lst,a=None,b=None,c=None,it = same,retries=10):
assert retries > 0
a = a or any(lst)
b = b or any(lst)
if it(a) == it(b):
return any3(lst,a=a,b=None,it=it,retries=retries - 1)
c = any(lst)
if it(a) == it(c) or it(b) == it(c):
return any3(lst,a=a,b=b,it=it,retries=retries - 1)
return a,b,c
@ok
def _ok1():
"Can at least one test fail?"
assert 1==2, "equality failure"
@ok
def _ok2():
"Can at least one test pass?"
assert 1==1, "equality failure"
@ok
def _any3():
"""There are 2600 three letter alphanet combinations.
So if we pick just 10, there should be no repeats."""
random.seed(1)
lst=list(string.ascii_lowercase) # abcdefghijklmnopqrstuvwxyz
seen = {}
for x in sorted([''.join(any3(lst)) for _ in xrange(10)]):
seen[x] = seen.get(x,0) + 1
for k,v in seen.items():
assert v < 2
print("")
oks()
| 24.104651 | 64 | 0.611674 | 343 | 2,073 | 3.623907 | 0.463557 | 0.03218 | 0.019308 | 0.01609 | 0.104586 | 0.046661 | 0 | 0 | 0 | 0 | 0 | 0.028898 | 0.198746 | 2,073 | 85 | 65 | 24.388235 | 0.719446 | 0.020743 | 0 | 0.090909 | 0 | 0 | 0.125621 | 0.017033 | 0 | 0 | 0 | 0 | 0.072727 | 0 | null | null | 0.145455 | 0.036364 | null | null | 0.127273 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
5d600ab0ab4fad14d0e1149d8eceecd953ba4d0b | 7,796 | py | Python | cake_uni_transaction_bot-main/txns.py | hongquantq92/Pancakeswap-and-uniswap-trading-bot | 6df841e044589b2b839858dc8339145c4a8c12ee | [
"BSD-2-Clause"
] | null | null | null | cake_uni_transaction_bot-main/txns.py | hongquantq92/Pancakeswap-and-uniswap-trading-bot | 6df841e044589b2b839858dc8339145c4a8c12ee | [
"BSD-2-Clause"
] | null | null | null | cake_uni_transaction_bot-main/txns.py | hongquantq92/Pancakeswap-and-uniswap-trading-bot | 6df841e044589b2b839858dc8339145c4a8c12ee | [
"BSD-2-Clause"
] | null | null | null | from web3 import Web3, IPCProvider
from web3.middleware import geth_poa_middleware
import json
import time
import keys
import sys
class Txn_bot(object):
def __init__(self, token_address, quantity, net, slippage, gas_price):
self.net = net
self.w3 = self.connect()
print("Access to Infura node: {}".format((self.w3.isConnected())))
self.address, self.private_key = self.set_address()
print("Address: {}".format(self.address))
print("Current balance of WETH/WBNB: {}".format(self.w3.fromWei(self.w3.eth.get_balance(self.address), 'ether')))
self.token_address = Web3.toChecksumAddress(token_address)
self.token_contract = self.set_token_contract()
print("Current balance of {}: {}".format(self.token_contract.functions.symbol().call() ,self.token_contract.functions.balanceOf(self.address).call() / (10 ** self.token_contract.functions.decimals().call())))
self.router_address, self.router = self.set_router()
self.quantity = quantity
self.slippage = 1 - (slippage/100)
self.gas_price = gas_price
# def __init__(self, token_address, net):
# self.net = net
# self.w3 = self.connect()
# print("Access to Infura node: {}".format((self.w3.isConnected())))
# self.address, self.private_key = self.set_address()
# print("Address: {}".format(self.address))
# print("Current balance of WETH/WBNB: {}".format(self.w3.fromWei(self.w3.eth.get_balance(self.address), 'ether')))
# self.token_address = Web3.toChecksumAddress(token_address)
# self.token_contract = self.set_token_contract()
# print("Current balance of {}: {}".format(self.token_contract.functions.symbol().call() ,self.token_contract.functions.balanceOf(self.address).call() / (10 ** self.token_contract.functions.decimals().call())))
# self.router_address, self.router = self.set_router()
def connect(self):
if self.net=="eth-mainnet":
w3 = Web3(Web3.HTTPProvider("https://mainnet.infura.io/v3/{}".format(keys.infura_project_id)))
w3.middleware_onion.inject(geth_poa_middleware, layer=0)
elif self.net=="eth-rinkeby":
w3 = Web3(Web3.HTTPProvider("https://rinkeby.infura.io/v3/{}".format(keys.infura_project_id)))
w3.middleware_onion.inject(geth_poa_middleware, layer=0)
elif self.net=="bsc-mainnet":
w3 = Web3(Web3.HTTPProvider("https://bsc-dataseed.binance.org/"))
# TODO: Add bsc-tesnet. Cake testing problems
else:
print("Not a valid network...\nSupported networks: eth-mainnet, eth-rinkeby, bsc-mainnet")
sys.exit()
return w3
def set_address(self):
return(keys.metamask_address, keys.metamask_private_key)
def set_router(self): #TODO: Refactor functions into shorter ones?
if "eth" in self.net:
router_address = Web3.toChecksumAddress("0x7a250d5630B4cF539739dF2C5dAcb4c659F2488D")
with open("./abis/IUniswapV2Router02.json") as f:
contract_abi = json.load(f)['abi']
router = self.w3.eth.contract(address=router_address, abi=contract_abi)
else:
router_address = Web3.toChecksumAddress("0x05fF2B0DB69458A0750badebc4f9e13aDd608C7F") # mainnet router
with open("./abis/pancakeRouter.json") as f:
contract_abi = json.load(f)['abi']
router = self.w3.eth.contract(address=router_address, abi=contract_abi)
return (router_address, router)
def set_token_contract(self): #TODO: Refactor functions into shorter ones?
if "eth" in self.net:
token_address = Web3.toChecksumAddress(self.token_address)
with open("./abis/erc20_abi.json") as f:
contract_abi = json.load(f)
token_contract = self.w3.eth.contract(address=token_address, abi=contract_abi)
else:
token_address = Web3.toChecksumAddress(self.token_address)
with open("./abis/bep20_abi_token.json") as f:
contract_abi = json.load(f)
token_contract = self.w3.eth.contract(address=token_address, abi=contract_abi)
return token_contract
def get_amounts_out_buy(self):
print(self.router.functions.WETH().call(), self.token_address)
return self.router.functions.getAmountsOut(
int(self.quantity * self.slippage),
[self.router.functions.WETH().call(), self.token_address]
).call()
def get_amounts_out_sell(self):
return self.router.functions.getAmountsOut(
self.token_contract.functions.balanceOf(self.address).call(),
[self.token_address, self.router.functions.WETH().call()]
).call()
def approve(self):
txn = self.token_contract.functions.approve(
self.router_address,
2**256 - 1
).buildTransaction(
{'from': self.address,
'gas': 250000,
'gasPrice': self.gas_price,
'nonce': self.w3.eth.getTransactionCount(self.address),
'value': 0}
)
signed_txn = self.w3.eth.account.sign_transaction(
txn,
self.private_key
)
txn = self.w3.eth.sendRawTransaction(signed_txn.rawTransaction)
print(txn.hex())
txn_receipt = self.w3.eth.waitForTransactionReceipt(txn)
print(txn_receipt)
def buy_token(self):
txn = self.router.functions.swapExactETHForTokens(
self.get_amounts_out_buy()[-1],
[self.router.functions.WETH().call(), self.token_address],
bytes.fromhex(self.address[2:]),
int(time.time()) + 10 * 60 # 10 min limit
).buildTransaction(
{'from': self.address,
'gas': 250000,
'gasPrice': self.gas_price,
'nonce': self.w3.eth.getTransactionCount(self.address),
'value': self.quantity}
)
signed_txn = self.w3.eth.account.sign_transaction(
txn,
self.private_key
)
txn = self.w3.eth.sendRawTransaction(signed_txn.rawTransaction)
print(txn.hex())
txn_receipt = self.w3.eth.waitForTransactionReceipt(txn)
print(txn_receipt)
def sell_token(self):
txn = self.router.functions.swapExactTokensForETH(
self.token_contract.functions.balanceOf(self.address).call(),
int(self.get_amounts_out_sell()[-1] * self.slippage),
[self.token_address, self.router.functions.WETH().call()],
bytes.fromhex(self.address[2:]),
int(time.time()) + 10 * 60 # 10 min limit
).buildTransaction(
{'from': self.address,
'gas': 250000,
'gasPrice': self.gas_price,
'nonce': self.w3.eth.getTransactionCount(self.address),
'value': 0}
)
signed_txn = self.w3.eth.account.sign_transaction(
txn,
self.private_key
)
txn = self.w3.eth.sendRawTransaction(signed_txn.rawTransaction)
print(txn.hex())
txn_receipt = self.w3.eth.waitForTransactionReceipt(txn)
print(txn_receipt)
def check_price_busd_usdc(self):
if (self.net == "eth-mainnet"):
return self.router.functions.getAmountsOut(
int(1*10**18),
[self.token_address, "0xA0b86991c6218b36c1d19D4a2e9Eb0cE3606eB48"]
).call()[1]
elif (self.net == "bsc-mainnet"):
return self.router.functions.getAmountsOut(
int(1*10**18),
[self.token_address, "0xe9e7CEA3DedcA5984780Bafc599bD69ADd087D56"]
).call()[1]
| 44.548571 | 218 | 0.621986 | 881 | 7,796 | 5.353008 | 0.165721 | 0.045802 | 0.034351 | 0.049618 | 0.755513 | 0.724131 | 0.681086 | 0.681086 | 0.614292 | 0.614292 | 0 | 0.036264 | 0.250128 | 7,796 | 174 | 219 | 44.804598 | 0.770441 | 0.119805 | 0 | 0.493056 | 0 | 0 | 0.10038 | 0.042665 | 0 | 0 | 0.024547 | 0.005747 | 0 | 1 | 0.076389 | false | 0 | 0.041667 | 0.013889 | 0.173611 | 0.083333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
5d70b05b6883f56fd7f6f73ea4a08e55ff2b89d3 | 493 | py | Python | zerogercrnn/lib/health.py | zerogerc/rnn-autocomplete | 39dc8dd7c431cb8ac9e15016388ec823771388e4 | [
"Apache-2.0"
] | 7 | 2019-02-27T09:48:39.000Z | 2021-11-30T19:01:01.000Z | zerogercrnn/lib/health.py | ZeRoGerc/rnn-autocomplete | 39dc8dd7c431cb8ac9e15016388ec823771388e4 | [
"Apache-2.0"
] | null | null | null | zerogercrnn/lib/health.py | ZeRoGerc/rnn-autocomplete | 39dc8dd7c431cb8ac9e15016388ec823771388e4 | [
"Apache-2.0"
] | null | null | null | from abc import abstractmethod
class HealthCheck:
"""Class that do some check on the model. Usually it prints some info about model at the end of epoch."""
@abstractmethod
def do_check(self):
pass
class AlphaBetaSumHealthCheck(HealthCheck):
def __init__(self, module):
super().__init__()
self.module = module
def do_check(self):
print('Alpha: {}'.format(self.module.mult_alpha))
print('Beta: {}'.format(self.module.mult_beta)) | 24.65 | 109 | 0.665314 | 62 | 493 | 5.096774 | 0.548387 | 0.126582 | 0.063291 | 0.088608 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.225152 | 493 | 20 | 110 | 24.65 | 0.827225 | 0.200811 | 0 | 0.166667 | 0 | 0 | 0.043702 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0.083333 | 0.083333 | 0 | 0.5 | 0.166667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
5d70c121f29466da9e905f0428da0c90082215a1 | 4,058 | py | Python | startup/users/30-user-Luo.py | NSLS-II-SMI/profile_collection | c1e2236a7520f605ac85e7591f05682add06357c | [
"BSD-3-Clause"
] | null | null | null | startup/users/30-user-Luo.py | NSLS-II-SMI/profile_collection | c1e2236a7520f605ac85e7591f05682add06357c | [
"BSD-3-Clause"
] | 13 | 2018-09-25T19:35:08.000Z | 2021-01-15T20:42:26.000Z | startup/users/30-user-Luo.py | NSLS-II-SMI/profile_collection | c1e2236a7520f605ac85e7591f05682add06357c | [
"BSD-3-Clause"
] | 3 | 2019-09-06T01:40:59.000Z | 2020-07-01T20:27:39.000Z | def mapping_Luo(t=1):
names = [ 'Yao_6_2', 'AB_TMA_2', 'ABA_TMA_2',
'ABAB_TMA_2', 'ABABA_TMA_2', 'ABABA_IPrMeP_2']
xlocs = [ 21400, 5500, -7200,
-11700, -17200, -30700]
ylocs = [ 0, 0, 0,
0, 0, 0]
zlocs = [ 2700, 2700, 2700,
2700, 2700, 2700]
x_range=[[0, 500, 21], [0, 500, 21], [0, 500, 21],
[0, 500, 21], [0, 500, 21], [0, 500, 21]]
y_range=[[0, 500, 101],[0, 500, 101],[0, 500, 101],
[0, 500, 101],[0, 500, 101],[0, 500, 101]]
wa_range=[ [0, 13, 3], [0, 13, 3], [0, 13, 3],
[0, 13, 3], [0, 13, 3], [0, 13, 3]]
# names = ['CRP_2_275', 'CRP_1_131', 'Yao_6', 'CRP_2_275F', 'CRP_1_275A', 'AB_TMA', 'iPrMeP_stat', 'ABA_TMA', 'ABAB_TMA', 'ABABA_TMA',
# 'TMA_stat', 'ABABA_IPrMeP' ]
# xlocs = [30400, 26100, 21400, 16200, 9600, 5500, -500, -7200,
# -11700, -17200, -23700, -30700]
# ylocs = [0, 0, 0, 0, 0, 0, 0, 0,
# 0, 0, 0, 0]
# zlocs = [2700, 2700, 2700, 2700, 2700, 2700, 2700, 2700,
# 2700, 2700, 2700, 2700]
# x_range=[[0, 500, 11], [0, 500, 11], [0, 500, 11], [0, 500, 11], [0, 500, 11], [0, 500, 11], [0, 500, 11], [0, 500, 11],
# [0, 500, 11], [0, 500, 11], [0, 500, 11], [0, 500, 11]]
# y_range=[[0, 500, 101],[0, 500, 101],[0, 500, 101],[0, 500, 101],[0, 500, 101],[0, 500, 101],[0, 500, 101],[0, 500, 101],
# [0, 500, 101],[0, 500, 101],[0, 500, 101],[0, 500, 101]]
# wa_range=[[0, 26, 5], [0, 13, 3], [0, 26, 5], [0, 26, 5], [0, 26, 5], [0, 13, 3], [0, 13, 3], [0, 13, 3],
# [0, 13, 3], [0, 13, 3], [0, 13, 3], [0, 13, 3]]
user = 'AL'
det_exposure_time(t,t)
assert len(xlocs) == len(names), f'Number of X coordinates ({len(xlocs)}) is different from number of samples ({len(names)})'
assert len(xlocs) == len(names), f'Number of X coordinates ({len(xlocs)}) is different from number of samples ({len(ylocs)})'
assert len(xlocs) == len(names), f'Number of X coordinates ({len(xlocs)}) is different from number of samples ({len(zlocs)})'
assert len(xlocs) == len(names), f'Number of X coordinates ({len(xlocs)}) is different from number of samples ({len(x_range)})'
assert len(xlocs) == len(names), f'Number of X coordinates ({len(xlocs)}) is different from number of samples ({len(y_range)})'
assert len(xlocs) == len(wa_range), f'Number of X coordinates ({len(xlocs)}) is different from number of samples ({len(wa_range)})'
# Detectors, motors:
dets = [pil300KW, pil1M]
for num, (x, y, sample, x_r, y_r, wax_ra) in enumerate(zip(xlocs, ylocs, names, x_range, y_range, wa_range)):
if num == 0:
proposal_id('2121_1', '307948_Luo')
else:
proposal_id('2121_1', '307948_Luo2')
pil1M.cam.file_path.put('/nsls2/xf12id2/data/images/users/2021_1/307948_Luo2/1M/%s'%sample)
pil300KW.cam.file_path.put('/nsls2/xf12id2/data/images/users/2021_1/307948_Luo2/300KW/%s'%sample)
for wa in np.linspace(wax_ra[0], wax_ra[1], wax_ra[2]):
yield from bps.mv(waxs, wa)
yield from bps.mv(piezo.x, x)
yield from bps.mv(piezo.y, y+500)
name_fmt = '{sam}_4m_16.1keV_wa{waxs}'
sample_name = name_fmt.format(sam=sample, waxs='%2.1f'%wa)
sample_id(user_name=user, sample_name=sample_name)
print(f'\n\t=== Sample: {sample_name} ===\n')
yield from bp.rel_grid_scan(dets, piezo.y, *y_r, piezo.x, *x_r, 0) #1 = snake, 0 = not-snake
sample_id(user_name='test', sample_name='test')
det_exposure_time(0.3,0.3)
| 56.361111 | 138 | 0.4931 | 603 | 4,058 | 3.182421 | 0.202322 | 0.075039 | 0.065659 | 0.066701 | 0.607087 | 0.551329 | 0.551329 | 0.541428 | 0.531006 | 0.531006 | 0 | 0.215301 | 0.323558 | 4,058 | 71 | 139 | 57.15493 | 0.483789 | 0.304337 | 0 | 0 | 0 | 0 | 0.294013 | 0.050606 | 0 | 0 | 0 | 0 | 0.142857 | 1 | 0.02381 | false | 0 | 0 | 0 | 0.02381 | 0.02381 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
5d71bda5c90bf70e50fd732656d7765b39f4ad2d | 635 | py | Python | demos/register-login/controllers/handlers.py | karldoenitz/karlooper | 2e1df83ed1ec9b343cdd930162a4de7ecd149c04 | [
"MIT"
] | 161 | 2016-05-17T12:44:07.000Z | 2020-07-30T02:18:34.000Z | demos/register-login/controllers/handlers.py | karldoenitz/karlooper | 2e1df83ed1ec9b343cdd930162a4de7ecd149c04 | [
"MIT"
] | 6 | 2016-08-29T01:40:26.000Z | 2017-12-29T09:20:41.000Z | demos/register-login/controllers/handlers.py | karldoenitz/karlooper | 2e1df83ed1ec9b343cdd930162a4de7ecd149c04 | [
"MIT"
] | 16 | 2016-06-27T02:56:54.000Z | 2019-08-08T08:18:48.000Z | # -*-encoding:utf-8-*-
from base import is_login
from karlooper.web.request import Request
class Login(Request):
def get(self):
return self.render("/register-login.html", button="Login", title="LOGIN")
class Register(Request):
def get(self):
return self.render("/register-login.html", button="SignUp", title="REGISTER")
class MainPage(Request):
@is_login
def get(self):
return self.http_response(
"<html>"
"<head>"
"<title>Main Page</title>"
"</head>"
"<body><h1>Login Successfully!</h1></body>"
"</html>"
)
| 22.678571 | 85 | 0.571654 | 72 | 635 | 5 | 0.430556 | 0.05 | 0.083333 | 0.133333 | 0.366667 | 0.311111 | 0.311111 | 0.311111 | 0.311111 | 0.311111 | 0 | 0.006438 | 0.266142 | 635 | 27 | 86 | 23.518519 | 0.766094 | 0.031496 | 0 | 0.157895 | 0 | 0 | 0.252855 | 0.040783 | 0 | 0 | 0 | 0 | 0 | 1 | 0.157895 | false | 0 | 0.105263 | 0.157895 | 0.578947 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 2 |
53909611650c8de35a38ffffdd6fe32ea9d62177 | 1,049 | py | Python | Task_6_for_250Movies.py | jyoti140220/python_websraping | 454d8dda66b99f5a209500d2a676855c98e8a92d | [
"MIT"
] | null | null | null | Task_6_for_250Movies.py | jyoti140220/python_websraping | 454d8dda66b99f5a209500d2a676855c98e8a92d | [
"MIT"
] | null | null | null | Task_6_for_250Movies.py | jyoti140220/python_websraping | 454d8dda66b99f5a209500d2a676855c98e8a92d | [
"MIT"
] | null | null | null | from _250_movie_detials import movie_details
from pprint import pprint
list1=[]
def analyse_movies_language():
for i in movie_details:
list1.append(i['language'])
language_list=list1
# print(language_list)
i=0
duplicate_language_list=[]
while i<len(language_list):
j=0
while j<len(language_list[i]):
if language_list[i][j] not in duplicate_language_list:
duplicate_language_list.append(language_list[i][j])
j=j+1
i=i+1
# print(duplicate_language_list)
i=0
language_count_list=[]
while i<len(duplicate_language_list):
count=0
for x in movie_details:
if duplicate_language_list[i] in x['language']:
count=count+1
language_count_list.append(count)
i=i+1
# print(language_count_list)
i=0
dic={}
while i<len(duplicate_language_list):
dic[duplicate_language_list[i]]=language_count_list[i]
i=i+1
return dic
pprint(analyse_movies_language()) | 27.605263 | 67 | 0.64061 | 145 | 1,049 | 4.358621 | 0.206897 | 0.265823 | 0.265823 | 0.10443 | 0.094937 | 0.094937 | 0 | 0 | 0 | 0 | 0 | 0.020725 | 0.264061 | 1,049 | 38 | 68 | 27.605263 | 0.797927 | 0.074357 | 0 | 0.25 | 0 | 0 | 0.01658 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.03125 | false | 0 | 0.0625 | 0 | 0.125 | 0.0625 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
53b094d6fa76cf7d6fafafae149fa38a8fc8fbb7 | 598 | py | Python | src/third_party/dart/build/config/linux/sysroot_ld_path.py | rhencke/engine | 1016db292c4e73374a0a11536b18303c9522a224 | [
"BSD-3-Clause"
] | 21 | 2021-06-04T21:08:21.000Z | 2022-03-04T14:21:34.000Z | src/third_party/dart/build/config/linux/sysroot_ld_path.py | rhencke/engine | 1016db292c4e73374a0a11536b18303c9522a224 | [
"BSD-3-Clause"
] | 1 | 2021-01-21T14:45:59.000Z | 2021-01-21T14:45:59.000Z | src/third_party/dart/build/config/linux/sysroot_ld_path.py | rhencke/engine | 1016db292c4e73374a0a11536b18303c9522a224 | [
"BSD-3-Clause"
] | 9 | 2021-03-16T09:29:26.000Z | 2022-01-06T08:38:10.000Z | # Copyright (c) 2013 The Chromium Authors. All rights reserved.
# Use of this source code is governed by a BSD-style license that can be
# found in the LICENSE file.
# This file takes two arguments, the relative location of the shell script that
# does the checking, and the name of the sysroot.
# TODO(brettw) the build/linux/sysroot_ld_path.sh script should be rewritten in
# Python in this file.
import subprocess
import sys
if len(sys.argv) != 3:
print "Need two arguments"
sys.exit(1)
result = subprocess.check_output([sys.argv[1], sys.argv[2]]).strip()
print '"' + result + '"'
| 28.47619 | 79 | 0.727425 | 98 | 598 | 4.408163 | 0.663265 | 0.048611 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.01626 | 0.177258 | 598 | 20 | 80 | 29.9 | 0.861789 | 0.64214 | 0 | 0 | 0 | 0 | 0.097087 | 0 | 0 | 0 | 0 | 0.05 | 0 | 0 | null | null | 0 | 0.285714 | null | null | 0.285714 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
53b86ea0157037069b87eb5f32e7375fd41d7f22 | 255 | py | Python | scripts/list1.py | arifulhaqueuc/python-algorithm-excersice | 7f2c93be7f87cbe2a8527f4edbf6c45f01e0597e | [
"MIT"
] | null | null | null | scripts/list1.py | arifulhaqueuc/python-algorithm-excersice | 7f2c93be7f87cbe2a8527f4edbf6c45f01e0597e | [
"MIT"
] | 4 | 2018-05-16T23:06:49.000Z | 2018-10-26T22:47:52.000Z | scripts/list1.py | arifulhaqueuc/python-algorithm-excersice | 7f2c93be7f87cbe2a8527f4edbf6c45f01e0597e | [
"MIT"
] | null | null | null | ## input = 1,2,3,4
## output = ['1','2','3','4'], ('1','2','3','4')
def abc():
values = input()
print("----")
print(values)
print("----")
x = values.split(",")
print(x)
y = tuple(x)
print("===")
print(y)
if __name__ == "__main__":
abc()
| 12.142857 | 48 | 0.462745 | 37 | 255 | 2.972973 | 0.459459 | 0.054545 | 0.081818 | 0.109091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.058252 | 0.192157 | 255 | 20 | 49 | 12.75 | 0.475728 | 0.239216 | 0 | 0.166667 | 0 | 0 | 0.10582 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.083333 | false | 0 | 0 | 0 | 0.083333 | 0.5 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
53c04dfa6be8b727ccac6ec028d6d48549f5ed74 | 966 | py | Python | alphaml/utils/class_loader.py | dingdian110/alpha-ml | d6a7a8a8a3452a7e3362bf0ef32b9ac5fe215fde | [
"BSD-3-Clause"
] | 1 | 2021-09-06T20:21:15.000Z | 2021-09-06T20:21:15.000Z | alphaml/utils/class_loader.py | dingdian110/alpha-ml | d6a7a8a8a3452a7e3362bf0ef32b9ac5fe215fde | [
"BSD-3-Clause"
] | null | null | null | alphaml/utils/class_loader.py | dingdian110/alpha-ml | d6a7a8a8a3452a7e3362bf0ef32b9ac5fe215fde | [
"BSD-3-Clause"
] | null | null | null | import sys
import pkgutil
import inspect
import importlib
from collections import OrderedDict
def find_components(package, directory, base_class):
components = OrderedDict()
for module_loader, module_name, ispkg in pkgutil.iter_modules([directory]):
full_module_name = "%s.%s" % (package, module_name)
if full_module_name not in sys.modules and not ispkg:
module = importlib.import_module(full_module_name)
for member_name, obj in inspect.getmembers(module):
if inspect.isclass(obj) and issubclass(obj, base_class) and \
obj != base_class:
# TODO test if the obj implements the interface
# Keep in mind that this only instantiates the ensemble_wrapper,
# but not the real target classifier
classifier = obj
components[module_name] = classifier
return components | 38.64 | 84 | 0.637681 | 111 | 966 | 5.387387 | 0.459459 | 0.100334 | 0.070234 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.305383 | 966 | 25 | 85 | 38.64 | 0.891207 | 0.148033 | 0 | 0 | 0 | 0 | 0.006098 | 0 | 0 | 0 | 0 | 0.04 | 0 | 1 | 0.058824 | false | 0 | 0.352941 | 0 | 0.470588 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
53cb95213f77be638f37d090c9c979dc93ecf3a6 | 15,357 | py | Python | RACE_for_thesis/src/nn_layers.py | l11x0m7/master_thesis_public | 0bf3b471e126f2e702be8de9d7af824e391f5a2f | [
"MIT"
] | 1 | 2019-09-05T12:35:54.000Z | 2019-09-05T12:35:54.000Z | RACE_for_thesis/src/nn_layers.py | l11x0m7/master_thesis_public | 0bf3b471e126f2e702be8de9d7af824e391f5a2f | [
"MIT"
] | null | null | null | RACE_for_thesis/src/nn_layers.py | l11x0m7/master_thesis_public | 0bf3b471e126f2e702be8de9d7af824e391f5a2f | [
"MIT"
] | null | null | null | import theano.tensor as T
import lasagne
import lasagne.layers as L
# my addition
class SelfAttention(L.MergeLayer):
# incomings[0]: B * P * 2D
# B * P
def __init__(self, incomings, num_units,
nonlinearity=lasagne.nonlinearities.tanh,
mask_input=None,
name='',
init=lasagne.init.Uniform(), **kwargs):
if len(incomings) != 2:
raise NotImplementedError
if mask_input is not None:
incomings.append(mask_input)
super(SelfAttention, self).__init__(incomings, **kwargs)
self.nonlinearity = nonlinearity
self.num_units = num_units
self.W0 = self.add_param(init, (self.num_units, self.num_units), name='W0_sa_{}'.format(name))
self.Wb = self.add_param(init, (self.num_units, ), name='Wb_sa_{}'.format(name))
# inputs[0]: B * P * 2D
# inputs[1]: B * P
def get_output_for(self, inputs, **kwargs):
B, P, D = inputs[0].shape
# B * P
alphas = T.dot(self.nonlinearity(T.dot(inputs[0], self.W0)), self.Wb)
alphas = T.nnet.softmax(alphas) * inputs[1]
alphas = alphas / (alphas.sum(axis=1, keepdims=True) + 1e-8) * inputs[1]
att = T.sum(inputs[0] * alphas.dimshuffle(0, 1, 'x'), axis=1)
return att
def get_output_shape_for(self, input_shapes):
# outputs: B * 2D
return (input_shapes[0][0], input_shapes[0][2])
class SqueezeLayer(lasagne.layers.Layer):
def get_output_for(self, input, **kwargs):
return T.cast(input.sum(axis=-1) > 0, 'int32')
def get_output_shape_for(self, input_shape):
return input_shape[:-1]
class CombineLayer(L.MergeLayer):
# inputs[0]: B * N
# inputs[1]: B * N
def get_output_for(self, inputs, **kwargs):
return (inputs[0] + inputs[1]) / 2.
def get_output_shape_for(self, input_shapes):
return input_shapes[0]
class OptionGateLayer(L.MergeLayer):
# inputs[0]: B * 4 * N
# inputs[1]: B * 4
def get_output_for(self, inputs, **kwargs):
return inputs[0] * inputs[1].dimshuffle(0, 1, 'x')
def get_output_shape_for(self, input_shapes):
return input_shapes[0]
# -----------------------------origin------------------------------
class GatedAttentionLayerWithQueryAttention(L.MergeLayer):
# inputs[0]: B * N * 2D
# inputs[1]: B * Q * 2D
# inputs[2]: B * Q (l_m_q)
def get_output_for(self, inputs, **kwargs):
M = T.batched_dot(inputs[0], inputs[1].dimshuffle((0,2,1))) # B x N x Q
B, N, Q = M.shape
alphas = T.nnet.softmax(T.reshape(M, (B*N, Q)))
alphas_r = T.reshape(alphas, (B,N,Q)) * inputs[2].dimshuffle(0, 'x', 1) # B x N x Q
alphas_r = alphas_r / alphas_r.sum(axis=2, keepdims=True) # B x N x Q
q_rep = T.batched_dot(alphas_r, inputs[1]) # B x N x 2D
d_gated = inputs[0] * q_rep
return d_gated
def get_output_shape_for(self, input_shapes):
return input_shapes[0]
class GateWithQuery(L.MergeLayer):
# inputs[0]: B * N * 2D
# inputs[1]: B * 2D
def get_output_for(self, inputs, **kwargs):
M = T.batched_dot(inputs[0], inputs[1])
M = T.nnet.sigmoid(M)
d_gated = inputs[0] * M.dimshuffle((0, 1, 'x'))
return d_gated
def get_output_shape_for(self, input_shapes):
return input_shapes[0]
class QuerySliceLayer(L.MergeLayer):
# inputs[0]: B * Q * 2D (q)
# inputs[1]: B (q_var)
def get_output_for(self, inputs, **kwargs):
q_slice = inputs[0][T.arange(inputs[0].shape[0]), inputs[1]-1, :] # B x 2D
return q_slice
def get_output_shape_for(self, input_shapes):
return (input_shapes[0][0], input_shapes[0][2])
class GatedAttentionLayer(L.MergeLayer):
# inputs[0]: B * N * 2D
# inputs[1]: N * 2D
def get_output_for(self, inputs, **kwargs):
return inputs[0] * inputs[1].dimshuffle(0, 'x', 1)
def get_output_shape_for(self, input_shapes):
return input_shapes[0]
class AttentionSumLayer(L.MergeLayer):
# inputs[0]: batch * len * h (d)
# inputs[1]: batch * h (q_slice)
# inputs[2]: batch * len * num_cand (c_var)
# inputs[3]: batch * len (m_c_var)
def get_output_for(self, inputs, **kwargs):
dq = T.batched_dot(inputs[0], inputs[1]) # B x len
attention = T.nnet.softmax(dq) * inputs[3] # B x len
attention = attention / attention.sum(axis=1, keepdims=True)
probs = T.batched_dot(attention, inputs[2]) # B x num_cand
probs = probs / probs.sum(axis=1, keepdims=True)
return probs
def get_output_shape_for(self, input_shapes):
return (input_shapes[2][0], input_shapes[2][2])
def stack_rnn(l_emb, l_mask, num_layers, num_units,
grad_clipping=10, dropout_rate=0.,
bidir=True,
only_return_final=False,
name='',
rnn_layer=lasagne.layers.LSTMLayer):
"""
Stack multiple RNN layers.
"""
def _rnn(backwards=True, name=''):
network = l_emb
for layer in range(num_layers):
if dropout_rate > 0:
network = lasagne.layers.DropoutLayer(network, p=dropout_rate)
c_only_return_final = only_return_final and (layer == num_layers - 1)
network = rnn_layer(network, num_units,
grad_clipping=grad_clipping,
mask_input=l_mask,
only_return_final=c_only_return_final,
backwards=backwards,
name=name + '_layer' + str(layer + 1))
return network
network = _rnn(True, name)
if bidir:
network = lasagne.layers.ConcatLayer([network, _rnn(False, name + '_back')], axis=-1)
return network
class AveragePoolingLayer(lasagne.layers.MergeLayer):
"""
Average pooling.
incoming: batch x len x h
"""
def __init__(self, incoming, mask_input=None, **kwargs):
incomings = [incoming]
if mask_input is not None:
incomings.append(mask_input)
super(AveragePoolingLayer, self).__init__(incomings, **kwargs)
if len(self.input_shapes[0]) != 3:
raise ValueError('the shape of incoming must be a 3-element tuple')
def get_output_shape_for(self, input_shapes):
return input_shapes[0][:-2] + input_shapes[0][-1:]
def get_output_for(self, inputs, **kwargs):
if len(inputs) == 1:
# mask_input is None
return T.mean(inputs[0], axis=1)
else:
# inputs[0]: batch x len x h
# inputs[1] = mask_input: batch x len
return (T.sum(inputs[0] * inputs[1].dimshuffle(0, 1, 'x'), axis=1) /
T.sum(inputs[1], axis=1).dimshuffle(0, 'x'))
class MLPAttentionLayer(lasagne.layers.MergeLayer):
"""
An MLP attention layer.
incomings[0]: batch x len x h
incomings[1]: batch x h
Reference: http://arxiv.org/abs/1506.03340
"""
def __init__(self, incomings, num_units,
nonlinearity=lasagne.nonlinearities.tanh,
mask_input=None,
init=lasagne.init.Uniform(), **kwargs):
if len(incomings) != 2:
raise NotImplementedError
if mask_input is not None:
incomings.append(mask_input)
super(MLPAttentionLayer, self).__init__(incomings, **kwargs)
self.nonlinearity = nonlinearity
self.num_units = num_units
self.W0 = self.add_param(init, (self.num_units, self.num_units), name='W0_mlp')
self.W1 = self.add_param(init, (self.num_units, self.num_units), name='W1_mlp')
self.Wb = self.add_param(init, (self.num_units, ), name='Wb_mlp')
def get_output_shape_for(self, input_shapes):
return input_shapes[1]
def get_output_for(self, inputs, **kwargs):
M = T.dot(inputs[0], self.W0) + T.dot(inputs[1], self.W1).dimshuffle(0, 'x', 1)
M = self.nonlinearity(M)
alpha = T.nnet.softmax(T.dot(M, self.Wb))
if len(inputs) == 3:
alpha = alpha * inputs[2]
alpha = alpha / alpha.sum(axis=1).reshape((alpha.shape[0], 1))
return T.sum(inputs[0] * alpha.dimshuffle(0, 1, 'x'), axis=1)
class LengthLayer(lasagne.layers.Layer):
def get_output_for(self, input, **kwargs):
return T.cast(input.sum(axis=-1), 'int32')
def get_output_shape_for(self, input_shape):
return input_shape[:-1]
class QuerySliceLayer(L.MergeLayer):
# inputs[0]: B * Q * 2D (q)
# inputs[1]: B (q_var)
def get_output_for(self, inputs, **kwargs):
q_slice = inputs[0][T.arange(inputs[0].shape[0]), inputs[1] - 1, :] # B x 2D
return q_slice
def get_output_shape_for(self, input_shapes):
return (input_shapes[0][0], input_shapes[0][2])
class MultiplyLayer(L.MergeLayer):
# inputs[0]: B * P * 2D
# inputs[1]: B * P
def get_output_for(self, inputs, **kwargs):
return T.sum(inputs[0] * inputs[1].dimshuffle(0, 1, 'x'), axis=1)
def get_output_shape_for(self, input_shapes):
return input_shapes[0]
class BilinearAttentionLayer(lasagne.layers.MergeLayer):
"""
A bilinear attention layer.
incomings[0]: batch x len x h
incomings[1]: batch x h
"""
def __init__(self, incomings, num_units,
mask_input=None,
init=lasagne.init.Uniform(), **kwargs):
if len(incomings) != 2:
raise NotImplementedError
if mask_input is not None:
incomings.append(mask_input)
super(BilinearAttentionLayer, self).__init__(incomings, **kwargs)
self.num_units = num_units
if 'name' not in kwargs:
self.W = self.add_param(init, (self.num_units, self.num_units), name='W_bilinear')
else:
self.W = self.add_param(init, (self.num_units, self.num_units), name='W_bilinear_{}'.format(kwargs['name']))
def get_output_shape_for(self, input_shapes):
return input_shapes[1]
def get_output_for(self, inputs, **kwargs):
# inputs[0]: batch * len * h
# inputs[1]: batch * h
# W: h * h
M = T.dot(inputs[1], self.W).dimshuffle(0, 'x', 1)
alpha = T.nnet.softmax(T.sum(inputs[0] * M, axis=2))
if len(inputs) == 3:
alpha = alpha * inputs[2]
alpha = alpha / (alpha.sum(axis=1).reshape((alpha.shape[0], 1)) + 1e-8) * inputs[2]
return T.sum(inputs[0] * alpha.dimshuffle(0, 1, 'x'), axis=1)
class BilinearAttentionMatLayer(lasagne.layers.MergeLayer):
"""
A bilinear attention layer.
incomings[0]: batch x len x h
incomings[1]: batch x h
"""
def __init__(self, incomings, num_units,
mask_input=None,
init=lasagne.init.Uniform(), **kwargs):
if len(incomings) != 2:
raise NotImplementedError
if mask_input is not None:
incomings.append(mask_input)
super(BilinearAttentionMatLayer, self).__init__(incomings, **kwargs)
self.num_units = num_units
if 'name' not in kwargs:
self.W = self.add_param(init, (self.num_units, self.num_units), name='W_bilinear')
else:
self.W = self.add_param(init, (self.num_units, self.num_units), name='W_bilinear_{}'.format(kwargs['name']))
def get_output_shape_for(self, input_shapes):
return input_shapes[0][:2]
def get_output_for(self, inputs, **kwargs):
# inputs[0]: batch * len * h
# inputs[1]: batch * h
# W: h * h
M = T.dot(inputs[1], self.W).dimshuffle(0, 'x', 1)
alpha = T.nnet.softmax(T.sum(inputs[0] * M, axis=2))
if len(inputs) == 3:
alpha = alpha * inputs[2]
alpha = alpha / (alpha.sum(axis=1).reshape((alpha.shape[0], 1)) + 1e-8) * inputs[2]
return alpha
class BilinearDotLayer(lasagne.layers.MergeLayer):
"""
A bilinear attention layer.
incomings[0]: batch x len x h
incomings[1]: batch x h
"""
def __init__(self, incomings, num_units,
mask_input=None,
init=lasagne.init.Uniform(), **kwargs):
if len(incomings) != 2:
raise NotImplementedError
if mask_input is not None:
incomings.append(mask_input)
super(BilinearDotLayer, self).__init__(incomings, **kwargs)
self.num_units = num_units
if 'name' not in kwargs:
self.W = self.add_param(init, (self.num_units, self.num_units), name='W_bilinear')
else:
self.W = self.add_param(init, (self.num_units, self.num_units), name='W_bilinear_{}'.format(kwargs['name']))
def get_output_shape_for(self, input_shapes):
return input_shapes[0][:2]
def get_output_for(self, inputs, **kwargs):
# inputs[0]: batch * len * h
# inputs[1]: batch * h
# inputs[2]: batch * len
# W: h * h
M = T.dot(inputs[1], self.W).dimshuffle(0, 'x', 1) #batch * 1 * h
alpha = T.nnet.softmax(T.sum(inputs[0] * M, axis=2)) #batch * len
if len(inputs) == 3:
alpha = alpha * inputs[2]
alpha = alpha / (alpha.sum(axis=1).reshape((alpha.shape[0], 1)) + 1e-8) * inputs[2]
return alpha
class BilinearDotLayerTensor(lasagne.layers.MergeLayer):
"""
A bilinear attention layer.
incomings[0]: batch x len x h
incomings[1]: batch x len x h
"""
def __init__(self, incomings, num_units,
mask_input=None,
init=lasagne.init.Uniform(), **kwargs):
if len(incomings) != 2:
raise NotImplementedError
if mask_input is not None:
incomings.append(mask_input)
super(BilinearDotLayerTensor, self).__init__(incomings, **kwargs)
self.num_units = num_units
def get_output_shape_for(self, input_shapes):
return input_shapes[0][:2]
def get_output_for(self, inputs, **kwargs):
alpha = T.nnet.softmax(T.sum(inputs[0] * inputs[1], axis=2))
return alpha
class DotProductAttentionLayer(lasagne.layers.MergeLayer):
"""
A bilinear attention layer.
incomings[0]: batch x len x h
incomings[1]: batch x h
"""
def __init__(self, incomings, mask_input=None, **kwargs):
if len(incomings) != 2:
raise NotImplementedError
if mask_input is not None:
incomings.append(mask_input)
super(DotProductAttentionLayer, self).__init__(incomings, **kwargs)
def get_output_shape_for(self, input_shapes):
return input_shapes[1]
def get_output_for(self, inputs, **kwargs):
# inputs[0]: batch * len * h
# inputs[1]: batch * h
# mask_input (if any): batch * len
alpha = T.nnet.softmax(T.sum(inputs[0] * inputs[1].dimshuffle(0, 'x', 1), axis=2))
if len(inputs) == 3:
alpha = alpha * inputs[2]
alpha = alpha / alpha.sum(axis=1).reshape((alpha.shape[0], 1))
return T.sum(inputs[0] * alpha.dimshuffle(0, 1, 'x'), axis=1)
| 36.21934 | 120 | 0.585661 | 2,074 | 15,357 | 4.162488 | 0.07811 | 0.034055 | 0.052821 | 0.033013 | 0.728368 | 0.711109 | 0.69883 | 0.695934 | 0.679022 | 0.670451 | 0 | 0.02587 | 0.277593 | 15,357 | 423 | 121 | 36.304965 | 0.752299 | 0.117666 | 0 | 0.629771 | 0 | 0 | 0.01595 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.183206 | false | 0 | 0.01145 | 0.09542 | 0.423664 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
53d3dd409561952e91405666cc893ee5a1f29e23 | 1,328 | py | Python | core/models.py | Guilehm/Mercado | 4273dc3e71319db325ddda22d4b518177919f0eb | [
"MIT"
] | null | null | null | core/models.py | Guilehm/Mercado | 4273dc3e71319db325ddda22d4b518177919f0eb | [
"MIT"
] | null | null | null | core/models.py | Guilehm/Mercado | 4273dc3e71319db325ddda22d4b518177919f0eb | [
"MIT"
] | 1 | 2018-07-23T19:56:16.000Z | 2018-07-23T19:56:16.000Z | from django.db import models
from django.core.validators import MinValueValidator
# Create your models here.
class Cliente(models.Model):
cliente = models.CharField('Nome Cliente', max_length=15)
cpf = models.CharField('CPF', max_length=11)
def __str__(self):
return self.cliente
class Produto(models.Model):
produto = models.CharField('Nome Produto', max_length=20)
descricao = models.TextField('Descrição', max_length=500)
preco = models.DecimalField('Preço', decimal_places=2, max_digits=7, validators=[MinValueValidator(0.01)])
def __str__(self):
return self.produto
class Pedido(models.Model):
cliente = models.ForeignKey(Cliente, on_delete=models.CASCADE)
criado = models.DateField('Criado em', auto_now_add=True)
modificado = models.DateField('Modificado em', auto_now_add=False, auto_now=True)
def __str__(self):
return str(self.id)
class DetalhePedido(models.Model):
pedido = models.ForeignKey(Pedido, on_delete=models.CASCADE)
produto = models.ForeignKey(Produto, on_delete=models.CASCADE)
quantidade = models.PositiveSmallIntegerField('quantidade')
preco = models.DecimalField('Preço', decimal_places=2, max_digits=7)
def __str__(self):
return str(self.pedido) | 34.947368 | 116 | 0.704066 | 160 | 1,328 | 5.64375 | 0.36875 | 0.046512 | 0.044297 | 0.070875 | 0.21041 | 0.166113 | 0.115172 | 0.115172 | 0.115172 | 0.115172 | 0 | 0.014815 | 0.186747 | 1,328 | 38 | 117 | 34.947368 | 0.821296 | 0.018072 | 0 | 0.153846 | 0 | 0 | 0.059862 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.153846 | false | 0 | 0.076923 | 0.153846 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 2 |
53d4a62b8da9b6712f642fb00a7e3e4c93da2035 | 1,582 | py | Python | model.py | wmqkk/gcnn2.0 | 25ba4d655ef835192908eece13ddc9ecaacdea6a | [
"MIT"
] | 7 | 2021-02-04T12:45:41.000Z | 2021-07-03T14:43:31.000Z | model.py | wmqkk/gcnn2.0 | 25ba4d655ef835192908eece13ddc9ecaacdea6a | [
"MIT"
] | null | null | null | model.py | wmqkk/gcnn2.0 | 25ba4d655ef835192908eece13ddc9ecaacdea6a | [
"MIT"
] | null | null | null | import tensorflow as tf
import numpy as np
from layers import *
class gcnnmodel(tf.Module):
def __init__(self):
super(gcnnmodel, self).__init__()
filters = 64 # Number of convolution kernels
self.layer = []
self.layer.append(GraphConvolution(shape=(4, filters), dropout=0, name='layer1'))
self.layer.append(GraphConvolution(shape=(filters, filters), dropout=0, name='layer2'))
self.layer.append(GraphConvolution(shape=(filters, filters), dropout=0, name='layer3'))
self.layer.append(Dense(shape=(filters, 3), dropout=0, name='layer4'))
def __call__(self, features, L):
x = features
n = len(L)
x = tf.cast(x, dtype=tf.float64)
for i in range(len(self.layer)-1):
x = self.layer[i](x, L, n)
x = self.layer[-1](x, n)
x = tf.convert_to_tensor(x)
return x
class gcnmodel(tf.Module):
def __init__(self):
super(gcnmodel, self).__init__()
filters = 32 # Number of convolution kernels
self.layer = []
self.layer.append(GraphConvolution(shape=(4, filters), dropout=0, name='layer1'))
self.layer.append(GraphConvolution(shape=(filters, filters), dropout=0., name='layer2'))
self.layer.append(GraphConvolution(shape=(filters, 2), dropout=0., act=lambda x:x, name='layer3'))
def __call__(self, features, L):
x = features
n = 1
x = tf.cast(x, dtype=tf.float64)
for i in range(len(self.layer)):
x = self.layer[i](x, L, n)
return tf.nn.softmax(x)
| 33.659574 | 106 | 0.610619 | 211 | 1,582 | 4.454976 | 0.28436 | 0.134043 | 0.111702 | 0.197872 | 0.703191 | 0.703191 | 0.652128 | 0.62234 | 0.558511 | 0.558511 | 0 | 0.024227 | 0.243363 | 1,582 | 46 | 107 | 34.391304 | 0.761069 | 0.037295 | 0 | 0.388889 | 0 | 0 | 0.027704 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.083333 | 0 | 0.305556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
53df7d856f98a3d9af332f004ddfbe41acfbec90 | 413 | py | Python | compose.py | raokiey/Data-augmentation-for-object-detection-Pytorch | cdec924e6a21861f1e3577d5f5c32dd041b493bd | [
"MIT"
] | null | null | null | compose.py | raokiey/Data-augmentation-for-object-detection-Pytorch | cdec924e6a21861f1e3577d5f5c32dd041b493bd | [
"MIT"
] | null | null | null | compose.py | raokiey/Data-augmentation-for-object-detection-Pytorch | cdec924e6a21861f1e3577d5f5c32dd041b493bd | [
"MIT"
] | null | null | null | class Compose(object):
"""Composes several transforms together for object detection.
Args:
transforms (list of ``Transform`` objects): list of transforms to compose.
"""
def __init__(self, transforms):
self.transforms = transforms
def __call__(self, image, target):
for t in self.transforms:
image, target = t(image, target)
return image, target
| 27.533333 | 82 | 0.641646 | 46 | 413 | 5.586957 | 0.521739 | 0.171206 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.266344 | 413 | 14 | 83 | 29.5 | 0.848185 | 0.348668 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | false | 0 | 0 | 0 | 0.571429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
53e0a2d0d94420b5afd89c431446d7e0cf672f3f | 409 | py | Python | Test/contiguousSubarray.py | Akash671/coding | 4ef047f8e227074b660a2c7b41aefa377fdc0552 | [
"MIT"
] | null | null | null | Test/contiguousSubarray.py | Akash671/coding | 4ef047f8e227074b660a2c7b41aefa377fdc0552 | [
"MIT"
] | null | null | null | Test/contiguousSubarray.py | Akash671/coding | 4ef047f8e227074b660a2c7b41aefa377fdc0552 | [
"MIT"
] | null | null | null | def subarraysCountBySum(a, k, s):
ans=0
n=len(a)
t=0
ii=1
while(k+t<=n):
tmp=[]
for i in range(t,ii+t):
tmp.append(a[i])
print(tmp)
ii+=1
if len(tmp)<=k and sum(tmp)==s:
ans+=1
t+=1
else:
break
return ans
a=list(map(int,input().split()))
k,s=map(int,input().split())
print(subarraysCountBySum(a, k, s))
| 18.590909 | 38 | 0.479218 | 67 | 409 | 2.925373 | 0.462687 | 0.030612 | 0.214286 | 0.22449 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.02214 | 0.337408 | 409 | 21 | 39 | 19.47619 | 0.701107 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.05 | false | 0 | 0 | 0 | 0.1 | 0.1 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.